Collections:
Input Operation Test with VARCHAR Data Type
How to run an input operation test with INSERT statements on VARCHAR data type?
✍: FYIcenter.com
If you want to test the INSERT statement performance with a variable record size using the VARCHAR data type, you can follow this tutorial. The VARCHAR data type uses a storage size of string length plus 2 extra bytes.
Create the following SQL script, fyi_varchar_insert.sql:
-- fyi_varchar_insert.sql -- Copyright (c) FYIcenter.com DECLARE @Count INT SET @Count = 100 DECLARE @Size INT SET @Size = 4 DECLARE @STime DATETIME DECLARE @ETime DATETIME DECLARE @Duration INT DECLARE @I INT DECLARE @Message VARCHAR(5760) SET @Message = 'The quick brown fox jumps over the lazy dog. ' SET @I = 7 WHILE @I > 0 BEGIN SET @Message = @Message+@Message SET @I = @I - 1 END CREATE TABLE FYI_VARCHAR (MESSAGE VARCHAR(4096)) SET @STime = GETDATE() SET @I = 0 WHILE @I < @Count BEGIN INSERT INTO FYI_VARCHAR VALUES (SUBSTRING(@Message,1,@Size-2)) SET @I = @I+1 END SET @ETime = GETDATE() DROP TABLE FYI_VARCHAR SET @Duration = DATEDIFF(millisecond,@STime,@ETime) SELECT @Size AS Size, @Count AS Count, @Duration AS Duration, 1000*@Count/@Duration AS Throughput, @Duration/@Count AS Latency
Each record now has a single field of UNIQUEIDENTIFIER, which uses 16 bytes.
Now run the script with different @Size and @Count values on the Azure SQL Database. You will get something like:
Size Count Duration Throughput Latency 4 100 550 181 5 8 100 547 182 5 16 100 523 191 5 32 100 646 154 6 64 100 573 174 5 128 100 523 191 5 256 100 626 159 6 512 100 660 151 6 1024 100 747 133 7 2048 100 530 188 5 4096 100 670 149 6 4096 1000 9807 101 9 4096 10000 104630 95 10 4096 20000 210070 95 10
The actual input data throughput is 95 * 4,096 bytes/sec = 389,120 bytes/sec = 389.5 KB/sec, using the last test result.
If we assume the I/O block size is 8 KB, throughput of 389.5 KB/sec represents 47.5 IOPS.
If we check the DTU usage metrics of the last execution on Azure portal, we see that 10 DTUs were used, which represents 25 IOPS according to our service tier level of "Standard S0":
IO throughput (approximate): 2.5 IOPS per DTU IO latency (approximate): 5 ms (read), 10 ms (write)
So our SQL script actually almost 2 times better than what the Azure metrics report says: 47.5 IOPS vs. 25 IOPS.
⇒ Input Operation Test with Large Records
⇐ Input Operation Test with UUID Data Type
2019-07-21, 1439🔥, 0💬
Popular Posts:
How to generate user full names? Test user names are frequently needed in testing applications that ...
Where to find test data generators? FYIcenter.com has prepared a number of test data generators for ...
How to access Response Status Code 400 as Success? My sampler failed, because the server returns the...
How to generate user birthday dates? Test user birthday dates are frequently needed in testing appli...
How to convert IPv4 to IPv6 addresses? In order to help your programming or testing tasks, FYIcenter...