Input Operation Test with Large Records

Q

How to run an input operation test with INSERT statements with Large Records?

✍: FYIcenter.com

A

If you want to test the INSERT statement performance with large record sizes, we need to use a table with multiple columns, since one column of VARCHAR is limited to 8000 characters.

Create the following SQL script, fyi_block_insert.sql:

-- fyi_block_insert.sql
-- Copyright (c) FYIcenter.com 

DECLARE @Count INT
SET @Count = 100
DECLARE @Size INT
SET @Size = 256

DECLARE @STime DATETIME
DECLARE @ETime DATETIME
DECLARE @Duration INT

DECLARE @I INT
DECLARE @Message VARCHAR(5760)
SET @Message = 'The quick brown fox jumps over the lazy dog. '
SET @I = 7
WHILE @I > 0 BEGIN
   SET @Message = @Message+@Message
   SET @I = @I - 1
END
SET @Message = SUBSTRING(@Message,1,@Size/8-2)

CREATE TABLE FYI_BLOCK (
   MSG_1 VARCHAR(4096),
   MSG_2 VARCHAR(4096), 
   MSG_3 VARCHAR(4096), 
   MSG_4 VARCHAR(4096), 
   MSG_5 VARCHAR(4096), 
   MSG_6 VARCHAR(4096), 
   MSG_7 VARCHAR(4096), 
   MSG_8 VARCHAR(4096)
) 
SET @STime = GETDATE()

SET @I = 0
WHILE @I < @Count BEGIN
   INSERT INTO FYI_BLOCK VALUES (@Message,@Message,@Message,@Message,
      @Message,@Message,@Message,@Message)
   SET @I = @I+1
END

SET @ETime = GETDATE()
DROP TABLE FYI_BLOCK

SET @Duration = DATEDIFF(millisecond,@STime,@ETime)
SELECT @Size AS Size, @Count AS Count, @Duration AS Duration, 
  1000*@Count/@Duration AS 'Throughput Record', 
  @Duration/@Count AS Latency,
  @Size*1000*@Count/@Duration/1024 AS 'Throughput KB',
  @Size*1000*@Count/@Duration/1024/8 AS IOPS

Now run the script with different @Size and @Count values on the Azure SQL Database. You will get something like:

 Size   Count   Duration   Throughput   Latency   Throughput   IOPS
                               Record                     KB
  256     100        563          177         5           44      5
  512     100        574          174         5           87     10
 1024     100        587          170         5          170     21
 2048     100        627          159         6          318     39
 4096     100        590          169         5          676     84
 8192     100        693          144         6         1152    144
16384     100       1704           58        17          928    116
32768     100       3533           28        35          896    112
               
 8192    1000      13934           71        13          568     71
16384    1000      24010           41        24          656     82
               
 8192   10000     146900           68        14          544     68
16384   10000     249466           40        24          640     80

As you can see, if throughput performance peaks at 1,152 KB/sec when the record size matches the I/O block size of 8,192 bytes or 8 KB and the workload is light (a test of 100 sequential insert statements). The IOPS at peak time is about 144 IOPS.

Two other observations:

  • Records larger than the block size gives poor performance. For example, 8-KB record size gives 1,152 KB/sec, while 16-KB record size gives 928 KB/sec.
  • Heavier work load gives poor performance. For example, 100-statement loop gives 1,152 KB/sec, while 1000-statement loop gives 568 KB/sec.

 

Performance Impact of Higher Azure Service Tier

Input Operation Test with VARCHAR Data Type

IOPS Tests on Azure SQL Database with INSERT

⇑⇑ SQL Server Storage Tutorials

2019-07-21, 1321🔥, 0💬