Jenninha Jenninha - 4 months ago 23
SQL Question

Fastest way to insert 30 thousand rows in a temp table on SQL Server with C#

I am trying to find out how I can improve my insert performance in a temporary table in SQL Server using c#. Some people are saying that I should use SQLBulkCopy however I must be doing something wrong as it seems to work much slower than simply building an SQL insert string instead.

My code to create table using SQLBulkCopy is below:

public void MakeTable(string tableName, List<string> ids, SqlConnection connection)
{

SqlCommand cmd = new SqlCommand("CREATE TABLE ##" + tableName + " (ID int)", connection);
cmd.ExecuteNonQuery();

DataTable localTempTable = new DataTable(tableName);

DataColumn id = new DataColumn();
id.DataType = System.Type.GetType("System.Int32");
id.ColumnName = "ID";
localTempTable.Columns.Add(id);

foreach (var item in ids)
{
DataRow row = localTempTable.NewRow();
row[0] = item;
localTempTable.Rows.Add(row);
localTempTable.AcceptChanges();
}


using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
bulkCopy.DestinationTableName = "##" + tableName;
bulkCopy.WriteToServer(localTempTable);

}
}


This way my inserts take a long time to run. I got my inserts to work faster in another way:

I created the inserts bit as a string and joined it in my SQL create temp table statement:

Creation of inserts string:

public string prepareInserts(string tableName, List<string> ids)
{
List<string> inserts = new List<string>();

var total = ids.Select(p => p).Count();
var size = 1000;

var insert = 1;

var skip = size * (insert - 1);

var canPage = skip < total;

while (canPage)
{
inserts.Add(" insert into ##" + tableName + @" (ID) values " + String.Join(",", ids.Select(p => string.Format("({0})", p))
.Skip(skip)
.Take(size)
.ToArray()));
insert++;
skip = size * (insert - 1);
canPage = skip < total;
}

string joinedInserts = String.Join("\r\n", inserts.ToArray());

return joinedInserts;

}


Using them in the SQL statement after creating query:

inserts = prepareInserts(tableName, ids);

var query = @"IF EXISTS
(
SELECT *
FROM tempdb.dbo.sysobjects
WHERE ID = OBJECT_ID(N'tempdb..##" + tableName + @"')
)
BEGIN
DELETE FROM ##" + tableName + @"
END
ELSE
BEGIN
CREATE TABLE ##" + tableName + @"
(ID int)
END " + inserts;

var command = new SqlCommand(query, sqlConnection);
...


Since I've seen people telling me (on stack exchange http://dba.stackexchange.com/questions/44217/fastest-way-to-insert-30-thousand-rows-in-sql-server/44222?noredirect=1#comment78137_44222 ) That I should use SQLBulkCopy and that would be faster I believe that I should improve the way I do it. So if anyone can suggest how I can improve my SQLBulkCopy code OR tell me if there is a better insert statement that can improve my application's performance that would be great.

Mzf Mzf
Answer

Your problem may be in localTempTable.AcceptChanges(); Since it commit your changes.
If you do the next , I think it will run faster

    foreach (var item in ids)
    {
         DataRow row = localTempTable.NewRow();
         row[0] = item;
         localTempTable.Rows.Add(row);

    }

    localTempTable.AcceptChanges();

    using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
    {
        bulkCopy.DestinationTableName = "##" + tableName;
        bulkCopy.WriteToServer(localTempTable);

    }

From MSDN - DataSet.AcceptChanges

Commits all the changes made to this DataSet since it was loaded or since the last time AcceptChanges was called.

Comments