Hi,
In this post I would like to share the bulk insert methodology via C# rather than using SSIS.
In the first place, why would i do that ??? Hmm…
Well i cam across this situation when I am working with substantially large data-sets, the strategy to process them as fast as possible is to partition the processing into small chunks and compute.
But as much as SSIS is great in its own way it hasn’t reached to the mark of creating multiple threads/tasks (in C# world) and process them independently, so I usually opt for custom made console application for the task at hand.
And when I build tool with the view above, I had issues where even the small chunk of data which I wanted to insert in a staging/fact table was getting bigger and my inserts were slow.
To the rescue i came across this bulk insert methodology, where by the SQL bulk inserts could be performed from C# code as well, and helping me to keep the performance to the best standards.
SqlDataReader reader; using (SqlCommand sqlCommand = new SqlCommand()) { sqlCommand.CommandTimeout = _commandTimeOut; sqlCommand.Connection = _sqlConnection; sqlCommand.CommandType = System.Data.CommandType.StoredProcedure; sqlCommand.Parameters.Add("@runDate", SqlDbType.Int).Value = _runDate; sqlCommand.CommandText = "[Schema].[dataproc]"; reader = sqlCommand.ExecuteReader(); } using (SqlBulkCopy sqlBcp = new SqlBulkCopy("SQL Connection String")) { sqlBcp.BulkCopyTimeout = 0; sqlBcp.ColumnMappings.Add(new SqlBulkCopyColumnMapping("Reader Column Name", "Destination Table Column Name")); sqlBcp.ColumnMappings.Add(new SqlBulkCopyColumnMapping("Reader Column Name1", "Destination Table Column Name1")); sqlBcp.ColumnMappings.Add(new SqlBulkCopyColumnMapping("Reader Column Name2", "Destination Table Column Name2")); sqlBcp.DestinationTableName = "[Schema].[Destination Table]"; sqlBcp.WriteToServer(reader); }
I hope the above help in your codes as well.