I am collecting costs by employee from 6 sources and then insert them into a a GeneratedDetail table. But the collected data has 1000+ rows, and the insertion one by one row is very slow. I needed to find a bulk insert solution.
I am using Dapper as ORM in my C# project, and MS SQL Database.
According to research, my approach would be as below. It works now quite fast, I am satisfied with the speed of execution. Feel free to comment and propose a better solution.
note: spGeneratedDetailInsert is a stored procedure with many parameters.
public void GeneratedDetailInsertBulk1(List<GeneratedDetail> DetailList)
{
using (IDbConnection connection = new System.Data.SqlClient.SqlConnection(connectionString))
{
var sql = "";
foreach (var mydet in DetailList)
{
sql = sql + "Exec spGeneratedDetailInsert '" + mydet.BillingCode + "', " +
mydet.Emp_id + ", '" +
mydet.Emp_name + "', " +
mydet.HrId + ", " +
mydet.Entity_id + ", " +
mydet.BU_id + ", '" +
mydet.Currency + "', '" +
mydet.Item + "', " +
mydet.ExchangeRate + ", " +
mydet.InitValueRON + ", " +
mydet.InitValueCurr + ", " +
mydet.MUValueCurr + ", " +
mydet.TotalRon + ", '" +
mydet.ServiceType + "', '" +
mydet.CostCategory + "', '" +
mydet.UpdatedBy + "', '" +
mydet.UpdatedAt + "', '" +
mydet.PeriodId + "' ; ";
}
connection.Execute(sql);
}
}
Note: I am not using Dapper Plus, and I don't want to.
Top comments (8)
Why not simple?
What you have there is a SQL injection vulnerability, don't concatenate strings to build SQL statements.
Also Dapper should support a list of parameters and be able to execute that stored procedure for each list parameter.
If it actually doesn't, foreach will definitely work:
The better way would be to not use stored procedures and if you need an actual bulk insert, convert your list to a
DataTable
, using a method, such as in here and then do the actual bulk insert, take sample code from this StackOverflow questionI appreciate your response. Structurally your solution is more beautiful, but mine is 2x faster. I have 850 records, and my solution inserts them in 15 sec, whereas the proposed solution is around 33 sec. Speed is very important here, as there will be around 1500 - 2000 records on a monthly basis, and we will use an Azure SQL database to store the data, so updates are a bit slow.
Thank you for your suggestion anyway, it's great way to learn and try.
850 records in 15 seconds is still really, really slow :)
Are your inserts complicated so that you need a stored procedure?
With a proper bulk insert you could do all of that in less than a second.
Yes, I checked your suggestion with the proper SQL bulk insert. It now works, indeed around 1 second. I needed to convert the List to Datatable using the suggested method. Thank you! :)
There you go, way better than initial way you had it. Also tons of times faster.
Hi, had you tried RepoDb.InsertAll(DetailList)? That is a "packed" statements. The first run could be slower by 1 sec due to compilation. The second run would be extremely fast.
Otherwise, RepoDb.BulkInsert which is using the SqlBulkCopy underneath that does the real bulk operation. That would only take 1 millisecond, I suppose.
Please let us hear it if you have done so. Thanks
Thanks Michael, I figured it out already, see my post: dev.to/zoltanhalasz/bulk-insert-wi...
Great and thanks!