You need to load a large csv file in the database (1.5 GB). Tried to do line-by-line entry using INSERT - this is a very long time. 100,000 lines load in about 2 minutes. The file has 120,000,000 lines. I read that you can use SQLBulkCopy and it will be much faster. But the question is, is it possible to make a CSV download broken down by columns? For example, in my CSV Series and Number columns with a separator ",". The table also has two similar columns. Are there any other faster ways to load or how to make a mapping of fields

string line = ""; string con_str = "server;Database=base;Trusted_Connection=True;"; using (SqlConnection connection = new SqlConnection(con_str)) { connection.Open(); using (StreamReader file = new StreamReader(new BufferedStream(File.OpenRead(@"D:\bzip2\WriteLines.csv"), 10 * 1024 * 1024))) { //string table = "ElmaBadPassport"; while ((line = file.ReadLine()) != null) { if (line.Length == 11) { string[] values = line.Split(','); SqlCommand cmd = new SqlCommand("Insert INTO ElmaBadPassport(Series,Number) VALUES (@series, @number)",connection); cmd.Parameters.AddWithValue("@series",values[0].ToString()); cmd.Parameters.AddWithValue("@Number", values[1].ToString()); cmd.ExecuteNonQuery(); //Console.WriteLine(cmd.ExecuteNonQuery()); } } } connection.Close(); } 
  • If you need to do this once, you can use the import / export wizard in SSMS. - Nick Proskuryakov
  • @NickProskuryakov he brakes and inserts 1 entry. ESF Migration Tool kit taxis and makes it fast. - iluxa1810
  • @ iluxa1810 if we are talking about speed, then load it through SSIS in several streams, everything is also clearly configured. But ... as far as I understand the author should do this with C # - Nick Proskuryakov
  • @NickProskuryakov unfortunately, it’s necessary in C # and repeatedly. They wrote a software that downloads the source, unpacks, then creates a new CSV with the correct data. And then you need to load in the database. - Andrey Sherman

1 answer 1

For a one-time solution of the problem, I recommend using ESF Migration ToolKit , which does an excellent job with it, while the standard import / export tool in SSMS inserts 1 entry. True minus is that it is paid ...

If you still want to implement it yourself, that is, the class SqlBulkCopyColumnMapping , where you can set the mapping of one field to another. However, although the DBMS doesn’t care about the register of columns, however, BulkCopy doesn’t care and it doesn’t find a match for similar names, but with a different register.

And also, as far as I know, you can directly BULK INSERT from MS SQL via the script:

 BULK INSERT SchoolsTemp FROM 'C:\CSVData\Schools.csv' WITH ( FIRSTROW = 2, FIELDTERMINATOR = ',', --CSV field delimiter ROWTERMINATOR = '\n', --Use to shift the control to next row ERRORFILE = 'C:\CSVDATA\SchoolsErrorRows.csv', TABLOCK ) 

True, I have not tried.

PS Well, do not forget to bang all indexes and keys in the target table, so this slows down the insertion.

  • those. SqlBulkCopyColumnMapping will eat the name of the column, and write to the correct one, provided that the registers are the same? - Andrey Sherman
  • @AndreySherman, should. Well, the input must be DataTable, or DataReader. - iluxa1810