Get my Udemy courses with a discount: www.felipe-gavilan.com/cursos?idioma=eng Github: github.com/gavilanch/TenMillionRecords
@pavfrang2 жыл бұрын
Thank you for the great video! Memory consumption is truly great for this. Based on my experience, bulk copy operations with SqlBulkCopy are the fastest way to insert large chunks of data in the database. Unfortunately a DataTable is needed to be loaded, however, you can load table in batches via another thread - and the SqlBulkCopy is responsible for the fast sending of all the data batch with a single command to the database.
@gavilanch38032 жыл бұрын
True, batches are a solution. Though, you need to implement transactions manually in that case, right?
@pavfrang2 жыл бұрын
@@gavilanch3803 no need for transactions. You just need an open SqlConnection (ok there is one internally, but you do not really need to set any transaction, because on failure the current operation is reverted automatically). In practice, depending on the record size, the batch size should be tuned/calibrated (after some tests) and then the optimum batch size really gives the best performance in terms of time.
@docdaven2 жыл бұрын
Try this helper function: private static DataTable ToDataTable(List items) { DataTable dataTable = new(typeof(T).Name); //Get all the properties PropertyInfo[] Props = typeof(T).GetProperties(BindingFlags.Public | BindingFlags.Instance); foreach (PropertyInfo prop in Props) { //Defining type of data column gives proper data table var type = (prop.PropertyType.IsGenericType && prop.PropertyType.GetGenericTypeDefinition() == typeof(Nullable) ? Nullable.GetUnderlyingType(prop.PropertyType) : prop.PropertyType); //Setting column names as Property names dataTable.Columns.Add(prop.Name, type); } foreach (T item in items) { var values = new object[Props.Length]; for (int i = 0; i < Props.Length; i++) { //inserting property values to datatable rows values[i] = Props[i].GetValue(item, null); } dataTable.Rows.Add(values); } return dataTable; }
@Tamer_Ali2 жыл бұрын
Thanks Mr.Felipe for this video tutorial. Keep it up 👍
@parmarjitesh19792 жыл бұрын
Superb video as always. Keep it up.
@marcrippee4365 Жыл бұрын
Brilliant thank you
@DucaTech Жыл бұрын
How does this compare to SQL Bulk Copy?
@hqcart1 Жыл бұрын
bulk copy is the fastest.
@amolsawant90487 күн бұрын
Can I get the DB Stored Procedure Script?
@lifeinloop2 жыл бұрын
I just didn't understood how "yeld return " returns all the records one by one, being that, that method "GetData" was only called once..
@hqcart1 Жыл бұрын
This video does not represent real life applications. 1. how big the records are. 2. how many indexes you have. 3. any outside relations. any real application will have at least one of the above. there is also another factor is how big is your batch?
@Thalapraga Жыл бұрын
Where can i get this code
@EngineerASO2 жыл бұрын
dear sir Is there another efficient way to insert multi-rows from Access Database to Sql Server, I tried using pass-through, doen't work when I used a form and try to select all records from its table and inserted to sql Server table... thank for your reply
@rmcgraw79433 ай бұрын
If you are converting from Access to SQL Server, MS has a tool for that; however, you have to rewrite all your queries. The best way is to export the Access to text, then import it into SQL, but be aware that all your Access queries will alias the SQL tables as aliased tables inside of Access, and that will result in a table scan, so you’ll need to rewrite all those to be SQL Server Stored Procedures, which you CAN then call as pass-thru queries.
@DoctorMGL Жыл бұрын
i have 800 million line in a text file and each line is about 100 character using the same method you have (streamReader) with async to read the file line by line and compare it to a text i have on a string, something like ( SearchText == TextFromFile ). and it takes 1 second to read every 50.000 line so 800 million line at this speed will take me ( 4 hours ) to finish . any other way to speed it more ? because 4 hours is to long ... im just trying to find if the "text" i have matches the once in the text file , and its a 35 Gb file with 800 million line
@hqcart1 Жыл бұрын
maybe you are limited by I/O speed?
@longuinni2 жыл бұрын
is bulk insert faster?
@davestorm67182 жыл бұрын
Much faster, though you have to load the dt in memory first (as far as I know).
@HajvazS2 жыл бұрын
Hi, thank you for you videos :) they are great, is it possible to contact you, would like to offer you something, but on your website i cant find any contact details, best regards