No Jobs, No saving files somewhere. The best explanation. Love it. Liked, subscribed. ❤
@vartisan2 ай бұрын
Great video! One thing to consider, though, is how this approach handles untrusted CSV files. Using methods like LOAD DATA INFILE is incredibly efficient but might not be safe when importing data from untrusted sources. Since this method doesn't allow for row-by-row validation or sanitization, it could introduce security risks like SQL injection or data corruption.
@biryuk8810 ай бұрын
Great speaker and a useful video! Thanks 👍
@DirkZz9 ай бұрын
Truely went all the way with the bulk insert scenarios, good vid!
@swancompany_inc10 ай бұрын
This is a great larabit. Thanks for the video Jeremy!
@MT8784010 ай бұрын
What a great topic!
@JoseFranciscoIT10 ай бұрын
Thanks for the video.. One Note: Disabling foreign key check in a production app, can lead to slowdown or worst case an error inserting an id a does not exists on parent, so for your safety i would skip disabling foreign key check
@shaikhanuman801210 ай бұрын
great content, tqs for sharing valuable information with laravel developers, tq you very much.
@OliverKurmis9 ай бұрын
Using DB:: instead of the Model class should speed up the process quite a bit
@vic_casanas10 ай бұрын
Wooooow great video, super useful, please more like this 🤩
@hkhdev955 ай бұрын
Amazing tut, thanks
@arthmelikyan10 ай бұрын
I think using queues with chunking is also useful in this case, e.g chunking the file and storing 10k records into the database during each queue iteration
@blank0019 ай бұрын
If possible I would recommend you not to use this method because when queuing you are essentially writing to DB 2 times for each queue, 1 for queue and 2nd for the actual record If your are using redis then that's an ok ok case
@arthmelikyan9 ай бұрын
sure I'm about redis driver not DB
@franciscojunior214110 ай бұрын
That’s amazing, thanks for the video, excellent well done 👍🏽
@АртурЗарипов-б2й10 ай бұрын
Good job! Thank you very much!
@wadecodez10 ай бұрын
did not know about infile, thanks!
@yasser.elgammal10 ай бұрын
Great Topic, Thank you
@keyvanakbarzadehАй бұрын
fantastic
@SemenP-i4x8 ай бұрын
Extremely cool, yeah
@BruceEmmanuelSueira10 ай бұрын
That's amazing! Thanks you
@ParsclickTV9 ай бұрын
very useful video
@mouhamaddiop114410 ай бұрын
Just amazing
@grugbrain10 ай бұрын
Amazing. Please share some ideas about database design for scalable Laravel app.
@gabrielborges11859 ай бұрын
Fantastic.
@edventuretech10 ай бұрын
Thank you for sharing such a value and informative video. This is my first time reaching your channel. I am gonna follow you and share this vid. I have a question. Is it suitable to use Transaction and Commit for bulking tons of record like this?
@shahzadwaris719310 ай бұрын
Hi, This is a great way of inserting for a single table and where don't need to perform any functionality but if we want to perform some functionality and store the data in to different tables. Can you please cover that topic as well? I know that can be handled with queues but I wasn't able to implement it in an efficient way instead I overloaded the database with many mini jobs.
@docetapedro50079 ай бұрын
Can I use bulk insert to get data from api and insert to my database?
@FahadTariq-x3q9 ай бұрын
Which platform you use in which you give that data and just say do it.?
@muhamadfikri726310 ай бұрын
Great 🔥 what the name theme vscode?
@rafamorawiec4 ай бұрын
Great video but last option is good only for simple import. If you need some checking, other laravel/php stuff to do before/during insert then we are doomed :D
@GergelyCsermely10 ай бұрын
thanks
@underflowexception10 ай бұрын
What about exporting INSERT statements to a SQL file and using MySQL dump to import the file? Would that be more memory efficient in some cases?
@arthmelikyan10 ай бұрын
I don't think so, in that case you will add one more useless step which is writing/creating a new file and it is not memory efficient because you will take the same data and put into the sql file... also insert query is limited, you can't inert 1M rows at once by default. instead you can immidiatelly insert prepared data into the database and clean the memory. Using queues can also help, you can send a notification to user saying "your data insert is in progress", and then notify if the process is finished, in this case the user will not wait 20/60/80+ seconds to receive a response from the server
@michalbany516210 ай бұрын
very good
@bryanhalstead9 ай бұрын
I won't fight you. Thanks 🙂
@IndraKurniawan10 ай бұрын
Chunking is way more available than threading. Still better than none at all.
@wadday10 ай бұрын
What if we collect the information into an array named $data during the loop and then execute a single database insert query, similar to using Model::query()->insert($data);?
@tass200110 ай бұрын
You can end up backing yourself into the corner of exceeding the max_allowed_packet size for the DBMS or depending on the load the DBMS is enduring, you could bring the application to a halt because of row locks. I would batch it into sensible chunks - 100-1000 records at a time depending on your dataset.
@Fever19849 ай бұрын
League/csv has been using generators for a long tim e now. I don't g et why you would use laravel and then not use a package like csv/league
@bidhanbaniya760510 ай бұрын
You Could Have used Job here
@mohammadashrafuddinferdous934710 ай бұрын
Note for me before watching the full video: php generator will be the option to read such big sized files line by line. Lets see if I am right or wrong.