How to Bulk Insert Data With Laravel

  Рет қаралды 11,791

Laracasts

Laracasts

Күн бұрын

As programmers, we often need to bulk insert a massive number of records into a database. Some languages and platforms offer built-in tools that make bulk insert a trivial thing. PHP (and Laravel), however, don't have these tools. But we can still efficiently bulk insert data! Let me show you how in this Larabit.
Watch Full Larabits Series on: laracasts.com/...
Watch thousands of videos, track your progress, and participate in a massive Laravel community at Laracasts.com.
Laracasts: laracasts.com
Laracasts Twitter: / laracasts
Jeffrey Way Twitter: / jeffrey_way

Пікірлер: 45
@bankai7654
@bankai7654 3 ай бұрын
No Jobs, No saving files somewhere. The best explanation. Love it. Liked, subscribed. ❤
@vartisan
@vartisan 3 ай бұрын
Great video! One thing to consider, though, is how this approach handles untrusted CSV files. Using methods like LOAD DATA INFILE is incredibly efficient but might not be safe when importing data from untrusted sources. Since this method doesn't allow for row-by-row validation or sanitization, it could introduce security risks like SQL injection or data corruption.
@biryuk88
@biryuk88 11 ай бұрын
Great speaker and a useful video! Thanks 👍
@DirkZz
@DirkZz 11 ай бұрын
Truely went all the way with the bulk insert scenarios, good vid!
@OliverKurmis
@OliverKurmis 11 ай бұрын
Using DB:: instead of the Model class should speed up the process quite a bit
@JoseFranciscoIT
@JoseFranciscoIT 11 ай бұрын
Thanks for the video.. One Note: Disabling foreign key check in a production app, can lead to slowdown or worst case an error inserting an id a does not exists on parent, so for your safety i would skip disabling foreign key check
@swancompany_inc
@swancompany_inc 11 ай бұрын
This is a great larabit. Thanks for the video Jeremy!
@vic_casanas
@vic_casanas 11 ай бұрын
Wooooow great video, super useful, please more like this 🤩
@shaikhanuman8012
@shaikhanuman8012 11 ай бұрын
great content, tqs for sharing valuable information with laravel developers, tq you very much.
@franciscojunior2141
@franciscojunior2141 11 ай бұрын
That’s amazing, thanks for the video, excellent well done 👍🏽
@edventuretech
@edventuretech 11 ай бұрын
Thank you for sharing such a value and informative video. This is my first time reaching your channel. I am gonna follow you and share this vid. I have a question. Is it suitable to use Transaction and Commit for bulking tons of record like this?
@hkhdev95
@hkhdev95 6 ай бұрын
Amazing tut, thanks
@MT87840
@MT87840 11 ай бұрын
What a great topic!
@arthmelikyan
@arthmelikyan 11 ай бұрын
I think using queues with chunking is also useful in this case, e.g chunking the file and storing 10k records into the database during each queue iteration
@blank001
@blank001 11 ай бұрын
If possible I would recommend you not to use this method because when queuing you are essentially writing to DB 2 times for each queue, 1 for queue and 2nd for the actual record If your are using redis then that's an ok ok case
@arthmelikyan
@arthmelikyan 11 ай бұрын
​sure I'm about redis driver not DB
@АртурЗарипов-б2й
@АртурЗарипов-б2й 11 ай бұрын
Good job! Thank you very much!
@grugbrain
@grugbrain 11 ай бұрын
Amazing. Please share some ideas about database design for scalable Laravel app.
@wadecodez
@wadecodez 11 ай бұрын
did not know about infile, thanks!
@shahzadwaris7193
@shahzadwaris7193 11 ай бұрын
Hi, This is a great way of inserting for a single table and where don't need to perform any functionality but if we want to perform some functionality and store the data in to different tables. Can you please cover that topic as well? I know that can be handled with queues but I wasn't able to implement it in an efficient way instead I overloaded the database with many mini jobs.
@muhamadfikri7263
@muhamadfikri7263 11 ай бұрын
Great 🔥 what the name theme vscode?
@docetapedro5007
@docetapedro5007 11 ай бұрын
Can I use bulk insert to get data from api and insert to my database?
@FahadTariq-x3q
@FahadTariq-x3q 10 ай бұрын
Which platform you use in which you give that data and just say do it.?
@yasser.elgammal
@yasser.elgammal 11 ай бұрын
Great Topic, Thank you
@ParsclickTV
@ParsclickTV 11 ай бұрын
very useful video
@rathadev
@rathadev 14 күн бұрын
Wow that so amazing
@BruceEmmanuelSueira
@BruceEmmanuelSueira 11 ай бұрын
That's amazing! Thanks you
@rafamorawiec
@rafamorawiec 6 ай бұрын
Great video but last option is good only for simple import. If you need some checking, other laravel/php stuff to do before/during insert then we are doomed :D
@underflowexception
@underflowexception 11 ай бұрын
What about exporting INSERT statements to a SQL file and using MySQL dump to import the file? Would that be more memory efficient in some cases?
@arthmelikyan
@arthmelikyan 11 ай бұрын
I don't think so, in that case you will add one more useless step which is writing/creating a new file and it is not memory efficient because you will take the same data and put into the sql file... also insert query is limited, you can't inert 1M rows at once by default. instead you can immidiatelly insert prepared data into the database and clean the memory. Using queues can also help, you can send a notification to user saying "your data insert is in progress", and then notify if the process is finished, in this case the user will not wait 20/60/80+ seconds to receive a response from the server
@keyvanakbarzadeh
@keyvanakbarzadeh 2 ай бұрын
fantastic
@SemenP-i4x
@SemenP-i4x 9 ай бұрын
Extremely cool, yeah
@IndraKurniawan
@IndraKurniawan 11 ай бұрын
Chunking is way more available than threading. Still better than none at all.
@mouhamaddiop1144
@mouhamaddiop1144 11 ай бұрын
Just amazing
@gabrielborges1185
@gabrielborges1185 11 ай бұрын
Fantastic.
@GergelyCsermely
@GergelyCsermely 11 ай бұрын
thanks
@michalbany5162
@michalbany5162 11 ай бұрын
very good
@bryanhalstead
@bryanhalstead 11 ай бұрын
I won't fight you. Thanks 🙂
@Fever1984
@Fever1984 10 ай бұрын
League/csv has been using generators for a long tim e now. I don't g et why you would use laravel and then not use a package like csv/league
@wadday
@wadday 11 ай бұрын
What if we collect the information into an array named $data during the loop and then execute a single database insert query, similar to using Model::query()->insert($data);?
@tass2001
@tass2001 11 ай бұрын
You can end up backing yourself into the corner of exceeding the max_allowed_packet size for the DBMS or depending on the load the DBMS is enduring, you could bring the application to a halt because of row locks. I would batch it into sensible chunks - 100-1000 records at a time depending on your dataset.
@bidhanbaniya7605
@bidhanbaniya7605 11 ай бұрын
You Could Have used Job here
@mohammadashrafuddinferdous9347
@mohammadashrafuddinferdous9347 11 ай бұрын
Note for me before watching the full video: php generator will be the option to read such big sized files line by line. Lets see if I am right or wrong.
Automate Vite with PhpStorm
4:05
Laracasts
Рет қаралды 1,3 М.
Execute Code After a Response is Returned?
14:00
Laracasts
Рет қаралды 10 М.
번쩍번쩍 거리는 입
0:32
승비니 Seungbini
Рет қаралды 182 МЛН
The Intermediate Developer Trap
18:39
Laracasts
Рет қаралды 25 М.
Using Generics with PHP
25:55
Laracasts
Рет қаралды 16 М.
7 Database Design Mistakes to Avoid (With Solutions)
11:29
Database Star
Рет қаралды 100 М.
Laravel: Seed 50k Rows Faster - Create vs Insert vs Chunk
4:52
Laravel Daily
Рет қаралды 17 М.
I'm new to testing in Laravel (Watch me learn)
22:03
Laravel
Рет қаралды 13 М.
Significantly Improve Page Loads Using Laravel Prefetching
9:12
Dive Deeper with Actions
12:34
Laracasts
Рет қаралды 6 М.