All your videos are really very good. I like the way you explain the concepts so clearly in such a shot time duration.
@chandravaranasi99963 жыл бұрын
Thank you so much Siva for your detailed explanation. I understand splitting the file and processing is one way of doing it, just trying to understand the Mule4 streaming option using deferred=true and @StreamCapable() in the Transform message without splitting the file, will it be faster or it has any disadvantages?
@hyderabadpropertyconsultant5 жыл бұрын
Sir, I have one task on Salesforce bulk upsert (PK chunking). We are getting some limit issues. Could you please explain it if you are aware?
@AZeeee4 жыл бұрын
Hi Siva. Couldn't we just read the file stream directly into batch? In the batch step do the dwl conversion only on the retrieved records and just do a bulk insert? This way we don't read the whole file into memory and don't have to split?
@amateur_psychologist3 жыл бұрын
Excellent video, sir. I have a question though. How is streaming different from specifying block size in the batch scope in Mule? Isn't it same from memory perspective?
@velvetmovies15834 жыл бұрын
Hi Siva, rather than using try scope in step, can we use aggregate and bulk insert to improve the performance
@radheshampandit1021 Жыл бұрын
Hi @Siva, Here I could see we are reading file from local file location. Can you please suggest how to handle the 1M file after download Azure blob and to process it?
@shaikhmuneer54105 жыл бұрын
Thanks for sharing the video..it was a really good video to understand processing the bulky record..🙂
@suryaandey36114 жыл бұрын
Thanks Siva , Can we use file connector in cloudhub ?? I think no right??
@snehak91764 жыл бұрын
Very Informative Video! Thank you
@sivathankamanee-channel4 жыл бұрын
Thank you.
@anilkumar14884 жыл бұрын
Nice video, Siva. What if we want to read a big file from the Mule File connector and pass it to the Java static method. How can we achieve this?
@akop35454 жыл бұрын
Thanks for the sharing your knowledge. I did similar implementation using Memory mapped files in DotNet to both read and write huge fixed length flat files.
@ramakrishnaravirala86694 жыл бұрын
Hi Siva, is there a git repostoy where we can get the code for all these tutorials, that would help a lot.
@jasperkopalli80213 жыл бұрын
Hi Siva, how to write million plus records into single csv - records coming from DB? can you help
@naveenbabu62584 жыл бұрын
Thank you Siva, its very useful use-case and nice learning. Learnt from this video that there is a way to process bulk/GB files. Thanks for sharing this video and expecting use case videos for future learning.
@vveena38054 жыл бұрын
could you please show how to write single csv from 10 csv files?
@chinabrahmampolisetty61205 жыл бұрын
Siva your explanation is excellent i fallow your every video if possible can you share Big CSV File
@anilmekala055 жыл бұрын
Hi Siva, could you please upload these samples in github. One small doubt can we write splited files into single file?
@sivathankamanee-channel5 жыл бұрын
Sure, Anil. Will do. Of course, the same technique applies to combine multiple files into one file. Use File.List method to fetch all the files in a folder and iterate them in a list to perform the append operation.
@anilmekala055 жыл бұрын
Hi Siva garu, I have been working as iib developer and designer. Your inputs helped me a lot on one of my similar requirements
@sivathankamanee-channel5 жыл бұрын
Thank you for your feedback and Glad it worked out well. :)
@sahilmatele84812 жыл бұрын
Hi Siva Sir, In my project i am getting millions of records in an JSON format from Database. After receiving i need to do some mappings and convert my JSON code to JAVA and send it to salesforce. Can you suggest me a better approach for this?? It would be very helpfull.
@vitheshiyer55072 жыл бұрын
Hello Siva, Video is quite helpful and have a query ---> Once the Big file is split into multiple small files and pushed into another folder, The "On New or Updated File" component/listener is going to run for each of those small files , again loading the entire data into memory at the same time or almost at the same time, which could lead to memroy issue. But thats the problem for which the files were split , so just trying to understand, how this will hep to solve memory issue. Not sure If Im missing something.. Thanks..
@sureshkumar-fw7jx3 жыл бұрын
HI Siva, Nice Video Sir, very informative. i have a question only 1st Csv will contain field names, if we need field names in all splitted csvs then ?
@murthujaanghati86102 жыл бұрын
Hi Sir , When I am trying to run a batch and passing the value as payload directly , along the payload I am using some variable values for token related stuff. But I am getting warning like batch job is being created with variable which is a streaming value, streaming values potentially large ... Pls suggest how to clear this warning . I am using almost 10 variables when the batch is running..
@joecook13753 жыл бұрын
Hi Siva, another great video. I was trying to read a file from S3 bucket and split it into smaller size then write it to another S3 bucket or do a batch process operation after that. Would you be kind enough to add another video or demo for that scenario.
@JoseCardozo174 жыл бұрын
Hi Siva. Excallent video. Thanks. What approach will you take if the files could come in both excel and csv format?
@sivathankamanee-channel4 жыл бұрын
Hi Jose - You can introduce multi channel or multiple listeners but have same common process flow to process the records.
@nikhilmahajan134 жыл бұрын
Thank you Siva sir, it's very useful. Appreciate your efforts.
@veerababu82755 жыл бұрын
Nice Siva. Very good explanation. Please share the video how to deploy the APIs with server in real time scenario. May it is very helpful most of the them like me. Thank you
@dileepd70963 жыл бұрын
thanks for the video, how i can split the same in the sftp server in order to process 500 gb file
@alonsoyt19404 жыл бұрын
Hi good day. think you can provide me with the code of the flow that I create in this video?
@baidhyanathnayak1925 жыл бұрын
Hi Siva, Another nice video( as usual) . Thanks for sharing your knowledge.
@vishalshah86264 жыл бұрын
SuperB! Simply Great. Just a question. Can't Mule procid as processor instead of asking us to write java code? Because by looking at java code it looks so simple so it could be easily encapsulate in form of processor.
@vijaydhanakodi55915 жыл бұрын
if possible try making videos on TX management in mule
@sivathankamanee-channel5 жыл бұрын
Hi Vijay - Nowadays, TX management is not preferred in a Non-Transactional REST architecture. It is nowadays done using a single Stored Procedure call and transaction is handled in a SP. If you still have a specific use-case, let me know, I will analyse. Thanks again.
@mayuria24473 жыл бұрын
Sir if we have source as database and having million records what is the best approach
@snjgpt0115 жыл бұрын
Thanks for sharing the knowledge. It is a very good video. I have specific requirement where instead of csv I am getting large json file having list nested in object. "outer":{"inner":[{firstindex},{secondindex}...]}. Now this list is very big 5 GB file. and I need to convert it into CSV. Will the approach of splitting into smaller file will work using java. Or I need to enforce the target system to send CSV only instead of json.
@sivathankamanee-channel5 жыл бұрын
Hi - Thanks for your feedback. For this requirements, you can use JSON parsers that streams the incoming file and it does the same thing like splitting the JSON array into individual records either into CSV or into JSON itself.
@sivathankamanee-channel5 жыл бұрын
Please try this link where different options are discussed: sites.google.com/site/gson/streaming
@gurubojja79534 жыл бұрын
Thanks for sharing the video. Can you please share videos on Salesforce integrations
@baidhyanathnayak1925 жыл бұрын
hi siva , can you please included some videos for watermarks and batch . thanks .
@chughcs12 жыл бұрын
Instead of writing java file for streaming can we not use the streaming in mule repeatable file streams ?
@pushthemule2 жыл бұрын
Maybe back in 2019 this would be useful. Mule 4 is much more developed now, I would not be using custom java code to complete this solution.
@thedeveloper25135 жыл бұрын
Hi Siva, how many years of experience do you have in MuleSoft?
@sivathankamanee-channel5 жыл бұрын
Hi - I have around 15+ years of experience in working in Java and Integration projects. :) Do you like all the topics? Please let me know.
@thedeveloper25135 жыл бұрын
Yeah I can tell you have years of experience from the way you explain. Thank you for dedicating your time and skills in creating all the videos
@surajrohankar46635 жыл бұрын
Hello Sir.. Is it possible to create multiple files without java code? That is using DW?
@sivathankamanee-channel5 жыл бұрын
In my view, it is not recommended to use DW since DW is designed to retrieve complete content into memory before processing. It is a bit risky that the server gets crashed for sure.
@Rohit34245 жыл бұрын
Thanks Siva, Well explained Sir Ji, please share java code path, or if possible kindly share above program source code..
@saitejapendli703610 ай бұрын
As usual very nice explanation about how to process millions of records in a batch, if it is possible could you please share the big csv file ?
@krishnamurthy19455 жыл бұрын
Thank you for sharing knowledge
@faizanarif42952 жыл бұрын
Thanks a lot Siva , its really helpful :)
@shailajanayak5195 жыл бұрын
Sir Can you make a video for delimited flatfile ?
@suvarnadhawad15384 жыл бұрын
Can you guide on how to download a file from remote server to our local system
@ganeshsrinivas78504 жыл бұрын
Hi Siva garu, I was impressed with your explanation,Thank you. I have small doubt in this video, After splitting into small files, we are keeping them into static path, but after deploying into cloudhub, where can i place the splitted files, Could you please provide me the solution.
@sivathankamanee-channel4 жыл бұрын
Hi Ganesh. You need to use shared folder or FTP.
@ravichandra14985 жыл бұрын
Awesome 👍
@amarreddy57864 жыл бұрын
how could we eliminate duplicates in millions of records?
@saitejamadha50794 жыл бұрын
The column names are being lost when the CSV file is being split
@sivathankamanee-channel4 жыл бұрын
Hi Sai - Yes. You need to add the header in your Java code or adjust the DW accordingly.
@hellothere8484 жыл бұрын
Doesn't Mule 4 support streaming out of box?
@sivathankamanee-channel4 жыл бұрын
Hello there - I haven't tried that solution yet. As far as I know the DWL works only after loading entire CSV. Please let me know if there are better alternative. Thank you.
@vnpandey104 жыл бұрын
@@sivathankamanee-channel It supports
@vp17in4 жыл бұрын
@@sivathankamanee-channel Hope you are well and thanks for great videos. A better alternative way would be to use streaming in File connector (Mule 3 has streams & Mule 4 has introduced repeatable-streams) and coerce the payload as Iterator (Mulesoft recommended approach - link given above by @Vikas Gupta). This will ensure chunks of data are streamed to Batch Processing and NOT loading the entire payload into memory (avoiding OutOfMemory Exception). Inside Batch Steps, we can use Dataweave to transform the payload to CSV and write to Database, send an email, and perform any other operation as required. This approach will not put pressure on memory and can handle very large files like 1 GB.
@sivathankamanee-channel4 жыл бұрын
Hi @@vp17in - Thanks. However, my intention is to provide design for any integration platform and integration experts will get some general guidelines on how to approach when the large payload is to be dealt with. This can be used in Java/J2EE, WSO2, Oracle SOA Suite, etc. Thanks for your valuable feedback. :)
@sudheerraja30593 жыл бұрын
sir, you always make grate videos. but if you share the codebase will help others too..
@KannaswamydasАй бұрын
Why 10 k records are failing to upsert through batch processing and mule is restarting the file size was more than 7 mb
@edisonvargasmayorga85794 жыл бұрын
Thank you, a well video!
@YousufKhan-th4qd4 жыл бұрын
sir can u plz provide the complete code
@ravichandra14985 жыл бұрын
Please give java code
@sivathankamanee-channel5 жыл бұрын
The link is added in the description, Ravi. :)
@ashwathnarayanm4 жыл бұрын
Thank you so much😊
@protectglobe4445 жыл бұрын
Useful !
@sudheerj29495 жыл бұрын
Thank you very much @Siva Thankamanee this is really awesome. Also I would appreciate very much if you could make a tutorial on how to use aws SQS connector and different strategies for processing millions of records.
@sivathankamanee-channel5 жыл бұрын
Sure Sudheer. Will try with a new AWS account. :) Thanks for your appreciation !
@piyushkandoi35395 жыл бұрын
Hi, I have tried using SQS Connector, it works for me but the issue I am facing is, let suppose I have 1000 messages in the queue and now I start my receiver and I have an assumption that receiver flow will take 30 seconds to process, then, in this case, I am getting Stack Overflow Error. @Siva sir, it would be great if you please cover this case.
@sivathankamanee-channel5 жыл бұрын
Hi Piyush - Could you please share the specific SQS listening flow XML piece, I will take a look. Do you have both start delay and frequency to be same?
@piyushkandoi35395 жыл бұрын
@@sivathankamanee-channel This code fetches the bulk of messages from SQS queue in pair of 100 messages in one shot and we have put a sleep of 20 seconds (assuming this process will take this much time in processing). So it should run perfectly but getting Stack Overflow error in CH Logs..could you please share your email id so that I can mail you the code.
@sivathankamanee-channel5 жыл бұрын
Sure. Please send it to sivathankamaneeaws@gmail.com.