Excellent explanation. Keep up the good work and thanks !!
@Jayanaresh_Kusume3 жыл бұрын
Thanks.. for crystal clear explanation 👍
@bnsagar904 жыл бұрын
Good explanation. I didn’t find batch streaming and sliding windows concepts explained like this.
@medotop3303 жыл бұрын
And I am
@ramum46844 жыл бұрын
Excellent job you doing Lime Guru. Thanks for the effort . Expect more on Apache Spark
@lavanyamukka55972 жыл бұрын
Good explanation 😍
@deepanshuaggarwal70423 ай бұрын
How to create state again and add it in rocksDB when we restarts our stream job from new checkpoint location?
@kuntalchowdhury53364 жыл бұрын
Good explanation.
@yogeshkawde89265 жыл бұрын
Job well done. Nicely explained. This video deserves more views and likes. Kudos.👍👍👍
@medotop3303 жыл бұрын
Thanks too much
@debapriyamaity12025 жыл бұрын
Good explanation. Can I use Spark streaming to read from hive table with millions of records. Say my usecase is like I have a source java program which uses Spark stream to read data from hive table and keep on posting data packets to some kafka connector. I am mainly concerned with memory usage. What is your opinion on this ?
@sujitunim5 жыл бұрын
Nicely explain
@bealegopro5 жыл бұрын
Good job
@Remesh1436 жыл бұрын
Hello, As you mentioned on streaming source-File. Is it possible to stream a file as source and as and when new lines append to a file(csv file), is it possible to do streaming on top of that? if yes.. which streaming source method i need to call...please help
@akashhudge57354 жыл бұрын
in Dstream in one batch you get only one RDD, check below link stackoverflow.com/questions/35164634/how-many-rdds-does-dstream-generate-for-a-batch-interval