Spark Interview Question | Scenario Based Questions | { Regexp_replace } | Using PySpark

  Рет қаралды 35,902

Azarudeen Shahul

Azarudeen Shahul

Күн бұрын

Пікірлер: 45
@gayathrilakshmi6087
@gayathrilakshmi6087 3 жыл бұрын
Can you please attach data set and solution set ..so that we can practice...thanks for all the excellent videos
@deepakkini3835
@deepakkini3835 3 жыл бұрын
This was thoroughly explained, nice scenario . Could you please do some videos on weekly cohort analysis using window functions in spark?
@AzarudeenShahul
@AzarudeenShahul 3 жыл бұрын
Sure, we will plan.. thanks for your support
@sensibleandhrite
@sensibleandhrite 3 жыл бұрын
@@AzarudeenShahul can you explain expressions used in regex replace? i didnt understand whats $0.
@arnabbangal766
@arnabbangal766 Жыл бұрын
Sir, can you explain the regex expression more clearly or provide any youtube link where regex is explained nicely ? Thanks. Your videos are very helpful.
@sumantaghosh9299
@sumantaghosh9299 3 жыл бұрын
Nice explanation..nd gd questions too
@murrthuzaalibaig1205
@murrthuzaalibaig1205 3 жыл бұрын
can u expalin about regex in detail and how did u get the expression
@SurendraKapkoti
@SurendraKapkoti 2 жыл бұрын
Great explanation.. Keep it 👆
@sravankumar1767
@sravankumar1767 2 жыл бұрын
superb explanation, do more videos bro........
@riyazalimohammad633
@riyazalimohammad633 2 жыл бұрын
Hello Azar! Amazing Video. Is there a way we could replace the 5th pipe occurrence rather than adding "-". I want to replace the pipe with "-".
@anuragvinit
@anuragvinit 3 жыл бұрын
Thanks for this . This has been asked to be during Mindtree interview
@AzarudeenShahul
@AzarudeenShahul 3 жыл бұрын
Cool.. hope you were able to answer the question and crack interview...
@bunnyvlogs7647
@bunnyvlogs7647 3 жыл бұрын
Great brother...God bless u
@pavanp7242
@pavanp7242 3 жыл бұрын
Good work bro
@dipakchavan4659
@dipakchavan4659 2 жыл бұрын
Hey Azharuddin, Superb 👍🏻. Can u plz provide dataset.
@jk_gameplay9195
@jk_gameplay9195 Жыл бұрын
Awesome bro
@umakanthtagore6003
@umakanthtagore6003 2 жыл бұрын
Thanks for this information, can you please help me that Is this aproch works large data as well ? Thanks in advance !!
@AzarudeenShahul
@AzarudeenShahul 2 жыл бұрын
Yes, this approach can be scale out to large dataset. Let me know if you face any problem
@PraveenKumar-xk5zv
@PraveenKumar-xk5zv 3 жыл бұрын
Very helpful.. thank you!
@sangramrajpujari3829
@sangramrajpujari3829 3 жыл бұрын
good video to improve our logic.
@shiyamprasath3105
@shiyamprasath3105 2 жыл бұрын
hii bro....i have seen this 5 th occasion change in Scala but code is too difficult compared with pyspark... please share easy step of code for scala bro
@aN0nyMas
@aN0nyMas 2 жыл бұрын
In this the last record in my test data had just four columns. hence I got a schema error. is there a way to specify to handle this by ignoring the malformed data?
@sudarshanthota4444
@sudarshanthota4444 2 жыл бұрын
Thank you very much for your video's
@AzarudeenShahul
@AzarudeenShahul 2 жыл бұрын
Thanks for you support:)
@bikersview9926
@bikersview9926 2 жыл бұрын
@@AzarudeenShahul please txt file and code snippets
@sumitrastogi1
@sumitrastogi1 3 жыл бұрын
Can you please share how to deploy in production environment for pyspark job.your videos very helpful
@arifulahsan8803
@arifulahsan8803 Жыл бұрын
Hi do you teach spark course?
@jittendrakumar3908
@jittendrakumar3908 3 жыл бұрын
Please upload the video for ingesting the data from sap server. This is very important as we need to ingest the data from different source via pyspark.
@maheshk1678
@maheshk1678 3 жыл бұрын
Thanks for the nice video
@ankitapriya6671
@ankitapriya6671 2 жыл бұрын
Can you share the post where you have provided answer for this
@maheshk1678
@maheshk1678 3 жыл бұрын
Could you explain the same with kafka message streaming
@AzarudeenShahul
@AzarudeenShahul 3 жыл бұрын
Sure
@jittendrakumar3908
@jittendrakumar3908 3 жыл бұрын
Also please upload for reading the occurance of an string in an word.
@ihba02_official
@ihba02_official 3 жыл бұрын
Thank you so much bro
@purnimabharti2306
@purnimabharti2306 2 жыл бұрын
I didn't understand at some places you convert rdd to df and then df to rdd...why is it so?
@khushbusalunkhe677
@khushbusalunkhe677 2 жыл бұрын
Some transformations are not allowed in dataframe but they are availble in RDD so to perform those operations it was converted into RDD and then back to DF
@VinodR-vx8uh
@VinodR-vx8uh 8 ай бұрын
Please someone explain that regexp pattern(.*?\\){5} and why $0 in $0-
@duskbbd
@duskbbd 2 жыл бұрын
Why you kept $0 before delimiter -
@riyazalimohammad633
@riyazalimohammad633 2 жыл бұрын
The $0 in awk syntax means to return the output, so when azar uses "$0-" in the function, it will preserve the output of the regex and add "-" to it.
@Azardeen-sb1wr
@Azardeen-sb1wr Жыл бұрын
Mohammed,Azar,BE-4year Prakesh,Kummar,Btech-3year Ram,Kumar,Mtech,3year jhon,smith,BE,2year # any one can share the pyspark code to delimit the "-"
@shyammtv.v.s.p4262
@shyammtv.v.s.p4262 Жыл бұрын
123#Australia,india,Pakistan 456#England,France 789#canada,USA output is 123#Australia 789#canada 456#England 456#France 123#india 123#Pakistan how to solve this using pyspark or scala
MY HEIGHT vs MrBEAST CREW 🙈📏
00:22
Celine Dept
Рет қаралды 46 МЛН
小路飞嫁祸姐姐搞破坏 #路飞#海贼王
00:45
路飞与唐舞桐
Рет қаралды 29 МЛН
啊?就这么水灵灵的穿上了?
00:18
一航1
Рет қаралды 58 МЛН
Solve using REGEXP_REPLACE and REGEXP_EXTRACT in PySpark
17:50
Spark Pivot Interview Scenario | +917395899448
8:11
Zeyobron Analytics
Рет қаралды 2,7 М.
MY HEIGHT vs MrBEAST CREW 🙈📏
00:22
Celine Dept
Рет қаралды 46 МЛН