AES encryption in python | Lec-25
22:58
error handling in python | Lec-23
29:13
function in python | Lec-21
23:03
7 ай бұрын
.join() method in python | Lec-20
24:04
string in python | Lec-17
34:47
7 ай бұрын
set in python | Lec-17
34:53
7 ай бұрын
tuple in python | Lec-16
31:13
7 ай бұрын
dictionary in python | Lec-14
33:55
while loop in python | Lec-12
20:10
for loop in python part-2 | Lec-11
38:12
for loop in python part-1 | Lec-10
46:38
list in python part-2 | Lec-9
38:19
list in python | Lec-8
39:38
9 ай бұрын
if else in python | Lec-7
35:59
10 ай бұрын
python assignment1
5:01
10 ай бұрын
Пікірлер
@sattyamghumare8212
@sattyamghumare8212 16 сағат бұрын
Sir, I have also rejected in multiple companies even I hard it l1 l2 rounds crack dsa, spark, rounds. but just bcz you share everything about success n failure, I get confidence to go further....I know will be get success one day ....Thank you..❤
@harmanpreetkaur8049
@harmanpreetkaur8049 18 сағат бұрын
This is the best series related to Spark that I have seen so far. Tried so many other videos and couldn't complete those. You make it so easy to understand. Hands down the best teacher! Thanks for all your efforts. Subscribed :)
@pavanrajyaguru
@pavanrajyaguru Күн бұрын
Great!!!
@tanmaykapil5362
@tanmaykapil5362 2 күн бұрын
bhai ek series python interview coding question ki bhi start kardo
@vikeesonawane3697
@vikeesonawane3697 2 күн бұрын
Inferschema and read are not actions. In Spark, the terms InferSchema and read are not technically "actions." Instead, they are transformations and operations used to define how data is read and processed. InferSchema: This is an option used with the read operation to automatically infer the schema of the input data. It is not an action but a configuration setting that helps Spark understand the structure of the data being read. read: This is a method provided by Spark's DataFrameReader to load data from various sources (like CSV, JSON, Parquet, etc.) into a DataFrame. It is also not an action but a transformation that prepares the data for further processing.
@ashwinichoure3203
@ashwinichoure3203 2 күн бұрын
Hi Sir, I am very thankful that I came across your playlist . very nice , please keep it up
@snehsparsh7954
@snehsparsh7954 2 күн бұрын
# Start with an initial result of 0 result = float(input("Enter the first number: ")) while True: # Take the operator as input operator = input("Enter operator (+, -, *, /) or '=' to get result: ") # Exit the loop if the user enters '=' if operator == "=": print(f"The final result is: {result}") break # Ensure that the operator is valid if operator not in ['+', '-', '*', '/']: print("Invalid operator. Please enter a valid operator.") continue # Take the next number as input num = float(input("Enter next number: ")) # Perform the operation based on the operator if operator == "+": result += num elif operator == "-": result -= num elif operator == "*": result *= num elif operator == "/": # Check for division by zero if num == 0: print("Error: Division by zero is not allowed.") continue result /= num # Display the updated result after each operation print(f"Current result: {result}")
@snehsparsh7954
@snehsparsh7954 2 күн бұрын
#Q2: Insert the new number 15 in such a way that list is sorted in descending order arr = [202,165,89,76,12] target = 15 # Step 1: Find the index where the target should be inserted idx = 0 for num in arr: if num < target: break idx += 1 print(f"The target value should be inserted at index : {idx}") # Step 2: Append a dummy value (None or any placeholder) arr.append(None) # Step 3: Shift elements to the right to make space for the target for i in range(len(arr) - 1, idx, -1): # Start from the last index arr[i] = arr[i - 1] # Shift elements one position to the right # Step 4: Insert the target at the correct index arr[idx] = target print(f"The newly sorted array with atrget value : {arr}")
@younevano
@younevano 2 күн бұрын
My entire code for implementing SCD2 in customer_dim_tbl based off sales_df: # appending records of, new customers in sales_df, that never existed in customer_dim_tbl new_customer_dim_df = sales_df.join(customer_dim_df, sales_df["customer_id"] == customer_dim_df["id"], "leftanti")\ .select(sales_df["customer_id"].alias("id"), sales_df["customer_name"].alias("name"), sales_df["food_delivery_address"].alias("city"), sales_df["food_delivery_country"].alias("country"), lit('Y').alias("active"), sales_df["sales_date"].alias("effective_start_date"), lit('null').alias("effective_end_date")) new_customer_dim_df = customer_dim_df.union(new_customer_dim_df) # appending updated records of existing customers in customer_dim_tbl new_2_customer_dim_df = customer_dim_df.join(sales_df, (customer_dim_df["active"] == 'Y') & (customer_dim_df["id"] == sales_df["customer_id"]) & (customer_dim_df["city"] != sales_df["food_delivery_address"]) , "inner")\ .select(customer_dim_df["id"].alias("id"), customer_dim_df["name"].alias("name"), sales_df["food_delivery_address"].alias("city"), sales_df["food_delivery_country"].alias("country"), customer_dim_df["active"].alias("active"), sales_df["sales_date"].alias("effective_start_date"), customer_dim_df["effective_end_date"].alias("effective_end_date")) customer_dim_df = new_customer_dim_df.union(new_2_customer_dim_df) #updating existing records to historical records customer_window = Window.partitionBy("id").orderBy("effective_start_date")\ .rowsBetween(Window.unboundedPreceding, Window.unboundedFollowing) customer_date_window = Window.partitionBy("id").orderBy("effective_start_date") customer_dim_df.withColumn("active", when ((col("effective_start_date") != last("effective_start_date").over(customer_window)), lit('N'))\ .otherwise(lit('Y')))\ .withColumn("effective_end_date", lead("effective_start_date").over(customer_date_window))\ .show()
@vinay2111
@vinay2111 2 күн бұрын
answer of leet code problem : df_output=df_per.join(df_add,df_per["personId"]==df_add["personId"],"left").select(df_per["firstName"],df_per["lastName"],df_add["city"],df_add["state"]) df_output.show()
@dayanandab.n3814
@dayanandab.n3814 3 күн бұрын
Thank you Manish Bhai
@ashishmishra-cx8ek
@ashishmishra-cx8ek 3 күн бұрын
👏👏
@bhagyashreethote5956
@bhagyashreethote5956 3 күн бұрын
Really helpful series
@dharanirajjain8316
@dharanirajjain8316 3 күн бұрын
number=int(input("enter the value")) result=number%2 If result==0: Print("even") else: Print("odd")
@ashishmishra-cx8ek
@ashishmishra-cx8ek 3 күн бұрын
Clap for you Manish bhai.
@ashishmishra-cx8ek
@ashishmishra-cx8ek 3 күн бұрын
Keep it up Manish bhai.
@PrKash5
@PrKash5 3 күн бұрын
Hi Manish. Very good lecture. One question, do we also make spark code modular? I have worked in few data engineering projects but i have mostly seen functional way of working and not modular
@OmegaSpeedwill
@OmegaSpeedwill 3 күн бұрын
Manish bhai, please make a roadmap for Fresher Data engineering in 2025, is there anything else we need to learn?
@younevano
@younevano 3 күн бұрын
product_window = Window.partitionBy("product_id").orderBy("sales_date") sales_window = Window.partitionBy("product_id").orderBy("sales_date").rowsBetween(-2, 0) product_df.withColumn("row_number", row_number().over(product_window))\ .withColumn("avg_sales_last_3_months", round(avg("sales").over(sales_window), 2))\ .filter(col("row_number") >= 3).show()
@ABHINITAKumari-d1s
@ABHINITAKumari-d1s 4 күн бұрын
# print Star pattern reverse now for i in range(6,0,-1): logger.info(i *" * " )
@Daily_Code_Challenge
@Daily_Code_Challenge 4 күн бұрын
Thank you ❤
@Daily_Code_Challenge
@Daily_Code_Challenge 4 күн бұрын
Thank you ❤
@Daily_Code_Challenge
@Daily_Code_Challenge 4 күн бұрын
Thank you ❤
@Banzaraaa
@Banzaraaa 4 күн бұрын
C:\Spark\spark-3.4.4-bin-hadoop3\python\pyspark\shell.py:74: UserWarning: Failed to initialize Spark session. warnings.warn("Failed to initialize Spark session.") Can anyone please tell me how to resolve above issue, I am facing while checking pyspark in Command prompt.
@Samar009-m
@Samar009-m 4 күн бұрын
I just have one word for this ..awesome
@MuhammadAbdullah-of7in
@MuhammadAbdullah-of7in 4 күн бұрын
Sir you're a legend!!
@coolraviraj24
@coolraviraj24 4 күн бұрын
farzi join😂
@snehsparsh7954
@snehsparsh7954 5 күн бұрын
Question 3: paragraph = """ Programming aasan hai. We are going to learn this in depth. While learning we have to make sure that we are implemeting all the logics by ourself. The aim here is to build our "4 BHK" house with the help of 'Python programming'. We have total land is of \100 ft * 100ft /, to colmplete the house we have total 6 labours with 'different skill set like "\\ building wall or building roof \\". I have to print this paragraph as it is given here.""" numbered_paragraph = ' '.join(f"{idx + 1}: {line}" for idx, line in enumerate(paragraph.splitlines())) print(numbered_paragraph) O/p - 1: Programming aasan hai. We are going to learn this in depth. While learning we have to make sure that 2: we are implemeting all the logics by ourself. The aim here is to build our "4 BHK" house with the 3: help of 'Python programming'. We have total land is of @ ft * 100ft /, to colmplete the house 4: we have total 6 labours with 'different skill set like "\ building wall or building roof \". 5: I have to print this paragraph as it is given here.
@shwetatejpalshah2333
@shwetatejpalshah2333 5 күн бұрын
You are a 💎
@ABHINITAKumari-d1s
@ABHINITAKumari-d1s 5 күн бұрын
num = int(input("enter your number : ")) if (num % 2 == 0): logger.info(f"Given number is Even") else: logger.info(f"Given number is Odd ")
@saurabhgulati2505
@saurabhgulati2505 6 күн бұрын
count is also an action. Why we didn't consider as a new job ?
@Vinodkumar_19879
@Vinodkumar_19879 6 күн бұрын
How to connect for you
@Vinodkumar_19879
@Vinodkumar_19879 6 күн бұрын
Please please
@Vinodkumar_19879
@Vinodkumar_19879 6 күн бұрын
Aap meri help karo
@Vinodkumar_19879
@Vinodkumar_19879 6 күн бұрын
Manish bhaiya aap se baat karna hài
@ashidreza384
@ashidreza384 7 күн бұрын
Hello sir, i have seen you said in the beginning that you are talking about corrupted record and the documentations. can you please tell me which documentation you are talking about.
@AnandaKrishna-t3h
@AnandaKrishna-t3h 7 күн бұрын
Nice
@bhupendrasonwane5335
@bhupendrasonwane5335 7 күн бұрын
thanks for the explaination :)
@gagansingh3481
@gagansingh3481 7 күн бұрын
@sir can you please create a complete video on pyspark and databricks
@mayuragrawal9678
@mayuragrawal9678 8 күн бұрын
Thank you so much for making amazing videos and its clear so many doubts. You are amazing Sir. Please keep going.
@kamalprajapati9955
@kamalprajapati9955 8 күн бұрын
Hi Manish, Your tutorials are in-depth and very much involved to the core concepts and perfect for beginners to intermediate. I don't see any tutorials out there teaching with such an ease and effort on making us understand. It really helped me crack a job interview with confidence and I am in a good position today. I promise not to go in the comfort zone will always skill up and watch your teachings to get back on track. Thank your for your efforts it has made a good impact and will look out for more such knowledge from you. And all the best for your next steps. Thank you.
@MuhammadAbdullah-of7in
@MuhammadAbdullah-of7in 8 күн бұрын
My salute to you sir!
@Bharti-q7d
@Bharti-q7d 8 күн бұрын
can you make a video on how to connect spark with mongodb and installation of each java, spark and hadoop with their versions ? i am getting py4j javaerror.
@younevano
@younevano 9 күн бұрын
customer_df.select("name").filter((col("referee_id") != 2) | (col("referee_id").isNull())).show()
@ShivamGupta-wn9mo
@ShivamGupta-wn9mo 10 күн бұрын
hey i completed whole series and practised every question and now i am confident in pyspark coding thanks.
@manish_kumar_1
@manish_kumar_1 9 күн бұрын
That's great
@younevano
@younevano 2 күн бұрын
Same here and read all comments under each video and commented on others' questions :D
@ShivamGupta-wn9mo
@ShivamGupta-wn9mo 10 күн бұрын
3 ans window=Window.partitionBy("product_id").orderBy("sales_date").rowsBetween(-2,Window.currentRow) df_running=product_df.withColumn("last_3_month_avg_sales",avg("sales").over(window))\ .withColumn("performance",when(col("sales")>col("last_3_month_avg_sales"),"Avobe Avg")\ .when(col("sales")<col("last_3_month_avg_sales"),"Below Avg")\ .otherwise("Equal to Avg"))\ .show()
@sahilbhatia1937
@sahilbhatia1937 10 күн бұрын
need more lectures like this
@Mr_Dattrao_B.Andhori
@Mr_Dattrao_B.Andhori 10 күн бұрын
what is mean by 0 row affected.
@Mr_Dattrao_B.Andhori
@Mr_Dattrao_B.Andhori 10 күн бұрын
thank you...
@sahilbhatia1937
@sahilbhatia1937 11 күн бұрын
great