Sir, I have also rejected in multiple companies even I hard it l1 l2 rounds crack dsa, spark, rounds. but just bcz you share everything about success n failure, I get confidence to go further....I know will be get success one day ....Thank you..❤
@harmanpreetkaur804918 сағат бұрын
This is the best series related to Spark that I have seen so far. Tried so many other videos and couldn't complete those. You make it so easy to understand. Hands down the best teacher! Thanks for all your efforts. Subscribed :)
@pavanrajyaguruКүн бұрын
Great!!!
@tanmaykapil53622 күн бұрын
bhai ek series python interview coding question ki bhi start kardo
@vikeesonawane36972 күн бұрын
Inferschema and read are not actions. In Spark, the terms InferSchema and read are not technically "actions." Instead, they are transformations and operations used to define how data is read and processed. InferSchema: This is an option used with the read operation to automatically infer the schema of the input data. It is not an action but a configuration setting that helps Spark understand the structure of the data being read. read: This is a method provided by Spark's DataFrameReader to load data from various sources (like CSV, JSON, Parquet, etc.) into a DataFrame. It is also not an action but a transformation that prepares the data for further processing.
@ashwinichoure32032 күн бұрын
Hi Sir, I am very thankful that I came across your playlist . very nice , please keep it up
@snehsparsh79542 күн бұрын
# Start with an initial result of 0 result = float(input("Enter the first number: ")) while True: # Take the operator as input operator = input("Enter operator (+, -, *, /) or '=' to get result: ") # Exit the loop if the user enters '=' if operator == "=": print(f"The final result is: {result}") break # Ensure that the operator is valid if operator not in ['+', '-', '*', '/']: print("Invalid operator. Please enter a valid operator.") continue # Take the next number as input num = float(input("Enter next number: ")) # Perform the operation based on the operator if operator == "+": result += num elif operator == "-": result -= num elif operator == "*": result *= num elif operator == "/": # Check for division by zero if num == 0: print("Error: Division by zero is not allowed.") continue result /= num # Display the updated result after each operation print(f"Current result: {result}")
@snehsparsh79542 күн бұрын
#Q2: Insert the new number 15 in such a way that list is sorted in descending order arr = [202,165,89,76,12] target = 15 # Step 1: Find the index where the target should be inserted idx = 0 for num in arr: if num < target: break idx += 1 print(f"The target value should be inserted at index : {idx}") # Step 2: Append a dummy value (None or any placeholder) arr.append(None) # Step 3: Shift elements to the right to make space for the target for i in range(len(arr) - 1, idx, -1): # Start from the last index arr[i] = arr[i - 1] # Shift elements one position to the right # Step 4: Insert the target at the correct index arr[idx] = target print(f"The newly sorted array with atrget value : {arr}")
@younevano2 күн бұрын
My entire code for implementing SCD2 in customer_dim_tbl based off sales_df: # appending records of, new customers in sales_df, that never existed in customer_dim_tbl new_customer_dim_df = sales_df.join(customer_dim_df, sales_df["customer_id"] == customer_dim_df["id"], "leftanti")\ .select(sales_df["customer_id"].alias("id"), sales_df["customer_name"].alias("name"), sales_df["food_delivery_address"].alias("city"), sales_df["food_delivery_country"].alias("country"), lit('Y').alias("active"), sales_df["sales_date"].alias("effective_start_date"), lit('null').alias("effective_end_date")) new_customer_dim_df = customer_dim_df.union(new_customer_dim_df) # appending updated records of existing customers in customer_dim_tbl new_2_customer_dim_df = customer_dim_df.join(sales_df, (customer_dim_df["active"] == 'Y') & (customer_dim_df["id"] == sales_df["customer_id"]) & (customer_dim_df["city"] != sales_df["food_delivery_address"]) , "inner")\ .select(customer_dim_df["id"].alias("id"), customer_dim_df["name"].alias("name"), sales_df["food_delivery_address"].alias("city"), sales_df["food_delivery_country"].alias("country"), customer_dim_df["active"].alias("active"), sales_df["sales_date"].alias("effective_start_date"), customer_dim_df["effective_end_date"].alias("effective_end_date")) customer_dim_df = new_customer_dim_df.union(new_2_customer_dim_df) #updating existing records to historical records customer_window = Window.partitionBy("id").orderBy("effective_start_date")\ .rowsBetween(Window.unboundedPreceding, Window.unboundedFollowing) customer_date_window = Window.partitionBy("id").orderBy("effective_start_date") customer_dim_df.withColumn("active", when ((col("effective_start_date") != last("effective_start_date").over(customer_window)), lit('N'))\ .otherwise(lit('Y')))\ .withColumn("effective_end_date", lead("effective_start_date").over(customer_date_window))\ .show()
@vinay21112 күн бұрын
answer of leet code problem : df_output=df_per.join(df_add,df_per["personId"]==df_add["personId"],"left").select(df_per["firstName"],df_per["lastName"],df_add["city"],df_add["state"]) df_output.show()
@dayanandab.n38143 күн бұрын
Thank you Manish Bhai
@ashishmishra-cx8ek3 күн бұрын
👏👏
@bhagyashreethote59563 күн бұрын
Really helpful series
@dharanirajjain83163 күн бұрын
number=int(input("enter the value")) result=number%2 If result==0: Print("even") else: Print("odd")
@ashishmishra-cx8ek3 күн бұрын
Clap for you Manish bhai.
@ashishmishra-cx8ek3 күн бұрын
Keep it up Manish bhai.
@PrKash53 күн бұрын
Hi Manish. Very good lecture. One question, do we also make spark code modular? I have worked in few data engineering projects but i have mostly seen functional way of working and not modular
@OmegaSpeedwill3 күн бұрын
Manish bhai, please make a roadmap for Fresher Data engineering in 2025, is there anything else we need to learn?
# print Star pattern reverse now for i in range(6,0,-1): logger.info(i *" * " )
@Daily_Code_Challenge4 күн бұрын
Thank you ❤
@Daily_Code_Challenge4 күн бұрын
Thank you ❤
@Daily_Code_Challenge4 күн бұрын
Thank you ❤
@Banzaraaa4 күн бұрын
C:\Spark\spark-3.4.4-bin-hadoop3\python\pyspark\shell.py:74: UserWarning: Failed to initialize Spark session. warnings.warn("Failed to initialize Spark session.") Can anyone please tell me how to resolve above issue, I am facing while checking pyspark in Command prompt.
@Samar009-m4 күн бұрын
I just have one word for this ..awesome
@MuhammadAbdullah-of7in4 күн бұрын
Sir you're a legend!!
@coolraviraj244 күн бұрын
farzi join😂
@snehsparsh79545 күн бұрын
Question 3: paragraph = """ Programming aasan hai. We are going to learn this in depth. While learning we have to make sure that we are implemeting all the logics by ourself. The aim here is to build our "4 BHK" house with the help of 'Python programming'. We have total land is of \100 ft * 100ft /, to colmplete the house we have total 6 labours with 'different skill set like "\\ building wall or building roof \\". I have to print this paragraph as it is given here.""" numbered_paragraph = ' '.join(f"{idx + 1}: {line}" for idx, line in enumerate(paragraph.splitlines())) print(numbered_paragraph) O/p - 1: Programming aasan hai. We are going to learn this in depth. While learning we have to make sure that 2: we are implemeting all the logics by ourself. The aim here is to build our "4 BHK" house with the 3: help of 'Python programming'. We have total land is of @ ft * 100ft /, to colmplete the house 4: we have total 6 labours with 'different skill set like "\ building wall or building roof \". 5: I have to print this paragraph as it is given here.
@shwetatejpalshah23335 күн бұрын
You are a 💎
@ABHINITAKumari-d1s5 күн бұрын
num = int(input("enter your number : ")) if (num % 2 == 0): logger.info(f"Given number is Even") else: logger.info(f"Given number is Odd ")
@saurabhgulati25056 күн бұрын
count is also an action. Why we didn't consider as a new job ?
@Vinodkumar_198796 күн бұрын
How to connect for you
@Vinodkumar_198796 күн бұрын
Please please
@Vinodkumar_198796 күн бұрын
Aap meri help karo
@Vinodkumar_198796 күн бұрын
Manish bhaiya aap se baat karna hài
@ashidreza3847 күн бұрын
Hello sir, i have seen you said in the beginning that you are talking about corrupted record and the documentations. can you please tell me which documentation you are talking about.
@AnandaKrishna-t3h7 күн бұрын
Nice
@bhupendrasonwane53357 күн бұрын
thanks for the explaination :)
@gagansingh34817 күн бұрын
@sir can you please create a complete video on pyspark and databricks
@mayuragrawal96788 күн бұрын
Thank you so much for making amazing videos and its clear so many doubts. You are amazing Sir. Please keep going.
@kamalprajapati99558 күн бұрын
Hi Manish, Your tutorials are in-depth and very much involved to the core concepts and perfect for beginners to intermediate. I don't see any tutorials out there teaching with such an ease and effort on making us understand. It really helped me crack a job interview with confidence and I am in a good position today. I promise not to go in the comfort zone will always skill up and watch your teachings to get back on track. Thank your for your efforts it has made a good impact and will look out for more such knowledge from you. And all the best for your next steps. Thank you.
@MuhammadAbdullah-of7in8 күн бұрын
My salute to you sir!
@Bharti-q7d8 күн бұрын
can you make a video on how to connect spark with mongodb and installation of each java, spark and hadoop with their versions ? i am getting py4j javaerror.