Пікірлер
@rafidbinsadeque499
@rafidbinsadeque499 22 күн бұрын
i am getting the error No module named 'airflow.providers.postgres.operators' although I have it installed
@sumitkumar2955
@sumitkumar2955 20 күн бұрын
Please recreate image and start airflow and check
@tushargogiya4017
@tushargogiya4017 Ай бұрын
sir what if i already have a local db in pgadmin4 and i want to use it in air flow
@sumitkumar2955
@sumitkumar2955 Ай бұрын
@@tushargogiya4017 yes you can use that . You have to give that postgres hostname in airflow config file. And ur postgres should be accessible from airflow server
@tushargogiya4017
@tushargogiya4017 Ай бұрын
@sumitkumar2955 thank you and your video helps a lot me great content
@sumitkumar2955
@sumitkumar2955 Ай бұрын
@tushargogiya4017 thanks.
@aritrakhatua1482
@aritrakhatua1482 Ай бұрын
06:42= Working the process in Anaconda,
@kollajyotheswarareddy998
@kollajyotheswarareddy998 Ай бұрын
Hi ,getting this error "Python worker exited unexpectedly (crashed)".Could you please assist with this
@sumitkumar2955
@sumitkumar2955 Ай бұрын
@@kollajyotheswarareddy998 have you follow all steps correctly. Please check again.
@kollajyotheswarareddy998
@kollajyotheswarareddy998 Ай бұрын
@sumitkumar2955 yes
@kollajyotheswarareddy998
@kollajyotheswarareddy998 Ай бұрын
@sumitkumar2955 remaining code executing when I ran df. Show() troughing error
@sumitkumar2955
@sumitkumar2955 Ай бұрын
@@kollajyotheswarareddy998 please ping me on WhatsApp 8147085086.
@gauravkumar-jp1ic
@gauravkumar-jp1ic Ай бұрын
Nice sir ji ....
@sumitkumar2955
@sumitkumar2955 Ай бұрын
@@gauravkumar-jp1ic 😂
@aavishkarmahajan6114
@aavishkarmahajan6114 Ай бұрын
Nicely shown steps. Thanks
@sumitkumar2955
@sumitkumar2955 Ай бұрын
@@aavishkarmahajan6114 thanks 🙏
@anithagracy6407
@anithagracy6407 2 ай бұрын
Super. Thank you. Very useful.
@sumitkumar2955
@sumitkumar2955 2 ай бұрын
@@anithagracy6407 thanks 🙏
@HiddenAway-oq6wo
@HiddenAway-oq6wo 2 ай бұрын
Thank You Very Much. Your explanation is very clear and to the point. I’m struggling to connect DBeaver from last 1 week with no help but your video solve this. Thanks again. Keep up the good work.
@sumitkumar2955
@sumitkumar2955 2 ай бұрын
@@HiddenAway-oq6wo happy to know that it helped you. Let me know if you have any other requirement. I will make a video on that topic.
@dark-crawler
@dark-crawler 3 ай бұрын
TnQ
@sumitkumar2955
@sumitkumar2955 3 ай бұрын
Thanks please subscribe
@sreenugangadevi9975
@sreenugangadevi9975 3 ай бұрын
tq its use
@sumitkumar2955
@sumitkumar2955 3 ай бұрын
@@sreenugangadevi9975 thanks. Please subscribe
@chandankumarthakur08
@chandankumarthakur08 3 ай бұрын
wow thanks for this :)
@sumitkumar2955
@sumitkumar2955 3 ай бұрын
@@chandankumarthakur08 thanks😄
@syedsimra
@syedsimra 3 ай бұрын
@sumit kumar - I followed the same steps like you did. Still I got the error while connecting to SFTP usin gWinscp. I says "Access Denied". What could be the reason? I created the bucket role, keys are spelled correctly (case-sensitive), and entered the correct passord in Winscp etc. Can you please help here?
@sumitkumar2955
@sumitkumar2955 3 ай бұрын
@@syedsimra may be role/policy issue.
@syedsimra
@syedsimra 3 ай бұрын
@@sumitkumar2955 Actually I fixed the issue. Looks like this video is outdated. The updated pusblished documentation gave me the fix. In the old post the secret key in format - SFTP/username. But, in the new post it gave the correct updated format for username - aws/transfer/server-id/username
@sumitkumar2955
@sumitkumar2955 3 ай бұрын
@@syedsimra yes, this is very old video. Maybe I have to create a new video. But thanks for the information. It will help others. 🙏👍
@fisicateca17
@fisicateca17 3 ай бұрын
Was dbfs browser unable forever? I can't access to this option. 😢
@sumitkumar2955
@sumitkumar2955 3 ай бұрын
@@fisicateca17 may be you don't have admin role
@NaveedKhan_777
@NaveedKhan_777 3 ай бұрын
Brother why this dbfs setting not showing in my account
@sumitkumar2955
@sumitkumar2955 3 ай бұрын
Naveed Bhai , you may not have an admin account.
@NaveedKhan_777
@NaveedKhan_777 3 ай бұрын
@@sumitkumar2955what should I do?
@NaveedKhan_777
@NaveedKhan_777 3 ай бұрын
What should I do, I'm using community edition​@@sumitkumar2955
@ranjeethrikkala6344
@ranjeethrikkala6344 4 ай бұрын
Hi @Sumit Kumar. This is workinng with notebook instances. But not working with Sagemaker Studio Juypter lab notebooks. Can you please help.
@sumitkumar2955
@sumitkumar2955 4 ай бұрын
@@ranjeethrikkala6344 sure i will check and let you know
@ranjeethrikkala6344
@ranjeethrikkala6344 4 ай бұрын
@@sumitkumar2955 Hi Sumit. Have you found out the soultion for Sagemaker Studio Notebook. Also above code is not working when notebook instance is stopped. Which will require us to start it manually. In which case automation with trigger is not served.
@arafatabsi6546
@arafatabsi6546 4 ай бұрын
hi the command docker-compose up -d --no-deps --build airflow-webserver airflow-scheduler is changing the containers (you can notice from the containers IDs) means when i want to add a requirement all my data in the containers will be lost! what is the solution!
@YashKumarJain-v5c
@YashKumarJain-v5c 5 ай бұрын
I followed as per the vedio but the dags are not visible in UI could you help me out
@sumitkumar2955
@sumitkumar2955 5 ай бұрын
@@YashKumarJain-v5c thanks for watching. Please check permission issue, role. If not working ping me on what's app. 8147085086
@YashKumarJain-v5c
@YashKumarJain-v5c 4 ай бұрын
@@sumitkumar2955 Thank you for the reply, i solved the issue, i was using already set up VPC and there was no NAT enabled.
@mayankv83
@mayankv83 5 ай бұрын
Thanks for Video. receiving error white test lamda function "An error occurred (ValidationException) when calling the StartNotebookInstance operation: Status (InService) not in ([Stopped, Failed]). Unable to transition to (Pending) for Notebook Instance" Please help me.
@srk0706
@srk0706 4 ай бұрын
you need to start the notebook instance. Maybe another lamdbda to start it first if you want it fully automated.
@AbhishekRoy-w2f
@AbhishekRoy-w2f 5 ай бұрын
can we run the lambda function without opening the terminal. i am only able to run it if i am opening terminal, the notebook run is successful. if i am not opening terminal, lambda function is successful then also but sagemaker notebook is not running
@sumitkumar2955
@sumitkumar2955 5 ай бұрын
"Can you run the lambda function without opening the terminal "- what does it mean could you please tell me which terminal you opening and running lambda?
@srk0706
@srk0706 4 ай бұрын
Hi, I found the hard way that the terminal needs to running as well for the code to work which is pretty expensive. I moved to lifecycle configuration and using nohup and it's working. I use lambda to start and stop the instance.
@franciscageorgue2207
@franciscageorgue2207 Ай бұрын
@@srk0706 Hello, could you show us the script that you created for lifecycle configuration plz?
@rithishkonduri508
@rithishkonduri508 6 ай бұрын
Thanks, Mate. It helped
@sumitkumar2955
@sumitkumar2955 6 ай бұрын
@@rithishkonduri508 happy to know 😊
@shubhampoul1643
@shubhampoul1643 6 ай бұрын
Thanks bro 🧡
@sumitkumar2955
@sumitkumar2955 6 ай бұрын
@@shubhampoul1643 please subscribe and share with your friends. Thanks 😊👍
@Safar-e4o
@Safar-e4o 6 ай бұрын
Pls sir Add the Code sheet in Description.
@sumitkumar2955
@sumitkumar2955 6 ай бұрын
Sure, but have I done any coding in this video?
@aashishpant
@aashishpant 6 ай бұрын
does this starts the notebook instance by itself or we have to run this while instance is running?
@sumitkumar2955
@sumitkumar2955 6 ай бұрын
Yes, we have to run this while instance is running..Thanks
@vaibhavgupta7429
@vaibhavgupta7429 6 ай бұрын
Commendable video, thanks a lot. Would appreciate if a lot of missing steps could be documneted in the doc also
@sumitkumar2955
@sumitkumar2955 6 ай бұрын
Thanks 🙏 I guess,I have updated all the steps but I will check and update in below blog. deltafrog.net/trigger-sagemaker-jupyter-notebook-file-from-aws-lambda/
@yejinzai
@yejinzai 7 ай бұрын
Thank you so much! Proven it works!
@sumitkumar2955
@sumitkumar2955 7 ай бұрын
Happy that it helped you 😃
@iamrahul_29
@iamrahul_29 7 ай бұрын
@SumitKumar getting below errors when running df.show() Py4JJavaError: An error occurred while calling o77.showString. : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 1) (LAPTOP-6C4N7D8I executor driver): org.apache.spark.SparkException: Python worker failed to connect back. Caused by: java.net.SocketTimeoutException: Accept timed out
@nabeelasyed1034
@nabeelasyed1034 7 ай бұрын
How to resolve error: externally-managed-environment while installing apache airflow?
@sumitkumar2955
@sumitkumar2955 7 ай бұрын
What step are you following? please provide more information
@reneshmlal2809
@reneshmlal2809 7 ай бұрын
@wintercherryblossom4020
@wintercherryblossom4020 8 ай бұрын
Thanks, this is very helpful!
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
Thanks 😊 please subscribe for more such videos
@droptimistic7419
@droptimistic7419 8 ай бұрын
Which API ? Did you use? Which account is the free account?
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
unfortunately,API is not working for me AS WELL
@abushama1638
@abushama1638 8 ай бұрын
Very informative
@notanspameratall7293
@notanspameratall7293 8 ай бұрын
Amazing, thanks for your tutorials and help
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
Thanks for watching
@abushama1638
@abushama1638 8 ай бұрын
Great tutorial
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
Thanks Bhai
@TechnoSparkBigData
@TechnoSparkBigData 8 ай бұрын
Nice video sir
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
Thanks Bhai
@egunjobitunde8369
@egunjobitunde8369 8 ай бұрын
Free at last🤩. After Several Videos, my df.show() always gives an error. Thanks a lot! This video is a life Saver
@AMITDAS-mr8xj
@AMITDAS-mr8xj 8 ай бұрын
Hi Sumit bro, i have applied this approach to get the dbt packages installed but still im getting module not found error. Although the installation went smooth. Need help
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
Ping me on WhatsApp 8147085086
@ManishKumar-mj3ko
@ManishKumar-mj3ko 8 ай бұрын
Can you please tell me about user amit ? when did you created that ?
@ManishKumar-mj3ko
@ManishKumar-mj3ko 8 ай бұрын
Can you please share video link for Create SFTP server for S3 with username and password authentication without using cloud formation template
@ManishKumar-mj3ko
@ManishKumar-mj3ko 8 ай бұрын
Can you please share the link please:-Create SFTP server for S3 with username and password authentication without cloud formation template
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
Hope you got the link
@TechnoSparkBigData
@TechnoSparkBigData 8 ай бұрын
That is very good feature, thanks for sharing Sir.
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
👍
@TheSarfarazahmed
@TheSarfarazahmed 8 ай бұрын
But the screen is blurred, please check.
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
Sure ....can you change the quality to 2k and check
@TheSarfarazahmed
@TheSarfarazahmed 8 ай бұрын
Superb👌
@abushama1638
@abushama1638 8 ай бұрын
Great thanks a lot
@notanspameratall7293
@notanspameratall7293 8 ай бұрын
Is there any way i can upload a .parquet file into the postgre database ?, from my local machine to the postgre container ?
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
You can use python code to read parquet file and prepare data frame the you can write dataframe into postgre DB
@notanspameratall7293
@notanspameratall7293 8 ай бұрын
@@sumitkumar2955 But if the parquet file is on my local computer ? how can i make a dag to upload the data to the postgreSQL db ? By the way, thanks for your response
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
@@notanspameratall7293 where is your airflow running? If it is running local. You can use the same python code in create dag.
@notanspameratall7293
@notanspameratall7293 8 ай бұрын
@@sumitkumar2955 Im currently using Airflow in a Docker compose, im trying to insert datasets from local machine into the postgresql database.
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
@@notanspameratall7293 kzbin.info/www/bejne/p37Cm4Rtjch9bposi=e9jOh0kxAQk-fL-r Could you please go through this video. It will be helpful. I am saving file in local and reading this. So you have you move your file in the container first using docker file. Once you have file in ur airflow container you can easily read. Please try. Let me know if you have any issue
@jeffersonsilvadiniz7575
@jeffersonsilvadiniz7575 8 ай бұрын
Solved my problem! I Appreciate it
@sumitkumar2955
@sumitkumar2955 8 ай бұрын
Thanks please share and subscribe 🤠
@narcis.nedelut
@narcis.nedelut 9 ай бұрын
Thank you! It is working. I really appreciate your video and support! 👍
@sumitkumar2955
@sumitkumar2955 9 ай бұрын
Thanks 🙏
@parisaemkani5730
@parisaemkani5730 9 ай бұрын
can I retrieve data from Twitter by basic developer account? I need data from 01/01/2024 until 31/01/2024. I would be grateful if someone could help me with this issue. I am very new to twitter developer and do not know how to scrape data from twitter for 30 days in January. could u please help me with this issue?
@sumitkumar2955
@sumitkumar2955 9 ай бұрын
Unfortunately My API key not working
@bommanasravan8279
@bommanasravan8279 9 ай бұрын
After so many videos this has worked for me
@sumitkumar2955
@sumitkumar2955 9 ай бұрын
Happy to hear that it helped you. Please share and subscribe 🙏
@SouhaCherif-vp2vw
@SouhaCherif-vp2vw 9 ай бұрын
is it working in a container or do i have to specify that
@sumitkumar2955
@sumitkumar2955 9 ай бұрын
This is AWS managed airflow,you don't have to specify anything
@joshisaiah2054
@joshisaiah2054 10 ай бұрын
Thanks for the tutorial. I tried to use your steps to send value_counts via ses but it wasn’t displaying well. The email was delivered but the result is not well formatted. I was sending df[[‘col1’,’col2’]].value_counts().to_frame(). Any hints?
@joshisaiah2054
@joshisaiah2054 10 ай бұрын
I actually solved it. I transposed the result after converting to a new dataframe
@sumitkumar2955
@sumitkumar2955 10 ай бұрын
Thanks for watching. Happy that it helped you. Please share and subscribe. You can connect with me in case of any issue