Пікірлер
@stevejob677
@stevejob677 5 сағат бұрын
great session
@leeoswald9799
@leeoswald9799 12 сағат бұрын
Another terrible Indian accent video.
@kolawolegabriel6558
@kolawolegabriel6558 2 күн бұрын
@Pythoholic, you are gifted teacher. Please can you make an explicit vide on AWS S3 Accesspoint practical with full demo
@kolawolegabriel6558
@kolawolegabriel6558 2 күн бұрын
@Pythoholic, you are gifted teacher. I appreciate your videos. Please can you make an explicit vide on AWS S3 Accesspoint practical with full demo
@kolawolegabriel6558
@kolawolegabriel6558 2 күн бұрын
@Pythoholic, you are gifted teacher. Please can you make an explicit vide on AWS S3 Accesspoint practical with full demo
@kanishkjohari2238
@kanishkjohari2238 3 күн бұрын
so nicely explained. I am just starting to get into hosting and this really helped me
@abhijitjadhav4456
@abhijitjadhav4456 5 күн бұрын
11 Videos are unhidden any update
@Pythoholic
@Pythoholic 4 күн бұрын
I will check and get back
@abhijitjadhav4456
@abhijitjadhav4456 5 күн бұрын
Not able to see next video under this playlist
@Pythoholic
@Pythoholic 4 күн бұрын
I will update it . Thanks for the feedback
@anwar6971
@anwar6971 20 күн бұрын
Too elaborate and very, very ambiguous.
@Pythoholic
@Pythoholic 12 күн бұрын
Thanks for the feedback. Will try and improve
@anwar6971
@anwar6971 20 күн бұрын
Too elaborate and very, very ambiguous.
@DaveLau0313
@DaveLau0313 21 күн бұрын
The threshold is too high. As a beginner, I don't comprehend these.
@c-bass413
@c-bass413 25 күн бұрын
Great demo.
@arfatbagwan48
@arfatbagwan48 Ай бұрын
For each VPC that you want to associate with the Route 53 hosted zone, change the following VPC settings to true: enableDnsHostnames enableDnsSupport what about this require a hands on this to understand the concept 😶
@ravinamohite9031
@ravinamohite9031 Ай бұрын
Best tutorial ever 😊 Thanks u so much for provide us from scratching 🙏🏻😊
@thelazychewresearch
@thelazychewresearch Ай бұрын
Great video!
@priyatiwari6331
@priyatiwari6331 Ай бұрын
Wonderful explanation please make a more viedo for api gateway and ECS and batch services
@nerdy-zeig7774
@nerdy-zeig7774 Ай бұрын
So... its just a cloud version of a jump server?
@sergioz7133
@sergioz7133 Ай бұрын
🤭 Promo sm
@nickwales4261
@nickwales4261 Ай бұрын
This was useful but would have been nice to see you actually make a request to the service.
@maddysunshine4173
@maddysunshine4173 Ай бұрын
very helpfull Video
@pulenteria
@pulenteria Ай бұрын
Como retiro mi dinero de esta web?? La tenia en Trust y paso a esta web por arte de magia
@sidds09
@sidds09 Ай бұрын
Thank you. Please add continuous updating process of the site. also do more projects.
@Pythoholic
@Pythoholic Ай бұрын
Thanks siddb09
@JamesBrown-lq5nn
@JamesBrown-lq5nn Ай бұрын
Good brief intro to hosted zones👍
@venkatrao7868
@venkatrao7868 Ай бұрын
Amazing explanation and demo !!
@rupaPrajapati-hg1ni
@rupaPrajapati-hg1ni 2 ай бұрын
Hey I had a quick question can we still do this with Application load balancers ? or we have to have network load balancers for service endpoint creation at producer end.
@Pythoholic
@Pythoholic Ай бұрын
As of recent updates, Application Load Balancers do support AWS PrivateLink. You can create an endpoint service in your VPC and specify an ALB as the service provider. This allows you to offer the applications behind your ALB privately to other VPCs through AWS PrivateLink. If there are any specific documentation you can refer i can help you better where they have mentioned dropping support.
@pavan9076
@pavan9076 2 ай бұрын
Any specific reason we are using Lambda or SQS before SNS, can't we send payload directly to SNS??
@Pythoholic
@Pythoholic 2 ай бұрын
You can definitely send payloads directly to Amazon SNS without using AWS Lambda or Amazon SQS. The choice to use services like Lambda or SQS before SNS often depends on the specific requirements of your architecture and the nature of the data processing involved. Here are some reasons why you might use AWS Lambda or Amazon SQS before sending a message to SNS: 1. **Data Processing with Lambda:** - **Complex Transformations:** If the payload needs to be processed or transformed before sending the notification, using Lambda is a common approach. For example, you might need to extract specific information from a larger dataset or format the message in a particular way. - **Conditional Logic:** Lambda allows you to implement complex logic to determine whether a notification should be sent. For instance, you might only want to trigger an SNS notification under certain conditions after inspecting the incoming data. - **Integration Logic:** Sometimes, the decision to send a notification via SNS depends on the results of integrating with other services. Lambda can handle these integrations (e.g., querying a database or calling an external API) before deciding to send a message to SNS. 2. **Decoupling with SQS:** - **Rate Limiting and Throttling:** If your system produces notifications at a high rate, you might use SQS to buffer these messages. This helps in managing throughput to SNS, especially if there are many subscribers or if the subscribers have rate limits. - **Reliability and Durability:** SQS provides a reliable queueing system that ensures that messages are not lost between your application and the notification service. If your system requires a high degree of reliability, using SQS as a buffer can help protect against data loss in case of transient errors. - **Retry Logic and Dead Letter Queues:** SQS can be used to manage retries in a more controlled manner. If a message fails to be processed or sent to SNS, it can be retried or moved to a Dead Letter Queue (DLQ) for further investigation. This helps in isolating problematic messages and ensures that your system remains resilient. 3. **Direct Use of SNS:** - **Simplicity:** If your application simply needs to send a notification with minimal processing, sending the payload directly to SNS is the simplest approach. This reduces the complexity and operational overhead of your architecture. - **Real-Time Notifications:** For scenarios where you need immediate notification without any delay, sending direct messages to SNS is preferable. This ensures that there is minimal latency between the event occurrence and the notification being sent out. 4. **Cost Considerations:** - Using Lambda or SQS introduces additional costs based on their usage (invocations for Lambda and the number of messages for SQS). Directly using SNS might be more cost-effective if the additional features provided by Lambda or SQS are not required for your use case. In summary, while sending payloads directly to SNS is a common and straightforward approach, using Lambda or SQS as intermediaries can provide additional flexibility, processing capabilities, and reliability features based on your specific needs.
@catchroniclesbyanik
@catchroniclesbyanik 2 ай бұрын
Based on the formula, if I have a requirement of 3000 rcu and 1000 wcu, the table will initially create 2 partitions. But why need 2, why not 1 ??. 1 partition is sufficient for the requirement. Assume data size is under 10 GB. Please anybody care to explain ??
@Pythoholic
@Pythoholic 2 ай бұрын
DynamoDB might start with 2 partitions instead of 1 for your requirements of 3000 RCUs and 1000 WCUs to better balance the load and improve performance consistency, even if one partition can technically handle the capacity. This approach also enhances fault tolerance and operational flexibility by distributing the workload across multiple partitions from the start.
@catchroniclesbyanik
@catchroniclesbyanik 2 ай бұрын
@@Pythoholic I think, I have figured out why 2 partitions are needed and not 1. One partition can deliver either 3000 rcu or 1000 wcu or mix of both (i.e. if it has 500 wcu it can have 1500 rcu). It is not possible for a partition to deliver both 3000 rcu and 1000 wcu. By this logic, if one partition delivers all the required 3000 rcu, one more partiton is required to deliver 1000 wcu. Since they are distributed equally among partitons, each partition will deliver 1500 rcu and 500 wcu.
@catchroniclesbyanik
@catchroniclesbyanik 2 ай бұрын
The thing which is missing from most documentation online, is its not 3000 rcu and 1000 wcu. Its 3000 rcu or 1000 wcu or a mix of both. By this logic, one partition won't be able to handle that load. You must need two. I got this information from an AWS Events video posted on YT about DynamoDB.
@AbhayAR
@AbhayAR 2 ай бұрын
U are a man or james bond ?
@glennadams7047
@glennadams7047 2 ай бұрын
Best explanation of endponts ! Well done sir !!!
@narongritsrisawang3610
@narongritsrisawang3610 2 ай бұрын
very helpful thank you :D <3
@kirodezno12
@kirodezno12 2 ай бұрын
Great video, thanks! Keep up with the good work!
@abibhavana
@abibhavana 2 ай бұрын
Do You have Cloudfront failover hands on lab
@Pythoholic
@Pythoholic 2 ай бұрын
No sir not yet. Will make one soon
@abibhavana
@abibhavana 2 ай бұрын
@@Pythoholic you are one of the best channel to learn AWS. Can you tell some simple projects for beginner and intermediate
@Pythoholic
@Pythoholic 2 ай бұрын
Here are some simple AWS project ideas 1. **Deploy a Static Website on S3:** - **Objective:** Learn how to use Amazon S3 to host static web resources like HTML, CSS, and JavaScript files. - **Key Services:** Amazon S3, AWS IAM, Route 53 (optional for custom domain). - **Steps:** 1. Create an S3 bucket and enable static website hosting. 2. Upload the static web files to the S3 bucket. 3. Update the bucket policy to make the content publicly accessible. 4. (Optional) Configure a custom domain using Route 53. 2. **Simple Web Application with DynamoDB:** - **Objective:** Develop a simple web application that uses DynamoDB to store data. - **Key Services:** AWS Lambda, Amazon API Gateway, Amazon DynamoDB, AWS IAM. - **Steps:** 1. Create a DynamoDB table to store data (e.g., a simple CRUD for tasks or notes). 2. Implement Lambda functions to handle CRUD operations on the DynamoDB table. 3. Use API Gateway to create RESTful endpoints for the Lambda functions. 4. Secure the application using appropriate IAM roles. 3. **Deploy a Python Flask App to Elastic Beanstalk:** - **Objective:** Learn how to deploy a simple Python Flask application using AWS Elastic Beanstalk. - **Key Services:** AWS Elastic Beanstalk, Amazon RDS (optional for a database). - **Steps:** 1. Develop a simple Flask application on your local machine. 2. Package the application with any dependencies and a configuration file (`requirements.txt` and `.ebextensions`). 3. Deploy the application to Elastic Beanstalk using the EB CLI or the AWS Management Console. 4. (Optional) Add an RDS database and connect it to your application. 4. **Serverless Image Resizer:** - **Objective:** Create a serverless application that automatically resizes images uploaded to S3. - **Key Services:** AWS Lambda, Amazon S3, Amazon SNS or SQS, AWS IAM. - **Steps:** 1. Create an S3 bucket to upload images. 2. Implement a Lambda function triggered by S3 events to resize images using a library like Pillow. 3. Store the resized images in a different S3 bucket or the same bucket with a different prefix. 4. Configure any necessary permissions and roles for Lambda to access S3. 5. **Basic CloudWatch Dashboard:** - **Objective:** Set up a CloudWatch dashboard to monitor the performance and health of AWS services. - **Key Services:** Amazon CloudWatch. - **Steps:** 1. Identify which metrics are important for your application or environment. 2. Create a CloudWatch dashboard and add widgets to visualize these metrics. 3. Set up CloudWatch alarms to notify you of any critical changes in the metrics. 6. **Simple Notification System:** - **Objective:** Implement a notification system using SNS to send messages to subscribed endpoints. - **Key Services:** Amazon SNS, AWS Lambda (optional), AWS IAM. - **Steps:** 1. Create an SNS topic. 2. Subscribe email addresses or phone numbers to the SNS topic. 3. Publish messages to the topic either manually from the AWS console or programmatically using AWS SDKs. 7. **Lambda Function to Process Logs:** - **Objective:** Develop a Lambda function to process log files uploaded to S3. - **Key Services:** AWS Lambda, Amazon S3, AWS IAM. - **Steps:** 1. Create an S3 bucket for log file uploads. 2. Write a Lambda function to parse log files (e.g., for error monitoring or usage statistics). 3. Set up an S3 event notification to trigger the Lambda function when new logs are uploaded.
@ZobiaKhan-mc1fs
@ZobiaKhan-mc1fs 2 ай бұрын
19:00 bookmark
@vivekgc2051
@vivekgc2051 2 ай бұрын
Please provide the slides of all video ... Please
@saleemnashawaty3179
@saleemnashawaty3179 2 ай бұрын
Thank you Chief
@StMarc-ow4wz
@StMarc-ow4wz 2 ай бұрын
This video should be removed to avoid wasting others' time.
@Pythoholic
@Pythoholic 2 ай бұрын
Point taken. Will try and improve. Sorry for the inconvenience
@ankitnegi163
@ankitnegi163 2 ай бұрын
nice video
@root-User
@root-User 2 ай бұрын
Lovely explanation. This has cleared all my doubts related to Graph Database. God bless you
@TechJedi007
@TechJedi007 2 ай бұрын
Not sure why, but I created the alarm, created a topic, and subscribed to the topic. I'm still unable to get any notifications even though the metrics I set were really low after running the stress test.
@sachinadi6715
@sachinadi6715 2 ай бұрын
This is an eye opener when it comes to ansible.. thanks for making this content @pythoholic
@Pythoholic
@Pythoholic 2 ай бұрын
Glad you enjoy it!
@MrLala12321
@MrLala12321 2 ай бұрын
Beautiful job, really appreciate it. Subscribed!
@FaltuKaam-vq7ko
@FaltuKaam-vq7ko 2 ай бұрын
nicely explained.
@StMarc-ow4wz
@StMarc-ow4wz 2 ай бұрын
This video could have been completed in 3 minutes if the creator did not keep repeating what he said.
@Pythoholic
@Pythoholic 2 ай бұрын
Thanks for the feedback I will surely keep this in mind
@agni4869
@agni4869 3 ай бұрын
awsome course and very clear explanation with aws. so much helpful
@jagan1957
@jagan1957 3 ай бұрын
I don't know where to start and where to end but in this video you have talked about a lot of pertinent and pragmatic things that are relatable to any professional in the IT world. When I watched a few videos related to the AWS course, I realized that your videos were a class apart from the rest by giving vivid visualization. You are an inspiration for me. I passed out from 2011, joined Indian Navy as an officer but for some reason had to quit and come back home. Took a three month rest and then found a good job in Bangalore and started my career in 2012. I also had a lot of ups and downs and more importantly I too was laid off from my startup(Silicon Valley) a few months back as part of a big downsizing exercise but as I had the skills and self-confidence, I was jumping with joy on the day of layoffs that I had broken from the shackles of this company rather than feeling dejected about it. I am currently puruing the AWS entry level certifications and finding your content really great. I wish you all the very best and hope that you will be one of the finest content creators in this decade as there is a huge potential for online education.
@Pythoholic
@Pythoholic 2 ай бұрын
Thanks a lot Jagan for your wonderful story that you shared and its also so amazing that you are still going on after facing a lot of issues in your life as well. I am sure you will get all the things you want from life. I am really happy that i could help you in any way possible. May be my job as a content creator was just that. I will try my best to provide good content and share what i know with the best of my abilities. Thanks again :)
@meghanathms823
@meghanathms823 3 ай бұрын
how to aws fetch time stream table to my react js website
@Pythoholic
@Pythoholic 3 ай бұрын
I got these steps on a website please check if it helps. 1. **Set up AWS SDK:** First, make sure you have the AWS SDK for JavaScript installed in your React project. You can install it using npm or yarn: ```bash npm install aws-sdk ``` 2. **Configure AWS Credentials:** You need to configure your AWS credentials to access Timestream from your React app. You can do this by setting up the `AWS.config` object: ```javascript import AWS from 'aws-sdk'; AWS.config.update({ region: 'us-east-1', // Replace with your region accessKeyId: 'YOUR_ACCESS_KEY_ID', secretAccessKey: 'YOUR_SECRET_ACCESS_KEY' }); ``` For security reasons, it's recommended to use environment variables or AWS IAM roles instead of hardcoding your credentials. 3. **Set up Timestream Query Client:** Create a Timestream Query client to execute queries on your Timestream table. ```javascript const timestreamQuery = new AWS.TimestreamQuery(); ``` 4. **Query Timestream Table:** Use the Timestream Query client to execute SQL queries on your Timestream table and fetch the data. ```javascript const params = { QueryString: 'SELECT * FROM your_database.your_table LIMIT 10' // Replace with your query }; timestreamQuery.query(params, (err, data) => { if (err) { console.error('Query Error:', err); } else { console.log('Query Results:', data); // Process and display the data in your React component } }); ``` 5. **Display Data in React Component:** Once you have the data, you can display it in your React components as needed. You might want to store the data in the component's state and render it in the JSX. ```javascript import React, { useState, useEffect } from 'react'; const MyComponent = () => { const [timestreamData, setTimestreamData] = useState([]); useEffect(() => { // Fetch Timestream data and set it to state timestreamQuery.query(params, (err, data) => { if (!err) { setTimestreamData(data.Rows); // Assuming data is in data.Rows } }); }, []); return ( <div> {timestreamData.map((row, index) => ( <div key={index}> {/* Render your data here */} </div> ))} </div> ); }; export default MyComponent; ``` Remember to replace placeholders like `YOUR_ACCESS_KEY_ID`, `YOUR_SECRET_ACCESS_KEY`, `your_database`, and `your_table` with your actual AWS credentials and Timestream table details. Also, make sure to handle error cases and secure your AWS credentials properly.
@meghanathms823
@meghanathms823 3 ай бұрын
@@Pythoholic thank you bro its working now
@udaykumar-tb5kn
@udaykumar-tb5kn 3 ай бұрын
Saa C03 will this playlist helpful or we have to anything extra pls help bro
@Pythoholic
@Pythoholic 3 ай бұрын
The topics are basically cloud concepts please refer the doc for saa.c03 and follow the course for individual topics that u wish to learn. I hope that helps
@udaykumar-tb5kn
@udaykumar-tb5kn 3 ай бұрын
Saac03 is latest certificate right ..any extra we need to study for this exam ..how this playlist helpful to clear latest SAA C03 PLS reply
@Pythoholic
@Pythoholic 3 ай бұрын
Please refer to the latest syllabus doc for the current exam pattern. Follow the course for the content and the topic.
@udaykumar-tb5kn
@udaykumar-tb5kn 3 ай бұрын
@@Pythoholic ok thanks for reply ..any extra service and knowledge we need to gain for latest one?? This is my question
@Pythoholic
@Pythoholic 3 ай бұрын
You need to cover the topics in the new one as per the exam syllabus. These videos are a bit old so you can refer them for ur concept knowledge and read more about the service from the documentation. I might not be able to share the complete changes.
@udaykumar-tb5kn
@udaykumar-tb5kn 3 ай бұрын
@@Pythoholic pls suggest certification how can I pass by following your playlist? I don't know what to study extra for clearing it.. pls help
@udaykumar-tb5kn
@udaykumar-tb5kn Ай бұрын
Pls reply
@udaykumar-tb5kn
@udaykumar-tb5kn 3 ай бұрын
Any changes 2024 AWS ..if we prepare and follow this playlist,will we be able to attend 2024 certified solution architect?? Pls help
@Pythoholic
@Pythoholic 3 ай бұрын
Hello Uday, the concepts will be same but the domain weights of topics for exam will be different.
@yusufnar
@yusufnar 3 ай бұрын
Is this the AWS marketing 😂😂
@yusufnar
@yusufnar 3 ай бұрын
All you said about performance is a big lie.
@Pythoholic
@Pythoholic 3 ай бұрын
Thanks for the feedback Yusuf. I will update it as per ur feedback