I'm only nine minutes in and this is clearly an excellent presentation. Clear, concise, and sufficiently broad for those coming fresh to the topic. Well done! Just finished. If you want to know about AWS and Big Data. This is the most worthwhile 45-minute presentation you will watch!
@yojackleon4 жыл бұрын
That was so good, at the end I clapped from my desk at home :)
@samuel_william2 жыл бұрын
Well, I am a bigdata architect and recently joined a fintech- this video help me understand aws services and how to use it for my project.
@shahrukhmushtaq6033 жыл бұрын
Extremely Helpful, a succinct overview of AWS offerings...Thanks for this piece.
@ravikiran37882 жыл бұрын
succinct presentation on Big Data services in AWS
@mrsaip2 жыл бұрын
Fantastic presentation.
@Bill01029 ай бұрын
I'm completely absorbed in this. I had the privilege of reading something similar, and I was completely absorbed. "Mastering AWS: A Software Engineers Guide" by Nathan Vale
@bjorn1409 Жыл бұрын
The title is very misleading... This video is basically a sales presentation of all the Amazon Services. If you are new to Big Data this video will not really help you.
@awssupport Жыл бұрын
Sorry to hear about this friction. Our teams care about opportunities you spot for improvement, which is why I've reached out internally to have your feedback reviewed and considered further. If you're interested, check out this Big Data info: go.aws/47E9x6H. For more tutorials and training on the topic, please explore the options here: go.aws/3GuQuQ2. ^JM
@et44938 ай бұрын
Yup
@pareshpatel23773 жыл бұрын
As far as I know and aws documentation mention, Kinesis data Stream and Firehose are not storage services . They used for data ingestion for streaming data . Those data then either pulled or pushed to storage services such as S3 or redshift or others . Seems this video has messed concept altogether and confusing .
@tom_see3 жыл бұрын
kinesis streams can store the streaming data for as long as we need, of course 99% of use-cases would like to read this data in from the stream ASAP, but it's still technically storing the data: an example of this is how we can reprocess the same kinesis stream of data over and over if we choose, from the very beginning (as long as the data is still stored, based on the data lifetime setting)