Kafka Tutorial Offset Management

  Рет қаралды 99,511

Learning Journal

Learning Journal

7 жыл бұрын

Spark Programming and Azure Databricks ILT Master Class by Prashant Kumar Pandey - Fill out the google form for Course inquiry.
forms.gle/Nxk8dQUPq4o4XsA47
-------------------------------------------------------------------
Data Engineering using is one of the highest-paid jobs of today.
It is going to remain in the top IT skills forever.
Are you in database development, data warehousing, ETL tools, data analysis, SQL, PL/QL development?
I have a well-crafted success path for you.
I will help you get prepared for the data engineer and solution architect role depending on your profile and experience.
We created a course that takes you deep into core data engineering technology and masters it.
If you are a working professional:
1. Aspiring to become a data engineer.
2. Change your career to data engineering.
3. Grow your data engineering career.
4. Get Databricks Spark Certification.
5. Crack the Spark Data Engineering interviews.
ScholarNest is offering a one-stop integrated Learning Path.
The course is open for registration.
The course delivers an example-driven approach and project-based learning.
You will be practicing the skills using MCQ, Coding Exercises, and Capstone Projects.
The course comes with the following integrated services.
1. Technical support and Doubt Clarification
2. Live Project Discussion
3. Resume Building
4. Interview Preparation
5. Mock Interviews
Course Duration: 6 Months
Course Prerequisite: Programming and SQL Knowledge
Target Audience: Working Professionals
Batch start: Registration Started
Fill out the below form for more details and course inquiries.
forms.gle/Nxk8dQUPq4o4XsA47
--------------------------------------------------------------------------
Learn more at www.scholarnest.com/
Best place to learn Data engineering, Bigdata, Apache Spark, Databricks, Apache Kafka, Confluent Cloud, AWS Cloud Computing, Azure Cloud, Google Cloud - Self-paced, Instructor-led, Certification courses, and practice tests.
========================================================
SPARK COURSES
-----------------------------
www.scholarnest.com/courses/s...
www.scholarnest.com/courses/s...
www.scholarnest.com/courses/s...
www.scholarnest.com/courses/s...
www.scholarnest.com/courses/d...
KAFKA COURSES
--------------------------------
www.scholarnest.com/courses/a...
www.scholarnest.com/courses/k...
www.scholarnest.com/courses/s...
AWS CLOUD
------------------------
www.scholarnest.com/courses/a...
www.scholarnest.com/courses/a...
PYTHON
------------------
www.scholarnest.com/courses/p...
========================================
We are also available on the Udemy Platform
Check out the below link for our Courses on Udemy
www.learningjournal.guru/cour...
=======================================
You can also find us on Oreilly Learning
www.oreilly.com/library/view/...
www.oreilly.com/videos/apache...
www.oreilly.com/videos/kafka-...
www.oreilly.com/videos/spark-...
www.oreilly.com/videos/spark-...
www.oreilly.com/videos/apache...
www.oreilly.com/videos/real-t...
www.oreilly.com/videos/real-t...
=========================================
Follow us on Social Media
/ scholarnest
/ scholarnesttechnologies
/ scholarnest
/ scholarnest
github.com/ScholarNest
github.com/learningJournal/
========================================

Пікірлер: 42
@ScholarNest
@ScholarNest 3 жыл бұрын
Want to learn more Big Data Technology courses. You can get lifetime access to our courses on the Udemy platform. Visit the below link for Discounts and Coupon Code. www.learningjournal.guru/courses/
@JemiloII
@JemiloII 6 жыл бұрын
Exactly what I was looking for and very clear of what to do, even though I am using a different programming language. Thank you for the knowledge shared.
@sreekanthmunigati6450
@sreekanthmunigati6450 7 жыл бұрын
excellent explanation ... I liked your playlist for Kafka , very clear and covered several points in each short video .. just play for an hour you will get to know lot about Kafka !!
@ravishankar4521
@ravishankar4521 6 жыл бұрын
kudos to you sir for such a wonderful effort in making these videos.
@mukundsridhar4250
@mukundsridhar4250 4 жыл бұрын
awesome... Thats the one word that i can think of to describe your work.
@krushnachandrasahoo5034
@krushnachandrasahoo5034 6 жыл бұрын
exemplary teaching..really excellent...learnt a lot of things...
@jaineshmodi
@jaineshmodi 6 жыл бұрын
sir your videos are really good.
@rajareddy47444
@rajareddy47444 5 жыл бұрын
Hi,Thank you sir for sharing your knowledge.The way u take the concept and explaining is awesome. After watching your spark videos i got the confidence that i can face interviews with more confidence .Thanks for showing way how to do in real time using GCP. Now i started learning kafka parallel with spark. These videos are in terminal.Can you please explain kafka also in real time with GCP. That will be a great advantage for those who are moving to this ecosystem. Thank you
@drindianVlogs
@drindianVlogs 6 жыл бұрын
Nice tutorial..... Learnt lot of things here... Thanks a lot sir for the videos
@kalpanabejawada2451
@kalpanabejawada2451 5 жыл бұрын
Nice explanation! Found it useful
@boycy69
@boycy69 3 жыл бұрын
Great explanation, thanks!
@mohanbabubanavaram5211
@mohanbabubanavaram5211 3 жыл бұрын
Excellent explanation, Covering various scinarios
@decaalv
@decaalv 4 жыл бұрын
Thank you Mr. Indian guy. I love you.
@ganesans4957
@ganesans4957 Жыл бұрын
Great explanation. Started exploring your other videos.
@jkiran2020
@jkiran2020 7 жыл бұрын
Very nice content. Is the video streaming slow only for me?
@lokeshwarreddy3
@lokeshwarreddy3 3 жыл бұрын
Excellent sir
@vivektwentysix9064
@vivektwentysix9064 6 жыл бұрын
Very good
@saurav0777
@saurav0777 3 жыл бұрын
What happens if commitAsync has been failing from longer time let say more than 1 hours and if we restart the application after that it will process the duplicates records . How to handle such scenarios?
@jeetendrasingh1972
@jeetendrasingh1972 3 жыл бұрын
Great
@rameshthamizhselvan2458
@rameshthamizhselvan2458 5 жыл бұрын
Is there anyway to see the committed offsets???
@mayankwassup
@mayankwassup 5 жыл бұрын
Hi, let's assume we have 3 topics with 1 partition each. There are 10 consumer groups, each group has a single consumer and each consumer is going to read all the three topics. So in that scenario, how offset will be maintained by each consumer?
@vishvaspatel34
@vishvaspatel34 7 жыл бұрын
Thank you sir for valuable information. What if while reading from broker there will be an exception occurs on consumer side in async commit. so required to data to be read once again. But in async communication may be higher ordered can be committed on top of that, than ?
@ScholarNest
@ScholarNest 7 жыл бұрын
Happy Holi :-)
@MohammadRasoolShaik
@MohammadRasoolShaik 4 жыл бұрын
Is Kafka allows to commit a higher offset before committing a lower offset?
@pheiroz6307
@pheiroz6307 7 жыл бұрын
Hi, In the example why would one want to commit sync in case of an exception. If there is an exception in mid processing i.e. one has processed only 10 messages out of 50 messages pulled from kafka won't a commit sync incorrectly set the committed offset at 50. However since there was an exception messages 11 to 50 never get processed because commit sync was invoked. Can you please clarify. Thanks!
@ScholarNest
@ScholarNest 7 жыл бұрын
Yes, you are correct. Thanks for pointing this out. You watched carefully :-) I should use a particular offset in the commit sync instead of committing current offset. Using appropriate offset will make sure that only 10 messages that I successfully processed are Committed.
@pheiroz6307
@pheiroz6307 7 жыл бұрын
Sure :).Your tutorials are extremely clear in communication. Can you please do a tutorial on the use of schema registry and it's implementation when consuming avro messages in Kafka. Thanks!
@ScholarNest
@ScholarNest 7 жыл бұрын
I am working on Schema Registry Example and Avro Schema, Will upload it soon. Thanks.
@pheiroz6307
@pheiroz6307 7 жыл бұрын
Awesome! eagerly looking forward to it. Thanks
@TheNayanava
@TheNayanava 7 жыл бұрын
sir. I have a doubt. In my code, I have set my enable auto commit to true and the interval as 1000 ms, yet I get the following error ERROR o.a.k.c.c.i.ConsumerCoordinator - Error UNKNOWN_MEMBER_ID occurred while committing offsets for group From the error I am assuming that its because the coordinator would have called for a partition rebalance, before the consumer goes to commit the offset, whereas the heartbeat interval is set to 3000. What could be the possible reason for this error then??
@poorvivijay
@poorvivijay 5 жыл бұрын
How does kafka handle, multiple consumers subscribing to the same topic? single publisher multiple consumers problem, as there is a restriction that only one consumer can read the partition.
@karthikragunaths
@karthikragunaths 7 жыл бұрын
After 100 records are fetched by a consumer for processing, and 50 records are processed, for whatever reason a re-balance is triggered in the broker, the consumer still processes the remaining 50 records - is this correct ? ( I assuming that there's no way for the broker to stop the consumer because consumer is a stand-alone application and the communication is uni-directional from consumer to broker and not the other way )
@ScholarNest
@ScholarNest 7 жыл бұрын
You are correct in saying that broker has no way of stopping the consumer, but the Broker blocks the poll call during the rebalance.
@karthikragunaths
@karthikragunaths 7 жыл бұрын
Is the number of records fetched by poll() configurable ?
@ScholarNest
@ScholarNest 7 жыл бұрын
No
@Daniel_wwf
@Daniel_wwf 7 жыл бұрын
At 1:08, shouldn't it be 99 for M100?
@ScholarNest
@ScholarNest 7 жыл бұрын
It always points to the record that should be sent on next request. For 100 records delivered, offset 0-99 are done, the next one is 100. So 100 is correct.
@Daniel_wwf
@Daniel_wwf 7 жыл бұрын
I meant the following: We assumed having 100 messages in the partition, so offsets should be 0-99. As M1 has offset 0, M100 should have offset 99.
@VishalYadav-di7ip
@VishalYadav-di7ip 7 жыл бұрын
Can we delete specific offset ?
@ScholarNest
@ScholarNest 7 жыл бұрын
Why would you want to do that?
@VishalYadav-di7ip
@VishalYadav-di7ip 7 жыл бұрын
Sorry for the late...Sir, actually interviewer asksed me to do that.
Kafka Tutorial   Rebalance Listener
10:49
Learning Journal
Рет қаралды 60 М.
Kafka Tutorial - Exactly once processing
13:33
Learning Journal
Рет қаралды 52 М.
Дибала против вратаря Легенды
00:33
Mr. Oleynik
Рет қаралды 5 МЛН
Became invisible for one day!  #funny #wednesday #memes
00:25
Watch Me
Рет қаралды 52 МЛН
Вечный ДВИГАТЕЛЬ!⚙️ #shorts
00:27
Гараж 54
Рет қаралды 14 МЛН
아이스크림으로 체감되는 요즘 물가
00:16
진영민yeongmin
Рет қаралды 23 МЛН
Kafka Tutorial - Producer API
11:38
Learning Journal
Рет қаралды 140 М.
Kafka Tutorial - Fault Tolerance
12:08
Learning Journal
Рет қаралды 169 М.
Kafka Tutorial - Core Concepts
13:04
Learning Journal
Рет қаралды 917 М.
Apache Kafka in 5 minutes
5:21
Stephane Maarek
Рет қаралды 963 М.
Kafka Tutorial - Consumer Groups
8:29
Learning Journal
Рет қаралды 83 М.
3. Apache Kafka Fundamentals | Apache Kafka Fundamentals
24:14
Confluent
Рет қаралды 451 М.
Topics, partitions, and offsets in Kafka
22:32
Knowledge Amplifier
Рет қаралды 7 М.
Topics, Partitions and Offsets:  Apache Kafka Tutorial #2
6:41
Anton Putra
Рет қаралды 19 М.
Kafka Tutorial - Custom Partitioner
12:57
Learning Journal
Рет қаралды 70 М.
Kafka Tutorial - Producer Configs
10:41
Learning Journal
Рет қаралды 52 М.
Дибала против вратаря Легенды
00:33
Mr. Oleynik
Рет қаралды 5 МЛН