Want to learn more Big Data Technology courses. You can get lifetime access to our courses on the Udemy platform. Visit the below link for Discounts and Coupon Code. www.learningjournal.guru/courses/
@NIKUNJ25785 жыл бұрын
As Albert Einstein once said "if you can not explain something in simple terms, you have not understood it". This tutorial explains Kafka concepts in such simple terms that anybody can understand. Thank you very much for uploading the video! Subscribed and shared with friends!
@DeepakPatel-xl3tr2 жыл бұрын
great
@mohamedmuhad4827 Жыл бұрын
How did the consumer work in this example as it did not have the latest version of ClickRecord class ? Consumer will not be able to set the newly added properties on the older version of the ClickRecord object which was generated using old schema.
@drakezen6 жыл бұрын
Since you mention confluent, it would be nice to have some sessions on what confluent adds as compared to apache kafka for instance, and also the usage of confluent control center and how that might be useful for managing confluent Kafka services.
@harishmi320075 жыл бұрын
Thank you! It's very easy to understand what you teach.
@SeekingHorizon4 жыл бұрын
Do we need Confluent Kafka for schema registry? How do we achieve it with Apache kafka?
@foruvasanth5 жыл бұрын
Do we need to create a new consumer and producer every time the schema changes???
@SunilKumar-uw2pf4 жыл бұрын
Hi Sir, I am able to produce multiSchema (ex. Product and Customer) messages thru AvroProducer using multiProducer. but I am not getting any API which help us to consume multiSchemas(Product, Customer) messages from same topic.
@MyTtest4 жыл бұрын
Thank you! I looked to this video and part one of the same topic, but I am stuck since I need to implement a .net solution (in c#) for Kafka/Avro... It looks like confluence does not have a Kafka for windows (only if I use the containers or run some Linux mode...etc.). It would be great if you had some videos on .net / c# on the subject (really poor documented...)
@jdang675 жыл бұрын
In my case, the data to be added into the Kafka topic is a complex object with inheritance. Without versioning, JSON serializer should be enough. What is the best approach to deal with existing java classes with inheritance?
@ScholarNest5 жыл бұрын
Thanks for putting this question. JSON and AVRO are the most commonly used approaches. However, AVRO do not support inheritance. I checked with JSON and it works.
@杨正云7 жыл бұрын
As the earlier topic mentioned, one partition is taken care of by just one consumer. And in this topic you started 2 consumers, one for old schema and one for new schema. I think they definitely consume different partitions. Then does this mean I need to add new partitions for new schemas changes? I can't let new consumers start working if there is not available partitions for it.
@杨正云7 жыл бұрын
I realize that actually it is not always true that the number of Consumers are the same as the number of partitions... So if the number of Consumers is less than the number of partitions I think add new Consumers are totally fine.
@matrixlnmi1696 жыл бұрын
Sir can we have video on cassandra
@pheiroz63077 жыл бұрын
Hi, You mentioned that the schema id is embedded in the message and the consumer/deserailizer uses it to refer to the appropriate schema from the registry. Q: Can you please tell how/when are the 2 schema registered in the Schema registry
@ScholarNest7 жыл бұрын
In the method explained in this example, We don't have to register the schema manually. It is taken care by Serializer. So Serializer is responsible for registering the new schema and embedding the schema ID in the message.
@rajeshbhupal66797 жыл бұрын
Your explanations are outstanding!!! Thank you so much. Sir, While answering one of the questions from the subscribers, you mentioned about a use case "here Kafka Connect pulls data from an RDBMS (Non-Avro) and sink it into HDFS (Avro file)" - is it possible to for you to create a video for this use showing steps from Producer (RDBMS DB) --> Kafka Broker -> Consumer(Hadoop HDFS)?
@ScholarNest7 жыл бұрын
I will create Kafka connect tutorial shortly.
@rajeshbhupal66797 жыл бұрын
Thanks!!! May we all know you name? :-)
@ScholarNest7 жыл бұрын
:-)
@kristijankontus48465 жыл бұрын
@@ScholarNest is there any chance you could do a tutorial on RabbitMQ in the future? great job here, much appreciated
@karthikkumar127 жыл бұрын
I started off with one video it was so informative I ended up going thru all videos (subscribed ofcourse). Great way to present things. I have a question Is it possible to convert string/json in kafka to be deserialized to avro by consumer ? string or json (non-avro) writing apps -----> serialize ---> bytes ==> KAFKA ==> bytes ---> deserializer --> avro ? Have a suggestion as well .. Can you make a video tutorial schema registry and/or avro data ?
@ScholarNest7 жыл бұрын
I already have a video on schema registry and Avro data.
@karthikkumar127 жыл бұрын
Thanks for your response. I will search for it. btw do you think this is possible string or json (non-avro) writing apps -----> serialize ---> bytes ==> KAFKA ==> bytes ---> deserializer --> avro ?
@ScholarNest7 жыл бұрын
Yes, it is possible. However, what are you going to do with the Avro object in the end? I guess you want to store it in Hadoop or some other place. If that's what you are aiming, Kafka Connect is the most suitable solution. I can see an analogy with a Use Case where Kafka Connect pulls data from an RDBMS (Non-Avro) and sink it into HDFS (Avro file).
@markcberman7 жыл бұрын
Can someone share out the GITHub link that is referred to in this video?