Excellent tutorial on real time scenarios. Avro serialization is one of the real time scenarios used in the projects and also an important topic in the Kafka interviews. Thanks a lot Basant for the efforts put in, much appreciated!!
@gopisambasivarao528210 ай бұрын
Thanks Basant. Appreciate your efforts. God Bless you.🙂🙏 Every week waiting for your videos. You are my Guru. 😊 The way of your explanation is awesome.
@kalaiselvankesavel297110 ай бұрын
Thanks for making such a detailed video about schema registry!!
@2RAJ212 ай бұрын
Thank you. i learned more about kafka docker compose.
@ManishSingh-dj4yu10 ай бұрын
Thanks for this tutorial.. your contribution to the developer community is invaluable.
@aadiraj612610 ай бұрын
Thank you Basant bhai.. Lucid explanation, loved it👍
@rahulrajsaini567012 күн бұрын
super video .. such a important topic explained so simply and so well .. thank you sir for all your efforts
@rajyahoob4 ай бұрын
Nice explanation, lot of other things to learn as well such as Docker, etc. Thank You as always.
@malleswarrao38876 ай бұрын
Excellent sir no words to describe your knowledge sharing
@user-0987-a10 ай бұрын
Surpringly I had this kind requirement using avro object, many thanks
@Javatechie10 ай бұрын
Glad it helped!
@ilronin8045 ай бұрын
Wowww Awesome Thank you very much, so helpfull!
@8588714 ай бұрын
Bro your content is exceptionally great..
@sgr7ss2 ай бұрын
Nice explanation
@mysavingclub10 ай бұрын
Nice video you are putting lot of effort
@Imjamalvali10 ай бұрын
thanks bro, pls do like this videos, pls keep ur good work
@AAlphonse6828 ай бұрын
Thank you for a wonderful tutorial
@anushbabu502310 ай бұрын
Thanks, looking for this one. Please cover for Kafka connect also😊
@Javatechie10 ай бұрын
Sure buddy
@manjunathbabu66097 ай бұрын
"Let's get started" has a separate fan base..
@itsmeibrahimm3 ай бұрын
Excellent tutorial
@suman8528Ай бұрын
great content ,waiting kafka connect and kafka stream
@JavatechieАй бұрын
Kafka connect i do struggled like anything but no results . Once successfully i will ran one source and sink poc then i will update buddy 👍
@ivanhomziak20 күн бұрын
Thanks!
@Deepakblg9710 ай бұрын
❤❤ Thank you so much sir 🙏
@armandoruizgonzalez10 ай бұрын
Thank's Basant
@avisulimanoff123110 ай бұрын
Thanks for your useful tutorials. Short question please: what is the difference between you first change to avro scheme, that when you send Employee information without age/dob was validate successfully , to your second change, when you added the middle name, the validation failed. I didn't notice any change in the process for the two changes...
@Javatechie10 ай бұрын
Check I have explained in control centre ui the difference in schema evolution
@amsfuy8 ай бұрын
From 1:00 on you explain that in the plain Kafka world we need to build a new producer and consumer app once a class, whose objects we send to Kafka, changes. From 41:00 on you generate a new Employee class based on the changed schema and then rebuild producer and consumer as well. From my understanding, using schemas we still need to build a new producer and consumer app; so this is not an advantage of the whole schema overhead. The only advantage I see is that we do not manually change classes. Do I overlook something?
@РоманПивоваров-ф7ш7 ай бұрын
i'm stuck in the same, still dont get it..
@Javatechie7 ай бұрын
No your understanding is correct. It means we no need to rely on data like pojo/events even producer modified any payload structure still my consumer will not break
@kevinameda27117 ай бұрын
@@Javatechie I used everything same as you but when it came to evolving the schema and dropping some of the fields, I got change in schema error 409. I am wondering because I used same properties as you. Also, another thing is, I have the following dashboard I am not seeing the messages even though the schemas and the topics are there. I am talking about port 9092/clusters/** but the 8081/subjects/** is showing
@kevinameda27117 ай бұрын
This are the errors org.apache.kafka.common.errors.InvalidConfigurationException: Schema being registered is incompatible with an earlier schema; error code: 409 2024-04-23T09:29:46.988+03:00 ERROR 12236 --- [apache-schema-registry] [nio-8888-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed: org.springframework.kafka.KafkaException: Send failed] with root cause org.apache.kafka.common.errors.InvalidConfigurationException: Schema being registered is incompatible with an earlier schema; error code: 409
@tusharrawal88510 ай бұрын
Hello sir, I already followed you from last two years and you are awesome. Could you please help me for one interview question on threading. In you corrent project where you are using thread with example?
@kanaillaurent5264 ай бұрын
Very helpful!
@utbhargav4 ай бұрын
Thank you So much. why AvroSchema generated DTO has @Deprecated on the attributes ?
@dattatraybharde290210 ай бұрын
Nice video ❤
@mohammadturabali38705 ай бұрын
awsome
@suprajainamadugu70839 ай бұрын
Thanks for the detailed information, But how to use different versions at consumer side while deserialising.
@arunvijay227910 ай бұрын
If consumer in different microsevice, I need to generate avro class for all the consumers right ?. In future, if we need to add new field, first we need to add default as empty, is my understanding correct ?.
@Javatechie10 ай бұрын
Yes that’s correct but if your consumer is different project then you need to only get the updated schema from producer that’s it
@saravanakumars5210 ай бұрын
Yes exactly because whatever version of avro Object producer is sending consumer is going to consume that object as per version in schema registry when deserializing
@santhoshg53427 ай бұрын
@@Javatechieif producer sends new schema, does consumer need to generate the dto files from consumer end before it consumes. Until the new changes deployed to prod, is it uses old schema ? How exactly schema versioning works for consumer without any break if consumer is a separate external service ?
@vverma0115 ай бұрын
Could you please discuss more on Forward and Backward compatibility configurations.
@deekandau45965 ай бұрын
This is awesome content thanks. One question , how do you share the schema across multiple micro services.
@Javatechie5 ай бұрын
It should be store in registry
@deekandau45965 ай бұрын
@@Javatechie Sorry i didn't get which registry you talking about. Could you elaborate further. Can we store it in like a config server? Just thinking..
@arunabhtiwari47716 ай бұрын
Hi Basat , what is default compatiblity for avro schema registry .
@rathinmaheswaran7 ай бұрын
Hi Sir , I need to convert an input xml file to avro schema dynamically instead of having it under resource folder and then build the Avro object and send it to Kafka topic .can you please help me with this.
@pradeeptamishra92252 ай бұрын
Sir I have one question. If we put producer and consumer in separate project. Then the .avsc file we should keep in consumer project. Then any changes in payload structure in producer side the same we need to change in consumer side avro file. So at the end the consumer jar file needs to be re deployed to the server. So what is the benefit? Of schema registry. Please answer
@Javatechie2 ай бұрын
Actually this file should be place on some common or centralize place like config server or s3
@letsCherishCoding10 ай бұрын
Nice explanation. Just have one doubt, can we use this when we are storing employee info in database like cassandra as database schema is fixed there. And if we want to accomodate new fileds we will have to manually add them.
@Javatechie10 ай бұрын
But in the db schema you need to modify the field.
@letsCherishCoding10 ай бұрын
@@Javatechie yes, I just wanted to know couple of real life use cases of using avro schema
@Javatechie10 ай бұрын
@@letsCherishCoding purpose of avro to avoid failure in schema evolution. Next you can think of any e-commerce or food delivery app who frequently do changes on their payload
@letsCherishCoding10 ай бұрын
@@Javatechie got it. Thanks
@LakshmiprasadKota7 ай бұрын
nice tutorial but if u do this concept by creating two differnt projects(producer and consumer) then it wil be more clear and prefect
@Javatechie7 ай бұрын
Yes I agree but to reduce the length of the video i kept both in the same project but what's the challenge here to separate consumers into different projects?
@mithileshchandra20726 ай бұрын
@@Javatechie i think we need to generate the beans class using avro schema in both producer and consumer application and then need to run both app. I thought if we separate producer and consume and if we change schema in producer it should reflect in consumer app without re run but seems not possible we have to re run both app. pls correct me if I'm wrong?
@Javatechie6 ай бұрын
You are correct only need to build your app again that's the things you need to do
@supriyakavuri19888 ай бұрын
Hi Sir, How can we achieve this kind of version in open-source Kafka ?
@Javatechie8 ай бұрын
Not possible supriya
@srinivastadimalla12323 ай бұрын
Any help on this is appreciated Sir.. I subscribed to your channel now and would like to continue to have the membership and thank you very much for all the knowledge you are sharing to the community. How to make it work for build.gradle file Sir?
@Javatechie3 ай бұрын
Hello srinivas , Thank you for following javatechie. What's the problem are you facing with gradle ?
@CanalGeekDev2 ай бұрын
@@Javatechie hello, can you please demonstrate how to set this dependencies using gradle instead ?
@aditvikramsingh192910 ай бұрын
Can you also cover Kafka connect and stream
@Jothianand-g1v4 ай бұрын
Great tutorial ! One question : Do we need to add any configuration related to SSL when using confluent cloud schema registry (https) ? I'm getting below error when my spring boot app is trying to connect to the schema url. SSLHandshakeException - PKIX path building failed: sun.security.provier.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
@Javatechie4 ай бұрын
Yes you need to add keystore and truststore certificate in producer and consumer configuration
@Jothianand-g1v4 ай бұрын
@@Javatechie Thanks for the quick response
@shivamkhare-p5z7 ай бұрын
which data type we can use for date in schema , Pls suggest ?
@riyakumari83778 ай бұрын
I have a doubt what if we are not sure what kind of schema is coming from my database as my app might be handling different types of json data in that case Do i need to define schema for every json data one by one or is there any other way?
@Javatechie8 ай бұрын
Schema is nothing is the contract between producer and consumer buddy. For example to work as IAS you must need to clear the UPSC exam that's the rule or you can say contract set by the government. If you have multiple exams then you need to prepare for different subjects like schema
@SurajKumar-l8d8y7 ай бұрын
everything's working well, but my control centre often shut down itself after some time, need to start it again from docker desktop, why can be the cause?
@ramanav31398 ай бұрын
can you guide me how to do kafka clustering in spring boot application how to handle
@sudheerkumar-tp1mg10 ай бұрын
1. In confluent dashboard we can able to see the sesitive data, how to secure it in one interviewer asked me this question, please suggest, also confluent is free or paid. 2. If consumer is down producer is producing messages continuously and the hardware resources are exhUsted kafka how can handle this situation
@Javatechie10 ай бұрын
Use secure Kafka cluster and yes it’s open source
@sudheerkumar-tp1mg10 ай бұрын
@@Javatechie Thanks for the reply basant sir, actually content ex: credit card numbers, social security numbers etc. which should display in amsked manner, please suggest
@GMSGAANAMANI9 ай бұрын
consumers are configured with appropriate heartbeat settings
@gauravjaiswal98086 ай бұрын
hi sir can you please create a video on employee biometric login using Spring boot
@attrayadas806710 ай бұрын
Can you please make a video on how to make a class immutable in java?
@2000riddick3 ай бұрын
I need the same in gradle build
@talhaansari57636 ай бұрын
Currently we are using in our application.
@GR-gk8dh9 ай бұрын
14:44
@el_yisusT10 ай бұрын
Thanks a lot
@gauravjaiswal98086 ай бұрын
hi sir can you please create a video on employee biometric login using Spring boot