No video

Kafka Tutorial Schema Evolution Part 1

  Рет қаралды 57,811

Learning Journal

Learning Journal

Күн бұрын

Spark Programming and Azure Databricks ILT Master Class by Prashant Kumar Pandey - Fill out the google form for Course inquiry.
forms.gle/Nxk8...
-------------------------------------------------------------------
Data Engineering using is one of the highest-paid jobs of today.
It is going to remain in the top IT skills forever.
Are you in database development, data warehousing, ETL tools, data analysis, SQL, PL/QL development?
I have a well-crafted success path for you.
I will help you get prepared for the data engineer and solution architect role depending on your profile and experience.
We created a course that takes you deep into core data engineering technology and masters it.
If you are a working professional:
1. Aspiring to become a data engineer.
2. Change your career to data engineering.
3. Grow your data engineering career.
4. Get Databricks Spark Certification.
5. Crack the Spark Data Engineering interviews.
ScholarNest is offering a one-stop integrated Learning Path.
The course is open for registration.
The course delivers an example-driven approach and project-based learning.
You will be practicing the skills using MCQ, Coding Exercises, and Capstone Projects.
The course comes with the following integrated services.
1. Technical support and Doubt Clarification
2. Live Project Discussion
3. Resume Building
4. Interview Preparation
5. Mock Interviews
Course Duration: 6 Months
Course Prerequisite: Programming and SQL Knowledge
Target Audience: Working Professionals
Batch start: Registration Started
Fill out the below form for more details and course inquiries.
forms.gle/Nxk8...
--------------------------------------------------------------------------
Learn more at www.scholarnes...
Best place to learn Data engineering, Bigdata, Apache Spark, Databricks, Apache Kafka, Confluent Cloud, AWS Cloud Computing, Azure Cloud, Google Cloud - Self-paced, Instructor-led, Certification courses, and practice tests.
========================================================
SPARK COURSES
-----------------------------
www.scholarnes...
www.scholarnes...
www.scholarnes...
www.scholarnes...
www.scholarnes...
KAFKA COURSES
--------------------------------
www.scholarnes...
www.scholarnes...
www.scholarnes...
AWS CLOUD
------------------------
www.scholarnes...
www.scholarnes...
PYTHON
------------------
www.scholarnes...
========================================
We are also available on the Udemy Platform
Check out the below link for our Courses on Udemy
www.learningjo...
=======================================
You can also find us on Oreilly Learning
www.oreilly.co...
www.oreilly.co...
www.oreilly.co...
www.oreilly.co...
www.oreilly.co...
www.oreilly.co...
www.oreilly.co...
www.oreilly.co...
=========================================
Follow us on Social Media
/ scholarnest
/ scholarnesttechnologies
/ scholarnest
/ scholarnest
github.com/Sch...
github.com/lea...
========================================

Пікірлер: 41
@ScholarNest
@ScholarNest 3 жыл бұрын
Want to learn more Big Data Technology courses. You can get lifetime access to our courses on the Udemy platform. Visit the below link for Discounts and Coupon Code. www.learningjournal.guru/courses/
@AmanGarg95
@AmanGarg95 6 жыл бұрын
I've been watching the playlist right from the start. The method of delivery is concise, succinct and clear. Way to go Sir. Thanks a lot.
@bsrameshonline
@bsrameshonline 2 жыл бұрын
Nicely explained
@shafiahmed3382
@shafiahmed3382 4 жыл бұрын
Dear Sir - you have brilliantly profuse expertise in teaching right content !
@111gangadhar
@111gangadhar 4 жыл бұрын
Excellent tutorials.. Sir. Clear and Concise...
@MohammadRasoolShaik
@MohammadRasoolShaik 7 жыл бұрын
Could you please explain how to setup schema registry for windows? And i do understand confluent-schema registry is to register avro-schema, but how it differentiate versions of one schema (Based on name of schema file name) when we are using for lower and higher versions of same schema? Apart from Avro schema, is this registry useful for any other tools or framework or else it is only specific to Avro. Ideally schema registry shouldn't be specific to avro.
@arpit35007
@arpit35007 6 жыл бұрын
Please make the tutorial on elastic search.
@s_rr_g9577
@s_rr_g9577 3 жыл бұрын
Great explanation. Thank you so much
@vikashverma9
@vikashverma9 4 жыл бұрын
Excellent tutorials
@chenhaukhoo
@chenhaukhoo 6 жыл бұрын
Im curious, isn't it is simple if we always serialise our object to string (use gson) before sending to Kafka? And in consumer side, once we received the String, we can just simply deserialise it to the object.
@ScholarNest
@ScholarNest 6 жыл бұрын
+chen hau khoo Yes. We can do that easily. In fact json is quite popular in simple scenarios and Json support is inbuilt in Kafka. However, when you have evolving schema, Avro could be a better option. I have covered schema evolution in a video.
@letme4u
@letme4u 5 жыл бұрын
thanks a lot for wonderful share.
@unmeshkadam4876
@unmeshkadam4876 Жыл бұрын
Sir how do you configure the schema registry?
@arunkumarramanujam
@arunkumarramanujam 4 жыл бұрын
Nice video
@s_rr_g9577
@s_rr_g9577 3 жыл бұрын
awesome sir
@Nilayam-DD
@Nilayam-DD 5 жыл бұрын
very useful videos
@AliKahoot
@AliKahoot 7 жыл бұрын
Thanks for the great tutorial, Very well explained.
@kunalgupta6152
@kunalgupta6152 7 жыл бұрын
Hi, Its a Nice tutorial over Schema in Kafka, just one clarification i want to have as you have told me earlier that schema is well embedded in data and deserializer extract schema and deserializer data so what is the requirement of schema registry when data has embedded schema in it.
@ScholarNest
@ScholarNest 7 жыл бұрын
Embedding Schema in each record will increase the size of each record and ultimately impact the performance. So Schema is stored in the registry, and an ID is embedded in the message record.
@sonunitjsr223
@sonunitjsr223 6 жыл бұрын
Same question here as well. But ClickRecord.java class has the schema(variable name: SCHEMA$) along with the data. And while writing the Producer/Consumer code you are using ClickRecord.java, so you have the schema embedded in the JAVA file. Why we need the schema registry
@DagangWei
@DagangWei 5 жыл бұрын
@@sonunitjsr223 Same question, since ClickRecord.java is generated from the schema, on the consumer side, it already knows how to deserialize the message, why we need the schema registry?
@jiger83
@jiger83 4 жыл бұрын
@@DagangWei This approach as @Learning Journal mentioned above can be costly in terms of network, storage, and other processing costs. So it's better to use a schema registry. You only supply schema if you don't use a schema registry. Otherwise, schema id will be sent in the message.
@user-rr3cc3uv2h
@user-rr3cc3uv2h 7 жыл бұрын
Great tutorial! I am looking forward to seeing the solution how to make old and new producer/consumers work together because now I can't get it how this could happen...
@user-rr3cc3uv2h
@user-rr3cc3uv2h 7 жыл бұрын
After a second watching I got it:)
@____R__
@____R__ 4 жыл бұрын
Still I didn’t got that. Is it in any other video??
@shristiraj9907
@shristiraj9907 4 жыл бұрын
What is the advantage of using AvraSerializer/Deserializer over the following approach? I have created one google protobuf object and converted into ByteString and send as a message and used org.apache.kafka.common.serialization.ByteArraySerializer and org.apache.kafka.common.serialization.ByteArrayDeserializer
@ScholarNest
@ScholarNest 4 жыл бұрын
What if your producer is changed an now it adds one new field in the message record. Can you use the same consumer without changing it?
@Anshmaster2016
@Anshmaster2016 7 жыл бұрын
Nicely defined... Good Job. As we see Avro schema are defined in JSON, so is there any requirement that the data shall also be in JSON, AVRO, ORC format or simple flatfile, CSV can also be processed by AVRO/JSON schema
@ScholarNest
@ScholarNest 7 жыл бұрын
Avro itself is a data file format. If your data is in another format, your producer need to encode it into Avro object as we have done in the example code.
@yusufdm5472
@yusufdm5472 7 жыл бұрын
Nice tutorial, is there a .NET/C# equivalent of Java SDK for Kafka including all the advanced topics you covered like Custom Partition, Commits, Schema Evolution etc...
@ScholarNest
@ScholarNest 7 жыл бұрын
You can use Kafka Rest Proxy if you want to use it from C#.
@nguyen4so9
@nguyen4so9 7 жыл бұрын
Excellent !
@krishnam1260
@krishnam1260 7 жыл бұрын
Very well explained Thanks :)
@nishantagnihotri8028
@nishantagnihotri8028 11 ай бұрын
Where is the link i am not able to download it its showing download from maven central?
@Harikrishna-ie4um
@Harikrishna-ie4um 7 жыл бұрын
awesome
@Modern_revolution
@Modern_revolution 5 жыл бұрын
Thank u so much
@pratiksarvan
@pratiksarvan 7 жыл бұрын
Excellent!!
7 жыл бұрын
Thank you for tutorial, but I have a question about the schema registry. who does set up it? Where?
@ScholarNest
@ScholarNest 7 жыл бұрын
Schema registry is an optional component of Kafka. If you need it, the cluster admin should setup it on a dedicated host machine.
@9962366673
@9962366673 5 жыл бұрын
@@ScholarNest "ClickRecord" class can serve as schema for serializing and deserializing right. Why does it require schema registry when we pass ClickRecord as ValueSerializer... Please clarify this part. Thank you..
Kafka Tutorial - Schema Evolution Part 2
9:59
Learning Journal
Рет қаралды 30 М.
Kafka Tutorial - Exactly once processing
13:33
Learning Journal
Рет қаралды 53 М.
Gli occhiali da sole non mi hanno coperto! 😎
00:13
Senza Limiti
Рет қаралды 24 МЛН
КТО ЛЮБИТ ГРИБЫ?? #shorts
00:24
Паша Осадчий
Рет қаралды 4,3 МЛН
Bend The Impossible Bar Win $1,000
00:57
Stokes Twins
Рет қаралды 40 МЛН
Kafka Tutorial   Custom Serializer
11:06
Learning Journal
Рет қаралды 55 М.
Kafka Tutorial - Fault Tolerance
12:08
Learning Journal
Рет қаралды 170 М.
Master Databricks and Apache Spark Step by Step: Lesson 1 - Introduction
32:23
Kafka Tutorial - Custom Partitioner
12:57
Learning Journal
Рет қаралды 70 М.
Kafka Tutorial - Producer API
11:38
Learning Journal
Рет қаралды 141 М.
Gli occhiali da sole non mi hanno coperto! 😎
00:13
Senza Limiti
Рет қаралды 24 МЛН