If I have a app ingesting CSV data into Kafka, can I use avro schema for that CSV data ?
@axual35732 жыл бұрын
Yes, that is very possible There are two ways to do this. 1. A full, specific record definition The full record definition will have a namespace and record name reflecting the specific data you want to store. Each column of the CSV will become a named field inside the record. This field must have the correct type of course. Then you alter your application to map each CSV record to the correct Avro record type. 2. A generic CSV record definition If your CSV data is more flexible, for example, each CSV can/will have different headers and value types. This makes it harder to specify a specific Avro record definition. But you can make a specific Avro record and name it something like CsvRecord. It will contain only one field which is an Avro Map. The value of that Avro map can be a union between null, string, int, float, etc. This allows you to store each column in the csv as a separate entry in that map. You can read up on the Avro specification and how to define a schema on avro.apache.org/docs/1.11.1/specification/ Hope this answers your question.
@nishikanttayade74462 жыл бұрын
@@axual3573 Thanks for taking time to reply 👍
@axual35732 жыл бұрын
@@nishikanttayade7446 if you would like to explore Axual's event streaming platform, you can request a free trial right here: axual.com/trial/ :D