Just getting started with Kafka, but this video makes me realise how useful it’s going to be. Great video, thank you
@rmoff3 жыл бұрын
Thanks!
@raoulvaneijndhoven14733 жыл бұрын
This is great, thank you. I have a question regarding Timestamp conversion, but placed it on the community page.
@sdon1011 Жыл бұрын
Very interesting series of videos. Very helpful. A little remark: at 38:58, it seems that the order value to be inserted was way higher that the currently displayed maximum (22190899.73 vs 216233.09) and still this value was not updated.
@paulocasaretto5272 жыл бұрын
Very useful! Thanks Robin!
@karthikb.s.k.44862 жыл бұрын
Thank you Robin for great video. If we edit the same csv file again all records gets processed i think . Can we connect this behaviour. I have seen this with S3 source connector?
@Agrossio11 ай бұрын
Great Video!! Will this work for processing a csv with 1.000.000 registries ?? Would it last less than an hour to save it in an Oracle Database??
@anak-anakindonesia3 жыл бұрын
thanks.this is good video. you save my time bro. 👍
@rmoff3 жыл бұрын
Glad I could help!
@hitrem2 жыл бұрын
Thank you so much for this guide ! But i've got an issue i don't know if it's normal. Personnaly when i cp back a .csv file ( same name ) in my unprocessed directory, the file is processed again and the offset is going from 500 to 501, 502 etc. Is this normal ? Plus when a file is processed it's creating a "orders.csv" subdirectory in the processed directory. Is this due to some update ?
@drakezen3 жыл бұрын
Brilliant
@gregorydonovan71813 жыл бұрын
Great video Robin. QQ - is there a way to publish an event once a file has been completely ingested? Does ksqlDB provide any hooks that might help? I need to ingest a customer file and then send subsets of that file to a number of vendors. I figure I'll have a consumer for each vendor.
@rmoff3 жыл бұрын
I've answered your question over here: forum.confluent.io/t/submit-message-when-csv-has-been-ingested/1658/2
@mouhammaddiakhate35463 жыл бұрын
Just Awesome !!
@rmoff3 жыл бұрын
Thanks, glad you liked it :)
@nokap26953 жыл бұрын
HTTP/1.1 405 Method Not Allowed X-Confluent-Control-Center-Version: 6.2.1 X-Confluent-Control-Session: 96af01d2-6b69-45ea-937c-1f42c8aa7f78 Strict-Transport-Security: max-age=31536000 Content-Length: 0 5:20 Error, any idea how to fix?
@rmoff3 жыл бұрын
Hi, please post this at forum.confluent.io/ :)
@gauravlotekar6604 жыл бұрын
U d champion !!
@shubhamgawali80302 жыл бұрын
Hi Robin great video! I have one requirement can we send only the file name, not the file content whenever the new file is created on the directory
@rmoff2 жыл бұрын
I don't know if there is a connector that does this. You could ask at forum.confluent.io/.
@rmoff4 жыл бұрын
For questions about the connector and Apache Kafka in general please head to confluent.io/community/ask-the-community/
@drhouse19803 жыл бұрын
Nice video, do you know if this connector can get the filename?
@gregorydonovan71813 жыл бұрын
@@drhouse1980 he shows how to get the headers via Kafkacat but the question I have is how to then turn this into a topic that other consumers can subscribe to. For example, after a customer sends a request you then want to ship out requests to vendors then marry the data up later.
@schoesa2 жыл бұрын
If i run docker-compose up -d it hangs everytime downloading the Kafka Connect JDBC hub plugin
@rmoff2 жыл бұрын
hi, the best place to get help is at www.confluent.io/en-gb/community/ask-the-community/ :)
@larrosapablo4 жыл бұрын
Hi, Is there any connector for csv sink? Thanks!
@laifimohammed12023 жыл бұрын
very awesome i'm beginners and it helpful
@rmoff3 жыл бұрын
Thanks for the comment :) Glad I could help!
@MsMilka874 жыл бұрын
Thanks!
@kaisneffati88013 жыл бұрын
Does it support file changes ? when the file change i want to re-read the file !
@rmoff3 жыл бұрын
The spooldir connector moves files to a new folder once ingested. I don't know if the functionality you describe is available in other connectors. Check out www.confluent.io/hub/streamthoughts/kafka-connect-file-pulse and www.confluent.io/hub/mmolimar/kafka-connect-fs perhaps. For any more questions, head to forum.confluent.io/ :)
@bisworanjanbarik73503 жыл бұрын
This is really great video . I want to load everyday huge csv file into database . Can I use Kafka Csv connector
@rmoff3 жыл бұрын
If you just have a CSV file and a database, I don't think adding Kafka in just to do the load would make any sense - there are plenty of database tools to load the CSV file directly. If you already have the data in a Kafka topic and want to load it into a database then you can use the JDBC Sink connector. For more questions head over to forum.confluent.io.
@nesreenmohd6659 ай бұрын
Thanks
@phemsobki19293 жыл бұрын
Hi How can i move data from a database running sql server on a windows server operating system into kafka
@rmoff3 жыл бұрын
See rmoff.dev/no-more-silos. Connectors suitable include the JDBC Source connector or Debezium.
@phemsobki19293 жыл бұрын
@@rmoff thanks
@vishnumurali5224 жыл бұрын
Hi @rmoff May I know how to get the same CSV file data from a SFTP location which use key based authentication...
@vishnumurali5224 жыл бұрын
Can't able to know how mention the key values I am giving request to start the connector from postman
@wardsworld4 жыл бұрын
Great video and amazing content! Could you please share a repo/link with the code used in this video? :)
@rmoff4 жыл бұрын
Thanks, glad you liked it! The code is here: github.com/confluentinc/demo-scene/tree/master/csv-to-kafka
@stefen_taime3 жыл бұрын
how to solve : % ERROR: Failed to query metadata for topic orders_spooldir_00: Local: Broker transport failure ?
@rmoff3 жыл бұрын
Hi, please post this at forum.confluent.io/ :)
@georgelza6 ай бұрын
realized this is a "old'ish" video... you dont show at any time how you started your kafkacat container, also of course now kafkacat has been replaced/renamed to kcat