You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
i have a mysql server on my remote server with row binlog enabled,then i want to implement a data collection bus using kafka(whether source is mysql or other format log file we use these bus to store them and provide other service ),then i want to consume these topics in kafka to get the exact same mysql database for others to use.
does this make sense?how to realize this
The text was updated successfully, but these errors were encountered:
Hi @wanghaisheng, I assume that MySQL's built in replication is not enough for you, and you're better off pushing your data into Kafka so you can push it into other downstream systems, MySQL or not.
Given the above, mypipe will allow you to tune into MySQL replication streams and convert them into Avro encoded data pushed into Kafka topics. You can then consume these topics and do what you wish with the data.
mypipe encodes data using Avro either "specifically" or "generically", and you can consume this data if you know how to decode it. This is documented in the README. Let me know if it is unclear to you and we'll try to make the documentation better where needed.
i have a mysql server on my remote server with row binlog enabled,then i want to implement a data collection bus using kafka(whether source is mysql or other format log file we use these bus to store them and provide other service ),then i want to consume these topics in kafka to get the exact same mysql database for others to use.
does this make sense?how to realize this
The text was updated successfully, but these errors were encountered: