You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If your data is in Avro format you can do this with skipping the column definition, here is an example: https://www.confluent.io/blog/ksql-december-release
KSQL will get the column info from the schema registry.
For JSON you need to define the columns in CREATE STREAM/TABLE statement.
Closing out this ticket as part of cleaning our backlog. Hojjat's answer is the most up to date - JSON support for schema registry will be coming hopefully in the near future, at which point you will not need to specify the schema in KSQL! You can follow that feature request here: confluentinc/schema-registry#220
I'm trying to extract all the columns from a topic into a stream, is there a way to do so or do I have to list them out:
CREATE STREAM mystream(id BIGINT, projectName VARCHAR, buildDuration BIGINT,......) WITH (kafka_topic='rawdata', value_format='JSON');
Is there a function like * to create stream with all of the columns from the topic?
The text was updated successfully, but these errors were encountered: