You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
When using a Protobuf schema from the schema reg, ksql interprets enum fields as VARCHAR(STRING) instead of BIGINT/INTEGER.
This causes a lot of issues and I have another topic/stream for which the kafka messages are based on a rather large Protobuf file. So I can't manually write out the CREATE STREAM portion.
Versions
Describe the bug
When using a Protobuf schema from the schema reg, ksql interprets enum fields as VARCHAR(STRING) instead of BIGINT/INTEGER.
This causes a lot of issues and I have another topic/stream for which the kafka messages are based on a rather large Protobuf file. So I can't manually write out the CREATE STREAM portion.
Sample of protobuf file:
describing stream where I give the CREATE STREAM all the columns and types (this returns kafka messages)
describing stream where I create the stream with (this does not return kafka messages:
schema registry -- schema was POSTed with normalize=true
The text was updated successfully, but these errors were encountered: