You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The data should be displayed in the ksqldb console.
Actual behaviour
Nothing is printed out in the ksqldb console.
According to ksqldb-server logs, the JSON deserializer considers the uid field to be required (while it is actually not required in the schema) :
[2024-08-29 16:23:08,319] ERROR {"type":0,"deserializationError":{"target":"value","errorMessage":"Error deserializing message from topic: dataset","recordB64":null,"cause":["Invalid value: null used for required field: \"uid\", schema type: STRING"],"topic":"dataset"},"recordProcessingError":null,"productionError":null,"serializationError":null,"kafkaStreamsThreadError":null} (processing.CSAS_DATASET_SCICAT_23.KafkaTopic_Left.Source.deserializer)
[2024-08-29 16:23:08,320] WARN stream-thread [_confluent-ksql-default_query_CSAS_DATASET_SCICAT_23-35b599ae-5eb6-4e02-a0b4-a49c1fff6741-StreamThread-1] task [0_0] Skipping record due to deserialization error. topic=[dataset] partition=[0] offset=[10] (org.apache.kafka.streams.processor.internals.RecordDeserializer)
org.apache.kafka.common.errors.SerializationException: Error deserializing message from topic: dataset
at io.confluent.ksql.serde.connect.KsqlConnectDeserializer.deserialize(KsqlConnectDeserializer.java:55)
at io.confluent.ksql.serde.connect.ConnectFormat$StructToListDeserializer.deserialize(ConnectFormat.java:239)
at io.confluent.ksql.serde.connect.ConnectFormat$StructToListDeserializer.deserialize(ConnectFormat.java:218)
at io.confluent.ksql.serde.GenericDeserializer.deserialize(GenericDeserializer.java:59)
at io.confluent.ksql.logging.processing.LoggingDeserializer.tryDeserialize(LoggingDeserializer.java:61)
at io.confluent.ksql.logging.processing.LoggingDeserializer.deserialize(LoggingDeserializer.java:48)
at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:62)
at org.apache.kafka.streams.processor.internals.SourceNode.deserializeValue(SourceNode.java:58)
at org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:66)
at org.apache.kafka.streams.processor.internals.RecordQueue.updateHead(RecordQueue.java:204)
at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:128)
at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:284)
at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:1039)
at org.apache.kafka.streams.processor.internals.TaskManager.addRecordsToTasks(TaskManager.java:1782)
at org.apache.kafka.streams.processor.internals.StreamThread.pollPhase(StreamThread.java:1254)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnceWithoutProcessingThreads(StreamThread.java:955)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:710)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:669)
Caused by: org.apache.kafka.connect.errors.DataException: Invalid value: null used for required field: "uid", schema type: STRING
at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:220)
at org.apache.kafka.connect.data.Struct.validate(Struct.java:233)
at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:250)
at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:213)
at org.apache.kafka.connect.data.ConnectSchema.validateValue(ConnectSchema.java:255)
at org.apache.kafka.connect.data.Struct.put(Struct.java:216)
at io.confluent.connect.json.JsonSchemaData.lambda$static$11(JsonSchemaData.java:234)
at io.confluent.connect.json.JsonSchemaData.toConnectData(JsonSchemaData.java:636)
at io.confluent.connect.json.JsonSchemaConverter.toConnectData(JsonSchemaConverter.java:135)
at io.confluent.connect.json.JsonSchemaConverter.toConnectData(JsonSchemaConverter.java:121)
at io.confluent.ksql.serde.connect.KsqlConnectDeserializer.deserialize(KsqlConnectDeserializer.java:49)
... 17 more
Additional context
This was working fine with confluent version 7.2.0 of cp-ksqldb-server and cli
The issue is present since version 7.3.0.
As a workaround, I can add empty values for the optionnal fields, but this solution is not convenient at all.
The problem only concerns the optional fields of the "datafile" sub-object. There is no problem with the optional fields of the top-level object.
The text was updated successfully, but these errors were encountered:
Describe the bug
JSON deserializer considers a field to be required while it actually is not.
To Reproduce
I run the following push query to print data in ksqldb-cli.
SELECT * FROM DATASET EMIT CHANGES;
Then, I produce the following data into my topic :
Expected behavior
Actual behaviour
Additional context
The text was updated successfully, but these errors were encountered: