Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming integrator CDC Capture column - Facing issue #148

Open
uarulraj486 opened this issue Aug 29, 2020 · 0 comments
Open

Streaming integrator CDC Capture column - Facing issue #148

uarulraj486 opened this issue Aug 29, 2020 · 0 comments

Comments

@uarulraj486
Copy link

uarulraj486 commented Aug 29, 2020

Description:
I am using wso2ei-7.0.2 streaming-integrator. I am connecting to the SQL Server 2017 database. using the osg command I have created the Database libary added to the streaming intagrator library folder.The streaming integrator connects to the database and debezium.connector also connects. But I am getting the below error message
Error Log
[2020-08-28 18:53:42,699] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 18:53:42,778] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 18:53:42,779] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 18:53:42,779] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Siddhi\CaptureStream.dat
[2020-08-28 18:53:42,779] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = testpoc
[2020-08-28 18:53:42,780] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = xxx
[2020-08-28 18:53:42,781] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 18:53:42,782] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 18:53:42,782] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 18:53:42,782] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 18:53:42,783] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 1921124184
[2020-08-28 18:53:42,783] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 18:53:42,783] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 18:53:42,783] INFO {io.debezium.connector.common.BaseSourceTask} - name = SiddhiCaptureStream
[2020-08-28 18:53:42,784] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 6225
[2020-08-28 18:53:42,784] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 18:53:42,889] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 18:53:42,893] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 18:53:42,895] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 18:53:42,901] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - No previous offset has been found
[2020-08-28 18:53:42,901] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - According to the connector configuration both schema and data will be snapshotted
[2020-08-28 18:53:42,901] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 1 - Preparing
[2020-08-28 18:53:42,902] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 2 - Determining captured tables
[2020-08-28 18:53:43,101] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 3 - Locking captured tables
[2020-08-28 18:53:43,103] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Executing schema locking
[2020-08-28 18:53:43,103] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Locking table QBS.dbo.ReportManagement
[2020-08-28 18:53:43,104] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 4 - Determining snapshot offset
[2020-08-28 18:53:43,109] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 5 - Reading structure of captured tables
[2020-08-28 18:53:43,109] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Reading structure of schema 'QBS'
[2020-08-28 18:53:43,611] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 6 - Persisting schema history
[2020-08-28 18:53:43,631] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Schema locks released.
[2020-08-28 18:53:43,631] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 7 - Snapshotting data
[2020-08-28 18:53:43,632] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Exporting data from table 'QBS.dbo.ReportManagement'
[2020-08-28 18:53:43,632] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - For table 'QBS.dbo.ReportManagement' using select statement: 'SELECT * FROM [dbo].[ReportManagement]'
[2020-08-28 18:53:43,645] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Finished exporting 38 records for table 'QBS.dbo.ReportManagement'; total duration '00:00:00.012'
[2020-08-28 18:53:43,647] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 8 - Finalizing
[2020-08-28 18:53:43,727] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000043e8:0001(NULL)[1]
[2020-08-28 19:26:12,686] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Schema will be changed for Capture instance "dbo_ReportManagement" [sourceTableId=QBS.dbo.ReportManagement, changeTableId=QBS.cdc.dbo_ReportManagement_CT, startLsn=00000060:00005938:0030, changeTableObjectId=1618104805, stopLsn=NULL]
[2020-08-28 19:30:07,339] INFO {org.wso2.msf4j.internal.websocket.WebSocketServerSC} - All required capabilities are available of WebSocket service component is available.
[2020-08-28 19:30:07,521] INFO {org.wso2.carbon.metrics.core.config.model.JmxReporterConfig} - Creating JMX reporter for Metrics with domain 'org.wso2.carbon.metrics'
[2020-08-28 19:30:07,534] INFO {org.wso2.msf4j.analytics.metrics.MetricsComponent} - Metrics Component is activated
[2020-08-28 19:30:07,536] INFO {org.wso2.carbon.databridge.agent.internal.DataAgentDS} - Successfully deployed Agent Server
[2020-08-28 19:30:07,548] INFO {org.wso2.msf4j.internal.websocket.EndpointsRegistryImpl} - Endpoint Registered : /console
[2020-08-28 19:30:07,990] INFO {org.wso2.msf4j.MicroservicesRunner} - Microservices server started in 301ms
[2020-08-28 19:30:07,991] INFO {org.wso2.transport.http.netty.contractimpl.listener.ServerConnectorBootstrap$HttpServerConnector} - HTTP(S) Interface starting on host 0.0.0.0 and port 7370
[2020-08-28 19:30:07,998] INFO {org.wso2.carbon.event.simulator.core.service.CSVFileDeployer} - CSV file deployer initiated.
[2020-08-28 19:30:08,000] INFO {org.wso2.carbon.event.simulator.core.service.SimulationConfigDeployer} - Simulation config deployer initiated.
[2020-08-28 19:30:09,487] INFO {org.wso2.carbon.business.rules.templates.editor.core.internal.StartupComponent} - Template Editor Started on : http://192.168.42.59:9390/template-editor
[2020-08-28 19:30:09,603] INFO {org.wso2.carbon.siddhi.editor.core.internal.StartupComponent} - Editor Started on : http://192.168.42.59:9390/editor
[2020-08-28 19:30:09,754] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App CDCWithListeningMode-Update.siddhi successfully deployed.
[2020-08-28 19:30:09,798] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Delete_fromUsers_toCustomer.siddhi successfully deployed.
[2020-08-28 19:30:09,805] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Delete_fromUsers_toSale.siddhi successfully deployed.
[2020-08-28 19:30:09,821] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Delete_fromUsers_toSales.siddhi successfully deployed.
[2020-08-28 19:30:09,827] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Delete_Log.siddhi successfully deployed.
[2020-08-28 19:30:09,829] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Deployment.siddhi successfully deployed.
[2020-08-28 19:30:09,973] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App HttpRequestResponseSample.siddhi successfully deployed.
[2020-08-28 19:30:09,981] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert.siddhi successfully deployed.
[2020-08-28 19:30:09,985] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App InsertCustomers.siddhi successfully deployed.
[2020-08-28 19:30:09,995] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_CapturedColumn.siddhi successfully deployed.
[2020-08-28 19:30:10,009] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_CSV.siddhi successfully deployed.
[2020-08-28 19:30:10,014] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_DB_Log.siddhi successfully deployed.
[2020-08-28 19:30:10,028] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_fromUsers_toCustomers.siddhi successfully deployed.
[2020-08-28 19:30:10,037] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_fromUsers_toSales.siddhi successfully deployed.
[2020-08-28 19:30:10,042] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_fromUsers_toSales_Concat.siddhi successfully deployed.
[2020-08-28 19:30:10,045] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_Log.siddhi successfully deployed.
[2020-08-28 19:30:10,062] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_log_file.siddhi successfully deployed.
[2020-08-28 19:30:10,077] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_Log_Report.siddhi successfully deployed.
[2020-08-28 19:30:10,096] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App ReceiveAndCount.siddhi successfully deployed.
[2020-08-28 19:30:10,120] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App ReceiveAPI.siddhi successfully deployed.
[2020-08-28 19:30:10,124] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App ReceiveHTTPInJsonFormatWithCustomMapping.siddhi successfully deployed.
[2020-08-28 19:30:10,139] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Siddhi.siddhi successfully deployed.
[2020-08-28 19:30:10,143] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App SiddhiApp.siddhi successfully deployed.
[2020-08-28 19:30:10,160] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App SiddhiApp1.siddhi successfully deployed.
[2020-08-28 19:30:10,166] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App SiddhiAppDatasource.siddhi successfully deployed.
[2020-08-28 19:30:10,174] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App update_fromUsers_toCustomers.siddhi successfully deployed.
[2020-08-28 19:30:10,179] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App update_fromUsers_toSales.siddhi successfully deployed.
[2020-08-28 19:30:10,183] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Update_Log.siddhi successfully deployed.
[2020-08-28 19:30:10,184] INFO {org.wso2.carbon.siddhi.extensions.installer.core.internal.StartupComponent} - Siddhi Extensions Installer Core Startup Listener Service Component is Activated.
[2020-08-28 19:30:10,191] INFO {org.wso2.msf4j.internal.MicroservicesServerSC} - All microservices are available
[2020-08-28 19:30:10,194] INFO {org.wso2.transport.http.netty.contractimpl.listener.ServerConnectorBootstrap$HttpServerConnector} - HTTP(S) Interface starting on host 0.0.0.0 and port 9390
[2020-08-28 19:30:10,194] INFO {org.wso2.transport.http.netty.contractimpl.listener.ServerConnectorBootstrap$HttpServerConnector} - HTTP(S) Interface starting on host 0.0.0.0 and port 9743
[2020-08-28 19:30:10,199] INFO {org.wso2.carbon.analytics.idp.client.core.utils.IdPServiceUtils} - IdP client of type 'local' is started.
[2020-08-28 19:30:10,349] INFO {org.wso2.carbon.databridge.receiver.binary.internal.BinaryDataReceiverServiceComponent} - org.wso2.carbon.databridge.receiver.binary.internal.Service Component is activated
[2020-08-28 19:30:10,354] INFO {org.wso2.carbon.databridge.receiver.thrift.internal.ThriftDataReceiverDS} - Service Component is activated
[2020-08-28 19:30:10,649] INFO {org.wso2.carbon.uiserver.internal.deployment.listener.AppTransportBinder} - Web app 'business-rules' is available at 'https://192.168.42.59:9743/business-rules'.
[2020-08-28 19:30:10,912] INFO {org.wso2.carbon.uiserver.internal.deployment.listener.AppTransportBinder} - Web app 'policies' is available at 'https://192.168.42.59:9743/policies'.
[2020-08-28 19:30:10,915] INFO {org.wso2.carbon.kernel.internal.CarbonStartupHandler} - WSO2 Streaming Integrator Tooling started in 6.219 sec
[2020-08-28 19:31:59,763] INFO {org.wso2.carbon.siddhi.editor.core.internal.EditorConsoleService} - Connected with user : 8cec4bfffec672ff-0000413c-00000009-3961d66be7c406b4-619f6c87
[2020-08-28 19:32:21,006] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:32:21,007] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:32:21,010] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:32:21,010] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:32:21,011] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:32:21,095] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 19:32:21,096] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 19:32:21,097] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Insert_CapturedColumn\insertStream.dat
[2020-08-28 19:32:21,097] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = test
[2020-08-28 19:32:21,097] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = QBS
[2020-08-28 19:32:21,097] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 19:32:21,098] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 19:32:21,098] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 19:32:21,098] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 19:32:21,098] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 1455150863
[2020-08-28 19:32:21,099] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 19:32:21,099] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 19:32:21,099] INFO {io.debezium.connector.common.BaseSourceTask} - name = Insert_CapturedColumninsertStream
[2020-08-28 19:32:21,100] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 6265
[2020-08-28 19:32:21,100] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 19:32:21,307] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 19:32:21,311] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 19:32:21,312] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 19:32:21,316] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - No previous offset has been found
[2020-08-28 19:32:21,316] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - According to the connector configuration both schema and data will be snapshotted
[2020-08-28 19:32:21,317] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 1 - Preparing
[2020-08-28 19:32:21,317] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 2 - Determining captured tables
[2020-08-28 19:32:21,376] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 3 - Locking captured tables
[2020-08-28 19:32:21,378] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Executing schema locking
[2020-08-28 19:32:21,378] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Locking table QBS.dbo.ReportManagement
[2020-08-28 19:32:21,379] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 4 - Determining snapshot offset
[2020-08-28 19:32:21,381] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 5 - Reading structure of captured tables
[2020-08-28 19:32:21,382] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Reading structure of schema 'QBS'
[2020-08-28 19:32:21,532] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 6 - Persisting schema history
[2020-08-28 19:32:21,560] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Schema locks released.
[2020-08-28 19:32:21,561] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 7 - Snapshotting data
[2020-08-28 19:32:21,561] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Exporting data from table 'QBS.dbo.ReportManagement'
[2020-08-28 19:32:21,562] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - For table 'QBS.dbo.ReportManagement' using select statement: 'SELECT * FROM [dbo].[ReportManagement]'
[2020-08-28 19:32:21,573] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Finished exporting 38 records for table 'QBS.dbo.ReportManagement'; total duration '00:00:00.012'
[2020-08-28 19:32:21,575] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 8 - Finalizing
[2020-08-28 19:32:21,605] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000063b0:0001(NULL)[1]
[2020-08-28 19:33:13,592] ERROR {io.debezium.relational.TableSchemaBuilder} - Error requesting a row value, row: 2, requested index: 2 at position 2
[2020-08-28 19:33:13,592] ERROR {io.debezium.pipeline.ErrorHandler} - Producer failure org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:13,594] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-error-handler
[2020-08-28 19:33:13,595] ERROR {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Interrupted while stopping java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2067)
at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475)
at java.util.concurrent.Executors$DelegatedExecutorService.awaitTermination(Executors.java:675)
at io.debezium.pipeline.ErrorHandler.stop(ErrorHandler.java:52)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.cleanupResources(SqlServerConnectorTask.java:205)
at io.debezium.pipeline.ErrorHandler.lambda$setProducerThrowable$0(ErrorHandler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:13,806] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Connector has already been stopped
[2020-08-28 19:33:13,810] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Connection to the database lost. Error while connecting at Source 'cdc' at 'insertStream'. Will retry in '5 sec'. io.siddhi.core.exception.ConnectionUnavailableException: Connection to the database lost.
at io.siddhi.extension.io.cdc.source.CDCSource.lambda$connect$1(CDCSource.java:496)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:899)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)
at io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.poll(SqlServerConnectorTask.java:161)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:814)
... 3 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

[2020-08-28 19:33:18,819] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:33:18,820] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:33:18,820] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:33:18,822] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:18,822] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:18,873] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 19:33:18,873] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 19:33:18,874] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Insert_CapturedColumn\insertStream.dat
[2020-08-28 19:33:18,874] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = test
[2020-08-28 19:33:18,874] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = QBS
[2020-08-28 19:33:18,875] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 19:33:18,875] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 19:33:18,875] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 19:33:18,876] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 19:33:18,876] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 1455150863
[2020-08-28 19:33:18,877] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 19:33:18,877] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 19:33:18,878] INFO {io.debezium.connector.common.BaseSourceTask} - name = Insert_CapturedColumninsertStream
[2020-08-28 19:33:18,878] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 6265
[2020-08-28 19:33:18,878] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 19:33:18,922] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Found previous offset SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=localhost_1433, changeLsn=NULL, commitLsn=00000060:000063b0:0001, eventSerialNo=null, snapshot=FALSE, sourceTime=null], partition={server=localhost_1433}, snapshotCompleted=true, eventSerialNo=0]
[2020-08-28 19:33:18,942] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 19:33:18,943] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 19:33:18,943] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 19:33:18,944] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - A previous offset indicating a completed snapshot has been found. Neither schema nor data will be snapshotted.
[2020-08-28 19:33:18,962] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000063b0:0001(NULL)[0]
[2020-08-28 19:33:18,967] ERROR {io.debezium.relational.TableSchemaBuilder} - Error requesting a row value, row: 2, requested index: 2 at position 2
[2020-08-28 19:33:18,968] ERROR {io.debezium.pipeline.ErrorHandler} - Producer failure org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:18,970] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-error-handler
[2020-08-28 19:33:18,971] ERROR {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Interrupted while stopping java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2067)
at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475)
at java.util.concurrent.Executors$DelegatedExecutorService.awaitTermination(Executors.java:675)
at io.debezium.pipeline.ErrorHandler.stop(ErrorHandler.java:52)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.cleanupResources(SqlServerConnectorTask.java:205)
at io.debezium.pipeline.ErrorHandler.lambda$setProducerThrowable$0(ErrorHandler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:19,443] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Connector has already been stopped
[2020-08-28 19:33:19,445] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Connection to the database lost. Error while connecting at Source 'cdc' at 'insertStream'. Will retry in '5 sec'. io.siddhi.core.exception.ConnectionUnavailableException: Connection to the database lost.
at io.siddhi.extension.io.cdc.source.CDCSource.lambda$connect$1(CDCSource.java:496)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:899)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)
at io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.poll(SqlServerConnectorTask.java:161)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:814)
... 3 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

[2020-08-28 19:33:24,449] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:33:24,450] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:33:24,450] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:33:24,451] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:24,451] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:24,505] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 19:33:24,508] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 19:33:24,510] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Insert_CapturedColumn\insertStream.dat
[2020-08-28 19:33:24,511] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = test
[2020-08-28 19:33:24,512] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = QBS
[2020-08-28 19:33:24,513] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 19:33:24,515] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 19:33:24,516] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 19:33:24,516] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 19:33:24,518] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 1455150863
[2020-08-28 19:33:24,520] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 19:33:24,521] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 19:33:24,521] INFO {io.debezium.connector.common.BaseSourceTask} - name = Insert_CapturedColumninsertStream
[2020-08-28 19:33:24,522] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 6265
[2020-08-28 19:33:24,522] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 19:33:24,563] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Found previous offset SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=localhost_1433, changeLsn=NULL, commitLsn=00000060:000063b0:0001, eventSerialNo=null, snapshot=FALSE, sourceTime=null], partition={server=localhost_1433}, snapshotCompleted=true, eventSerialNo=0]
[2020-08-28 19:33:24,567] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 19:33:24,567] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 19:33:24,569] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 19:33:24,576] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - A previous offset indicating a completed snapshot has been found. Neither schema nor data will be snapshotted.
[2020-08-28 19:33:24,587] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000063b0:0001(NULL)[0]
[2020-08-28 19:33:24,592] ERROR {io.debezium.relational.TableSchemaBuilder} - Error requesting a row value, row: 2, requested index: 2 at position 2
[2020-08-28 19:33:24,592] ERROR {io.debezium.pipeline.ErrorHandler} - Producer failure org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:24,594] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-error-handler
[2020-08-28 19:33:24,595] ERROR {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Interrupted while stopping java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2067)
at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475)
at java.util.concurrent.Executors$DelegatedExecutorService.awaitTermination(Executors.java:675)
at io.debezium.pipeline.ErrorHandler.stop(ErrorHandler.java:52)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.cleanupResources(SqlServerConnectorTask.java:205)
at io.debezium.pipeline.ErrorHandler.lambda$setProducerThrowable$0(ErrorHandler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:25,067] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Connector has already been stopped
[2020-08-28 19:33:25,067] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Connection to the database lost. Error while connecting at Source 'cdc' at 'insertStream'. Will retry in '5 sec'. io.siddhi.core.exception.ConnectionUnavailableException: Connection to the database lost.
at io.siddhi.extension.io.cdc.source.CDCSource.lambda$connect$1(CDCSource.java:496)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:899)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)
at io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.poll(SqlServerConnectorTask.java:161)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:814)
... 3 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

[2020-08-28 19:33:30,072] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:33:30,072] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:33:30,073] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:33:30,074] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:30,074] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:30,126] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 19:33:30,127] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 19:33:30,127] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Insert_CapturedColumn\insertStream.dat
[2020-08-28 19:33:30,127] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = test
[2020-08-28 19:33:30,128] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = QBS
[2020-08-28 19:33:30,128] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 19:33:30,128] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 19:33:30,129] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 19:33:30,129] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 19:33:30,129] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 1455150863
[2020-08-28 19:33:30,129] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 19:33:30,130] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 19:33:30,130] INFO {io.debezium.connector.common.BaseSourceTask} - name = Insert_CapturedColumninsertStream
[2020-08-28 19:33:30,130] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 6265
[2020-08-28 19:33:30,131] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 19:33:30,167] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Found previous offset SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=localhost_1433, changeLsn=NULL, commitLsn=00000060:000063b0:0001, eventSerialNo=null, snapshot=FALSE, sourceTime=null], partition={server=localhost_1433}, snapshotCompleted=true, eventSerialNo=0]
[2020-08-28 19:33:30,172] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 19:33:30,173] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 19:33:30,173] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 19:33:30,174] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - A previous offset indicating a completed snapshot has been found. Neither schema nor data will be snapshotted.
[2020-08-28 19:33:30,185] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000063b0:0001(NULL)[0]
[2020-08-28 19:33:30,195] ERROR {io.debezium.relational.TableSchemaBuilder} - Error requesting a row value, row: 2, requested index: 2 at position 2
[2020-08-28 19:33:30,195] ERROR {io.debezium.pipeline.ErrorHandler} - Producer failure org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:30,197] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-error-handler
[2020-08-28 19:33:30,199] ERROR {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Interrupted while stopping java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2067)
at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475)
at java.util.concurrent.Executors$DelegatedExecutorService.awaitTermination(Executors.java:675)
at io.debezium.pipeline.ErrorHandler.stop(ErrorHandler.java:52)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.cleanupResources(SqlServerConnectorTask.java:205)
at io.debezium.pipeline.ErrorHandler.lambda$setProducerThrowable$0(ErrorHandler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:30,672] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Connector has already been stopped
[2020-08-28 19:33:30,673] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Connection to the database lost. Error while connecting at Source 'cdc' at 'insertStream'. Will retry in '5 sec'. io.siddhi.core.exception.ConnectionUnavailableException: Connection to the database lost.
at io.siddhi.extension.io.cdc.source.CDCSource.lambda$connect$1(CDCSource.java:496)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:899)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)
at io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.poll(SqlServerConnectorTask.java:161)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:814)
... 3 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

[2020-08-28 19:33:31,514] WARN {org.wso2.msf4j.internal.MSF4JHttpConnectorListener} - Error in http connector listener : 'Remote client closed the connection before initiating inbound request'
[2020-08-28 19:33:35,679] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:33:35,680] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:33:35,681] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:33:35,685] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:35,686] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:35,739] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 19:33:35,740] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 19:33:35,741] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Insert_CapturedColumn\insertStream.dat
[2020-08-28 19:33:35,742] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = test
[2020-08-28 19:33:35,742] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = QBS
[2020-08-28 19:33:35,743] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 19:33:35,744] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 19:33:35,744] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 19:33:35,745] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 19:33:35,745] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 1455150863
[2020-08-28 19:33:35,746] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 19:33:35,746] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 19:33:35,747] INFO {io.debezium.connector.common.BaseSourceTask} - name = Insert_CapturedColumninsertStream
[2020-08-28 19:33:35,749] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 6265
[2020-08-28 19:33:35,749] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 19:33:35,790] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Found previous offset SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=localhost_1433, changeLsn=NULL, commitLsn=00000060:000063b0:0001, eventSerialNo=null, snapshot=FALSE, sourceTime=null], partition={server=localhost_1433}, snapshotCompleted=true, eventSerialNo=0]
[2020-08-28 19:33:35,795] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 19:33:35,796] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 19:33:35,796] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 19:33:35,797] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - A previous offset indicating a completed snapshot has been found. Neither schema nor data will be snapshotted.
[2020-08-28 19:33:35,807] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000063b0:0001(NULL)[0]
[2020-08-28 19:33:35,811] ERROR {io.debezium.relational.TableSchemaBuilder} - Error requesting a row value, row: 2, requested index: 2 at position 2
[2020-08-28 19:33:35,812] ERROR {io.debezium.pipeline.ErrorHandler} - Producer failure org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:35,813] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-error-handler
[2020-08-28 19:33:35,814] ERROR {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Interrupted while stopping java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2067)
at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475)
at java.util.concurrent.Executors$DelegatedExecutorService.awaitTermination(Executors.java:675)
at io.debezium.pipeline.ErrorHandler.stop(ErrorHandler.java:52)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.cleanupResources(SqlServerConnectorTask.java:205)
at io.debezium.pipeline.ErrorHandler.lambda$setProducerThrowable$0(ErrorHandler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:36,295] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Connector has already been stopped
[2020-08-28 19:33:36,296] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Connection to the database lost. Error while connecting at Source 'cdc' at 'insertStream'. Will retry in '5 sec'. io.siddhi.core.exception.ConnectionUnavailableException: Connection to the database lost.
at io.siddhi.extension.io.cdc.source.CDCSource.lambda$connect$1(CDCSource.java:496)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:899)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)
at io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.poll(SqlServerConnectorTask.java:161)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:814)
... 3 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

[2020-08-28 19:33:41,299] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:33:41,301] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:33:41,302] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:33:41,305] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:41,306] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:41,360] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 19:33:41,362] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 19:33:41,362] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Insert_CapturedColumn\insertStream.dat
[2020-08-28 19:33:41,363] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = test
[2020-08-28 19:33:41,365] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = QBS
[2020-08-28 19:33:41,366] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 19:33:41,367] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 19:33:41,368] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 19:33:41,368] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 19:33:41,368] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 1455150863
[2020-08-28 19:33:41,368] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 19:33:41,369] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 19:33:41,370] INFO {io.debezium.connector.common.BaseSourceTask} - name = Insert_CapturedColumninsertStream
[2020-08-28 19:33:41,370] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 6265
[2020-08-28 19:33:41,371] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 19:33:41,425] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Found previous offset SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=localhost_1433, changeLsn=NULL, commitLsn=00000060:000063b0:0001, eventSerialNo=null, snapshot=FALSE, sourceTime=null], partition={server=localhost_1433}, snapshotCompleted=true, eventSerialNo=0]
[2020-08-28 19:33:41,439] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 19:33:41,439] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 19:33:41,441] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 19:33:41,445] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - A previous offset indicating a completed snapshot has been found. Neither schema nor data will be snapshotted.
[2020-08-28 19:33:41,469] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000063b0:0001(NULL)[0]
[2020-08-28 19:33:41,477] ERROR {io.debezium.relational.TableSchemaBuilder} - Error requesting a row value, row: 2, requested index: 2 at position 2
[2020-08-28 19:33:41,477] ERROR {io.debezium.pipeline.ErrorHandler} - Producer failure org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:41,480] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-error-handler
[2020-08-28 19:33:41,483] ERROR {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Interrupted while stopping java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2067)
at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475)
at java.util.concurrent.Executors$DelegatedExecutorService.awaitTermination(Executors.java:675)
at io.debezium.pipeline.ErrorHandler.stop(ErrorHandler.java:52)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.cleanupResources(SqlServerConnectorTask.java:205)
at io.debezium.pipeline.ErrorHandler.lambda$setProducerThrowable$0(ErrorHandler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:41,938] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Connector has already been stopped
[2020-08-28 19:33:41,939] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Connection to the database lost. Error while connecting at Source 'cdc' at 'insertStream'. Will retry in '5 sec'. io.siddhi.core.exception.ConnectionUnavailableException: Connection to the database lost.
at io.siddhi.extension.io.cdc.source.CDCSource.lambda$connect$1(CDCSource.java:496)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:899)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)
at io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.poll(SqlServerConnectorTask.java:161)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:814)
... 3 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

[2020-08-28 19:33:46,941] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:33:46,943] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:33:46,952] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:33:46,956] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:46,972] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:47,026] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 19:33:47,026] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 19:33:47,028] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Insert_CapturedColumn\insertStream.dat
[2020-08-28 19:33:47,035] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = test
[2020-08-28 19:33:47,039] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = QBS
[2020-08-28 19:33:47,050] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 19:33:47,051] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 19:33:47,066] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 19:33:47,067] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 19:33:47,082] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 1455150863
[2020-08-28 19:33:47,102] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 19:33:47,105] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 19:33:47,106] INFO {io.debezium.connector.common.BaseSourceTask} - name = Insert_CapturedColumninsertStream
[2020-08-28 19:33:47,117] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 6265
[2020-08-28 19:33:47,120] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 19:33:47,167] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Found previous offset SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=localhost_1433, changeLsn=NULL, commitLsn=00000060:000063b0:0001, eventSerialNo=null, snapshot=FALSE, sourceTime=null], partition={server=localhost_1433}, snapshotCompleted=true, eventSerialNo=0]
[2020-08-28 19:33:47,172] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 19:33:47,173] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 19:33:47,177] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 19:33:47,178] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - A previous offset indicating a completed snapshot has been found. Neither schema nor data will be snapshotted.
[2020-08-28 19:33:47,198] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000063b0:0001(NULL)[0]
[2020-08-28 19:33:47,204] ERROR {io.debezium.relational.TableSchemaBuilder} - Error requesting a row value, row: 2, requested index: 2 at position 2
[2020-08-28 19:33:47,206] ERROR {io.debezium.pipeline.ErrorHandler} - Producer failure org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:47,220] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-error-handler
[2020-08-28 19:33:47,233] ERROR {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Interrupted while stopping java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2067)
at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475)
at java.util.concurrent.Executors$DelegatedExecutorService.awaitTermination(Executors.java:675)
at io.debezium.pipeline.ErrorHandler.stop(ErrorHandler.java:52)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.cleanupResources(SqlServerConnectorTask.java:205)
at io.debezium.pipeline.ErrorHandler.lambda$setProducerThrowable$0(ErrorHandler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:47,672] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Connector has already been stopped
[2020-08-28 19:33:47,672] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Connection to the database lost. Error while connecting at Source 'cdc' at 'insertStream'. Will retry in '5 sec'. io.siddhi.core.exception.ConnectionUnavailableException: Connection to the database lost.
at io.siddhi.extension.io.cdc.source.CDCSource.lambda$connect$1(CDCSource.java:496)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:899)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)
at io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.poll(SqlServerConnectorTask.java:161)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:814)
... 3 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

[2020-08-28 19:33:52,677] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:33:52,678] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:33:52,678] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:33:52,686] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:52,686] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:52,739] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 19:33:52,740] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 19:33:52,742] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Insert_CapturedColumn\insertStream.dat
[2020-08-28 19:33:52,749] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = test
[2020-08-28 19:33:52,750] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = QBS
[2020-08-28 19:33:52,753] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 19:33:52,765] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 19:33:52,766] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 19:33:52,769] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 19:33:52,781] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 1455150863
[2020-08-28 19:33:52,793] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 19:33:52,799] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 19:33:52,812] INFO {io.debezium.connector.common.BaseSourceTask} - name = Insert_CapturedColumninsertStream
[2020-08-28 19:33:52,815] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 6265
[2020-08-28 19:33:52,829] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 19:33:52,873] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Found previous offset SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=localhost_1433, changeLsn=NULL, commitLsn=00000060:000063b0:0001, eventSerialNo=null, snapshot=FALSE, sourceTime=null], partition={server=localhost_1433}, snapshotCompleted=true, eventSerialNo=0]
[2020-08-28 19:33:52,879] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 19:33:52,880] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 19:33:52,881] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 19:33:52,899] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - A previous offset indicating a completed snapshot has been found. Neither schema nor data will be snapshotted.
[2020-08-28 19:33:52,916] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000063b0:0001(NULL)[0]
[2020-08-28 19:33:52,927] ERROR {io.debezium.relational.TableSchemaBuilder} - Error requesting a row value, row: 2, requested index: 2 at position 2
[2020-08-28 19:33:52,928] ERROR {io.debezium.pipeline.ErrorHandler} - Producer failure org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:52,929] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-error-handler
[2020-08-28 19:33:52,943] ERROR {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Interrupted while stopping java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2067)
at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475)
at java.util.concurrent.Executors$DelegatedExecutorService.awaitTermination(Executors.java:675)
at io.debezium.pipeline.ErrorHandler.stop(ErrorHandler.java:52)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.cleanupResources(SqlServerConnectorTask.java:205)
at io.debezium.pipeline.ErrorHandler.lambda$setProducerThrowable$0(ErrorHandler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:53,380] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Connector has already been stopped
[2020-08-28 19:33:53,381] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Connection to the database lost. Error while connecting at Source 'cdc' at 'insertStream'. Will retry in '5 sec'. io.siddhi.core.exception.ConnectionUnavailableException: Connection to the database lost.
at io.siddhi.extension.io.cdc.source.CDCSource.lambda$connect$1(CDCSource.java:496)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:899)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)
at io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.poll(SqlServerConnectorTask.java:161)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:814)
... 3 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

[2020-08-28 19:33:58,392] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:33:58,393] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:33:58,394] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:33:58,395] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:58,401] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:33:58,455] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 19:33:58,455] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 19:33:58,457] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Insert_CapturedColumn\insertStream.dat
[2020-08-28 19:33:58,464] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = test
[2020-08-28 19:33:58,465] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = QBS
[2020-08-28 19:33:58,467] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 19:33:58,468] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 19:33:58,482] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 19:33:58,485] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 19:33:58,497] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 1455150863
[2020-08-28 19:33:58,499] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 19:33:58,514] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 19:33:58,525] INFO {io.debezium.connector.common.BaseSourceTask} - name = Insert_CapturedColumninsertStream
[2020-08-28 19:33:58,525] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 6265
[2020-08-28 19:33:58,539] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 19:33:58,589] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Found previous offset SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=localhost_1433, changeLsn=NULL, commitLsn=00000060:000063b0:0001, eventSerialNo=null, snapshot=FALSE, sourceTime=null], partition={server=localhost_1433}, snapshotCompleted=true, eventSerialNo=0]
[2020-08-28 19:33:58,596] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 19:33:58,596] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 19:33:58,601] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 19:33:58,602] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - A previous offset indicating a completed snapshot has been found. Neither schema nor data will be snapshotted.
[2020-08-28 19:33:58,618] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000063b0:0001(NULL)[0]
[2020-08-28 19:33:58,625] ERROR {io.debezium.relational.TableSchemaBuilder} - Error requesting a row value, row: 2, requested index: 2 at position 2
[2020-08-28 19:33:58,635] ERROR {io.debezium.pipeline.ErrorHandler} - Producer failure org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:58,650] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-error-handler
[2020-08-28 19:33:58,683] ERROR {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Interrupted while stopping java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2067)
at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475)
at java.util.concurrent.Executors$DelegatedExecutorService.awaitTermination(Executors.java:675)
at io.debezium.pipeline.ErrorHandler.stop(ErrorHandler.java:52)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.cleanupResources(SqlServerConnectorTask.java:205)
at io.debezium.pipeline.ErrorHandler.lambda$setProducerThrowable$0(ErrorHandler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:33:59,096] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Connector has already been stopped
[2020-08-28 19:33:59,097] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Connection to the database lost. Error while connecting at Source 'cdc' at 'insertStream'. Will retry in '5 sec'. io.siddhi.core.exception.ConnectionUnavailableException: Connection to the database lost.
at io.siddhi.extension.io.cdc.source.CDCSource.lambda$connect$1(CDCSource.java:496)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:899)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)
at io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.poll(SqlServerConnectorTask.java:161)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:814)
... 3 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

[2020-08-28 19:35:21,314] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_CapturedColumn.siddhi successfully deployed.
[2020-08-28 19:35:37,159] ERROR {org.wso2.carbon.siddhi.editor.core.internal.EditorMicroservice} - Unable to generate design view io.siddhi.core.exception.SiddhiAppCreationException: Error on 'Insert_CapturedColumn' between @ Line: 14. Position: 0 and @ Line: 14. Position: 21. Different definition same as output 'define stream logStream (ID int, PRODUCT string, Active int, Des string)' already exist as '@sink( type = "log")define stream logStream (ID int, PRODUCT string, Active int)'
at org.wso2.carbon.siddhi.editor.core.util.designview.designgenerator.DesignGenerator.getEventFlow(DesignGenerator.java:58)
at org.wso2.carbon.siddhi.editor.core.internal.EditorMicroservice.getDesignView(EditorMicroservice.java:1116)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.wso2.msf4j.internal.router.HttpMethodInfo.invokeResource(HttpMethodInfo.java:187)
at org.wso2.msf4j.internal.router.HttpMethodInfo.invoke(HttpMethodInfo.java:143)
at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.dispatchMethod(MSF4JHttpConnectorListener.java:218)
at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.lambda$onMessage$47(MSF4JHttpConnectorListener.java:129)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:35:37,174] ERROR {org.wso2.carbon.siddhi.editor.core.internal.EditorMicroservice} - Unable to generate design view io.siddhi.core.exception.SiddhiAppCreationException: Error on 'Insert_CapturedColumn' between @ Line: 14. Position: 0 and @ Line: 14. Position: 21. Different definition same as output 'define stream logStream (ID int, PRODUCT string, Active int, Des string)' already exist as '@sink( type = "log")define stream logStream (ID int, PRODUCT string, Active int)'
at org.wso2.carbon.siddhi.editor.core.util.designview.designgenerator.DesignGenerator.getEventFlow(DesignGenerator.java:58)
at org.wso2.carbon.siddhi.editor.core.internal.EditorMicroservice.getDesignView(EditorMicroservice.java:1116)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.wso2.msf4j.internal.router.HttpMethodInfo.invokeResource(HttpMethodInfo.java:187)
at org.wso2.msf4j.internal.router.HttpMethodInfo.invoke(HttpMethodInfo.java:143)
at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.dispatchMethod(MSF4JHttpConnectorListener.java:218)
at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.lambda$onMessage$47(MSF4JHttpConnectorListener.java:129)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:36:09,422] INFO {org.wso2.msf4j.internal.websocket.WebSocketServerSC} - All required capabilities are available of WebSocket service component is available.
[2020-08-28 19:36:09,635] INFO {org.wso2.carbon.metrics.core.config.model.JmxReporterConfig} - Creating JMX reporter for Metrics with domain 'org.wso2.carbon.metrics'
[2020-08-28 19:36:09,674] INFO {org.wso2.msf4j.analytics.metrics.MetricsComponent} - Metrics Component is activated
[2020-08-28 19:36:09,712] INFO {org.wso2.carbon.databridge.agent.internal.DataAgentDS} - Successfully deployed Agent Server
[2020-08-28 19:36:09,939] INFO {org.wso2.msf4j.internal.websocket.EndpointsRegistryImpl} - Endpoint Registered : /console
[2020-08-28 19:36:11,492] INFO {org.wso2.msf4j.MicroservicesRunner} - Microservices server started in 1330ms
[2020-08-28 19:36:11,494] INFO {org.wso2.transport.http.netty.contractimpl.listener.ServerConnectorBootstrap$HttpServerConnector} - HTTP(S) Interface starting on host 0.0.0.0 and port 7370
[2020-08-28 19:36:11,504] INFO {org.wso2.carbon.event.simulator.core.service.CSVFileDeployer} - CSV file deployer initiated.
[2020-08-28 19:36:11,506] INFO {org.wso2.carbon.event.simulator.core.service.SimulationConfigDeployer} - Simulation config deployer initiated.
[2020-08-28 19:36:11,744] INFO {org.wso2.carbon.business.rules.templates.editor.core.internal.StartupComponent} - Template Editor Started on : http://192.168.42.59:9390/template-editor
[2020-08-28 19:36:11,857] INFO {org.wso2.carbon.siddhi.editor.core.internal.StartupComponent} - Editor Started on : http://192.168.42.59:9390/editor
[2020-08-28 19:36:12,005] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App CDCWithListeningMode-Update.siddhi successfully deployed.
[2020-08-28 19:36:12,029] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Delete_fromUsers_toCustomer.siddhi successfully deployed.
[2020-08-28 19:36:12,034] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Delete_fromUsers_toSale.siddhi successfully deployed.
[2020-08-28 19:36:12,039] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Delete_fromUsers_toSales.siddhi successfully deployed.
[2020-08-28 19:36:12,044] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Delete_Log.siddhi successfully deployed.
[2020-08-28 19:36:12,045] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Deployment.siddhi successfully deployed.
[2020-08-28 19:36:12,179] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App HttpRequestResponseSample.siddhi successfully deployed.
[2020-08-28 19:36:12,183] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert.siddhi successfully deployed.
[2020-08-28 19:36:12,208] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App InsertCustomers.siddhi successfully deployed.
[2020-08-28 19:36:12,217] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_CapturedColumn.siddhi successfully deployed.
[2020-08-28 19:36:12,223] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_CSV.siddhi successfully deployed.
[2020-08-28 19:36:12,226] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_DB_Log.siddhi successfully deployed.
[2020-08-28 19:36:12,232] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_fromUsers_toCustomers.siddhi successfully deployed.
[2020-08-28 19:36:12,235] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_fromUsers_toSales.siddhi successfully deployed.
[2020-08-28 19:36:12,238] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_fromUsers_toSales_Concat.siddhi successfully deployed.
[2020-08-28 19:36:12,241] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_Log.siddhi successfully deployed.
[2020-08-28 19:36:12,244] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_log_file.siddhi successfully deployed.
[2020-08-28 19:36:12,249] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_Log_Report.siddhi successfully deployed.
[2020-08-28 19:36:12,267] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App ReceiveAndCount.siddhi successfully deployed.
[2020-08-28 19:36:12,275] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App ReceiveAPI.siddhi successfully deployed.
[2020-08-28 19:36:12,278] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App ReceiveHTTPInJsonFormatWithCustomMapping.siddhi successfully deployed.
[2020-08-28 19:36:12,280] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Siddhi.siddhi successfully deployed.
[2020-08-28 19:36:12,281] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App SiddhiApp.siddhi successfully deployed.
[2020-08-28 19:36:12,284] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App SiddhiApp1.siddhi successfully deployed.
[2020-08-28 19:36:12,286] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App SiddhiAppDatasource.siddhi successfully deployed.
[2020-08-28 19:36:12,293] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App update_fromUsers_toCustomers.siddhi successfully deployed.
[2020-08-28 19:36:12,296] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App update_fromUsers_toSales.siddhi successfully deployed.
[2020-08-28 19:36:12,301] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Update_Log.siddhi successfully deployed.
[2020-08-28 19:36:12,302] INFO {org.wso2.carbon.siddhi.extensions.installer.core.internal.StartupComponent} - Siddhi Extensions Installer Core Startup Listener Service Component is Activated.
[2020-08-28 19:36:12,307] INFO {org.wso2.msf4j.internal.MicroservicesServerSC} - All microservices are available
[2020-08-28 19:36:12,309] INFO {org.wso2.transport.http.netty.contractimpl.listener.ServerConnectorBootstrap$HttpServerConnector} - HTTP(S) Interface starting on host 0.0.0.0 and port 9390
[2020-08-28 19:36:12,310] INFO {org.wso2.transport.http.netty.contractimpl.listener.ServerConnectorBootstrap$HttpServerConnector} - HTTP(S) Interface starting on host 0.0.0.0 and port 9743
[2020-08-28 19:36:12,315] INFO {org.wso2.carbon.analytics.idp.client.core.utils.IdPServiceUtils} - IdP client of type 'local' is started.
[2020-08-28 19:36:12,450] INFO {org.wso2.carbon.databridge.receiver.binary.internal.BinaryDataReceiverServiceComponent} - org.wso2.carbon.databridge.receiver.binary.internal.Service Component is activated
[2020-08-28 19:36:12,456] INFO {org.wso2.carbon.databridge.receiver.thrift.internal.ThriftDataReceiverDS} - Service Component is activated
[2020-08-28 19:36:12,789] INFO {org.wso2.carbon.uiserver.internal.deployment.listener.AppTransportBinder} - Web app 'business-rules' is available at 'https://192.168.42.59:9743/business-rules'.
[2020-08-28 19:36:12,966] INFO {org.wso2.carbon.uiserver.internal.deployment.listener.AppTransportBinder} - Web app 'policies' is available at 'https://192.168.42.59:9743/policies'.
[2020-08-28 19:36:12,969] INFO {org.wso2.carbon.kernel.internal.CarbonStartupHandler} - WSO2 Streaming Integrator Tooling started in 6.455 sec
[2020-08-28 19:36:19,663] INFO {org.wso2.carbon.siddhi.editor.core.internal.EditorConsoleService} - Connected with user : 8cec4bfffec672ff-00005764-00000009-0f8e404a24c7fdf6-f65da2f2
[2020-08-28 19:36:39,549] INFO {org.wso2.carbon.siddhi.editor.core.internal.WorkspaceDeployer} - Siddhi App Insert_CapturedColumn.siddhi successfully deployed.
[2020-08-28 19:36:48,789] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:36:48,790] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:36:48,793] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:36:48,794] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:36:48,795] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:36:48,876] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 19:36:48,877] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 19:36:48,878] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Insert_CapturedColumn\insertStream.dat
[2020-08-28 19:36:48,878] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = test
[2020-08-28 19:36:48,878] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = QBS
[2020-08-28 19:36:48,879] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 19:36:48,879] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 19:36:48,879] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 19:36:48,879] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 19:36:48,880] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 40671814
[2020-08-28 19:36:48,880] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 19:36:48,880] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 19:36:48,880] INFO {io.debezium.connector.common.BaseSourceTask} - name = Insert_CapturedColumninsertStream
[2020-08-28 19:36:48,881] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 5890
[2020-08-28 19:36:48,882] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 19:36:48,983] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 19:36:48,989] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 19:36:48,991] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 19:36:48,994] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - No previous offset has been found
[2020-08-28 19:36:48,995] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - According to the connector configuration both schema and data will be snapshotted
[2020-08-28 19:36:48,996] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 1 - Preparing
[2020-08-28 19:36:48,997] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 2 - Determining captured tables
[2020-08-28 19:36:49,021] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 3 - Locking captured tables
[2020-08-28 19:36:49,022] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Executing schema locking
[2020-08-28 19:36:49,022] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Locking table QBS.dbo.ReportManagement
[2020-08-28 19:36:49,023] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 4 - Determining snapshot offset
[2020-08-28 19:36:49,026] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 5 - Reading structure of captured tables
[2020-08-28 19:36:49,027] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Reading structure of schema 'QBS'
[2020-08-28 19:36:49,046] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 6 - Persisting schema history
[2020-08-28 19:36:49,066] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - Schema locks released.
[2020-08-28 19:36:49,066] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 7 - Snapshotting data
[2020-08-28 19:36:49,067] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Exporting data from table 'QBS.dbo.ReportManagement'
[2020-08-28 19:36:49,067] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - For table 'QBS.dbo.ReportManagement' using select statement: 'SELECT * FROM [dbo].[ReportManagement]'
[2020-08-28 19:36:49,080] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Finished exporting 39 records for table 'QBS.dbo.ReportManagement'; total duration '00:00:00.012'
[2020-08-28 19:36:49,083] INFO {io.debezium.relational.RelationalSnapshotChangeEventSource} - Snapshot step 8 - Finalizing
[2020-08-28 19:36:49,110] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000064b0:0003(NULL)[1]
[2020-08-28 19:36:59,117] ERROR {io.debezium.relational.TableSchemaBuilder} - Error requesting a row value, row: 2, requested index: 2 at position 2
[2020-08-28 19:36:59,117] ERROR {io.debezium.pipeline.ErrorHandler} - Producer failure org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:36:59,121] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-error-handler
[2020-08-28 19:36:59,125] ERROR {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Interrupted while stopping java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2067)
at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475)
at java.util.concurrent.Executors$DelegatedExecutorService.awaitTermination(Executors.java:675)
at io.debezium.pipeline.ErrorHandler.stop(ErrorHandler.java:52)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.cleanupResources(SqlServerConnectorTask.java:205)
at io.debezium.pipeline.ErrorHandler.lambda$setProducerThrowable$0(ErrorHandler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:36:59,483] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Connector has already been stopped
[2020-08-28 19:36:59,486] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Connection to the database lost. Error while connecting at Source 'cdc' at 'insertStream'. Will retry in '5 sec'. io.siddhi.core.exception.ConnectionUnavailableException: Connection to the database lost.
at io.siddhi.extension.io.cdc.source.CDCSource.lambda$connect$1(CDCSource.java:496)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:899)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)
at io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.poll(SqlServerConnectorTask.java:161)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:814)
... 3 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

[2020-08-28 19:37:04,492] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:37:04,493] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:37:04,493] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:37:04,494] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:37:04,495] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:37:04,547] INFO {io.debezium.connector.common.BaseSourceTask} - Starting SqlServerConnectorTask with configuration:
[2020-08-28 19:37:04,547] INFO {io.debezium.connector.common.BaseSourceTask} - connector.class = io.debezium.connector.sqlserver.SqlServerConnector
[2020-08-28 19:37:04,551] INFO {io.debezium.connector.common.BaseSourceTask} - database.history.file.filename = C:\PROGRA1\WSO2\STREAM1\103623~1.1\bin..\cdc\history\Insert_CapturedColumn\insertStream.dat
[2020-08-28 19:37:04,560] INFO {io.debezium.connector.common.BaseSourceTask} - database.user = test
[2020-08-28 19:37:04,561] INFO {io.debezium.connector.common.BaseSourceTask} - database.dbname = QBS
[2020-08-28 19:37:04,562] INFO {io.debezium.connector.common.BaseSourceTask} - offset.storage = io.siddhi.extension.io.cdc.source.listening.InMemoryOffsetBackingStore
[2020-08-28 19:37:04,570] INFO {io.debezium.connector.common.BaseSourceTask} - database.server.name = localhost_1433
[2020-08-28 19:37:04,573] INFO {io.debezium.connector.common.BaseSourceTask} - database.port = 1433
[2020-08-28 19:37:04,574] INFO {io.debezium.connector.common.BaseSourceTask} - table.whitelist = dbo.ReportManagement
[2020-08-28 19:37:04,583] INFO {io.debezium.connector.common.BaseSourceTask} - cdc.source.object = 40671814
[2020-08-28 19:37:04,585] INFO {io.debezium.connector.common.BaseSourceTask} - database.hostname = localhost
[2020-08-28 19:37:04,585] INFO {io.debezium.connector.common.BaseSourceTask} - database.password = ********
[2020-08-28 19:37:04,600] INFO {io.debezium.connector.common.BaseSourceTask} - name = Insert_CapturedColumninsertStream
[2020-08-28 19:37:04,603] INFO {io.debezium.connector.common.BaseSourceTask} - server.id = 5890
[2020-08-28 19:37:04,612] INFO {io.debezium.connector.common.BaseSourceTask} - database.history = io.debezium.relational.history.FileDatabaseHistory
[2020-08-28 19:37:04,672] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Found previous offset SqlServerOffsetContext [sourceInfoSchema=Schema{io.debezium.connector.sqlserver.Source:STRUCT}, sourceInfo=SourceInfo [serverName=localhost_1433, changeLsn=NULL, commitLsn=00000060:000064b0:0003, eventSerialNo=null, snapshot=FALSE, sourceTime=null], partition={server=localhost_1433}, snapshotCompleted=true, eventSerialNo=0]
[2020-08-28 19:37:04,689] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = error-handler
[2020-08-28 19:37:04,690] INFO {io.debezium.util.Threads} - Requested thread factory for connector SqlServerConnector, id = localhost_1433 named = change-event-source-coordinator
[2020-08-28 19:37:04,694] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-change-event-source-coordinator
[2020-08-28 19:37:04,698] INFO {io.debezium.connector.sqlserver.SqlServerSnapshotChangeEventSource} - A previous offset indicating a completed snapshot has been found. Neither schema nor data will be snapshotted.
[2020-08-28 19:37:04,942] INFO {io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource} - Last position recorded in offsets is 00000060:000064b0:0003(NULL)[0]
[2020-08-28 19:37:04,988] ERROR {io.debezium.relational.TableSchemaBuilder} - Error requesting a row value, row: 2, requested index: 2 at position 2
[2020-08-28 19:37:04,989] ERROR {io.debezium.pipeline.ErrorHandler} - Producer failure org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:37:04,992] INFO {io.debezium.util.Threads} - Creating thread debezium-sqlserverconnector-localhost_1433-error-handler
[2020-08-28 19:37:05,014] ERROR {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Interrupted while stopping java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2067)
at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475)
at java.util.concurrent.Executors$DelegatedExecutorService.awaitTermination(Executors.java:675)
at io.debezium.pipeline.ErrorHandler.stop(ErrorHandler.java:52)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.cleanupResources(SqlServerConnectorTask.java:205)
at io.debezium.pipeline.ErrorHandler.lambda$setProducerThrowable$0(ErrorHandler.java:42)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:37:05,189] INFO {io.debezium.connector.sqlserver.SqlServerConnectorTask} - Connector has already been stopped
[2020-08-28 19:37:05,191] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Connection to the database lost. Error while connecting at Source 'cdc' at 'insertStream'. Will retry in '5 sec'. io.siddhi.core.exception.ConnectionUnavailableException: Connection to the database lost.
at io.siddhi.extension.io.cdc.source.CDCSource.lambda$connect$1(CDCSource.java:496)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:899)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.kafka.connect.errors.ConnectException: An exception ocurred in the change event producer. This connector will be stopped.
at io.debezium.connector.base.ChangeEventQueue.throwProducerFailureIfPresent(ChangeEventQueue.java:170)
at io.debezium.connector.base.ChangeEventQueue.poll(ChangeEventQueue.java:151)
at io.debezium.connector.sqlserver.SqlServerConnectorTask.poll(SqlServerConnectorTask.java:161)
at io.debezium.embedded.EmbeddedEngine.run(EmbeddedEngine.java:814)
... 3 more
Caused by: org.apache.kafka.connect.errors.ConnectException: Data row is smaller than a column index, internal schema representation is probably out of sync with real database schema
at io.debezium.relational.TableSchemaBuilder.validateIncomingRowToInternalMetadata(TableSchemaBuilder.java:217)
at io.debezium.relational.TableSchemaBuilder.lambda$createValueGenerator$4(TableSchemaBuilder.java:243)
at io.debezium.relational.TableSchema.valueFromColumnData(TableSchema.java:143)
at io.debezium.relational.RelationalChangeRecordEmitter.emitCreateRecord(RelationalChangeRecordEmitter.java:68)
at io.debezium.relational.RelationalChangeRecordEmitter.emitChangeRecords(RelationalChangeRecordEmitter.java:43)
at io.debezium.pipeline.EventDispatcher.dispatchDataChangeEvent(EventDispatcher.java:141)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.lambda$execute$1(SqlServerStreamingChangeEventSource.java:228)
at io.debezium.jdbc.JdbcConnection.prepareQuery(JdbcConnection.java:493)
at io.debezium.connector.sqlserver.SqlServerConnection.getChangesForTables(SqlServerConnection.java:143)
at io.debezium.connector.sqlserver.SqlServerStreamingChangeEventSource.execute(SqlServerStreamingChangeEventSource.java:151)
at io.debezium.pipeline.ChangeEventSourceCoordinator.lambda$start$0(ChangeEventSourceCoordinator.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more

[2020-08-28 19:37:10,203] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = key
schemas.cache.size = 1000
schemas.enable = true

[2020-08-28 19:37:10,204] INFO {org.apache.kafka.connect.json.JsonConverterConfig} - JsonConverterConfig values:
converter.type = value
schemas.cache.size = 1000
schemas.enable = false

[2020-08-28 19:37:10,208] INFO {io.debezium.embedded.EmbeddedEngine$EmbeddedConfig} - EmbeddedConfig values:
access.control.allow.methods =
access.control.allow.origin =
bootstrap.servers = [localhost:9092]
client.dns.lookup = default
config.providers = []
connector.client.config.override.policy = None
header.converter = class org.apache.kafka.connect.storage.SimpleHeaderConverter
internal.key.converter = class org.apache.kafka.connect.json.JsonConverter
internal.value.converter = class org.apache.kafka.connect.json.JsonConverter
key.converter = class org.apache.kafka.connect.json.JsonConverter
listeners = null
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
offset.flush.interval.ms = 60000
offset.flush.timeout.ms = 5000
offset.storage.file.filename =
offset.storage.partitions = null
offset.storage.replication.factor = null
offset.storage.topic =
plugin.path = null
rest.advertised.host.name = null
rest.advertised.listener = null
rest.advertised.port = null
rest.extension.classes = []
rest.host.name = null
rest.port = 8083
ssl.client.auth = none
task.shutdown.graceful.timeout.ms = 5000
value.converter = class org.apache.kafka.connect.json.JsonConverter

[2020-08-28 19:37:10,216] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.key.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:37:10,225] INFO {org.apache.kafka.connect.runtime.WorkerConfig} - Worker configuration property 'internal.value.converter' is deprecated and may be removed in an upcoming release. The specified value 'org.apache.kafka.connect.json.JsonConverter' matches the default, so this property can be safely removed from the worker configuration.
[2020-08-28 19:37:10,227] ERROR {io.siddhi.core.stream.input.source.Source} - Error on 'Insert_CapturedColumn'. Task EmbeddedEngine{id=Insert_CapturedColumninsertStream} rejected from java.util.concurrent.ThreadPoolExecutor@1f558765[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 2] Error while connecting at Source 'cdc' at 'insertStream'. java.util.concurrent.RejectedExecutionException: Task EmbeddedEngine{id=Insert_CapturedColumninsertStream} rejected from java.util.concurrent.ThreadPoolExecutor@1f558765[Terminated, pool size = 0, active threads = 0, queued tasks = 0, completed tasks = 2]
at java.util.concurrent.ThreadPoolExecutor$AbortPolicy.rejectedExecution(ThreadPoolExecutor.java:2063)
at java.util.concurrent.ThreadPoolExecutor.reject(ThreadPoolExecutor.java:830)
at java.util.concurrent.ThreadPoolExecutor.execute(ThreadPoolExecutor.java:1379)
at java.util.concurrent.Executors$DelegatedExecutorService.execute(Executors.java:668)
at io.siddhi.extension.io.cdc.source.CDCSource.connect(CDCSource.java:502)
at io.siddhi.extension.io.cdc.source.CDCSource.connect(CDCSource.java:56)
at io.siddhi.core.stream.input.source.Source.connectWithRetry(Source.java:160)
at io.siddhi.core.stream.input.source.Source$1.run(Source.java:185)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

[2020-08-28 19:38:31,507] WARN {org.wso2.msf4j.internal.MSF4JHttpConnectorListener} - Error in http connector listener : 'Remote client closed the connection before initiating inbound request'

Suggested Labels:

Suggested Assignees:

Affected Product Version:

OS, DB, other environment details and versions:

Steps to reproduce:

Related Issues:

@uarulraj486 uarulraj486 changed the title Streaming integrator Capture column - Facing issue Streaming integrator CDC Capture column - Facing issue Aug 29, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant