Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Kafka Connect JDBC Sink Connector: How to delete a record that doesn't have a NULL value?

Is there a (recommended) way to delete a record from a Kafka Connect JDBC Sink Connector where the record's value is not NULL?

For example, if my JSON configuration includes the following:

...
"delete.enabled": "true",
"pk.mode": "record_key",
...

And my record's value is non-null, is there a way to have that record be deleted in the database?

I ask because the record's value has a field that marks if it should be deleted i.e a column like "Operation" where "Operation" == "D" should be a delete in the database via JDBC.

If there is a standard/recommended way to do this I would love to hear it. My only other idea was to make a custom transform that checks the "Operation" column for the value "D" and if it is a match, we pass back the record with the PK intact but with the value set to NULL aka a tombstone record which should get picked up by the connector as a delete operation. Is that a possibility?

I appreciate any help, thank you!

like image 991
bmoe24x Avatar asked Oct 23 '25 15:10

bmoe24x


1 Answers

No responses yet but I got my somewhat hacky solution to work:

  • Created a custom Transform that sets value of the record to NULL (makes a tombstone record) if a certain condition is met (in my case this is checking a field in the record's value)
  • Transform returns original record if condition is not met
  • Packaged into a JAR
  • Provided JAR on the "plugin.path"
  • Make sure "delete.enabled":"true" and "pk.mode":"record_key" so that tombstone records are actually deleted
  • When sending POST request to instantiate a connector, include the transform and any relevant configuration in body of POST

Hope this helps

like image 171
bmoe24x Avatar answered Oct 26 '25 10:10

bmoe24x