We have a message scheduler that generates a hash-key from the message attributes before placing it on a Kafka topic queue with the key.
This is done for de-duplication purposes. However, I am not sure how I could possibly test this deduplication without actually setting up a local cluster and checking that it is performing as expected.
Searching online for tools for mocking a Kafka topic queue has not helped, and I am concerned that I am perhaps thinking about this the wrong way.
Ultimately, whatever is used to mock the Kafka queue, should behave the same way as a local cluster - i.e. provide de-deuplication with Key inserts to a topic queue.
Are there any such tools?
If you need to verify a kafka specific feature, or implementation with a kakfa specific feature, then the only way to do it is by using kakfa!
Does kafka have any tests around its deduplication logic? If so, the combination of the following maybe enough to mitigate your organizations perceived risks of failure:
If kafka does NOT have any sort of tests around its topic deduplication or you are concerned about breaking changes, then it is important to have automated checks around kafka specific functionality. This can be done through integration tests. I have had much success recently with docker based integration test pipelines. Once the initial legwork of creating a kafka docker image (one is probably already available from the community) it becomes trivial to setup integration test pipelines. A pipeline could look like:
I think the important thing is to make sure kafka integration tests are minimized to ONLY include tests that absolutely rely on kafka specific functionality. Even using docker compose they may be orders of magnitude slower than unit tests, ~1ms vs 1 second? Another thing to consider is the overhead of maintaining an integration pipeline may be worth the risk of trusting that kakfa will provide the topic deduplication that it claims to.