Good data quality is one of the most critical requirements in decoupled architectures, like microservices or data mesh. Apache Kafka became the de facto standard for these architectures. But Kafka is a dumb broker that only stores byte arrays. The Schema Registry enforces message structures. This blog post looks at enhancements to leverage data contracts for policies and rules to enforce good data quality on field-level and advanced use cases like routing malicious messages to a dead letter queue.

From Point-To-Point and Spaghetti To Decoupled Microservices With Apache Kafka

Point-to-point HTTP / REST APIs create tightly coupled services. Data lakes and lakehouses enforce a monolithic architecture instead of open-minded data sharing and choosing the best technology for a problem. Hence, Apache Kafka became the de facto standard for microservice and data mesh architectures. Data streaming with Kafka is complementary (not competitive!) to APIs, data lakes/lakehouses, and other data platforms.

Leave a Reply

Your email address will not be published. Required fields are marked *