Route, Layer and Process Kafka Messages with Tansu Services
Learn how Tansu uses the Service and Layer traits to route, layer and process Apache Kafka messages in a modular and composable way.
I’m Peter, the founder of tansu.io, we develop, license and support an Apache Kafka® compatible broker with PostgreSQL and S3 storage engines.
Learn how Tansu uses the Service and Layer traits to route, layer and process Apache Kafka messages in a modular and composable way.
Learn how to use Tansu to validate and automatically convert Kafka messages into Apache Parquet format, simplifying data processing. In this tutorial, we demonstrate using a Protocol Buffer schema to transform taxi ride data into Parquet files, with support for Apache Avro and JSON schemas as well. Discover how Tansu integrates with Apache Kafka, supports schema validation, and allows easy configuration with storage engines like S3 and PostgreSQL. Learn how to use tools like DuckDB to query Parquet files, making your data pipeline seamless and efficient.
Tansu, a Kafka-compatible broker using S3 or PostgreSQL, undergoes automated smoke testing with GitHub workflows and BATS.
In this article we deploy Tansu on Fly using Tigris Data S3 compatible storage.