SQLite storage for Tansu
Using SQLite as a storage engine with Tansu, a Kafka compatible streaming platform, producing and consuming Protobuf messages using generated test data.
All of my long-form thoughts on software development, collected in chronological order.
Using SQLite as a storage engine with Tansu, a Kafka compatible streaming platform, producing and consuming Protobuf messages using generated test data.
Learn how Tansu uses the Service and Layer traits to route, layer and process Apache Kafka messages in a modular and composable way.
Learn how to use Tansu to validate and automatically convert Kafka messages into Apache Parquet format, simplifying data processing. In this tutorial, we demonstrate using a Protocol Buffer schema to transform taxi ride data into Parquet files, with support for Apache Avro and JSON schemas as well. Discover how Tansu integrates with Apache Kafka, supports schema validation, and allows easy configuration with storage engines like S3 and PostgreSQL. Learn how to use tools like DuckDB to query Parquet files, making your data pipeline seamless and efficient.
Tansu, a Kafka-compatible broker using S3 or PostgreSQL, undergoes automated smoke testing with GitHub workflows and BATS.
In this article we deploy Tansu on Fly using Tigris Data S3 compatible storage.
Using serde, quote, syn and proc_macro2 to implement the Kafka protocol in Rust.