I’m Peter, the founder of tansu.io, we develop, license and support an Apache Kafka® compatible broker with SQLite, PostgreSQL and S3 storage engines with Apache Parquet, Apache Iceberg and Delta Lake.

SQLite storage for Tansu

Using SQLite as a storage engine with Tansu, a Kafka compatible streaming platform, producing and consuming Protobuf messages using generated test data.

Effortlessly Convert Kafka Messages to Apache Parquet with Tansu: A Step-by-Step Guide

Learn how to use Tansu to validate and automatically convert Kafka messages into Apache Parquet format, simplifying data processing. In this tutorial, we demonstrate using a Protocol Buffer schema to transform taxi ride data into Parquet files, with support for Apache Avro and JSON schemas as well. Discover how Tansu integrates with Apache Kafka, supports schema validation, and allows easy configuration with storage engines like S3 and PostgreSQL. Learn how to use tools like DuckDB to query Parquet files, making your data pipeline seamless and efficient.