In today's data-driven world, creating efficient and reliable data APIs is crucial for businesses to make informed decisions. However, setting up a data API can be a daunting task, requiring extensive knowledge of various data sources, data transformations, and REST or gRPC API generation. And Building low-latency data APIs that connect to multiple data sources can be a challenging task.
This is where Dozer comes in, an open-source data API backend that implements uses a Real-Time SQL Engine and automatically generates REST and gRPC APIs. Using a simple YAML configuration, Dozer can connect to any data source, transform and store data in an embedded cache, create secondary indexes, and generate low-latency data APIs. In this talk, we'll explore how to create data APIs with Dozer using Python and Postgres, demonstrating how easy and powerful it is to build data APIs using YAML and Dozer.
In this talk, we will introduce Dozer, an open-source data API backend that enables developers to connect to any data source, transform and store the data in an embedded cache powered by LMDB, automatically create secondary indexes, and instantly generate REST and gRPC APIs. With Dozer, it's possible to combine multiple data sources in real time and instantly obtain low-latency Data APIs with just a simple configuration.
We will provide a use case for Dozer by demonstrating how it can be used with Python and Postgres to create a Data API. We will walk through creating a YAML configuration file, which Dozer will use to generate the API endpoints. We will show how Dozer can easily combine data from multiple sources and expose it through a single endpoint.
Attendees will leave this talk with a clear understanding of how Dozer can be used to create powerful Data APIs with minimal setup and configuration. They will also better understand the benefits of real-time SQL engines and how they can be leveraged to create fast and efficient data pipelines.