Today, I’m excited to announce DeltaStream has secured $15M Series A funding from New Enterprise Associates(NEA), Galaxy Interactive, and Sanabil Investment. This brings our total raised to $25M and will accelerate DeltaStream's vision of providing a serverless stream processing platform to manage, secure and process all streaming data. 

The Beginnings of Our Story

I joined Confluent in 2016 because I was given the opportunity to build a SQL layer on top of Kafka so users could build streaming applications in SQL. ksqlDB was the product we created, and it was one of the first SQL processing layers on top of Apache Kafka. While ksqlDB was a significant first step, it had limitations. Including being too tightly coupled with Kafka, only working with one Kafka cluster, and creating a lot of network traffic on Kafka cluster. 

The need for a next-generation stream processing platform was obvious. It had to be a completely new platform from the ground up, so it made sense to start from scratch. Starting DeltaStream was the beginning of a journey to revolutionize the way organizations manage and process streaming data. The challenges we set out to solve for were:

  1. Easy of use: Writing SQL queries is all a user should worry about
  2. Have a single data layer to process/analyze all streaming and batch data, including, for example, data from Kafka, Kinesis, Postgres, and Snowflake
  3. Standardize and authorize access to all data
  4. Enable high-scale and resiliency
  5. Flexible deployment models

Building DeltaStream

At DeltaStream, Apache Flink is our processing/computing engine. Apache Flink has emerged as the gold standard platform for stream processing with proven capabilities and a large and vibrant community. It’s a foundational piece of our platform, but there’s much more. Here is how we solved the challenges outlined above:

Ease of use

We have abstracted away the complexities of running Apache Flink and made it serverless. Users don’t have to think about infrastructure and can instead focus on writing queries. DeltaStream handles all the operations, including fault tolerance and elasticity.

Single Data Layer

DeltaStream can read across many modern streaming stores, databases, and data lakes. We then organize this data into a logical hierarchy, making it easy to analyze and process the underlying data. 

Standardize Access

We built a governance layer to manage access through fine-grain permissions across all data rather than across disparate data stores. For example, you would manage access to data in your Kafka clusters, Kinesis streams all within DeltaStream.

Enable High Scale and Resiliency

Each DeltaStream query is run in isolation, eliminating the “noisy neighbor” problem. Queries can be scaled up/down independently.

Flexible Deployment Models

In addition to our cloud service, we provide BYOC for companies that want more control of their data. This is essential for highly regulated industries and companies with strict data security policies. 

Also, with DeltaStream, we wanted to go beyond Flink and provide a full suite of analytics by enabling users to build real-time materialized views with sub-second latency. 

What’s next for DeltaStream

We’re just getting started. Here are a few things we’re planning:

  • Increase the number of stores we can read and write to. This includes platforms such as Apache Iceberg and Clickhouse.
  • Increasing the number of clients/adaptors we support, including dbt and Python.
  • Multi-Cloud
  • Leverage AI to enable users with no SQL knowledge to interact with DeltaStream

If you are a streaming and real-time data enthusiast and would like to help build the future of streaming data, please reach out to us; we are hiring for engineering and GTM roles!

If you want to experience how DeltaStream enables users to maximize the value of their streaming data, try it for yourself by heading to deltastream.io.

Finally, I would like to thank our customers, community, team, and partners—including our investors—for their unwavering support. Together, we are making stream processing a reality for organizations of all sizes.