WarpStream

@warpstream
4 Followers
13 Following
32 Posts
WarpStream is an Apache Kafka® compatible data streaming platform built directly on top of object storage: no inter-AZ bandwidth costs, no disks to manage, and infinitely scalable, all within your VPC.
Homepagehttps://www.warpstreamlabs.com
YouTubehttps://www.youtube.com/@warpstreamlabs

I did a fun thing today. I used Streamlit for the first time and wired it up to the
@warpstream #ApacheKafka drop-in replacement so I could have a webpage that would prompt me for credentials to a cluster, give me a list of available topics, and then some records I want to see, which then display on the page. I did one last thing: take inbound data and summarize a field into a bar chart. It's all in this short video.

#python #kafka #dataengineering #streamingdata

https://youtu.be/NGRHezHyOmk

- YouTube

Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.

How did I ingest 4 million records from @warpstream into #ParadeDB on my laptop? Using #Bento - find out how. #apachekafka #dataengineering #postgres
https://youtu.be/ZjBQ-DzrYyw
- YouTube

Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.

@warpstream provides powerful and easy-to-use features for managing your #ApacheKafka compatible topics in a WarpStream cluster. I made this 2-minute video to show you how it all works.

#dataengineering #datastreaming #kafka #streamingdata

https://youtu.be/yn0dMWoGw4M

- YouTube

Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.

💸 We're making a bold claim: we can reduce Kafka costs by as much as 10x (by removing inter-AZ networking fees and using object storage). See how our benchmarks show how we do that. 📊
#apachekafka #dataengineering
https://www.warpstream.com/blog/warpstream-benchmarks-and-tco
Public Benchmarks and TCO Analysis

Benchmarking databases – and maintaining fairness and integrity while doing so – is a notoriously difficult task to get right, especially in the data streaming space. Vendors want their systems to produce mouth watering results, and so unnatural configurations divorced from customer realities (AKA “vanity” benchmarks) get tested, and it's ultimately the end-user that is left holding the bag when they realize that their actual TCO is a lot higher than they were led to believe.

WarpStream (@warpstream_labs) on X

Interested in learning more about @warpstream_labs zero disk architecture and how it makes Kafka cheaper and simpler? Check out @richardartoul's complete Kafka Summit London talk.

X (formerly Twitter)
Wouldn't it be great if you could run Kafka in your own environment, eliminate inter-zone networking fees, never have data leave your environment and not have to worry about cross-account permissions? You can with WarpStream Bring Your Own Cloud.
#apachekafka #dataengineering #streamingdata
https://www.warpstream.com/blog/secure-by-default-how-warpstreams-byoc-deployment-model-secures-the-most-sensitive-workloads
Secure by default: How WarpStream’s BYOC deployment model secures the most sensitive workloads

WarpStream's Zero Disk Architecture enables a BYOC deployment model that is secure by default and does not require any external access to the customer's environment.

I've wanted to find a way to take an #ApacheKafka producer from @warpstream and get it into DuckDB for a couple of months. Last week, I was looking at Estuary and saw they had both a #Kafka connector and MotherDuck connector, and I saw my chance, so I hit up their data engineer Daniel Palma and by gum; it just took us a few minutes to generate some good sample data with ShadowTraffic and get two tables into MotherDuck.

#motherduck #duckdb #dataengineering #streamingdata
https://youtu.be/920VzW6jE1E

Warp Solutions: Estuary & WarpStream

YouTube
Huge upgrade: WarpStream understands your data now. Integrate with any external schema registry and fully validate produced records (field level). Protobuf support is coming soon 😉
#apachekafka #dataengineering #streamingdata
https://www.warpstream.com/blog/announcing-warpstream-schema-validation
Announcing WarpStream Schema Validation

WarpStream now has the capability to connect to external schema registries, and verify that records actually conform to the provided schema.

Excellent post from our engineer Aratz on measuring consumer group lag in time instead of offsets.
#apachekafka #dataengineering #streamingdata
https://www.warpstream.com/blog/the-kafka-metric-youre-not-using-stop-counting-messages-start-measuring-time
The Kafka Metric You're Not Using: Stop Counting Messages, Start Measuring Time

Traditional offset-based monitoring can be misleading due to varying message sizes and consumption rates. To address this, you can introduce a time-based metric for a more accurate assessment of consumer group lag.

In addition to WarpStream being SOC 2 Type I certified, we are now SOC 2 Type II certified. This reinforces our best-in-class security practices and production readiness. Learn more and request our report via our Trust Center 👉 https://security.warpstream.com
#soc2 #kafka
WarpStream Labs Trust Center

Trust Centers are the fastest and most transparent way to demonstrate your company's commitment to security