Updated March 2023

Below are eight case studies showcasing how our Kafka experts have supported clients with Kafka challenges.  These case studies cover a variety of fields and highlight the vast applications for Kafka across industries.

Information Security

CHALLENGE:  Client needs to monitor all servers, devices, applications, and laptops in this 500+ employee company.  Devices that do not support producing to Kafka need to be configured to output to a second location that is capable of producing to Kafka.  Client needs real time alerting and dashboards to view the status of the company.

SOLUTION:  Dattell worked with several teams in the company to deploy log collection and producing programs across all servers, laptops, and applications. Wherever needed, Datell worked with the company to create unique pipelines for devices such as firewalls that do not support producing directly to Kafka.  Due to the sensitive nature of the data being collected, Dattell implemented security both in transit and at rest for the entire pipeline.

TACTICS:
•  Implemented many layers of security including certificate verification and access control
•  Optimized for high throughput and low cost
•  Saved the client millions in licensing costs

Trading Brokerage

CHALLENGE:  Client needs exact ordering of messages, exactly once delivery, and extremely low latency for purchase/sale of cryptocurrencies on their exchange.

SOLUTION:  Dattell worked with the client to configure producers to send small batches of data to a specific Kafka broker partition to decrease latency and prevent out of order messages. Dattell optimized the Kafka brokers for durability and low latency.  Dattell assisted client in consumer architectures that have low latency consumption.

TACTICS:
•  Trained employees on best practices for low latency and guaranteed exactly once delivery
•  Optimized for low latency and durability
•  100% uptime managed service

Machine Learning

CHALLENGE:  Oil rigs produce a large amount of data with spotty internet connections and limited hardware resources.  A 100% complete data set is required to develop models that predict failures and guide preventative maintenance.

SOLUTION:  Dattell worked with the client on a minimal persistent producer implementation optimized for the hardware resources and Kafka broker cluster to have required processing guarantees.  Dattell worked with client to implement single per topic consumer groups that maintain durability and availability.

TACTICS:
100% data delivered
•  Added cost for durability and availability
•  Lowered cost for high latency and throughput

Customer Tracking

CHALLENGE:  Client needs to monitor hundreds of thousands of vehicles’ driving performance while retaining as close to 100% of data possible to develop machine learning techniques that predict the cost per vehicle to be profitable.

SOLUTION:  Dattell worked with the client to build infrastructure capable of processing 100+ TB of data per day.  Dattell optimized for the lowest cost of hardware while guaranteeing at least once delivery of all messages.

TACTICS:
•  Massive project focusing on providing high value for the lowest price
•  Data driven decisions making – benchmarks drove purchases
•  Saved the client tens of millions in licensing costs

Video Streaming

CHALLENGE:  Client needs to collect log data from thousands of applications running globally into a single location for analysis and discovery.  Many different types of applications and programming languages are in use across this large business.  The amount of data being indexed exceeds ten terabytes per day.

SOLUTION:  Dattell worked with the client to integrate different Kafka producers into their applications optimizing for throughput.  Dattell built a Kafka broker cluster that handles spikes of up to 40 TB per day.  Dattell built a consumer pipeline that pulled data from Kafka, transformed data, and loaded the data into Elasticsearch saving the company tens of millions of dollars per year on licensing fees.

TACTICS:
•  Optimized for throughput
•  Adapted Kafka implementation to the environment
•  Benchmarked implementation for definitive validation of high load

Medical Manufacturing

CHALLENGE:  Company automating the drug manufacturing process with multiple machines needs Kafka as a message bus guaranteeing delivery of 100% of data while also keeping costs low.

SOLUTION:  Dattell worked with the company to integrate open source Kafka producers into their code that is optimized for durability, availability, and cost. Dattell built a Kafka broker cluster capable of both acting as a message bus for the machines producing drugs as well as a starting point for Artificial Intelligence (AI) to determine the best procedure for making the drugs.  Dattell worked with the client to implement multiple consumer groups consuming data to both perform normal tasks as well as develop an AI.

TACTICS:
100% data delivered
•  Added cost for durability, low latency, and high availability
•  Lowered cost for scalability and throughput

Precision Manufacturing

CHALLENGE:  Client needs exact parts and verification that all parts are installed correctly.  Client requires low latency, high throughput, and guaranteed at least once delivery.  If an unplanned explosion occurs and the internal records are not recovered, as much data as possible is needed to discover the cause of the explosion.

SOLUTION:  Dattell worked with the client to build multiple Kafka broker clusters, Kafka producer, and Kafka consumer implementations.  The precision manufacturing cluster focused on durability, scalability, and availability at greater cost and latency. The potentially explosive-based cluster focused on latency and at least once delivery at the cost of all other categories.

TACTICS:
•  Sub 1 millisecond delivery time
•  After reviewing business cases Dattell chose to deliver multiple solutions
•  Isolated failure with multiple clusters

Radio Streaming

CHALLENGE:  Client needs to collect log data from millions of user applications, sort delivery to many different destinations with various stages of transformation.

SOLUTION:  Dattell built a multi-tiered topic subscription system and transformation platform where a single message undergoes several transformations and insertions into various databases. Dattell worked with the client to integrate different Kafka producers into their applications optimizing for throughput.  Dattell built a Kafka broker cluster that handles spikes of up to ten terabytes per day.

TACTICS:
•  Built custom platform to match unique need
•  Optimized for throughput and durability
•  Benchmarked each stage of process to prevent service funneling

Have Kafka Questions?

Managed Kafka on your environment with 24/ 7 support.

Consulting support to implement, troubleshoot,
and optimize Kafka.

Schedule a call with a Kafka solution architect.

Discover more from

Subscribe now to keep reading and get access to the full archive.

Continue reading

Scroll to Top