Technical Deep Dive: Use Kafka to Optimize Real-Time Analytics in Financial Services & IoT Applications
Start Date: February 06, 2019
Start Date: February 06, 2019

Date: Wednesday, February 6, 2019
Time: 8:00 am Pacific / 10:00 am Central / 11:00 am Eastern / 4:00 pm London / 5:00 pm CET

Click here to register

When it comes to the fast-paced nature of capital markets and IoT, the ability to analyze data in real time is critical to gaining an edge. It’s not just about the quantity of data you can analyze at once, it’s about the speed, scale, and quality of the data you have at your fingertips.

Modern streaming data technologies like Apache Kafka and the broader Confluent platform can help detect opportunities and threats in real time. They can improve profitability, yield, and performance. Combining Kafka with Panopticon visual analytics provides a powerful foundation for optimizing your operations.

Use cases in capital markets include transaction cost analysis (TCA), risk monitoring, surveillance of trading and trader activity, compliance, and optimizing profitability of electronic trading operations. Use cases in IoT include monitoring manufacturing processes, logistics, and connected vehicle telemetry and geospatial data.

This online webinar will include in depth practical demonstrations of how Confluent and Panopticon together to support several key applications. In this webinar, you will learn:

  • Why Apache Kafka is widely used to improve performance of complex operational systems
  • How Confluent and Panopticon open new opportunities to analyze operational data in real time
  • How to quickly identify and act on anomalies
  • How to quickly identify and react immediately to fast-emerging trends, clusters, and outliers
  • How to scale data ingestion and data processing
  • Build new analytics dashboards in minutes

Speakers


Peter Simpson
VP Panopticon
Streaming Analytics
Datawatch
Peter Simpson is responsible for driving the vision and strategy for Datawatch’s Panopticon streaming analytics solutions. Peter joined Datawatch through the acquisition of Panopticon where he held the role of SVP Research & Development and had responsibility for product strategy, development, sales engineering and services, and helped to successfully expand the customer base to most top tier global financial institutions.

Prior to Datawatch, he was Product Manager at Instant Information, a provider of news analytics solutions to trading desks. He also held analytical roles at HSBC Global Markets within Technology, Equities, and Research for several years. Peter holds a MSc in Info Systems Engineering and a BSc in Physics with Space Science & Technology.


Tom Underhill
Partner Solutions Architect
Confluent
Tom Underhill is responsible for helping Confluent’s many partners become successful implementing solutions using the Confluent Platform. Tom joined Confluent after several years consulting in the Big Data and Analytics space where he led technical teams delivering large scale integration projects. His passion has always been around liberating data from silos, turning batch into real time, and building systems that scale.