Technical Deep Dive: Using Kafka to Optimize Real-Time Analytics in Financial Services

Date: Wednesday, January 9, 2019
Time: 8:00 am Pacific / 10:00 am Central / 11:00 am Eastern / 4:00 pm London / 5:00 pm CET

Click here to register

When it comes to the fast-paced nature of the capital markets, the ability to analyze data in real time is critical to gaining an edge. It’s not just about the quantity of data you can analyze at once, it’s about the speed, scale and quality of the data you have at your fingertips. The faster you can respond to your data, the easier you can take advantage of fast-moving market events.

Modern streaming data technologies like Apache Kafka and Confluent KSQL, the streaming SQL engine for Kafka, can help detect trading opportunities and threats in real time, improving profitability and performance. Combining Kafka with Panopticon visual analytics provides a powerful foundation for optimizing electronic trading operations.

Use cases in capital markets include transaction cost analysis (TCA), risk monitoring, surveillance of trading and trader activity, compliance, and optimizing profitability of electronic trading operations.

This webinar will include in depth practical demonstrations of how to use Apache Kafka, Confluent KSQL, and Panopticon together to support several key applications. In this webinar, you will learn:

  • Why Apache Kafka is widely used to improve electronic trading performance
  • How KSQL and Panopticon open opportunities to analyze and profitability in real time
  • How to quickly identify and act on anomalies
  • How to quickly identify and react immediately to market events
  • How to scale data ingestion and data processing
  • Build new analytics dashboards in minutes

Speakers


Peter Simpson
VP Panopticon
Streaming Analytics
Datawatch

Peter Simpson is responsible for driving the vision and strategy for Datawatch’s Panopticon streaming analytics solutions. Peter joined Datawatch through the acquisition of Panopticon where he held the role of SVP Research & Development and had responsibility for product strategy, development, sales engineering and services, and helped to successfully expand the customer base to most top tier global financial institutions.

Prior to Datawatch, he was Product Manager at Instant Information, a provider of news analytics solutions to trading desks. He also held analytical roles at HSBC Global Markets within Technology, Equities, and Research for several years. Peter holds a MSc in Info Systems Engineering and a BSc in Physics with Space Science & Technology.


Tom Underhill
Partner Solutions Architect
Confluent

Tom Underhill is responsible for helping Confluent’s many partners become successful implementing solutions using the Confluent Platform. Tom joined Confluent after several years consulting in the Big Data and Analytics space where he led technical teams delivering large scale integration projects. His passion has always been around liberating data from silos, turning batch into real time, and building systems that scale.

Subscribe to get the latest updates on data preparation solutions