Market Data Volatility and Growth Make Appliance Refreshes Essential

Key considerations when designing your analytics appliance refresh for increasingly volatile market data that’s set to grow +50% in 2 years.

Market Data Volatility and Growth Make Appliance Refreshes EssentialBy Scott Peerbolte    July 12, 2019      Thinking

A technology refresh is not an option for electronic traders. It’s a business imperative as market volatility, fuelled by trade wars and economic disruption, is driving more technology innovation and optimization to gain a competitive advantage.

Following on from the $300 million capital markets spent in 2018 on data, 2019 will see a greater focus on gathering more of it, storing it in new ways, using machine learning and AI to analyze it, and acting on it more systematically than ever before, according to Greenwich Associates.

Data explosion

Market data volumes have increased by 30% in the last three years and further growth of 56% is predicted over the next two1. Higher volumes mean bigger traffic peaks during trading open and close periods. Increasing volatility means more frequent spikes as quantitative traders look for novel trading signals with news feeds. This more complex and dynamic mix of market data services is driving adoption of next generation networks and infrastructure services. This presents a technical challenge for analysis and capture appliances, often sized for mean market conditions rather than peaks.

The ability to monitor and analyze the vast quantities of data-in-motion over small time periods is a way to see how algorithms, strategies and IT systems are keeping up with the volume. It provides insights for shaving milliseconds, or nanoseconds in some environments, off of data delivery or execution times. At the same time providing intelligence to improve aspects of order execution with greatest impact on profitability.

Examining infrastructure to ensure you can handle the increase in volatility and data volumes should be par for the course. The way monitoring appliances are configured in data centers, for example, should be inspected for readiness as evolution from 10-40Gb and eventually 100Gb connectivity gathers pace. You will need a bigger pipe to handle up and down volatility as you aggregate the data in your data center before sending it to a monitoring port.

At the same time, changing regulatory requirements around data retention have a direct impact on storage capacity. Different trading firms will have their own requirements for network data capture – the need to retain it for an hour, day, week or month will determine the size of an appliance and its capability when it comes to writing and storing data.

We regularly carry out performance health checks with customers where we look at their requirements and align them to next-generation architecture. We have a range of options to ensure Corvil appliances are always fit for purpose. Corvil’s Appliance Performance Dashboard shows how well clients are handling current volumes and help plan for volatile data feeds.

Getting Ahead in the Appliance Refresh Game

Workflows and use cases drive the appliance specification and the variables are different with every client. Consuming different kinds of data, for example, will push you towards certain appliances, while aggregation design decisions are all about the math.

It is very easy in the aggregation game to get yourself in trouble – if you have 1% utilization across a 10Gb link on 11 points on your network – which is not uncommon – you will already have a saturated link. Without a 40Gb connection you will have to rely on port density to handle the traffic and give you the headroom you need to avoid congestion. If you are aggregating multiple 10Gb links, it’s a better design decision to go into a 40Gb interface for your monitoring point.

Throughput and physical limitations in terms of storage should also be taken into account. As a result, some clients may need to supplement their existing analytics infrastructure with a similar appliance; for some clients we might recommend an upgrade to a somewhat larger appliance; and others may be best served by our new flagship appliance. The Corvil 9000 is built for high performance trading analytics (3M msg/s for FIX protocol) and 40Gbps sustained capture without compression.

Regardless of the recommended architecture, our goal remains the same – empower customers to reliably accommodate periods of high volatility, or market open and close, without losing the visibility or analytics necessary to assure market data delivery, optimize order execution and meet compliance demands.

Schedule a meeting to arrange an assessment of your existing environment.
Learn more about Corvil Appliance Performance Dashboard

1 OPRA total messages per day used as a proxy for Market Data volumes.
Source: OPRA Bandwidth Notices for Feb 2016, 2017 and 2019

Market Data Volatility and Growth Make Appliance Refreshes Essential

Scott Peerbolte, Sales Engineering Director, North America, Corvil
Corvil is the leader in performance monitoring and analytics for electronic financial markets. The world’s financial markets companies turn to Corvil analytics for the unique visibility and intelligence we provide to assure the speed, transparency, and compliance of their businesses globally. Corvil watches over and assures the outcome of electronic transactions with a value in excess of $1 trillion, every day.
@corvilinc

You might also be interested in...