Corvil Connectors make it easy to stream your Corvil data directly into Compliance, Big Data Analytics and Operations platforms. No code – just install, configure and stream.
Corvil Connectors tear down data silos that hinder collaboration and decision-making. Now everyone can leverage the power of Corvil’s real-time data streams within their preferred decision-making or analytics tools.
Corvil Connectors are the glue that joins the Corvil Streaming API to receiving systems. A growing library of connectors is provided without cost to existing customers supporting the major Big Data, Analytics, and Operations platforms. Outside of the library, creating a connector for an unsupported system is straightforward, with sample code provided as a starting point.
Configure the Connector to write to your receiving system, and you’re up and going in minutes. No need to get familiar with the Corvil streaming API. Simple.
Corvil supported connectors scale to hundreds of thousands of events per second, depending on receiver. If clients go down for any reason, connectors pick up where they left off, retrieving missing data and ensuring a complete data set.
Put your Corvil data powerfully to work. Whether you need a complete record for compliance, or want detailed operations information combined with other data sources, Corvil connectors deliver the reliable data, in realtime, at the speed you need.
Combine Corvil real-time data streams with other internal data sets to drive unique business insights.
Use Corvil Streams to send just the data you need. Avoid incurring excessive data costs in the receiving system. For example, stream details of slow transactions, content of every failed login, or connections from domains flagged by Threat Intelligence.
Comprehensive visibility across network and application tiers with zero integration requirements.
“We use Splunk all the time and it was critical for us to have the Corvil data represented within Splunk, as well as use the Corvil UI when needed.”
VoIP Operations Manager, Tier-1 Global Bank
A growing library of connectors is provided without cost to existing customers - allowing an authoritative record to stream directly from the network data into your chosen big data solution.
Cloudera is the leader in enterprise big data management, powered by Apache Hadoop™. Corvil’s partnership with Cloudera allows joint customers to stream Corvil data continuously and at scale into Cloudera Enterprise, where it can be processed, analyzed, modeled and served using a variety of engines that are part of the Cloudera data management platform.
The collaboration was motivated by requests from companies in the financial services sector that are seeking to address the escalating burden of governance, risk, and compliance in an increasingly regulated and scrutinized financial industry.
Corvil leverages its comprehensive real-time visibility into network data, to stream high value, low volume security related events to HP Arcsight, including threat intelligence based indicator detections, user activity tracking, and entity-based anomaly detection. Every Corvil generated event sent to HP Arcsight, provides Security Analysts with a "click-back" option which enables rapid investigation and response, by providing Security Analysts with immediate visibility into the bigger picture of network activity associated with impacted hosts and users.
Carbon Black’s Cb Response product provides an endpoint/host based incident response and threat hunting capability, which continuously records and captures all threat activity enabling the ability to hunt threats in real-time, visualize the complete attack kill chain, and then respond and remediate attacks, quickly.
Corvil Security Analytics provides a similar capability but from the network control point.
The Corvil Carbon Black connector integrates Carbon Black’s Cb Response with Corvil Security Analytics enabling a number of workflows involving automated data exchange, orchestration and seamless pivot across endpoint/host and network control points.
Highlighted use-cases include: End-to-End Threat Visibility, Live and Retrospective Investigation, Threat Hunting, Response and Ecosystem Enablement. For more details, see our Endpoint Security solution.
Integrate Corvil Streams with the Apache Kafka messaging backbone to provide scalable pub/sub access to event data and analytics, sourced live from your network. Originally developed at LinkedIn, Kafka is designed for fast and fault-tolerant delivery to multiple receivers.
Use Corvil’s Kafka Producer to republish Corvil Stream data as Kafka messages with CSV payload, easily consumable by any Kafka client.
Many products have support for ingesting data from Kafka, including Hadoop HDFS, Presto, Storm and MongoDB.
Apache Flume is a common mechanism for loading data into HDFS and other systems. Flume is an extensible mechanism that defines a range of Sources for reading data and Sinks that write data to various destinations, including HDFS. It is robust and fault tolerant with tunable reliability mechanisms and many failover and recovery mechanisms. It uses a simple extensible data model that allows for online analytic application.
Corvil has defined a Flume Source that allows a Corvil Stream to be connected into the Flume agent. The agent is then configured to send the data on to the appropriate Sink. In addition to HDFS, this also allows data to be put into HBase, MongoDB and Cassandra - amongst others.
Amazon Kinesis is a platform for streaming data on AWS. Kinesis is architected for high-throughput and is well integrated with the AWS product suite.
The Corvil Kinesis connector republishes Corvil Streams data as a JSON payload, widely supported by Kinesis receivers.
Many Corvil customers are using Splunk as a key part of their IT Operations, bringing together log data, alerts, and other information. As a result, getting Corvil data into Splunk is an essential part of their workflows. The Corvil Splunk connector, available from Splunkbase, will have you up and running in no time. You can ingest complete streams, or limit it to specific types of events to reduce your data ingestion. Extra features include automatically-created callbacks that can take you directly from Splunk to Corvil searches, granular data, and dashboards.
The Corvil Splunk connector comes in the form of a simple Splunk app that uses the Splunk Modular Input mechanism. With the app installed, the list of available inputs will now include a Corvil input. You can create multiple Corvil inputs to pull in data from as many analytics streams as required across your Corvil deployment.
Elastic provides an open source search and analytics suite for IT Operations Analytics, and other use-cases. The stack includes ElasticSearch, Logstash and Kibana, for data storage, logfile analysis and dashboarding, respectively.
Corvil data can be ingested into ElasticSearch using either the Kafka or Flume connectors. We recommend the Kafka connector, as the current version of Elastic does not appear to support Flume generally.
The Java Database Connectivity (JDBC) API provides database-independent connectivity between the Java programming language and a wide range of databases. The Corvil JDBC connector injects Corvil data streams into relational databases such as Oracle Database, Microsoft SQL Server, and Oracle MySQL, enabling database analysts to apply SQL queries to Corvil data streams.
The JDBC Connector is perfect for sending Corvil data directly to the following relational database systems:
Corvil’s real-time data with highly accurate time stamps is streamed into kdb+ to enable time-series analysis for compliance reporting, data-mining and correlation with other data sources, reconciliation analysis for electronic brokerages.
kdb+, from Kx systems, is a high-performance time-series database - widely used across the financial sector. Many Corvil customers feed Corvil data into kdb to facilitate multiple use-cases: complete record of transactions for compliance reporting, data-mining, and correlation with other data sources, reconciliation analysis for electronic brokerages.
kdb+ is highly customizable, the Corvil connector is designed to work with a kdb+tick deployment. It will connect to the Corvil analytics stream, download the Google Protobuf schema and then convert that into a kdb+ table definition. Once the database table is consistent with the analytics stream, the connector will stream the events into the kdb+tick real-time database.
MongoDB is a NoSQL database that stores data using a flexible document data model similar to JSON. MongoDB can be quite flexible in allowing changes to data models as needs and schemas evolve, making it a popular choice with development teams. Once models are fixed, rules can be enforced. MongoDB has a scalable architecture, with sharding and replication for scalability and high-availability. And MongoDB makes extensive use of RAM, providing in-memory speed and on-disk capacity.
We recommend the use of the Corvil Kafka connector, for ingesting data into MongoDB.
The Corvil Syslog connector republishes Corvil Streams data as syslog events. Particularly useful for low-volume streams containing detailed alert content, the Syslog connector output can be consumed by a huge range of systems.
The Corvil Console connector is ideally suited for rapid prototyping of Corvil data integrations. The Console connector writes Corvil streams to a local file, or to standard input so that you can concentrate on valuable analysis and not on data plumbing.