How Deep Neural Networks Built on AWS Can Help Predict and Prevent Security Threats
This article was originally written for the Amazon Partner Network Blog Post
At Deep Instinct, we apply end-to-end deep learning to cybersecurity.
Deep learning is inspired by the human brain. Once a brain learns to identify an object, its identification becomes second nature. Similarly, as Deep Instinct’s artificial neural network learns to detect more and more types of cyber threats, its prediction capabilities become instinctive. As a result, malware both known and new can be predicted and prevented in zero-time.
Deep Instinct is an AWS Partner Network (APN) Select Technology Partner whose predictive threat prevention platform can be applied against known or unknown threats, whether it be a file or fileless attack. It works completely on raw data to classify between malicious or benign files and can be applied to any device and all major operating systems.
As the critical infrastructure behind Deep Instinct’s deep learning network architecture, Amazon Web Services (AWS) supports the training of the deep learning neural network and drives real-time threat intelligence, file reputation, and event analysis.
This post addresses how Deep Instinct uses AWS services to process millions of queries per day and to support the huge processing power involved. Dealing with such a large number of data samples in each training cycle can be challenging, but AWS allows us to operate an architecture that supports millions of endpoints with strict KPIs and SLAs for performance and availability.
Creating Datasets for Training the Deep Neural Networks
The accuracy of Deep Instinct’s inference model necessitates the ability to train on as many data samples as possible to ensure enhanced reliability and resilience. It’s important to note the reliability of the inference models becomes incrementally better with the number of iterations that have been conducted to improve the model.
The dataset is created from multiple sources and stored in Amazon Simple Storage Service (Amazon S3). From each of these files, Deep Instinct uses the raw data to support the training cycles so the neural network can perform non-linear correlations.
This creates a significant computing challenge, though, as it necessitates computing power that can’t be facilitated in-house using generally available CPUs and memory constraints in pre-training stages. The entire training phase, therefore, is conducted in-house by Deep Instinct using GPUs.
For this purpose, Amazon Elastic Compute Cloud (Amazon EC2) is utilized to create Spark clusters. Spark is a distributed computing framework for big data processing and analysis.
Amazon EC2, meanwhile, offers the ideal infrastructure to host these clusters, and often serves as an expandable, low-configuration service that’s an easier alternative to running in-house cluster computing. The Amazon EC2 machines are used as Amazon EC2 Spot instances, which enables Deep Instinct to keep costs low while providing business continuity.
Using Spark allows for linear scalability, which is conducive to growing the number of training samples. Even if the datasets grow in orders of magnitude, the training process is minimally affected and can keep abreast without changing the inherent code to accommodate the increased load.
Supporting Deep Instinct’s Threat Intelligence
According to the Ponemon Institute, enterprises waste an average of 425 hours a week responding to and investigating false positives, costing them an average of $1.37 million per year.
Furthermore, a study by Vanson Bourne which polled 600 IT decision-makers (300 from IT operations and 300 from IT security), indicates that 55 percent of CISOs consider their solution’s verdicts to be unreliable or wrong. These poor results attest to the industry-wide need for solutions that provide high detection rates and the lowest possible false positives.
False positives are widely considered to be an unavoidable challenge for many cybersecurity professionals, so the ability to effectively avoid these enables security staff to focus efforts on more productive activities.
To meet this need for customers, Deep Instinct prioritized the investment in our file reputation services. A separate repository was required for storing events and file scanning data from production environments to further reduce the rate of false positives.
This second database was hosted in just one location so that data would be searchable and easily correlated with existing data from our AWS-based D-Cloud reputation layer. This cloud-based environment stores file records, threat intelligence, prevalence, event information, and rich metadata on billions of files from dozens of different sources.
That said, the scalability of Deep Instinct’s solution was also important so it could facilitate the constant inquiries coming in each and every day. A normal week could see thousands of inquiries from Deep Instinct’s various agents around the globe. A centralized server was necessary, as responses have to be fast.
There are billions of files that comprise the full spectrum of file types and platforms, be it PE, PDF, OLE, APK, Fonts, RTF TIFF, SWF, or Mach-O. This enormous variation means any previously known malware has been labeled with its respective verdict.
D-Cloud provides a comprehensive file reputation service that further reduces false positives and improves Deep Instinct’s threat intelligence infrastructure. This infrastructure supports endpoints around the globe so that no matter where an agent may be located it can inquire D-Cloud and receive the respective verdict in less than 200 milliseconds.
Stages of Verifying Data Inquiries
Deep Instinct’s threat intelligence database on AWS receives millions of new samples each week from a global pool of sources, making it a trusted source of information that can be used to verify all types of data inquiries.
The serverless Lambda function is a small piece of software that has all of the logic inside, including protocols and methods to fetch the data from the database. AWS Lambda relies on Amazon CloudFront, which is a fast, highly secure, and programmable content delivery network (CDN).
CloudFront can update or upgrade the D-Client (end device) in the fastest way available, as AWS installs CloudFront servers in locations in the world so users can access a server that’s localized to their geographic location.
Lambda@Edge acts as a protocol between the two components of the software. Prior to the request, the protocol needs to be defined between the agent and DynamoDB on how it should be queried and how the response is to be relayed back.
The Lambda is created in the cloud, and the agent team is provided with all the details on how to query data on files.
Customer Use Case: Designing for Millions of Devices Worldwide
This tailor-made security solution involved a complex design for millions of devices worldwide. With our D-Cloud infrastructure in place, Deep Instinct was uniquely positioned to offer a solution with strict update time, availability and scalability to meet the production needs of HP.
As a commercially available product, HP Sure Sense is a self-managed solution. It has no management console because the deployment environment doesn’t have any security officers who would be following events and analyzing the type of events that are occurring.
For HP, this necessitated a high level of confidence in the inference model decision, because there is no course of remediating actions. Deep Instinct’s strong solution’s detection rate and low level of false positives became crucial to the security product’s success.
In delivering the project for HP, a key objective for our team at Deep Instinct was to provide updated services and threat intelligence that would scale to meet the customer’s anticipated increased volume, while also addressing operational and product requirements.
Such requirements included providing a CDN, serverless infrastructure, load balancing, and domain name services (DNS) at the most cost-effective rate. Deep Instinct worked closely with AWS solution architects to define, configure, and design the infrastructure.
Here’s how AWS services were incorporated into HP Sure Sense:
- Amazon CloudFront is a fast CDN service that securely delivers data, videos, applications, and APIs to customers globally with low latency and high transfer speeds, all within a developer-friendly environment.
- Amazon S3 is the object storage service containing the raw metadata to support the training cycles.
- Amazon API Gateway provides scalable integration with multiple sources.
- Amazon Route 53 provides a DNS service to support communication.
- AWS Lambda fetches the details and passes them on to the agent via Amazon API Gateway.
- Amazon DynamoDB is a NoSQL database that runs completely on serverless infrastructure.
- Amazon EC2 hosts all of the computations.
This infrastructure was designed with the ability to perform upgrades and updates for the deep learning model infrastructure in a way that would enable significant scaling.
The infrastructure was tested with a high-volume agent simulation, and once it proved successful was ready for implementation with HP Sure Sense.
With AWS, Deep Instinct’s product architecture was easy to apply with a payment model that scales well for growth. The user interface is supported by a large community of users and a wealth of informational resources, including user manuals and tutorials. This all contributed to the successful completion of the project within management constraints.
To learn more about Deep Instinct’s product architecture, download our technical brief.
A critical module of our production architecture, AWS both supports the development and training of the core deep learning neural network (D-Brain) and supports Deep Instinct’s research and threat intelligence services.
The inference capability of the deep learning neural network necessitates the ability to train on as many data samples as possible to ensure enhanced reliability. When it comes to cybersecurity, quality is key. The more iteration for validating models, the better the performance of the deep learning solution.