Eliminating Encryption Blind Spots in Network Security

 In Blog, Cybersecurity

In the early days of the internet, data flowed freely and in plain sight for anyone to see. Emails could easily be read by anyone curious enough to eavesdrop on an SMTP server or as the data passed by. The same was true with web browser traffic. There was a limited ability for e-commerce because no standardized method to encrypt data between the browser and the webserver was available.

This all changed for browser traffic in 1994 when Netscape introduced the Secure Sockets Layer (SSL) protocol. Through SSL, data could now easily be encrypted in transport as it flowed across the internet. An eavesdropper would need a website’s unique encryption key in order to decipher your data. Initially, SSL was only used on sites when encryption was necessary due to the sensitivity of the information being transmitted. Websites would commonly direct a user from the insecure HTTP version of a page to the secure HTTPS version once sensitive information such as credit card info or user credentials were being transmitted. Encrypted traffic made up only a small percentage of all internet traffic.

By 2010, websites were starting down a trend of encrypting all site traffic. Processing resources were becoming abundant enough that administrators no longer had to be concerned with encrypted traffic slowing the user experience. In 2011, the search giant Google began encrypting all search traffic on their sites when a user was logged into an account. And by 2013 Google quietly began encrypting all search traffic. Today nearly 90% of all internet traffic is encrypted in transit. The move to encrypting traffic by default has come largely on the heels of growing privacy concerns online. But with the increase in privacy, also came a blind spot for security.

The privacy gained for users through encryption also created privacy for attackers. As WatchGuard technologies reported in their Q1 2020 Internet Security Report, two-thirds of malware is now encrypted. This encrypted traffic creates an opportunity for hackers to slip malicious code through your network defenses undetected. Perimeter security systems must evolve and inspect all incoming and outgoing HTTP and HTTPS traffic when possible. Some compliance standards such as PCI (Payment Card Industry) and HIPAA (Health Insurance Portability and Accountability Act of 1996) require that data remain encrypted from end to end. This traffic cannot be effectively inspected by a firewall.

Decrypting, inspecting and re-encrypting all this traffic is a drain on network perimeter devices. NSS Labs report SSL-based attacks require 15 times more resources from the server-side than the client-side. Securing a network from encrypted attacks can once again have a significant impact on user experience when performance is impacted. As more web traffic becomes encrypted, organizations need to find a network security solution that keeps them secure while maintaining performance for end-users. This degradation of throughput performance is a common problem in processing and scanning secure traffic. With more security features enabled, the throughput rate decreases. In the case of Unified Threat Management (UTM) devices, this means allowing enough device processing power and performance to handle anticipated traffic bursts with all security layers turned on including HTTPS inspection.

As threats continue to increase along with the cost of recovering from a breach, organizations can’t afford encryption blind spots in their network security. Gaining better visibility into all network traffic is one of the most important steps you can take today to improve overall security. Contact us to learn how you can best gain this visibility without degrading performance and stay one step ahead of the latest threats.

Recent Posts
Contact Us

We're not around right now. But you can send us an email and we'll get back to you, asap.

Not readable? Change text. captcha txt

Start typing and press Enter to search