<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1678611822423757&amp;ev=PageView&amp;noscript=1">
Defrag This

| Read. Reflect. Reboot.

Can AI, Analytics And Cognitive Computing Enhance Cybersecurity Defenses?

Michael O'Dwyer| June 22 2017

| security

can-ai-analytics-and-cognitive-computing-enhance-cybersecurity.jpg

We need a new cybersecurity solution that can block threats, eliminate human error and adequately protect valuable data. Cognitive computing may well be the answer.

Yippee! Yet, another alarmist cybersecurity story. Let’s dive right in with some statistics that demonstrate that cybercrime is a problem worth discussing. PwC’s Global Economic Crime Survey 2016 (it’s a free download, not one that costs thousands of dollars) indicates that cybercrime is the second most reported economic crime, affecting 32% of organizations. What is telling is that most companies are neither prepared for, nor understanding of, the risks of cybercrime. Only 37% of organizations have a cyber incident response plan. In addition, senior management is dropping the ball in terms of preparation. According to the survey, less than half of the senior managers requested information on their company’s state of cyber-readiness.

With thousands of new malware and phishing variation each day (no stats needed as we all know this), how is it that companies are still burying their heads in the sand when it comes to cybersecurity? What we need is a new solution that can block threats, eliminate human error and adequately protect valuable data. Cognitive computing may well be the answer.

What Is Cognitive Computing?

Fans of the gameshow Jeopardy! understand the game format. For example: “Supplier of the world’s most effective secure FTP solutions” returns the answer “What is Ipswitch”, obviously.

See what I did there? You hardly even noticed my subliminal reference, right? Okay, moving on. Cognitive computing is a little more difficult to describe. Artificial intelligence is where a computing system learns from its interactions without additional programming, but cognitive computing gets more complex.

“There is an ongoing debate as to what qualifies as AI and what is more likely a reasonably sophisticated conditional algorithm. Cognitive computing is a relatively new evolution in the space with underpinnings at the nexus of massively parallel computing power availability, neural networking and dark data. In my opinion, an AI exists if the system learns in real-time and thereby becomes better at its given task without requiring additional programming. Cognitive computing is more specialized. It specifically deals with unstructured data like English prose, and while it can dip into and leverage structured data, it excels in the former. This is not to suggest that cognitive computing can hold a conversation in a  given language, though rather that it can identify patterns within unstructured dark data that were formerly impenetrable,” said Damion Hankejh, chief strategy officer, director and innovating founder at BOHH Labs, a Bay Area-based fintech security company

In fact, in 2011, the ability of cognitive computing was effectively demonstrated by IBM’s Watson, which is described as the world’s first cognitive system. It defeated Ken Jennings and Brad Butter at Jeopardy!, effectively proving that it is capable of working with unstructured data (prose or text) in a meaningful manner, and then using that data to beat the best human players of the game. Technophobes can relax, it did not become ‘self-aware’ or threaten humanity at this time or since.

Industry-leading Managed File Transfer Software Our Managed File Transfer  software provides controlled movement of critical data between partners, people  and systems. LEARN MORE

A True Innovation in Cybersecurity

What IBM’s Watson did indicate was the potential for improvement without additional programming. Perhaps this could be utilized in cybersecurity? Sure, but not without analytics.

“The advent of predictive analytics with underpinnings in AI and cognitive computing is one of the very few genuinely new developments the security industry has seen in decades. It levies a powerful new tool built at the nexus of big data, massively parallel computing and cognitive systems that can analyze breach attempt and success data in near real-time to assist security specialists in identifying credible threats. Cognitive cybersecurity systems will grow increasingly capable of autonomous operation, further improving their predictive defenses,” said Hankejh.

Supercomputing does not necessarily need vast software and hardware resources.

A colleague recently asserted that, ‘IBM Watson is not a supercomputer’. I disagreed, noting that while Watson is software and can be run on a laptop, it cannot achieve general usefulness without a massively parallel computing infrastructure [to connect with]. Watson is indeed a supercomputer,” said Hankejh.

What has really changed in cybersecurity in decades? Not much.

“By far the most interesting development in cybersecurity is the advent of cognitive computing predictive analytics – the application of cognitive computing to numerous terabytes of breach data. In fact, aside from Watson’s push into this space, there is nothing otherwise notable in the cybersecurity space since the advent of firewalls in 1980 and network encryption in Netscape’s 1994 Navigator,” insisted Hankejh.

He could well be right, as the average home user still relies on antivirus, malware and spyware solutions that have hardly changed, apart from database expansion and faster updates as new patches are released.

Read: Which Cybersecurity Approach Is Best For Your Business?

Infallibility Still A Dream

Assuming companies can afford the latest cybersecurity solutions, it is still impossible to be fully protected or to eliminate all false positive alerts.

“Predictive analytics systems, including cognitive computing approaches, are a powerful tool for network analysts. The sheer number of breach attempts on a given company is unmanageable by humans alone. There are still false positives, though these systems nonetheless are reducing 100K+ breach attempts to a few thousand worthy of human review,” said Hankejh.

What about catching human errors, such as those associated with phishing that allow hackers into a system or network?

“Without question, cognitive computing predictive analytics can reduce human errors,” said Hankejh, citing WannaCry as an exception that requires a new path forward in cybersecurity.

“It could have been prevented, for instance, by patches from Microsoft for known vulnerabilities in unsupported operating systems. Alas, that is a business case decision that went unexecuted,” he added.

In his opinion, a better solution for future cybersecurity improvement is to design one that would be business case, software and infrastructure agnostic, one that would secure against both external and internal threats alike.

In conclusion, it appears that predictive analytics in a cognitive system is indeed a worthy innovation in a cybersecurity industry that struggles to deal with human error and constantly changing variants of malware. At a company level, leveraging high-performance computing for cyber defenses is an expensive proposition but still doesn’t reflect anywhere near a guaranteed defense against breaches.

“The ideal cybersecurity solution doesn’t require HPC, routine patching and dedicated support. It would likely be an evolution in the appliance or SaaS space to immunize networks from brute force attacks, man-in-the-middle hacks and notably quantum decryption, which as a breach tool will arrive sooner than anyone cares to think about,”said Hankejh.

So, there you have it. As cybersecurity improves, savvy hackers are utilizing the same innovations for their own benefit. In the meantime, as high-profile breaches continue, all we can do is try to stem the tide and cognitive computing may well provide the means to at least reduce the risk of a successful breach.

Topics: security

Leave a Reply

Your email address will not be published. Required fields are marked *

THIS POST WAS WRITTEN BY Michael O'Dwyer

An Irishman based in Hong Kong, Michael O’Dwyer is a business & technology journalist, independent consultant and writer who specializes in writing for enterprise, small business and IT audiences. With 20+ years of experience in everything from IT and electronic component-level failure analysis to process improvement and supply chains (and an in-depth knowledge of Klingon,) Michael is a sought-after writer whose quality sources, deep research and quirky sense of humor ensures he’s welcome in high-profile publications such as The Street and Fortune 100 IT portals.

Free Trials

Getting started has never been easier. Download a trial today.

Download Free Trials

Contact Us

Let us know how we can help you. Focus on what matters. 

Send us a note

Subscribe to our Blog

Let’s stay in touch! Register to receive our blog updates.