Why Cognitive Computing Is about to Change the Way You Think about Data Analytics

Cognitive ComputingYou don’t need to look much further than the estimated $3 billion in venture capital investment made from 2009-2013 in new analytics-related ventures to get an idea of the excitement surrounding big data and analytics.

And what’s more is that this sum doesn’t even count the mountains of money being poured into R&D and acquisitions by the larger players as well.

Amidst all of the money and buzz flowing into developing software to allow organizations to make better use of their data, an exciting new trend is beginning to emerge: a shift from traditional programmatic computing to cognitive computing.

The Advent of Cognitive Computing

Cognitive computing, at a high level, revolves around the idea of training a computer to think like a human brain. In particular, IBM has championed cognitive computing as it seeks to make waves with Watson. (You may recall that they very publically invested $1 billion into creating the division.)

I recently had a discussion with Steve Gold, VP of marketing and sales for IBM’s Watson division.  We had the chance to discuss the implications of cognitive computing as we transition away from our traditional interactions with computing and what it means for the future of analytics.

Big data and analytics are inherently complex and difficult for people to wrap their heads around. Creating predictive models, building algorithms, and even just manipulating data has traditionally been the domain of a highly trained and often scarce subset of people in organizations. This restriction changes, however, when we imagine an interface through which we engage with data more organically, either by typing or speaking a question in natural language.

As an executive of a company, I may not be able to write code, but I can certainly say something to the effect of: “What would be the most profitable location upon which to build a new store front?”

The intelligent data systems we interact with will be able process these sorts of requests and very quickly give us our desired answers. In my discussion with Steve Gold, he was quick to point out both the ability and the importance of Watson asking qualifying questions.

The analytics interfaces of the future will be able to identify key factors and prompt us to answer such things as “what locations are we considering?” or “how much are we willing to spend on a store?” If these intelligent interactions still seem a bit far fetched to you, then I would encourage you to think of Watson’s stellar performance on Jeopardy a few years back or to watch this video demoing a new age of smart shopping assistants.

There is an opportunity here to enhance our ability to effectively analyze data. Whether you are a data scientist, an executive, or a front line employee, you will have a new medium of interaction with your analysis. In the same way that our mobile devices and Google searches are ‘brain extenders’ to the multiply the amount of accessible knowledge available to us, cognitive computing could similarly expand the data-driven questions that we can answer.

Of course, IBM is not the only horse in this race to bring us cognitive computing. Google’s deep learning initiatives–dubbed ‘Google Brain’ and–Microsoft’s Project Adam are both bringing artificial intelligence to the brink of becoming frighteningly real.

The Intersection of Cognitive Computing, Natural Language Processing and BI

The quantum leap of all this is to be able to take natural language and perform analysis in the same way that a human would construct and analyze the data.

There is no shortage of nuance to how I might ask my questions. Certainly, understanding the context in which questions are asked is pivotal to making these solutions useful. To this end, a number of high profile offerings either already available or coming to the market are aimed at bringing our interactions with devices to a new level. These include Microsoft’s Cortana, Apple’s Siri, Google Now, and Intel’s Jarvis. Each represents an exciting advance into taking the spoken word and turning it into actionable commands for machine processing.

Bringing the issue closer to BI specifically are companies such as Looker and DataRPM, that are pioneering natural language search to answer data-driven questions. Each is going to market with a distinctly new-age way to interact with organizations’ data. Looker sports a browser-based interface, while DataRPM’s interface exhibits speech recognition as one of its features.

These companies are betting that business users want a new way to ask questions other than building queries and models or shipping requests off to the IT department. The level of self-service and immediacy to information afforded makes for a very compelling business case for solutions of this nature.

As R&D dollars continue to pour into cognitive computing, we will continue to see a shift towards more natural interactions between man and machine. As these technologies come of age there is an opportunity to make natural language the standard medium through which we interact with data.

The implications are important to consider as it would not only increase the efficiency in which we would be able to analyze data, but also enormously increase the number and type of people have access to it.

About the Author: James Haight is a research analyst at Blue Hill Research focusing on analytics and covering a variety of emerging enterprise technologies. James brings a unique perspective to technology research, drawing from his background in economics, executive compensation, and microfinance.

Selecting The Right BI Vendor:
The Ultimate Guide

Choosing a BI vendor is all about finding the right fit. Our exclusive report will walk you through the process and help you select the perfect fit.Download Now


  1. “As an executive of a company, I may not be able to write code, but I can certainly say something to the effect of: “What would be the most profitable location upon which to build a new store front?””

    More and more so, the user of BI data is non-tech people. They don’t have the skills to custom code to pull reports, they don’t have the analytical training to dig through millions of pieces of data and come up with a result. But they do know what kind of information they are looking for. Getting a computer that can understand a request and create a report with no extra help based on that would make life easier for a lot of people.

Speak Your Mind