- cognitive computing
- advanced user experience
- cloud infrastructure
- digital identity
There are many definitions of cognitive computing; the one I like to use is based on four key needs of future technology: machines need to learn, machines need to think, machines need to talk to other machines, and machines need to interact with people.
There are many parallels between machines and human beings. At birth, we know nothing and immediately start to learn.
Machines are the same. They must be trained with specific types of data sets to begin to learn. As machines do not have the same type of brains as humans, scientists develop sets of rules, built around very specific taxonomies of words to train them.
Machines can be taught, for example, that the name Robert may also have name variants of Bob, Rob, or Bobby. They can be taught relationships between things, such as Exxon “is related” to oil, refineries, and gasoline.
Machine learning, though, is only as good as the data sets on which they are trained. So learning comes with a fundamental problem: just because a machine (or human for that matter) learns, it does not mean either understand.
Thus, we move to the next piece of cognitive computing: artificial intelligence, where machines begin to ‘think.’
Artificial intelligence, broadly speaking, is the next stage of attempting to apply human-like characteristics to data and information to get results and answers.
Recommendation engines (you bought this, you might like that) serve as a simple example of marrying learning with intelligence.
Again though, machines – like people – even as they learn and begin to reason, will find that they don’t have all the answers and will need to discover more information.
Machine to machine communication is typically called the “Internet of Things.” The machine might not know how many hours a fan blade in a jet engine has been in operation but it can ask a sensor in that engine for such information.
Thus, we can see how machines learn, think, and get more information. However, as with any technology, it means nothing if we cannot make it feel personal and relevant to a consumer.
Thus, the last component of cognitive computing – the human-computer interface – ties nicely into advanced user experience.
With cognitive computing, we use technology called ‘natural language processing’ to enable humans to ask questions in a ‘normal’ way to get answers.
We are seeing this glimpse of the human computer interface in an advanced user experience when we ask Apple’s Siri, Amazon’s Alexa, or Google Home for information.
Building in the cloud almost seems obvious in 2017, and yet when I joined in 2010, that was not the case due to prevalent security concerns.
Now, today, seven years on, the services that Amazon, Microsoft, and Google can provide with size, security, and scale far outweigh anything one might build new.
New solutions can be robustly architected and deployed with geographic redundancy in minutes courtesy of the global footprint of these infrastructure providers.
It’s an exciting world that awaits us with technologies that will revolutionize how we engage and interact with content regardless of the profession we work in.
Delivering this technology in an effective user experience will take it from the fringes to being a part of our everyday life and work-life blur.
It’s a future that we are looking forward to bringing to life for our customers, to help them get the answers and insight they need more efficiently.
Discover how to optimize your productivity and engage your audience with Refinitiv Wealth Management Solutions