Kathy Gibson at Gartner Symposium, Cape Town — Artificial intelligence (AI) is becoming more pervasive — and it is starting to enable intelligent apps and things.
This is among the trends that we can look forward to in 2018, according to Gartner’s list of the top 10 technology trends.
Brian Burke, vice-president and head of research at Gartner, refers to the intelligent digital mesh as the overarching concept.
Intelligent refers to technology that is becoming more insightful and aware of context; digital is about technology that spans the digital and physical world, becoming immersive and more autonomous; and mesh talks to the enabling underlying technologies that are enabling these trends, while making them dynamic and secure.
“Each of these — intelligence, digital and mesh — will have a different impact on your organisation — you need to analyse how you can use them to increase your competitive advantage — or how your competitors will use them to displace you.
“These are not standalone technologies: the combinatorial effect will have the most impact on your business, your industry and the world overall.”
By 2020, 30% of CIOs will have AI as one of their top five investment priorities; while 40% of new development projects will have AI components delivered by joint teams of data scientists and programmers.
“We think it will be a bit like electricity was 100 years ago: we will add it to just about everything that we do.
“What we won’t have in this decade is general AI,” Burke adds. “We won’t have R2D2, but we can apply it to very narrow fields.”
AI has been around for a long time, Burke points out. It was researched as far back as the 1960s. Today it has evolved into supervised learning, unsupervised learning and reinforcement learning.
Deep learning and machine learning is where we are today, he adds.
“Machine learning is where we have data objects, and neural networks are taught to recognise data by aged objects.”
This is good for face, speech and image recognitions.
Unsupervised learning is where data is fed into a system, which finds unexpected data, Burke adds.
Reinforced learning happened when an algorithm has a goal, and we let it try to reach the goal, rewarding it when it reaches it.
“Data is driving a lot of this,” Burke adds. “But probably more important is the development in compute technology. AI today uses mostly GPU chips, but the major players are looking at new technologies to develop chips that focus specifically on machine learning tasks.”
By 2020, Gartner believes that deep neural networks (DNNs) and machine learning applications will represent a $10-billion market opportunity for semiconductor vendors.
AI will be applied to apps and analytics on one hand, and things on the other.
We are seeing the application of AI in apps and analytics now: packaged applications, custom applications and security/operations are already available. Intelligence will soon be seen widely used in chatbots, virtual assistants and advisors.
Analytics is becoming more pervasive, thanks to AI. Augmented analytics is helping data scientists and end users gain better insights.
In terms of things, we’re already seeing consumer appliances, industrial equipment and medical devices, with robots, drones and autonomous vehicles coming soon.
Pilot projects are already seen in retail scenarios, drone deliveries and even a hotel manned only by robots. There are also interesting moves in the medical sphere. Intelligence is being applied to a raft of consumer appliances. Intelligence is even being piloted in self-sailing ships.
A further development in driving the Internet of Things is the concept of artificial swarm intelligence, where large numbers of things can learn to act in unison, or co-operatively — for instance, in traffic management.
Gartner predicts that by 2022, IoT will save consumers and business $1-trillion a year in maintenance, services and consumables.
On the infrastructure front, the industry is now talking about cloud to edge connectivity. “Where we are moving towards, at least with IoT, is to put more intelligence on the edge,” Burke says.
In this scenario, assets on the edge will communicate with a gateway that aggregates, transforms, filers and forward data to the edge server that does additional analytics and control before forwarding the data to the server or cloud that looks after the monitoring and management.
“Moving intelligence to the edge means that we are able to react to events at the edge,” Burke says. This is important in applications like autonomous vehicles.
“But we need to communicate with the external word as well, and provide data to the cloud for analysis, and interacting with other vehicles for intelligent collaboration.”
So there is a need for intelligence at the edge for quick response, and the data also needs to be provided to a broader set of stakeholders.
This is true for a range of applications, for instance in healthcare.
Digital twins is a fairly new trend that is now coming to fruition
This involves creating a digital representation of the physical device, which is loaded up with sensors that record events and send the data to the digital twin.
It is happening today in industrial machines, usually for observation, operation and optimisation.
“But here’s the challenge,” Burke says. “You have the technology provider and the enterprise ecosystem — and they need to be integrated. Right now, that is your problem because there is no consistent communication between these providers.”
The idea of the digital twin is evolving into new areas. For instance, people will soon be represented by digital twins — a development that will be interesting for healthcare providers and insurers.
The concept can also extend to digital twinning of cars, trucks and even cities.
Conversational systems are extremely significant, Burke adds.
“In the past, when we interacted with computers, we had to be computer literate. Conversational systems flip that model: the complexity is done by the computers — humans signal their intent; the computer deciphers the intent and provides the response.
“This is massively important in the way we interact with technology.”
This is currently seen in voice-activated personal assistants like Siri, Cortana and Alexa, but will soon evolve way beyond this, Burke says.
New technologies will include virtual personal assistance, virtual employee assistants and virtual commercial assistants.
“More and more it is going to shift away from making people interact with technology.”
Increasingly, the interface is going to be voice. The system then processes the language, does context awareness and intent handling, before integrating with information systems and sending the information back.
“We are moving to a multi-sensory, multi-modal model,” Burke says. “It will be auditory, visual, touch, haptics, smell, taste and more.”
Right now, we interact with a few devices that are not interconnected. “What will happen is that we will maintain conversation across multiple devices — but that is in the future.”
Immersive experiences build on the trends of augmented reality and virtual reality (AR/VR). “Other sensory input is making for a much more immersive experience,” Burke says.
VR currently doesn’t provide the user with any real information, while AR provides the user with the real word and some virtual augmentation.
This is shifting, Burke says. The addition of sensors in the physical world and increased intelligence is driving the move to mixed reality, using all kinds of sensory inputs — spatial, motion, audio, visual, tactile nd more.
“What we are ultimately moving towards is the integration of the perception and the iteration,” Burke sys. “And we will get to the transparent immersive experience.”
On the infrastructure front, blockchain and the distributed ledger are moving — but slowly.
“We are seeing vendors starting to come out with frameworks,” Burke says. “Where it is right now is that financial services are leading the way; government is starting to come on board.
“Other commercial applications are starting to emerge, particularly in supply chain.
“But blockchain is still in the early deployment stage. There are few large-scale deployments and there is still a lack of trust around the technology.”
Event-driven IT is starting to happen. “By 2020, event-driven processing will be necessary,” says Burke
Currently, most systems are request-centric, and event-centric systems will complement these.
On the security front, there is a continuous adaptation of risk and trust.
Systems need to get to the point where they can predict security events before they occur, rather than simply responding to attacks.
Among the technologies that will be seen in this environment is deception technologies that will trap threats before they cause harm.
This is leading to the rise of DevSecOps — adding security into the DevOps cycle as an integral part of an development initiative.
“We will do this by automating a lot of the security administration that needs to happen to make DevOps secure,” Burke says.