We’re living in a golden generation of technological innovation. If you’ve watched a world-class soccer or rugby team in action, you’ll recognise the signs - technologies brimming with potential that come of age all at the same time, and creating a whole that far transcends its individual parts. Like any good team, the magic comes from how the players interact with one another.
Sensors are getting smaller and more cost-effective while computing power is soaring thanks to hardware innovation like edge computing. Artificial intelligence is getting more able to make autonomous decisions just as our ability to feed it large amounts of data is increasing. Advances in image and gesture recognition as well as natural language processing are redrawing the boundaries of what data is, while new user interfaces such as AR-powered devices and wearables open up opportunities to engage with that data.
In other words, potentially disruptive technologies have matured enough over the last few years that we can start mixing and matching them in very unique ways. Put them together in the right configurations - sensor-enabled wearable devices with high processing speeds and data-crunching AI for example - and their capabilities can go far beyond what we’ve ever been able to do before.
Welcome to the age of convergence. The phrase that sums it up best? “Put two things together that have not been put together before, and the world changes.”
Houston, we have a solution
Ever since we first lit the first campfire humans have been experimenting with new ways of using technologies. What if I combined the hot-flamey stuff I made by striking two rocks together with the meat I hunted using this pointy stick?
One of the best examples for understanding the ongoing process of digital convergence is digital twin technology, which uses data from sensors to simulate real world physical assets and dynamically respond to changing environmental factors. While the concept has only started to gain mainstream recognition as a term in the last few years, it’s been around since the Space Race - think of the scenes from Apollo 13 when Houston scientists recreate the environment aboard the Apollo and come up with survival options based on what they have available.
The technology available at the time meant NASA was limited to using a table full of parts as their simulation environment. As the decades passed and the technologies became available, NASA scientists were able to simulate mission environments (among many other things) much more accurately on computers. As sensor computing and analytics technologies advanced, the possibilities of what digital twin tech could do grew exponentially.
Nowadays, real-time analytics capable of simulating a physical environment are widespread beyond NASA. GE’s Predix platform has over 1-million digital twins active across its ecosystem, primarily in the manufacturing sector but spreading to the aviation, utility and transport industries.
This year’s Consumer Electronics Show saw Kia showcase its Real-time Emotion Adaptive Driving System (READ), an interface that adjusts a car’s temperature, seat positioning, lighting and music based on the user’s preferences. The basic concept of a data-driven car system is nothing we haven’t seen other car manufacturers explore, but READ takes it a step further by incorporating facial recognition technologies that allow it to read your emotions. Instead of having to input the relevant data yourself, your car could potentially recognise when you need to turn up the aircon or change the station from the expressions on your face.
Take the combination of AI, real-time analytics and 3D printing. The US Navy and Lockheed Martin are working together on robots that can make complex decisions on how to optimise 3D printing production. Using machine learning algorithms, the robots would be able to monitor and adjust the design process much faster and more efficiently than humans could. Developers like AI Build are investigating the possibility of factory of the future as a service, where users can access autonomous 3D printing services from anywhere in the world.
As exciting as these user cases are on their own, the potential new applications that emerge from them are even more exciting. Who’s to say that the next generation of cars wouldn’t be able to recognise when someone is too tired or intoxicated to drive from their speech, and take over driving duties? Just imagine a fully autonomous 3D printer deployed to a rural area that can print medical supplies on demand.
The 5G piece of the puzzle
As convergence unites potentially game-changing technology, South Africans might be sceptical of how long these applications will take to reach our shores. There’s reason to think that we will be enjoying the spread of true IoT much sooner than expected: the arrival of 5G.
Last year, Vodacom launched one of the world’s first commercial 5G networks in Lesotho, using the newly allotted 3.5 GHz spectrum. Many more providers across South Africa, Kenya and Nigeria have similar plans scheduled for the next few years. The biggest barrier to next-generation mobile connectivity in SA has always been our limited spectrum. Thankfully, the last few years have seen telecom operators and government work together to try and solve the spectrum issue, and Lesotho’s successful use of the 3.5 GHz band provides a roadmap for ICASA in the years ahead.
The potential opened up by the convergence of AI, sensor, processing and other disruptive technologies is heavily reliant on universal connectivity and the free flow of data. For years, South Africans have faced a shortage of last mile connectivity options, especially as you move outside urban areas. 5G could be the catalyst in delivering reliable connectivity to these areas.
AS IoT starts to embed itself in our everyday lives, it’s time to get playful about possibilities. Think big about what outcomes you’re seeking and then think about what LEGO pieces you need.
You don’t need to know every development in every area but you do need a rough idea of what’s happening in the main areas - AI, IoT and AR especially. Keep an eye on how organisations are using combinations of technologies instead of seeing new use cases through the lens of a single trend. Think holistically about new advances in your industry and explore how the interaction of different technologies came to create that specific example, and what other applications they might open up.
Finally, keep a close eye on consumers and how their behaviour shifts in response to these new interplays of technologies. As incredible as some of these new applications are, the success of executing them is dependent on whether it makes sense for your customers or users.