5 Big Technology Innovations Of 201827 March, 2018 / Articles
Today IBM is announcing its ‘Five-for-Five’ report, highlighting five amazing technological advances that will have a real impact on lives in the next half decade.
From AI-powered micro-cameras monitoring the world’s oceans to the graduation of quantum computers from the lab into the real world, here’s a summary of these game-changing breakthroughs.
- AI-powered robot microscopes will help clean up the world’s water supplies
Water shortage is a problem that could affect up to a quarter of the world’s population by 2025. The behavior of microscopic plankton can give vital clues on everything from chemical pollution levels to temperature change.
Autonomous, robotic cameras developed by IBM and powered by AI have the potential to monitor this behavior in more detail than has been possible before. Data from the cameras can be analyzed to give real-time insights into factors affecting water quality and life in our lakes and oceans.
Ahead of today’s release, Jeff Welser, vice president and lab director at IBM Research told me “So with internet of things (IoT) we talk about putting sensors everywhere – and this is an example of just how far we can take this, when we combine it with AI.
“We know people are going to have all kinds of problems with clean water in the future, and we know there are micro-organisms in water, that if we can get them to tell us what’s happening that would be a really great way to understand any potential problems.”
Making the devices as low-powered as possible is essential, in order to be able to deploy them at scale. To this end, they don’t contain lenses or focus mechanisms or other complicated mechanical parts, but simply track shadows and movements through light sensors.
“We can get a lot of information from that”, Welser says. “Are the microscopic organisms moving around as they should be? There’s a lot of interesting science on what that behavior means.”
- Crypto-Anchors and blockchains to fight Counterfeiter
Nobody likes knockoffs – blockchain and crypto-anchors will help to crack down on counterfeiting as well as ensure security in the food supply chain
With $600 billion a year lost to the global economy through fraud and counterfeiting, blockchain offers the potential to ensure the provenance of everything from food to diamonds and life-saving medicines.
In a global economy, goods pass through many different sets of hands between their point of production and the end consumer. This leaves them open to tampering and theft problems which blockchain technology could help to eliminate.
In order to work, however, there needs to be a tamper-proof link between the physical products and the digital records on the blockchain. This is where crypto-anchors come in – microscopic codes or identifiers which can serve as “digital fingerprints” to ensure security at every stage of the journey.
“The challenge here is that the blockchain can record all the transactions but somewhere you’ve got to link the transactions to the actual physical object itself – so that you know the banana that got scanned is the actual banana that got to you,” Welser tells me.
“What crypto anchors do is they basically embed tiny codes, like microscopic QR codes, in a way that makes it so that if you tried to replace it with a similar one, you could tell it had been tampered with. When you put those codes onto the blockchain, the supply chain is then protected.”
- Lattice cryptography will discourage even quantum-powered hackers
Complex algebraic structures called lattices will become a valuable tool in the age of quantum computers. With more and more sensitive data being collected and stored online, security measures will need to keep pace with the growing capability of hackers, as virtually unlimited amounts of computing power become cheaper and more available.
Until now ever-more complex cryptography – from 64-bit encryption to 128 bit and 256 bit – has been the standard response to the increasing amount of CPU power available to hackers. As quantum computing becomes mainstream, this will no longer be enough.
“The reality is there’s constantly a battle on with cybersecurity, we need to make sure we continue to have cryptography and encryption that can keep the bad guys out, and all of that relies on the fact that the maths is so hard to do that trying to solve it with a computer takes an unreasonable amount of time,” Welser says.
“We have to make sure that as computers get faster, we can continue to keep ahead of them. In particular, this is a concern with the quantum computers that are coming up.”
Lattice cryptography involves encoding data within high-dimensional algebraic structures which even theoretical million-qubit quantum computers will find tough to crack. It also opens up the possibility of Fully Homomorphic Encryption (FHE), which will enable computers to operate on data while it is still in an encrypted state – eliminating the security flaw inherent in existing systems whereby data has to be decrypted (and thus made vulnerable to hackers) in order to be processed. This could, for example, mean credit reference systems which can make credit scoring decisions without personal data ever being exposed.
- AI bias will explode, but only unbiased AI will survive
The most sophisticated AI systems are only as good as the data they are trained on, and if that data has been collected in a biased or compromised way, then results are unlikely to fit with the real world which we are attempting to model.
Devising new ways to monitor for bias, and eliminate it at the source, are keys to creating AI software which accurately reflects reality, rather than the biased human view of reality that AI promises to help us transcend.
Welser tells me “One of the hopes of AI is that it will help us make decisions in less biased ways, because AI won’t have human biases. So if you’re making decisions on mortgages, or who should get bail, or who you should recruit, all of those things have biases built into them.
“AI systems would hopefully be able to make those decisions with less bias, but the challenge is that the AI gets trained on data, and if that data has a bias then your AI will be biased.
“We spend a lot of time right now working on how the systems we’re training aren’t inadvertently learning bias, and also protecting them from players who might be trying to teach them bias that we don’t want them to have.”
AI systems trained in this way to provide an unbiased, objective model of the world are likely to be the most successful. This will help us to tackle moral and ethical problems which will be encountered by any industries or fields of research attempting to use AI to tackle social issues or make decisions that will affect human lives
- Quantum computing will move from the research labs into the real world
Half a decade from now, quantum computing will be an essential element of any computer engineering degree, IBM researchers are today predicting. Rather than a technology shrouded in mystery, it will be fundamentally understood and a practical tool in use solving problems in many disciplines and industries.
“We’re doing a lot with quantum computers,” Welser says. “We have a 15 qubit system on the cloud which anyone can go and use and we’re seeing a lot of interesting things.
There’s over 100,000 hits on it now which are people going and writing programs on it.
“But it’s still a toy at the moment – a researcher’s playground. I think that in the next five years we will have systems that are both large enough and have low enough error rates that we will see some really interesting things that have real value.
“The most likely area is things like quantum chemistry – simulations of things like molecules or chemical bonds.
“Right now we use very large high-performance computing systems but even with those, once we go beyond simulating a few molecules or atoms it becomes very difficult because there are too many variables. And of course, as we are working at the sub-atomic level those are quantum variables. They can be simulated very directly with a quantum computer.”
An understanding of quantum computing will be essential for those looking for careers in any scientific field, and students will leave university with hands-on experience of running practical experiments on quantum powered machines. And just as most engineers or scientists could today outline what is meant by the computing term “bit”, in five years’ time, the term “qubit” will be widely understood.
Bernard Marr is a best-selling author & keynote speaker on business, technology and big data. His new book is Data Strategy. To read his future posts simply join his network here.