Skip to Content
Robotic hand typing on computer keyboard

6 surprising innovations for the future of computing

By Dan Wellers, Fawn Fitter | 5 min reading time

As silicon-based transistors become so tiny that they bump up against the laws of physics, manufacturing techniques can no longer keep up. That signals the upper limits of Moore’s Law, which posits that the number of transistors on a microprocessor (and therefore its computing power) can double every two years. But does that mean the era of exponential tech-driven change is about to come to a screeching halt?

Absolutely not.

 

Moore’s Law has never been an immutable truth, like gravity or the conservation of energy. It’s been more of a self-fulfilling prophecy: it set expectations for chip makers to meet, and so they did. That helped stoke the world’s insatiable hunger for more and more computing power – and that demand isn’t going to disappear just because we’ve taken silicon-based microprocessors about as far as they can go. So now we need to explore new ways of packing more power into ever tinier spaces.

Want a quick overview?

Read “The Future of Computing: Post-Silicon Computing.”

The future of computing is being shaped by transistors made from materials other than silicon. It’s being amplified by approaches that have nothing to do with transistor speed, such as deep-learning software and the ability to crowdsource excess computing power to create what amounts to distributed supercomputers. It may even redefine computing itself.

Here are some of the landmarks on computing’s new frontiers:

  • Graphene-based transistors: Graphene – one carbon-atom thick and more conductive than any other known material (see The Super Materials Revolution) – can be rolled up into tiny tubes and combined with other 2D materials to move electrons faster, in less space and using less energy, than even the smallest silicon transistorUntil recently, though, producing nanotubes has been too messy and error prone to be commercially feasible. However, in 2019, a team of MIT researchers developed a process for creating a 16-bit carbon nanotube microprocessor that successfully executed a set of instructions to print out a message beginning “Hello, World!” The process eliminated enough defects in the nanotubes that it could move from lab to factory in less than five years.
  • Quantum computing: Even the most powerful conventional computer can only assign a one or a zero to each bit. Quantum computing, by contrast, uses quantum bits, or qubits, which can be a zero, a one, both at once, or some point in between, all at the same time. (Brain bending, yes, but see WIRED’s surprisingly understandable explanation.) Current quantum computers are loud and unreliable, but in the next 10 or 20 years they’ll be able to help us design new materials and chemical compounds and create unhackable channels of communication to protect everything from financial transactions to troop movements.
  • DNA data storage: Convert data to base 4 and you can encode it on synthetic DNA. Why would we want to do that? Simple: We already know how to sequence (read), synthesize (write to), and copy DNA. A little bit of it stores a whole lot of information; some researchers believe we could meet the world’s entire data storage needs for a year with a cubic meter of powdered e. coli DNA. And it’s remarkably stable, as proven by scientists who successfully used a scrap of bone to reconstruct the genome of a cave bear that died 300,000 years ago. DNA-based data storage as a service (because you’re probably not going to invest in your own gene editing tools) may be just a few years away.
  • Neuromorphic technology: The goal of this technology is to create a computer that mimics the architecture of the human brain in order to achieve human levels of problem solving – and perhaps even cognition at some point – while requiring hundreds of thousands of times less energy than a traditional transistor. We aren’t there yet, but in early 2020, Intel rolled out a new server based on neuromorphic chips that it claims has roughly the same neural capacity as a small mammal’s brain. And in a development that would once have been science fiction, an international team of researchers has linked artificial and biological neurons to communicate like a biological nervous system but one that uses internet protocols.
  • Optical computing: The ability to compute using photons, that is, by mapping data onto light-intensity levels and then varying the light intensity to perform calculations, is still in its earliest stages but could enable high-efficiency, low-power processing and data transmission. Optical computing at nanoscale would be possible at the literal speed of light.
  • Distributed computing: Every computer that’s idling in sleep mode or isn’t operating at full capacity has compute cycles that can be used for other things. A client that runs in the background allows that computer to download workloads from a remote server, perform calculations locally, and upload the results back to the server. The current apex of this distributed model is Folding@home, which is modeling protein molecules to find cures for diseases like Alzheimer’s, cancer, and, most recently, COVID-19. The project now has nearly 750,000 participants and a collective 1.5 exaflops of power – that is, the ability to perform a quintillion calculations per second. That’s 75% of the projected speed of the El Capitan supercomputer, which is expected to be the world’s fastest when it comes out in 2023.

We may be approaching the limits of what silicon chips can do, but technology itself is still accelerating. It’s unlikely to stop being the driving force in modern life. Its influence will only increase as new computing technologies push robotics, artificial intelligence, machine-to-human interfaces, nanotechnology, and other world-shaking advances past today’s accepted limits.

 

In short, exponential growth in computing may not be able to go on forever, but its end is still much further in the future than we might think.

About the Authors

Dan Wellers
is the Digital Futures Global Lead and Senior Analyst at SAP Insights research center.

Fawn Fitter
writes about the intersection of business and technology.

SAP Insights Newsletter

Subscribe today

Gain key insights by subscribing to our newsletter.

Further reading

Back to top