Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Dr. Roman Caudillo

Inventing tomorrow’s technology with hyperdimensional computing

Magnifying glass on binary code.

It may sound like something out of the latest Star Wars or Marvel film, but Hyperdimensional (HD) computing is in fact, very real. Simply put, it is a novel machine-learning paradigm inspired by theoretical neuroscience.

HD computing takes a selection of principles seen in how our brains perform complex tasks – namely the transformation of dense sensory data into a high-dimensional sparse representation where relevant information can more easily be separated – to achieve gains in performance and energy efficiency without sacrificing accuracy and with the additional benefit of being robust to noise. HD computing is also amenable to high parallelization and, when paired with the optimal computing hardware, could use these principles as the basis for the next generation of machine learning.

Unlike the likes of AI or Quantum computing, HD computing is not yet embedded in the popular cultural zeitgeist. So, while we’ve addressed what HD computing is in simple terms, we can go further. Let’s take a deep dive into HD computing, what is it? What are its capabilities? And how is it being used today…

Just what is hyperdimensional computing?

As we outlined above, HD computing is inspired by how humans (and other animals) use their senses to gather information. For instance, when taking in visual information, to you the information is represented by dense input sensory signals from firing neurons.

These electrical signals then travel from your retina; through the optic nerve; to your brain where they are transformed or exploded as a high-dimensional sparse representation in which thousands of neurons are engaged – in other words what you see. By moving from a compact depiction to a seemingly less efficient high-dimensional sparse representation, scientists believe that the brain can more easily separate the most crucial information.

HD computing mimics this pattern-based computation in human memory, using large vectors in the tens or even hundreds of thousands. For example, if you were to take data from an initially dense depiction of a real number and then represent it as thousands of bits or hypervectors, then an HD model could more easily recognize what data shares a pattern with other data – essentially separating the most crucial information.

Alongside this, HD computing uses fast single-pass training. This allows for real-time learning and reasoning, which, unlike deep neural networks which require complex systems to be trained on, means HD computing models can learn and be trained online on something as simple as a mobile phone or a sensor. The use of hypervectors also makes HD computing incredibly robust to noise while using hardware accelerators makes HD computing energy efficient. All these properties lend HD computing an excellent fit for AI functions – be it the classification of images and objects to detecting, storing, binding, and unbinding noisy patterns.

How is HD Computing being used today?

All this talk of what HD computing is, but what about it in action? Well, there have already been strives to use it to deliver real value in the real world.

Over the past five years, the Joint University Microelectronics Program (JUMP) CRISP Center at the University of California San Diego (UCSD), has worked with Intel Labs as well as a host of other industry collaborators to use HD computing to solve the memory and storage challenges brought by COVID-19 wastewater surveillance and personalized recommendation systems.

The research was led by Principal Investigator, Tajana Simunic Rosing - Professor of Computer Science and Engineering, and Electrical and Computer Engineering at UCSD. As we all know, when the COVID-19 pandemic hit in 2020, organizations and institutions across the globe were forced to shut down. However, UCSD continued operating due to the joint efforts of the CRISP Center and the UCSD School of Medicine to track virus genomic sequences in wastewater.

The analysis of any standard microbiome is a data-intensive task, and this is no different with COVID-19’s which reached up to 10 TB worth of data daily and required expensive operations such as alignment – which itself can take multiple days for viral genomic analysis and months to process the genomic data. A large part of the collaborative research focused on developing novel ways to use HD computing, combined with accelerators, to overcome memory bottlenecks.

What the team found is that HD computing accelerated the genomic sequence tracking process from days to hours, which proved vital during the pandemic. They achieved this using HD computing to represent the genes of either humans or the virus, speeding up the DNA sequencing in its COVID-19 wastewater surveillance by 200x because they could compare these patterns in parallel – ultimately helping UCSD prevent outbreaks by detecting 85 per cent of cases early.

It may still be in its infancy, but HD computing presents us with great potential. From revolutionizing how we create AI to secure personalized drug discovery – HD computing could be at the forefront of helping us solve some of the hardest challenges we face.

We've featured the best cloud computing. 

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.