BrainChip aims to cash in on industrial automation with neuromorphic chip Akida

via Tech channel news

Dubai: Neuromorphic chip, mimicking brain processing, has been around for a while but it is still in deep research by Intel, IBM, MIT, Stanford and others.

Neuromorphic computing is the next level of artificial intelligence and aims to create a system similar to how neurons fire and interact in the human brain.

IBM has TrueNorth and Intel has Loihi but they are used for research purposes and not for commercial use.

But one company that is trying to cash is on the industrial automation is the Australian-listed tech company – BrainChip – that is bringing artificial intelligence to the edge in a way that is beyond the capabilities of other products.

“Both IBM and Intel have neuromorphic chips but our chip is more efficient than others and we go one step further, it can convert CNN to SNN or to native SNN,” Anil Mankar, Chief Development Officer and Co-Founder at BrainChip, told Tech Channel News.

Convolutional Neural Networks (CNNs) are a category of neural networks that have proven very effective in identifying faces, objects and traffic signs apart from powering vision in robots and self-driving cars and are widely used in pattern recognition and image processing.

Spiking Neural Networks (SNN) is the next generation of machine learning and operates using spikes, which are discrete events that take place at points in time, rather than continuous values.

BrainChip has a new neuromorphic processor knows as Akida, the Greek word for a spike, which can solve today’s CNN problems and is ready for third-generation of AI – SNN.

“There is a lot of algorithms developed for CNN and there is a lot of algorithms that are not yet developed for CNN. Akida can have 1.2 million neurons and 100 billion synapses, the electricity going from the brain to give action, compared to Intel’s Loihi chip has 128,000 neurons and 100 million synapses and our efficiency is better than IBM and Intel,” Mankar said.

No real learning in DLA

The Akida neural processor analyses data such as images, sounds, data patterns, and other sensor data and extracts useful information generated from events in the data.

In Deep Learning Accelerator (DLA), he said that there is no real learning in that and you are training it for a dataset.

“When a child sees a dog, his mother has to tell him that it is a dog and his brain stores it. Next time, when he sees a dog, he knows that it is a dog. Instead of this, if his mother shows the dog picture and tells it is a cat, the child’s brain stores it as a cat,” he said.

Anil Mankar, Chief Development Officer and Co-Founder at BrainChip.

However, in a neuromorphic environment, he said that it can learn from the repetitive patterns coming in from the spikes, which means similar to what the human brain does and the neuromorphic hardware learn all the five senses which DLA does not do that.

“Our chip can do what DLA does such as audio signature, keyword spotting, finding objects or classification in video frames like a CNN do but what neuromorphic hardware does it very well is finding unsupervised learning of finding patterns what you don’t know by looking at the events that is coming in and learning from that, with on-chip learning, and you can label it afterwards,” he said.

With incremental learning, he said that new classifiers can be added to the network without retraining the entire network.

Living on the edge

Neuromorphic hardware is very good at finding an abnormality in a data pattern without the need to retrain in the cloud and can benefit sectors such as security, smart cities, home building, automation, autonomous driving, IoT, medical, drones and biometric.

At the edge, sensor inputs are analysed at the point of acquisition rather than through transmission via the cloud to a datacentre.

BrainChip has signed several important partnerships over the past several months with Ford, Valeo and most recently with Vorago for developing AI solutions for NASA.
The BrainChip Early Access Program is available to a select group of customers that require early access to the Akida device, evaluation boards and dedicated support.
“Many edge-AI processors take advantage of the spatial sparsity in neural network models to eliminate unnecessary computations and save power but neuromorphic processors achieve further savings by performing an event-based computation, which exploits the temporal sparsity inherent in data generated by audio, vision, olfactory, lidar and other edge sensors,” Mankar said.

Moreover, he said that neuromorphic chips can play a big part in machine learning if it is close to the sensor and can make the sensor smart.

“What we have taken, so far, is baby steps to make algorithms that mimic how the brain does something which is low power. The industry is still very niche and it will hundreds of years for us to understand the brain because it is complex and there are columns in the brain. Nobody can match the human capacity to detect objects,” he said.

A human brain has 88 billion neurons and 100 trillion synapses and in 20W ultra-low power.

Product commercialisation

“Neuromorphic hardware will allow the machines to do exactly what all the five senses – see, hear, smell, touch and taste – of the human being do just to see, differentiate and classify them.

“Everybody is moving away from computational-based deep learning to neuromorphic. Intel has a research chip and researchers are using it to find new algorithms in neuromorphic hardware,” he said.

However, Mankar said that they are not going after the research but to commercialise the product for current applications.

The 28nm chip costs close to $15 for mass volumes.

BrainChip does not want to rest on its laurels and has already started R&D on next-generation neuromorphic chip with memory.