On a Path to Artificial Brains

via EETimes

Now that computers are ubiquitous, we are looking for more ways to delegate to them the tasks that we were previously content to perform ourselves: operating industrial robots, driving cars and detecting disease. That transition has propelled advances in new approaches, including neuromorphic computing.

Since computers were expensive and rare in the early days, it made sense for engineers to focus on applications at which humans were inherently poor: the things we couldn’t possibly do quickly without machines, like performing complex arithmetic and mathematical operations. From an engineering standpoint, this meant favoring the precision and programmability of zeroes and ones, ANDs and NOTs, adding memory and processing used by more analog and distributed platforms.

Hence, it is no surprise that that current shift toward intelligent machines is making us rethink our engineering priorities. Speed in deciding when and how much to brake a car is much more important than absolute precision in deciding whether the animal in your path is a dog or a cat. In addition, if the intelligence is to be embodied in a phone, a car, a robot, then computational density and battery life both must be maximized.

It may have taken three decades but, because of this priority shift, neuromorphic computing – which uses lessons from biology to build more efficient, brain-like machines – is finally starting to take off. In the five years ending in 2019, $2 billion was invested in neuromorphic companies. That estimate does not include internal spending by the likes of IBM, Intel and Samsung.

This EE Times Special Project opens with my introduction of the basic concepts in neuromorphic electronics, exploring why it offers large potential advantages in speed, weight, area and power for neural processing.

We also examine the engineering trade-offs.

Next, we take a deep dive in commercial development of neuromorphic architectures and emerging applications. Peter van der Made describes work within his company, BrainChip, on keyword spotting, examining why this could be among the first applications for neuromorphic computing.

For now, neuromorphic engineering is particularly suited to real-time sensory processing. Because this is such a critical area, our package includes two stories, one a report from the field by EE Times Europe Editor-in-Chief Anne-Françoise Pelé exploring vision sensors in mobile phones and production lines. The second is the personal perspective of Tobi Delbrück, one of neuromorphic engineering founder Carver Mead’s PhD students and professor at the Institute of Neuroinformatics in Zurich, Switzerland. In his spare time, Delbrück also founded three neuromorphic start-ups.

We also take a closer look at the ideal hardware types for the kinds of sophisticated machines capable of approaching human intelligence. Chris Eliasmith and Terry Stewart of Applied Brain Research and the University of Waterloo, Canada, have built models and even software that would allow this kind of intelligence to be implemented. They explore the hardware architectures needed to make it happen, including the neuromorphic chips they’ve already tested.

In my new EE Times blog, Brains and MachinesI’ll take a look at the potential of improving the connectivity (and computational power) of silicon brains using various 3D fabrication techniques. This is partly inspired by a conversation with Kwabena Boahen last year where we discussed, among other things, the importance of neuron density in the human brain.

Prophesee’s Jean-Luc Jaffard explains in his story, Bringing Neuromorphic Vision to the Edge, how the industry has begun to see the use of neuromorphic vision proliferate, thanks to its ability to significantly improve the functionality, performance and power consumption for machine vision-enabled systems.

Finally, to ensure that conventional technology will not make neuromorphic approaches unnecessary, Sally Ward-Foxton ponders whether the chips designed to accelerate inference and similar tasks are likely compete with – and possibly beat – their brain-inspired counterparts.