09 Sep 2020 – BrainChip Holdings Limited (ASX:BRN) President & CEO Louis DiNardo provides an update on collaboration and partnerships for the company’s AI chip, Akida.
WELCOME TO BRAINCHIP
BrainChip’s Akida™ is a revolutionary advanced neural networking processor that brings artificial intelligence to the edge in a way that existing technologies are not capable. The solution is high-performance, small, ultra-low power and enables a wide array of edge capabilities that include continuous learning and inference.
INCREMENTAL LEARNING WITH AKIDA
This video demonstrates the power of incremental learning on Akida with one-shot learning examples
Interview & Presentation from FNN online event April 21, 2020 ( ASX: BRN)
BrainChip Holdings (ASX:BRN BrainChip Holdings (ASX:BRN) CEO Lou DiNardo talks about its Akida integrated circuit, its advantages in event-based convolutional (e-conv) networks, applications and its move to wafer production and first engineering samples by Q3 2020.
BRAINCHIP & TATA ROBOT VIDEO DEMO
In this demonstration, the BrainChip Engineering team shows human gesture recognition using spiking input on Akida™ neuromorphic platform – NeurIPS Vancouver 2019
BrainChip provides a Market Update on March 25, 2020. This update precedes the Company’s lodging of the March quarter 4C as the global impact of the coronavirus has led to numerous investor enquires.
ACTUAL TECH MEDIA ECOCAST: SUPPORTING EMERGING AI, ML, AND DATA SCIENCE INITIATIVES
This presentation by BrainChip COO Roger Levinson includes BrainChip’s Akida™ that brings highly efficient AI processing capabilities to edge devices such as motion sensors, medical devices, automotive systems, or drones.
EDA CAFE: BRAINCHIP
In this interview, BrainChip COO, Roger Levinson, shares with EDA Cafe how BrainChip is taking AI to the edge.
AKIDA IMAGENET RUNNING ON MOBILENET
The Akida Development Environment running the full mobilenet V1 on Imagenet 1000 classifiers
AKIDA GESTURE RECOGNITION WITH INCREMENTAL LEARNING USING SAMSUNG DVS CAMERA
This video demonstrates how the Akida event domain neural processor can learn and respond to new hand gestures utilizing the Samsung DVS camera as input. The entire neural processing solution consumes less than 1mW.
BE AMONG THE FIRST TO RECEIVE NEWS, EVENTS, AND UPDATES