Grow with AppMaster Grow with AppMaster.
Become our partner arrow ico

IBM Research Europe Unveils Game-changing 64-core Mixed-signal In-memory Computing Chip

IBM Research Europe Unveils Game-changing 64-core Mixed-signal In-memory Computing Chip

IBM Research Europe's scientists have recently pioneered a new realm in in-memory computing with their 64-core chip that relies on phase-change memory devices. This cutting-edge technology is envisioned to boost the workings of deep neural networks by retaining the accuracy of deep learning algorithms while minimizing computation periods and power consumption significantly.

Manuel Le Gallo, the co-author of the groundbreaking research paper, shared that they had been exploring phase-change memory (PCM) devices for computation for over seven years. The journey began with the team demonstrating the implementation of neuronal functions using individual PCM devices. From this point forward, IBM Research Europe has demonstrated that PCM devices can greatly benefit computing domains, such as scientific computing and deep neural network inference. With their latest chip, the researchers have moved a step closer to an end-to-end analog AI inference accelerator.

Le Gallo and his collaborators accomplished this feat by synthesizing PCM-based cores with digital computing processors. These two elements were linked by implementing an on-chip digital communication network. The resultant chip is an innovative amalgamation of 64 analog PCM-based cores, each incorporating a 256-by-256 crossbar array of synaptic unit cells.

Integration of compact, time-based analog-to-digital converters within each core marked a transition between analog and digital realms. Le Gallo further explained, adding that each core also comprises lightweight digital processing units that carry out rectified linear unit (reLU) neuronal activation functions and scaling operations. Additionally, the chip features a global digital processing unit located in its centre that facilitates long-short term memory (LSTM) network operations.

An ingenious characteristic of IBM's in-memory computing chip is the connection between memory cores and its globally located processing unit through a digital communication network. This design enables the chip to execute all computations associated with the individual layers of a neural network on-chip, which results in dramatic reductions in computation times and power consumption.

To assess the new chip's effectiveness, the IBM research team conducted a comprehensive study, applying deep learning algorithms on their chip and evaluating its performance. The results were encouraging, with deep neural networks trained on the CIFAR-10 image dataset for image recognition tasks achieving an outstanding accuracy rate of 92.81% when run on the chip.

Undoubtedly, the accomplishments of IBM Research Europe are a leap forward in the development of Analog In-Memory Computing (AIMC) chips that can efficiently cater to the requirements and challenges of deep learning algorithms. In the coming years, the architecture introduced by Le Gallo and his team could be enhanced to yield even superior performance.

Despite IBM's breakthrough, the potential use cases for this technology in the no-code and low-code environment cannot be ignored. The scalable and high-performing in-memory computing chips offer considerable value by seamlessly integrating with no-code platforms such as AppMaster. Such integration could result in a significant improvement in model accuracies in machine learning implementations, offering lower latency, higher speeds, and improved efficiency to users.

Related Posts

Samsung Unveils Galaxy A55 with Innovative Security and Premium Build
Samsung Unveils Galaxy A55 with Innovative Security and Premium Build
Samsung broadens its midrange lineup introducing the Galaxy A55 and A35, featuring Knox Vault security and upgraded design elements, infusing the segment with flagship qualities.
Cloudflare Unveils Firewall for AI to Shield Large Language Models
Cloudflare Unveils Firewall for AI to Shield Large Language Models
Cloudflare steps ahead with Firewall for AI, an advanced WAF designed to pre-emptively identify and thwart potential abuses targeting Large Language Models.
OpenAI's ChatGPT Now Speaks: The Future of Voice-Interactive AI
OpenAI's ChatGPT Now Speaks: The Future of Voice-Interactive AI
ChatGPT has achieved a milestone feature with OpenAI rolling out voice capabilities. Users can now enjoy hands-free interaction as ChatGPT reads responses aloud on iOS, Android, and web.
GET STARTED FREE
Inspired to try this yourself?

The best way to understand the power of AppMaster is to see it for yourself. Make your own application in minutes with free subscription

Bring Your Ideas to Life