[ Back ]   [ More News ]   [ Home ]
ArterisIP Advances Machine Learning SoC Design with Ncore 2.0 Cache Coherent Interconnect and Resilience Package

SoC interconnect IP enables highly scalable neural network systems with integrated hardware functional safety features for ISO 26262 ASIL D compliance

Linley Autonomous Hardware Conference 2017, SANTA CLARA, Calif. — April 6, 2017 — ArterisIP, the innovative supplier of silicon-proven commercial system-on-chip (SoC) interconnect IP, today announced the  Ncore 2.0 Cache Coherent Interconnect IP and the optional Ncore Resilience Package to accelerate and enhance the creation of next-generation designs for autonomous driving systems and advanced driver assistance systems (ADAS).

Ncore 2.0 offers new capabilities:

Ncore is a distributed heterogeneous cache coherent interconnect that enables SoC design teams to easily integrate custom processing elements using embedded low latency proxy caches (also called “I/O caches”). In neural network machine learning SoCs where workloads are typically partitioned onto different processing elements, low latency proxy caches offer a more hardware- and software-efficient way of communicating between all the different elements than fixed internal SRAMs or scratchpad memories. These types of architectures are at the core of autonomous driving systems.

“We engineered the Ncore 2.0 cache coherent interconnect and the accompanying Ncore Resilience Package based on feedback from our customers, who are creating the world’s most efficient machine learning systems that will power the future of autonomous driving,” said K. Charles Janac, President and CEO of ArterisIP. “Our IP was specifically created for autonomous hardware SoC designers, enabling them to simultaneously meet the competing needs of design complexity and functional safety.”

Supporting Quotations

Sanechips (Subsidiary of ZTE):

“We are very impressed with the new ArterisIP Ncore interconnect IP technology,” said Mr. Yu Li, VP at Sanechips (Subsidiary of ZTE). “ArterisIP Ncore 2.0 interconnect IP offers even higher scalability along with the Coherent Memory Cache, which reduces DRAM accesses while maintaining area efficiency.”

The Linley Group:

“Designers of machine-learning SoCs must integrate heterogeneous processor cores and accelerators into a complex system that can handle the high data bandwidth and low latency required for such applications,” said Mike Demler, senior analyst for the Linley Group. “ArterisIP’s new Ncore 2.0 interconnect IP with resiliency features addresses these issues by enabling designers to implement heterogeneous cache-coherent machine-learning architectures for applications such as the rapidly developing autonomous-vehicle market.”

Resiltech:

“Implementing functional safety mechanisms in SoC interconnect hardware is imperative for highly complex machine learning SoCs,” said Andrea Bondavalli, Professor of Computer Science and head of the Resilient Computing Lab at the University of Florence, and scientific advisor to ResilTech, a leading functional safety consultancy, “Doing so simplifies software development and maintenance while freeing scarce processing resources to perform functional work rather than safety checking.”

About ArterisIP

ArterisIP provides  system-on-chip (SoC) interconnect IP to accelerate SoC semiconductor assembly for a wide range of applications from IoT to mobile phones, cameras, automobiles, SSD controllers and servers for customers such as  SamsungHuawei / HiSiliconMobileyeAltera (Intel), and  Texas Instruments. ArterisIP products include the  Ncore cache coherent and  FlexNoC non-coherent interconnect IP, as well as optional  Resilience Package (functional safety) and  PIANO automated timing closure capabilities. Customer results obtained by using the ArterisIP product line include lower power, higher performance, more efficient design reuse and faster SoC development, leading to lower development and production costs. For more information, visit  www.arteris.com or find us on LinkedIn at  https://www.linkedin.com/company/arteris.