[ Back ]   [ More News ]   [ Home ]
Successful conclusion to EU’s three-year TULIPP project for embedded image processing and vision applications

Palaiseau, France – July 11, 2019. The Tulipp (Towards Ubiquitous Low-power Image Processing Platforms) Consortium has announced a highly successful conclusion to the EU’s three-year project. Beginning in January 2016, the Tulipp project targeted the improved development of high performance, energy efficient systems for the growing range of complex, vision-based image processing applications. The Tulipp project was funded with nearly €4 million from Horizon 2020, the European Union’s biggest research and innovation programme to date.

The conclusion of the Tulipp project sees the release of a comprehensive reference platform for vision-based embedded system designers, enabling computer vision product designers to readily address the combined challenges of low power, low latency, high performance and real-time image processing design constraints. The Tulipp reference platform includes a full development kit, comprising an FPGA-based embedded, multicore computing board, parallel real-time operating system and development tool chain with guidelines, coupled with ‘real world’ Use Cases focusing on diverse applications such as medical x-ray imaging, driver assistance and autonomous drones with obstacle avoidance. The complete Tulipp ecosystem was demonstrated earlier in the year to vision-based system designers in a series of hands-on tutorials.

“The Tulipp project has achieved all of its objectives,” said Philippe Millet of Thales and Tulipp’s Project Co-ordinator. “By taking a diverse range of application domains as the basis for defining a common reference processing platform that captures the commonality of real-time, high-performance image processing and vision applications, it has successfully addressed the fundamental challenges facing today’s embedded vision-based system designers.”

Developed by Sundance Multiprocessor Technology, each instance of the Tulipp processing platform is 40mm x 50mm and is compliant with the PC/104 embedded processor board standard. The hardware platform utilizes the powerful multicore Xilinx Zynq Ultrascale+ MPSoC which contains, along with the Xilinx FinFET+ FPGA, an ARM Cortex-A53 quad-core CPU, an ARM Mali-400 MP2 Graphics Processing Unit (GPU), and a real-time processing unit (RPU) containing a dual-core ARM Cortex-R5 32-bit real-time processor based on the ARM-v7R architecture. A separate expansion module (VITA57.1 FMC) allows application-specific boards with different flavours of input and output interfaces to be created while keeping the interfaces with the processing module consistent.

Coupled with the Tulipp hardware platform, is a parallel, low latency embedded real-time operating system developed by Hipperos specifically to manage complex multi-threaded embedded applications in a predictable manner. Perfect real-time co-ordination ensures a high frame rate without missing any deadlines or data. Additionally, to facilitate the efficient development of image processing applications on the Tulipp hardware and in order to help vision-based systems designers understand the impact of their functional mapping and scheduling choices on the available resources, the Tulipp reference platform has been extended with performance analysis and power measurement features developed by Norges Teknisk-Naturvitenskapelige Universitet (NTNU) and Technische Universität Dresden (TUD) and implemented in the Tulipp STHEM toolchain.

Also, the insights of the Tulipp Consortium’s experts have been captured in a set of guidelines, consisting of practical advice, best practice approaches and recommended implementation methods, to help vision-based system designers select the optimal implementation strategy for their own applications. This will become a TULIPP book to be published by Springer by the end of 2019 and supported by endorsements from the growing ecosystem of developers that are currently testing the concept.

To further demonstrate the applicability of defining a common reference processing platform, comprising the hardware, operating system and a programming environment that captures the commonality of real-time, high performance image processing and vision application, Tulipp has also developed three ‘real-world’ Use Cases in distinctly diverse application domains – medical X-ray imaging, automotive Advanced Driver Assistance Systems (ADAS) and Unmanned Aerial vehicles (UAVs).

Tulipp’s medical X-ray imaging Use Case demonstrates advanced image enhancement algorithms for X-ray images running at high frame rates. It focuses on improving the performance of X-ray imaging Mobile C-Arms, which provide an internal view of a patient’s body in real-time during the course of an operation, to deliver increases in surgeon efficiency and accuracy with minimal incision sizes, aids faster patient recovery, lowers nosocomial disease risks and reduces by 75% the radiation doses to which patients and staff are exposed.

ADAS adoption is dependent on the implementation of vision systems or on combinations of vision and radar and the algorithms must be capable of integration into a small, energy-efficient Electronic Control Unit (ECU). An ADAS algorithm should be able to process a video image stream with a frame size of 640x480 at a full 30Hz or at least at the half rate. The Tulipp ADAS Use Case demonstrates pedestrian recognition in real-time based on Viola & Jones algorithm. Using the Tulipp reference platform, the ADAS Use Case achieves a processing time per frame of 66ms, which means that the algorithm reaches the target of running on every second image when the camera runs at 30Hz.

Tulipp’s UAV Use Case demonstrates a real-time obstacle avoidance system for UAVs based on a stereo camera setup with cameras orientated in the direction of flight. Even though we talk about autonomous drones, most current systems are still remotely piloted by humans. The Use Case uses disparity maps, which are computed from the camera images, to locate obstacles in the flight path and to automatically steer the UAV around them. This is the necessary key towards fully autonomous drones.

“As image processing and vision applications grow in complexity and diversity, and become increasingly embedded by their very nature, vision-based system designers need to know that they can simply and easily solve the design constraint challenges of low power, low latency, high performance and reliable real-time image processing that face them,” concluded Philippe Millet. “The EU’s Tulipp project has delivered just that. Moreover, the ecosystem of stakeholders that we have created along the way will ensure that it will continue to deliver in the future. Tulipp will truly leave a legacy.”

# # #

About TULIPP and its Partners

Tulipp (Towards Ubiquitous Low-power Image Processing Platforms) is funded by the European Union’s Horizon 2020 programme. It began its work in 2016 and completed in 2019. Its focus is on the development of high-performance, energy-efficient embedded systems for the growing range of increasingly complex image processing applications that are emerging across a broad range of industry sectors. Tulipp focused on providing vision-based system designers with a reference platform that defines implementation rules and interfaces designed to tackle power consumption issues while delivering guaranteed, high performance computing power. For more information on Tulipp, please visit: http://www.tulipp.eu. For further information on the Tulipp consortium members see:

 

Thales - www.thalesgroup.com

Efficient Innovation SAS - www.efficient-innovation.fr

Fraunhofer IOSB – www.iosb.fraunhofer.de

Hipperos – www.hipperos.com

Norges Teknisk-Naturvitenskapelige Universitet – www.ntnu.no

Technische Universität Dresden – tu-dresden.de

Sundance Multiprocessor Technology – www.sundance.com

Synective Labs – www.synective.se



Read the complete story ...