Lately, attention is turning towards self-driving cars again. Some people envision driverless vehicles in the next couple of years; others struggle with the idea to trust a “robot” with their life. While some fear artificial intelligence with evil intentions, it might be time to wonder what are the ultimate goals that drive us and the teams at the heart of the AD and ADAS industry forward?

Fasten your seatbelts

The numbers are pretty brutal to start with, so I’d recommend buckling up, have your lane-keeping and emergency brake assist on — let’s glance at how things stand.

It is estimated that fatal and…


aiSim started development on the premise that extensive testing is essential to ensure the safety and reliability of any Automated Driving product. aiSim not only simulates the model space but also generates accurate virtual sensor data (including camera images) as well. All with the possibility of running in real time, even for a complex sensor setup required for L4 robotaxies, in closed-loop, reacting to live control signals.

Written by: Gábor Könyvesi

aiSim has large-scale Software-In-the-Loop (SiL) testing capabilities where integration with an AD software means program modules associated with hardware interfacing are replaced with modules interfacing with the simulated world…


aiMotive and NI collaborate to integrate NI Veristand with aiMotive’s automated driving simulator (aiSim) to create an end-to-end Hardware-in-the-Loop (HiL) set-up for automated driving systems for even the most complex sensor configurations.

Budapest, Hungary, July 1, 2021 — One of the biggest challenges to validating an automated driving system is the real-time simulation of the entire end-to-end system, with all sensors modelled accurately in a realistic 3D environment. aiMotive, a technology leader in automated driving (AD) technology and NI collaborated to integrate NI Veristand with aiMotive’s AD simulator (aiSim) in an end-to-end HIL test set-up that models multiple high-definition cameras…


Technology leadership claims can often be “economical” with the truth, and this is particularly true for NPUs (Neural Processor Units). However, over-stated performance can lead to procurement decisions that fall short of expectations.

Written by Tony King-Smith

The result is engineering targets missed, deadlines not met, and R&D costs escalating. For its aiWare NPU, AImotive focuses on benchmarking NPU efficiency to provide the best information for AI engineers. Many other NPU suppliers quote raw TOPS and hardware utilization as a measure of performance. What’s the difference, and why does it matter?


Respected automotive executive joins Budapest-based automated driving company

Budapest, Hungary, June 3, 2021 — AImotive, a leading automated driving technology company based in Budapest, today announced that Arnaud Lagandré will be joining its team as Chief Commercial Officer. Arnaud Lagandré brings over 25 years of international automotive experience collected at European, Asian and North American positions at Continental to the Budapest-based team.

The executive has a deep understanding of the ADAS and automated driving markets, having spent the past six years leading Continental’s ADAS LiDAR segment in Santa Barbara, California. Arnaud led a team of engineers working on developing a…


The 4th generation of aiWare™ automotive NPU hardware IP delivers up to 64 TOPS per core, leveraging innovative wavefront-processing algorithms and upgraded memory architecture to deliver dramatically improved PPA* and improved built-in ISO26262 safety support

Budapest, Hungary, 27th May 2021 — AImotive, one of the world’s leading suppliers of scalable modular automated driving technologies, today announced the latest release of its award-winning aiWare NPU hardware IP. Featuring substantial upgrades to on-chip memory architecture, innovative new wavefront-processing algorithms and enhanced ISO26262-compliant safety features, aiWare4 delivers the ultimate scalable solution from the most challenging single-chip edge applications to the highest performance central…


Simulating the sensor modalities used in automated driving efficiently and accurately is an immense challenge. The best solutions is to rely on GPU-accelerated raytracing techniques and the efficient distribution of tasks. Today, the Vulkan API is the only tool that can support such a system — that’s why we’ve integrated it into aiSim.

Written by Balázs Teréki

Seven years ago, our team set out on a mission to create the world’s first ISO26262 certified automotive-grade simulator for automated driving development. This was when the first versions of aiSim were born, but creating a simulator capable of supporting automotive development was challenging.

Accurately and efficiently simulating all major sensor modalities used in automated driving requires robust hardware and complex code. For example, an automotive LiDAR sensor will emit millions of laser rays each second. Raytracing technology makes simulating this possible, but not automatically efficient.

A crucial element of aiSim 3.0’s efficiency is scheduling the…


In 2020 AImotive’s aiSim™ simulator was certified to TCL 3 according to ISO 26262:2018 by TÜV-Nord. This made it the world’s first ISO26262-certified comprehensive automated driving simulator for the development of automated driving solutions. Let’s go a layer deeper and see what’s under the hood: what makes aiSim’s rendering engine special and unique.

Written by: Zoltán Hortsin

Game engines vs sensor simulation — Why “physically-based” is our motto?

Most game engines focus on spectacular visuals, but they are not necessarily physically correct. In the development of aiSim, we aim for physically-based image synthesis because the main emphasis for automated driving solutions is not on the aesthetic experience but on being as realistic as possible. However, this does not mean that aiSim cannot handle spectacular effects. Furthermore, in aiSim, we can turn these effects on and off — and the number of these visuals can also be set by the users depending on the use-case. Another element that is not visible initially is that our…


Next-generation simulation slingshots advanced safety features closer to reality with multi-node and multi-client support

Budapest, Hungary, April 20, 2021 — AImotive, a global leader in automated driving (AD) technology, today announced aiSim 3.0, the next generation of the world’s first ISO26262-certified simulator for the development and validation of ADAS and AD systems. aiSim 3.0 brings multi-node and multi-client capability together with physics-based sensor simulation enabling high and measurable correlations between virtual and real-world testing from large-scale software-in-the-loop testing to real-time environments.

Multi-client support allows several ego vehicles or multi-ECU control systems to be placed in the same virtual environment to test interactions between numerous automated vehicles or system components. …


Demonstrating industry-leading NPU efficiency for demanding automotive vision NN workloads, as first samples delivered to lead customers

Budapest, Hungary and Seoul, South Korea, 15th April 2021 — AImotive, the world-leading supplier of scalable automated driving technologies and Nextchip Co., Ltd., a dedicated automotive vision technology company, today announced that AImotive has successfully demonstrated automotive NN (Neural Network) vision applications executing at up to 98% efficiency on the aiWare3P™ NPU (Neural Network Processor Unit) used on Nextchip’s latest Apache5 IEP (Imaging Edge Processor). Also featuring an advanced ISP supporting imaging sensors up to 5.7Mpixel …

AImotive Team

aiMotive’s 220-strong team develops a suite of technologies to enable AI-based automated driving solutions built to increase road safety around the world.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store