High level synthesis of automatic devices.
- Get link
- X
- Other Apps
A require a sophisticated framework of sensors to function. These sensors include LiDAR, Radar, video, etc. that continuously generate a high volume of data in real time about the environment surrounding the car. The sensors constantly send their output for analysis to powerful domain controllers connected to a processing unit. The discrete data from different sensors are then merged to create meaningful information related to the vehicle’s location, speed, direction and surroundings. This process is known as sensor fusion.
Sensor Fusion is typically done using custom hardware – either FPGA or ASICs. The data is then processed to make decisions that impact the ADAS systems –such as turns, breaks or speed adaptation. The hardware incorporates algorithms which involve machine learning in sophisticated artificial intelligence (AI) applications to facilitate realtime processing of the sensor data.
In this white paper by Mentor, a Siemens Business, explore how autonomous designers are grappling with new silicon architectures optimized for neural computing and computer vision to make autonomous vehicle solutions better and faster to market than ever before.
- Get link
- X
- Other Apps
Comments
Post a Comment