Camera is one of the leading sensors (among others like Lidar and Radar) used for enabling advanced driver assistance systems (ADAS) and automated driving (AD). Car manufacturers (OEMs), as well as various tier1 suppliers, believe that camera application in ADAS and AD will rise steeply in coming years. Analysts believe that camera-based ADAS market will grow at a CAGR of 18.7% during the period 2017-2022. Cameras in cars are used as standalone sensors as well as sometimes coupled with other cameras/Lidar/Radar sensors.
Various applications of Cameras in Advanced Driver Assistance Systems and Automated Driving
A camera is used to monitor the surroundings of the vehicle as well as the driver. There are various applications of the camera in automated driving. In this blog, we discuss in brief various applications of cameras in ADAS as well as automated driving, along with the advantages of using a camera. Camera-based ADAS in general can be classified into two categories based on the placement of the camera.
Exterior Camera- Single or multiple camera placed outside the car for various applications like Lane Departure Warning, Traffic Sign Recognition, Surround View, Park Assists and Road Vulnerability Detection
- Lane departure warning – It is a standard and safe practise to stick to one lane while traveling. When features like cruise control are used, the speed of the vehicle is not controlled by the driver but instead is set at a standard value and vehicle ECU does the rest. However, controlling the direction of the vehicle becomes the work of the driver. To assist the driver in this activity, a camera is placed in the front of the car that identifies the lane markings and takes necessary measures to control the vehicle so that it sticks to one lane. As an extension of this simple application, automated cars use the same sensory information to take decision like switching lanes and adjusting speed to that of the car in front.
- Traffic sign recognition- Autonomous cars need to interact with other cars on road, that means adhering to the same traffic signs and traffic lights that are in practise. To assist cars in identifying various signs and signals, a camera-based system is used. The video captured from the sensor is then processed by a central processing unit. This unit runs various machine vision algorithms to identify any traffic sign and traffic light and assist the car in making overall autonomous decisions.
- Surround view- While travelling on roads, humans have the ability to understand the entire surroundings of the car; for any autonomous car, the same is achieved by means of camera. These cars have cameras surrounding them. The data from these sensors are collected and compressed for machine vision algorithms, then processed to identify any foreign objects in the path of the vehicle.
- Park assists- Parking in any metro city is a difficult task, even for a veteran driver. To assist drivers in parking, camera-based park-assist systems are used. This gives a complete picture of the surrounding area to the driver, making parking easier. In automated driving, camera-based park assist systems are used by cars to do the same and park themselves.
- Road vulnerability detection- Road vulnerability is a generic term that is used for various objects such as street signs and, traffic signs as well as road lane markings. Similar to generic object detection, inputs from the camera are run on a CNN model that is generally trained on traffic sign database (the most commonly used is GTSRB). An ADAS system notes any missed traffic signs and adjusts various parameters like the speed of the vehicle accordingly. In automated driving, this helps the autonomous system to identify various traffic signs and, road lane markings and plan accordingly their speed and movement in a busy street. Once these CNN models are developed, running them on an embedded platform is another challenge as explained here.
- Driver Monitoring system(DMS)- This is one of the camera-based advanced driver assistance systems in which the camera is used inside to monitor driver behaviour. Euro NCAP mandates a driver monitoring system by 2025. DMS monitors the driver for various behaviours that might hinder the right response while driving. DMS keeps track of various facial features like eyelid and mouth movement. For a driver monitoring solution to work accurately in terms of predicting in-attention by the driver, the underlying CNN models must be trained on a diverse set of datasets in terms of gender, age, ethnicity, etc.
- Occupancy detection- Various news articles report children or pets being left behind in a parked car during a hot summer day, sometimes resulting in fatality. This issue can be addressed with a single camera-based solution. The underlying system uses CNN models to identify any human or pet present in the vehicle after locking it
Various challenges with camera-based ADAS systems
Camera-based ADAS systems, as explained earlier, are here to make driving safer and simpler. However, they come with some challenges that need to be dealt with to achieve the desired results. Some of the challenges that camera-based ADAS systems put forth are listed below.
- Developing complex algorithm suits that are capable of achieving human-level driving accuracy
- Data transfer from sensor to processing units that are responsible for decision-making based on the vehicle surrounding
- Data compression for seamless transfer of camera data that is collected from across the vehicle
Autonomous cars and cars with advanced driver assist systems are equipped with a host of sensors for various object detection and driver monitoring applications. The camera is one such sensor that has long been, enabling autonomy. With advances in computer vision and machine learning, dependency on automation is increasing the need to consider safety as a top priority. The applications that are explained in this article are the primary applications of any autonomous driving-capable car or cars equipped with advanced driver assistance systems.
- Six technologies that are accelerating autonomous driving capabilities- last one for sure is the one everyone is waiting for!!
- Why does Radar outwit ultrasonic and other sensing technologies for blind spot detection?
- Lossless Compression Techniques for Multiple Camera Based Autonomous Driving (and ADAS) Solutions