Automotive

Automotive lighting: from vision to driving assistance

20th April 2020
Alex Lynn
0

“Autonomous vehicle technologies have a direct impact on traditional vehicles market and behind that the automotive lighting industry,” said Martin Vallo, PhD., Technology & Market Analyst, Solid-state Lighting at Yole Développement (Yole). 

“ADAS represents clearly a strong opportunity for the automotive lighting companies. For highest levels of autonomy, a combination of sensors in addition to AI and digital lighting will be so implemented for all weather capability.”

Without doubts, automotive lighting will evolve from a pure ‘Vision’ function to a ‘Driving Assistance’ function.

The market research & strategy consulting company Yole analyzes this evolution in a new dedicated automotive lighting report, titled ‘Automotive Advanced Front-Lighting Systems report’. Yole proposes a comprehensive overview and deep understanding of the automotive lighting market and advanced technologies.

This report analyzes the current status and future trends related to automotive front-lighting market applications, reviews the automotive lighting industry’s structure and future trends, examines the AFLS used for automotive applications, especially ADAS ones, and the associated roadmaps. This study also points out market insights and details regarding benefits and drawbacks, integration status, development roadmaps, market forecasts and much more.

Lighting is evolving from a basic passive feature used in vehicles to help the driver to see the road in dark conditions to a new function where lighting becomes an active feature able to detect oncoming traffic and reduce glare.

The evolution of lighting technology with increasing resolution gives the possibility to enable new functionality like projections on the road. Trends toward ADAS, lighting digitalization and deep integration of advanced sensors will enhance the functional content of headlamps. And this will bring synergy to the automotive lighting and sensing industry.

ADAS emergence was driven by the development and the integration of sensors into cars, starting with basic functionalities and progressing to complex functionalities, such as infrared LED for rain sensor, radar for blind spot monitoring, cruise control, adaptive cruise control, camera for traffic sign recognition, or lane departure warning. 

“The integration of sensors in ADAS vehicles is mandatory,” according to Yole’s team in the ‘Automotive Advanced Front-Lighting Systems report’.

Furthermore, the development of advanced ADAS systems is linked with the development of new innovative sensors, that can be introduced into automotive, and data processing. Currently, ADAS using camera(s) have different levels of complexity: The basic level is ACC using camera, with the camera used to monitor the distance between two vehicles.

The advanced level, with traffic sign recognition, where the camera must be able to read numbers, and the complex level, which corresponds to driver monitoring, where the camera is able to identify the eyes of the driver and detect drowsiness levels.

Pierrick Boulay, Technology & Market Analyst, Solid-state Lighting at Yole, said: “The development of advanced ADAS functions means an increasing penetration of electronics into the vehicle. It started with one module for one function, but now some devices are multi-functional, like the front camera/s, and such evolution is also occurring at the sensor level, such as combination of the camera with radar.”

Increased integration of electronics is also benefiting exterior lighting applications. Indeed, sensors and related data collected can be processed for advanced lighting functions. A clear example is the use of the camera for high-resolution lighting where the camera is used to detect oncoming or preceding vehicles

Under this dynamic context of innovations, OEM are playing a key role. Martin Vallo from Yole comments: “OEM requirements were quite different one or two years ago, but now tend to homogenize as they better understand the technology.”

As an example, OEMs have several requirements at different levels in integrating LiDAR. First, the optical integration limits losses in range detection to 10%, and high level of transmission (90%) is required. Also, surface close to the normal of LiDAR installation to limit deviation. Regarding the thermal integration, LiDAR needs to operate in all conditions, so the sensor has to handle temperature above 95°C in specific situations.

Management of internal humidity will be mandatory. De-icing will be also problematic, and washing and heating system for the outer lens must be evaluated. Then, for shock and mechanical integrations, LiDAR is not easy to break, so in case of pedestrian impact effect on lower leg must be taken into account.

Depending on the vertical field of view of the LiDAR, aiming system may be mandatory (mechanical dispersion, load, acceleration, braking). Finally, the styling integration needs a compromise between performance, technical constraints and aesthetics. Integration of LiDAR can be done in two ways: invisible or visible, depending on whether the OEM wants to show its technology or not.

Featured products

Product Spotlight

Upcoming Events

View all events
Newsletter
Latest global electronics news
© Copyright 2024 Electronic Specifier