Eye spy: Q&A with Seeing Machines

Insights
Technology Trends

How occupant monitoring systems can enhance automotive safety and user experience by using AI to track driver attention and adapt to environments.

Source: Getty Images/Imgorthand

Occupant monitoring systems (OMS) are becoming a game-changer in the automotive industry for safety and user experience. These systems are not just about detecting who is in the car; they combine 2D cameras with 3D depth sensors to deliver precise occupant tracking. With AI, they assess driver distraction by monitoring eye movements and head positions, giving real-time insights into whether a driver is paying attention or not. Edge computing takes this a step further, processing data on the spot to cut down on lag time, which is critical for safety responses. Plus, these systems are designed to adapt to different environments, handling everything from bright sunlight to dimly lit conditions without missing a beat. As the tech keeps advancing, OMS are set to redefine what safety looks like in vehicles. We spoke to Akshay Asthana, Chief Scientist (Artificial Intelligence) at Seeing Machines to get more insights into these developments.

Key takeaways:

  • Sensor fusion: Seeing Machines uses a combination of 2D RGB-IR cameras and 3D depth sensors to improve the accuracy of occupant detection.
  • AI for distraction detection: The company employs neural network algorithms to track driver behavior and identify distractions based on data from the driver and the vehicle's interior.
  • Edge computing: Their algorithms are designed to run efficiently on automotive-grade edge devices, ensuring quick data processing and low latency.
  • Environmental adaptation: The OMS is built to perform under various environmental conditions, utilizing advanced optics and algorithms to adjust to factors like lighting and weather.

The following is an edited transcript of the conversation.

S&P Global Mobility: How does Seeing Machines use sensor fusion from cameras and other technologies to enhance the accuracy of occupant detection in its OMS?

Akshay Asthana: We recently announced a partnership with Airy3D, and we are building a state-of-the-art full cabin monitoring system that leverages a great quality 5MP 2D RGB-IR image along with a highly accurate 3D depth sensor. This additional 3D information is invaluable and allows us to build features that would simply not be possible with a 2D camera alone. In addition, we are also exploring several other fusion algorithms, at the feature/algorithm level, that utilize the additional sensor data in the most optimal way to deliver the best products for our customers.

What specific AI algorithms developed by Seeing Machines are employed to enhance real-time decision-making in OMS, particularly in addressing distracted driving?

Our strategy is to detect as many of the factors that distract drivers as possible, then leverage a cutting-edge neural network-based method to infer the distraction status. At the heart of this is our core algorithms which can accurately track the driver (eyes, head, full body) in the car along with detection of all relevant other objects like other occupants, a mobile phone, cigarettes, etc., that can cause distraction. These signals are fed into the distraction algorithm, which uses this holistic scene information from various time instances to determine the state and severity of the driver distraction.

How is edge computing integrated into your OMS to facilitate faster data processing and reduce latency in occupant monitoring?

We specialize in the development of algorithms that can be deployed on any automotive-grade edge device with or without accelerators. Almost all of our algorithms rely on cutting-edge custom-designed neural networks. These networks need to be carefully trained in a quantization-aware manner for deployment at lower precision (most often 8-bit precision but can be lower) for delivering the best level of accuracy at the lowest possible latency on these edge devices. We are at the cutting edge of building highly efficient models that still have the best possible accuracy on deployment. This allows us to run all our OMS features very efficiently on any automotive-grade platform in the most optimal manner.

In what ways can your OMS integrate with other smart vehicle systems, such as navigation and infotainment, to enhance user experience?

We use our proprietary algorithms to enable integration with the other systems in a vehicle.

How do environmental factors like lighting and weather conditions impact the performance of Seeing Machines' OMS, and what solutions are being developed to mitigate these effects?

Environmental factors are among some of the most important factors that impact the performance of our algorithms. These are front and center during the algorithm design phase and we deal with these issues in multiple ways. For example, we use the right type of optics and camera design to minimize the impact, we have clever algorithms that adjust sensor settings to give the best image quality at capture time, and then we leverage big data to train the models that are robust to all commonly occurring environmental noise factors.

What challenges have you faced when integrating its OMS with existing vehicle safety systems, especially regarding compliance with new regulations?

Vehicle safety implementation strategies are heavily dependent on OEM implementations and vary across different OEMs. Even the ultimate safety goal achievements are very similar across different OEMs, the detailed implementation may vary significantly and require different algorithm strategies to satisfy the desired safety outcome.

What global standards is Seeing Machines aligning its OMS with, and how do these standards differ across regions such as the EU and US?

SM aligns and coordinates the OMS safety relevant feature scope to international safety standards and regulations for the relevant target markets. Although regulations are evolving at different paces, we strive to deliver a cohesive solution for our customers that meets the demands of a global market. We recognize that varying regulations, such as those in the EU and the US, must be considered in our approach.

preload preload preload preload preload preload