As reported in a press release on Nov. 5, SoftBank Corp. has introduced a traffic understanding multimodal AI, designed to enhance autonomous driving through remote support, by operating on SoftBank's multi-access edge computing (MEC) and other edge AI servers that feature low latency and high security.
The AI aims to improve vehicle safety and reduce operational costs, and its real-time operation allows for a comprehensive understanding of autonomous vehicle status. A field trial of this system was slated to begin at Keio University's Shonan Fujisawa Campus in October, with the goal of assessing the AI's capability to provide efficient remote assistance during complex or unforeseen driving situations.
The multimodal AI integrates various types of data, including forward-facing vehicle footage, to evaluate traffic conditions and potential hazards, consequently offering recommended actions to ensure safe driving. This system has been trained on a wide array of Japanese traffic knowledge, including manuals, regulations and risk scenarios, equipping it with a broad spectrum of essential traffic understanding. Through real-time data transmission over a 5G network and the use of graphics processing units (GPUs) for analysis, the traffic understanding multimodal AI is capable of identifying risks and advising on countermeasures, thereby enabling remote support for autonomous driving. Initially, remote operators relay instructions based on the AI's analysis, but the ultimate aim is to facilitate direct instruction from the AI to vehicles to achieve fully unmanned operations.
An example scenario from the field trial involves preventing accidents at crosswalks by identifying and reacting to risks not immediately evident to autonomous vehicles, such as pedestrians obscured by stationary vehicles. This scenario underscores the AI's capability to issue real-time, situation-specific guidance to autonomously operating vehicles.