A delivery robot operating in Moscow recently created an unusual scene when it failed to yield to a special motorcade at an intersection. Although the robot was crossing on a green light, it reportedly did not react quickly enough to a police inspector who was actively clearing traffic for the approaching motorcade. While the initial escort vehicle managed to pass, the main Aurus limousine was compelled to stop, allowing the robot to proceed. This incident swiftly ignited a public discussion online regarding the readiness of such autonomous vehicles to correctly prioritize emergency and special vehicles like ambulances or fire trucks in future scenarios.

Experts generally concur that autonomous robots are typically programmed to yield to special transport. However, Ekaterina Rodina, Director of External Development at Automakon Group of Companies and a prominent voice on her “Rodina i Roboty” Telegram channel, clarified that this particular event underscored significant deficiencies in the autonomous system’s learning logic. She pinpointed several critical weaknesses, including the robot`s sensor capabilities, perception software, its decision-making module, and insufficient testing protocols. For engineers, this incident provides invaluable real-world data that can be used for further system refinement and improvement.
Director of External Development at Automakon Group of Companies, Author of “Rodina i Roboty” Telegram channel
Rodina further elaborated that the robot’s perception system requires more robust training to accurately recognize special signals, such as flashing lights and sirens, from diverse angles and in varying conditions. She suggested a probable lack of sufficient training data in these specific contexts. The system, she noted, failed to assign a high enough priority to the special signals, likely misinterpreting the situation as “danger passed” after the lead vehicle of the motorcade had gone by. It then rigidly adhered to its programmed “green light – proceed” rule, demonstrating a critical lack of capability to anticipate and predict the continued movement of the motorcade. Rodina also highlighted that such scenarios are inherently rare and thus present significant challenges for effective simulation and training. The robot, in this instance, encountered a unique confluence of factors: a pedestrian crossing, an already passed traffic police vehicle, and a subsequent vehicle with flashing lights, a complex combination that its algorithm was unable to rapidly adapt to and process into a correct decision.
In contrast, a human driver who fails to yield to a motorcade in a similar situation would face severe penalties under Russian law, including a substantial fine of 10,000 rubles or a driving license suspension ranging from six months to a year. As of now, there have been no official reports or announcements regarding potential liability for the robot`s owner in connection with this incident.

