Friday, December 9, 2022
Published 2 Months Ago on Saturday, Oct 15 2022 By Amira Saadeh
People have envisioned the cars of the future to be autonomous for decades. And as the years go by, technology advances, and the further it goes, the closer we are to that vehicular future. The 1980s witnessed the making of the first self-sufficient and truly autonomous cars through projects similar to Mercedes-Benz and Bundeswehr University Munich’s Eureka Prometheus Project in 1987. Regarding the passengers’ safety, there is no such thing as too many precautions. Animated googly eyes are the latest innovation.
Several sensors strategically placed around the vehicle allow the autonomous car to create, maintain, and operate a map of its surroundings. While radar sensors only monitor the position of nearby vehicles, video cameras detect traffic lights, read road signs, track other cars, and look for pedestrians.
As for distance measurements, road edge detection, and marking identification, the automobile relies on LiDAR (light detection and ranging) sensors that bounce light pulses off the car’s surroundings. Ultrasonic sensors in the wheels detect curbs and other vehicles when parking.
The autonomous vehicle’s sophisticated software then processes the sensory input, plots a path, and, based on hard-coded rules, obstacle avoidance algorithms, predictive modeling, and object recognition, sends instructions to the car’s actuators: these actuators control acceleration, braking, and steering.
According to the National Highway Traffic Safety Administration (NHTSA), 42,915 people died in motor vehicle traffic crashes in 2021. It is theorized that 94 percent of serious crashes are directly linked to human error. Self-driving cars remove most of the human error factor. The most common safety features found in autonomous vehicles are listed below.
AEB systems predict a possible collision and respond autonomously. The system will activate a vehicle’s brakes to prevent an impact; the car will either stop completely or slow down. Reverse Automatic Braking and Pedestrian and Cyclist Detection are some of its essential features. The Insurance Institute for Highway Safety (IIHS) found that the system:
FCW systems also rely on radar, LiDAR, or cameras to operate. In contrast to the AEB systems, FCW technology does not take control of the car but gives out visual, vibration, or sound warnings to drivers to take action in the face of an imminent collision. FCW and AEB systems are paired to create more integral and mixed systems. According to a survey conducted in 2022, 56 percent of drivers surveyed were satisfied with the FCW technology.
A 2022 survey by IIHS shows that SUVs and large vehicles often hit pedestrians while turning, chiefly due to blind spots. The bigger the car, the more blind spots it has. BSW systems use sensors to scan the driver’s blind spot areas and warn them. If the blind spots are occupied by a vehicle, pedestrian, or other objects, steering, braking, or lane change maneuvers activate these warnings.
RCTA uses sensors or cameras to check the vehicle’s rear and sides when a car reverses.
LDW alerts the driver when drifting out of their lane without a turn signal. It will not, however, take control of the vehicle. According to statistics, LDW systems can reduce 11 percent of all crashes and injuries by 21 percent.
ACC system refers to complex safety systems that maintain a safe following distance. It also uses sensory data recollected by cameras, lasers, radar, or LiDAR hardware installed on a car.
During the 14th International Conference on Automotive User Interfaces, researchers from the University of Tokyo and Kyoto University presented a research paper discussing the potential of using Animated Googly eyes as a safety feature in autonomous vehicles.
They considered a critical street-crossing situation in which a pedestrian is in a hurry to cross the street. On account of the car not “looking” at the pedestrian, they can judge that they should not cross the street, thereby avoiding potential traffic accidents. The vehicle not “looking” at the pedestrian implies its sensors did not register them.
To determine this, the researchers organized an empirical study using a 360-degree video recording of an actual car with Animated googly eyes (robotic eyes).
They then compared a no-eyes car (normal-looking car) and the eyes car in the critical street-crossing scenario for the four conditions:
Eyes could help pedestrians make faster street-crossing decisions. This study, however, demonstrates the possibility of reducing traffic accidents with eyes in specific situations. The results showed that the animated googly eyes could reduce traffic accidents for male pedestrians in critical cases (both car and pedestrian go ahead) and increase traffic efficiency for female pedestrians in non-critical cases (both car and pedestrian stop). Furthermore, the eye’s gaze invokes a sense of security when aimed at the pedestrian and a sense of danger when aimed away from them.
Self-driving vehicles are the cars of the future. Innovators and manufacturers alike are working on increasing their safety and efficiency. Using animated googly eyes is worth a deeper inspection, as unsettling as the robotic eyes may seem. This study is still bare bones. We will see what the future holds.
Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Technology section to stay informed and up-to-date with our daily articles.
Oman to host the Huawei ICT Competition Middle EastRegional Finals 2022 in Muscat on December 20 – 22 Middle East, December 09, 2022: Huawei, a leading global provider of information and communications technology (ICT) infrastructure and smart devices, announced the national finalists who will take part in the Huawei ICT Competition Middle EastRegional Finals 2022, […]
Stay tuned with our weekly newsletter on all telecom and tech related news.
© Copyright 2022, All Rights Reserved