What are the technical limitations of Autopilot?

Why do Tesla, who claims that "all the vehicles in production will be able to drive completely automatically", can't even avoid the obstacles that are visible hundreds of meters away? We can analyze a few well-known accidents in Tesla.

Tesla announced on the 30th: "In the period before the collision, the automatic assisted driving function is in use, and the following distance in the active cruise control is set to the minimum distance."

The statement said that the car's system issued several video prompts and an audible prompt asking the driver to hold the steering wheel with both hands. “But within 6 seconds before the accident, the system did not detect that the driver put his hands on the steering wheel. From the distance of 150 meters from the highway concrete barrier, the driver can clearly see the isolation belt and have a reaction time of about 5 seconds. However, unfortunately, the driving log shows that the driver did not take any action."

The disaster caused by Tesla's "exaggerated vocabulary"

This is already the second automatic driving death of Tesla. In May 2016, a Tesla Model S crashed in Florida, USA. Before the incident, the owner Joshua Brown also opened the Autopilot function and gave up control of the vehicle. Eventually, the accident occurred because no truck appeared in front.

The accident was the world's first autopilot death that caused widespread concern. Although the results of the US Highway Administration's investigation did not reveal design flaws in the system, it also accused Tesla of using "autopilot" when promoting Autopilot. "Exaggerating the vocabulary makes it easy for some daring drivers to think that this system is unmanned, thus giving up control of the vehicle."

Since then, Tesla has changed the translation of Autopilot in China from "autopilot" to "automatic assisted driving". Musk also said at the press conference: It's autopilot not autonomous. (Autopilot is not automatic driving, but automatic assisted driving).

The Autopilot program has also been upgraded, that is, when the user leaves the steering wheel for a certain event, the system will flash the reminder chart through the dashboard, issue a warning tone and other measures to remind the driver to take over the vehicle, and if the driver still refuses to take over the vehicle, the Autopilot function will be released.

Although the industry believes that Tesla's latest version of the Autopilot 2.0 system is more mature in terms of vertical control, there is room for improvement in the functions related to assisted steering. After all, we have received mixed evaluations from the driver's feedback.

At the same time, due to functional limitations, the current Autopilot 2.0 cannot be directly ported to urban environments. Although it can be opened smoothly (2 minutes at a time, 3 minutes at a time), there are complex traffic flows in the city, and it is necessary to pass through the traffic lights without lane lines. The distance between the cars and the cars is closer, which reveals that the system still needs to drive the city. Learn.

From the technical analysis, we can also know that the current Tesla Autopilot function still has application scenarios that cannot be solved by the existing technology, and many accidents have fully proved that the current system can not be 100. % of the stationary obstacles are recognized and reacted, so they cannot be called autonomous driving.

However, Tesla equipped with the Autopilot system, which was almost deified, why can't even avoid the obstacles visible hundreds of meters away?

What are the technical limitations of Autopilot?

If you choose to "brake when you don't need to brake" and choose "do not brake when you need to brake", how would you choose?

I believe that more than 90% of people will choose the former, because the latter will happen or will be fatal.

But the actual result is exactly the opposite. For engineers in the Autopilot field, the latter will be chosen without exception. Such a setting that sounds extremely dangerous is what the engineers deliberately did. Why?

In terms of current autonomous driving technology, it is not yet possible to achieve full automatic driving. Even though the hardware content of the hardware is very high, many experimental simulations have been done in the early stage. However, in the actual road environment, there are still many situations in which the system cannot judge. Generally, when this happens, the system has only two choices: "False PosiTIve" and "False NegaTIve". That is to say, when the system cannot clearly determine whether there is an obstacle in front, it should be immediately braked just in case, that is, a false alarm. Still should ignore this uncertain danger, that is, underreporting.

In the eyes of many people, the system should be set to a tendency of false positives. In the spirit of “Ning is not credible, it does not believe it”, the accident can be avoided to the greatest extent. But the fact is not like this. Imagine: On the highway, you turned on Autopilot, and behind the car was approaching at high speed, and the front was wide open. At this time, a large steel plate appeared on the road ahead. The radar determined that this was a huge obstacle, so the emergency brake was taken, and the rear car could not respond...

The only safe solution is not to move.

Especially for millimeter wave radars, it is sensitive to metal reflections. The steel plate on the road, the raised manhole cover, and even the bottom of the can, are equivalent to a wall in the eyes of a millimeter wave radar. For such an overly complex actual road situation, the poor driving experience caused by the inexplicable braking of the vehicle does not mean that the risk factor will be greater.

Therefore, engineers will adopt the logic of “missing” to improve the driving experience and avoid the safety hazards caused by excessive braking behavior. However, once the system has missed any of the dangers that are indeed present, it is very likely to cause a major disaster, which is why the current brand officials are emphasizing that “the hands should not leave the steering wheel when the Autopilot function is turned on”. In order to be ready at all times, the driver takes over the vehicle.

And Tesla also clearly stated in the user manual:

Traffic-aware cruise control systems may not brake or slow down to avoid stationary vehicles, especially in this case: you are driving at speeds of more than 80 kilometers per hour, after you have changed lanes in front of you, suddenly in front of you A stationary vehicle or object appears. Drivers should always pay attention to the road ahead and be ready to take urgent corrective measures. Total reliance on traffic-aware cruise control systems can lead to serious casualties.

Going back to the very beginning question: Why do Tesla, who claims that “all vehicles in production are able to drive completely automatically”, can’t even avoid obstacles that are visible hundreds of meters away? We can analyze a few well-known accidents in Tesla.

In May 2016, a Tesla ModelS electric vehicle collided with a turning trailer while driving in the Autopilot mode, causing the driver to die. For this accident, Tesla explained this:

At that time, the Model S was driving on a two-way, centrally isolated road with autopilot in an open mode, when a towed vehicle traversed the road in a direction perpendicular to the Model S. In strong daylight conditions, both the driver and the autonomous driver failed to notice the white body of the trailer, so the brake system could not be activated in time. As the trailer is crossing the road and the body is high, this special condition causes the Model S to collide with the bottom of the trailer when it passes through the bottom of the trailer.

Although the official did not give a clear and powerful explanation for the accident. However, as an engineer in the industry, combined with the development of intelligent driving technology and various difficulties encountered, we may wish to make the following inference: On the hardware equipment, the Tesla was equipped with the first generation Autopilot system, namely Front camera from Mobileye, millimeter wave radar from Bosch, and 12 ultrasonic sensors. At the priority level, it is camera-led.

What are the technical limitations of Autopilot?

Although Tesla official blamed the strong sunshine and the white body, the camera did not see the trailer, causing an accident. But perhaps these are only secondary reasons. The key issue is likely to be: the Mobileye camera used in this generation of systems is more focused on the training of the front and rear, but the training on the side of the car is limited, and the comparison of the trailer is encountered. Special shape. So the camera analyzes from the outline and does not regard it as an obstacle. And because the bottom of the trailer is empty, the millimeter-wave radar does not get a reasonable reflection when scanning. Or it is judged that the trailer in front may be dangerous, but because the camera is dominant, the execution of the millimeter wave radar is not high enough. The two fuzzy judgments are superimposed, and the system decides to “missing”, which leads to the accident.

Dan Galves, an executive at Mobileye, also indicated after the accident:

The current anti-collision technology or automatic emergency braking system is only suitable for the following (follow-up) state and is only designed for problems that follow the car. That is, when the vehicle is driving laterally, the current Autopilot system itself does not have sufficient judgment.

Therefore, in September of the same year, Tesla also announced the upgrade of Autopilot technology. The second-generation Autopilot will use the radar instead of the camera as the dominant factor to judge the surrounding conditions through the eight cameras and 12 sensors. At the same time, the settings are also adjusted: if the driver does not hold the steering wheel within a certain period of time, the system will issue an alarm, and if the driver repeatedly ignores the alarm issued by the system, the automatic steering software will automatically stop using.

The statement also stated:

In order to allow the vehicle to better handle the data collected by the sensor, the vehicle will be equipped with a more powerful computer, which will be 40 times more powerful than the previous generation, and will run a new neural network system developed by Tesla to handle Vision, sonar and radar signals. This system enables a view that the driver can't see, and can view all directions at the same time and at a speed far beyond human perception.

In this way, Tesla already has the hardware foundation for autonomous driving. However, this does not mean that Tesla will have full autopilot capability right away. Because for each individual sensor, or for the fusion between different sensors, it takes some time to learn and improve.

FLOW Vape

If you want to know more about the products in FLOW Vape, please click the product details to view parameters, models, pictures, prices and other information about FLOW Vape. And deliver to the destination just cost 7-15davs.We are a legal e-cigarette trader in China. We can provide you with FLOW Disposable Vape. Here you can find the related products in FLOW Vape, we are professional manufacturer of FLOW Vape. We focused on international export product development, production and sales. We have improved quality control processes of FLOW Vape to ensure each export qualified product.

Sufficient supply and favorable price, If you need to pick up goods wholesale, don't miss us.

Flow Vape Pod

Flow Vape Pod

Folw Vape,Folw Disposable Electronic Cigarette,FLOW Disposable Vape ,Flow Cigarette Cartridge Kit,FLOW Vape Kit

Tsvape E-cigarette Supplier Wholesale/OEM/ODM , https://www.tsecigarette.com