autonomous driving

Why level 2 autonomous driving of heavy-duty trucks difficult?

Level 2 autonomous driving is becoming commonplace in passenger cars. In the case of a large truck, it seems to be a little different. I want to delve into the company’s related technologies and the Daimler Group from the announcement’s content.

Development of autonomous driving technology

Roughly speaking, Level 2 autonomous driving can be a driving support system that combines adaptive cruise control (ACC) and lane-keep control (LKC). It recognizes the lane and the preceding vehicle with cameras and sensors and controls the accelerator, brakes, and steering to perform appropriate acceleration/deceleration/stop/restart.

In the case of heavy-duty trucks, each company has a lineup of vehicles equipped with ACC. Still, even if warning and brake control are performed for lane departure and lane-keeping, those that serve direction control by active steering intervention are put on the market. Not.

Mitsubishi Fuso will launch the Super Great 2019 model that supports Level 2 autonomous driving on the market this fall. The technology that achieves this is called Active Drive Assist (ADA). The ADA is based on proximity control (equivalent to ACC) and LDP (lane deviation warning system with steering control).

In the Daimler Group, Mercedes-Benz trucks in Europe are in charge of development. Daimler Trucks in North America and Mitsubishi Fuso trucks and buses in Japan (Asia) have the technology to research and develop autonomous driving technology. Daimler considers North America to be the center of autonomous driving technology due to legal development, road conditions, and needs. In fact, in North America, Tesla’s semi-experimental experiments and TuSimple’s self-driving trucks are being tested on UPS regular flights (commercial driving on public roads).

North America will probably lead the market launch of Level 4 and above self-driving trucks. Still, Japan is cooperating in experiments on right-hand drive vehicles and the development of vehicle control technology. At Daimler, Level 3 autonomous driving will not be dealt with because it has little merit for drivers and businesses.

The Daimler Group shares sensors.

Various sensors, which are the infrastructure components of autonomous driving, are called “global sensor sets” and are designed and developed according to their standards. According to Daimler’s specifications, the sensor set’s hardware is not procured from the supplier’s products but is made by each supplier. In other words, the technology related to autonomous driving is manufactured in-house, and the sensor hardware has its platform.

The installed software will be finely tuned for each of the above three locations. Different countries and areas have different lane colors, shapes, widths, lengths, and marking standards. Road construction, cityscape, and climate are also other. It is necessary to judge sensor information such as cameras and season the country’s control.

The camera unit installed in the 2019 model Super Great is also one of the global sensor sets. Although the feature is a monocular camera, the sensing function is enhanced. In addition to the conventional automatic braking and detection of the vehicle’s distance and speed in front required for the ACC function, it has become possible to identify and detect oncoming vehicles, parked vehicles in front, pedestrians, and bicycles.

Tesla and others have adopted the control that mainly uses image recognition by the camera in autonomous driving, and it shows one direction globally. Mitsubishi Fuso also uses millimeter-wave radar for collision damage mitigation brakes and millimeter-wave radar for side obstacle detection (prevention of entanglement). Still, in Level 2 autonomous driving it has shifted to image sensing.

It is challenging to maintain lanes for large trucks because there is little room in the route.

The advantage of using a camera is that it makes it easier to recognize vehicles, bicycles, pedestrians, buildings, and many other objects. Radar does not know the shape of the target. LiDAR knows that the range is close and the condition (point cloud) of the target, but there is not enough information to distinguish whether it is a car or a hako. The new camera unit has enhanced sensitivity and resolution and is equipped with two processors to improve pedestrian identification and identification performance at night.

One of the reasons why Level 2 autonomous driving, which has become common in passenger cars, has been delayed in mounting on large trucks is the vehicle body’s physical size and weight. Even with the same lane width, the controllable range differs between heavy-duty trucks and passenger cars. Heavy-duty trucks move out of the lane quickly just by dragging a little or delaying control.

Volvo Trucks is trying to achieve the delicate control required for autonomous driving with electronically controlled steering by fly-by-wire. Mitsubishi Fuso (Daimler) uses a conventional power steering system with electric assist. Still, it has developed its control ECU and uses the camera unit’s information mentioned above to perform control that predicts the lane’s curvature. The accelerometer also measures the yaw moment, and the actual movement of the car is also fed back to the power. Therefore, the truck’s weight (the weight changes depending on the load) and the difference due to the road surface condition can be absorbed to some extent. For more information about it, you can join a truck driving school.

 

Mached1
Technology Tags:, ,

Comment (1) on “Why level 2 autonomous driving of heavy-duty trucks difficult?”

Leave a Reply

Your email address will not be published. Required fields are marked *