This article also appears in
Subscribe now »

The Case IH Magnum autonomous concept tractor, shown in the field with the Early Riser 2150 planter, looks futuristic with its cab-less design, carbon-fiber front fenders and signature LED status running lights.

CNH advances driverless tech for tractors

Precision farming and machine automation already play a significant role in agriculture. CNH Industrial’s Innovation Group is focusing on key times of the year when farm work still requires long days in the field, particularly when harvesting a crop or planting the next one. Working with Utah-based technology provider Autonomous Solutions Inc. (ASI), the Innovation Group developed concept autonomous technology to meet this challenge and demonstrated it via tractor concepts based on the existing Case IH Magnum and New Holland T8 high-horsepower conventional tractors.

“There have a been a number of groups and product platforms that have been involved with automation of some of the tractor onboard systems, and those are all enablers that allowed us to put together a very successful autonomous concept vehicle program,” John Posselius, CNH Industrial Head of Agricultural Innovation Technology, told Truck & Off-Highway Engineering. “Things as simple as having ISOBUS Class 3 capabilities on our tractor allows us to communicate by wire to all of the important functions on the tractor such as the hydraulics, the hydraulic remote, the three-point hitch, PTO, steering, transmission and engine control.”

The concept tractors are configured as two distinct versions: the cab-less Case IH Magnum and the New Holland T8 NHDrive concept that maintains its cab for operating flexibility. Both use a conventional engine, transmission, chassis and implement couplings.

A fully interactive interface has been developed to control the tractors. Three operating screens include a path-plotting screen that shows the tractor’s progress, one that shows live camera feeds with up to four views (two front, two rear), and a screen that enables monitoring and modification of key machine and implement parameters such as engine speed and implement settings.

Once path plotting has finished, the user can choose a job from a pre-programmed menu by selecting the vehicle, choosing the field and then setting the tractor out on its task. The sequence takes about 30 seconds.

The two tractors have a complete sensing and perception package in common, which includes radar, Lidar and video cameras to ensure obstacles in the tractor’s path or that of the implement are detected and avoided. If an object is detected in the tractor’s path, visual and audio warnings appear on the control interface—either tablet interface or desktop—which offers a choice of how the tractor should respond: by waiting for human intervention, driving around the obstacle using either a manually or automatically plotted path, or driving onwards if the object is not a danger.

“The fencing and perception is a real challenge,” said Posselius. “We’ve built in some nice systems in our concept vehicles and they do what we need right now. But one of the real challenges to truly move forward is our sensing and perception has to get much smarter.”

When operating parameters become critical, as in the case of low fuel or seed levels, the same notifying system is employed. Any critical machine alarms or loss of critical machine control functions cause the autonomous vehicle to stop automatically, or a stop button on the control interface can be activated manually.

Machine tasks can be modified in real time, such as if a storm is approaching. In the future, these concept tractors will be able to use “Big Data” such as real-time weather satellite information to automatically make best use of ideal conditions, independent of human input, regardless of the time of day, the company claims. For example, the tractor would stop automatically should it become apparent weather would cause a problem, then recommence work when conditions have sufficiently improved; or they could be sent to another field altogether where conditions are better.

The tablet interface also can be mounted in another machine whose operator can supervise its activities. As an example, from the seat of a combine or tractor, the operator can monitor the progress and eventually modify the performance of an autonomous tractor/planter combination working in the same or neighboring field. This allows autonomous tractors to “seamlessly integrate” into an existing farm machinery fleet, with minimal operational changes.

According to CNH, the autonomous technologies have been designed so that, in the future, they could be further developed to enable their application across the full range of equipment in a farmer’s fleet. This could encompass the full range of tractors, harvesting equipment and support vehicles, such as sprayers.

Being a diverse company with operations in three segments—Commercial Vehicle, Agricultural and Construction Machinery—transfer of technology from one application to another is not only possible but an actuality. CNH’s construction business is in the early stages of applying autonomous technology to some of its smaller equipment.

“What we develop in one sphere we can very easily adapt and apply in the others,” a spokeswoman told TOHE. “You’ve got truck platooning [by Iveco] and all the technology behind that, which we can sort of cherry-pick what we can from the experience there and then apply it to the Ag sphere and Construction business. We’re not operating in silos.”

With the autonomous tractors, the company is already working with some customers in the U.S. to set up an initial pilot program over a small group of farms with diverse operating conditions and environments. The program, which is expected to start next year, will help to determine how these products work in the real world and where some of the snags might be when operating in different conditions.

“So far, work has been strictly under the engineering organizations, specifically the Innovation Group, but we are broadening that,” said Posselius. “As we work with our customers, what we’re trying to see is how they would use something like this if it was a production piece of equipment. What specific needs do they have that we may not have foreseen yet? A lot of that work will not be done by our other organizations that deal closer with our customers.”

Continue reading »
X