File(s) under embargo
until file(s) become available
Perception system development for automated combine-to-cart unloading during harvest of grain crops
During harvest, the combine harvester needs to frequently unload the grain to another container to empty its hopper for the next round of crop intake. An efficient farming practice is unloading on-the-go since the combine harvester does not need to stop harvesting while unloading the material to another cart drawn by a tractor. Although desirable, unloading on-the-go requires highly skilled labor, and complete concentration, as the combine operator must simultaneously monitor the cart fullness, control the combine movement, and communicate with the tractor operator for vehicle coordination.
Automation of grain unloading on-the-go process can ease operator's burden, lower the demand for skilled labor and enhance harvesting productivity. This work is part of the Auto Unload project which is a collaboration between Purdue University and John Deere to develop a fully automated grain unloading on-the-go system.
To fully automate the unloading on-the-go operation, a vision-based perception system is needed to track the grain cart location and monitor fill status inside the cart. This work develops novel tools, frameworks, and algorithms for the development and evaluation of perception systems used for automatic grain unloading applications.
First, a simulation environment was developed using the Unreal Engine for the vision-based perception systems. The virtual environment is created specifically for the grain unloading scenarios with the combine and the tractor driven cart, and it simulates perception sensors including the camera and LiDAR sensor. The simulation tool is used to simulate sensor behavior in the unloading environment and to determine a preferable sensor placement for the automatic unloading system design.
The fill status information extracted by the perception system is used by the controller for decision making. To investigate the impact of the perception system on the unloading control, a novel co-simulation (CoSim) framework was developed to achieve an integrated simulation enabling models of the perception system and controller to run at the same time in order to interact with each other. The proposed CoSim framework includes perception simulation modules in Unreal Engine and simulates system models and dynamics in MATLAB/Simulink. The two-way communication was established between Unreal and MATLAB/Simulink to mimic the complex interaction between perception module and control module, and between the automatic unloading system and external environment. Two simulation cases were conducted to demonstrate that CoSim can be used for both system design and system evaluation. This CoSim scheme can also be transferred to other applications given the versatility of the software packages used. To the best of author's knowledge, the presented CoSim work was the first one demonstrating the closed-loop simulation of a complex automation task using Unreal and MATLAB/Simulink.
To develop a reliable perception system, experimental validation is necessary. For the Auto Unload project, a stereo-camera based perception system was chosen as the production-intent system due to its advantages of generating high-resolution 3D images at low cost.
This work proposes a novel LiDAR-based benchmark system to evaluate the in-situ performance of such combine harvester grain unloading on-the-go automation systems that incorporate stereo camera-based perception. The LiDAR-based benchmark system incorporates a dust filtering strategy to enable its use during dusty conditions. The benchmark system runs simultaneously with the stereo-camera based system to provide the benchmark data from in-field testing. Experimental results demonstrated that the proposed benchmark system provides consistent benchmark data in both clear and dusty environments. Compared with existing evaluation methods for sensing/perception, the benchmark system in this dissertation achieves in-situ evaluation of a complex perception task (i.e., deformable grain profile) to understand the holistic perception system performance in real-world grain unloading operations.
The evaluation results shows that the stereo-camera perception system was susceptible to dusty environment and failed to perceived the grain profile intermittently. To enable a robust grain perception, a data fusion strategy was proposed to fuse the measurement data from the perception system with the prediction from a grain fill model. The experimental results demonstrate that the fusion strategy increased the accuracy and coverage of the grain profile estimation and worked robustly under different situations. Instead of using the prevalent Kalman filter-based fusion approach, which is computationally expensive for high-dimension data, the proposed algorithm can achieve sufficient accuracy for unloading automation while maintaining a low computation cost.