In France, the average age of farmers is constantly increasing, which poses a major challenge for the sustainability of the sector.

Agricultural robotics is proving to be a promising solution to address the shortage of labor and meet the challenges of productivity and safety.

It is in this context that the first Hackathon of the “Grand Défi de la Robotique Agricole” (GDRA) took place.

The Hackathon: Principles, proceedings, rules, and challenges

Organized by Robagri via France 2030, the GDRA Hackathon aimed to consolidate and accelerate the robotization of the agricultural sector by bringing together researchers and professionals from the field.

This first edition aims to address 3 major challenges related to the arrival of robots in the agricultural environment:

  1. Itinerary creation
  2. Pedestrian detection
  3. Obstacle avoidance

Teams from all over the world competed on each of these topics during a qualification phase from December 2023 to January 2024.

Based on the performance of each solution presented and the opinion of a jury of experts, the top 2 teams from each challenge were qualified for a final phase.

The Innolab team of T&S took up the challenge by registering for the pedestrian detection and obstacle avoidance challenges.

The Hackathon: A Platform for Exchange and Confrontation for T&S and its AG-Tech Solutions

Since 2019,T&S has been developing all-terrain intelligent mobility solutions through its R&D team Innolab.

Specialized in AG-Tech, the team has acquired a lot of expertise on the themes of perception, planning, control, and safety functions.

This Hackathon was an opportunity for T&S to confront its expertise with other players in the market.

The Competition Starts

During the qualification phase of the hackathon, T&S chose to participate in the challenges 2 and 3, respectively pedestrian detection and obstacle avoidance.

The Complexity of Pedestrian Detection in Agricultural Environments

Obstacle detection in agricultural environments is particularly complex.

The detection and identification of environmental elements can be compromised by many factors:

  • Changes in brightness: days, nights, shadow cast by the canopy and crops,
  • Weather hazards: rain, snow, hail, fog
  • Heavy obstruction caused by vegetation: it is common for farmers to walk from one crop to another to inspect and harvest each plant. The high concentration of vegetation creates a natural camouflage that prevents robots from correctly identifying pedestrians.
Thanks to AI  were are able to identifying and distinguish people from other objects
People identification is possible thanks to the recent improvements of Artificial intelligence

Limitations of Classical Methods and Rise of AI

Traditional obstacle detection methods do not distinguish their nature, a crucial challenge for the safety of agricultural robots. AI now allows for precise identification of obstacles.

Advantages of Robot Instrumentation

Easier and more realistic than environmental instrumentation, robot instrumentation offers flexibility and adaptability.

The Innolab team favored an on-board solution based on AI and robot perception for reliable and precise obstacle detection by designing an innovative solution based on a LiDAR sensor.

Pedestrian Detection for Agricultural Robots: Choosing LiDAR

Pedestrian detection tasks are usually addressed with sensors such as cameras, thermal cameras or LiDARs.

As the competition is taking place on a simulator, the use of a thermal camera is impossible.

In addition, cameras used in autonomous vehicles are unreliable at night or without adequate lighting.

LiDARs, submarine sonar look-alikes, work equally well day and night. They also provide unrivalled accuracy in measuring the distance to near and distant objects.

This technology allows pedestrians to be detected with high precision, even in difficult weather conditions and in the presence of obstacles.

Detecting Pedestrians in Complex Environments

The system developed by the Innolab team adapts to variations in brightness, weather hazards and the heavy obstruction caused by vegetation.

It uses a robust deep learning model to distinguish pedestrians from obstacles and ensure accurate real-time localization.

Obstacle Avoidance in an Agricultural Environment

Participation in the obstacle avoidance challenge allowed the Innolab team to improve its use of key tools (ROS, Gazebo, Docker) and to strengthen its expertise in obstacle avoidance.

Dual Requirement for the Obstacle Avoidance Method

  1. Bypassing Moving Obstacles

The method must allow the robot to detect and avoid moving obstacles, thus ensuring smooth and safe navigation.

  1. Precise Navigation in Confined Spaces

The method must also allow the robot to pass close to fixed obstacles, such as rows of closely spaced vines, for better efficiency in complex environments.

Existing Approaches to Obstacle Avoidance

Unlike road vehicles and their overtaking maneuvers in simple environments, obstacle avoidance for agricultural robots is a greater challenge given the multitude of elements and specific constraints of fields.

In the case of autonomous robots in constrained environments, research focuses on three areas:

  • Empirical and geometric approaches (low computational cost, but can get “stuck” in complex cases)
  • Path planning approaches by optimization (very time-consuming, but valid solutions even in constrained environments)
  • Model predictive control approaches (high computational cost, but consider the dynamics of the vehicle and obstacles, recommended for high-speed vehicles)

A Simple and Effective LIDAR-Based Approach

Screenshot of the real-time obstacle map generated from a 2D LiDAR sensor.
2D real-time obstacle map generated with LiDAR sensor data

For reasons of development time available and implementation, the team deliberately focused on relatively simple and computationally inexpensive approaches.

The chosen method is based on the construction of a real-time obstacle map from a 2D LiDAR sensor.

With each new measurement, the map is updated to refine detection and consider moving obstacles. This map, coupled with the robot's orientation and maneuverability information, allows the danger posed by obstacles around the robot to be quantified.

The system assesses the danger of obstacles and adapts the robot's trajectory to avoid collisions.

This simple and robust approach adapts to a wide variety of environments.

Semantic Segmentation: A Crucial Issue for the Autonomy of Agricultural Robots

Semantic segmentation for agricultural robots is complex due to external conditions and the need to learn to identify non-standardized obstacles.

Fortunately, T&S and its Englab and Innolab teams have expertise in AI and data generation and processing tools, allowing for better obstacle detection.

Promising Results and Great Adaptability

With these two proposed solutions, T&S was able to prove its skills by climbing to 1st place in the pedestrian detection challenge thanks to the detection rate and accuracy of its solution.

This 1st place in the qualification phase allows us to obtain our tickets for the final phase that took place during the 2024 edition of the World FIRA!

The Final at WFIRA 2024

The World FIRA is one of (if not the) largest AG-Tech events.

This exhibition brings together farmers, industrialists, robot manufacturers and researchers.

This event is an opportunity for 3 days to observe demonstrations of cutting-edge robots and attend conferences, presentations of research projects and education.

The GDRA Hackathon: A Thrilling Final

Innolab chose to use LiDAR Sensors to optimize their obsctacle detection solution
3D simulator with the LiDAR Data

Day 1: Immersion in the 4D-Virtualiz Simulator

On February6, 2024, the GDRA Hackathon finals got off to a flying start with the presentation of the new 4D-Virtualiz simulator and its new challenges.

This state-of-the-art simulator can add measurement noise to sensors, generate unfavorable weather scenarios, and integrate realistic pedestrians into the environment, in addition to standard obstacles (buildings, fences, moving agricultural machines, pedestrians, vegetation, crops).

The Innolab Team Put to the Test

Despite these new constraints, our expert team did not disappoint. Thanks to their rigorous preparation, they obtained rapid results, with performances similar to those of the previous phase.

A New Jury for a New Challenge

The evaluation criteria from the qualification phase were not used to separate the finalist teams. Indeed, as each challenge is unique, an objective comparison was not possible. This is why a second jury of experts was appointed to select the winning team.

After deliberation, the T&S Innolab team was selected as the 1st winner of the GDRA hackathon thanks to its innovative pedestrian detection solution.


From Simulation to Reality: Innolab Tests Its Solution in the Field

After a month and a half of fierce competition on simulators, the Innolab team is ready to take on a new challenge: testing its pedestrian detection solution in a real agricultural environment.

To do this, they will rely on their all-terrain KIPP test platform.

These crucial experiments will make it possible to confirm the technical choices made throughout the project and to validate the operation of this innovative system in the real world.

The addition of cameras to the system is also envisaged to meet the specific requirements of certain application cases.

Stay tuned to follow the next advances of this promising project!

Share

Our experts are only a phone call away!

Ask your questions and find solutions for your product development

Contact us

Read more news

30/10/24

A production line of the future: a dream or a reality?

Englab's experimental Industry 5.0 production line, revolutionizes manufacturing by combining human-machine collaboration, AI, and automation to optimize processes, increase flexibility, and improve product quality. To learn more about this project, read the article below.

READ MORE
11/7/24

Why a V.I.E. Contract Can Launch Your Career: Lessons from Cérine and Meiyun

This article explores how a V.I.E. contract can jumpstart your career. Through the stories of Cérine and Meiyun, two V.I.E. employees, you'll discover how the program can help you develop essential skills, build a global network, and gain the confidence to thrive in an international work environment.

READ MORE
6/6/24

T&S Group 2023 Carbon footprint

Technology & Strategy (T&S) is committed to environmental sustainability, transforming environmental objectives into economic development drivers. T&S conducts an annual carbon footprint assessment, embracing a low-carbon approach and positioning itself as a key contributor to the ecological transition. In 2023, T&S’s total carbon footprint was 11,699 tCO2e, with emissions spanning across three scopes. T&S is deploying initiatives to control environmental impact and combat climate change, setting emission reduction targets in accordance with the Science-Based Targets Initiative (SBTi). All branches of T&S have made an environmental commitment, actively participating in assessing the company’s carbon footprint and following a low-carbon path in alignment with SBTi.

READ MORE
-->