39

Litter Robot

HackIllinois Litter Detection & Mapping Robot

Goals

The goal of our project was to create an autonomous robot that would have a real-world application or benefit with the toolkits provided by the John Deere sponsors.

The toolkits we were provided with had the following materials:

  • Raspberry Pi
  • Wheels
  • Power Bank
  • Ultrasonic Sensors
  • Camera
  • Pi Hat (optional)

Purpose and Approach

We decided that a meaningful implementation of an autonomous robot could be for garbage identification and collection.

Our first goal was to create a functioning rover that would be able to move as instructed. To do this, we constructed our rover with the Raspberry Pi and power bank as the main part of the frame and attached the required components to the rover to add to its functionality.

After the rover successfully moved according to our instructions, we implemented functionality for the ultrasonic sensors and the camera. The ultrasonic sensors would change the direction of the rover if it was about to collide with any oncoming objects and the camera would be able to detect objects in front of the rover.

    # Initialize Picamera
    picam2 = Picamera2()
    picam2.start()
    
    # Load the object detection model
    detector = hub.load("https://tfhub.dev/tensorflow/efficientdet/lite2/detection/1")
    labels = pd.read_csv('labels.csv', sep=';', index_col='ID')
    labels = labels['OBJECT (2017 REL.)']
    
    # Set the directory to save snapshots
    save_directory = "./snapshots/"
    os.makedirs(save_directory, exist_ok=True)

This snippet of code is a portion of how we were able to use the camera to take snapshots of the items in front of the rover. Whenever the rover would get close to colliding with an object, it would stop and take a picture, then it would alter its course.

With this implementation, we were able to parse the snapshots with a pretrained TensorFlow model to detect and label objects that the rover faces. Therefore, we were able to detect if the rover interacts with any "garbage" labeled objects, which includes bottles or paper and other items.

Finally, the locations of the garbage objects (based on the rover's starting position) would be tracked and a map would be created that provides the location of all of the trash objects that the rover interacted with.

Conclusion

This autonomous robot led us to the final shark tank themed presentation to a panel of judges as a representative for the Hack Knights path, which was the most competitive, and the John Deere track.