The algorithms spot and classify synthetic-material objects based on the distinctive manner in which they reflect polarized light. Polarized light reflected from human-made objects often differs from natural objects, such as vegetation, soil, and rocks.
The researchers tested such a camera, both on the ground and from a US Coast Guard helicopter, which was flying at the altitude at which the polarimetric-camera-equipped drones will fly.
Once fully operational, data collected by the drone-based machine learning system will be used to make maps that show where marine debris is concentrated along the coast to guide rapid response and removal efforts. The researchers will provide NOAA Marine Debris Program staff with training in the use of the new system, along with standard operating procedures manual.