By Kate Gersh

Holy moly technology! We have exciting news to share stemming from our Neighbors to Nature: Cache Creek Study project (N2N), which is a partnership with Friends of Pathways, The Nature Conservancy-Wyoming, and the Bridger-Teton National Forest (BTNF). This project’s intent is to provide valuable wildlife, plant phenology, trail, and recreation use data to the BTNF. This data collection will inform the forthcoming BTNF Forest Plan revision process and influence the way we recreate on our trail system while coexisting with wildlife.

A component of this large-scale citizen science project is deployment of 27 game cameras throughout our local Cache Creek, Snow King and Game Creek trail system, and here is where the exciting update comes in to play. Over the course of the N2N project so far, the team has collected a whopping 719,173 total images from these game cameras! Most images contain “nothing there” because the cameras were triggered by wind, snowfall, or vegetation. We have been searching for a way to process a large quantity of camera trap images and especially, to pre-sort the images that have non-detections; thus, making the process of vetting images for wildlife identification faster and more fun for our volunteers. Thankfully, The Nature Conservancy-Wyoming recently collaborated with The Nature Conservancy-California to batch process all 719,173 game camera images through Microsoft’s Artificial Intelligence (AI) processor. The AI is trained to detect animals, people, and vehicles in camera trap images, using millions of training images from a variety of ecosystems.

Here are some example images of what detector output looks like:

In a sample of 7500 images, non-detections (aka. images with nothing there) made up 69.9%. The AI also detects people, and with a high level of confidence we can withhold images containing human subjects from further review, which is important for privacy issues (please note: AI DOES NOT identify faces). Animal detections accounted for 290 images or 3.9% of the sample. However, Microsoft’s AI does not identify animals to species; it just finds them. So, this is only a first stage and the N2N projects still needs volunteers, like yourself, to continue verifying images through the project’s Zooniverse platform. Machine learning has accelerated this process, by letting volunteers and researchers spend their time on the images that matter the most. While this streamlining of images hasn’t happened yet, we are excited to be closing in on getting the AI-processed images into Zooniverse, so soon you will be seeing far fewer “nothing here” photos!

Join the Study, Help Classify Images on Zooniverse

 We need you to help us comb through the thousands of images captured on our cameras and collect the data they provide. The information will then be analyzed by scientists at The Nature Conservancy-Wyoming. By devoting as little or as much of your time as you like, you will be providing critical data to the Bridger-Teton National Forest that will benefit both people and wildlife.

Access the Zooniverse site here.

Funny side note: It took Dr. Courtney Larson, Conservation Scientist at The Nature Conservancy-Wyoming, over a week to digitally transfer 719,173 images to the AI processor!

Celebrate Wildlife!

Enjoy monthly updates from JHWF and join us in creating a more wildlife-friendly community!

You have Successfully Subscribed!