Texas A&M Research Team Receives USDA Grant To Develop Innovative Wildlife Monitoring System

A research team led by the Texas A&M College of Veterinary Medicine & Biomedical Sciences (CVMBS) has been awarded a Conservation Innovation Grant from the U.S. Department of Agriculture (USDA) to develop a new artificial intelligence-based wildlife monitoring system.

Purple martins on a nesting box in a lake as part of the team's conservation innovation grant
Purple martins using a nesting box with the new camera system
Photo by Doug Bonham, Field Data Technologies

The Conservation Innovation Grant program, under the USDA Natural Resources Conservation Service, supports the development of new tools, approaches, practices, and technologies to further natural resource conservation on private lands.

The nearly $700,000 grant will be used by principal investigator and CVMBS associate professor Dr. Donald Brightsmith and his team to integrate camera, image, and sensor data to create a tool to monitor wildlife typically difficult to observe, including pollinators, reptiles, amphibians, and nesting birds.

Currently, landowners do not have many options for measuring their wildlife conservation efforts, such as setting up nesting boxes for birds or preserving areas of natural land, besides bringing in teams of scientists for hand surveys, a costly and lengthy process. Since private land makes up so much of the U.S., landowners’ efforts play a key role in the country’s overall wildlife management and conservation.

Brightsmith’s team, in conjunction with colleagues from the University of California, Santa Barbara; the University of Hawaii; and private industry, is working to develop a low-cost and easy-to-use system that will allow producers to monitor wildlife on their land and understand how their actions are directly affecting the local environment.

“Part of our grant is to make it so a typical landowner can easily use a laptop or phone app to see the information that came in from a specific camera, such as where that camera is on a map; the weather, temperature, light, and humidity there; and the critters that were at that camera,” Brightsmith said.

Commercial wildlife cameras already allow landowners to monitor wildlife on a small scale, but by using artificial intelligence to aggregate and analyze data from a number of cameras and locations, landowners will be able to see a much more complete picture.

The camera system will photograph the scene and, whenever there is a significant change, forward images to a central computer that will use artificial intelligence to identify species, behaviors, and trends. According to Connie Woodman, a member of Brightsmith’s team and graduate student in the CVMBS’ Department of Veterinary Pathobiology (VTPB), traditionally, this process would take scientists dozens of hours conducting surveys and sorting through photos.

“If this sort of technology could be available at low cost or free to farmers, it could really impact the ability to see if conserving farmland and private land is working,” Woodman said. “It’s just too expensive to have a hands-on survey for every property owner who wants to apply for government support to maintain wild lands and wildlife populations.”

“One of the other objectives within this is if landowners are doing land management activities like cutting, spraying, or planting, they will be able to look at the data coming in to immediately see how those changes in the ecosystem have impacted key reptiles, amphibians, birds, and bees,” Brightsmith said.

Baby chickadees inside a nesting box created under the conservation innovation grant
Baby chickadees inside a nesting box
Photo by Doug Bonham, Field Data Technologies

“A landowner can do something, or not do something, and find out how the environment changes over a short period of time with relatively little need for outside monitoring or assistance because they can be hooked directly to the data from their land,” he said.

The camera system will be used in three different setups—a ground setup to monitor reptiles and amphibians, a veil trap setup for insects, and a bird nesting box setup. In all three uses, the animals photographed will be free from any harm or human interference.

An early version of the nest box setup has already been deployed in the Pacific Northwest by collaborators within Field Data Technologies and is providing invaluable data on chickadee and purple martin nesting behaviors.

“It’s doing more than just taking photos; it’s spitting out data. How many eggs? When was there a change? How many times is the nest being visited? How many times does a chickadee investigate a nest box before it decides that it’s good enough?” Woodman said. “With that data we can tell a land manager, ‘The chickadee started visiting the nest boxes ‘x’ weeks before it used them. So if you wait until June to hang up the box, you won’t get any use this year.’

“There are all of these management recommendations that can be enabled by data,” she said. “And when data collection is automated, we can ask a database questions instead of staring at hours of video and photos that don’t convey meaning. We can do really amazing things to modify how we interact with the ecosystem and support these animals.”

While the Conservation Innovation Grant will directly support the technology’s creation and use in supporting landowners, Brightsmith hopes that the technology will one day also be adapted to create systems that could survey for invasive species and agricultural pests, monitor wildlife recovery after natural disasters, and more.

“If we build a technology that’s fairly straightforward to train the AI, someone else can take our platform and tweak it,” he said. “Our objective is to create a system that has unlimited potential.”

The system will initially be tested in Arizona, California, Texas, and Montana to see how it works in different environments. The team has already partnered with producers and other landowners near Austin, San Antonio, and southern California to see how the technology holds up around large animals like cattle.

Along with Brightsmith and Woodman, collaborators on this Conservation Innovation Grant include Drs. Chris Evelyn and Katja Seltmann, from the University of California, Santa Barbara, Cheadle Center for Biodiversity and Ecological Restoration; Dr. Ethel Villalobos, from the University of Hawaii at Manoa; and Doug Bonham, senior electronics engineer at Microsoft and the founder and president of Field Data Technologies of Essex, Montana (501(c)3 non-profit).

###

Story by Megan Myers, CVMBS Communications

For more information about the Texas A&M College of Veterinary Medicine & Biomedical Sciences, please visit our website at vetmed.tamu.edu or join us on FacebookInstagram, and Twitter.

Contact Information: Jennifer Gauntt, Director of CVMBS Communications, Texas A&M College of Veterinary Medicine & Biomedical Sciences; jgauntt@cvm.tamu.edu; 979-862-4216