Colby Loucks is probably the first rhino poacher to wear a T-shirt featuring a chemistry joke.
Monday morning, Loucks played the role of a poacher, strutting down a road at a Maryland dairy farm as the World Wildlife Fund tested a new approach to halt poaching of elephants and rhinos in Africa. His green T-shirt featured the periodic table of the elements, and underneath said, “I wear this shirt periodically.”
As the Wildlife Crime Technology Project’s director turned left and approached a group of cows — playing the role of rhinos — other members of his team huddled around a laptop to see if their camera and software would properly identify Loucks as a human.
The team wants to use thermal cameras and machine learning to identify humans, and trigger automated alerts to nearby park rangers when suspected poachers cross into parks.
They hope the technology can make it more affordable to monitor miles of roads in East Africa, and solve the growing poaching problem. In South Africa, for example, from 1980 to 2007 an average of just nine rhinos were poached per year. That number skyrocketed to 1,215 in 2014. The population of forest elephants in Central Africa declined 62 percent from 2002 to 2011.
“We’re trying to take the human out of the loop,” said Eric Becker, an engineer at the World Wildlife Fund. “We looked at every technology that exists. This may sound expensive but compared to the others it’s a fraction of the cost.”
The $7,000 thermal cameras were chosen over everything from drones to buried fiber optics, seismic technology and radar systems. The work is being funded by a grant from Google’s Global Impact Award.
The plan is to mount the cameras on poles along roadways. A small computer attached to the camera will run software that identifies moving objects and classifies them. Solar panels will power the cameras and computer. If a human walks into a park, the camera can recognize the movement and send a text message or e-mail to park rangers via radio signals.
The team tested their approach earlier this year in Africa. Lately Becker has set the camera up at a soccer field near his home and used his dog as a test animal. Monday, the researchers headed to the Prigel Family Creamery, just north of Baltimore, to get a location with longer sight lines and a setting vaguely reminiscent of the African savanna.
The testing was a chance to see just how far the cameras could effectively see. The group had hoped for up to a kilometer of range. The longer the cameras can see, the fewer are needed to provide coverage, which would lower their costs.
In the tests the software was quick to identify Loucks as he walked within a few hundred feet of the camera. But at greater distances it struggled. Becker thinks narrowing the temperature range the camera focuses on could solve that problem. At times the software also drew the wrong conclusions, identifying an SUV and a cow as human.
Becker said it would take the algorithm a couple of days to learn the environment and to properly classify objects. For example, it should eventually ignore regular movements such as swaying grass or tree limbs.
While the algorithm already looked for human torsos and swinging arms to identify humans, it includes an option to teach it by having a human classify photos as containing humans, animals or other objects. Becker said the group would be working with the software maker to expand the categories of things it could identify.
In July, they’ll head to East Africa for on-site testing. Which country is unclear, as they’ll have to strike an agreement with the government. If all goes well, Loucks’s team envisions implementing it by year’s end.