How Artificial Intelligence Can Stop Sex Trafficking
For Matt Osborne, finding exploited children typically starts with a walk on the beach, and it ends with hands cuffed behind his back. It’s almost always the same—Osborne and a few friends travel somewhere that’s known for sex tourism and walk along the beach or hang in area nightclubs, not to look for girls but to be seen themselves. A group of white American men is easy to spot in heavily-touristed resort towns in Asia, Central America, and South America, so it doesn’t take long to make a connection.
“They approach us,” Osborne says. “At first, everything is innocuous. Want to go jet ski or parasailing? Buy a margarita or beer? They offer us drugs, and the conversation always turns to girls. And if you let them talk long enough and say, ‘What else do you have? What else do you have?’ Then sooner or later, they always offer us young girls.”
For Osborne, that’s where the real work begins. A former CIA analyst, Osborne is senior vice president for rescue and rehabilitation at Operation Underground Railroad (OUR), a California-based nonprofit that extracts children from sex trafficking rings across the globe. Working undercover with local law enforcement officials, Osborne’s team makes contact with a pimp and arranges to have kids, usually girls, brought to a party packed with male operatives posing as wealthy American buyers and middlemen while female operatives pose as their girlfriends. Traffickers and victims are searched for weapons upon arrival and “girlfriends” take the kids to a back room under the pretense of dressing them in lingerie for the party, though the costume change never actually happens. Meanwhile, a few male operatives finish the financial transaction, with Osborne’s team secretly recording the entire process.
Once the right evidence is gathered, Osborne gives the signal, local cops rush in, and everyone, operatives and victims included, is handcuffed and taken away. After questioning, victims are released to their parents or family, if that’s a safe option. If it’s not, they go to pre-vetted shelters where they receive food, medical treatment, and psychological counseling, sometimes on OUR’s dime, while Osborne’s team quietly slips out of the country. Local authorities often take sole credit for the bust. Osborne’s team regroups to do it all again somewhere else.
“It is the most gut-wrenching thing to have to look into these girls’ eyes and have to pretend that I’m sizing them up,” Osborne says. “I see in their eyes, the eyes of my 14 year-old and my 11 year-old. We’ve rescued girls younger than that.”
Tracking Down Traffickers
Since OUR’s launch in late 2013, the nonprofit says it has assisted in rescue operations for 571 victims (180 of whom were minors) in 12 countries and assisted in arrests for 250 suspected traffickers. The organization only extracts when invited to do so by local governments and works with local prosecutors, law enforcement, and the U.S. embassy for months before an operation to carefully choreograph how a bust will go down and what evidence is required for conviction. Sometimes aiding a trafficking case means posing as buyers. Other times it means providing financial, technological, or training resources in cash-strapped countries where those aren’t accessible.
“It’s very, very difficult a lot of times to gain convictions because each country’s trafficking laws are different,” Osborne says, though he adds that having a trafficking transaction captured in high-definition audio and video can significantly help prosecutors. “Some countries don’t even know what their trafficking laws are because they’ve never pursued these types of cases.”
OUR and an increasing number of researchers across the U.S. are building technologies to improve capture and conviction rates and help law enforcement go after trafficking kingpins. Working in partnership with the cybercrime analytics company Delitor, Inc., OUR is developing proprietary software to track trafficker travel routes and help law enforcement determine if an escort ad was posted by an organized trafficking ring.
ID Via Machine Learning
Following a trafficker’s online footprints is tough, says Wade Shen, a program manager for the U.S. Defense Advanced Research Projects Agency (DARPA), in part because traffickers are good at evading commercial web indexing bots that search engines like Google use to catalogue the web. Traffickers post escort ads on both sites that are low-priority for indexing bots and on anonymous “deep web” sites that aren’t indexed by bots at all. Ads are often altered or removed after ten to 15 minutes, contact information changes frequently, and posts use a variety of non-standard writing formats to make them less searchable.
Shen is part of a multi-million dollar effort to develop better ways of mining the web to track illegal activity. Starting work in 2015, DARPA’s Memex program is a partnership of 17 contracting teams building tools that can collect content ignored by or unavailable to commercial search engines, analyze that content for hidden patterns, and build models to predict behavior. Focusing on trafficking for its first year, Memex has debuted 50 software programs and tools aimed at enhancing online search capabilities, some of which law enforcement officials are currently using to find leads and build cases against traffickers. Memex has also analyzed more than 100 million escort ads and uncovered new indicators that can help agents separate organized trafficking rings from adult prostitutes working solo. One of those indicators is price data, Shen says, a factor enforcement agents historically haven’t used to build cases.
If the prices listed in an ad increase or decrease depending how safe or physically dangerous the advertised sex acts or situation is, “then it’s much more likely that they are an independent contractor,” Shen says, adding that traffickers who aren’t personally taking on risks like catching a sexually-transmitted disease through unprotected sex tend to use less variable pricing structures. When combined with other information, such as the number of ads uploaded by a single person and the number of escorts represented in the same ad, “you can actually start to model behavior of rings of human traffickers.”
Identifying pimps is a small piece of the problem. Gathering enough evidence to prove sex trafficking and get a conviction is often a much bigger obstacle. The United Nations’ International Labour Organization estimates that 4.5 million people worldwide are victims of forced sexual exploitation. In the US, nearly 2,700 sex trafficking cases have been reported in the first half of 2016—roughly 15 cases every day—according to statistics gathered by the National Human Trafficking Resource Center. If the second half of the year is comparable, 2016 will see a nearly 30% increase in cases over the previous year. Prosecuting those cases is difficult in the U.S. and almost impossible in other countries. A 2014 report by the United Nations’ Office on Drugs and Crime collected data from 128 countries and found that while more than 90% of reporting nations criminalized human trafficking (including labor trafficking), 40% reported less than 10 convictions per year. Nineteen countries did not have any convictions between 2010 and 2012.
Tim Hoppock, a detective with the Austin Police Department in Texas, has worked in the Human Trafficking and Vice Unit for three years. The department makes roughly one arrest per month but Hoppock says that getting convictions is often a challenge. To get a first-degree felony conviction, which comes with sentences of up to life in prison along with a lifetime requirement to register as a sex offender, Texas requires prosecutors to prove that a victim is either under age 18 or is an adult that has been continuously trafficked for at least 30 days and engaged in forced labor or sex at least twice in that period. It can take months of gathering evidence and corroborating victims’ stories to prove uninterrupted trafficking over that time, and once evidence is assembled, cases can potentially get derailed or dropped if victims leave town, won’t testify, or are unreliable on the stand. Charges can also be pled down and sentences negotiated, Hoppock says, meaning that traffickers may be tried for a second-degree felony even though they were picked up for a first-degree crime.
“These cases are difficult from start to finish,” Hoppock says. “Even the prosecutors aren’t very comfortable with our cases because they’re so few and far between.”
Human trafficking cases can live or die based on how quickly detectives gather evidence. Several emerging technologies aim to speed up that process, including Traffic Jam, a software program launched in 2013 that combs through escort ads and uses machine learning to find patterns that can connect ads across multiple geographic locations to the same organization or pimp. In addition to tracking standard search metrics like contact information and search terms in escort descriptions, Traffic Jam can also identify similar photos that appear across different ads as well as stylistic markers like spelling errors and writing patterns that often follow an ad writer wherever they post. Instead of manually finding a few ads, waiting for a subpoena, then going back for more, Hoppock says that Traffic Jam fast-tracks the process by pulling a more comprehensive list of ads from across the country, including ones he wouldn’t immediately recognize as having a shared author.
“If I punch in a phone number into Traffic Jam and it gives me one ad with that number but it links to 50 ads on a different phone number that I previously didn’t know about, that can help me identify new victims,” he says. “It can definitely help me corroborate a victim’s story about where they’ve been and for how long.”
Marinus Analytics, the company that makes Traffic Jam, reports that the program is used by hundreds of law enforcement agencies across the U.S. and Canada, and has aided in rescue operations for at least 300 victims. They’re not the only player in the field. In the past two years, several programs aimed at mining escort ads have emerged including Spotlight—produced by the Ashton Kutcher/Demi Moore-founded nonprofit, Thorn—and DIG, an open source Memex-backed project that catalogues about 5,000 webpages every hour and transforms that content into a searchable database of escort ads.
Ad-mining algorithms require data and lots of it. To train machines to identify information across varying formats—to understand, for instance, that an “eight” in a phone number could also be written as “8” or “e1ght”—researchers need thousands of examples of different ways each piece of information could be written. To get enough training data, DIG uses Amazon Mechanical Turk to recruit people to read escort ads and highlight key information like eye color, hair color, and the working name of people featured in escort ads.
Since trafficker ads already use a variety of writing formats and are designed to evade search methods, getting clean data is the project’s biggest obstacle, says Pedro Szekely, the University of Southern California computer scientist who co-founded DIG. “We’re dealing with data where people are actively lying to go undetected, where people are doing a lot of spam,” Szekely says. “Just teasing out what is real data and what is bad data is already a big challenge.”
Data Hidden in Photos
Mining ads isn’t the only way that computer scientists are tackling traffickers. Released last year, Microsoft’s free PhotoDNA tool goes through pictures uploaded to social media and photo-sharing services, and identifies images depicting sexual abuse. Using a process called hash-matching—which involves dividing images into a grid and assigning numerical “hash” values for individual pieces—PhotoDNA compares the hash set of the pulled image from hash sets derived from a series of known illegal images provided by the National Center of Missing and Exploited Children. Taking a completely different approach, TraffickCam, a program created last year, is building a crowdsourced database of hotel room images law enforcement officials can use to determine where photos from escort ads were taken, and how traffickers are moving victims. Anyone can download TraffickCam’s smartphone app and upload interior photos of hotel rooms to the program’s collection of 1.5 million images.
At Carnegie Mellon University’s CyLab Biometrics Center, director Marios Savvides is focused on victim identification, a problem that’s particularly tough when victims are young.
“If a baby is abducted, for example at the age of two or three, at the age of five or six, even their parents won’t be able to identify them facially,” Savvides says. “How do you identify those victims?”
Children’s faces change as they grow up, but their eyes, specifically their irises, generally don’t. Savvides has developed a long-range iris scanner that captures data from up to 40 feet away. Originally designed to help soldiers ID people in combat zones, Savvides hopes to one day apply the technology to trafficking cases, potentially by allowing law enforcement to install scanners at major transit hubs to identify victims from a distance. Successful iris identification requires cops to have a picture of the victim’s iris. Instead of waiting for tragedy to strike, Savvides is modifying smartphone cameras to enable them to capture high-resolution iris photos. An accompanying app would allow parents to upload photos of their children’s eyes in case law enforcement needs them now or later.
A Never Ending Battle
Regardless of how good new trafficking programs and technologies get, there are obstacles to putting them into widespread use, says Wade Shen from DARPA Memex.
“Somebody has to actually invest in the maintenance and upkeep of collection platforms and tools,” he says. “Software we think of as a living thing, and if people aren’t updating it and maintaining it and keeping it running on platforms, it will eventually die.”
Shen doesn’t believe that new technologies will end trafficking entirely, but they are a step in that direction.
“It will of course change the tactics traffickers use, but that’s a good thing,” he says. “We want to raise their costs.”