LOUISVILLE, Ky. (WHAS11) -- One hundred thousand escort ads are posted online daily. Thousands of those ads are children being sold for sex, according to Thorn.
Precedents set by the court does not hold websites and social media sites accountable for content posted on their platforms by third-party users, unlike print media. Therefore, to end the spread of illegal material and illegal activity it is up to users to report it.
Traffickers talking to your kids?
Local investigators have also entered messages from a trafficker that appear to be recruiting girls into sex trafficking as evidence in a pending Kentucky human trafficking case.
WHAS11 iTeam asked several of the major social media sites about their policies regarding human trafficking and content posted.
Learn more about how kids are communicating and how to protect your kids.
Here are responses and statements from the spokespeople respectively:
FACEBOOK AND INSTAGRAM
“It is against Facebook's Community Standards and Instagram's Community Guidelines to post material that coordinates human trafficking and human smuggling and this type of material will be removed when reported to us. We encourage people to use the reporting links found across our site so that our team of experts can review the content swiftly.”
- sex trafficking has no place on Facebook or Instagram. Our Community Standards and Community Guidelines, respectively, make it very clear that human trafficking and smuggling are against our policies. This is true across the platform.
- We remove content that threatens or promotes sexual violence or exploitation. This includes the sexual exploitation of minors and sexual assault.
- We have a team of professional investigators and work with agencies across the world that seek to identify and rescue victims and bring perpetrators to justice.
- We’ve worked with NGOs and others in the tech industry to combat child exploitative images (CEI) with the use of cutting-edge technology such as Photo DNA.
- Photo DNA is a technology that scans all images on Facebook and Instagram and flags known child exploitative material so that future uploads of that imagery is prevented from surfacing on the platform at all.
- If someone does attempt to upload known CEI, our systems prevent upload, the account that attempted upload is immediately taken down, and we report all instances of such content to the National Center for Missing and Exploited Children (NCMEC).
- We also make it easy for people to report violations of our policy, and we prioritize reports of child sexual exploitation.
- If we believe a child is in immediate danger, we proactively refer cases to local law enforcement.
- We have developed strong relationships with NCMEC, its international counterpart, the International Centre for Missing and Exploited Children (ICMEC), and other NGOs to work together to disrupt and prevent the sexual abuse of children online.
- We launched AMBER Alerts on Facebook in 2011 to help families and authorities successfully recover missing children and have since expanded the program to over 12 countries.
- People in a designated search area where local law enforcement has activated an AMBER Alert will see the alert in their News Feed. The alert includes a photo of the missing child, a description, the location of the abduction, and any other pertinent, available information. Users can share the alert with friends to spread awareness, tapping into an organic desire to help.
- We know the chances of finding a missing child increase when more people are on the lookout, especially in the critical first hours.
- Our goal is to help get these alerts out quickly to the people who are in the best position to help.
- We have created shortcuts on Facebook and Instagram to provide education and additional resources (developed in conjunction with the National Human Trafficking Resource Center) to people who search for terms related to sex trafficking. These terms have been provided by internal and external experts and when someone searches for them on Facebook, we will have a pop-up that reminds them sex trafficking is illegal and violates our policies and shares resources for getting help.
- If people encounter content that indicates someone is in immediate physical danger related to human trafficking, we ask that they contact their local law enforcement immediately.
- People can report Groups/Pages, profiles, images, videos, and comments. When reported, content is reviewed 24 hours a day, 7 days a week, in over 40 languages. And because of the risk of real world harm, material relating to human trafficking is prioritized for review.
- People can also choose to report this content to us via a special form. This form is accessible via our Help Center.
- We work with the National Human Trafficking Resource Center, operated by the Polaris Project, to provide resources and assist victims of human trafficking.
- We also connect people to local organizations including:
- United States: National Human Trafficking Resource Center
- Canada: Contact Canadian Crime Stoppers
- Latin America: Bilateral Safety Corridor Coalition (BSCC)
- United Kingdom: Blue Blindfold UK
- Other Countries: National Human Trafficking Resource Center, operated by Polaris Project
“We do not tolerate child sexual exploitation on Twitter or Periscope. When we are made aware of links to images of or content promoting child sexual exploitation, they are removed from the platform and reported to the National Center for Missing and Exploited Children, which works with police globally to investigate.”
- Twitter works in coalition with every major technology company on ways to eliminate child sexual exploitation online. In 2013 we implemented PhotoDNA technology, designed by Microsoft, to identify and remove known online images of child abuse. We have established relationships with organizations such as INHOPE, Interpol, Thorn, and the Internet Watch Foundation.
- Our policy is posted here.
- When we are made aware of links to images promoting child sexual exploitation they are removed and reported to the National Center for Missing & Exploited Children (NCMEC).
- NCMEC exchanges information and sound practices with international partners to help achieve the mutual goal of reducing child sexual victimization in all countries.
- We instruct users who see images of Child Sexual Exploitation (CSE) on Twitter to report them immediately via our web form. Every report is reviewed by a member of our Trust and Safety team.
- Our website includes clear guidelines for law enforcement personnel seeking to request information about Twitter users, however, we do not require a law enforcement request to remove CSE images.
"Snapchat is a fun way to talk with friends and family through pictures and videos. Snapchat is not, however, a safe platform for illegal activity. We will continue to work closely with law enforcement."
- 3.5 billion ‘snaps’ a day are generated by Snapchat’s over 178 million daily active users, making Snapchat one of the most used cameras in the world, according to Snapchat.
- While the Snapchat claims the vast majority of Snapchatters use Snapchat for fun, they comply with valid legal requests from law enforcement and publish a Transparency Report detailing this activity twice a year.
- Snapchat provide tools in the app for Snapchatters to report violations of our Community Guidelines and Terms of Service.
- According to Snapchat, they have a ‘Trust & Safety’ team that works to review abuse reports and act when they become aware of a violation.
- Snapchat has also partnered with safety experts (including ConnectSafely, iKeepSafe, Net Family News, and the UK Safer Internet Centre) to build a Safety Center where parents, teachers, and teens can find safety tips, research, and resources about Snapchat. These groups form our Safety Advisory Board and provide insights which guide our policies and product development. Our Safety Center also includes a Parents Guide here.
- Snapchat says users do not have browsable public profiles that include things like location, interests, or age. On Snapchat there are no public likes or public comments. By default, a user cannot receive any messages on Snapchat from someone who hasn’t already been added as a friend on the app. Snapchat says users can also to block anyone for any reason.
- According to Snapchat, they report any images of child exploitation and sexual abuse to law enforcement via NCMEC (National Center for Missing and Exploited Children).
WHAS11 News iTeam attempted to contact Backpage.com multiple times but never received a direct statement.
- Transmitting any information, data, text, files, links, software, chats, communication or other materials that is unlawful,…..
- Posting adult content or explicit adult material unless: (i) such material is specifically permitted in designated adult categories and permitted under applicable federal, state, and local law; and (ii) you are at least 18 years of age or older and not considered to be a minor in your state of residence;
(b) Posting, anywhere on the Site, obscene or lewd and lascivious graphics or photographs which depict genitalia or actual or simulated sexual acts, as determined in the sole discretion of backpage.com;
(c) Posting any solicitation directly or in “coded” fashion for any illegal service exchanging sexual favors for money or other valuable consideration;
(d) Posting any material on the Site that exploits minors in any way;
(e)Posting any material on the Site that in any way constitutes or assists in human trafficking.
- Posting any ad for products or services, use or sale of which is prohibited by any law or regulation;
Take the issue to your leaders.
For more information on how you can help or get help, we have included additional links and hotlines:
YMCA Safe Place: call 502-635-5233 or text SAFE and address, city, state to 69866
National Human Trafficking Hotline: 1-888-373-7888
National Runaway Safeline: 1-800-RUNAWAY (786-2929)
Missing Children and Child Pornography: 1-800-THE-LOST (843-5678)