Ex-FBI Contractor Pleads Guilty to Child Sexual Abuse Involving Minor Boys

The contractor played online games with conditions like the opposing minor player would strip away a piece of clothing on losing.
Ex-FBI Contractor Pleads Guilty to Child Sexual Abuse Involving Minor Boys
The U.S. Department of Justice building in Washington on June 28, 2023. (Madalina Vasiliu/The Epoch Times)
Naveen Athrappully
11/10/2023
Updated:
11/10/2023
0:00

A former FBI contractor has pleaded guilty to sexual exploitation charges of boys and production of child sexual abuse material (CSAM), according to the Department of Justice.

Beginning in February, Brett Janes, 33, from Arlington, Virginia, allegedly began enticing minors to submit CSAM content, according to a Nov. 7 press release by the DOJ.

Mr. Janes was arrested on May 31, just a day after his warrant was approved by a judge.

In August, he was charged with child exploitation offenses.

On Tuesday, Mr. Janes “pleaded guilty to one count of sexual exploitation of children, including using children to produce CSAM, and one count of receipt of child pornography. He is scheduled to be sentenced on Feb. 27, 2024, and faces a mandatory minimum of 15 years in prison.”

According to court documents, Mr. Janes is alleged to have contacted roughly a dozen minor boys over Discord and Snapchat. He is accused of grooming by telling them that he worked for U.S. intelligence and threatening the kids with suicide if they did not continue to communicate with him.

The former FBI contractor is also said to have purchased hundreds of CSAM videos and images through Telegram.

After his arrest on May 31, Mr. Janes appeared in court the same day. As he didn’t have a lawyer, a public defender was appointed by the court.

On Aug. 23, a federal grand jury in the Eastern District of Virginia charged him with two counts of sexual exploitation of children and production of CSAM, one count of attempted coercion and enticement, and one count of receipt of child pornography.

In addition to a mandatory minimum of 15 years in prison, Mr. Janes faces the possibility of a maximum penalty of life imprisonment. Actual sentences for federal crimes typically tend to be less than the maximum penalties.

“A federal district court judge will determine any sentence after considering the U.S. Sentencing Guidelines and other statutory factors,” according to the release.

The case was brought as part of Project Safe Childhood, a nationwide initiative from the DOJ. Launched in May 2006, the initiative aims to combat the epidemic of child sexual exploitation and abuse.

Grooming Victims

One of the victims in the case was a 13-year-old boy who met Mr. Janes while playing an online shooter game.

After the duo began messaging each other, the contractor asked the boy to play a version of the game in which the loser would remove a piece of clothing on video chat. He gave the boy around $500. The two also allegedly engaged in sexual acts.

Mr. Janes used “tactics consistent with ‘grooming’ to entice [the minor] to play strip video games,” according to FBI agent Paul Fisher, who authored an affidavit in support of an arrest warrant.

In one case, Mr. Janes allegedly attempted to get a boy to visit his house even after knowing that he was only 14 years old. In another instance, he allegedly threatened to commit suicide after a boy took the money he offered, but then ghosted him.

“if you ever need [expletive] i’m here. But it’s still [expletive] up that you took money while I’m hammered. Guess im happy that it happened sooner rather than later and im just going to kill myself cause no one likes me,” Mr. Janes allegedly wrote in a message after the boy stopped responding to his messages.

The contractor was identified as he sent pictures of his official badges to the minors. Authorities also identified his IP address, email address, and phone number that he used to contact them.

Child Exploitation Laws

Mr. Janes’s arrest comes as calls for stronger child abuse laws have gathered momentum in recent times.
In September, the National Association of Attorneys General sent a letter to congressional leaders, asking them to propose solutions to tackle the use of artificial intelligence in CSAM exploitation. It pointed to the dangers posed by AI’s ability to quickly create new CSAM material.

“AI tools can rapidly and easily create ‘deepfakes’ by studying real photographs of abused children to generate new images showing those children in sexual positions,” it said. “Deepfakes can also be generated by overlaying photographs of otherwise unvictimized children on the internet with photographs of abused children to create new CSAM involving the previously unharmed children.”

“Whether the children in the source photographs for deepfakes are physically abused or not, creation and circulation of sexualized images depicting actual children threatens the physical, psychological, and emotional wellbeing of the children who are victimized by it, as well as that of their parents.”

The letter called for expanding restrictions on CSAM to “explicitly cover” AI-generated material. It asked for Congress to establish an expert commission to study how AI can be used to exploit children and identify solutions to prevent such actions.

In Aug, Rep. Ann Wagner (R-Mo.) introduced the Child Online Safety Modernization Act, bipartisan legislation seeking to enhance the reporting of online child exploitation.
“Under current law, there are no requirements regarding what online platforms must include in a CyberTipline report. Due to this legal gap, platforms do not consistently report substantive and actionable information in their report,” a summary of the legislation said.

This leaves the National Center for Missing and Exploited Children (NCMEC) and law enforcement in a position where they are unable to locate and rescue the child, it stated.

The legislation makes it mandatory for online platforms to include information that would help in identifying and locating a child depicted in CSAM as well as the individuals behind such content.

Last year, the NCMEC CyberTipline received more than 32 million reports of online CSAM—an 89 percent jump from 2019. In over 50 percent of these cases, there was not enough data that could aid law enforcement in investigating the matter, said the summary.