Instagram’s algorithm directly connects teens to ‘drug dealers selling everything from opioids to party drugs,’ researchers say

Instagram logo is blurred on an iPhone screen.
Instagram promoted hashtags related to the buying of illegal substances to users as young as 13, according to new research.
  • Instagram recommended hashtags related to illegal drugs to teenagers as young as 13, researchers at the Tech Transparency Project found. 
  • "The platform's algorithms helped the underage accounts connect directly with drug dealers selling everything from opioids to party drugs," the researchers said.
  • Instagram has faced increased scrutiny around how the platform impacts children.

Instagram recommended hashtags related to illegal substances to users as young as 13, and its algorithms led them to accounts claiming to sell drugs, including opioids and party drugs, in violation of Instagram policy, researchers found. 

Researchers at the Tech Transparency Project said they set up multiple new Instagram accounts, creating one for a 13-year-old user, two representing 14-year-old users, two for 15-year-old users, and two for 17-year-old users. According to the report, it took two clicks for the hypothetical teen accounts to access accounts that claimed to be drug dealers.

In comparison, it took researchers five clicks to log out of an account on the Instagram app. 

"Not only did Instagram allow the hypothetical teens to easily search for age-restricted and illegal drugs, but the platform's algorithms helped the underage accounts connect directly with drug dealers selling everything from opioids to party drugs," the Tech Transparency Project said in a news release outlining its findings. 

"We prohibit drug sales on Instagram," a Meta spokesperson told Insider on Tuesday. "We removed 1.8 million pieces of content related to drug sales in the last quarter alone, and due to our improving detection technology, the prevalence of such content is about 0.05 percent of content viewed, or about 5 views per every 10,000.

"We'll continue to improve in this area in our ongoing efforts to keep Instagram safe, particularly for our youngest community members," the spokesperson added.

While Instagram bans hashtags for illegal substances, researchers at the Tech Transparency Project found that the app would recommend alternative hashtags for some drugs after users typed into the Instagram search bar. 

"For example, when one of our teen users started typing the phrase 'buyxanax' into Instagram's search bar, the platform started auto-filling results for buying Xanax before the user was even finished typing," the researchers said. "When the minor clicked on one of the suggested accounts, they instantly got a direct line to a Xanax dealer. The entire process took seconds and involved just two clicks."

Instagram said it has blocked problematic hashtags identified in the report

The "buyxanax" hashtag and other hashtags outlined in the report, including "#mdma" and "#buyfentanyl," have since been blocked by Instagram, the Meta spokesperson told Insider, adding "we're reviewing additional hashtags to understand if there are further violations of our policies." 

When one of the teen accounts followed a user claiming to be a drug dealer, the app's algorithm recommended other accounts similarly appearing to sell drugs, according to Tech Transparency Project's report.

According to Instagram's community guidelines, it is against policy to sell drugs on the platform. But researchers said they found that drug dealers operated "openly" on the platform and offered pills, including the opioid Oxycontin.  

"Many of these dealers mention drugs directly in their account names to advertise their services," the researchers said.

Instagram in July announced that all Instagram accounts for users aged 16 years old or younger would be set to private by default, but researchers found that only accounts set up using the Instagram mobile app, and not Instagram's website, were set to private. 

These findings come as Instagram, and its parent company Meta (formerly Facebook), face increasing scrutiny for how the platform affects minors. 

The company on Tuesday announced it was rolling out new safety features for teenagers, including tools to help users spend less time on the app, have fewer unwanted interactions with adults and sensitive content, and allow parents to have more oversight of their children's accounts, NPR reported. 

The announcement came just one day before Instagram head Adam Mosseri is scheduled to testify Wednesday before the US Senate Subcommittee on Consumer Protection, Product Safety and Data Security. Mosseri is expected to be questioned about Instagram's influence on young users.

In October, former Facebook employee and whistleblower Frances Haugen said Facebook had internal data that showed Instagram was toxic to teenagers, and particularly young girls. Internal Facebook researcher provided by Haugen showed 13.5% of teen girls said Instagram worsened suicidal thoughts and 17% of teenage girls said Instagram contributed to eating disorders, NPR reported.

Read the original article on Business Insider

Comments are closed.