A bipartisan group of state attorneys general is investigating how Instagram attracts and potentially harms children and young adults.

The probe follows revelations from a whistleblower about how Instagram's parent company Meta, formerly known as Facebook, has studied the risks of the photo-sharing app to its youngest users, including exacerbating body image issues for some teenage girls.

"Facebook, now Meta, has failed to protect young people on its platforms and instead chose to ignore or, in some cases, double down on known manipulations that pose a real threat to physical and mental health – exploiting children in the interest of profit," Massachusetts Attorney General Maura Healey, who is co-leading the states' investigation, said in a statement.

"Meta can no longer ignore the threat that social media can pose to children for the benefit of their bottom line," she said.

The group, which also includes prosecutors from Nebraska, California, Florida, Kentucky, New Jersey, Tennessee, Texas and Vermont, is examining whether Meta violated consumer protection laws and put the public at risk.

Pressure on Meta has been mounting since former employee Frances Haugen leaked thousands of pages of internal documents about how the company has studied and dealt with a range of problems, from hate speech to the "Stop the Steal" movement to the mental health impacts of Instagram on teen users.

The documents also show that Meta is fighting to attract and retain the attention of young people, amid competition from apps like TikTok and Snapchat.

The states are investigating the techniques Meta uses to get young people to log into Instagram more frequently and spend more time scrolling the app, and how those features might harm users.

"When social media platforms treat our children as mere commodities to manipulate for longer screen time engagement and data extraction, it becomes imperative for state attorneys general to engage our investigative authority under our consumer protection laws," said Nebraska Attorney General Doug Peterson.

"These accusations are false and demonstrate a deep misunderstanding of the facts," Instagram spokesperson Liza Crenshaw said in a statement. "While challenges in protecting young people online impact the entire industry, we've led the industry in combating bullying and supporting people struggling with suicidal thoughts, self-injury, and eating disorders."

She pointed to new features Instagram has introduced, including a "Take a Break" prompt that users can enable, and parental supervision tools for teenagers' accounts.

After Instagram's internal research on the risks to teenagers' mental health was first reported by the Wall Street Journal in September, lawmakers and regulators renewed calls for Meta to scrap its plans to launch a version of the app for kids 12 and under. (Instagram, like most social media apps, prohibits users younger than 13 because of federal privacy law.)

Shortly afterward, Meta said it was putting the project on hold.

Editor's note: Meta pays NPR to license NPR content.

Copyright 2021 NPR. To see more, visit https://www.npr.org.