Facebook-owned Instagram has been criticised for keeping secret its internal research into the effect social media had on teenager users.
According to the Wall Street Journal, its studies showed teenagers blamed Instagram for increased levels of anxiety and depression.
Campaign groups and MPs have said it is proof the company puts profit first.
Instagram said the research showed its "commitment to understanding complex and difficult issues".
The Wall Street Journal's report, not disputed by Facebook, finds:
A 2019 presentation slide said: "We make body-image issues worse for one in three teenage girls"
Another slide said teenagers blamed Instagram for increased levels of anxiety and depression
In 2020, research found 32% of teenage girls surveyed said when they felt bad about their bodies, Instagram made them feel worse
Some 13% of UK teenagers and 6% of US users surveyed traced a desire to kill themselves to Instagram
Instagram conducted multiple focus groups, online surveys and diary studies over a number of years
In 2021, it conducted large-scale research of tens of thousands of people that paired user responses with its own data about time spent on Instagram and what was viewed
In response to the WSJ report, Instagram published a lengthy blog defending its research.
The WSJ story focused "on a limited set of findings and casts them in a negative light", it said, but the issue was far more complex.
Two teens taking a selfie
"We've done extensive work around bullying, suicide and self-injury, and eating disorders, to help make Instagram a safe and supportive place for everyone," the company said in its post.
"Based on our research and feedback from experts, we've developed features so people can protect themselves from bullying, we've given everyone the option to hide 'like' counts and we've continued to connect people who may be struggling with local support organisations."
It was working on prompts to encourage people repeatedly dwelling on negative subjects to look at different topics, it said.
And it promised to be more transparent about its research in future.
'Profit before harm'
But National Society for the Prevention of Cruelty to Children head of child safety online Andy Burrows said it was "appalling they chose to sit on their hands rather than act on evidence".
"Instead of working to make the site safe, they've obstructed researchers, regulators, and government and run a PR [public-relations] and lobbying campaign in an attempt to prove the opposite."
MP Damian Collins, who is chairing the UK parliamentary committee looking at how big technology should be regulated to protect users' safety, said it was time to "hold them to account".
"The Wall Street Journal Facebook files investigation has exposed how the company, time and again, puts profit before harm," he said.
"Its own research is telling it that a large number of teen Instagram users say the service makes them feel worse about themselves - but the company just wants to make sure they keep coming back."
The Online Safety Bill aims to give regulator Ofcom the power to fine companies that fail to act on content that could cause harm.
US campaign group Fairplay (formerly the Campaign for a Commercial-Free Childhood) said the news showed Instagram was no place for children.
"In a move straight out of big tobacco's playbook, Facebook downplayed the negative effects of its product and hid this research from the public and even from members of Congress who specifically asked for it," it said.
"And in the ultimate display of chutzpah and disregard for children, the company now wants to hook young kids on Instagram."
Fairplay also called on the US government to demand Facebook released its research and blocked its plans to launch Instagram Youth.
It was revealed earlier this year Facebook planned to create an advert-free Instagram for under-13s, designed to keep them safe.
'No fix'
Jonathan Haidt, a social psychologist at New York University's Stern School of Business, told BBC Radio 4's Today programme he had met Facebook boss Mark Zuckerberg to discuss the social network's effect on mental health.
"He was interested but he believes the research is ambiguous and does not point to harm," Mr Haidt said.
"Of course, now we know they had their own research which did suggest harm."
"They had focus groups, online surveys, diary studies - so this was not one chance finding.
"I wouldn't expect them to come forward the first time they find evidence of harm and say, 'Oh my God our product is harmful,' but if they have multiple sources of evidence and there is evidence outside the company too, then I think the picture is pretty clear."
But, he added, it would take root-and-branch changes at the company to make any difference.
"The platform encourages children to post photos of themselves, to be raided by others including strangers around the world," he said.
"If this is the business model, there is no way to fix it."