SYDNEY -- Facebook monitored the posts of Australian children and used algorithms to identify and exploit them by allowing advertisers to target them during their "most vulnerable moments", media reported, evoking criticism against the social media giant.
A confidential 23-page Facebook document prepared by company's two top Australian executives outlines how the social network can target "moments when young people need a confidence boost" in pinpoint detail, The Australian reported on Sunday.
Facebook collected the information on a person's moods including feeling "worthless", "overwhelmed" and "nervous" and then, it divulged the same to advertisers who use it to target them.
Facebook admitted it was wrong to target the children and apologized.
"We have opened an investigation to understand the process failure and improve our oversight. We will undertake disciplinary and other processes as appropriate," a Facebook spokeswoman told The Australian.
"While the data on which this research is based was aggregated and presented consistent with applicable privacy and legal protections, including the removal of any personally identifiable information, our internal process sets a standard higher than required by law," she added.
Facebook's tactic violates the Australian Code for Advertising and Marketing Communications to Children guidelines.
The revelation also points towards the how Facebook can be used for covert surveillance which most of the social networking sites claim to be fighting against.
There have been rumors about Facebook's advertising sales methods but there was no proof until now that could corroborate that.
"The document is an insight on how Facebook gathers psychological insights on 6.4 million 'high schoolers', 'tertiary students' and 'young Australians, New Zealanders... in the workforce' to sell targeted advertising," the report noted.
The document states that the detailed information on mood shifts among young people is "based on internal Facebook data, shareable under non-disclosure agreement only, and is not publicly available".
Facebook has not disclosed if the similar practices exist elsewhere.
This practice is similar to a 2014 psychological experiment conducted by Facebook on its 600,000 users without their knowledge.
Facebook had then tweaked the News Feed of users to highlight either positive or negative posts from their friends. The social media giant then monitored the users' response to study the impact of their friends' attitude.