Meta released a pardon Wednesday evening after a “mistake” caused the algorithm of Instagram recommendation in flooding users with disturbing and violent video, some describing fatal shots and terrible accidents.
The case affected a wide range of users, including minors.
The worrying content, which was recommended to users without their consent, presented graphic descriptions of the shooting individuals, driven by vehicles and suffering terrible injuries.
While some videos carried “sensitive content” warnings, others appeared without restrictions.
The account of an Instagram reporter Wall Street Journal was flooded with back-to-back clips of people who were shot, printed by machinery and violently extracted from traveling to the amusement park.
These videos originate from pages with names such as “Blackpeopleeinghurth”, “Shocking”, and “Peopledyyinghub” – calculates the journalist did not follow.
The metrics in some of these posts suggested that the Instagram algorithm had dramatically increased their visibility.
See calculations in certain videos exceeded those of other posts from the same accounts from millions.
“We have arranged a mistake that made some users see content in their Instagram resources that should not have been recommended,” said an Instagram spokesman late Wednesday.
“We apologize for the mistake.”
Despite the pardon, the company refused to specify the degree of the matter.
However, even after Meta claimed the problem was resolved, a Wall Street Journal reporter continued to watch videos depicting shots and deadly accidents late Wednesday evening.
These worrying clips appeared along with advertising paid for legal firms, massage studios and e -commerce platform TEMU.
The incident comes as Meta continues to regulate his content moderation policies, especially regarding the automated detection of opposition material.
In a statement issued on January 7, Meta announced that it would change how it implements certain rules of content, citing concerns that past moderation practices had led to unnecessary censorship.
As part of the change, the company said it will fix its automated systems to focus only on “illegal and high violations, such as terrorism, children’s sexual exploitation, drugs, fraud and fraud”, rather than scan for all policy violations.
For less serious violations, Meta indicated that it would rely on users to report problematic content before taking action.
The company also acknowledged that its systems had been extremely aggressive in demonstrating posts that could “violate” its standards and said it was in the process of eliminating most of those demotions.
Meta has also reduced the suppression of the content driven by it for some categories, though the company did not confirm whether its violence and its policies had changed as part of these adjustments.
According to the company’s transparency report, Meta removed more than 10 million pieces of violent and graphic content from Instagram between July and September last year.
Nearly 99% of this material was proactive and removed from the company’s systems before it was reported by users.
However, Wednesday’s incident left some unresolved users.
Grant Robinson, a 25-year-old working in the supply chain industry, was one of those affected.
“Hard hard to understand that this is what served me,” Robinson told the newspaper.
“I saw 10 people die today.”
Robinson noted that similar videos had appeared to all his male friends, aged 22 to 27, none of whom usually deal with violent content on the platform.
Many have interpreted these changes as a Zuckerberg attempt to repair relations with President Donald Trump, who has been a vocal critic of Meta modification policies.
A spokesman for the company confirmed in X that Zuckerberg visited the White House earlier this month “to discuss how Meta can help the administration defend and advance the leadership of American technology abroad.”
Meta change in moderation strategy comes after significant staff reduction.
During a series of technology vacation in 2022 and 2023, the company trimmed approximately 21,000 jobs – nearly a quarter of its labor force – including positions in its teams of integrity, trust and civic security.
#Meta #apologizes #Instagram #users #flooded #violent #videos
Image Source : nypost.com