Why it matters:
There is a concern about the proliferation of flagrant ads selling illegal drugs like psilocybin and LSD on Meta’s platforms, despite efforts to filter illegal content using algorithms. The issue is complicated as sometimes educational cannabis material is mistakenly removed while illegal drug ads are allowed to remain.
What they are saying:
Journalists and experts are criticizing Meta for its inconsistent moderation of content related to illegal drugs. Some have even been able to get ads for LSD approved on Facebook, highlighting the flaws in Meta’s automated moderation system. There is also debate about the decline in online advertising spending playing a role in Meta’s moderation capabilities.
The big picture:
There is a growing reliance on machine learning and algorithms to police content on social media platforms like Facebook, raising concerns about their efficiency and accuracy. Efforts to regulate drug ads on social media are also being seen in Canada through new online harms legislation.
What to watch:
It will be important to monitor how Meta improves its algorithms to better filter illegal drug ads and prevent them from being displayed on its platforms. Additionally, the impact of new online harms legislation in Canada and its enforcement on tech companies like Meta will be worth watching.
My take:
The issue of ads selling illegal drugs on social media platforms like Meta highlights the challenges in moderating content using automated systems. While it is essential to crack down on illegal drug ads, it is also important to ensure that legitimate educational content is not mistakenly removed. It will be interesting to see how Meta addresses these issues moving forward and how regulatory measures in countries like Canada impact the online environment.