Suing for Social Media Addiction


It’s not a rare thing to see the founder of Facebook ducking and weaving before the irate comments of Congress as he explains, for yet another time, why his network does not harm, has no intention to harm and, if harm arose, it was unforeseen and unintended. This dance of mendacity has been going on for years, and reached another level when Mark Zuckberg took the stand on February 18. Zuck has finally found his way to court where he faced cross-examination before counsel and the attention of a jury.

The trial being held in Los Angeles is considering the extent social media platforms are addictive to children with such features as push notifications, infinite scroll and personally targeted notifications. (The analogy here is with tobacco.) Google’s YouTube is also a defendant. (The legal suit shrank somewhat with agreements by TikTok and Snapchat to settle shortly before the trial.) The CEO of Instagram-owner Meta Platforms insisted in the court that the plaintiffs, including parents, teens, and school districts, were “mischaracterising” the nature of internal communications produced by Meta on the subject of addiction.

Mark Lanier, lawyer for the lead plaintiff in the case, had a few samples to produce in court. One was an email from 2019 to Zuckerberg and three highly placed Meta executives finding it hard to accept the company’s lack of enforcement protocols regarding age limitations. Authored by Meta’s then head of global affairs and former Liberal Democrat MP and UK Deputy Prime Minister Nick Clegg, the email suggests that it would be “difficult to claim we’re doing all we can”.

An externally authored 2019 research report done on behalf of Instagram was also raised, noting how teenagers using the platform felt “hooked despite how it makes them feel”, with such users having “an addicts’ narrative about their Instagram use.” Such use “can make them feel good, it can make them feel bad, they wish they could spend less time caring about it.” (Rather weakly, Zuckerberg could only point to a minor qualification: that the report had not been written by Meta personnel.)

Further evidence adduced showed a company focus on teenagers and increasing their time spent on the platforms. A 2017 email from an executive noted how “Mark has decided the top priority for the company is teens”. Under questioning by Meta’s lawyer, Paul Schmidt, Zuckerberg presented an image of evolved concern, having spent years addressing “problematic use” on his platforms. To focus just on time spent on the platforms would not have enabled Meta to endure. Examples of restrictive tools, such as Instagram’s daily limits, alerts regarding time spent, and turning off notifications at night, were cited as instances of caring correctives.

However unpalatable Meta’s CEO is, wariness should arise when issues such as children’s safety and addiction are cited as legal and policy concerns, particularly when they come from priestly, holier-than-thou lawmakers and advocates. Legislators find such themes as “duty of care” and “harm” irresistible when regulating conduct they disapprove of. Users of a service or a product – in this case, social media – are treated as hopeless, easily seduced and incapable of volition. Habit is thereby confused with addiction, while parents and school instructors deem themselves incapable of guiding children and instructing them on healthier habits.

Using the tobacco analogy is also problematic: social media addiction is not, as yet, recognised as a clinical condition, its pathological dimension understudied. A study published last November in Nature’s Scientific Reports argues that social media behavioural addiction is misunderstood, with “reason to believe that users overestimate their addiction”, frequently using “large builds of habit to automatically activate, scroll, post, and react to social media.” The perception of addiction most likely arose “from popular media’s frequent labeling of social media as addictive (vs. habit forming).” The consequence of this was critical: to convince individuals that using a social media platform was an addiction impaired perceived control, increased self-blame and blame of the app, and fostered a degree of helplessness in cutting back use. “This effect,” the study reports, “is aligned with past literature showing that merely seeing addiction scales can negatively impact feelings of well-being.”

Little wonder that so much stock is being placed on the Kids Online Safety Act (KOSA), which has gone through various iterations of severity when it comes to policing free speech. It’s author, the US Senator from Tennessee Marsha Blackburn, was teacherly in condemning Zuckerberg. The Facebook founder merely “followed his usual playbook of denial and deceit while sitting just a few steps away from parents who have tragically lost their children as a consequence of the way his platforms are designed to harm young users,” she stated. “86% of Americans want Big Tech companies to be held accountable for their role in the social media addiction crisis.” As with Big Tobacco, the social media giants were “trying to keep kids hooked on products that hurt them.”

KOSA ostensibly mandates the platforms to furnish minors with options to protect their information, disable the addictive aspects of the product, and opt out of algorithmic recommendations curated for the user. Parents are provided with controls on protecting their children and detecting harmful behaviour, with both parents and educators able to report harmful behaviour via a dedicated channel. Relevant audits on measures taken by the social media platforms to address risks posed to children are also mandated.

Most problematic of all is the scope of harm that the legislation is meant to address. A duty is created for the platforms to prevent and mitigate risks arising from the promotion of suicide, eating disorders, substance abuse, sexual exploitation and advertising on restricted products (alcohol and tobacco stand out). Joe Mullin of the Electronic Frontier Foundation claims that the “core function” of the legislation “is to let government agencies sue platforms, big or small, that don’t block or restrict content someone later claims contributed to one of these harms.” And as liability is the central problem, avoiding it will result in over-censoring tendencies. “It’s what happens when speech becomes a legal risk.”

As Mullin goes on to mention, the listed “harms in KOSA’s ‘duty of care’ provision is so broad and vague that no platform will know what to do regarding any given piece of content.” KOSA is also filled with insufferably vague terms such as “compulsory usage”, despite there being no accepted medical definition of what that means in the context of using social media platforms.

It therefore follows that any posted material discussing topics that might prove harmful, even if that material can address more positive features of the topic (for instance, the issue of drug avoidance, an appreciation of body appearance, or advice on coping with depression), will not be hosted for fear “that an attorney general or FTC [Federal Trade Commission] lawyer might later decide the content was harmful.” By allegedly targeting the harm in question, groups and individuals dedicated to discussing strategies to avoid it will be excluded. The social media behemoths do need to be held accountable, but we should all be cautious how that accountability is asserted and enforced.

The post Suing for Social Media Addiction appeared first on Dissident Voice.


This content originally appeared on Dissident Voice and was authored by Binoy Kampmark.