“I Was Addicted to Social Media – Now I’m Suing the Tech Giants”…

Addiction, a suicide attempt and years of depression – how a 21-year-old girl’s favourite hobby turned into a nightmare

Hundreds of families are suing some of the world’s biggest technology companies they say are knowingly putting their children in danger.

The BBC catches up with one of the victims, addicted to social networking sites, who tells us about the risk behind seemingly safe internet surfing and the enjoyable hours spent searching for something fun.

“At the age of 12, I literally fell into the addiction trap. And I couldn’t get my life back throughout my teenage years.” With these words begins the story of Taylor Little, a girl whose addiction to social media nearly ruined her life.

Taylor’s interest in social media spiraled out of control, leading to the most difficult time of her life – suicide attempts and years of depression.

Taylor, who is now 21, described to the BBC tech companies as “big, bad monsters”.

According to Taylor, companies are deliberately putting products into the hands of children that are highly addictive and damaging to their health.

That’s why Taylor and hundreds of other American families are suing four of the world’s biggest tech companies.

The invincible Silicon Valley, whose power is ruining lives?
The lawsuit against Meta – owner of Facebook and Instagram – plus TikTok, Google and Snap Inc, owner of Snapchat, is one of the biggest in Silicon Valley.

The plaintiffs include ordinary families and schools across the US. They claim the platforms intentionally harm the psyches and mental health of adolescents.

Attorneys for the families believe the case of the death of 14-year-old British schoolgirl Molly Russell is an important example of the potential harm teens face.

Last year, they monitored the investigation into her death by video link from Washington, seeking evidence to use in the US trial. Molly’s name is mentioned a dozen times in the main complaint filed with the California court.

The families involved in the case received a significant uplift recently when a federal judge declared that the companies were prohibited from invoking the First Amendment of the US Constitution, safeguarding freedom of speech, to impede legal proceedings.
Similar to withdrawals, Taylor, residing in Colorado, recounts that prior to acquiring their initial smartphone, she was active and sociable, engaging in activities such as dance and theatre.

“If I had my phone taken away, it felt like having withdrawals. It was unbearable. Literally, when I say it was addictive, I don’t mean it was habit-forming. I mean, my body and mind craved that”, said Taylor.

Taylor recalls the initial social media notification they encountered—an individual’s distressing self-harm page featuring explicit images of wounds and cuts.

“At 11 years old, I clicked on a page and saw that with no warning. I didn’t seek it out. I didn’t request it. I can still vividly see it. Even at 21, I can still see it.”

Taylor also grappled with content related to body image and eating disorders.

“That was—or is—like a cult. It felt like a cult. You’re incessantly bombarded with images of a body that you can’t attain without risking your life.

“You can’t escape that.”

Currently, lawyers for Taylor and the other plaintiffs have taken a new approach to the litigation, focusing on the design of the platforms rather than individual posts, comments or images. They argue that the apps contain design features that cause addiction and harm.

Defence of the Accused
Meta released a statement saying: “Our thoughts are with the families represented in these complaints.

“We want to reassure every parent that we have their interests at heart in the work we are doing to provide teens with safe, supportive experiences online.”

TikTok declined to comment.

“The allegations in these complaints are simply not true. Protecting kids across our platforms has always been core to our work”, Google told BBC.

Snapchat also issued a statement to the BBC “was designed to remove the pressure to be perfect. We vet all content before it can reach a large audience to prevent the spread of anything that could be harmful.”

Taylor is familiar with the tragic tale of Molly Russell from north-west London, who tragically ended her own life after being inundated with a barrage of negative and depressing content on Instagram. A subsequent inquest into Molly’s death concluded that she died “while suffering from depression and the negative effects of online content.” Taylor notes the striking similarity between their stories.

“I feel incredibly lucky to have survived. And my heart breaks in ways I can’t put into words for people like Molly.

“I’m happy. I really love my life. I’m in a place I didn’t think I would live to.”

It makes Taylor determined to see the legal action through.

“They know we’re dying. They don’t care. They make money off us dying.

“All hope I have for better social media is entirely dependent on us winning and forcing them to make it – because they will never, ever, ever choose to.”

The negativity behind social media and hate speech from users can be detrimental to adolescent teens and sensitive people who have a hard time accepting abuse, even from strangers.

Tags: , , , , , , , , , , , , , ,
Editor @ DevStyleR