Children are even more in the bag of social networking companies than we think. So many of them have surrendered their online autonomy so completely to their phones that they even refuse the idea of ​​searching the internet – for them, the only acceptable online environment is personalized by big tech algorithms that feed them personalized content.

As our children’s free time and imaginations merge more and more with the social media they consume, we must understand that unregulated access to the internet comes at a cost. Something similar happens for adults as well. With the advent of AI, spiritual loss awaits us as we outsource countless human rituals—research and trial and error—to machines. But it is not too late to change this history.

This spring, I visited with a group of high school students in suburban Connecticut to talk about the role social media plays in their daily lives and in their mental health. More children today report feeling depressed, lonely and disconnected than ever before. More teens, especially teen girls and LGBTQ teens, are seriously considering suicide. I wanted to talk candidly about how social media helps and hurts mental health. By the end of the 90-minute dialogue, I was more concerned than ever about the well-being of our children – and of the society they will inherit.

There are many problems with children and teenagers using social media, from mental health deterioration to dangerous and age-inappropriate content and the lackluster efforts tech companies use to enforce their own age verification rules. But the high school students I met with alerted me to an even more insidious result of growing minors’ reliance on social media: the death of research, trial and error, and discovery. Algorithmic recommendations now do the work of discovering and pursuing interests, finding community, and learning about the world. Children today, simply put, are not learning how to be curious, critical adults – and they don’t seem to know what they’ve been missing.

A week before meeting the students, I presented the Child Protection on Social Media Act with three of my colleagues in the Senate, Brian Schatz, Democrat of Hawaii, and Republicans Katie Britt of Alabama and Tom Cotton of Arkansas. The bill is a comprehensive attempt to protect young people on social media, prioritizing stronger age control practices and banning children under 13 from using social media altogether. But there was one provision of the bill that was particularly alarming to this group of students: a ban on social media companies using the data (what they look at and swipe on) they collect on children to build and feed algorithms that spoon-feed individualized content. back to users. These high school students have become addicted, perhaps even addicted, to the algorithms of social media.

Their dependence on technology sounds familiar to most of us. Many of us can hardly remember when we didn’t have Amazon to hit up, when we needed a last-minute gift, or when we waited by the radio to play our favorite songs. Today, information, entertainment and connection are delivered to us via a conveyor belt, with less effort and research required of us than ever before.

Retreating from the rituals of discovery comes with a cost. We all know instinctively that the journeys in life matter as much as the destinations. It is in the wandering that we learn what we like and what we don’t like. The sweat to get the result makes the result more fulfilling and satisfying.

Why should students try to find a song or poem they like when an algorithm will do it for them? Why risk exploring something new when their phones will simply send them endless content related to things they already care about?

What the guys I talked to didn’t know is that these algorithms were designed in a way that inevitably makes – and keeps – users unhappy. According to an advice issued by the surgeon general this year, “there are abundant indicators that social networks may also pose a profound risk of harm to the mental health and well-being of children and adolescents.” A a report of the nonprofit Center to Counter Digital Hate found that users could be served content related to suicide less than three minutes after downloading TikTok. Five minutes after that, they might come across a community promoting eating disorder content. Instagram is full of mild pornography, offering a gateway to hard material on other sites (which are often just as lax about age verification). And all over social media are highly curated and filtered fake lives, creating a sense of envy and inadequacy in the developing brains of teenagers.

Social media companies know that content that generates negative feelings holds our attention longer than content that makes us feel good. It’s the same reason local news leads with the shooting or the house fire, not the local food drive. If you’re a teenager who feels bad about yourself, your social media will usually continue to feed you videos and images that are likely to exacerbate negative feelings.

These kids may think they need the algorithm, but the algorithm actually makes many of them worse. It’s no accident that teenager indices of sadness and suicide increased just as algorithmically driven social media content took over the lives of children and teenagers.

The feedback from the students in Connecticut left me more convinced than ever that this law is essential. By taking steps to separate young people from their social media addiction and forcing them to engage in real exploration to find connection and fulfillment, we can recreate the lost rituals of adolescence that, for centuries, have made us who we are.

The role that social media has played in the declining mental health of teenagers also gives us a preview of what’s to come for adults, with the accelerating spread of artificial intelligence and machine learning in our own lives. The psychological impact of the coming transition of thousands of everyday basic human tasks to machines will make the impact of social media look like child’s play. Today, machines help us find a song we like. Tomorrow, the machines will not only find the song – they will also create it. Just as we were not prepared for the impact that social media algorithms would have on our children, we are probably not prepared for the spiritual loss that will come when we outsource countless human functions to computers.

Regardless of whether the Child Social Media Protection Act becomes law, we should begin to engage in a broader dialogue, with adults and children from all walks of life, to determine whether we will truly be happier as a species when machines and algorithms do everything. the work for us, or if the fulfillment comes only when people actually do the work, how to search and discover, to be human.

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *