Following Facebook, US lawmakers are questioning internet companies like Snap, TikTok, and YouTube on the influence of content on such social media platforms on kids and adolescents.
When it comes to children and teenagers, social media has always been a heated issue of conversation. When it comes to internet usage, many social activists, psychologists, and therapists raise their eyebrows at the addiction to various social media platforms.
However, all sides of a discussion drew attention, and many individuals advocated for the usage of social media. Though, the uncertainty persists. From the government’s standpoint, there are no such hard and fast laws in place about how much one should consume the internet.
Concerns about consumer protection were expressed by both Democrats and Republicans. They are concerned that these internet titans are encouraging disruptive behavior and endangering youngsters.
Snapchat executives Tik Tok and YouTube testified on algorithms, safety, and initiatives to protect mental health.
Sen. Marsha Blackburn, a Republican, spoke out, stating that for far too long, these social media platforms have been permitted to post and promote information that appears to be misleading for child and adolescent users.
She claims she received calls from many parents, teachers, and mental health professionals in the weeks preceding up to the hearing, all of whom voiced concern about how long this may go on.
According to the sources, Senator Richard Blumenthal, the chairman of the Senate subcommittee, testified that everything they do is add people, especially kids, and keep them on their apps for extended periods.
Michael Beckerman, TikTok’s head of public policy in the Americas, justified the app by claiming that it is not like a social network based on followers. According to him, users of TikTok watch TikTok videos, and TikTok users create TikTok videos. Tiktok, he claims, is a distinct kind of platform.
Snap’s VP of global public policy, Jennifer Stout, explained that the site was created as an alternative to social media, stressing that photographs on the platform are deleted by default.
In the first three months, YouTube erased 7 million accounts suspected to belong to young children and preteens, according to the company. On YouTube Kids and YouTube, autoplay videos remain implemented differently for viewers under the age of 18.
In September, Facebook faced a similar circumstance when a whistleblower spoke at a Senate hearing about the social network company’s destructive consequences on adolescent users, according to one of them.
Later, Facebook responded by saying it had put a hold on a new app it was developing for kids due to rising concerns from lawmakers and advocacy groups in the United States.