The social dilemma: Should social media platforms be more regulated?

Photo Courtesy: P.Kijsanayothin | Getty Images

In an era where algorithms shape our opinions, timelines dictate our moods, and going viral can make or break reputations, the question remains: Should social media platforms be more regulated? 

For some, the answer is a resounding yes. Critics argue that unchecked digital powerhouses like Meta, X (formerly Twitter), and TikTok have too much influence over public discourse, mental health, and even national security. 

“Absolutely, social media platforms should be regulated, specifically by age,” says Chloe Waites, a public relations major here at Florida A&M. “Having access to social media at a young age can expose children to harmful content, cyberbullying, and unrealistic standards, all of which can negatively impact their mental and emotional development. As a member of Generation Z, I’ve witnessed both the positive and negative effects that social media can have.” 

Waites points to the rise in misinformation, privacy breaches, and mental health concerns as proof that regulation isn’t just necessary but it’s urgent. With artificial intelligence being integrated into these platforms and content spreading faster than ever before, some believe the stakes have never been higher. 

Jourdan Garnett, a health general science scholar, agrees that regulation may be necessary, especially given society’s heavy dependence on digital platforms. 

“I believe that it’s all about perspective, but based on how society is now, yes, it should be regulated,” Garnett said. “Society now is very technological and social media based. Most people get their information from social media or any platform that is quick to access.” 

Garnett noted that the speed of digital information isn’t the only issue — it’s also about accuracy and integrity. 

“People don’t take the time to fact check if the information that they are receiving is properly sourced and/or correct,” she said. “Especially when it comes to platforms that are owned by people who have strong and very known political views.” 

This raises broader concerns about bias, transparency, and the ethical responsibility of tech companies to ensure that their platforms aren’t shaping public opinion through the lens of personal or political agendas.

Meanwhile, lawmakers across the globe are currently grappling with how or whether to rein in social media giants. The European Union has already implemented the Digital Services Act, aimed at increasing transparency and limiting harmful content. In the U.S., proposed legislation like the Kids Online Safety Act continues to stir debate over how far regulation should go. 

Still, many feel caught in the middle, wanting protection without overreach, freedom without chaos. 

As society becomes more dependent on digital spaces, the urgency of this conversation grows. Whether through government policy, platform responsibility, or public pressure, one thing is clear: the way we interact online is no longer just a personal choice but it’s a public issue.