Social Media Companies Proving They Can’t Be Trusted

Q&A.png

Social media has a social responsibility even if its platforms leaders don’t take ownership of it. Scrutiny is increasing and a report is calling for outside corrective action.

The article, “Social Media Making Political Polarization Worse: Report,” written by Rebecca Klar in The Hill talked about the challenges the report details and the recommendations it makes. The report, released by the NYU Stern Center for Business and Human Rights doesn’t mince words about the problems and dangers present and that are still developing, without much impediment or fences.

"We’re not just talking about political polarization just in the abstract, but it has these very specific consequences which we are seeing basically eroding aspects of democracy and civil relationships among people and trust in institutions and so forth,” said Paul M. Barrett, one of the authors of the report was quoted as saying.

In addition, the report states that leaders are more than capable of displaying sounder judgment and making better decisions.

“The fact that they actually acknowledge that they have the capacity, in their lingo, to ‘turn the dial’ at certain points and they acknowledge that they've done this in certain emergency situations I think proves a very strong implication that they know there is a connection to what they're doing and this social, political problem,” Barrett said in the article. “The question we raised is, if you could do it temporarily, explain to us why you wouldn't want to do that generally?” he added. 

The authors of the report aren’t just identifying issues and complaining. They bring recommendations as solutions to the conversation, advising social media companies to do what they can do, which is make adjustments to their algorithms to lessen polarization systemically and improve the “dial-turning” measures. The authors also make the assertive and reasonable call for transparency of plans, process and action.

Leaders of these platforms are not fully responsible for the nation’s troubles yet what they can do, they should do. And because of the significant deficiencies, a call to higher authority is being made.

“Social media companies cannot rescue the United States from itself. But these companies can, and must, reform their practices when they cause harm to democracy. In light of the industry’s failure to engage in sufficiently vigorous self-regulation, however, it is now time for the government to step in, as well,” the report states.

Communication Intelligence magazine spoke to Barrett in a short interview to dive a little deeper.

CIM: Ethics — what is your professional observation and analysis as to why tech platforms seem to be disinterested? Or is this not a matter of ethics?

Paul M. Barrett, Deputy director and senior research scholar, NYU Stern Center for Business and Human Rights: “The major tech platforms have made certain incremental reforms in recent years to address the spread of dis-and misinformation and other harmful content. They haven't made more fundamental changes because no one has forced them to. The platforms remain hugely popular with billions of users worldwide. Advertisers continue to provide the lion's share of platform revenue, and the U.S. government doesn't regulate the social media industry in a sustained way. The government has filed antitrust lawsuits against Facebook and Google, but those legal actions are at a very preliminary stage.”

CIM: What are Facebook (and Twitter) communicating to the country, the world and lawmakers with their deficiencies of governance in how their technology works?

PMB: “They are implicitly communicating that, while self-regulation by the industry would be preferable, the time has come for government intervention.”

CIM: What possible solutions do you see as being available -- and what could those solutions' benefits look like in practice?

PMB: “One solution would be for the social media industry to reconsider its fundamental business model, which currently maximizes user engagement to generate advertising revenue.

“Maximizing user engagement is problematic because sensationalistic, incendiary content is most likely to draw ‘likes,’ shares, and comments. So, if the platforms' algorithms favor that kind of content to stoke engagement, they will tend to spread material that provokes divisiveness, anger, and distrust of government and civil society institutions; democratic practices, such as elections; and facts that should be commonly embraced, such as the need for masking and vaccination in response to a lethal pandemic.”

Michael Toebe

Founder, writer, editor and publisher

Previous
Previous

Dangers of Deception in ‘Newstainment’ and Politics

Next
Next

What Trustworthy Leadership Confidence Looks Like