Top News

RUSSELL WANGERSKY: Trying to put the genie back in the bottle

Social media giants like Twitter and Facebook have been taking baby steps towards policing their products. — Reuters file photo

It’s never easy to go backwards.

The newspaper business is a good example: decades ago, as the internet grew, most papers saw websites as an add-on, rather than a new business model, and started posting their news stories for free. It made a bit of sense; at that time, circulation revenue paid a small fraction of the cost of a daily paper, with most of the money needed to gather and write the news coming from advertising.

Problem is, as I said, it’s tough to go backwards — after essentially convincing people that internet news was free (except for the bagloads of money cable and phone companies get you to pay for your internet connection), it’s been an uphill climb for news organizations to try and get users to pay for news.

Genies don’t go willingly back into their bottles.

By offering it for free, we convinced the market that the hard work of researching, writing and editing news was without value — or, more to the point, without cost.

Sorry — that’s not exactly where I was going with this.

I was actually talking about the difficulty of something else going backwards.

Genies don’t go willingly back into their bottles.

For years and years, social media companies have argued that they bear no responsibility for what people might say on their platforms; that, like your internet provider, they were merely a pipeline. Something along the lines of “you can’t blame the letter carrier if you don’t like what you get in the mail.”

Problem is, they’ve always been a publisher, if not in name, at least in action. Ask any opinion editor at a newspaper — if a letter to the editor libels someone, the paper gets sued, too, even if nothing in the letter was even edited.

Lately, facing considerable criticism, the social media giants have been taking baby steps towards policing their products, implicitly admitting that if you deliver, magnify or widely broadcast lies and slander, you’re an integral piece of the problem.

Last week, both Facebook and Twitter limited the ability of users to share a story from the New York Post about Joe Biden, pointing out there were concerns about the accuracy and truth of the story.

But it doesn’t stop there.

YouTube said Wednesday that it would remove any content that contained false COVID-19 vaccine information, like claiming vaccines would kill people, cause infertility or implant tracking microchips. Facebook has banned postings that deny or distort facts about the Holocaust and has stepped in to limit postings promoting the QAnon conspiracy.

Twitter has also moved, with limited success, to haul down QAnon accounts, along with faked accounts trying to influence the U.S. election. And there have been restrictions or warnings placed on inaccurate or misleading social media statements by U.S. President Donald Trump, as well as senior Trump staffers.

Policing by social media firms is slowly ramping up and will likely increase.

Problem is, the horse has long since left the barn. And coaxing it back inside poses distinct problems.

Users might have been more willing to accept restrictions when platforms were launched — now, the first reaction is that social media firms are censoring their users. But the problem for the social media firms is even larger than that. How, for example, do they decide what gets banned, and what gets allowed? Anyone who has ever dealt with abusive attacks knows that the line of what constitutes abuse is fluid, rather than sharply defined. It isn’t an easy yes or no.

To actually fully police the wild and woolly social media world is going to take lots of time — and lots and lots of money.

I wish the social media giants luck: they’ll need it.

And we do, too.

Russell Wangersky’s column appears in SaltWire newspapers and websites across Atlantic Canada. He can be reached at [email protected] — Twitter: @wangersky.


Did this story inform or enhance your perspective on this subject?
1 being least likely, and 10 being most likely

Recent Stories