The Register: How do we combat mass global misinformation?

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

Post Reply
User avatar
Jedi Knight
Posts: 581
Joined: 2016-02-28 08:15am

The Register: How do we combat mass global misinformation?

Post by SolarpunkFan »

Apologies in advance if this is in the wrong forum.

The Register
How do we combat mass global misinformation? How about making the internet a little harder to use

We’re blinded by our trust of the Search box and its inscrutability

A friend recently texted me in a panic from Florida, where he’d been consoling relatives following a death in the family. Those relatives had returned a positive COVID-19 test, then a negative test. My friend was exposed to them, so got his own test. It came back negative. His relatives were tested again. Negative.

What should he do, he asked me?

Well, I helpfully suggested, why don’t you Google the accuracy of the bog-standard polymerase chain reaction (PCR) test you’ve just had? That’ll give you some error bars and allow you to manage your risks.

Oh, he replied. I tried that.

I had no idea what he meant by this, so I typed the following into a search box: “WHO PCR COVID test accuracy”...

...and got back a tsunami of conspiracy and antivax propaganda.

I suddenly understood why my friend had reached out to me - halfway around the world - rather than trying to find the answer on his own.

My friend had stumbled into a data void, a phenomenon noted by social media researcher Danah Boyd who explored areas on the internet where there should be strong, factually based content. Data voids are spaces that have instead been crowded out by insanity, propaganda - and outright recruiting for various radical causes. They're one reason for, and one path towards, organised and violent extremists.

That a data void exists on a topic as important as the accuracy of PCR tests highlights the weakness of algorithmic approaches to indexing the Web, something we’ve been doing in one form or another since DEC launched AltaVista twenty-six years ago.

When search engines came along we assumed machines could index the web.

It’s now clear that human curation has a bigger role, separating the chaff (and the spam) from the grain.

Slow, expensive and dependent on humans, curation breaks business models that print money by consigning the hard bits of running web businesses to automation.

Yet automation clearly does the job incompletely, and at worst - as we can see with my search results – can amplify dangerous misinformation.

Which is why Google’s threat to pull search from Australia in the face of laws that force it to share profits rings increasingly hollow.

Yes, Bing, DuckDuckGo and others would simply fill the vacuum left by Google. A lack of Google can serve a greater purpose because making the Internet harder to use could be exactly what we need.

For a generation we’ve been able to type anything we like into an empty white box and find something that felt like the answer to our needs. Over time we've assumed it may also be the truth. Or truthful enough.

Yet the complete lack of transparency between typing a search request and receiving results (Google’s deepest secrets) has also become its - and our - greatest weakness. Our ability to find things has atrophied precisely because all the work had been automated.

Now, when we need to find something of vital importance, we can’t. If it somehow lies beyond the range of a search engine, we can’t find it.

Google can’t write an algorithm to fix that; it can only curate its way towards knowledge.

That was the founding principle of Yahoo! which started life as a curated index of everything on the web. The web’s explosive growth made it more and more difficult to operate or use a curated index; that’s when search took over. Ever since, everything we’ve done on the web has been centered on a blind spot that might, at last, be too big to ignore.

If Australians switched en masse from Google to Baidu, they might find themselves learning a different version of Chinese history in which the Tiananmen Square massacre did not happen.

That sort of blind spot we can see coming because China tells us what it doesn't want us to know.

It’s harder to anticipate blind spots that only become apparent in moments of greatest need. And in those moments it may be too late to do anything but try to quickly recover those long-abandoned skills for finding important information.

No one wants Google to pull the plug on Search in Australia, or elsewhere. Yet it could present a fantastic opportunity: a chance to learn, again, what it means to dig for information. We would value that information, because of the time we invested in finding it.

In an era of information overload - and stripped of the blinders of an empty white box hiding all that complexity - we get to decide what’s valuable. We’ll spend time searching it out, verifying it, curating it, and then - we’ll share it with our peers, our children, and our nation. It’ll be slow. But it won’t be wrong. And it won’t leave us blinded by the simplicity of a white box.
"wouldn't that create a big explosion?"

"A black hole is like, this giant bunghole in outer space... it sucks up the whole universe, and then it's like, it grinds it up and sends it all to Hell or something." - Butt Head
User avatar
Sith Marauder
Posts: 4068
Joined: 2012-08-06 07:58pm
Location: British Columbia, Canada

Re: The Register: How do we combat mass global misinformation?

Post by Jub »

For a search like that why not simply use google scholar to weed out the BS? You could also use search modifiers to remove unwanted terms from the search. I get that it's annoying but we already have ways to solve this and my 32-year-old self learned them in highschool.
User avatar
Emperor's Hand
Posts: 8832
Joined: 2003-05-11 09:41pm

Re: The Register: How do we combat mass global misinformation?

Post by Solauren »

Also, if Google and other search engines can pop up warnings about websites, links from said engines (and other major websites, like Facebook) to websites known to be disinformation sources can have those warnings added as well.
I've been asked why I still follow a few of the people I know on Facebook with 'interesting political habits and view points'.

It's so when they comment on or approve of something, I know what pages to block/what not to vote for.
User avatar
Sith Marauder
Posts: 4449
Joined: 2002-10-29 07:19pm
Location: DC Metro Area

Re: The Register: How do we combat mass global misinformation?

Post by Exonerate »

This is a painfully naive way of looking at only one thin slice of what makes up misinformation. Google very much relies on human feedback to determine how to rank pages - in fact, it was one of their innovations that made them the dominant search engine. I remember Yahoo. There's a reason Google beat them. If anything, they democraticized the process of curation by incorporating the myriad of their users. The problem is not that there isn't enough human involvement, it's that one person's information is another's misinformation - for everything you believe is true, there is somebody out there who fervently believes otherwise. Most people are generally ill-equipped to determine what is "true", especially outside their areas of familiarity, because of lack of knowledge, time, or warped perspective. To put it bluntly, if people could critically think their way out of the lamented misinformation, it wouldn't be a problem in the first place.

At the individual level, there are a lot of ways to shortcut rational thinking. Lots of studies on how on emotive topics, such as politics, bypass critical examination. For these topics, most people are not really interested in discovering the truth, but to confirm what they want to believe is true. Hearing things from trusted sources is another, which is why social media has been such a profligate vector of misinformation. I've certainly initially accepted some things on face value because it came from somebody I knew, only for it to crumble when I got suspicious and applied closer scrutiny. On a group/population level, you get the bandwagoning effect - most people will go along with their peers... or the first page of Google results that are all similar in content, people will accept that as the truth (which is then reinforced as "high quality").

What the author is really calling for is a return where most of everybody could agree on what was true, for somebody to wave a magic wand and make everything that isn't official truth disappear. This is just a proposal to offload the difficult task of sense-making, epistemological discovery, critical thinking, etc onto a selection of people who they think can be trusted to do it. The recent experience with COVID-19 should starkly illustrate the flaws of this approach: In early 2020, many public health institutions - with the best intentions - discouraged the general usage of masks to mitigate the risk of COVID because they were afraid it would squeeze the supply for healthcare workers and might provide a sense of safety that would encourage people to be more risky in their behavior. This was despite the fact that there was already good evidence mask usage was effective in mitigating COVID and dissenting from health scientists. Still, it became the official truth. Shortly afterwards, they had to do a 180 and tell people to start wearing masks. Then you got Trump applying political pressure to change school reopening guidelines, etc. The point is this: if the most prestigious health agencies - professional, competent, well-intended, acting in their sphere of expertise - got it wrong when it mattered the most, do you really want, say, Twitter/Facebook/Google to start curating for everything?

Post Reply