Search Polarises Extremes

A simple illustration that non-AI search algorithms are the cause of polarising memes that dominate public discourse.

You know from experience that Google search terms out of context can return irrelevant items, so you think as you start to type what are likely to make key terms relevant to your particular interest. Finding something specific or finding out about something, explicitly or implicitly, you have a search strategy in mind – even if it is a completely neutral, I’ve no idea where to look or what I’m looking for specifically – looking for ideas. That’s your experience and your (human) mind working so far.

However, Google offers a helpful type-ahead of previous and popular search strings, even before you’ve completed the first word you had in mind. To save typing and mental effort, how often do you find, yeah, that string might do the job, and click on it. Relevant early returns? If not go back, continue typing a word or two, and pick one of the alternative strings offered? Obviously, Google’s prioritising of the returns gives its own set of skewed prejudices, a commercial attention-grabbing game of SEO and the like, but the very act of selecting a previously trended search string skews you to popular options. The non-AI search algorithm has intercepted your own human-intelligent strategy. Binary extremes emerge naturally, even if your original intent was neutral.

(My own strategy at this point is to notice if there is a wikipedia item near the top or to the right, and select that first – read around the subject – find contentious hot-topics to avoid if possible before continuing. But it takes effort, so inevitably the search-strings and returns offered dominate the traffic, and the traffic reinforces the algorithms, even search-string clicks that don’t return any relevant items!)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.