Cellphone, studio and woman with headache, hurt and brain fog on blue background. Technology, hand and pain for expression with smartphone for medical research, internet and online information
Image Credit: Shutterstock.

10 ways algorithms rewrite how you think

Algorithms aren’t there to simply share random stuff on your feed. Nope, they tweak & filter what you see in ways that can attract your attention or change your mood. And yes, they can change your opinions. Scientists have researched exactly how they do this. Here are ten research-backed ways algorithms rewrite your opinions. Which one do you think hits hardest?

Featured Image Credit: Shutterstock.

Search rankings

Woman, business and tablet search engine for information hologram, SEO graphic and iot research. Creative, worker and African employee on digital web technology, internet and office browsing overlay
Image Credit: Shutterstock.

One experiment involved researchers adjusting search result order. Undecided voters changed their voting preferences in favor of the top-ranked option, sometimes by more than 20%, although most of them didn’t realize anything had changed. It seems that the information that you see first could change your political opinions. No flashy ads required.

Autocomplete

Search bar results. Web page interface mockup with search window and suggestion list. Vector browser searched address frame and searching phone application.
Image Credit: Shutterstock.

Autocomplete is the feature when you start typing & the algorithm gives you suggested phrases. Studies found that simply hiding negative suggestions & showing friendlier ones changed how people judged a candidate later. But the strangest part? The only difference was the completed phrase, which changed people’s opinions.

Repetition

Concept of Repetition, Repetition, Repetition write on sticky notes isolated on Wooden Table.
Image Credit: Shutterstock.

Saying something once is mere noise. Saying it five times makes it suddenly feel true. Psychologists call this the “illusory truth effect,” and even when people know a statement is false, enough exposure to it makes them see it as more believable. As such, seeing something enough on social media could make you believe it.

Engagement-tuned feeds

Like and share icon. Social symbol in flat style with shadow. Vector Illustration, eps 10
Image Credit: Shutterstock.

For every moral-emotional word you add in a political post, it gets shared around 20% more. Lab work proved the cycle. A post with outrage gets more likes, which leads to more posts with outrage. That’s why you see so many angry political posts. It isn’t random. The truth is, algorithms lean toward heated stuff, and that changes what you see repeated in your feed.

Personalization

Tablet with news website on stack of newspapers. All contents are made up.
Image Credit: Shutterstock.

Millions of American adults see far less news from the opposing political side. Why? Because of ranking systems. Conservatives see about 5% less from liberals, while liberals see about 8% less from conservatives. And that’s before they even clicked. This filtering happens at the exposure stage & people don’t see stuff that challenges their opinions.

Popularity bias

Image of social media like notification over smiling caucasian woman using smartphone. Global communication network, business and social media concept digitally generated image.
Image Credit: Shutterstock.

It’s no secret that algorithms tend to chase the crowd. Studies show systems boost what’s already trending & double down on the idea that what’s popular gets more popular. But that’s not good. It means that only a small set of songs or articles gets people’s attention. The quieter ones practically vanish in the algorithm’s loop.

Visible ratings

Sitting woman in sweater and shorts holding smartphone on dune viewing thumbs up icons, copy space. Nature, leisure, technology, coastal, outdoor, relaxation, mood
Image Credit: Shutterstock.

One study found that a single upvote on a post increased the odds of more positives by 32%. A downvote didn’t work the same way, though. Readers saw the positive score & judged the article through that lens. Effectively, they began liking its content because they saw that other people did, and over time, this affects whole discussions around that topic.

Comment ordering

Portrait of a group of young business people in office. Speech bubble, team and comment by business people holding sign, news and voice icon feeling excited on social media. Group, opinion and poll
Image Credit: Shutterstock.

What other people comment matters, too. In one study, the top comment changed what most people believed about the content of the post, and later, readers judged the article differently depending on which comments sat at the top. It was the same story. Just a different perception, simply because of the comment order. How weird.

Algorithmic fact-check labels

Smartphone with red Fake News words on screen over a newspaper. HOAX and Fake news concept.
Image Credit: Shutterstock.

Those little warning tags like “missing context” or “disputed” are hardly random. Most of the time, they come from automated systems that decide which posts need a flag. And many people treat them as a clear signal of truth…or lies. A labeled post makes people less likely to believe it, while one without a label is more trustworthy. Really, the label matters more than the content.

Shadow-banning

Sad woman feeling left alone on social media
Image Credit: Shutterstock.

Some posts never make it very far. But it’s not always clear why. You might think you’re posting like normal, yet the algorithm limits how many people actually see it, and that’s called “shadow banning.” This means that fewer critical or opposing takes will show up in people’s feeds. Soon enough, conversations slow down & opinions change.

Sources: Please see here for a complete listing of all sources that were consulted in the preparation of this article.

Like our content? Be sure to follow us.