JACKSON, Miss. (WLBT) - How much is social media influencing what you do or what you think? Experts warn popular platforms like Facebook and Instagram can actually cause more harm than good and make spreading misinformation that much easier.
It’s no secret that social media sites like Facebook, Twitter and Snapchat want people to look at their phones as much as possible.
The screen time given to social media -- and by extension, its ads -- is how these publicly traded companies make money.
However, there’s another side to this, revealed in a new Netflix documentary, that demonstrates how these sites we use are changing our own behavior in ways people may never have thought.
”The Social Dilemma" reveals that Twitter, Facebook and other social media companies aren’t just tracking what its users do on these platforms, they’re filling your news feeds with opinions and other things that you’re more likely to agree with.
Executives from some of the highest levels of involvement in these social media companies said in the film that what you see in your news feed comes from what you like, a feedback loop of more like-minded opinions based on how much more you like and share.
“We wind up being in an echo chamber where we just end up feeding back on what we already know. We call that confirmation bias also, because we are just looking for things to confirm our existing beliefs instead of trying to go out to see other viewpoints and have that back-and-forth dialogue,” said Millsaps College associate professor of psychology and neuroscience Sabrina Grondhuis. "So what ultimately winds up happening is we wind up believing that way more people agree with us and are consistent with what our own beliefs are than what is true in actual reality.”
Grondhuis said using social media -- particularly posting comments and photos for likes -- also produces a reaction similar to an addictive substance.
“Anytime we feel that something we’re doing is justified or validated, we do feel this little boost from it," Grondhuis said.
So if you see a post you’re curious about -- even if it contains misinformation -- you might believe it and not know right away that it’s incorrect.
Facebook and Twitter’s programs, or algorithms, monitor that and produce similar posts for you to read because it keeps you on their sites longer and keeps your interest. That strategy lets users go down the rabbit hole and eventually believe what you’re reading.
“Even if we do attempt to find diverse viewpoints, either the algorithms or those filters we set up for ourselves by selecting our friends and people we’re following, it’s just gonna give us back what we already believe, and so it’s really prohibiting us from actually accessing additional information,” said Grondhuis.
That additional information, she said, might have been fact checked from reputable news organizations, but false claims and conspiracy theories get spread more rapidly instead.
Political science professor Nathan Shrader said it’s also further contributed to polarizing people on both sides of the aisle, and he believes it shows people shouldn’t get their news from social media, a responsibility that everybody shares.
“I don’t want to take the burden off of the American citizens. Because at the end of the day, we are doing two things. We are electing to use those platforms. That’s the first thing. The second thing is we can still choose other venues or outlets for information," Shrader said.