Security · Soapbox

Media Engagement Metrics and Recommendation Engines

…now you gotta decide how much you’re going to give up, how much of your intellectual independence.”

Chamath Palihapitiya, who was vice-president for user growth at Facebook

This isn’t traditional IT security but I think its relevant. When I see businesses burning, I see a lack of security. When I see companies hiring private security because the police are being disbanded, I see a lack of security. When I see things like the flyer about fixing the ‘Jewish Privilege’ issue, I see a lack of security. When I see areas of a city that are blocked off so that no one can enter, I see a lack of security.

When media companies started moving to on line platforms and social media became ‘a thing’, they started wooing advertisers with engagement metrics, showing how many clicks, likes, and shares their articles generated. Once those metrics began to drive dollars, media companies quickly figured out what it takes to get us to click and share an article. Guess what that is – what ever makes you the angriest is whatever you are most likely to share.

Couple that with the social validation feedback loop created by social media and you have the perfect storm for the free world to shred itself. Chamath Palihapitiya, who was vice-president for user growth at Facebook before he left the company in 2011, said: “The short-term, dopamine-driven feedback loops that we have created are destroying how society works. No civil discourse, no cooperation, just misinformation and mistruth.” Another Facebook insider to speak about this is Antonio Garcia-Martinez who says that Facebook uses the data it collects about people to manipulate them for its own profit. Others like Ken LaCorte just exploit the divisions it creates to make more money.

For those who aren’t familiar, let’s take a side trip into neurochemistry. When you scroll through your social media feed and something interesting pops up, your brain gives you a shot of a dopamine. When you post and you get a like or comment, you get another shot of dopamine. Since its an intermittent somewhat random reinforcement, its extremely effective in maintaining the behavior. In short, it keeps you on the social media site scrolling, clicking, and posting to get more gratification by driving your subconscious impulses.

Facebook and others have long been aware that their recommendation algorithms are stoking the polarization and divisiveness. The recommendation engines create echo chambers where you get more of you look at, which given how media metrics work, that means you get more of what makes you angry. Wall Street Journal reported that a staggering 64% of the people who joined extremist groups on Facebook did so because the recommendation engine suggested it to them.

Facebook has finally admitted that it took advertising money from Russian operatives to create the divisive 2016 election. Facebook finally admitted that it look advertising money that resulted in an ethnic cleansing in Myanmar. Ethnic cleansing is a polite way of saying that a lot of people died. India recently had an incident where WhatsApp (owned by Facebook) messages prompted the lynching of 7 people. Facebook and others have repeatedly ignored their own research about how dangerous this has become because it is profitable.

All around the world, people are dying because recommendation engines and media metrics are feeding fear, anger, and paranoia. When does it become bloody enough to shut these platforms down and force them to rethink their business model?

Leave a Reply

Your email address will not be published. Required fields are marked *