What do you call it when media try to manipulate your feelings without first asking for informed consent?
Tuesday.
Example: The average Facebook user sees only 20 percent of the 1,500 stories per day that could have shown up in their news feed. The posts you receive are determined by algorithms whose bottom line is Facebook's bottom line. The company is constantly adjusting all kinds of dials, quietly looking for the optimal mix to make us spend more of our time and money on Facebook. Of course the more we're on Facebook, the more information they have about us to fine-tune their formulas for picking ads to show us. That's their business model: We create and give Facebook, for free, the content they use and the data they mine to hold our attention, which Facebook in turn sells to advertisers.
Those are the terms of service that everyone, without reading, clicks "I Agree" to -- and not just for Facebook. We make comparable mindless contracts all the time with Gmail, Yahoo, Twitter, Amazon, Siri, Yelp, Pandora and tons of other apps, retailers and advertiser-supported news and entertainment. If you're online, if you use a smartphone, you're an experimental subject in proprietary research studies of how best to target, engage and monetize you. They're always testing content, design, headlines, graphics, prices, promotions, profiling tools, you name it, and you've opted in whether you realize it or not.
Many of these experiments hinge on our feelings, because much of what makes us come, stay, buy, like, share, comment and come back is emotional, not rational. So it should surprise no one that Facebook wants to know what makes its users happier. But when they acknowledged last month that they had tested -- on 700,000 people, for one week -- whether increasing the fraction of upbeat posts in their news feeds made them feel more upbeat (it did), a firestorm broke out.
Complete story at - Please Manipulate Me | The Smirking Chimp
Tuesday.
Example: The average Facebook user sees only 20 percent of the 1,500 stories per day that could have shown up in their news feed. The posts you receive are determined by algorithms whose bottom line is Facebook's bottom line. The company is constantly adjusting all kinds of dials, quietly looking for the optimal mix to make us spend more of our time and money on Facebook. Of course the more we're on Facebook, the more information they have about us to fine-tune their formulas for picking ads to show us. That's their business model: We create and give Facebook, for free, the content they use and the data they mine to hold our attention, which Facebook in turn sells to advertisers.
Those are the terms of service that everyone, without reading, clicks "I Agree" to -- and not just for Facebook. We make comparable mindless contracts all the time with Gmail, Yahoo, Twitter, Amazon, Siri, Yelp, Pandora and tons of other apps, retailers and advertiser-supported news and entertainment. If you're online, if you use a smartphone, you're an experimental subject in proprietary research studies of how best to target, engage and monetize you. They're always testing content, design, headlines, graphics, prices, promotions, profiling tools, you name it, and you've opted in whether you realize it or not.
Many of these experiments hinge on our feelings, because much of what makes us come, stay, buy, like, share, comment and come back is emotional, not rational. So it should surprise no one that Facebook wants to know what makes its users happier. But when they acknowledged last month that they had tested -- on 700,000 people, for one week -- whether increasing the fraction of upbeat posts in their news feeds made them feel more upbeat (it did), a firestorm broke out.
Complete story at - Please Manipulate Me | The Smirking Chimp
No comments:
Post a Comment
All comments subject to moderation.