In the TED Talk video above, Eli Pariser shows that personalization algorithms are being used everywhere, from Facebook's news feed, to Google's search results, and even to the news on Yahoo News. More worrisomely, he claims that the "filter bubbles" created by these personalized results are a major problem for society, as they limit users' exposure to new ideas. Pariser argues that to limit the effects of filter bubbles, personalization algorithms needs to include an "embedded ethics" to show us things that are "uncomfortable, or challenging or important".
What do you think of of Pariser's claims? Come to Pugwash this week to share your thoughts with others. We'll be asking the questions:
- Are filter bubbles a genuine problem for our society?
- If so, should a company like Facebook be obligated to show us "challenging" content that might make us less likely to stay on the site?
- Alternatively, is there a way to incentivize companies to use personalization algorithms with "embedded ethics"?
- Is there a difference between the filter bubbles created by personalization algorithms and those which are created by manually-curated sources of news? If so, what?
- For example, if you can easily find a news station or blog that shares your beliefs, does it matter that a service like Yahoo News is adapting to those beliefs?