Decentralized news media

From DAO Governance Wiki
Jump to navigation Jump to search

Background[edit | edit source]

News production has degenerated in response to the concentration of outlet ownership and the current social media experience. The multiplication of news channels, instead of giving us greater diversity of opinion and more nuanced approaches to storytelling, has fragmented the audience as each channel caters to a narrower audience with more fixed beliefs. As these channels court their particular audiences, they become ever more careful to avoid confronting them with any facts which contradict their narratives. Such a system leads to filter bubbles where citizens in democracies are fragmented into subpopulations which share less and less value assumptions and terminology.

The clickbait culture incentivized by the typical social media platform discourages nuance in reporting and transmission of news stories. Context stripping, cherry picking, and good vs. evil storytelling are extremely successful in the Darwinistic environment of algorithmic information filtering.

Eliminating the use of these algorithms in the transmission of information is not possible. Improving the algorithms, to make the filtering process more effective in transmitting better stories, is the correct path forward.

The solution is to decentralize the power of news production and transmission. News production can be decentralized by giving reporters greater power over their stories, by giving reporters ownership of their stories on a platform which organizes reporters with similar standards and values. News transmission can be decentralized by giving consumers greater control over their experience. Consumers of news media (readers, listeners, and viewers) are given greater control of the algorithm that filters content, according to their standards and values.

Design Structure[edit | edit source]

Criticisms[edit | edit source]

Filter bubbles[edit | edit source]

A natural criticism of the idea of giving news readers greater power over the content they consume is that it will make filter bubbles more extreme. By giving people more power to choose the type of content they see, their content feeds become more personalized. If people can avoid the news that contrasts with their values, they will be less likely to ever examine those values. So they will be less likely to improve their understanding of their values. People will have less exposure to any facts which contradict their current worldviews, so they will be less likely to improve their understanding of the world.

That criticism is precisely the position we have been put in, due to the current algorithmic filtering on social media and audience fragmentation from channel proliferation. But the way out is not to retreat and eliminate the diversity of channels or the use of algorithmic filtering. The way forward is to improve those tools to benefit society.

Filter bubbles can be eliminated by giving people more information. When the platforms indicate what type of information they are consuming, then the users will at least know what type of bubble they have chosen. And they will have greater understanding of what other types of content creation are available.