Decentralized news media

From DAO Governance Wiki
Revision as of 13:51, 22 November 2024 by Craig Calcaterra (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Background

News production has degenerated in response to the concentration of outlet ownership and the current social media experience. The multiplication of news channels, instead of giving us greater diversity of opinion and more nuanced approaches to storytelling, has fragmented the audience as each channel caters to a narrower audience with more fixed beliefs. As these channels court their particular audiences, they become ever more careful to avoid confronting them with any facts which contradict their narratives. Such a system leads to filter bubbles where citizens in democracies are fragmented into subpopulations which share less and less value assumptions and terminology.

The clickbait culture incentivized by the typical social media platform discourages nuance in reporting and transmission of news stories. Context stripping, cherry picking, and good vs. evil storytelling are extremely successful in the Darwinistic environment of algorithmic information filtering.

Eliminating the use of these algorithms in the transmission of information is not possible. Improving the algorithms, to make the filtering process more effective in transmitting better stories, is the correct path forward.

The solution is to decentralize the power of news production and transmission. News production can be decentralized by giving reporters greater power over their stories, by giving reporters ownership of their stories on a platform which organizes reporters with similar standards and values. News transmission can be decentralized by giving consumers greater control over their experience. Consumers of news media (readers, listeners, and viewers) are given greater control of the algorithm that filters content, according to their standards and values.

Properly weighted incentive mechanisms are crucial for designing an algorithm which successfully represents the group's values. The focus on reputation tokens which pay off in the future incentivizes members to police the system, as long as policing is properly rewarded. DGF is designed to give a governance structure which supports that possibility.

Changes in news gathering

The design of the incentive algorithm is crucial for incentivizing reporters to write stories which are not misleading. But before the story is written, reporters require a nuanced understanding of the sources of information that contribute to their stories. In order to confidently report valid information to their news consumers, they require trustworthy sources of information. No reporter can possibly be a complete authority on the subject which they are reporting. Every writer relies on other reporters. News itself requires a system of newsgathering which has integrity at every level.

As the fourth estate breaks up in to smaller outlets with shorter lifespans, as the legacy media outlets devolve and have less trusted professional reporters on the ground at location, news stories will require ever more nuanced policing and vetting of sources. A decentralized system which properly rewards such policing is necessary to give the proper incentives to seek the truth. A future-oriented, valuable, and meaningful reputation token is necessary to build the system of incentivization. A dynamic governance mechanism is required so the system can incentivize the rebalancing of the incentivization mechanism itself. This will always be necessary as the landscape of reporting will always change, with new reporters, new audience interests, and new mediums of information transmission.

Design Structure

Criticisms

Filter bubbles

A natural criticism of the idea of giving news readers greater power over the content they consume is that it will make filter bubbles more extreme. By giving people more power to choose the type of content they see, their content feeds become more personalized. If people can avoid the news that contrasts with their values, they will be less likely to ever examine those values. So they will be less likely to improve their understanding of their values. People will have less exposure to any facts which contradict their current worldviews, so they will be less likely to improve their understanding of the world.

That criticism is precisely the position we have been put in, due to the current algorithmic filtering on social media and audience fragmentation from channel proliferation. But the way out is not to retreat and eliminate the diversity of channels or the use of algorithmic filtering. The way forward is to improve those tools to benefit society.

Filter bubbles can be eliminated by giving people more information. When the platforms indicate what type of information they are consuming, then the users will at least know what type of bubble they have chosen. And they will have greater understanding of what other types of content creation are available.