Algorithmic bias
"When I would query them for summaries of historical figures, it usually got the most important things.
"Something different happened when I generated summaries for more obscure figures. It became apparent the summaries were skipping over the things I liked best about them.
"Popular figures benefit from a steady flow of subjective analyses made by individuals over time. For less popular figures, somewhere between obscure and somewhat popular, the most interesting things about them fades into the background and isn’t emerged in the summaries.
"The same bias emerges differently when research papers are summarized. The findings of the paper that are already well known get a lot of presence in the summary. The novel findings, eg. the most important ideas, are skipped over precisely because they are less well known.
"What’s happening here seems like the right behavior for contexts that are improved by optimizing for consensus.
"Whenever it’s true that important ideas that are not well known can live alongside important ideas that are consensus, it is also true that unique communities can exist alongside much bigger mainstream communities without drawing much attention to themselves."
Comments
Post a Comment
Empathy recommended