Join the Program on Democracy and the Internet (PDI) and moderator Alex Stamos in conversation with Ronald E. Robertson for Engagement Outweighs Exposure to Partisan and Unreliable News within Google Search
This session is part of the Fall Seminar Series, a months-long series designed to bring researchers, policy makers, scholars and industry professionals together to share research, findings and trends in the cyber policy space. Both in-person (Stanford-affiliation required) and virtual attendance (open to the public) is available; registration is required.
If popular online platforms systematically expose their users to partisan and unreliable news, they could potentially contribute to societal issues like rising political polarization. This concern is central to the echo chamber and filter bubble debates, which critique the roles that user choice and algorithmic curation play in guiding users to different online information sources. These roles can be measured in terms of exposure, the URLs seen while using an online platform, and engagement, the URLs selected while on that platform or browsing the web more generally. However, due to the challenges of obtaining ecologically valid exposure data--what real users saw during their regular platform use--studies in this vein often only examine engagement data, or estimate exposure via simulated behavior or inference. Despite their centrality to the contemporary information ecosystem, few such studies have focused on web search, and even fewer have examined both exposure and engagement on any platform. To address these gaps, we conducted a two-wave study pairing surveys with ecologically valid measures of exposure and engagement on Google Search during the 2018 and 2020 US elections. We found that participants' partisan identification had a small and inconsistent relationship with the amount of partisan and unreliable news they were exposed to on Google Search, a more consistent relationship with the search results they chose to follow, and the most consistent relationship with their overall engagement. That is, compared to the news sources our participants were exposed to on Google Search, we found more identity-congruent and unreliable news sources in their engagement choices, both within Google Search and overall. These results suggest that exposure and engagement with partisan or unreliable news on Google Search are not primarily driven by algorithmic curation, but by users' own choices.
Dr. Ronald E Robertson received his Ph.D. in Network Science from Northeastern University in 2021. He was advised by Christo Wilson, a computer scientist, and David Lazer, a political scientist. For his research, Dr. Robertson uses computational tools, behavioral experiments, and qualitative user studies to measure user activity, algorithmic personalization, and choice architecture in online platforms. By rooting his questions in findings and frameworks from the social, behavioral, and network sciences, his goal is to foster a deeper and more widespread understanding of how humans and algorithms interact in digital spaces. Prior to Northeastern, Dr. Robertson obtained a BA in Psychology from the University of California San Diego and worked with research psychologist Robert Epstein at the American Institute for Behavioral Research and Technology.