Virtual Library

Our virtual library is an online repository of all of the reports, papers, and briefings that IST has produced, as well as works that have influenced our thinking.

Submit your Content

Op-ed

ROOST Reminds Us Why Open Source Tools Matter

view

Reports

Navigating AI Compliance, Part 2: Risk Mitigation Strategies for Safeguarding Against Future Failures

Mariami Tkeshelashvili and Tiffany Saade

viewpdf

Reports

Deterring the Abuse of U.S. IaaS Products: Recommendations for a Consortium Approach

Steve Kelly, Tiffany Saade

viewpdf

Podcasts

TechnologIST Talks: Looking Back and Looking Ahead: Deep Dive on the New Cybersecurity Executive Order

Carole House, Megan Stifel, and Steve Kelly

view

Podcasts

TechnologIST Talks: The Offense-Defense Balance

Philip Reiner and Heather Adkins

view

Reports

The Generative Identity Initiative: Exploring Generative AI’s Impact on Cognition, Society, and the Future

Gabrielle Tran, Eric Davis

viewpdf

Podcasts

TechnologIST Talks: A Transatlantic Perspective on Quantum Tech

Megan Stifel and Markus Pflitsch

view

Contribute to our Library!

We also welcome additional suggestions from readers, and will consider adding further resources as so much of our work has come through crowd-sourced collaboration already. If, for any chance you are an author whose work is listed here and you do not wish it to be listed in our repository, please, let us know.

SUBMIT CONTENT

Modulating Trust

Leah Walker and Zoë Brammer

SUMMARY

Social trust – trust in other people and institutions – is critical to the DCDI problem set. But trust is not always beneficial. Although trust in technology can facilitate economic transactions, it can also diminish our capacity for skepticism. Consumers tend to prefer to use technologies that they trust, and sellers and developers of technology find more success when there is more trust in their systems. Yet trust placed too freely in technologies can also generate vulnerabilities for those same consumers–to identity theft, to addiction, to misinformation, and to fraud. Misplaced trust in online information and sources of online information can create vulnerability to disinformation, affective polarization, and anti-democratic behavior.

The key findings of the DCDI research into trust include:

  • People are increasingly dependent on, and distrustful of, digital technology—however, they don’t behave as though they mistrust technology. Rather, people continue to use technology intensively in all aspects of daily life, despite the risks and manipulation of which they are aware.
  • The democratization of truth, the idea that everyone can have their own truth, rather than deriving it from a few reputable sources, can destroy the notion of objectivity and shared beliefs. Instead, people choose beliefs based on group identity and rationalize false beliefs to avoid cognitive dissonance.
  • Humans are programmed to trust those closest to them the most. This can also mean trusting those who they identify with the most. This phenomenon can extend to influencers, with nano influencers in particular exploiting the human inclination to trust that which is near and dear, thereby building up devoted followings of like-minded individuals. The role that someone plays within an ingroup perpetuates certain behavior, and thought leaders get bigger rewards (e.g. followers, money through Patreon, or merchandise sales) for promoting more extreme or more polarizing content.
  • Digital technologies are affecting the cognitive processes that comprise trust, including memory, attention, and reasoning.
download pdf