Psst is a nonprofit dedicated to creating safer channels for collective whistleblowing. For 4 months in 2025, ASML partnered with Psst to develop novel whistleblowing technologies. We built two things: first, a set of prototypes for employment verification, to enable a whistleblower to prove that they work at a particular company without revealing personal information (e.g., their name); and second, a new storage system for securely holding disclosures until multiple similar tips are identified—i.e., until the whistleblowers will have strength in numbers. Below, we describe what we built and why we built it. We are also excited to announce that all of our code is open-source – check out our GitHub (#1, #2, #3), project page, and Senior Engineer Nora Trapp’s technical deep-dive for more details! 

Motivation

Psst was founded by three people: a former journalist, a writer/activist, and a lawyer, with a combined decade of experience helping whistleblowers. Psst’s founders had seen the costs of blowing the whistle alone, as well as the strength of worker collectives who came forward together. When ASML met with Psst late last year, Psst wanted to develop an online storage system (dubbed “the Safe”) that would accept disclosures from workers, and keep the disclosures encrypted until they could be “matched” with other, similar disclosures. So, for example, if only one person submitted a tip about their workplace, their disclosure would be inaccessible to Psst. Only when other, similar disclosures about that same workplace had been received would the set of tips be revealed to Psst, along with a secure way to contact the sources. This would enable whistleblowers to come forward with the confidence that they wouldn’t be stepping out alone. To protect whistleblower privacy and safety, the Safe and its “matching” system would need some basic source verification, and be resistant to tampering, leaks, and human errors–both by the whistleblower and by the Psst team. 

“Had I been able to connect with other Boeing employees… a more comprehensive picture of what was going wrong across the company might have been made public earlier. A collective approach using Psst Safe could have… potentially saved lives.”  

Ed Pierson, Boeing 737 MAX whistleblower and Psst board member

In 2013, the launch of SecureDrop granted critical protection to whistleblowers in the digital age. An open-source system that uses the private network Tor and dedicated hardware to enable secure, anonymous whistleblowing, SecureDrop is now maintained by the Freedom of the Press Foundation, and remains a vital tool for whistleblowers to come forward safely. But today, in an age of highly accessible encrypted tech like Signal, the usability challenges of SecureDrop demand new innovations for whistleblowers. We hope to contribute to this future.

We collaborated with Psst to develop two pieces of The Safe. 

  1. Safe, private employment verification as a first step in whistleblower verification. We developed a set of prototypes that allow a whistleblower to prove a single aspect of their identity (namely, that they work at a particular organization) without alerting their employer and without revealing other personal details (e.g., their name or email address). For technical details, see ASML Senior Engineer Nora Trapp’s deep-dive into what we built and why: Selfhood in the Shadows: Verification Tools.
  2. Matching whistleblowers with each other in a way that is automated, trustworthy, and resistant to tampering, leaks, and human error. Our goal was to allow whistleblowers within the same company to securely find each other and form strength in numbers. We designed a “rendezvous protocol” which uses threshold cryptography and secret-sharing schemes to ensure that Psst cannot read an individual disclosure before a matching disclosure is found. This creates safety for whistleblowers, who don’t need to place as much trust in Psst, and relieves Psst of the risk of making a mistake and breaking trust with their users. Learn more in Selfhood in the Shadows: Matching Tools.

What we’ve built is infrastructure: components that can be adopted, forked, or extended by the communities that need them most. For example, if you are: 

  • A newsroom looking for new, safe source verification methods; 
  • A labor and/or advocacy org supporting collective disclosure; 
  • A researcher working on privacy, anonymity, or digital identity; 
  • A platform or toolmaker looking to add privacy-preserving verification to your products; 

and want to learn how to use these tools, check out our GitHub (#1, #2, #3) or reach out to us at asml@cyber.harvard.edu

Whistleblowing is critical to enabling public accountability

Organizational transparency is essential to public interest tech work: it helps us hold powerful companies accountable, protect against harmful technologies, and build new and better systems that are informed by the shortcomings of the old. Whistleblowers are critical sources of transparency: per the Signals Network’s 2024 report, “over the past decade, the biggest revelations about Big Tech’s activities have come not from the companies, or even civil society or governments. Instead, they have come from Big Tech’s own employees.” 

We note that whistleblowing is just one example of why selective, secure identity disclosure is important. We can all suffer in an internet where we have so little control over how we prove our digital identities—where joining a social network can ask for your biometric data, and where bringing just one piece of your identity from one platform to the next can create a sweeping digital footprint. Our prototypes for Psst point to a future in which individuals—not platforms or employers—hold the keys to their own identity data, and use those keys on their own terms. These prototypes are a small piece of our  effort to redesign digital identity with users’ privacy, safety, and agency at the core.


Lara (she/her) is a product manager for the Applied Social Media Lab’s discourse pod. She’s working on tools that address barriers to discourse around trust, digital identity, privacy, and algorithmic...