Signal has taken an important step, enabling its “Screen security” setting by default on Windows 11 to prevent Recall from recording the app’s contents.
Signal is the most secure messaging platform on the market, giving users true end-to-end encryption (E2EE) to protect their communication. Unfortunately, screenshots can circumvent that security, making a record of conversations that can be shared outside the app.
Window 11’s new Recall feature is particularly worrying, in this context, since it takes screenshots of everything the user is doing, converts the screenshots into text-based data, and stores that data in a database that can be searched via natural language prompts. While potentially convenient, such a feature is a security nightmare.
Microsoft has worked to improve Recall’s security, repeatedly delaying its rollout to address concerns. Despite the efforts, the feature still has disturbing failings, as security researcher Keven Beaumont pointed out. After saying the original version of Recall would “set cybersecurity back a decade & endanger customers,” Beaumont looked at Microsoft’s revamped official release.
“The feature to filter sensitive data doesn’t appear to work reliably, across multiple devices from testing,” Beaumont found.
“For example, I updated my credit card in Microsoft’s own account interface, and Recall recorded it.
“In this snapshot I’d typed an invalid credit card number, but it also captured the valid card number. It indexed both, and both were findable under “credit card” in Recall search. It captured and indexed the CVV, too.
*”It’s unclear why Recall saved this — possibly because I use Vivaldi as a web browser? Either way — I’d assumed it wasn’t saving this as sensitive information filter was on… but it just didn’t work reliably for me. In some cases, great. In other cases, I was surprised by what it captured. You basically need to be careful to review what Recall is recording, which is difficult when it records everything you do. The best advice I can give is pause Recall before shopping online to ensure it isn’t recording, then reenable it afterwards.”
Signal’s Solution
Signal has decided to take matters into its own hands, disabling Windows 11’s ability to screenshot the app by default.
If you’re wondering why we’re only implementing this on Windows right now, it’s because the purpose of this setting is to protect your Signal messages from Microsoft Recall.
Although Microsoft made several adjustments over the past twelve months in response to critical feedback, the revamped version of Recall still places any content that’s displayed within privacy-preserving apps like Signal at risk. As a result, we are enabling an extra layer of protection by default on Windows 11 in order to help maintain the security of Signal Desktop on that platform even though it introduces some usability trade-offs. Microsoft has simply given us no other option.
Signal says users will be left with a blank screenshot, similar to what happens when trying to screenshot some copyrighted material in a web browser.
If you attempt to take a screenshot of Signal Desktop when screen security is enabled, nothing will appear. This limitation can be frustrating, but it might look familiar to you if you’ve ever had the audacity to try and take a screenshot of a movie or TV show on Windows. According to Microsoft’s official developer documentation, setting the correct Digital Rights Management (DRM) flag on the application window will ensure that “content won’t show up in Recall or any other screenshot application.” So that’s exactly what Signal Desktop is now doing on Windows 11 by default.
As Signal points out, the issue could be more effectively resolved if Microsoft offered granular controls, giving app developers the ability to block Recall. Since Microsoft doesn’t do this, Signal has to take the more dramatic step of blocking screenshots of the app altogether.
We hope that the AI teams building systems like Recall will think through these implications more carefully in the future. Apps like Signal shouldn’t have to implement “one weird trick” in order to maintain the privacy and integrity of their services without proper developer tools. People who care about privacy shouldn’t be forced to sacrifice accessibility upon the altar of AI aspirations either.
Recall Underscores a Larger Problem With AI and Microsoft
Issues With AI
Recall is, in many ways, the poster child of what is wrong with some AI applications and features. While the feature is certainly impressive, and makes it much easier to find data on a computer, Microsoft’s repeated missteps in its efforts to deploy it point out the security issues involved.
Just because AI can be used to do some thing things doesn’t always mean it should be. What’s more, when AI is used to add or augment a feature, security needs to be a primary consideration from day one, not an afterthought that is bolted on later—as appears to be the case with Recall.
Issues With Microsoft’s Security Culture
Recall’s issues also point to issues with Microsoft’s security culture. The company had a rough 2024, suffering multiple security incidents and drawing the ire of lawmakers over its lax security posture. The company has long been accused of focusing on making things easy, adding new features, or rushing products out to compete with rivals, all at the expense of building those products from the ground up as securely as possible.
As any developer will point out, it’s often far more difficult to secure a product after it’s released than it is to build that security in from the outset. Recall is an example of that. Microsoft failed to properly account for the security implications of a Recall-like feature, had to postpone Recall’s launch repeatedly, and still failed to fully address the issues in the final product.
Either Microsoft was in such a hurry to deploy a major AI features on Windows that it rush development and ignored the potential security issues or it didn’t fully understand the potential security issues with Recall—an even more terrifying possibility.
Signal’s response to Microsoft Recall is just the latest example of the challenges and risks that AI brings to developers and users alike.