4 Windows 11 features you must disable if you care about your privacy
Windows 11 has always been a minefield to navigate when it comes to data privacy. Microsoft's operating system is the most popular desktop OS in the world, holding about 67% of the market share as of 2026. This also means that inherent privacy drawbacks aren't a niche concern, but rather, affect the vast majority of the computing world as we know it.
While Windows offers some powerful features, many intrusive ones operate on an "opt-out" rather than an "opt-in" basis, which buries data harvesting settings underneath layers of menus and sub-menus, enabled by default unless requested otherwise. If you want to reclaim some of your digital sovereignty, please consider not using these four features.
Windows Telemetry
It can track your hardware IDs and app usage
There is a reason why when you search for "Windows Telemetry" on any search engine, the first ten or twenty results you'd get will be tutorials on how you can disable the service. The service isn't looked upon favorably by most average users, let alone privacy-conscious ones, and for good reason.
While Microsoft claims in their official documentation that "required" diagnostic data is the minimum level of data needed to keep your device reliable and secure, they never explicitly define as to what "minimum" encompasses. As a matter of fact, they state later in the documentation that specific items are even "subject to change to give Microsoft flexibility to collect data needed for the purposes described." Microsoft also explicitly states that their "optional" diagnostics data also may include information about the websites users browse, which they define as "usage". For the privacy-focused users, these terms should be met with an automatic denial.
Windows Recall
It has been compromised before, and it will be compromised again
Windows Recall tops the list of features that no user ever requested in an operating system. The feature is designed to take snapshots of your screen every few seconds, ostensibly to help you "find things" you remember you had looked at, but not sure where. In its philosophy, it does attempt to solve a genuine problem and attempts to alleviate some of the cognitive load on the user, but it also creates a searchable, chronological record of your digital activity.
Microsoft has made many attempts to get users on board with this feature, considering the massive backlash it faced on launch. In 2024, Microsoft tried to address some privacy and security concerns raised by users and overhauled the security architecture of Recall snapshots by protecting it through the use of VBS Enclaves.
This overhaul was a direct response to the infamous "TotalRecall" exploit first detailed by Wired in mid-2024. At that time, offensive security expert Alex Hagenah, the same expert behind the current 2026 bypass, released a proof-of-concept tool that demonstrated that the original iteration of Recall was storing every screenshot and OCR-processed word in an unencrypted SQLite database, leaving it vulnerable to malware. Unless you "recall" asking for spyware, it's best not to use this feature.
Forced cloud syncing via OneDrive
You didn't buy a 2TB SSD for nothing, did you?
Windows 11 has been trying to persuade users to buy into its "cloud-by-default" architecture for years now. In recent builds, Microsoft has made it arduous to keep user files strictly local, and that's one of the reasons why, even after painstakingly searching for the Word document that you "saved" the night before across your drive gets found on your OneDrive desktop instead of... the actual desktop on your PC.
The reason for this can be traced back to the installation. During setup, the OS defaults to syncing your Desktop, Pictures and Documents folders to OneDrive. It's certainly annoying, but what's concerning is the fact that once your data is synced to OneDrive, it falls under the scope of Microsoft's Services Agreement.
This gives a legal backdoor to law enforcement agencies to access your data. Microsoft's own transparency reports confirm that they comply with thousands of law enforcement requests annually. In multiple jurisdictions, the agencies can subpoena your cloud data directly from Microsoft.
Inking & Typing Personalization
It's literally a keylogger, and it's not even in disguise
If you found a physical device or a background service that recorded everything you typed and sent it to a third party, you'd call it a security breach. When Microsoft does it, they call it "Inking & typing personalization."
The documentation says that, "Microsoft will collect samples of content you type or write" and further adds that the content is divided into samples, removing any unique identifiers or sequencing information that can be used to associate the input with the user. Whether this is enough for a user to trust their data processing methods is a whole different question.
The problem here is that, by sending the samples to the cloud, Microsoft is introducing a similar vulnerability as it did with the Recall feature. This level of function creep, despite anonymization promises, should concern those who want their keyboard to only function as a method to input text, and not rely on cloud services to guesstimate what comes next.
No user should pay for a feature with their data, especially if they're features they never asked for in the first place.
Microsoft desperately needs more transparency surrounding its features
Windows 11's privacy pitfalls have become serious enough to reconsider using the platform altogether, and it's one of the reasons users cite while migrating to Linux. Although Microsoft has made consistent efforts to safeguard user privacy over the past few years, they must adopt genuine opt-in consent and plain language disclosures so that users are able to make informed decisions regarding the data they will be trading off. No user should pay for a feature with their data, especially if they're features they never asked for in the first place.
Chicago
Track Your Order


0
