@@ -196,34 +196,23 @@ In addition to the above design requirements, the technology decisions about the
### 2.4 Limitations
In the past, we have made [application data isolation](https://2019.www.torproject.org/projects/torbrowser/design/#app-data-isolation) an explicit goal, whereby all evidence of the existence of Tor Browser usage can be removed via secure deletion of the installation folder.
The majority of deployed Tor Browser installs run on platforms which either explicitly disrespect user agency and privacy (for-profit platforms such as Android, macOS, and Windows) or whose threat model may be less extreme than that of some of our users (the various flavours of Linux and BSD).
Developing real application data isolation with enough certainty that we could confidently promise to our users that it works would be an astronomical undertaking.
It would require a never ending auditing process to identify all of the conditions under which the operating itself leaks information about the user's browsing session.
We would need to modify the browser to either work around the data-leaking API calls or implement cleanup functionality for each platform to wipe the offending data from disk.
We would need testing infrastructure to ensure we did not have regressions on these platforms.
This work would need to be done for each of our four supported operating systems, each of which has multiple supported CPU architectures, untold number of major+minor versions, and various hardware-vendor customisations.
This is not achievable in the general case, and we should not pretend that it is.
1.**Application Data Isolation**
Suppose the above *were* possible, and that we could somehow know with certainty every condition under which the operating system leaks session data to disk.
Even if we had such an acolyte with perfect knowledge, we would still have a big problem with no viable solution.
The operating system necessarily runs at a higher level of privilege and power than Tor Browser does.
This means the operating system has a higher level of access to the rest of the system, meaning it can store things in places the browser process does not have access to.
In the past, we have made [application data isolation](https://2019.www.torproject.org/projects/torbrowser/design/#app-data-isolation) an explicit goal, whereby all evidence of the existence of Tor Browser usage can be removed via secure deletion of the installation folder.
This is not generally achievable.
For example, on Windows it is a common design pattern for user-space platform API calls to be routed to and implemented in service processes running as `SYSTEM` or `Administrator` (which are similar to root on Linux) via RPC mechanisms.
These services can write whatever they like to the `HKLM` registry hive, whereas the browser cannot; it does not have the required privileges.
To hypothetically solve this problem in the general case, we would need to modify the browser to either work around any data-leaking external API calls or implement cleanup functionality for each platform to wipe the offending data from disk.
Some of this cleanup would necessarily require elevated privileges (e.g. Admin or root) to cleanup leaks made by the operating system itself, which goes against our principle of least privilege.
We would also need continual audits to identify all of the conditions under which the user's operating itself leaks information about their browsing session for each supported operating system and CPU architecture.
In order to remove a leak to `HKLM` or other similar data-stores, the browser *itself* would need the capability to elevate itself to the same level of permissions as the process which leaked in the first place.
Such a capability would work directly against our [least privilege](#44-least-privilege) security requirement and could negate the entire browser sandboxing efforts of the past few decades which (mostly) ensures 0-day exploits do not take over your system when watching cat videos on the internet.
Practically speaking, it is not possible to provide this functionality with a level of confidence required for cases where physical access is a concern.
The majority of deployed Tor Browser installs run on platforms which either explicitly disrespect user agency and privacy (for-profit platforms such as Android, macOS, and Windows) or whose threat model may be less extreme than that of some of our users (the various flavours of Linux and BSD).
We would direct users who *do* have disk forensics in their threat model to the [Tails operating system](https://tails.net/).
Tails is a purpose-built Linux-based operating system which is ephemeral by default, and also supports full-disk encryption for persistent storage if needed.
Users whose threat model *does* include the need to hide evidence of their usage of Tor Browser should use Tor Browser with the [Tails operating system](https://tails.net/).
Tails is a purpose-built Linux-based operating system which is ephemeral by default, and also supports full-disk encryption for optional persistent storage if needed.
It essentially provides whole operating system level data isolation to its users with a level of confidence unachievable for Tor Browser on its own.
## 3. Adversary Model
The browser's adversaries have a number of possible goals, capabilities, and attack types that can be used to illustrate the design requirements for the browser.