@@ -82,7 +82,7 @@ This document describes the [adversary model](#3-adversary-model), [design requi
...
@@ -82,7 +82,7 @@ This document describes the [adversary model](#3-adversary-model), [design requi
This document is also meant to serve as a set of design requirements and to describe a reference implementation of a Private Browsing Mode that defends against active network adversaries, in addition to the passive forensic local adversary currently addressed by the major browsers.
This document is also meant to serve as a set of design requirements and to describe a reference implementation of a Private Browsing Mode that defends against active network adversaries, in addition to the passive forensic local adversary currently addressed by the major browsers.
For more practical information regarding Tor Browser development, please consult the [Tor Browser Hacking Guide](https://trac.torproject.org/projects/tor/wiki/doc/TorBrowser/Hacking).
For more practical information regarding Tor Browser development, please consult the [Tor Browser Hacking Guide](https://gitlab.torproject.org/tpo/applications/tor-browser/-/wikis/Hacking).
### 1.1. Browser Component Overview
### 1.1. Browser Component Overview
...
@@ -165,7 +165,7 @@ In addition to the above design requirements, the technology decisions about Tor
...
@@ -165,7 +165,7 @@ In addition to the above design requirements, the technology decisions about Tor
4.**Minimize Global Privacy Options**
4.**Minimize Global Privacy Options**
[Another failure of Torbutton](https://trac.torproject.org/projects/tor/ticket/3100) was the options panel. Each option that detectably alters browser behavior can be used as a fingerprinting tool. Similarly, all extensions [should be disabled in the mode](https://blog.chromium.org/2010/06/extensions-in-incognito.html) except as an opt-in basis. We should not load system-wide and/or operating system provided addons or plugins.
[Another failure of Torbutton](https://gitlab.torproject.org/legacy/trac/-/issues/3100) was the options panel. Each option that detectably alters browser behavior can be used as a fingerprinting tool. Similarly, all extensions [should be disabled in the mode](https://blog.chromium.org/2010/06/extensions-in-incognito.html) except as an opt-in basis. We should not load system-wide and/or operating system provided addons or plugins.
Instead of global browser privacy options, privacy decisions should be made [per URL bar origin](https://wiki.mozilla.org/Privacy/Features/Site-based_data_management_UI) to eliminate the possibility of linkability between domains. For example, when a plugin object (or a JavaScript access of window.plugins) is present in a page, the user should be given the choice of allowing that plugin object for that URL bar origin only. The same goes for exemptions to third party cookie policy, geolocation, and any other privacy permissions.
Instead of global browser privacy options, privacy decisions should be made [per URL bar origin](https://wiki.mozilla.org/Privacy/Features/Site-based_data_management_UI) to eliminate the possibility of linkability between domains. For example, when a plugin object (or a JavaScript access of window.plugins) is present in a page, the user should be given the choice of allowing that plugin object for that URL bar origin only. The same goes for exemptions to third party cookie policy, geolocation, and any other privacy permissions.
...
@@ -305,7 +305,7 @@ The adversary can perform the following attacks from a number of different posit
...
@@ -305,7 +305,7 @@ The adversary can perform the following attacks from a number of different posit
The Implementation section is divided into subsections, each of which corresponds to a [Design Requirement](#2-design-requirements-and-philosophy). Each subsection is divided into specific web technologies or properties. The implementation is then described for that property.
The Implementation section is divided into subsections, each of which corresponds to a [Design Requirement](#2-design-requirements-and-philosophy). Each subsection is divided into specific web technologies or properties. The implementation is then described for that property.
In some cases, the implementation meets the design requirements in a non-ideal way (for example, by disabling features). In rare cases, there may be no implementation at all. Both of these cases are denoted by differentiating between the **Design Goal** and the **Implementation Status** for each property. Corresponding bugs in the [Tor bug tracker](https://trac.torproject.org/projects/tor/report) are typically linked for these cases.
In some cases, the implementation meets the design requirements in a non-ideal way (for example, by disabling features). In rare cases, there may be no implementation at all. Both of these cases are denoted by differentiating between the **Design Goal** and the **Implementation Status** for each property. Corresponding bugs in the [Tor Browser issue tracker](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues) are typically linked for these cases.
### 4.1. Proxy Obedience
### 4.1. Proxy Obedience
...
@@ -359,7 +359,7 @@ Tor Browser State is separated from existing browser state through use of a cust
...
@@ -359,7 +359,7 @@ Tor Browser State is separated from existing browser state through use of a cust
As an additional defense-in-depth measure, we set `browser.cache.disk.enable`, `browser.cache.offline.enable`, `signon.rememberSignons`, `browser.formfill.enable` to **true**, `browser.download.manager.retention` to **1**, and both `browser.sessionstore.privacy_level` and `network.cookie.lifetimePolicy` to **2**. Many of these preferences are likely redundant with `browser.privatebrowsing.autostart` enabled, but we have not done the auditing work to ensure that yet.
As an additional defense-in-depth measure, we set `browser.cache.disk.enable`, `browser.cache.offline.enable`, `signon.rememberSignons`, `browser.formfill.enable` to **true**, `browser.download.manager.retention` to **1**, and both `browser.sessionstore.privacy_level` and `network.cookie.lifetimePolicy` to **2**. Many of these preferences are likely redundant with `browser.privatebrowsing.autostart` enabled, but we have not done the auditing work to ensure that yet.
For more details on disk leak bugs and enhancements, see the [tbb-disk-leak tag in our bugtracker](https://trac.torproject.org/projects/tor/query?keywords=~tbb-disk-leak&status=!closed).
For more details on disk leak bugs and enhancements, see the [Disk Leak](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/?sort=updated_desc&state=opened&label_name%5B%5D=Disk%20Leak&first_page_size=20) tag in our issue tracker.
### 4.4. Application Data Isolation
### 4.4. Application Data Isolation
...
@@ -386,7 +386,7 @@ Unfortunately, many aspects of browser state can serve as identifier storage, an
...
@@ -386,7 +386,7 @@ Unfortunately, many aspects of browser state can serve as identifier storage, an
**Design Goal**: All cookies MUST be double-keyed to the URL bar origin and third-party origin.
**Design Goal**: All cookies MUST be double-keyed to the URL bar origin and third-party origin.
**Implementation Status**: Double-keying cookies should just work by setting `privacy.firstparty.isolate` to **true**. However, [we have not audited that](https://trac.torproject.org/projects/tor/ticket/21905) yet and there is still the [UI part missing for managing cookies in Private Browsing Mode](https://trac.torproject.org/projects/tor/ticket/10353). We therefore opted to keep third-party cookies disabled for now by setting `network.cookie.cookieBehavior` to **1**.
**Implementation Status**: Double-keying cookies should just work by setting `privacy.firstparty.isolate` to **true**. However, [we have not audited that](https://gitlab.torproject.org/legacy/trac/-/issues/21905) yet and there is still the [UI part missing for managing cookies in Private Browsing Mode](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10353). We therefore opted to keep third-party cookies disabled for now by setting `network.cookie.cookieBehavior` to **1**.
2.**Cache**
2.**Cache**
...
@@ -414,13 +414,13 @@ Unfortunately, many aspects of browser state can serve as identifier storage, an
...
@@ -414,13 +414,13 @@ Unfortunately, many aspects of browser state can serve as identifier storage, an
**Design Goal**: Users should be able to click-to-play flash objects from trusted sites. To make this behavior unlinkable, we wish to include a settings file for all platforms that disables flash cookies using the [Flash settings manager](https://www.macromedia.com/support/documentation/en/flashplayer/help/settings_manager03.html).
**Design Goal**: Users should be able to click-to-play flash objects from trusted sites. To make this behavior unlinkable, we wish to include a settings file for all platforms that disables flash cookies using the [Flash settings manager](https://www.macromedia.com/support/documentation/en/flashplayer/help/settings_manager03.html).
**Implementation Status**: We are currently [having difficulties](https://trac.torproject.org/projects/tor/ticket/3974) causing Flash player to use this settings file on Windows, so Flash remains difficult to enable.
**Implementation Status**: We are currently [having difficulties](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/3974) causing Flash player to use this settings file on Windows, so Flash remains difficult to enable.
7.**SSL+TLS session resumption**
7.**SSL+TLS session resumption**
**Design Goal**: TLS session resumption tickets and SSL Session IDs MUST be limited to the URL bar domain.
**Design Goal**: TLS session resumption tickets and SSL Session IDs MUST be limited to the URL bar domain.
Implementation Status: We disable TLS Session Tickets and SSL Session IDs by setting `security.ssl.disable_session_identifiers` to **true**. To compensate for the increased round trip latency from disabling these performance optimizations, we also enable [TLS False Start](https://tools.ietf.org/html/draft-bmoeller-tls-falsestart-00) via the Firefox Pref `security.ssl.enable_false_start`. However, URL bar domain isolation should be working both for session tickets and session IDs but we [have not verified that yet](https://trac.torproject.org/projects/tor/ticket/17252).
**Implementation Status**: We disable TLS Session Tickets and SSL Session IDs by setting `security.ssl.disable_session_identifiers` to **true**. To compensate for the increased round trip latency from disabling these performance optimizations, we also enable [TLS False Start](https://tools.ietf.org/html/draft-bmoeller-tls-falsestart-00) via the Firefox Pref `security.ssl.enable_false_start`. However, URL bar domain isolation should be working both for session tickets and session IDs but we [have not verified that yet](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/17252).
8.**Tor circuit and HTTP connection linkability**
8.**Tor circuit and HTTP connection linkability**
...
@@ -450,7 +450,7 @@ Unfortunately, many aspects of browser state can serve as identifier storage, an
...
@@ -450,7 +450,7 @@ Unfortunately, many aspects of browser state can serve as identifier storage, an
Non-automated redirect chains that require user input at some step (such as federated login systems) SHOULD still allow identifiers to persist.
Non-automated redirect chains that require user input at some step (such as federated login systems) SHOULD still allow identifiers to persist.
**Implementation status**: There are numerous ways for the user to be redirected, and the Firefox API support to detect each of them is poor. We have a [trac bug open](https://trac.torproject.org/projects/tor/ticket/3600) to implement what we can.
**Implementation status**: There are numerous ways for the user to be redirected, and the Firefox API support to detect each of them is poor. We have a [trac bug open](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/40787) to implement what we can.
13.**window.name**
13.**window.name**
...
@@ -496,7 +496,7 @@ Unfortunately, many aspects of browser state can serve as identifier storage, an
...
@@ -496,7 +496,7 @@ Unfortunately, many aspects of browser state can serve as identifier storage, an
Firefox provides the feature to [connect speculatively](https://www.igvita.com/2015/08/17/eliminating-roundtrips-with-preconnect/) to remote hosts if that is either indicated in the HTML file (e.g. by [link rel="preconnect" and rel="prefetch"](https://w3c.github.io/resource-hints/)) or otherwise deemed beneficial.
Firefox provides the feature to [connect speculatively](https://www.igvita.com/2015/08/17/eliminating-roundtrips-with-preconnect/) to remote hosts if that is either indicated in the HTML file (e.g. by [link rel="preconnect" and rel="prefetch"](https://w3c.github.io/resource-hints/)) or otherwise deemed beneficial.
Firefox does not support rel="prerender", and Mozilla has disabled speculative connections and rel="preconnect" usage where a proxy is used (see [comment 3 in bug 18762](https://trac.torproject.org/projects/tor/ticket/18762#comment:3) for further details). Explicit prefetching via the rel="prefetch" attribute is still performed, however.
Firefox does not support rel="prerender", and Mozilla has disabled speculative connections and rel="preconnect" usage where a proxy is used (see [comment 3 in issue tor-browser#18762](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/18762#note_2630269) for further details). Explicit prefetching via the rel="prefetch" attribute is still performed, however.
All pre-loaded links and speculative connections MUST be isolated to the URL bar domain, if enabled. This includes isolating both Tor circuit use, as well as the caching and associate browser state for the prefetched resource.
All pre-loaded links and speculative connections MUST be isolated to the URL bar domain, if enabled. This includes isolating both Tor circuit use, as well as the caching and associate browser state for the prefetched resource.
...
@@ -510,7 +510,7 @@ Unfortunately, many aspects of browser state can serve as identifier storage, an
...
@@ -510,7 +510,7 @@ Unfortunately, many aspects of browser state can serve as identifier storage, an
**Implementation Status**: This functionality is provided by setting `privacy.firstparty.isolate` to **true**.
**Implementation Status**: This functionality is provided by setting `privacy.firstparty.isolate` to **true**.
For more details on identifier linkability bugs and enhancements, see the [tbb-linkability tag in our bugtracker](https://trac.torproject.org/projects/tor/query?keywords=~tbb-linkability&status=!closed).
For more details on identifier linkability bugs and enhancements, see the [Linkability](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/?sort=updated_desc&state=opened&label_name%5B%5D=Linkability&first_page_size=20) label in our issue tracker.
@@ -614,7 +614,7 @@ Where our actual implementation differs from an ideal solution, we separately de
...
@@ -614,7 +614,7 @@ Where our actual implementation differs from an ideal solution, we separately de
Plugins add to fingerprinting risk via two main vectors: their mere presence in `window.navigator.plugins` (because they are optional, end-user installed third party software), as well as their internal functionality.
Plugins add to fingerprinting risk via two main vectors: their mere presence in `window.navigator.plugins` (because they are optional, end-user installed third party software), as well as their internal functionality.
**Design Goal**: All plugins that have not been specifically audited or sandboxed MUST be disabled. To reduce linkability potential, even sandboxed plugins SHOULD NOT be allowed to load objects until the user has clicked through a click-to-play barrier. Additionally, version information SHOULD be reduced or obfuscated until the plugin object is loaded. For Flash, we wish to [provide a settings.sol](https://trac.torproject.org/projects/tor/ticket/3974) file to disable Flash cookies, and to restrict P2P features that are likely to bypass proxy settings. We'd also like to restrict access to fonts and other system information (such as IP address and MAC address) in such a sandbox.
**Design Goal**: All plugins that have not been specifically audited or sandboxed MUST be disabled. To reduce linkability potential, even sandboxed plugins SHOULD NOT be allowed to load objects until the user has clicked through a click-to-play barrier. Additionally, version information SHOULD be reduced or obfuscated until the plugin object is loaded. For Flash, we wish to [provide a settings.sol](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/3974) file to disable Flash cookies, and to restrict P2P features that are likely to bypass proxy settings. We'd also like to restrict access to fonts and other system information (such as IP address and MAC address) in such a sandbox.
**Implementation Status**: Currently, we entirely disable all plugins in Tor Browser. However, as a compromise due to the popularity of Flash, we allow users to re-enable Flash, and flash objects are blocked behind a click-to-play barrier that is available only after the user has specifically enabled plugins. Flash is the only plugin available, the rest are entirely blocked from loading by the Firefox patches mentioned in the [Proxy Obedience section](#41-proxy-obedience). We also set the Firefox preference `plugin.expose_full_path` to false, to avoid leaking plugin installation information.
**Implementation Status**: Currently, we entirely disable all plugins in Tor Browser. However, as a compromise due to the popularity of Flash, we allow users to re-enable Flash, and flash objects are blocked behind a click-to-play barrier that is available only after the user has specifically enabled plugins. Flash is the only plugin available, the rest are entirely blocked from loading by the Firefox patches mentioned in the [Proxy Obedience section](#41-proxy-obedience). We also set the Firefox preference `plugin.expose_full_path` to false, to avoid leaking plugin installation information.
...
@@ -650,13 +650,13 @@ Where our actual implementation differs from an ideal solution, we separately de
...
@@ -650,13 +650,13 @@ Where our actual implementation differs from an ideal solution, we separately de
For Windows and macOS we use a preference, `font.system.whitelist`, to restrict fonts being used to those in the whitelist. This functionality is provided by setting `privacy.resistFingerprinting` to **true**. The whitelist for Windows and macOS contains both a set of [Noto fonts](https://www.google.com/get/noto) which we bundle and fonts provided by the operating system. For Linux systems we only bundle fonts and [deploy](https://gitlab.torproject.org/tpo/applications/tor-browser-build/-/blob/main/projects/browser/Bundle-Data/linux/Data/fontconfig/fonts.conf) a fonts.conf file to restrict the browser to use those fonts exclusively. In addition to that we set the `font.name.*` preferences for macOS and Linux to make sure that a given code point is always displayed with the same font. This is not guaranteed even if we bundle all the fonts Tor Browser uses as it can happen that fonts are loaded in a different order on different systems. Setting the above mentioned preferences works around this issue by specifying the font to use explicitly.
For Windows and macOS we use a preference, `font.system.whitelist`, to restrict fonts being used to those in the whitelist. This functionality is provided by setting `privacy.resistFingerprinting` to **true**. The whitelist for Windows and macOS contains both a set of [Noto fonts](https://www.google.com/get/noto) which we bundle and fonts provided by the operating system. For Linux systems we only bundle fonts and [deploy](https://gitlab.torproject.org/tpo/applications/tor-browser-build/-/blob/main/projects/browser/Bundle-Data/linux/Data/fontconfig/fonts.conf) a fonts.conf file to restrict the browser to use those fonts exclusively. In addition to that we set the `font.name.*` preferences for macOS and Linux to make sure that a given code point is always displayed with the same font. This is not guaranteed even if we bundle all the fonts Tor Browser uses as it can happen that fonts are loaded in a different order on different systems. Setting the above mentioned preferences works around this issue by specifying the font to use explicitly.
Allowing fonts provided by the operating system for Windows and macOS users is currently a compromise between fingerprintability resistance and usability concerns. We are still investigating the right balance between them and have created a [ticket in our bug tracker](https://trac.torproject.org/projects/tor/ticket/18097) to summarize the current state of our defense and future work that remains to be done.
Allowing fonts provided by the operating system for Windows and macOS users is currently a compromise between fingerprintability resistance and usability concerns. We are still investigating the right balance between them and have created a [ticket in our issue tracker](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/18097) to summarize the current state of our defense and future work that remains to be done.
7.**Monitor, Widget, and OS Desktop Resolution**
7.**Monitor, Widget, and OS Desktop Resolution**
Both CSS and JavaScript have access to a lot of information about the screen resolution, usable desktop size, OS widget size, toolbar size, title bar size, and OS desktop widget sizing information that are not at all relevant to rendering and serve only to provide information for fingerprinting. Since many aspects of desktop widget positioning and size are user configurable, these properties yield customized information about the computer, even beyond the monitor size.
Both CSS and JavaScript have access to a lot of information about the screen resolution, usable desktop size, OS widget size, toolbar size, title bar size, and OS desktop widget sizing information that are not at all relevant to rendering and serve only to provide information for fingerprinting. Since many aspects of desktop widget positioning and size are user configurable, these properties yield customized information about the computer, even beyond the monitor size.
**Design Goal**: Our design goal here is to reduce the resolution information down to the bare minimum required for properly rendering inside a content window. We intend to report all rendering information correctly with respect to the size and properties of the content window, but report an effective size of 0 for all border material, and also report that the desktop is only as big as the inner content window. Additionally, new browser windows are sized such that their content windows are one of a few fixed sizes based on the user's desktop resolution. In addition, to further reduce resolution-based fingerprinting, we are [investigating zoom/viewport-based mechanisms](https://trac.torproject.org/projects/tor/ticket/7256) that might allow us to always report the same desktop resolution regardless of the actual size of the content window, and simply scale to make up the difference. As an alternative to zoom-based solutions we are testing a [different approach](https://trac.torproject.org/projects/tor/ticket/14429) in our alpha series that tries to round the browser window at all times to a multiple 200x100 pixels. Regardless which solution we finally pick, until it will be available the user should also be informed that maximizing their windows can lead to fingerprintability under the current scheme.
**Design Goal**: Our design goal here is to reduce the resolution information down to the bare minimum required for properly rendering inside a content window. We intend to report all rendering information correctly with respect to the size and properties of the content window, but report an effective size of 0 for all border material, and also report that the desktop is only as big as the inner content window. Additionally, new browser windows are sized such that their content windows are one of a few fixed sizes based on the user's desktop resolution. In addition, to further reduce resolution-based fingerprinting, we are [investigating zoom/viewport-based mechanisms](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/7256) that might allow us to always report the same desktop resolution regardless of the actual size of the content window, and simply scale to make up the difference. As an alternative to zoom-based solutions we are testing a [different approach](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/14429) in our alpha series that tries to round the browser window at all times to a multiple 200x100 pixels. Regardless which solution we finally pick, until it will be available the user should also be informed that maximizing their windows can lead to fingerprintability under the current scheme.
**Implementation Status**: We automatically resize new browser windows to a 200x100 pixel multiple based on desktop resolution by backporting patches from [bug 1330882](https://2019.www.torproject.org/projects/torbrowser/design/) and setting `privacy.resistfingerprinting` to **true**. To minimize the effect of the long tail of large monitor sizes, we also cap the window size at 1000 pixels in each direction. In addition to that we set `privacy.resistFingerprinting` to **true** to use the client content window size for window.screen, and to report a `window.devicePixelRatio` of 1.0. Similarly, we use that preference to return content window relative points for DOM events. We also force popups to open in new tabs (via `browser.link.open_newwindow.restriction`), to avoid full-screen popups inferring information about the browser resolution. In addition, we prevent auto-maximizing on browser start, and inform users that maximized windows are detrimental to privacy in this mode.
**Implementation Status**: We automatically resize new browser windows to a 200x100 pixel multiple based on desktop resolution by backporting patches from [bug 1330882](https://2019.www.torproject.org/projects/torbrowser/design/) and setting `privacy.resistfingerprinting` to **true**. To minimize the effect of the long tail of large monitor sizes, we also cap the window size at 1000 pixels in each direction. In addition to that we set `privacy.resistFingerprinting` to **true** to use the client content window size for window.screen, and to report a `window.devicePixelRatio` of 1.0. Similarly, we use that preference to return content window relative points for DOM events. We also force popups to open in new tabs (via `browser.link.open_newwindow.restriction`), to avoid full-screen popups inferring information about the browser resolution. In addition, we prevent auto-maximizing on browser start, and inform users that maximized windows are detrimental to privacy in this mode.
...
@@ -698,7 +698,7 @@ Where our actual implementation differs from an ideal solution, we separately de
...
@@ -698,7 +698,7 @@ Where our actual implementation differs from an ideal solution, we separately de
15.**System Uptime**
15.**System Uptime**
It is possible to get the system uptime of a Tor Browser user by querying the **Event.timestamp** property. We avoid this by setting `dom.event.highrestimestamp.enabled` to **true**. This might seem to be counterintuitive at first glance but the effect of setting that preference to true is a [normalization](https://trac.torproject.org/projects/tor/ticket/17046#comment:8) of `evt.timestamp` and `new Event('').timeStamp`. Together with clamping the timer resolution to 100ms this provides an effective means against system uptime fingerprinting.
It is possible to get the system uptime of a Tor Browser user by querying the **Event.timestamp** property. We avoid this by setting `dom.event.highrestimestamp.enabled` to **true**. This might seem to be counterintuitive at first glance but the effect of setting that preference to true is a [normalization](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/17046) of `evt.timestamp` and `new Event('').timeStamp`. Together with clamping the timer resolution to 100ms this provides an effective means against system uptime fingerprinting.
16.**Keyboard Layout Fingerprinting**
16.**Keyboard Layout Fingerprinting**
...
@@ -726,7 +726,7 @@ Where our actual implementation differs from an ideal solution, we separately de
...
@@ -726,7 +726,7 @@ Where our actual implementation differs from an ideal solution, we separately de
We set `dom.enable_user_timing` and `dom.enable_resource_timing` to **false** to disable these explicit timing sources. Furthermore, we clamp the resolution of explicit clocks to 100ms by setting `privacy.resistFingerprinting` to **true** thanks to [Mozilla 1217238](https://bugzilla.mozilla.org/show_bug.cgi?id=1217238). This includes `performance.now()`, `new Date().getTime()`, `audioContext.currentTime`, `canvasStream.currentTime`, `video.currentTime`, `audio.currentTime`, `new File([], "").lastModified` , `new File([], "").lastModifiedDate.getTime()`, `animation.startTime`, `animation.currentTime`, `animation.timeline.currentTime`, and `document.timeline.currentTime`.
We set `dom.enable_user_timing` and `dom.enable_resource_timing` to **false** to disable these explicit timing sources. Furthermore, we clamp the resolution of explicit clocks to 100ms by setting `privacy.resistFingerprinting` to **true** thanks to [Mozilla 1217238](https://bugzilla.mozilla.org/show_bug.cgi?id=1217238). This includes `performance.now()`, `new Date().getTime()`, `audioContext.currentTime`, `canvasStream.currentTime`, `video.currentTime`, `audio.currentTime`, `new File([], "").lastModified` , `new File([], "").lastModifiedDate.getTime()`, `animation.startTime`, `animation.currentTime`, `animation.timeline.currentTime`, and `document.timeline.currentTime`.
While clamping the clock resolution to 100ms is a step towards mitigating timing-based side channel fingerprinting, it is by no means sufficient. It turns out that it is possible to subvert our clamping of explicit clocks by using [implicit ones](https://www.usenix.org/system/files/conference/usenixsecurity16/sec16_paper_kohlbrenner.pdf), e.g. extrapolating the true time by running a busy loop with a predictable operation in it. We are tracking [this problem](https://trac.torproject.org/projects/tor/ticket/16110) in our bug tracker and are working with the research community and Mozilla to develop and test a proper solution to this part of our defense against timing-based side channel fingerprinting risks.
While clamping the clock resolution to 100ms is a step towards mitigating timing-based side channel fingerprinting, it is by no means sufficient. It turns out that it is possible to subvert our clamping of explicit clocks by using [implicit ones](https://www.usenix.org/system/files/conference/usenixsecurity16/sec16_paper_kohlbrenner.pdf), e.g. extrapolating the true time by running a busy loop with a predictable operation in it. We are tracking [this problem](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/16110) in our bug tracker and are working with the research community and Mozilla to develop and test a proper solution to this part of our defense against timing-based side channel fingerprinting risks.
19.**resource:// and chrome:// URIs Leaks**
19.**resource:// and chrome:// URIs Leaks**
...
@@ -750,7 +750,7 @@ Where our actual implementation differs from an ideal solution, we separately de
...
@@ -750,7 +750,7 @@ Where our actual implementation differs from an ideal solution, we separately de
[JavaScript performance fingerprinting](https://cseweb.ucsd.edu/~hovav/dist/jspriv.pdf) is the act of profiling the performance of various JavaScript functions for the purpose of fingerprinting the JavaScript engine and the CPU.
[JavaScript performance fingerprinting](https://cseweb.ucsd.edu/~hovav/dist/jspriv.pdf) is the act of profiling the performance of various JavaScript functions for the purpose of fingerprinting the JavaScript engine and the CPU.
**Design Goal**: We have [several potential mitigation approaches](https://trac.torproject.org/projects/tor/ticket/3059) to reduce the accuracy of performance fingerprinting without risking too much damage to functionality. Our current favorite is to reduce the resolution of the `Event.timeStamp` and the JavaScript `Date()` object, while also introducing jitter. We believe that JavaScript time resolution may be reduced all the way up to the second before it seriously impacts site operation. Our goal with this quantization is to increase the amount of time it takes to mount a successful attack. [Mowery et al](https://cseweb.ucsd.edu/~hovav/dist/jspriv.pdf) found that even with the default precision in most browsers, they required up to 120 seconds of amortization and repeated trials to get stable results from their feature set. We intend to work with the research community to establish the optimum trade-off between quantization+jitter and amortization time, as well as identify highly variable JavaScript operations. As long as these attacks take several seconds or more to execute, they are unlikely to be appealing to advertisers, and are also very likely to be noticed if deployed against a large number of people.
**Design Goal**: We have [several potential mitigation approaches](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/3059) to reduce the accuracy of performance fingerprinting without risking too much damage to functionality. Our current favorite is to reduce the resolution of the `Event.timeStamp` and the JavaScript `Date()` object, while also introducing jitter. We believe that JavaScript time resolution may be reduced all the way up to the second before it seriously impacts site operation. Our goal with this quantization is to increase the amount of time it takes to mount a successful attack. [Mowery et al](https://cseweb.ucsd.edu/~hovav/dist/jspriv.pdf) found that even with the default precision in most browsers, they required up to 120 seconds of amortization and repeated trials to get stable results from their feature set. We intend to work with the research community to establish the optimum trade-off between quantization+jitter and amortization time, as well as identify highly variable JavaScript operations. As long as these attacks take several seconds or more to execute, they are unlikely to be appealing to advertisers, and are also very likely to be noticed if deployed against a large number of people.
**Implementation Status**: Currently, our mitigation against performance fingerprinting is to disable [Navigation Timing](https://www.w3.org/TR/navigation-timing/) by setting the Firefox preference `dom.enable_performance` to **false**, and to disable the [Mozilla Video Statistics](https://developer.mozilla.org/en-US/docs/Web/API/HTMLVideoElement#Gecko-specific_properties) API extensions by setting the preference `media.video_stats.enabled` to **false**, too.
**Implementation Status**: Currently, our mitigation against performance fingerprinting is to disable [Navigation Timing](https://www.w3.org/TR/navigation-timing/) by setting the Firefox preference `dom.enable_performance` to **false**, and to disable the [Mozilla Video Statistics](https://developer.mozilla.org/en-US/docs/Web/API/HTMLVideoElement#Gecko-specific_properties) API extensions by setting the preference `media.video_stats.enabled` to **false**, too.
...
@@ -800,11 +800,11 @@ Where our actual implementation differs from an ideal solution, we separately de
...
@@ -800,11 +800,11 @@ Where our actual implementation differs from an ideal solution, we separately de
As we mentioned in the introduction of this section, OS type fingerprinting is currently considered a lower priority, due simply to the numerous ways that characteristics of the operating system type may leak into content, and the comparatively low contribution of OS to overall entropy. In particular, there are likely to be many ways to measure the differences in widget size, scrollbar size, and other rendered details on a page. Also, directly exported OS routines (such as those from the standard C math library) expose differences in their implementations through their return values.
As we mentioned in the introduction of this section, OS type fingerprinting is currently considered a lower priority, due simply to the numerous ways that characteristics of the operating system type may leak into content, and the comparatively low contribution of OS to overall entropy. In particular, there are likely to be many ways to measure the differences in widget size, scrollbar size, and other rendered details on a page. Also, directly exported OS routines (such as those from the standard C math library) expose differences in their implementations through their return values.
**Design Goal**: We intend to reduce or eliminate OS type fingerprinting to the best extent possible, but recognize that the effort for reward on this item is not as high as other areas. The entropy on the current OS distribution is somewhere around 2 bits, which is much lower than other vectors which can also be used to fingerprint configuration and user-specific information. You can see the major areas of OS fingerprinting we're aware of using the [tbb-fingerprinting-os tag on our bug tracker](https://trac.torproject.org/projects/tor/query?keywords=~tbb-fingerprinting-os).
**Design Goal**: We intend to reduce or eliminate OS type fingerprinting to the best extent possible, but recognize that the effort for reward on this item is not as high as other areas. The entropy on the current OS distribution is somewhere around 2 bits, which is much lower than other vectors which can also be used to fingerprint configuration and user-specific information.
**Implementation Status**: At least two HTML5 features have a different implementation status across the major OS vendors and/or the underlying hardware: the [Network Connection API](https://developer.mozilla.org/en-US/docs/DOM/window.navigator.connection), and the [Sensor API](https://wiki.mozilla.org/Sensor_API). We disable these APIs through the Firefox preferences `dom.network.enabled` and `device.sensors.enabled`, setting both to **false**.
**Implementation Status**: At least two HTML5 features have a different implementation status across the major OS vendors and/or the underlying hardware: the [Network Connection API](https://developer.mozilla.org/en-US/docs/DOM/window.navigator.connection), and the [Sensor API](https://wiki.mozilla.org/Sensor_API). We disable these APIs through the Firefox preferences `dom.network.enabled` and `device.sensors.enabled`, setting both to **false**.
For more details on fingerprinting bugs and enhancements, see the [tbb-fingerprinting tag in our bug tracker](https://trac.torproject.org/projects/tor/query?keywords=~tbb-fingerprinting&status=!closed).
For more details on fingerprinting bugs and enhancements, see the [Fingerprinting](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/?sort=updated_desc&state=opened&label_name%5B%5D=Fingerprinting&first_page_size=20) label in issue bug tracker.
### 4.7. Long-Term Unlinkability via "New Identity" button
### 4.7. Long-Term Unlinkability via "New Identity" button
...
@@ -848,7 +848,7 @@ In addition to the above mechanisms that are devoted to preserving privacy while
...
@@ -848,7 +848,7 @@ In addition to the above mechanisms that are devoted to preserving privacy while
**Design Goal**: We want to deploy a mechanism that reduces the accuracy of [useful features](https://en.wikipedia.org/wiki/Feature_selection) available for classification. This mechanism would either impact the true and false positive accuracy rates, or reduce the number of web pages that could be classified at a given accuracy rate.
**Design Goal**: We want to deploy a mechanism that reduces the accuracy of [useful features](https://en.wikipedia.org/wiki/Feature_selection) available for classification. This mechanism would either impact the true and false positive accuracy rates, or reduce the number of web pages that could be classified at a given accuracy rate.
Ideally, this mechanism would be as light-weight as possible, and would be tunable in terms of overhead. We suspect that it may even be possible to deploy a mechanism that reduces feature extraction resolution without any network overhead. In the no-overhead category, we have [HTTPOS](https://freehaven.net/anonbib/cache/LZCLCP_NDSS11.pdf) and [better use of HTTP pipelining and/or SPDY](https://blog.torproject.org/blog/experimental-defense-website-traffic-fingerprinting). In the tunable/low-overhead category, we have [Adaptive Padding](https://arxiv.org/abs/1512.00524) and [Congestion-Sensitive BUFLO](https://www3.cs.stonybrook.edu/~xcai/fp.pdf). It may be also possible to [tune such defenses](https://trac.torproject.org/projects/tor/ticket/7028) such that they only use existing spare Guard bandwidth capacity in the Tor network, making them also effectively no-overhead.
Ideally, this mechanism would be as light-weight as possible, and would be tunable in terms of overhead. We suspect that it may even be possible to deploy a mechanism that reduces feature extraction resolution without any network overhead. In the no-overhead category, we have [HTTPOS](https://freehaven.net/anonbib/cache/LZCLCP_NDSS11.pdf) and [better use of HTTP pipelining and/or SPDY](https://blog.torproject.org/blog/experimental-defense-website-traffic-fingerprinting). In the tunable/low-overhead category, we have [Adaptive Padding](https://arxiv.org/abs/1512.00524) and [Congestion-Sensitive BUFLO](https://www3.cs.stonybrook.edu/~xcai/fp.pdf). It may be also possible to [tune such defenses](https://gitlab.torproject.org/tpo/core/tor/-/issues/7028) such that they only use existing spare Guard bandwidth capacity in the Tor network, making them also effectively no-overhead.
**Implementation Status**: Nothing?
**Implementation Status**: Nothing?
...
@@ -948,7 +948,7 @@ Because the total elimination of side channels during cross-origin navigation wi
...
@@ -948,7 +948,7 @@ Because the total elimination of side channels during cross-origin navigation wi
In general, it should not be possible for onclick handlers to alter the navigation destination of 'a' tags, silently transform them into POST requests, or otherwise create situations where a user believes they are clicking on a link leading to one URL that ends up on another. This functionality is deceptive and is frequently a vector for malware and phishing attacks. Unfortunately, many legitimate sites also employ such transparent link rewriting, and blanket disabling this functionality ourselves will simply cause Tor Browser to fail to navigate properly on these sites.
In general, it should not be possible for onclick handlers to alter the navigation destination of 'a' tags, silently transform them into POST requests, or otherwise create situations where a user believes they are clicking on a link leading to one URL that ends up on another. This functionality is deceptive and is frequently a vector for malware and phishing attacks. Unfortunately, many legitimate sites also employ such transparent link rewriting, and blanket disabling this functionality ourselves will simply cause Tor Browser to fail to navigate properly on these sites.
Automated cross-origin redirects are one form of this behavior that is possible for us to [address ourselves](https://trac.torproject.org/projects/tor/ticket/3600), as they are comparatively rare and can be handled with site permissions.
Automated cross-origin redirects are one form of this behavior that is possible for us to [address ourselves](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/40787), as they are comparatively rare and can be handled with site permissions.