Replace entirey of implementation with a TODO authored by Richard Pospesel's avatar Richard Pospesel
...@@ -48,6 +48,7 @@ June 15, 2018 ...@@ -48,6 +48,7 @@ June 15, 2018
4. [Implementation](#4-implementation) 4. [Implementation](#4-implementation)
<!--
4.1 [Proxy Obedience](#41-proxy-obedience) 4.1 [Proxy Obedience](#41-proxy-obedience)
4.2 [State Separation](#42-state-separation) 4.2 [State Separation](#42-state-separation)
...@@ -63,7 +64,7 @@ June 15, 2018 ...@@ -63,7 +64,7 @@ June 15, 2018
4.7 [Long-Term Unlinkability via "New Identity" button](#47-long-term-unlinkability-via-new-identity-button) 4.7 [Long-Term Unlinkability via "New Identity" button](#47-long-term-unlinkability-via-new-identity-button)
4.8 [Other Security Measures](#48-other-security-measures) 4.8 [Other Security Measures](#48-other-security-measures)
-->
5. [Build Security and Package Integrity](#5-build-security-and-package-integrity) 5. [Build Security and Package Integrity](#5-build-security-and-package-integrity)
5.1 [Achieving Binary Reproducibility](51-achieving-binary-reproducibility) 5.1 [Achieving Binary Reproducibility](51-achieving-binary-reproducibility)
...@@ -415,6 +416,13 @@ The adversary can perform the following attacks from a number of possible positi ...@@ -415,6 +416,13 @@ The adversary can perform the following attacks from a number of possible positi
## 4. Implementation ## 4. Implementation
**TODO**: Re-write this section based on the current Tor Browser implementation.
Each subsection should include mitigations provided by:
- preference
- build-flag
- tor-browser patch
<!--
The Implementation section is divided into subsections, each of which corresponds to a [Design Requirement](#2-design-requirements-and-philosophy). The Implementation section is divided into subsections, each of which corresponds to a [Design Requirement](#2-design-requirements-and-philosophy).
Each subsection is divided into specific web technologies or properties. Each subsection is divided into specific web technologies or properties.
The implementation is then described for that property. The implementation is then described for that property.
...@@ -423,876 +431,8 @@ In some cases, the implementation meets the design requirements in a non-ideal w ...@@ -423,876 +431,8 @@ In some cases, the implementation meets the design requirements in a non-ideal w
In rare cases, there may be no implementation at all. In rare cases, there may be no implementation at all.
Both of these cases are denoted by differentiating between the **Design Goal** and the **Implementation Status** for each property. Both of these cases are denoted by differentiating between the **Design Goal** and the **Implementation Status** for each property.
Corresponding bugs in the [Tor Browser issue tracker](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues) are typically linked for these cases. Corresponding bugs in the [Tor Browser issue tracker](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues) are typically linked for these cases.
-->
### 4.1 Proxy Obedience
Proxy obedience is assured through the following:
1. **Firefox proxy settings, patches, and build flags**
Our Firefox preferences file sets the Firefox proxy settings to use Tor directly as a SOCKS proxy.
It sets `network.proxy.socks_remote_dns`, `network.proxy.socks_version`, `network.proxy.socks_port`, and `network.dns.disablePrefetch`.
To prevent proxy bypass by WebRTC calls, we disable WebRTC at compile time with the `--disable-webrtc` configure switch, as well as set the pref `media.peerconnection.enabled` to false.
We also patch Firefox in order to provide several defense-in-depth mechanisms for proxy safety.
We patch [OCSP and PKIX code](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/13028) to prevent any use of the non-proxied command-line tool utility functions from being functional while linked in to the browser.
We could find no direct paths to these routines in the browser, but it seemed better safe than sorry.
During every Extended Support Release transition, we perform [in-depth code audits](https://gitlab.torproject.org/tpo/applications/tor-browser-spec/-/tree/main/audits) to verify that there were no system calls or XPCOM activity in the source tree that did not use the browser proxy settings.
We have verified that these settings and patches properly proxy HTTPS, OCSP, HTTP, FTP, gopher (now defunct), DNS, SafeBrowsing Queries, all JavaScript activity, including HTML5 audio and video objects, addon updates, WiFi geolocation queries, searchbox queries, XPCOM addon HTTPS/HTTP activity, WebSockets, and live bookmark updates.
We have also verified that external protocol helpers, such as SMB URLs and other custom protocol handlers are all blocked.
`TODO: build with --enable-proxy-bypass-protection`
2. **Disabling plugins**
NPAPI plugins have been historically a major source of vulnerabilities and proxy bypasses, which the browser had specifically to deal with in the past.
Luckily enough, they have been deprecated since 2016 and [finally completely removed from Firefox in 2021](https://en.wikipedia.org/wiki/NPAPI#Firefox).
With [Gecko Media Plugins](https://wiki.mozilla.org/GeckoMediaPlugins) (GMPs) a second type of plugins is available.
They are mainly third party codecs and [EME](https://www.w3.org/TR/encrypted-media/) content decryption modules.
We currently disable these plugins as they either can't be built reproducibly or are binary blobs which we are not allowed to audit (or both).
For the EME case we use the `--disable-eme` configure switch and set `browser.eme.ui.enabled`, `media.gmp-eme-adobe.visible`, `media.gmp-eme-adobe.enabled`, `media.gmp-widevinecdm.visible`, `media.gmp-widevinecdm.enabled`, `media.eme.enabled`, and `media.eme.apiVisible` to **false** to indicate to the user that this feature is disabled.
For GMPs in general we make sure that the external server is not even pinged for updates/downloads in the first place by setting `media.gmp-manager.url.override` to `data:text/plain`, and avoid any UI with `media.gmp-provider.enabled` set to **false**.
Moreover, we disable GMP downloads via local fallback by setting `media.gmp-manager.updateEnabled` to **false**.
To reduce our attack surface we exclude the ClearKey EME system, too.
3. **External App Blocking and Drag Event Filtering**
External apps can be induced to load files that perform network activity.
Unfortunately, there are cases where such apps can be launched automatically with little to no user input.
In order to prevent this, we ship [Firefox](https://gitlab.torproject.org/legacy/trac/-/issues/8324) [patches](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41613) to provide the user with a popup whenever the browser attempts to launch a helper application.
~Furthermore, we ship a [patch for Linux users](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/23044) that makes sure `sftp://` and `smb://` URLs are not passed along to the operating system as this can lead to proxy bypasses on systems that have GIO/GnomeVFS support.~
`TODO: this patch has been uplifted and is a pref now`
Additionally, modern desktops now preemptively fetch any URLs in Drag and Drop events as soon as the drag is initiated.
This download happens independent of the browser's Tor settings, and can be triggered by something as simple as holding the mouse button down for slightly too long while clicking on an image link.
We filter [drag and drop events](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41613) before the OS downloads the URLs the events contained.
`TODO: this bit should be updated by ma1 to talk about our updated drag+drop patches`
4. **Disabling system extensions and clearing the addon allow-list**
Firefox addons can perform arbitrary activity on your computer, including bypassing Tor.
It is for this reason we disable the addon allow-list (`xpinstall.whitelist.add`), so that users are prompted before installing addons regardless of the source.
We also exclude system-level addons from the browser through the use of `extensions.enabledScopes` and `extensions.autoDisableScopes`.
Furthermore, we set ` extensions.systemAddon.update.url` and `extensions.hotfix.id` to an empty string in order to avoid the risk of getting extensions installed by Mozilla into the browser, and remove unused system extensions with a [Firefox patch](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/21431).
In order to make it harder for users to accidentally install extensions which Mozilla presents to them on the *about:addons* page, we hide the *Get Addons* option on it by setting `extensions.getAddons.showPane` to **false**.
`TODO: not quiet true, pdfjs is system extension we include`
### 4.2 State Separation
The browser state is separated from existing browser state through use of a custom Firefox profile, and by setting the `$HOME` environment variable to the root of the bundle's directory.
The browser also does not load any system-wide extensions (through the use of `extensions.enabledScopes` and `extensions.autoDisableScopes`).
Furthermore, plugins are disabled, which prevents Flash cookies from leaking from a pre-existing Flash directory.
### 4.3 Disk Avoidance
**Design Goal**: The User Agent MUST (at user option) prevent all disk records of browser activity.
The user SHOULD be able to optionally enable URL history and other history features if they so desire.
**Implementation Status**: We are working towards this goal through several mechanisms.
First, we set the Firefox Private Browsing preference `browser.privatebrowsing.autostart` to **true**.
We also had to disable the media cache with the pref `media.cache_size`, to prevent HTML5 videos from being written to the OS temporary directory, which happened regardless of the private browsing mode setting.
Finally, we set `security.nocertdb` to **true** to make the intermediate certificate store memory-only.
As an additional defense-in-depth measure, we set `browser.cache.disk.enable`, `browser.cache.offline.enable`, `signon.rememberSignons`, `browser.formfill.enable` to **true**, `browser.download.manager.retention` to **1**, and both `browser.sessionstore.privacy_level` and `network.cookie.lifetimePolicy` to **2**.
Many of these preferences are likely redundant with `browser.privatebrowsing.autostart` enabled, but we have not done the auditing work to ensure that yet.
For more details on disk leak bugs and enhancements, see the [Disk Leak](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/?sort=updated_desc&state=opened&label_name%5B%5D=Disk%20Leak&first_page_size=20) tag in our issue tracker.
### 4.4 Least Privilege
**Design Goal**: The browser MUST run with as few permissions and capabilities as possible to function.
**Implementation Status**: Tor Browser inherits Firefox ESR's upstream sandboxing protections.
Tor Browser for Android disables some additional platform permissions related to telemetry and advertising, which are of course not needed.
On Windows, the browser installer does not require Administrator privileges to run.
We additionally have patched Tor Browser's updater to remove the code-paths which could trigger elevated execution on Windows.
### 4.5 Cross-Origin Identifier Unlinkability
The Cross-Origin Identifier Unlinkability design requirement is satisfied through first party isolation of all browser identifier sources.
First party isolation means that all identifier sources and browser state are scoped (isolated) using the URL bar domain.
This scoping is performed in combination with any additional third party scope.
When first party isolation is used with explicit identifier storage that already has a constrained third party scope (such as cookies and DOM storage), this approach is referred to as "double-keying".
`TODO: 3rd party cookies are disabled, not double-keyed`
~The benefit of this approach comes not only in the form of reduced linkability, but also in terms of simplified privacy UI.
If all stored browser state and permissions become associated with the URL bar origin, the six or seven different pieces of privacy UI governing these identifiers and permissions can become just one piece of UI.
For instance, a window that lists the URL bar origin for which browser state exists, possibly with a context-menu option to drill down into specific types of state or permissions.
An example of this simplification can be seen in Figure 1.~
#### ~Figure 1. Improving the Privacy UI~
![Improving the Privacy UI](uploads/b60467a8bfbfa04789c316dfd20cf097/improving-privacy-ui.png)
~This example UI is a mock-up of how isolating identifiers to the URL bar domain can simplify the privacy UI for all data - not just cookies.
Once browser identifiers and site permissions operate on a URL bar basis, the same privacy window can represent browsing history, DOM Storage, HTTP Auth, search form history, login values, and so on within a context menu for each site.~
#### Identifier Unlinkability Defenses
~Unfortunately, many aspects of browser state can serve as identifier storage, and no other browser vendor or standards body had invested the effort to enumerate or otherwise deal with these vectors for third party tracking.
As such, we have had to enumerate and isolate these identifier sources on a piecemeal basis.
This has gotten better lately with Mozilla stepping up and helping us with uplifting our patches, and with contributing their own patches where we lacked proper fixes.
However, we are not done yet with our unlinkability defense as new identifier sources are still getting added to the web platform.~
`TODO: less commentary here`
Here is the list that we have discovered and dealt with to date:
1. **Cookies**
~**Design Goal**: All cookies MUST be double-keyed to the URL bar origin and third-party origin.~
~**Implementation Status**: Double-keying cookies should just work by setting `privacy.firstparty.isolate` to **true**.
However, [we have not audited that](https://gitlab.torproject.org/legacy/trac/-/issues/21905) yet and there is still the [UI part missing for managing cookies in Private Browsing Mode](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10353).
We therefore opted to keep third-party cookies disabled for now by setting `network.cookie.cookieBehavior` to **1**.~
`TODO: So 3rd party cookies realistically are going away relatively soon, chrome is starting to phase them out entirely in 2024: https://developers.google.com/privacy-sandbox/3pcd`
2. **Cache**
**Design Goal**: All cache entries MUST be isolated to the URL bar domain.
**Implementation Status**: We isolate the content and image cache to the URL bar domain by setting privacy.firstparty.isolate to true.
Furthermore there is the [CacheStorage API](https://developer.mozilla.org/en-US/docs/Web/API/CacheStorage).
That one is currently not available in the browser as we do not allow third party cookies and are in Private Browsing Mode by default.
As the cache entries are written to disk the CacheStorage API [got disabled](https://bugzilla.mozilla.org/show_bug.cgi?id=1173467) in that mode in Firefox, similar to how IndexedDB is handled.
There are [thoughts](https://bugzilla.mozilla.org/show_bug.cgi?id=1117808) about enabling it by providing a memory-only database but that is still work in progress.
But even if users are leaving the Private Browsing Mode and are enabling third party cookies the storage is isolated to the URL bar domain by `privacy.firstparty.isolate` set to **true**.
Finally, we have the asm.js cache.
The cache entry of the script is (among others things, like type of CPU, build ID, source characters of the asm.js module etc.) keyed [to the origin of the script](https://blog.mozilla.org/luke/2014/01/14/asm-js-aot-compilation-and-startup-performance/).
Lacking a good solution for binding it to the URL bar domain instead we decided to disable asm.js for the time being by setting `javascript.options.asmjs` to **false**.
It remains to be seen whether keying the cache entry e.g. to the source characters of the asm.js module helps to avoid using it for cross-origin tracking of users.
We did not investigate that yet.
3. **HTTP Authentication**
HTTP Authorization headers can be used to encode [silent third party tracking identifiers](http://jeremiahgrossman.blogspot.com/2007/04/tracking-users-without-cookies.html).
To prevent this, we set `privacy.firstparty.isolate` to **true**.
4. **DOM Storage**
DOM storage for third party domains MUST be isolated to the URL bar domain, to prevent linkability between sites.
We achieve this by setting `privacy.firstparty.isolate` to **true**.
5. **IndexedDB Storage**
IndexedDB storage for third party domains MUST be isolated to the URL bar domain, to prevent linkability between sites.
By default [IndexedDB storage](https://developer.mozilla.org/en-US/docs/Web/API/IndexedDB_API) is disabled as the browser is using Firefox's Private Browsing Mode and does not allow third party cookies.
There are [thoughts](https://bugzilla.mozilla.org/show_bug.cgi?id=781982) about enabling this API in Private Browsing Mode as well but that is still work in progress.
However, if users are leaving this mode and are enabling third party cookies, isolation to the URL bar is achieved, though, by `privacy.firstparty.isolate` set to **true**.
6. **Flash cookies**
**Design Goal**: Users should be able to click-to-play flash objects from trusted sites.
To make this behavior unlinkable, we wish to include a settings file for all platforms that disables flash cookies using the [Flash settings manager](https://www.macromedia.com/support/documentation/en/flashplayer/help/settings_manager03.html).
**Implementation Status**: We are currently [having difficulties](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/3974) causing Flash player to use this settings file on Windows, so Flash remains difficult to enable.
`TODO: remove this whole section`
7. **SSL+TLS session resumption**
**Design Goal**: TLS session resumption tickets and SSL Session IDs MUST be limited to the URL bar domain.
**Implementation Status**: We disable TLS Session Tickets and SSL Session IDs by setting `security.ssl.disable_session_identifiers` to **true**.
To compensate for the increased round trip latency from disabling these performance optimizations, we also enable [TLS False Start](https://tools.ietf.org/html/draft-bmoeller-tls-falsestart-00) via the Firefox Pref `security.ssl.enable_false_start`.
However, URL bar domain isolation should be working both for session tickets and session IDs but we [have not verified that yet](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/17252).
8. **Tor circuit and HTTP connection linkability**
**Design Goal**: Tor circuits and HTTP connections from a third party in one URL bar origin MUST NOT be reused for that same third party in another URL bar origin.
**Implementation Status**: The isolation functionality is provided by a component that [sets the SOCKS username and password for each request](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/3455).
The Tor client has logic to prevent connections with different SOCKS usernames and passwords from using the same Tor circuit.
Firefox has existing logic to ensure that connections with SOCKS proxies do not re-use existing HTTP Keep-Alive connections unless the proxy settings match.
[We extended this logic](https://bugzilla.mozilla.org/show_bug.cgi?id=1200802) to cover SOCKS username and password authentication, providing us with HTTP Keep-Alive unlinkability.
9. **SharedWorkers**
[SharedWorkers](https://developer.mozilla.org/en-US/docs/Web/API/SharedWorker) are a special form of JavaScript Worker threads that have a shared scope between all threads from the same Javascript origin.
They MUST be isolated to the URL bar domain.
I.e. a SharedWorker launched from a third party from one URL bar domain MUST NOT have access to the objects created by that same third party loaded under another URL bar domain.
This functionality is provided by setting `privacy.firstparty.isolate` to **true**.
10. **blob: URIs (URL.createObjectURL)**
The [URL.createObjectURL](https://developer.mozilla.org/en-US/docs/Web/API/URL/createObjectURL) API allows a site to load arbitrary content into a random UUID that is stored in the user's browser, and this content can be accessed via a URL of the form `blob:UUID` from any other content element anywhere on the web.
While this UUID value is neither under control of the site nor predictable, it can still be used to tag a set of users that are of high interest to an adversary.
URIs created with URL.createObjectURL MUST be limited in scope to the first party URL bar domain that created them.
We provide the isolation in the browser by setting `privacy.firstparty.isolate` to **true**.
11. **SPDY and HTTP/2**
**Design Goal**: SPDY and HTTP/2 connections MUST be isolated to the URL bar domain.
Furthermore, all associated means that could be used for cross-domain user tracking (alt-svc headers come to mind) MUST adhere to this design principle as well.
**Implementation Status**: SPDY and HTTP/2 are currently disabled by setting the Firefox preferences `network.http.spdy.enabled`, `network.http.spdy.enabled.v2`, `network.http.spdy.enabled.v3`, `network.http.spdy.enabled.v3-1`, `network.http.spdy.enabled.http2`, `network.http.spdy.enabled.http2draft`, `network.http.altsvc.enabled`, and `network.http.altsvc.oe` to **false**.
12. **Automated cross-origin redirects**
**Design Goal**: To prevent attacks aimed at subverting the Cross-Origin Identifier Unlinkability [privacy requirement](#22-privacy-requirements), the browser MUST NOT store any identifiers (cookies, cache, DOM storage, HTTP auth, etc) for cross-origin redirect intermediaries that do not prompt for user input.
For example, if a user clicks on a bit.ly URL that redirects to a doubleclick.net URL that finally redirects to a cnn.com URL, only cookies from cnn.com should be retained after the redirect chain completes.
Non-automated redirect chains that require user input at some step (such as federated login systems) SHOULD still allow identifiers to persist.
**Implementation status**: There are numerous ways for the user to be redirected, and the Firefox API support to detect each of them is poor.
We have a [trac bug open](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/40787) to implement what we can.
13. **window&period;name**
[window.name](https://developer.mozilla.org/En/DOM/Window.name) is a magical DOM property that for historical reasons is allowed to retain a persistent value for the lifespan of a browser tab.
This behavior allowed information to be shared/leaked across different websites as the same value was persisted on cross-site navigation, and the browser used to deploy ad-hoc mitigations against this obvious linkability issue.
However, browser vendors eventually acknowledge this problem, which nowadays is fixed in Firefox (and therefore in derivative browsers) by partitioning the `window.name` property's value per-site and resetting it on cross-site navigation.
14. **Auto form-fill**
We disable the password saving functionality in the browser as part of our [Disk Avoidance](#43-disk-avoidance) requirement.
However, since users may decide to re-enable disk history records and password saving, we also set the [signon.autofillForms](http://kb.mozillazine.org/Signon.autofillForms) preference to false to prevent saved values from immediately populating fields upon page load.
Since JavaScript can read these values as soon as they appear, setting this preference prevents automatic linkability from stored passwords.
15. **HSTS and HPKP supercookies**
An extreme (but not impossible) attack to mount is the creation of [HSTS](https://www.leviathansecurity.com/blog/archives/12-The-Double-Edged-Sword-of-HSTS-Persistence-and-Privacy.html) [supercookies](https://www.radicalresearch.co.uk/lab/hstssupercookies/).
Since HSTS effectively stores one bit of information per domain name, an adversary in possession of numerous domains can use them to construct cookies based on stored HSTS state.
HPKP provides [a mechanism for user tracking](https://zyan.scripts.mit.edu/presentations/toorcon2015.pdf) across domains as well.
It allows abusing the requirement to provide a backup pin and the option to report a pin validation failure.
In a tracking scenario every user gets a unique SHA-256 value serving as backup pin.
This value is sent back after (deliberate) pin validation failures working in fact as a cookie.
**Design Goal**: HSTS and HPKP MUST be isolated to the URL bar domain.
**Implementation Status**: Currently, HSTS and HPKP state is both cleared by New Identity, but we don't defend against the creation and usage of any of these supercookies between `New Identity` invocations.
16. **Broadcast Channels**
The BroadcastChannel API allows cross-site communication within the same origin.
However, to avoid cross-origin linkability broadcast channels MUST instead be isolated to the URL bar domain.
We provide the isolation by setting `privacy.firstparty.isolate` to **true**.
17. **OCSP**
OCSP requests go to Certificate Authorities (CAs) to check for revoked certificates.
They are sent once the browser is visiting a website via HTTPS and no cached results are available.
Thus, to avoid information leaks, e.g. to exit relays, OCSP requests MUST go over the same circuit as the HTTPS request causing them and MUST therefore be isolated to the URL bar domain.
The resulting cache entries MUST be bound to the URL bar domain as well.
This functionality is provided by setting `privacy.firstparty.isolate` to **true**.
18. **Favicons**
**Design Goal**: When visiting a website its favicon is fetched via a request originating from the browser itself (similar to the OCSP mechanism mentioned in the previous section).
Those requests MUST be isolated to the URL bar domain.
**Implementation Status**: Favicon requests are isolated to the URL bar domain by setting `privacy.firstparty.isolate` to **true**.
19. **mediasource: URIs and MediaStreams**
Much like blob URLs, mediasource: URIs and MediaStreams can be used to tag users.
Therefore, mediasource: URIs and MediaStreams MUST be isolated to the URL bar domain.
This functionality is provided by setting `privacy.firstparty.isolate` to **true**.
20. **Speculative and prefetched connections**
Firefox provides the feature to [connect speculatively](https://www.igvita.com/2015/08/17/eliminating-roundtrips-with-preconnect/) to remote hosts if that is either indicated in the HTML file (e.g. by [link rel="preconnect" and rel="prefetch"](https://w3c.github.io/resource-hints/)) or otherwise deemed beneficial.
Firefox does not support rel="prerender", and Mozilla has disabled speculative connections and rel="preconnect" usage where a proxy is used (see [comment 3 in issue tor-browser#18762](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/18762#note_2630269) for further details).
Explicit prefetching via the rel="prefetch" attribute is still performed, however.
All pre-loaded links and speculative connections MUST be isolated to the URL bar domain, if enabled.
This includes isolating both Tor circuit use, as well as the caching and associate browser state for the prefetched resource.
For automatic speculative connects and rel="preconnect", we leave them disabled as per the Mozilla default for proxy settings.
However, if enabled, speculative connects will be isolated to the proper first party Tor circuit by the same mechanism as is used for HTTP Keep-Alive.
This is true for rel="prefetch" requests as well.
For rel="preconnect", we set `privacy.firstparty.isolate` to **true**.
This isolation makes both preconnecting and cache warming via rel="prefetch" ineffective for links to domains other than the current URL bar domain.
For links to the same domain as the URL bar domain, the full cache warming benefit is obtained.
As an optimization, any preconnecting to domains other than the current URL bar domain can thus be disabled (perhaps with the exception of frames), but we do not do this.
We allow these requests to proceed, but we isolate them.
21. **Permissions API**
The Permissions API allows a website to query the status of different permissions.
Although permissions are keyed to the origin, that is not enough to alleviate cross-linkability concerns: the combined permission state could work like an identifier given more and more permissions and their state being accessible under this API.
**Design Goal**: Permissions MUST be isolated to the URL bar domain.
**Implementation Status**: This functionality is provided by setting `privacy.firstparty.isolate` to **true**.
For more details on identifier linkability bugs and enhancements, see the [Linkability](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/?sort=updated_desc&state=opened&label_name%5B%5D=Linkability&first_page_size=20) label in our issue tracker.
### 4.6 Cross-Origin Fingerprinting Unlinkability
Browser fingerprinting is the act of inspecting browser behaviors and features in an attempt to differentiate and track individual users.
Fingerprinting attacks are typically broken up into passive and active vectors.
Passive fingerprinting makes use of any information the browser provides automatically to a website without any specific action on the part of the website.
Active fingerprinting makes use of any information that can be extracted from the browser by some specific website action, usually involving JavaScript.
Some definitions of browser fingerprinting also include supercookies and cookie-like identifier storage, but we deal with those issues separately in the [preceding section on identifier linkability](#45-cross-origin-identifier-unlinkability).
For the most part, however, we do not differentiate between passive or active fingerprinting sources, since many active fingerprinting mechanisms are very rapid, and can be obfuscated or disguised as legitimate functionality.
Instead, we believe fingerprinting can only be rationally addressed if we understand where the problem comes from, what sources of issues are the most severe, what types of defenses are suitable for which sources, and have a consistent strategy for designing defenses that maximizes our ability to study defense efficacy.
The following subsections address these issues from a high level, and we then conclude with a list of our current specific defenses.
#### Sources of Fingerprinting Issues
All browser fingerprinting issues arise from one of four primary sources: end-user configuration details, device and hardware characteristics, operating system vendor and version differences, and browser vendor and version differences.
Additionally, user behavior itself provides one more source of potential fingerprinting.
In order to help prioritize and inform defenses, we now list these sources in order from most severe to least severe in terms of the amount of information they reveal, and describe them in more detail.
1. **End-user Configuration Details**
End-user configuration details are by far the most severe threat to fingerprinting, as they will quickly provide enough information to uniquely identify a user.
We believe it is essential to avoid exposing platform configuration details to website content at all costs.
We also discourage excessive fine-grained customization by minimizing and aggregating user-facing privacy and security options, as well as by discouraging the use of additional plugins and addons.
When it is necessary to expose configuration details in the course of providing functionality, we strive to do so only on a per-site basis via site permissions, to avoid linkability.
2. **Device and Hardware Characteristics**
Device and hardware characteristics can be determined in three ways: they can be reported explicitly by the browser, they can be inferred through browser functionality, or they can be extracted through statistical measurements of system performance.
We are most concerned with the cases where this information is either directly reported or can be determined via a single use of an API or feature, and prefer to either alter functionality to prevent exposing the most variable aspects of these characteristics, place such features behind site permissions, or disable them entirely.
On the other hand, because statistical inference of system performance requires many iterations to achieve accuracy in the face of noise and concurrent activity, we are less concerned with this mechanism of extracting this information.
We also expect that reducing the resolution of JavaScript's time sources will significantly increase the duration of execution required to extract accurate results, and thus make statistical approaches both unattractive and highly noticeable due to excessive resource consumption.
3. **Operating System Vendor and Version Differences**
Operating system vendor and version differences permeate many different aspects of the browser.
While it is possible to address these issues with some effort, the relative lack of diversity in operating systems causes us to primarily focus our efforts on passive operating system fingerprinting mechanisms at this point in time.
For the purposes of protecting user anonymity, it is not strictly essential that the operating system be completely concealed, though we recognize that it is useful to reduce this differentiation ability where possible, especially for cases where the specific version of a system can be inferred.
4. **User Behavior**
While somewhat outside the scope of browser fingerprinting, for completeness it is important to mention that users themselves theoretically might be fingerprinted through their behavior while interacting with a website.
This behavior includes e.g. keystrokes, mouse movements, click speed, and writing style.
Basic vectors such as keystroke and mouse usage fingerprinting can be mitigated by altering JavaScript's notion of time.
More advanced issues like writing style fingerprinting are the domain of [other tools](https://github.com/psal/anonymouth/blob/master/README.md).
5. **Browser Vendor and Version Differences**
Due to vast differences in feature set and implementation behavior even between different ([minor](https://tsyrklevich.net/2014/10/28/abusing-strict-transport-security/)) versions of the same browser, browser vendor and version differences are simply not possible to conceal in any realistic way.
It is only possible to minimize the differences among different installations of the same browser vendor and version.
We make no effort to mimic any other major browser vendor, and in fact most of our fingerprinting defenses serve to differentiate users from normal Firefox users.
Because of this, any study that lumps browser vendor and version differences into its analysis of the fingerprintability of a population is largely useless for evaluating either attacks or defenses.
Unfortunately, this includes popular large-scale studies such as [Panopticlick](https://panopticlick.eff.org/) and [Am I Unique](https://amiunique.org/).
To gather usable data about the browser's fingerprinting defenses we launched a Google Summer of Code project in 2016, called [FPCentral](https://github.com/plaperdr/fp-central), with the aim to provide us an own testbed.
We set this up during 2017 and [have it available now](https://fpcentral.tbb.torproject.org/) for further integration into our quality assurance efforts and possible research into improving our fingerprinting defenses and measuring their effectiveness.
#### General Fingerprinting Defenses
To date, the team has concerned itself only with developing defenses for APIs that have already been standardized and deployed.
Once an API or feature has been standardized and widely deployed, defenses to the associated fingerprinting issues tend to have only a few options available to compensate for the lack of up-front privacy design.
In our experience, so far these options have been limited to value spoofing, subsystem modification or reimplementation, virtualization, site permissions, and feature removal.
We will now describe these options and the fingerprinting sources they tend to work best with.
1. **Value Spoofing**
Value spoofing can be used for simple cases where the browser provides some aspect of the user's configuration details, devices, hardware, or operating system directly to a website.
It becomes less useful when the fingerprinting method relies on behavior to infer aspects of the hardware or operating system, rather than obtain them directly.
2. **Subsystem Modification or Reimplementation**
In cases where simple spoofing is not enough to properly conceal underlying device characteristics or operating system details, the underlying subsystem that provides the functionality for a feature or API may need to be modified or completely reimplemented.
This is most common in cases where customizable or version-specific aspects of the user's operating system are visible through the browser's featureset or APIs, usually because the browser directly exposes OS-provided implementations of underlying features.
In these cases, such OS-provided implementations must be replaced by a generic implementation, or at least modified by an implementation wrapper layer that makes effort to conceal any user-customized aspects of the system.
3. **Virtualization**
Virtualization is needed when simply reimplementing a feature in a different way is insufficient to fully conceal the underlying behavior.
This is most common in instances of device and hardware fingerprinting, but since the notion of time can also be virtualized, virtualization also can apply to any instance where an accurate measurement of wall clock time is required for a fingerprinting vector to attain high accuracy.
4. **Site Permissions**
In the event that reimplementation or virtualization is too expensive in terms of performance or engineering effort, and the relative expected usage of a feature is rare, site permissions can be used to prevent the usage of a feature for cross-site tracking.
Unfortunately, site permissions become less effective once a feature is already widely overused and abused by many websites, since warning fatigue typically sets in for most users after just a few permission requests.
5. **Feature or Functionality Removal**
Due to the current bias in favor of invasive APIs that expose the maximum amount of platform information, some features and APIs are simply not salvageable in their current form.
When such invasive features serve only a narrow domain or use case, or when there are alternate ways of accomplishing the same task, these features and/or certain aspects of their functionality may be simply removed.
#### Strategies for Defense: Randomization versus Uniformity
When applying a form of defense to a specific fingerprinting vector or source, there are two general strategies available: either the implementation for all users of a single browser version can be made to behave as uniformly as possible, or the user agent can attempt to randomize its behavior so that each interaction between a user and a site provides a different fingerprint.
Although [some research suggests](https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/tr1-1.pdf) that randomization can be effective, so far striving for uniformity has generally proved to be a better strategy for the browser for the following reasons:
1. **Evaluation and measurement difficulties**
The fact that randomization causes behaviors to differ slightly with every site visit makes it appealing at first glance, but this same property makes it very difficult to objectively measure its effectiveness.
By contrast, an implementation that strives for uniformity is very simple to evaluate.
Despite their current flaws, a properly designed version of [Panopticlick](https://panopticlick.eff.org/) or [Am I Unique](https://amiunique.org/) could report the entropy and uniqueness rates for all users of a single user agent version, without the need for complicated statistics about the variance of the measured behaviors.
[FPCentral](https://fpcentral.tbb.torproject.org/fp) is trying to achieve that for the browser by providing feedback on acceptable browser properties and giving guidance on possible improvements.
Randomization (especially incomplete randomization) may also provide a false sense of security.
When a fingerprinting attempt makes naive use of randomized information, a fingerprint will appear unstable, but may not actually be sufficiently randomized to impede a dedicated adversary.
Sophisticated fingerprinting mechanisms may either ignore randomized information, or incorporate knowledge of the distribution and range of randomized values into the creation of a more stable fingerprint (by either removing the randomness, modeling it, or averaging it out).
2. **Randomization is not a shortcut**
While many end-user configuration details that the browser currently exposes may be safely replaced by false information, randomization of these details must be just as exhaustive as an approach that seeks to make these behaviors uniform.
When confronting either strategy, the adversary can still make use of any details which have not been altered to be either sufficiently uniform or sufficiently random.
Furthermore, the randomization approach seems to break down when it is applied to deeper issues where underlying system functionality is directly exposed.
In particular, it is not clear how to randomize the capabilities of hardware attached to a computer in such a way that it either convincingly behaves like other hardware, or such that the exact properties of the hardware that vary from user to user are sufficiently randomized.
Similarly, truly concealing operating system version differences through randomization may require multiple reimplementations of the underlying operating system functionality to ensure that every operating system version is covered by the range of possible behaviors.
3. **Usability issues**
When randomization is introduced to features that affect site behavior, it can be very distracting for this behavior to change between visits of a given site.
For the simplest cases, this will lead to minor visual nuisances.
However, when this information affects reported functionality or hardware characteristics, sometimes a site will function one way on one visit, and another way on a subsequent visit.
4. **Performance costs**
Randomizing involves performance costs.
This is especially true if the fingerprinting surface is large (like in a modern browser) and one needs more elaborate randomizing strategies (including randomized virtualization) to ensure that the randomization fully conceals the true behavior.
Many calls to a cryptographically secure random number generator during the course of a page load will both serve to exhaust available entropy pools, as well as lead to increased computation while loading a page.
5. **Increased vulnerability surface**
Improper randomization might introduce a new fingerprinting vector, as the process of generating the values for the fingerprintable attributes could be itself susceptible to side-channel attacks, analysis, or exploitation.
#### Specific Fingerprinting Defenses
The following defenses are listed roughly in order of most severe fingerprinting threat first.
This ordering is based on the above intuition that user configurable aspects of the computer are the most severe source of fingerprintability, followed by device characteristics and hardware, and then finally operating system vendor and version information.
Where our actual implementation differs from an ideal solution, we separately describe our **Design Goal** and our **Implementation Status**.
1. **Plugins**
Plugins add to fingerprinting risk via two main vectors: their mere presence in `window.navigator.plugins` (because they are optional, end-user installed third party software), as well as their internal functionality.
**Design Goal**: All plugins that have not been specifically audited or sandboxed MUST be disabled.
To reduce linkability potential, even sandboxed plugins SHOULD NOT be allowed to load objects until the user has clicked through a click-to-play barrier.
Additionally, version information SHOULD be reduced or obfuscated until the plugin object is loaded.
For Flash, we wish to [provide a settings.sol](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/3974) file to disable Flash cookies, and to restrict P2P features that are likely to bypass proxy settings.
We'd also like to restrict access to fonts and other system information (such as IP address and MAC address) in such a sandbox.
**Implementation Status**: Currently, we entirely disable all plugins in the browser.
However, as a compromise due to the popularity of Flash, we allow users to re-enable Flash, and flash objects are blocked behind a click-to-play barrier that is available only after the user has specifically enabled plugins.
Flash is the only plugin available, the rest are entirely blocked from loading by the Firefox patches mentioned in the [Proxy Obedience section](#41-proxy-obedience).
We also set the Firefox preference `plugin.expose_full_path` to false, to avoid leaking plugin installation information.
2. **HTML5 Canvas Image Extraction**
After plugins and plugin-provided information, we believe that the [HTML5 Canvas](https://developer.mozilla.org/en-US/docs/HTML/Canvas) is the single largest fingerprinting threat browsers face today.
[Studies](https://cseweb.ucsd.edu/~hovav/dist/canvas.pdf) [show](https://securehomes.esat.kuleuven.be/~gacar/persistent/the_web_never_forgets.pdf) that the Canvas can provide an easy-access fingerprinting target: The adversary simply renders WebGL, font, and named color data to a Canvas element, extracts the image buffer, and computes a hash of that image data.
Subtle differences in the video card, font packs, and even font and graphics library versions allow the adversary to produce a stable, simple, high-entropy fingerprint of a computer.
In fact, the hash of the rendered image can be used almost identically to a tracking cookie by the web server.
In some sense, the canvas can be seen as the union of many other fingerprinting vectors.
If WebGL is normalized through software rendering, system colors were standardized, and the browser shipped a fixed collection of fonts (see later points in this list), it might not be necessary to create a canvas permission.
However, until then, we gate canvas reading and related functionality behind a site permission.
This functionality is provided by setting `privacy.resistFingerprinting` to **true**.
If the user hasn't previously allowed the site in the URL bar to access Canvas image data, pure white image data is returned to the JavaScript APIs.
Extracting canvas image data by third parties is not allowed, though.
3. **Open TCP Port and Local Network Fingerprinting**
In Firefox, by using either WebSockets or XHR, it is possible for remote content to [enumerate the list of TCP ports open on 127.00.1](http://www.andlabs.org/tools/jsrecon.html), as well as on any other machines on the local network.
In other browsers, this can be accomplished by DOM events on image or script tags.
This open vs filtered vs closed port list can provide a very unique fingerprint of a machine, because it essentially enables the detection of many different popular third party applications and optional system services (Skype, Bitcoin, Bittorrent and other P2P software, SSH ports, SMB and related LAN services, CUPS and printer daemon config ports, mail servers, and so on).
It is also possible to determine when ports are closed versus filtered/blocked (and thus probe custom firewall configuration).
We prevent access to 127.00.1/localhost by ensuring that even these requests are still sent by Firefox to our SOCKS proxy (ie we set `network.proxy.no_proxies_on` to the empty string).
The local Tor client then rejects them, since it is configured to proxy for internal IP addresses by default.
Access to the local network is forbidden via the same mechanism.
We also disable the WebRTC API as mentioned previously, since even if it were usable over Tor, it still currently provides the local IP address and associated network information to websites.
4. **Invasive Authentication Mechanisms (NTLM and SPNEGO)**
Both NTLM and SPNEGO authentication mechanisms can leak the hostname, and in some cases the current username.
The only reason why these aren't a more serious problem is that they typically involve user interaction, and likely aren't an attractive vector for this reason.
However, because it is not clear if certain carefully-crafted error conditions in these protocols could cause them to reveal machine information and still fail silently prior to the password prompt, these authentication mechanisms should either be disabled, or placed behind a site permission before their use.
This has been resolved upstream in the fix for [Mozilla 1046421](https://bugzilla.mozilla.org/show_bug.cgi?id=1046421).
5. **USB Device ID Enumeration via the GamePad API**
The [GamePad API](https://developer.mozilla.org/en-US/docs/Web/Guide/API/Gamepad) provides web pages with the [USB device id, product id, and driver name](https://dvcs.w3.org/hg/gamepad/raw-file/default/gamepad.html#widl-Gamepad-id) of all connected game controllers, as well as detailed information about their capabilities.
It's our opinion that this API needs to be completely redesigned to provide an abstract notion of a game controller rather than offloading all of the complexity associated with handling specific game controller models to web content authors.
For systems without a game controller, a standard controller can be virtualized through the keyboard, which will serve to both improve usability by normalizing user interaction with different games, as well as eliminate fingerprinting vectors.
Barring that, this API should be behind a site permission in Private Browsing Modes.
For now though, we simply disable it via the pref `dom.gamepad.enabled`.
6. **Fonts**
According to the Panopticlick study, fonts provide the most linkability when they are available as an enumerable list in file system order, via either the Flash or Java plugins.
However, it is still possible to use CSS and/or JavaScript to query for the existence of specific fonts.
With a large enough pre-built list to query, a large amount of fingerprintable information may still be available, especially given that additional fonts often end up installed by third party software and for multilingual support.
**Design Goal**: Font-based fingerprinting MUST be rendered ineffective
**Implementation Status**: We investigated shipping a predefined set of fonts to all of our users allowing only those fonts to be used by websites at the exclusion of system fonts.
We are currently following this approach, which has been suggested by researchers previously.
This defense is available for all three supported platforms: Windows, macOS, and Linux, although the implementations vary in detail.
For Windows and macOS we use a preference, `font.system.whitelist`, to restrict fonts being used to those in the allow-list.
This functionality is provided by setting `privacy.resistFingerprinting` to **true**.
The allow-list for Windows and macOS contains both a set of [Noto fonts](https://www.google.com/get/noto) which we bundle and fonts provided by the operating system.
For Linux systems we only bundle fonts and [deploy](https://gitlab.torproject.org/tpo/applications/tor-browser-build/-/blob/main/projects/browser/Bundle-Data/linux/Data/fontconfig/fonts.conf) a fonts.conf file to restrict the browser to use those fonts exclusively.
In addition to that we set the `font.name.*` preferences for macOS and Linux to make sure that a given code point is always displayed with the same font.
This is not guaranteed even if we bundle all the fonts the browser uses as it can happen that fonts are loaded in a different order on different systems.
Setting the above mentioned preferences works around this issue by specifying the font to use explicitly.
Allowing fonts provided by the operating system for Windows and macOS users is currently a compromise between fingerprintability resistance and usability concerns.
We are still investigating the right balance between them and have created a [ticket in our issue tracker](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/18097) to summarize the current state of our defense and future work that remains to be done.
7. **Monitor, Widget, and OS Desktop Resolution**
Both CSS and JavaScript have access to a lot of information about the screen resolution, usable desktop size, OS widget size, toolbar size, title bar size, and OS desktop widget sizing information that are not at all relevant to rendering and serve only to provide information for fingerprinting.
Since many aspects of desktop widget positioning and size are user configurable, these properties yield customized information about the computer, even beyond the monitor size.
**Design Goal**: Our design goal here is to reduce the resolution information down to the bare minimum required for properly rendering inside a content window.
We intend to report all rendering information correctly with respect to the size and properties of the content window, but report an effective size of 0 for all border material, and also report that the desktop is only as big as the inner content window.
Additionally, new browser windows are sized such that their content windows are one of a few fixed sizes based on the user's desktop resolution.
In addition, to further reduce resolution-based fingerprinting, we are [investigating zoom/viewport-based mechanisms](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/7256) that might allow us to always report the same desktop resolution regardless of the actual size of the content window, and simply scale to make up the difference.
As an alternative to zoom-based solutions we are testing a [different approach](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/14429) in our alpha series that tries to round the browser window at all times to a multiple 200x100 pixels.
Regardless which solution we finally pick, until it will be available the user should also be informed that maximizing their windows can lead to fingerprintability under the current scheme.
**Implementation Status**: We automatically resize new browser windows to a 200x100 pixel multiple based on desktop resolution by backporting patches from [bug 1330882](https://2019.www.torproject.org/projects/torbrowser/design/) and setting `privacy.resistfingerprinting` to **true**.
To minimize the effect of the long tail of large monitor sizes, we also cap the window size at 1000 pixels in each direction.
In addition to that we set `privacy.resistFingerprinting` to **true** to use the client content window size for window.screen, and to report a `window.devicePixelRatio` of 1.0 Similarly, we use that preference to return content window relative points for DOM events.
We also force popups to open in new tabs (via `browser.link.open_newwindow.restriction`), to avoid full-screen popups inferring information about the browser resolution.
In addition, we prevent auto-maximizing on browser start, and inform users that maximized windows are detrimental to privacy in this mode.
`TODO: this section is out of date`
8. **Display Media information**
Beyond simple resolution information, a large amount of so-called "Media" information is also exported to content.
Even without JavaScript, CSS has access to a lot of information about the device orientation, system theme colors, and other desktop and display features that are not at all relevant to rendering and also user configurable.
Most of this information comes from [CSS Media Queries](https://developer.mozilla.org/en-US/docs/Web/Guide/CSS/Media_queries), but Mozilla has exposed [several user and OS theme defined color values](https://developer.mozilla.org/en-US/docs/Web/CSS/color_value#System_Colors) to CSS as well.
**Design Goal**: A website MUST NOT be able infer anything that the user has configured about their computer.
Additionally, it SHOULD NOT be able to infer machine-specific details such as screen orientation or type.
**Implementation Status**: We set `ui.use_standins_for_native_colors` to **true** to report a fixed set of system colors to content window CSS, and prevent detection of font smoothing on macOS with the help of `privacy.resistFingerprinting` set to **true**.
We use the same preference, too, to always report landscape-primary for the [screen orientation](https://w3c.github.io/screen-orientation/).
9. **WebGL**
WebGL is fingerprintable both through information that is exposed about the underlying driver and optimizations, as well as through performance fingerprinting.
Because of the large amount of potential fingerprinting vectors and the [previously unexposed vulnerability surface](https://www.contextis.com/resources/blog/webgl-new-dimension-browser-exploitation/), we deploy a similar strategy against WebGL as for plugins.
First, WebGL Canvases have click-to-play placeholders (provided by NoScript), and do not run until authorized by the user.
Second, we obfuscate driver information by setting the Firefox preferences `webgl.disable-extensions`, `webgl.min_capability_mode`, and `webgl.disable-fail-if-major-performance-caveat` to **true** which reduces the information provided by the following WebGL API calls: `getParameter()`, `getSupportedExtensions()`, and `getExtension()`.
Furthermore, WebGL2 is disabled by `setting webgl.enable-webgl2` to **false**.
Making the minimal WebGL mode usable is accomplished in [Mozilla 1217290](https://bugzilla.mozilla.org/show_bug.cgi?id=1217290).
Another option for WebGL might be to use software-only rendering, using a library such as [Mesa](https://www.mesa3d.org/).
The use of such a library would avoid hardware-specific rendering differences.
10. **MediaDevices API**
The [MediaDevices API](https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices) provides access to connected media input devices like cameras and microphones, as well as screen sharing.
In particular, it allows web content to easily enumerate those devices with `MediaDevices.enumerateDevices()`.
This relies on WebRTC being compiled in which we currently don't do.
Nevertheless, we disable this feature for now as a defense-in-depth by setting `media.peerconnection.enabled` and `media.navigator.enabled` to **false**.
11. **MIME Types**
Which MIME Types are registered with an operating system depends to a great deal on the application software and/or drivers a user chose to install.
Web pages can not only estimate the amount of MIME types registered by checking `navigator.mimetypes.length`.
Rather, they are even able to test whether particular MIME types are available which can have a non-negligible impact on a user's fingerprint.
We prevent both of these information leaks by setting `privacy.resistfingerprinting` to **true**.
12. **Web Speech API**
The Web Speech API consists of two parts: SpeechSynthesis (Text-to-Speech) and SpeechRecognition (Asynchronous Speech Recognition).
The latter is still disabled in Firefox.
However, the former is enabled by default and there is the risk that `speechSynthesis.getVoices()` has access to computer-specific speech packages making them available in an enumerable fashion.
Moreover, there are callbacks that would allow JavaScript to time how long a phrase takes to be "uttered".
To prevent both we set `media.webspeech.synth.enabled` to **false**.
13. **Touch API**
Touch events are able to reveal the absolute screen coordinates of a device which would defeat our approach to mitigate leaking the screen size as described above.
In order to prevent that we implemented two defenses: first we disable the Touch API by setting `dom.w3c_touch_events.enabled` to **false**.
Second, for those user that really need or want to have this API available we patched the code to give content-window related coordinates back.
Furthermore, we made sure that the touch area described by `Touch.radiusX`, `Touch.radiusY`, and `Touch.rotationAngle` does not leak further information and `Touch.force` does not reveal how much pressure a user applied to the surface.
This is accomplished by reporting back **1** for the first two properties and **0.0** for the two last ones thanks to [Mozilla 1382499](https://bugzilla.mozilla.org/show_bug.cgi?id=1382499) when `privacy.resistFingerprinting` is set to **true**
14. **Battery Status API**
The Battery Status API provides access to information about the system's battery charge level.
From Firefox 52 on it is disabled for web content.
Initially, it was possible on Linux to get a double-precision floating point value for the charge level, which means there was a large number of possible values making it almost behave like an identifier allowing to track a user cross-origin.
But still after that got fixed (and on other platforms where the precision was just two significant digits anyway) the risk for tracking users remained as combined with the chargingTime and dischargingTime the possible values [got estimated to be in the millions](https://senglehardt.com/papers/iwpe17_battery_status_case_study.pdf) under normal conditions.
We avoid all those possible issues with disabling the Battery Status API by setting `dom.battery.enabled` to **false**.
15. **System Uptime**
It is possible to get the system uptime of a user by querying the **Event.timestamp** property.
We avoid this by setting `dom.event.highrestimestamp.enabled` to **true**.
This might seem to be counterintuitive at first glance but the effect of setting that preference to true is a [normalization](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/17046) of `evt.timestamp` and `new Event('').timeStamp`.
Together with clamping the timer resolution to 100ms this provides an effective means against system uptime fingerprinting.
16. **Keyboard Layout Fingerprinting**
Keyboard events provide a way for a website to find out information about the keyboard layout of its visitors.
In fact there are [several dimensions](https://developers.google.com/web/updates/2016/04/keyboardevent-keys-codes) to this fingerprinting vector.
The `KeyboardEvent.code` property represents a physical key that can't be changed by the keyboard layout nor by the modifier state.
On the other hand the `KeyboardEvent.key` property contains the character that is generated by that key.
This is dependent on things like keyboard layout, locale and modifier keys.
**Design Goal**: Websites MUST NOT be able to infer any information about the keyboard of a user.
**Implementation Status**: [Mozilla 1222285](https://bugzilla.mozilla.org/show_bug.cgi?id=1222285) takes care of spoofing `KeyboardEvent.code` and `KeyboardEvent.keyCode` by providing consensus (US-English-style) fake properties.
This is achieved by hiding the user's use of the numpad, and any non-QWERTY US English keyboard.
Characters from non-en-US languages are currently returning an empty `KeyboardEvent.code` and a `KeyboardEvent.keyCode` of 0. Moreover, neither `Alt` or `Shift`, or `AltGr` keyboard events are reported to content.
This functionality is provided by setting `privacy.resistFingerprinting` to **true**
We are currently not taking the actually deployed browser locale or the locale indicated by a loaded document into account when spoofing the keyboard layout.
We think that would be the right thing to do in the longer run, to mitigate possible usability issues and broken functionality on websites.
Similarily to how users of non-english browser bundles right now can choose between keeping the Accept header spoofed or not they would then be able to keep a spoofed english keyboard or a spoofed one depending on the actual browser locale or language of the document.
17. **User Agent and HTTP Headers**
**Design Goal**: All browser users MUST provide websites with an identical user agent and HTTP header set for a given request type.
We omit the Firefox minor revision, and report a popular Windows platform.
If the software is kept up to date, these headers should remain identical across the population even when updated.
**Implementation Status**: Firefox provides several options for controlling the browser user agent string which we leverage.
We also set similar prefs for controlling the Accept-Language and Accept-Charset headers, which we spoof to English by default.
18. **Timing-based Side Channels**
Attacks based on timing side channels are nothing new in the browser context.
[Cache-based](http://sip.cs.princeton.edu/pub/webtiming.pdf), [cross-site timing](https://www.abortz.net/papers/timingweb.pdf), and [pixel stealing](https://www.contextis.com/documents/2/Browser_Timing_Attacks.pdf), to name just a few, got investigated in the past.
While their fingerprinting potential varies all timing-based attacks have in common that they need sufficiently fine-grained clocks.
**Design Goal**: Websites MUST NOT be able to fingerprint a user by exploiting timing-based side channels.
**Implementation Status**: The cleanest solution to timing-based side channels would be to get rid of them.
This has been [proposed](https://acmccs.github.io/papers/p163-caoA.pdf) in the research community.
However, we remain skeptical as it does not seem to be trivial even considering just a [single](https://bugzilla.mozilla.org/show_bug.cgi?id=711043) [side channel](https://cseweb.ucsd.edu/~dkohlbre/papers/subnormal.pdf) and [more and more potential side channels](https://gruss.cc/files/fantastictimers.pdf) are showing up.
Thus, we rely on disabling all possible timing sources or making them coarse-grained enough in order to render timing side channels unsuitable as a means for fingerprinting browser users.
We set `dom.enable_user_timing` and `dom.enable_resource_timing` to **false** to disable these explicit timing sources.
Furthermore, we clamp the resolution of explicit clocks to 100ms by setting `privacy.resistFingerprinting` to **true** thanks to [Mozilla 1217238](https://bugzilla.mozilla.org/show_bug.cgi?id=1217238).
This includes `performance.now()`, `new Date().getTime()`, `audioContext.currentTime`, `canvasStream.currentTime`, `video.currentTime`, `audio.currentTime`, `new File([], "").lastModified` , `new File([], "").lastModifiedDate.getTime()`, `animation.startTime`, `animation.currentTime`, `animation.timeline.currentTime`, and `document.timeline.currentTime`.
While clamping the clock resolution to 100ms is a step towards mitigating timing-based side channel fingerprinting, it is by no means sufficient.
It turns out that it is possible to subvert our clamping of explicit clocks by using [implicit ones](https://www.usenix.org/system/files/conference/usenixsecurity16/sec16_paper_kohlbrenner.pdf), e.g.
extrapolating the true time by running a busy loop with a predictable operation in it.
We are tracking [this problem](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/16110) in our bug tracker and are working with the research community and Mozilla to develop and test a proper solution to this part of our defense against timing-based side channel fingerprinting risks.
19. **resource:// and chrome:// URIs Leaks**
Due to [bugs in Firefox](https://bugzilla.mozilla.org/show_bug.cgi?id=1120398) it is possible to detect the locale and the platform of a user.
Moreover, it is possible to [find out the extensions](https://www.usenix.org/system/files/conference/usenixsecurity17/sec17-sanchez-rola.pdf) a user has installed.
This is done by including `resource://` and/or `chrome://` URIs into web content, which point to resources included in the browser itself or in installed extensions, and exploiting the different behavior resulting out of that: the browser raises an exception if a webpage requests a resource but the extension is not installed.
This does not happen if the extension is indeed installed but the resource path does not exist.
20. **Locale Fingerprinting**
We provide non-English users the option of concealing their OS and browser locale from websites.
It is debatable if this should be as high of a priority as information specific to the user's computer, but for completeness, we attempt to maintain this property.
**Implementation Status**: We set the fallback character set to set to windows-1252 for all locales, via `intl.charset.default`.
We also set `javascript.use_us_english_locale` to **true** to instruct the JS engine to use en-US as its internal C locale for all Date, Math, and exception handling.
21. **Timezone and Clock Offset**
While the latency in Tor connections varies anywhere from milliseconds to a few seconds, it is still possible for the remote site to detect large differences between the user's clock and an official reference time source.
**Design Goal**: All users MUST report the same timezone to websites.
Currently, we choose UTC for this purpose, although an equally valid argument could be made for EDT/EST due to the large English-speaking population density (coupled with the fact that we spoof a US English user agent).
Additionally, the Tor software should detect if the user's clock is significantly divergent from the clocks of the relays that it connects to, and use this to reset the clock values used in the browser to something reasonably accurate.
Alternatively, the browser can obtain this clock skew via a mechanism similar to that used in [tlsdate](https://github.com/ioerror/tlsdate).
**Implementation Status**: We set the timezone to UTC by setting `privacy.resistFingerprinting` to **true** thanks to [Mozilla 1330890](https://bugzilla.mozilla.org/show_bug.cgi?id=1330890).
22. **JavaScript Performance Fingerprinting**
[JavaScript performance fingerprinting](https://cseweb.ucsd.edu/~hovav/dist/jspriv.pdf) is the act of profiling the performance of various JavaScript functions for the purpose of fingerprinting the JavaScript engine and the CPU.
**Design Goal**: We have [several potential mitigation approaches](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/3059) to reduce the accuracy of performance fingerprinting without risking too much damage to functionality.
Our current favorite is to reduce the resolution of the `Event.timeStamp` and the JavaScript `Date()` object, while also introducing jitter.
We believe that JavaScript time resolution may be reduced all the way up to the second before it seriously impacts site operation.
Our goal with this quantization is to increase the amount of time it takes to mount a successful attack.
[Mowery et al](https://cseweb.ucsd.edu/~hovav/dist/jspriv.pdf) found that even with the default precision in most browsers, they required up to 120 seconds of amortization and repeated trials to get stable results from their feature set.
We intend to work with the research community to establish the optimum trade-off between quantization+jitter and amortization time, as well as identify highly variable JavaScript operations.
As long as these attacks take several seconds or more to execute, they are unlikely to be appealing to advertisers, and are also very likely to be noticed if deployed against a large number of people.
**Implementation Status**: Currently, our mitigation against performance fingerprinting is to disable [Navigation Timing](https://www.w3.org/TR/navigation-timing/) by setting the Firefox preference `dom.enable_performance` to **false**, and to disable the [Mozilla Video Statistics](https://developer.mozilla.org/en-US/docs/Web/API/HTMLVideoElement#Gecko-specific_properties) API extensions by setting the preference `media.video_stats.enabled` to **false**, too.
23. **Keystroke Fingerprinting**
Keystroke fingerprinting is the act of measuring key strike time and key flight time.
It is seeing increasing use as a biometric.
**Design Goal**: We intend to rely on the same mechanisms for defeating JavaScript performance fingerprinting: timestamp quantization and jitter.
**Implementation Status**: We clamp keyboard event resolution to 100ms by setting `privacy.resistFingerprinting` to **true** thanks to [Mozilla 1217238](https://bugzilla.mozilla.org/show_bug.cgi?id=1217238).
24. **Amount of Processor Cores (hardwareConcurrency)**
Modern computers have multiple physical processor cores available in their CPU.
For optimum performance, native code typically attempts to run as many threads as there are cores, and `navigator.hardwareConcurrency` makes the number of those threads (i.e. logical processors) available to web content.
**Design Goal**: Websites MUST NOT be able to fingerprint a user taking advantage of the amount of logical processors available.
**Implementation Status**: We set `dom.maxHardwareConcurrency` to **1** to report the same amount of logical processors for everyone.
However, there are [probabilistic ways of determining the same information available](https://github.com/oftn/core-estimator) which we are not defending against currently.
Moreover, we might even want to think about a more elaborate approach defending against this fingerprinting technique by not making all users uniform but rather [by following a bucket approach](https://bugs.torproject.org/22127) as we currently do in our defense against screen size exfiltration.
25. **Web Audio API**
The [Web Audio API](https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API) provides several means to aid in fingerprinting users.
At the simplest level it allows differentiating between users who have the API available and those who don't by checking for an `AudioContext` or `OscillatorNode` object.
However, there are more bits of information that the Web Audio API reveals if audio signals generated with an `OscillatorNode` are processed as [hardware and software differences](https://senglehardt.com/papers/ccs16_online_tracking.pdf) influence those results.
We disable the Web Audio API by setting `dom.webaudio.enabled` to **false**.
That has the positive side effect that it disables one of several means to perform [ultrasound cross-device tracking](https://petsymposium.org/2017/papers/issue2/paper18-2017-2-source.pdf) as well, which is based on having `AudioContext` available.
26. **MediaError.message**
The `MediaError` object allows the user agent to report errors that occurred while handling media, for instance using `audio` or `video` elements.
The `message` property provides specific diagnostic information to help understanding the error condition.
As a defense-in-depth we make sure that no information aiding in fingerprinting is leaking to websites that way by returning just an empty string.
27. **Connection State**
It is possible to monitor the connection state of a browser over time with [navigator.onLine](https://developer.mozilla.org/en-US/docs/Web/API/NavigatorOnLine/onLine).
We prevent this by setting `network.manage-offline-status` to **false**.
28. **Reader View**
[Reader View](https://support.mozilla.org/t5/Basic-Browsing/Firefox-Reader-View-for-clutter-free-web-pages/ta-p/38466) is a Firefox feature to view web pages clutter-free and easily adjusted to own needs and preferences.
To avoid fingerprintability risks we make users uniform by setting `reader.parse-on-load.enabled` to **false** and `browser.reader.detectedFirstArticle` to **true**.
This makes sure that documents are not parsed on load as this is disabled on some devices due to memory consumption and we pretend that everybody has already been using that feature in the past.
29. **Contacting Mozilla Services**
The browser is based on Firefox which is a Mozilla product.
Quite naturally, Mozilla is interested in making users aware of new features and in gathering information to learn about the most pressing needs Firefox users are facing.
This is often implemented by contacting Mozilla services, be it for displaying further information about a new feature or by [sending (aggregated) data back for analysis](https://wiki.mozilla.org/Telemetry).
While some of those mechanisms are disabled by default on release channels (such as telemetry data) others are not.
We make sure that none of those Mozilla services are contacted to avoid possible fingerprinting risks.
In particular, we disable GeoIP-based search results by setting `browser.search.countryCode` and `browser.search.region` to **US** and `browser.search.geoip.url` to the empty string.
Furthermore, we disable Selfsupport and Unified Telemetry by setting `browser.selfsupport.enabled` and `toolkit.telemetry.unified` to **false** and we make sure no related ping is reaching Mozilla by setting `datareporting.healthreport.about.reportUrlUnified` to **data:text/plain,**.
The same is done with `datareporting.healthreport.about.reportUrl` and the new tiles feature related `browser.newtabpage.directory.ping` and `browser.newtabpage.directory.source` preferences.
`browser.newtabpage.remote` is set to **false** in this context as well, as a defense-in-depth given that this feature is already of by default.
Additionally, we disable the UITour backend by setting `browser.uitour.enabled` to **false** and avoid getting Mozilla experiments installed into the browser by flipping `experiments.enabled` to **false**.
On the update side we prevent the browser from pinging the new [Kinto](https://wiki.mozilla.org/Firefox/Kinto) service for block-list updates as it is not used for it yet anyway.
This is done by setting `services.blocklist.update_enabled` to **false**.
The captive portal detection code is disabled as well as it phones home to Mozilla.
We set `network.captive-portal-service.enabled` to **false** to achieve that.
Unrelated to that we make sure that Mozilla does not get bothered with TLS error reports from browser users by hiding the respective checkbox with `security.ssl.errorReporting.enabled` set to **false**.
And while we have the Push API disabled as there are no Service Workers available in the browser yet, we remove the value for `dom.push.serverURL` as a defense-in-depth.
Finally, we set `privacy.resistFingerprinting.block_mozAddonManager` to **true** to prevent Mozilla's websites from querying whether particular extensions are installed and what their state is by using the `window.navigator.AddonManager` API.
We have [Safebrowsing](https://wiki.mozilla.org/Security/Safe_Browsing) disabled in the browser.
In order to avoid pinging providers for list updates we remove the entries for `browser.safebrowsing.provider.mozilla.updateURL` and `browser.safebrowsing.provider.mozilla.gethashURL` (and the values for Google related preferences as well).
30. **Operating System Type Fingerprinting**
As we mentioned in the introduction of this section, OS type fingerprinting is currently considered a lower priority, due simply to the numerous ways that characteristics of the operating system type may leak into content, and the comparatively low contribution of OS to overall entropy.
In particular, there are likely to be many ways to measure the differences in widget size, scrollbar size, and other rendered details on a page.
Also, directly exported OS routines (such as those from the standard C math library) expose differences in their implementations through their return values.
**Design Goal**: We intend to reduce or eliminate OS type fingerprinting to the best extent possible, but recognize that the effort for reward on this item is not as high as other areas.
The entropy on the current OS distribution is somewhere around 2 bits, which is much lower than other vectors which can also be used to fingerprint configuration and user-specific information.
**Implementation Status**: At least two HTML5 features have a different implementation status across the major OS vendors and/or the underlying hardware: the [Network Connection API](https://developer.mozilla.org/en-US/docs/DOM/window.navigator.connection), and the [Sensor API](https://wiki.mozilla.org/Sensor_API).
We disable these APIs through the Firefox preferences `dom.network.enabled` and `device.sensors.enabled`, setting both to **false**.
For more details on fingerprinting bugs and enhancements, see the [Fingerprinting](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/?sort=updated_desc&state=opened&label_name%5B%5D=Fingerprinting&first_page_size=20) label in issue bug tracker.
### 4.7 Long-Term Unlinkability via "New Identity" button
In order to avoid long-term linkability, we provide a "New Identity" context menu option in the browser.
**Design Goal**: All linkable identifiers and browser state MUST be cleared by this feature.
**Implementation Status**: First, the browser disables JavaScript in all open tabs and windows by using both the [browser.docShell.allowJavaScript](https://developer.mozilla.org/en-US/docs/XPCOM_Interface_Reference/nsIDocShell#Attributes) attribute as well as [nsIDOMWindowUtil.suppressEventHandling()](https://developer.mozilla.org/en-US/docs/XPCOM_Interface_Reference/nsIDOMWindowUtils#suppressEventHandling%28%29).
We then stop all page activity for each tab using [browser.webNavigation.stop(nsIWebNavigation.STOP_ALL)](https://developer.mozilla.org/en-US/docs/XPCOM_Interface_Reference/nsIWebNavigation#stop%28%29).
We then clear the site-specific Zoom by temporarily disabling the preference `browser.zoom.siteSpecific`, and clear the GeoIP wifi token URL `geo.wifi.access_token` and the last opened URL preference (if it exists).
Each tab is then closed.
After closing all tabs, we then clear the searchbox and findbox text and emit ["browser:purge-session-history"](https://developer.mozilla.org/en-US/docs/Supporting_private_browsing_mode#Private_browsing_notifications) (which instructs addons and various Firefox components to clear their session state).
Then we manually clear the following state: HTTP auth, SSL state, crypto tokens, OCSP state, site-specific content preferences (including HSTS state), the undo tab history, content and image cache, offline and memory cache, offline storage, Cache storage, IndexedDB storage, asm.js cache, cookies, DOM storage, the safe browsing key, the Google wifi geolocation token (if it exists), and the domain isolator state.
We also clear NoScript's site and temporary permissions, and all other browser site permissions.
After the state is cleared, we then close all remaining HTTP Keep-Alive connections and then send the NEWNYM signal to the Tor control port to cause a new circuit to be created.
Finally, a fresh browser window is opened, and the current browser window is closed (this does not spawn a new Firefox process, only a new window).
Upon the close of the final window, an unload handler is fired to invoke the [garbage collector](https://developer.mozilla.org/en-US/docs/Mozilla/Tech/XPCOM/Reference/Interface/nsIDOMWindowUtils#garbageCollect%28%29), which has the effect of immediately purging any blob:UUID URLs that were created by website content via [URL.createObjectURL](https://developer.mozilla.org/en-US/docs/Web/API/URL/createObjectURL).
### 4.8 Other Security Measures
In addition to the above mechanisms that are devoted to preserving privacy while browsing, we also have a number of technical mechanisms to address other privacy and security issues.
1. **Security Slider**
In order to provide vulnerability surface reduction for users that need high security, we have implemented a "Security Slider" to allow users to make a tradeoff between usability and security while minimizing the total number of choices (to reduce fingerprinting).
Using metrics collected from Mozilla's bug tracker, we analyzed the vulnerability counts of core components, and used [information gathered from a study performed by iSec Partners](https://github.com/iSECPartners/publications/tree/master/reports/Tor%20Browser%20Bundle) to inform which features should be disabled at which security levels.
The Security Slider consists of three positions:
- **Low (default)**
At this security level, the preferences are the browser defaults.
This includes three features that were formerly governed by the slider at higher security levels: `gfx.font_rendering.graphite.enabled` is set to **false** now after Mozilla got convinced that [leaving it enabled is too risky](https://bugzilla.mozilla.org/show_bug.cgi?id=1255731).
Even though Mozilla reverted that decision after another round of fixing critical Graphite bugs, we remain skeptical and keep that feature disabled for now.
`network.jar.block-remote-files` is set to **true**.
Mozilla tried to block remote JAR files in Firefox 45 but needed to revert that decision due to breaking IBM's iNotes.
While Mozilla [is working on getting this disabled again](https://bugzilla.mozilla.org/show_bug.cgi?id=1329336) we take the protective stance already now and block remote JAR files even on the low security level.
Finally, we exempt asm.js from the security slider and block it on all levels.
See the [Disk Avoidance](#43-disk-avoidance) and the cache linkability concerns in the [Cross-Origin Identifier Unlinkability](#45-cross-origin-identifier-unlinkability) sections for further details.
- **Medium**
At this security level, we disable the ION JIT (`javascript.options.ion`), native regular expressions (`javascript.options.native_regexp`), Baseline JIT (`javascript.options.baselinejit`), WebAudio (`media.webaudio.enabled`), MathML (`mathml.disabled`), SVG Opentype font rendering (`gfx.font_rendering.opentype_svg.enabled`), and make HTML5 audio and video click-to-play via NoScript (`noscript.forbidMedia`).
Furthermore, we only allow JavaScript to run if it is loaded over HTTPS and the URL bar is HTTPS (by setting `noscript.global` to false and `noscript.globalHttpsWhitelist` to true).
- **High**
This security level inherits the preferences from the Medium level, and additionally disables remote fonts (`noscript.forbidFonts`), completely disables JavaScript (by unsetting `noscript.globalHttpsWhitelist`), and disables SVG images (`svg.in-content.enabled`).
2. **Website Traffic Fingerprinting Defenses**
[Website Traffic Fingerprinting](#33-adversary-capabilities---attacks) is a statistical attack to attempt to recognize specific encrypted website activity.
**Design Goal**: We want to deploy a mechanism that reduces the accuracy of [useful features](https://en.wikipedia.org/wiki/Feature_selection) available for classification.
This mechanism would either impact the true and false positive accuracy rates, or reduce the number of web pages that could be classified at a given accuracy rate.
Ideally, this mechanism would be as light-weight as possible, and would be tunable in terms of overhead.
We suspect that it may even be possible to deploy a mechanism that reduces feature extraction resolution without any network overhead.
In the no-overhead category, we have [HTTPOS](https://freehaven.net/anonbib/cache/LZCLCP_NDSS11.pdf) and [better use of HTTP pipelining and/or SPDY](https://blog.torproject.org/blog/experimental-defense-website-traffic-fingerprinting).
In the tunable/low-overhead category, we have [Adaptive Padding](https://arxiv.org/abs/1512.00524) and [Congestion-Sensitive BUFLO](https://www3.cs.stonybrook.edu/~xcai/fp.pdf).
It may be also possible to [tune such defenses](https://gitlab.torproject.org/tpo/core/tor/-/issues/7028) such that they only use existing spare Guard bandwidth capacity in the Tor network, making them also effectively no-overhead.
**Implementation Status**: Nothing?
3. **Privacy-preserving update notification**
~In order to inform the user when their browser is out of date, we perform a privacy-preserving update check asynchronously in the background.
The check uses Tor to download the file https://www.torproject.org/projects/torbrowser/RecommendedTBBVersions and searches that version list for the current value for the local preference `torbrowser.version`.
If the value from our preference is present in the recommended version list, the check is considered to have succeeded and the user is up to date.
If not, it is considered to have failed and an update is needed.
The check is triggered upon browser launch, new window, and new tab, but is rate limited so as to happen no more frequently than once every 1.5 hours.~
~If the check fails, we cache this fact, and update the Torbutton graphic to display a flashing warning icon and insert a menu option that provides a link to our download page.
Additionally, we reset the value for the browser homepage to point to a [page that informs the user](https://check.torproject.org/?lang=en-US&small=1&uptodate=0) that their browser is out of date.~
We also make use of the in-browser Mozilla updater, and have [patched the updater](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/4234) to avoid sending OS and Kernel version information as part of its update pings.
## 5. Build Security and Package Integrity ## 5. Build Security and Package Integrity
... ...
......