Tor Browser issueshttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues2023-11-21T16:38:59Zhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/42287Backport security fixes (Android & wontfix) from Firefox 120 to 115.5 - based...2023-11-21T16:38:59Zma1Backport security fixes (Android & wontfix) from Firefox 120 to 115.5 - based Tor Browser<details>
<summary>Explanation of Variables</summary>
- `$(ESR_VERSION)`: the Mozilla defined ESR version, used in various places for building tor-browser tags, labels, etc
- **Example**: `102.8.0`
- `$(RR_VERSION)`: the Mozilla def...<details>
<summary>Explanation of Variables</summary>
- `$(ESR_VERSION)`: the Mozilla defined ESR version, used in various places for building tor-browser tags, labels, etc
- **Example**: `102.8.0`
- `$(RR_VERSION)`: the Mozilla defined Rapid-Release version; Tor Browser for Android is based off of the `$(ESR_VERSION)`, but Mozilla's Firefox for Android is based off of the `$(RR_VERSION)` so we need to keep track of security vulnerabilities to backport from the monthly Rapid-Release train and our frozen ESR train.
- **Example**: `110`
- `$(PROJECT_NAME)`: the name of the browser project, either `base-browser` or `tor-browser`
- `$(TOR_BROWSER_MAJOR)`: the Tor Browser major version
- **Example**: `12`
- `$(TOR_BROWSER_MINOR)`: the Tor Browser minor version
- **Example**: either `0` or `5`; Alpha's is always `(Stable + 5) % 10`
- `$(BUILD_N)`: a project's build revision within a its branch; many of the Firefox-related projects have a `$(BUILD_N)` suffix and may differ between projects even when they contribute to the same build.
- **Example**: `build1`
</details>
**NOTE:** It is assumed the `tor-browser` rebases (stable and alpha) have already happened and there exists a `build1` build tags for both `base-browser` and `tor-browser` (stable and alpha)
### **Bookkeeping**
- [x] Link this issue to the appropriate [Release Prep](https://gitlab.torproject.org/tpo/applications/tor-browser-build/-/issues/?sort=updated_desc&state=opened&label_name%5B%5D=Release%20Prep) issues (stable and alpha).
### **Security Vulnerabilities Report**: https://www.mozilla.org/en-US/security/advisories/
- Potentially Affected Components:
- `firefox`/`geckoview`: https://github.com/mozilla/gecko-dev
- `application-services`: https://github.com/mozilla/application-services
- `android-components` (ESR 102 only): https://github.com/mozilla-mobile/firefox-android
- `fenix` (ESR 102 only): https://github.com/mozilla-mobile/firefox-android
- `firefox-android`: https://github.com/mozilla-mobile/firefox-android
**NOTE:** `android-components` and `fenix` used to have their own repos, but since November 2022 they have converged to a single `firefox-android` repo. Any backports will require manually porting patches over to our legacy repos until we have transitioned to ESR 115.
- [x] Go through the `Security Vulnerabilities fixed in Firefox $(RR_VERSION)` report and create a candidate list of CVEs which potentially need to be backported in this issue:
- CVEs which are explicitly labeled as 'Android' only
- CVEs which are fixed in Rapid Release but not in ESR
- 'Memory safety bugs' fixed in Rapid Release but not in ESR
- [x] Foreach issue:
- Create link to the CVE on [mozilla.org](https://www.mozilla.org/en-US/security/advisories/)
- **Example**: https://www.mozilla.org/en-US/security/advisories/mfsa2023-05/#CVE-2023-25740
- Create link to the associated Bugzilla issues (found in the CVE description)
- Create links to the relevant `gecko-dev`/other commit hashes which need to be backported OR a brief justification for why the fix does not need to be backported
- To find the `gecko-dev` version of a `mozilla-central`, search for a unique string in the relevant `mozilla-central` commit message in the `gecko-dev/release` branch log.
- **NOTE:** This process is unfortunately somewhat poorly defined/ad-hoc given the general variation in how Bugzilla issues are labeled and resolved. In general this is going to involve a bit of hunting to identify needed commits or determining whether or not the fix is relevant.
### CVEs
- [x] https://www.mozilla.org/en-US/security/advisories/mfsa2023-49/#CVE-2023-6210 // CVE-2023-6210: Mixed-content resources not blocked in a javascript: pop-up
- [Bug 1801501](https://bugzilla.mozilla.org/show_bug.cgi?id=1801501)
- **Note**: NO backport, Tor Browser unaffected
- [x] https://www.mozilla.org/en-US/security/advisories/mfsa2023-49/#CVE-2023-6213 // CVE-2023-6213: Memory safety bugs fixed in Firefox 120
- [Bug 1849265](https://bugzilla.mozilla.org/show_bug.cgi?id=1849265)
- **Note**: NO backport, Tor Browser unaffected
- [x] https://www.mozilla.org/en-US/security/advisories/mfsa2023-49/#CVE-2023-6211 // CVE-2023-6211: Clickjacking to load insecure pages in HTTPS-only mode
- [Bug 1850200](https://bugzilla.mozilla.org/show_bug.cgi?id=1850200)
- **Note**: NO backport, already backported in previous Tor Browser stable
[x] https://www.mozilla.org/en-US/security/advisories/mfsa2023-49/#CVE-2023-6213 // CVE-2023-6213: Memory safety bugs fixed in Firefox 120
- [Bug 1851118](https://bugzilla.mozilla.org/show_bug.cgi?id=1851118)
- **Note**: NO backport, risky and complex patch, not exploitable in Tor Browser supported configurations
### **tor-browser**: https://gitlab.torproject.org/tpo/applications/tor-browser.git
- [ ] Backport any Android-specific security fixes from Firefox rapid-release
- [ ] Backport patches to `tor-browser` stable branch
- [ ] Open MR
- [ ] Merge
- [ ] Rebase patches onto:
- [ ] `base-browser` stable
- [ ] `tor-browser` alpha
- [ ] `base-browser` alpha
- [ ] Sign/Tag commits:
- **Tag**: `$(PROJECT_NAME)-$(ESR_VERSION)-$(TOR_BROWSER_MAJOR).$(TOR_BROWSER_MINOR)-1-$(BUILD_N)`
- **Message**: `Tagging $(BUILD_N) for $(ESR_VERSION)-based stable|alpha)`
- [ ] `base-browser` stable
- [ ] `tor-browser` stable
- [ ] `base-browser` alpha
- [ ] `tor-browser` alpha
- [ ] Push tags to `upstream`
- **OR**
- [x] No backports
### **application-services**: https://gitlab.torproject.org/tpo/applications/application-services
- **NOTE**: we will need to setup a gitlab copy of this repo and update `tor-browser-build` before we can apply security backports here
- [ ] Backport any Android-specific security fixes from Firefox rapid-release
- [ ] Backport patches to `application-services` stable branch
- [ ] Open MR
- [ ] Merge
- [ ] Rebase patches onto `application-services` alpha
- [ ] Sign/Tag commits:
- **Tag**: `application-services-$(ESR_VERSION)-$(TOR_BROWSER_MAJOR).$(TOR_BROWSER_MINOR)-1-$(BUILD_N)`
- **Message**: `Tagging $(BUILD_N) for $(ESR_VERSION)-based stable|alpha`
- [ ] `application-services` stable
- [ ] `application-services` alpha
- [ ] Push tags to `upstream`
- **OR**
- [x] No backports
### **android-components (Optional, ESR 102)**: https://gitlab.torproject.org/tpo/applications/android-components.git
- [ ] Backport any Android-specific security fixes from Firefox rapid-release
- **NOTE**: Since November 2022, this repo has been merged with `fenix` into a singular `firefox-android` repo: https://github.com/mozilla-mobile/firefox-android. Any backport will require a patch rewrite to apply to our legacy `android-components` project.
- [ ] Backport patches to `android-components` stable branch
- [ ] Open MR
- [ ] Merge
- [ ] Rebase patches onto `android-components` alpha
- [ ] Sign/Tag commits:
- **Tag**: `android-components-$(ESR_VERSION)-$(TOR_BROWSER_MAJOR).$(TOR_BROWSER_MINOR)-1-$(BUILD_N)`
- **Message**: `Tagging $(BUILD_N) for $(ESR_VERSION)-based stable|alpha)`
- [ ] `android-components` stable
- [ ] `android-components` alpha
- [ ] Push tags to `upstream`
- **OR**
- [ ] No backports
### **fenix (Optional, ESR 102)**: https://gitlab.torproject.org/tpo/applications/fenix.git
- [ ] Backport any Android-specific security fixes from Firefox rapid-release
- **NOTE**: Since February 2023, this repo has been merged with `android-components` into a singular `firefox-android` repo: https://github.com/mozilla-mobile/firefox-android. Any backport will require a patch rewrite to apply to our legacy `fenix` project.
- [ ] Backport patches to `fenix` stable branch
- [ ] Open MR
- [ ] Merge
- [ ] Rebase patches onto `fenix` alpha
- [ ] Sign/Tag commits:
- **Tag**: `tor-browser-$(ESR_VERSION)-$(TOR_BROWSER_MAJOR).$(TOR_BROWSER_MINOR)-1-$(BUILD_N)`
- **Message**: `Tagging $(BUILD_N) for $(ESR_VERSION)-based stable|alpha)`
- [ ] `fenix` stable
- [ ] `fenix` alpha
- [ ] Push tags to `upstream`
- **OR**
- [x] No backports
### **firefox-android**: https://gitlab.torproject.org/tpo/applications/firefox-android
- [ ] Backport any Android-specific security fixes from Firefox rapid-release
- [ ] Backport patches to `firefox-android` stable branch
- [ ] Open MR
- [ ] Merge
- [ ] Rebase patches onto `fenix` alpha
- [ ] Sign/Tag commits:
- **Tag**: `firefox-android-$(ESR_VERSION)-$(TOR_BROWSER_MAJOR).$(TOR_BROWSER_MINOR)-1-$(BUILD_N)`
- **Message**: `Tagging $(BUILD_N) for $(ESR_VERSION)-based stable|alpha)`
- [ ] `firefox-android` stable
- [ ] `firefox-android` alpha
- [ ] Push tags to `upstream`
- **OR**
- [x] No backportsma1ma1https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/42143do something about autoconfig.js / cfg / js?2024-03-18T14:53:47ZThorindo something about autoconfig.js / cfg / js?e.g. https://github.com/mullvad/mullvad-browser/issues/137#issuecomment-1742215179
Things
- tweak about:config interstitial
- lock prefs where appropriate
- do something with user.js?
- do something with userChrome/content? (there is a ...e.g. https://github.com/mullvad/mullvad-browser/issues/137#issuecomment-1742215179
Things
- tweak about:config interstitial
- lock prefs where appropriate
- do something with user.js?
- do something with userChrome/content? (there is a pref we can lock)
- UX stuff (education, remove dead UI)
and now do something with autoconfig.js? I don't think we have an issue for this, so here it is. Maybe we can make this a meta. As for the autoconfig, I think tom had/has an issue on this upstream as a security concern
edit: another example: https://old.reddit.com/r/firefox/comments/16x7qbq/cant_disable_sponsored_shortcuts_on_home_screen/
- do we protect the user here against 3rd party software meddling with settings? - cc: @pierovhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41112Integrate cross-tab identity leak protection into Tor Browser with native UX2024-03-27T14:39:06ZdonutsIntegrate cross-tab identity leak protection into Tor Browser with native UXIn response to the potential for cache side channel attacks reported in https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41071, @ma1 deployed [Cross-tab Identity Leak Protection](https://noscript.net/usage/#crosstab-i...In response to the potential for cache side channel attacks reported in https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41071, @ma1 deployed [Cross-tab Identity Leak Protection](https://noscript.net/usage/#crosstab-identity-leak-protection) (or "TabGuard") in NoScript 11.4.8. However some users are finding the warning confusing, and/or are suffering from warning fatigue – e.g.:
```
<Jeremy_Rand_36C3[m]> So far at least 2 users in #tor have been very confused about the NoScript warnings that were recently added. One of them thought the warning meant his identity had already leaked, and panicked and shut off Tor Browser. Seems like we should ask the UX Team to evaluate how we can improve this, now that we have some breathing room since the vulnerability is mitigated.
<Jeremy_Rand_36C3[m]> One of the two users I noticed who was confused about the warning was one of my co-workers, who is very technically proficient, including about Tor, and even he couldn't understand what the warning was about, what triggered it, and what the correct course of action was
<Jeremy_Rand_36C3[m]> Then you have a less sophisticated user who thought the warning meant he was already pwned and panicked
<Jeremy_Rand_36C3[m]> I was hoping the UX Team might be able to evaluate how this warning can be better presented so that users don't get confused or make bad decisions when they see it
```
We're planning on integrating this feature into Tor Browser as part of the work to migrate the Security Level feature in https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/40925. We should take this opportunity to improve the UX in general, in addition to converting the feature into standard Tor Browser UI patterns.ma1ma1https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41943Lock javascript.options.spectre.disable_for_isolated_content to false2023-10-03T15:37:56ZrichardLock javascript.options.spectre.disable_for_isolated_content to falseLink: https://bugzilla.mozilla.org/show_bug.cgi?id=1774178
I suspect we will want to lock this pref and prevent users from reducing their security, would need to be looked into.Link: https://bugzilla.mozilla.org/show_bug.cgi?id=1774178
I suspect we will want to lock this pref and prevent users from reducing their security, would need to be looked into.ma1ma1https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41766TTP-02-001 WP1: XSS in TorConnect's captive portal (Info)2023-10-19T13:37:48ZrichardTTP-02-001 WP1: XSS in TorConnect's captive portal (Info)>>>
## Description:
TorConnect's captive portal performs a redirect to a URL that is retrieved from the **redirect** parameter located in the **query** string. No validations are performed to guarantee that the scheme of the URL is valid...>>>
## Description:
TorConnect's captive portal performs a redirect to a URL that is retrieved from the **redirect** parameter located in the **query** string. No validations are performed to guarantee that the scheme of the URL is valid before having it used in the redirection. Note that the next step is performed after the user successfully connects to TOR.
Fortunately, arbitrary JavaScript execution is prevented due to the strict CSP policy that is applied to the `about:torconnect` page. Hence, the severity has appropriately been set at **Info** only.
## Affected file:
_browser/components/torconnect/content/aboutTorConnect.js_
## Affected code:
```javascript
async init() {
// see if a user has a final destination after bootstrapping
let params = new URLSearchParams(new URL(document.location.href).search);
if (params.has("redirect")) {
const encodedRedirect = params.get("redirect");
this.redirect = decodeURIComponent(enodedRedirect);
} else {
// if the user gets here manually or via the button in the urlbar
// then we will redirect to about:tor
this.redirect = "about:tor";
}[...]
}
```
## Steps to reproduce:
1. Open the Tor Browser and access `about:torconnect?redirect=javascript:alert(document.domain);`
2. Click on Connect and check the DevTools to verify that JavaScript execution was prevented by CSP.
To mitigate this issue, Cure53 advises validating the scheme of the URL from the **redirect** parameter, and verifying it against an allow-list of safe schemes.
>>>ma1ma1https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41765TTP-02-006 WP1: Information leaks via custom homepage (Low)2023-11-07T17:38:54ZrichardTTP-02-006 WP1: Information leaks via custom homepage (Low)>>>
## Description
It was discovered that setting a custom homepage can lead to information leaks under specific circumstances, specifically when malicious approaches are combined with using the **Reset your Identity** feature. Specifica...>>>
## Description
It was discovered that setting a custom homepage can lead to information leaks under specific circumstances, specifically when malicious approaches are combined with using the **Reset your Identity** feature. Specifically, when a user has their custom homepage opened in a browser tab and then decides to use the **Reset their identity** feature, the homepage will automatically open again after the browser restarts with the **new identity**. If the custom homepage is malicious, it could track the moment the user left the page and infer that the new user who shortly accessed their page is the same as the previous user.
Furthermore, a malicious webpage could use the `onbeforeunload` function to determine with confidence whether the user initiated an identity reset. If the user tried to close the browser or navigate away, the `onbeforeunload` dialog would be displayed and block further actions, giving enough time for the script to ping the server. In contrast, if the user chose to reset their identity, the browser would be automatically closed, and no ping would be sent. The PoC below demonstrates how the above sequence could be achieved. Additional steps to track when the user left and rejoined the page would have to be added to properly infer the user's new identity.
## PoC:
```html
<script>
let exit;
onbeforeunload = () => { exit = true; return ""; }
let timer = setInterval (()=>{
if (exit) {
let img = new Image();
img.src = "/exited";
clearInterval (timer) ;
timer = false;
}
}, 1);
</script>
```
## Steps to reproduce:
1. Open the Tor Browser and connect to it.
2. Save the PoC above as an HTML file and open it on the Tor Browser.
3. Observe a request made to /exited if the user tried to close the browser or navigate away from the tab. See that the data will be handled differently if the user tries to reset their identity.
To mitigate this issue, Cure53 advises removing the ability to set custom homepages from the options available to users. Alternatively, the custom homepage should not be opened automatically upon usage of the Reset your Identity feature.
>>>ma1ma1https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41767TTP-02-002 WP1: Redirect prevents switching to new Tor Circuit (Info)2024-03-12T12:05:59ZrichardTTP-02-002 WP1: Redirect prevents switching to new Tor Circuit (Info)>>>
## Description:
It was discovered that navigation initiated through the new Tor Circuit feature can be hijacked. This can be accomplished by redirecting the current website to a cached page immediately after the Tor Circuit switch st...>>>
## Description:
It was discovered that navigation initiated through the new Tor Circuit feature can be hijacked. This can be accomplished by redirecting the current website to a cached page immediately after the Tor Circuit switch starts. As a result, the attacker-initiated navigation occurs before the Tor Circuit's browser-initiated navigation and, subsequently, the next step is canceled.
An attacker could exploit this vulnerability to prevent users from switching circuits while browsing a malicious webpage. Although this prevents the user from changing their Tor Circuit, it was concluded that this does not pose any immediate security risk, and as such, the severity mark was appropriately set at Info.
## PoC:
```html
<?php header ("cache-control: max-age=604800") ;
header ("Age: 100"); 2>
<html>
<script>
let status = false;
onbeforeunload = () => {
status = true;
}
let timer = setInterval(() => {
if (status) {
status = false;
clearInterval (timer);
location.href = location.href;
}
}, 1);
</script>
</html>
```
## Steps to reproduce:
1. Open the Tor Browser and connect to it.
2. Save the PoC above as a PHP file and serve it through a PHP server.
3. Access the file a few times through the Tor Browser to make sure it gets cached by the browser.
4. Click on the **Tor Circuit** button and then on the** New Tor circuit for this site** option.
5. The page will quickly be reloaded but the Circuit will remain the same.
To mitigate this issue, Cure53 advises forcing the navigation initiated by the new **Tor Circuit** feature to be completed. Cancellation of a user-initiated navigation is ill-advised in this scenario. However, during the testing phase, the team was unable to pinpoint the specific code responsible for this issue. As a result, the mitigation advice provided is currently incomplete.
>>>https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41768TTP-02-005 WP1: Redirect to about:blank hides the new Tor Circuit button (Info)2023-10-19T14:33:23ZrichardTTP-02-005 WP1: Redirect to about:blank hides the new Tor Circuit button (Info)>>>
## Description:
It is possible to hide the **Tor Circuit** button from the address bar for a given tab by listening to the `onbeforeunload` event and redirecting the page to `about:blank` when the event is triggered.
If a user attem...>>>
## Description:
It is possible to hide the **Tor Circuit** button from the address bar for a given tab by listening to the `onbeforeunload` event and redirecting the page to `about:blank` when the event is triggered.
If a user attempts to reset their identity by clicking on the **New Tor circuit for this site** option, the navigation can be hijacked by the attacker's script. A blank page will be displayed as a consequence. If the user attempts to navigate back to the previous page using the Back button, the **Tor Circuit** button will not be displayed in the address bar.
Similarly to [TTP-02-002](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41767), this issue was found not to pose any immediate security risk and is included as **Info** only.
## PoC:
```html
<script>
let status;
onbeforeunload = () => {
status = true;
}
let timer = setInterval(() => {
if (status) {
status = false;
clearInterval (timer) ;
location = "about:blank";
}
}, 1);
</script>
```
# Steps to reproduce:
1. Open the Tor Browser and connect to it.
2. Save the PoC above as an HTML file and open it in the browser.
3. Click on the **Tor Circuit** button and then on the **New Tor circuit for this** site option.
4. The page will be redirected to `about:blank`.
5. Click on the **Back** option and observe that the **Tor Circuit button** is hidden for this page.
To mitigate this issue, Cure53 advises applying the same mitigation as specified in the [TTP-02-002](https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41767) ticket. Given these issues seem to be related and they might share the same root cause, it is recommended to consider and address them together.
>>>https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41769TTP-02-007 WP1: Missing about: pages in shouldShowTorConnect check (Info)2023-10-19T14:31:53ZrichardTTP-02-007 WP1: Missing about: pages in shouldShowTorConnect check (Info)>>>
## Description:
It was discovered that the `about:welcome`, `about:privatebrowsing`, and `about:home` pages are not redirecting to about:tor when they are accessed by a user who has not connected to Tor yet.
While this behavior does...>>>
## Description:
It was discovered that the `about:welcome`, `about:privatebrowsing`, and `about:home` pages are not redirecting to about:tor when they are accessed by a user who has not connected to Tor yet.
While this behavior does not present any immediate security risk, it can potentially cause confusion or alarm users who may access these pages before being connected to the Tor network. To ensure consistency across all about: pages, it is recommended to deploy relevant changes.
## Affected file:
`browser/base/content/utilityOverlay.js`
## Affected code:
```javascript
if (TorConnect.shouldShowTorConnect) {
if (
url === "about:tor" ||
(url === "about:newtab" &&
Services.prefs.getBoolPref("browser.newtabpage.enabled", false))
) {
url = TorConnect.getRedirectURL(url) ;
}
}
```
In order to reproduce this issue, simply open the Tor Browser, access `about:home` and
note that the page does not perform an automated redirection to `about:tor`.
To mitigate the problem, Cure53 advises including additional checks to validate whether
the URL matches `about:welcome`, `about:privatebrowsing` or about:home. If a match is
found, the page should be redirected to `about:tor`.
>>>henryhenryhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41598Prevent NoScript from being removed / disabled until core functionality has b...2023-03-27T19:45:38ZrichardPrevent NoScript from being removed / disabled until core functionality has been migrated to Tor BrowserUsers can currently uninstall or disable NoScript, which makes Security Level just silently fail in interesting ways. We should fix this issue.Users can currently uninstall or disable NoScript, which makes Security Level just silently fail in interesting ways. We should fix this issue.ma1ma1https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/25916Disable MOZ_DISABLE_CONTENT_SANDBOX2023-07-12T14:12:39ZTom Rittertom@ritter.vgDisable MOZ_DISABLE_CONTENT_SANDBOXMOZ_DISABLE_CONTENT_SANDBOX can be used at runtime to disable the content sandbox. If an attacker can influence this, we're probably already sunk, but just like we disable the "Dump all your TLS Session Keys here please" environment var...MOZ_DISABLE_CONTENT_SANDBOX can be used at runtime to disable the content sandbox. If an attacker can influence this, we're probably already sunk, but just like we disable the "Dump all your TLS Session Keys here please" environment variable, we should disable this one too.Sponsor 131 - Phase 2 - Privacy Browserhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41539Crypto warning weaknesses2023-01-30T08:35:42ZhenryCrypto warning weaknessesThe "Bug 40209: Implement Basic Crypto Safety" patch (`73640da2c4e719493b45fb6140f7ad2666326d89`) is trying to prevent users using malicious crypto addresses from HTTP websites. It does this under the following condition
1. The website ...The "Bug 40209: Implement Basic Crypto Safety" patch (`73640da2c4e719493b45fb6140f7ad2666326d89`) is trying to prevent users using malicious crypto addresses from HTTP websites. It does this under the following condition
1. The website is HTTP and not `.onion` (so vulnerable to being spoofed).
2. The user [copies or cuts text](https://gitlab.torproject.org/tpo/applications/tor-browser/-/commit/73640da2c4e719493b45fb6140f7ad2666326d89#17431c47080b50e91d17ade0423f534d7467c15d_0_75)
3. And the copied text [looks like a crypto address](https://gitlab.torproject.org/tpo/applications/tor-browser/-/commit/73640da2c4e719493b45fb6140f7ad2666326d89#17431c47080b50e91d17ade0423f534d7467c15d_0_78)
In this case it shows the user a popup warning them about the potential inserted crypto address.
## Weaknesses
I can think of three weaknesses to this approach.
### White space
Currently, [we only trim the copied text](https://gitlab.torproject.org/tpo/applications/tor-browser/-/commit/73640da2c4e719493b45fb6140f7ad2666326d89#17431c47080b50e91d17ade0423f534d7467c15d_0_77) rather than remove all whitespace within as well. This means that you can just insert some whitespace in the address (they could make it look presentational, or use CSS to hide it) and the user won't get a warning.
It is not that usually for text inputs to consume (some) whitespace. And even if it didn't, a user that has already copied the text will probably just remove the whitespace themselves after pasting.
### Drag and drop
No warning is triggered if the user starts dragging the crypto address. Maybe this doesn't come up much, but the website could try and encourage it by just writing "Drag and drop the address below". Or setting `user-select: none` but making the address draggable.
### Copying the address manually
If you set `user-select: none` on the address then there is no way to copy the text. If the user already trusts the HTTP website, then they may just copy out the address by hand. Maybe they wouldn't bother with the length of some addresses though.
## Risk
I'm not sure how high the risk is since we have HTTPS-always now. But we have decided to still keep the crypto warning in place as a protective measure.henryhenryhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/18107Prevent automatic HTTP redirects2023-01-05T17:00:27ZTracPrevent automatic HTTP redirectsApparently, at some point this feature was removed from Firefox. The option "Advanced -> General -> Warn me when websites try to redirect" doesn't seem to work. For example, this link redirects automatically: http://bit.ly/M4DEDa
I thin...Apparently, at some point this feature was removed from Firefox. The option "Advanced -> General -> Warn me when websites try to redirect" doesn't seem to work. For example, this link redirects automatically: http://bit.ly/M4DEDa
I think that automatic HTTP redirects are a potential attack vector. (See, for example, [1]). Can the option to disable them be restored?
[1] https://www.reddit.com/r/TOR/comments/41bfwq/tor_exits_can_strip_ssl_inject_malicious_js_then/
**Trac**:
**Username**: slycelotehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41041[Feature proposal] Verification of onion service integrity2023-05-02T13:55:57ZErik Moeller[Feature proposal] Verification of onion service integrity## Problem statement
[SecureDrop](https://securedrop.org/) and similar onion services that seek to provide end-to-end-encrypted communications (between sender and designated recipient) have a bootstrapping problem: if the server is comp...## Problem statement
[SecureDrop](https://securedrop.org/) and similar onion services that seek to provide end-to-end-encrypted communications (between sender and designated recipient) have a bootstrapping problem: if the server is compromised, users cannot be sure that their communications are in fact end-to-end encrypted. Server-provided code or cryptographic key material may have been tampered with.
This is not addressed by existing web standards like [SRI](https://developer.mozilla.org/en-US/docs/Web/Security/Subresource_Integrity) and [CSP](https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP), both of which depend on the server being a trusted resource to begin with. In the context of WhatsApp E2EE, Cloudflare/Facebook have recently piloted the use of an [integrity verification browser extension](https://blog.cloudflare.com/cloudflare-verifies-code-whatsapp-web-serves-users/).
We similarly need a way to securely ship authenticated JavaScript and WASM code and ensure that script execution is limited to that resource(s) only.
The executed code would be the same for all SecureDrop instances of the same version. This requirement is both to prevent browser exploits from untrusted sites and from trusted but compromised websites, as well as to prevent MITM attacks from trusted but compromised websites.
## Proposal
We suggest that an integrity verification feature can be built on top of the existing [about:rulesets](https://gitlab.torproject.org/tpo/applications/tor-browser/-/merge_requests/262) functionality in Tor Browser, which maps full-length onion addresses against short names in the form `<service-name>.<namespace>.tor.onion`, e.g., `nytimes.securedrop.tor.onion`, and ships this information as a signed ruleset.
In this proposal, a ruleset provider could act as a verifier of a set of hashes (e.g., sha-256) which correspond to accepted response bodies for specific paths, e.g., `/index.html`, `/1.0.0/`. Subresources could then be verified using SRI.
Tor Browser would need to compute the hash based on the decompressed response body, before rendering the page, and display an error message if it does not correspond to one of the accepted hashes.
Interactions with Tor Browser's safety settings and the NoScript extension will need to be considered; ideally we'd like to ensure script execution is limited to resources that are verified directly or indirectly (e.g., via SRI hashes in a verified resource).
## Alternatives and implementation
We’d be happy to discuss alternative approaches that seem like a better fit from the Tor Project’s perspective, and are open to partnering directly with you on the implementing, testing and piloting any agreed upon approach.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/22985Can we simplify and clarify click-to-play of audio/video?2023-01-05T17:37:48ZArthur EdelsteinCan we simplify and clarify click-to-play of audio/video?Right now click-to-play of videos is quite cumbersome and has poor usability. For example on youtube, this is what I observe on Medium Security.
* On first page load, no video or audio is visible -- the video box is gray. A "musical not...Right now click-to-play of videos is quite cumbersome and has poor usability. For example on youtube, this is what I observe on Medium Security.
* On first page load, no video or audio is visible -- the video box is gray. A "musical notes" icon appears in the middle of the video box, and an "orbiting dots" indicator seems to indicate some problem loading. After a few seconds the video box goes black and it says "an error occured." Then after another few seconds the "musical notes" icon reappears.
* If I click on the "musical notes" icon, then a confirmation box appears, that says "Temporarily allow ... [URLs and codec gibberish]". If I click OK, then the whole page reloads. Again I get a gray video box with orbiting dots. This time there is a film canister icon in the middle of the dots.
* If I click on the film canister it says, "Temporarily allow [URL and more codec gibberish]". again I click OK, the page reloads and the video finally plays.
So here, click-to-play required two clicks and two reloads (plus confirmation clicks). Ideally it should require only one reload. The option to click to play the video should be much more clear (it should probably have the text "Click to Play"). The click-to-play button shouldn't disappear when the youtube page tries to re-load the video. If a confirmation prompt is to be shown, then it should clearly explain to the user that video/audio is about to be loaded, and what the security concerns are.Sponsor 131 - Phase 5 - Ongoing Maintenancehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/28621Investigate "website fingerprinting through cache occupancy channel"2023-01-05T17:31:44ZArthur EdelsteinInvestigate "website fingerprinting through cache occupancy channel"See this paper:
https://arxiv.org/abs/1811.07153
> Robust Website Fingerprinting Through the Cache Occupancy Channel
> Anatoly Shusterman, Lachlan Kang, Yarden Haskal, Yosef Meltser, Prateek Mittal, Yossi Oren, Yuval Yarom
> (Submitted...See this paper:
https://arxiv.org/abs/1811.07153
> Robust Website Fingerprinting Through the Cache Occupancy Channel
> Anatoly Shusterman, Lachlan Kang, Yarden Haskal, Yosef Meltser, Prateek Mittal, Yossi Oren, Yuval Yarom
> (Submitted on 17 Nov 2018)
>
> Website fingerprinting attacks, which use statistical analysis on network traffic to compromise user privacy, have been shown to be effective even if the traffic is sent over anonymity-preserving networks such as Tor. The classical attack model used to evaluate website fingerprinting attacks assumes an on-path adversary, who can observe all traffic traveling between the user's computer and the Tor network. In this work we investigate these attacks under a different attack model, inwhich the adversary is capable of running a small amount of unprivileged code on the target user's computer. Under this model, the attacker can mount cache side-channel attacks, which exploit the effects of contention on the CPU's cache, to identify the website being browsed. In an important special case of this attack model, a JavaScript attack is launched when the target user visits a website controlled by the attacker. The effectiveness of this attack scenario has never been systematically analyzed,especially in the open-world model which assumes that the user is visiting a mix of both sensitive and non-sensitive sites. In this work we show that cache website fingerprinting attacks in JavaScript are highly feasible, even when they are run from highly restrictive environments, such as the Tor Browser .Specifically, we use machine learning techniques to classify traces of cache activity. Unlike prior works, which try to identify cache conflicts, our work measures the overall occupancy of the last-level cache. We show that our approach achieves high classification accuracy in both the open-world and the closed-world models. We further show that our techniques are resilient both to network-based defenses and to side-channel countermeasures introduced to modern browsers as a response to the Spectre attack.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/27462Use OSS-Fuzz for Tor Browser2023-01-05T17:28:25ZGeorg KoppenUse OSS-Fuzz for Tor BrowserWe might be able to leverage OSS-Fuzz for getting our browser patches fuzzed. At least we should investigate the requirements and necessary changes we'd need to make on our side to make this happen.We might be able to leverage OSS-Fuzz for getting our browser patches fuzzed. At least we should investigate the requirements and necessary changes we'd need to make on our side to make this happen.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/27123Investigate PING/SETTINGS-related timing side-channels2023-01-05T17:28:23ZArthur EdelsteinInvestigate PING/SETTINGS-related timing side-channelsWe are auditing HTTP/2 for tracking vectors in #14592. But a more difficult question for HTTP/2 (and potentially HTTP 1.x) are timing side channels.We are auditing HTTP/2 for tracking vectors in #14592. But a more difficult question for HTTP/2 (and potentially HTTP 1.x) are timing side channels.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/23214Defend against stack overflow due to overly deep nested (unclosed) XML tags a...2023-01-05T17:20:54ZGeorg KoppenDefend against stack overflow due to overly deep nested (unclosed) XML tags and similar vectorsThere are several ways to get Tor Browser crashed due to missing mitigations while dealing with overly deep nested XML tags (see: https://bugzilla.mozilla.org/show_bug.cgi?id=485941 for an example). For Mozilla this is just annoying but ...There are several ways to get Tor Browser crashed due to missing mitigations while dealing with overly deep nested XML tags (see: https://bugzilla.mozilla.org/show_bug.cgi?id=485941 for an example). For Mozilla this is just annoying but depending on the circumstances we might come to a different conclusion due to our different threat model.
We should try to come up with something that handles those cases more gracefully and in a less dangerous way.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/22974NoScript (and Tor Browser) vulnerable to Mozilla Add-On Code Execution2023-01-05T17:19:55ZTom Rittertom@ritter.vgNoScript (and Tor Browser) vulnerable to Mozilla Add-On Code ExecutionPer legacy/trac#22966 it sounds like NoScript is not signed with a developer key (the 'updateKey' feature described here: https://developer.mozilla.org/en-US/Add-ons/Install_Manifests#updateKey )
updateKey allows the extension developer...Per legacy/trac#22966 it sounds like NoScript is not signed with a developer key (the 'updateKey' feature described here: https://developer.mozilla.org/en-US/Add-ons/Install_Manifests#updateKey )
updateKey allows the extension developer to require updates be signed with a key only they control. Without it, Mozilla can rewrite extensions and effectively get arbitrary code execution via an add-on.
There's a few things at play here.
1) We could disable add-on updating all together to mitigate this in 52.
2) In 59, when the only 'full' add-ons are 'system' add-ons we'll need to figure this out ourselves anywhere. This will probably involve Tor signing Tor Launcher and TorButton with its own system add-on keys. Dev Tools is an open question.
3) In 59, when Web Extensions are around this won't be as big of a concern. Mozilla can't get code execution but could neuter the effect of an add-on or turn it into spyware (assuming we keep extension updating in place). Whether web extensions will support an updateKey mechanism is an open question (they don't now, EFF wants it. Tor might wish to lend support to the argument. If Tor could get another partner repack to join in that would help even more I bet.)https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/21961should torbrowser enable network.IDN_show_punycode by default?2024-02-07T09:04:16Zcypherpunksshould torbrowser enable network.IDN_show_punycode by default?Firefox and torbrowser do not show punycodes by default.
The attack vector is discussed here, including a demo:
https://www.wordfence.com/blog/2017/04/chrome-firefox-unicode-phishing/Firefox and torbrowser do not show punycodes by default.
The attack vector is discussed here, including a demo:
https://www.wordfence.com/blog/2017/04/chrome-firefox-unicode-phishing/Sponsor 131 - Phase 5 - Ongoing Maintenancehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/21030Test integration of PartitionAlloc/HardenedPartitionAlloc in Tor Browser2024-03-13T18:10:28ZGeorg KoppenTest integration of PartitionAlloc/HardenedPartitionAlloc in Tor BrowserPartitionAlloc/HardenedPartitionAlloc (https://github.com/struct/HardenedPartitionAlloc) have some interesting security properties we like to use (see: legacy/trac#20998 for a similar ticket wrt jemalloc). We should try to test them and ...PartitionAlloc/HardenedPartitionAlloc (https://github.com/struct/HardenedPartitionAlloc) have some interesting security properties we like to use (see: legacy/trac#20998 for a similar ticket wrt jemalloc). We should try to test them and compare them to jemalloc with partitioning enabled.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/20971Try building Tor Browser with SafeStack2023-01-05T17:05:02ZArthur EdelsteinTry building Tor Browser with SafeStackSafeStack is part of the [Levee](http://dslab.epfl.ch/proj/cpi/) project and prevents stack smashing attacks. It is reported to have a negligible performance hit.
Together, Levee's components, SafeStack and CPI or CPS, are supposed to p...SafeStack is part of the [Levee](http://dslab.epfl.ch/proj/cpi/) project and prevents stack smashing attacks. It is reported to have a negligible performance hit.
Together, Levee's components, SafeStack and CPI or CPS, are supposed to prevent code flow hijacking. Once CPI and CPS have been released, we should try those as well.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/20314Make SVG click-to-play and support fallback2023-01-05T17:03:20ZbugzillaMake SVG click-to-play and support fallbackCurrently TBB uses the worst option: entirely disabled. Even no white rectangle on a white background. It's not fair that videos have CTP, but images haven't. NoScript is most suitable now for this feature.Currently TBB uses the worst option: entirely disabled. Even no white rectangle on a white background. It's not fair that videos have CTP, but images haven't. NoScript is most suitable now for this feature.Sponsor 131 - Phase 5 - Ongoing Maintenancehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/17734Use PDF.js to sanitize saved PDFs2023-01-05T16:59:02ZcypherpunksUse PDF.js to sanitize saved PDFsPDF files often have malicious content within itself, which can be used to compromise the security of the system. Rendering PDF file with PDF.js is often slow and broken, which makes the users to open the files with native readers. Unfor...PDF files often have malicious content within itself, which can be used to compromise the security of the system. Rendering PDF file with PDF.js is often slow and broken, which makes the users to open the files with native readers. Unfortunately, there is no good sanitizers: they are mostly written in script languages (s.a. Python and Ruby) and require their runtime. It will be very useful to have a tool to remove malicious content from downloaded PDF implemented in JS right in browser. Fortunately, Firefox already has PDF parsing library inside its PDF.js engine.
* Use PDF.js to parse PDF into internal representation, but do not render it.
* Decompress and destream it.
* Remove all potentially malicious tags (this should be tweakable in popup window similar to "Clear Recent History"): JS, fonts, flash (and other objects calling plugins), 3d, forms, signatures, remote content, anything else not needed for rendering directly.
* Recreate PDF file from the internal representation recomputing all the recomputable fields to destroy memory corruption exploits.
First I asked abou it in PDF.js bug tracker, they refused because it is not the goal of that project.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/17569Add uBlock Origin to the Tor Browser2024-03-09T17:31:09ZJesse VictorsAdd uBlock Origin to the Tor BrowserI suggest that we add Ublock Origin to the Tor Browser. Ublock Origin has the following advantages:
1. FOSS under GPL3. See https://github.com/gorhill/uBlock
2. It is actively maintained and very popular.
3. It's designed to be efficient...I suggest that we add Ublock Origin to the Tor Browser. Ublock Origin has the following advantages:
1. FOSS under GPL3. See https://github.com/gorhill/uBlock
2. It is actively maintained and very popular.
3. It's designed to be efficient on CPU and memory. See https://github.com/gorhill/uBlock#performance
From https://github.com/gorhill/uBlock#philosophy:
> uBlock Origin is not an ad blocker; it's a general-purpose blocker. Furthermore, advanced mode allows uBlock₀ to work in default-deny mode, which mode will cause all 3rd-party network requests to be blocked by default, unless allowed by the user.
Its behavior is governed through filter lists, which are maintained by Adblock Plus, Disconnect, the community, or other sources. Users can control which lists are downloaded and most are fetched through HTTPS.
I have read through https://www.torproject.org/projects/torbrowser/design/#philosophy, but this was written several years ago and I believe that the landscape has changed and that it's time to revisit those assumptions. Arguments include:
1. Default denial of cross-site (3rd party) requests, unless allowed by the users. This eliminates CSRFs and prevents contact with ad networks and trackers in the first place. This supplements browser security by prevent ad networks from tracking users across a browser session.
2. If all users use Ublock Origin, then everyone has the same fingerprint.
3. Adblockers are now relatively common by tech-savvy users, to the point where they now consider webpages to be broken if ads get in their way. The existence of ads may drive a user to install an insecure adblocker or to use their native non-Tor browser.
4. Ublock Origin would save significant bandwidth, reducing the load on the Tor network and increasing the responsiveness of webpages in the Tor Browser.
<n8fr8> might be good to revisit these assumptions, but make sure to read on in the design document to get the full understanding
<helix> I wonder how many people install adblockers anyway. I have like 4 extra extensions for ad/tracking blocking
<n8fr8> true that
<helix> my memory was fuzzy but I recall there also being some concern that blocking ads might increase sites' contempt towards tor users, but this was like 2011-2012 and the situation was quite different
<nickm> It seems like it follows some kind of design antipattern to me; "Assuming that we deliver security with X, Y adds no additional security. Therefore, not Y." then again, I am not a TB person and do not want to step on their toes here
<n8fr8> the world has changed wrt to ad blockers being seen as anti-social... Apple now supports them after all.
<kernelcorn> helix: so many non-Tor users use adblockers that I doubt that Tor users would make a significant impact
<helix> kernelcorn: I agree now - I'm saying that the timeframe in which that decision was made had a different landscape
<helix> I think it's probably worth revisiting the topic to see if it's still true
Ticket legacy/trac#10914 is related.richardrichard2023-11-15https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/17509Write a patch for additional -ldl needed when compiling Tor Browser with ASan...2023-01-05T16:58:29ZGeorg KoppenWrite a patch for additional -ldl needed when compiling Tor Browser with ASan and GCC 5This is a reminder to investigate and write a patch for https://bugzilla.mozilla.org/show_bug.cgi?id=1213698.This is a reminder to investigate and write a patch for https://bugzilla.mozilla.org/show_bug.cgi?id=1213698.Sponsor 131 - Phase 5 - Ongoing Maintenancehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/13747Make sure tor browser handles mixed content in .onions correctly2023-01-05T16:56:03ZWilliam BudingtonMake sure tor browser handles mixed content in .onions correctlyThe .onion URL for a given THS instance is a fingerprint of the public key, thus ensuring authenticity of the service. For this reason, some assume the same security assurances for .onion addresses as they would for https, with the adde...The .onion URL for a given THS instance is a fingerprint of the public key, thus ensuring authenticity of the service. For this reason, some assume the same security assurances for .onion addresses as they would for https, with the added assurances that hidden services provide. For instance, the major browsers have chosen to not load http resources when accessing an https site, blocking mixed content. However, there is no protection against mixed content being loaded in the TBB for .onion addresses when they include resources from http URLs. For any .onion URL which includes http resources, an attacker controlling an exit node could perform a Man in the Middle attack, providing malicious javascript which modifies the content of the DOM.
One would hope that an http THS would never include remote resources from an http site if they would like to protect their users. In fact, one would hope that a THS would never load any resources at all from a source they do not control. But this is no guarantee that they won't. It seems like a good security measure to disallow http resources from being loaded in TBB.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/15825webgl.disable-extensions true about:config setting may allow DoS and breaks w...2023-11-04T01:15:51Zcypherpunkswebgl.disable-extensions true about:config setting may allow DoS and breaks websitesReference legacy/trac#3323 and legacy/trac#6370 ...
"The conclusion is that if we set webgl.min_capability_mode and webgl.disable-extensions, our primary API-level fingerprinting concerns are addressed."
However, I am concerned because...Reference legacy/trac#3323 and legacy/trac#6370 ...
"The conclusion is that if we set webgl.min_capability_mode and webgl.disable-extensions, our primary API-level fingerprinting concerns are addressed."
However, I am concerned because this presumably disables security extensions such as GL_ARB_robustness too, making it easier for malicious content to cause crashes on the user's computer (some of which can lead to things such as remote code execution).https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/30271Validate untrusted TLS certificates to ensure Exits aren't performing an attack2023-01-05T16:36:06ZTom Rittertom@ritter.vgValidate untrusted TLS certificates to ensure Exits aren't performing an attackAs described in https://gitweb.torproject.org/tor-browser-spec.git/tree/proposals/103-selfsigned-user-safety.txt - we can validate untrusted certificates via a separate circuit to ensure the exit node is not performing a TLS MITM attack ...As described in https://gitweb.torproject.org/tor-browser-spec.git/tree/proposals/103-selfsigned-user-safety.txt - we can validate untrusted certificates via a separate circuit to ensure the exit node is not performing a TLS MITM attack on end users.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/20955Tor Browser memory hardening2023-01-05T16:23:56ZArthur EdelsteinTor Browser memory hardeningHere's a parent ticket for memory hardening for Tor Browser.
See also notes at [[doc/TorBrowser/Hardening]]Here's a parent ticket for memory hardening for Tor Browser.
See also notes at [[doc/TorBrowser/Hardening]]Sponsor 131 - Phase 5 - Ongoing Maintenancehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41351Move the crypto protection patch earlier in the patchset2023-02-15T08:26:56ZPier Angelo VendrameMove the crypto protection patch earlier in the patchsetThe patch for bug #40209 (e.g., ae81c697dfb66792ec5454a19e728f91abfee24d) could be moved to be with security level and new identity (so, possibly part of base browser, or be the first excluded patch).
The only problem is that it depends...The patch for bug #40209 (e.g., ae81c697dfb66792ec5454a19e728f91abfee24d) could be moved to be with security level and new identity (so, possibly part of base browser, or be the first excluded patch).
The only problem is that it depends on TorStrings.jsm.
We should either wait for #40924 to be completed, or do a workaround, like I've done for #40925 and #40926.Sponsor 131 - Phase 2 - Privacy BrowserPier Angelo VendramePier Angelo Vendramehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/40087Use "Safer" as the default security level2023-01-05T16:13:52ZMatthew FinkelUse "Safer" as the default security levelWe should move toward shipping Tor Browser with Safer as the default security level. Safer includes a more sane and reasonable compromise between usability and security, and (we suspect) most users never change the security level from th...We should move toward shipping Tor Browser with Safer as the default security level. Safer includes a more sane and reasonable compromise between usability and security, and (we suspect) most users never change the security level from the default. Therefore, we should use a safe(r) default.
This ticket can track what is needed for this:
- [ ] #40086
- [ ] #33000
- [ ] #19850
- [ ] #22981https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/23362consider performing network operations in a dedicated process2023-01-05T16:13:48Zcypherpunksconsider performing network operations in a dedicated processESR59 will have approx. 8 processes, excluding content processes. And it makes sense to run them all in strong sandboxes without network access. To achieve this it could be helpful to discuss and coordinate this work with Mozilla in http...ESR59 will have approx. 8 processes, excluding content processes. And it makes sense to run them all in strong sandboxes without network access. To achieve this it could be helpful to discuss and coordinate this work with Mozilla in https://bugzilla.mozilla.org/show_bug.cgi?id=1322426.Sponsor 131 - Phase 5 - Ongoing Maintenancehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41468backport 1600437 : Disable CBC-mode ECDSA ciphers and stop advertising ECDSA+...2023-11-20T16:45:31ZThorinbackport 1600437 : Disable CBC-mode ECDSA ciphers and stop advertising ECDSA+SHA1following on from #40183
- FF109+ [1600437](https://bugzilla.mozilla.org/show_bug.cgi?id=1600437)
- [patch](https://hg.mozilla.org/mozilla-central/rev/d0ac295c1b62)
IDK if this makes any difference really, but it's more than just pref f...following on from #40183
- FF109+ [1600437](https://bugzilla.mozilla.org/show_bug.cgi?id=1600437)
- [patch](https://hg.mozilla.org/mozilla-central/rev/d0ac295c1b62)
IDK if this makes any difference really, but it's more than just pref flips. And we could drop the two prefs added in https://gitlab.torproject.org/tpo/applications/tor-browser/-/merge_requests/433
these two
```js
pref("security.ssl3.ecdhe_ecdsa_aes_256_sha", false, locked);
pref("security.ssl3.ecdhe_ecdsa_aes_128_sha", false, locked);
```Pier Angelo VendramePier Angelo Vendramehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41506Remove TrustCor root certificates2023-01-30T08:35:42ZGusRemove TrustCor root certificates“Certificate Authorities have highly trusted roles in the internet ecosystem and it is unacceptable for a CA to be closely tied, through ownership and operation, to a company engaged in the distribution of malware,” Mozilla’s Kathleen Wi...“Certificate Authorities have highly trusted roles in the internet ecosystem and it is unacceptable for a CA to be closely tied, through ownership and operation, to a company engaged in the distribution of malware,” Mozilla’s Kathleen Wilson wrote to a mailing list for browser security experts. “Trustcor’s responses via their Vice President of CA operations further substantiates the factual basis for Mozilla’s concerns.”
https://www.washingtonpost.com/technology/2022/11/30/trustcor-internet-authority-mozilla/
concerns about Trustcor - https://groups.google.com/a/mozilla.org/g/dev-security-policy/c/oxX69KFvsm4/m/etbBho-VBQAJma1ma1https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41452WebExtension "Content Script"2023-01-05T15:17:01ZP9LmZu22jmVbWebExtension "Content Script"Recently TorBrowser disallows running "content scripts" from the WebExtension API. I understand that content scripts can read user data and therefore are generally excluded in TorBrowser. But is there any way to disable this protection? ...Recently TorBrowser disallows running "content scripts" from the WebExtension API. I understand that content scripts can read user data and therefore are generally excluded in TorBrowser. But is there any way to disable this protection? In about:config I didn't find a solution unfortunately.
Version 11.5.7https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/22584More RWX memory pages for TBB on some Windows versions2022-11-30T16:58:09ZArthur EdelsteinMore RWX memory pages for TBB on some Windows versionsA cypherpunk has reported some RWX memory pages were observed for Tor Browser on Windows 7 and Windows 10. See:
* ticket:21617#comment:4
* ticket:21617#comment:7
* ticket:21617#comment:14A cypherpunk has reported some RWX memory pages were observed for Tor Browser on Windows 7 and Windows 10. See:
* ticket:21617#comment:4
* ticket:21617#comment:7
* ticket:21617#comment:14Sponsor 131 - Phase 5 - Ongoing Maintenancehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/33479PDF fullscreen Presentation Mode doesn't letterbox2023-05-03T13:40:28ZcypherpunksPDF fullscreen Presentation Mode doesn't letterbox1. Open a PDF file in a new tab so it opens in the browser's internal PDF viewer. Here's one. https://gitweb.torproject.org/company/policies.git/plain/corpdocs/IRS-Determination-Letter.pdf
2. Click the 4-outward-arrows (fullscreen?) icon...1. Open a PDF file in a new tab so it opens in the browser's internal PDF viewer. Here's one. https://gitweb.torproject.org/company/policies.git/plain/corpdocs/IRS-Determination-Letter.pdf
2. Click the 4-outward-arrows (fullscreen?) icon on the PDF toolbar. Its tooltip when you hover on it says, "Switch to Presentation Mode"
3. Observe that Presentation Mode is not letterboxed.
PDF Presentation Mode is distinct from browser full screen (F11 key) and from maximize.
Is this exploitable at all? Is the internal PDF API fingerprintable? Tor Browser warns when downloading to not open files in external viewers that could circumvent Tor.
Similar vectors:
* legacy/trac#32713, Letterboxing doesn't work when fullscreening videos
* legacy/trac#12609, HTML5 fullscreen API makes TB fingerprintable
Inspired by:
* https://blog.torproject.org/comment/286752#comment-286752https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/17216Make Tor Browser's updater work over Hidden Services2022-11-30T16:46:33ZIsis LovecruftMake Tor Browser's updater work over Hidden ServicesThis would provide additional cover traffic for other HSes. Another proposal from the (second) HS guard discovery protections meeting at the 2015 Berlin Tor developer meeting was to only have clients check for new Tor Browser updates via...This would provide additional cover traffic for other HSes. Another proposal from the (second) HS guard discovery protections meeting at the 2015 Berlin Tor developer meeting was to only have clients check for new Tor Browser updates via some HS(es), and then do the actual download of the update over the regular non-HS mirrors.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/40944Prompt if Tor Browser is zoomed2023-11-04T01:34:46ZbugzillaPrompt if Tor Browser is zoomedDon't we need to display some kind of toolbar message or otherwise warn the user against zooming their Tor Browser window like in legacy/trac#7255?
Because zooming changes resolution to very rare values.Don't we need to display some kind of toolbar message or otherwise warn the user against zooming their Tor Browser window like in legacy/trac#7255?
Because zooming changes resolution to very rare values.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/23664Deal with UUID for content sandbox temp folder on Windows and Mac2024-01-05T16:08:12ZGeorg KoppenDeal with UUID for content sandbox temp folder on Windows and Maccomment:56:ticket:16010 mentioned:
```
Very important side issue is that the sandboxing feature adds `security.sandbox.content.tempDirSuffix` pref which is a 128-bit GUID that allows to uniquely identify your copy of Tor Browser. It is p...comment:56:ticket:16010 mentioned:
```
Very important side issue is that the sandboxing feature adds `security.sandbox.content.tempDirSuffix` pref which is a 128-bit GUID that allows to uniquely identify your copy of Tor Browser. It is persistent and leaves unique traces on every machine you use in system %TEMP% folder.
```
We should find a good way dealing with that. Maybe a first start is to set the pref, so that every Windows user has the same sandbox temp dir name.Sponsor 131 - Phase 5 - Ongoing Maintenancehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41131Review Mozilla 1738983: Enable Background Update by default on Release starti...2022-12-09T14:40:46ZrichardReview Mozilla 1738983: Enable Background Update by default on Release starting in FX96## https://bugzilla.mozilla.org/show_bug.cgi?id=1738983
Updater changes, odds are you're already aware and handled in the rebase already## https://bugzilla.mozilla.org/show_bug.cgi?id=1738983
Updater changes, odds are you're already aware and handled in the rebase alreadyhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/5791Gather apparmor/selinux/seatbelt profiles for each component of TBB2022-11-30T16:20:35ZRoger DingledineGather apparmor/selinux/seatbelt profiles for each component of TBBIt's increasingly clear that shipping TBB without any "system call permissions" wrappers is an arms race that is too easy to lose. Bug 5741 is the latest of what will continue to be many instances.
The Tor wiki has a variety of instruc...It's increasingly clear that shipping TBB without any "system call permissions" wrappers is an arms race that is too easy to lose. Bug 5741 is the latest of what will continue to be many instances.
The Tor wiki has a variety of instructions on putting your TBB in a VM, or running it wrapped by apparmor, or somebody saying the word SELinux, etc.
We should gather all these instructions together, and start vetting them with the goal of integrating as many as we can into the main build processes, and providing the rest as "for experts, you can be even safer if".
We need a volunteer with good security taste to get this started. I could easily see this project being a bounty too.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/40207Tor Browser is writing to Windows registry on every start2022-11-30T15:19:24ZGeorg KoppenTor Browser is writing to Windows registry on every startI got a report from a cypherpunk:
```
https://gitlab.torproject.org/tpo/applications/tor-browser/-/wikis/Platform-Installation
Firefox is still writing to Windows Registry on every start:
Computer\HKEY_CURRENT_USER\SOFTWARE\Mozilla\Firef...I got a report from a cypherpunk:
```
https://gitlab.torproject.org/tpo/applications/tor-browser/-/wikis/Platform-Installation
Firefox is still writing to Windows Registry on every start:
Computer\HKEY_CURRENT_USER\SOFTWARE\Mozilla\Firefox\Launcher
There it stores all the paths TBB was started from.
That also allows an attacker to permanently disable Launcher Process
security feature, and even any hiccup can do/leads to it:
about:support
Launcher Process Disabled due to failure
```Sponsor 131 - Phase 2 - Privacy Browserhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/40609Investigate Firefox's per-site "Disable Javascript" feature2022-12-08T15:15:29ZMatthew FinkelInvestigate Firefox's per-site "Disable Javascript" featurefe6cfda83acdbdd9f1576f710a1aa0d4116635b2fe6cfda83acdbdd9f1576f710a1aa0d4116635b2ma1ma1https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/31031Tor Browser trying to read /etc/machine-id on start2022-11-30T16:55:53ZTracTor Browser trying to read /etc/machine-id on startSteps to reproduce:
- Tor Browser from the official website
- Download and enable the AppArmor profile from https://github.com/Whonix/apparmor-profile-torbrowser (you may need to modify 2 or 3 lines due to different naming, e.g. change ...Steps to reproduce:
- Tor Browser from the official website
- Download and enable the AppArmor profile from https://github.com/Whonix/apparmor-profile-torbrowser (you may need to modify 2 or 3 lines due to different naming, e.g. change `*-browser` to `*-browser*`)
- Start TorBrowser
- Inspect `/var/log/kern.log`
You'll see a message like
`Jun 29 01:23:45 debian kernel: [xxxxxx.xxxxxx] audit: type=1400 audit(xxxxxxxxxx.xxx:xx): apparmor="DENIED" operation="open" profile="/**/*-browser*/Browser/firefox" name="/etc/machine-id" pid=xxxx comm="firefox.real" requested_mask="r" denied_mask="r" fsuid=1000 ouid=0`
Not sure if this behaviour is also present in Firefox, maybe test it when I have time.
---
Debian 10 "Buster"
Tor Browser 8.5.3
AppArmor 2.13.2-10
**Trac**:
**Username**: rain-undefinedhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/19750Sandboxing in Tor Browser2022-11-30T15:14:35ZArthur EdelsteinSandboxing in Tor BrowserHere's a parent ticket to track efforts to sandbox Tor Browser. Please use this ticket to discuss various approaches and link to email discussions where available.Here's a parent ticket to track efforts to sandbox Tor Browser. Please use this ticket to discuss various approaches and link to email discussions where available.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41149Review Mozilla 1762576: Firefox is not allowing Symantec DLP to inject DLL i...2022-10-21T20:23:58ZrichardReview Mozilla 1762576: Firefox is not allowing Symantec DLP to inject DLL into the browser for Data Loss Prevention software## https://bugzilla.mozilla.org/show_bug.cgi?id=1762576
Here's a thought, let's not let random processes inject dlls into tor-browser (to be clear I propose we revert / disable this funcitonality)## https://bugzilla.mozilla.org/show_bug.cgi?id=1762576
Here's a thought, let's not let random processes inject dlls into tor-browser (to be clear I propose we revert / disable this funcitonality)Sponsor 131 - Phase 3 - Major ESR 102 MigrationDan BallardDan Ballardhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/32492Unexpected NoScript behavior when security level is pinned using user.js2022-10-05T12:47:47ZTracUnexpected NoScript behavior when security level is pinned using user.jsIf a Tor Browser user attempts to pin the security level using `user.js` (see below), Tor Browser will launch with the pinned security level, but NoScript will not respect that choice and instead retain its previous behavior. For example...If a Tor Browser user attempts to pin the security level using `user.js` (see below), Tor Browser will launch with the pinned security level, but NoScript will not respect that choice and instead retain its previous behavior. For example, if the user attempts to pin the security level to "Safest" using `user.js`, closes Tor Browser with the security level set to "Safer" and then re-launches Tor Browser, NoScript will behave as though the security setting is "Safer", blocking non-HTTPS JavaScript but allowing HTTPS JavaScript to run.
This behavior is potentially dangerous because the user will believe all Tor Browser security features will follow the user's pinned choice and the user will see the shield icon appearance according to their chosen pinned security level, but NoScript may behave differently. For example, NoScript may run JavaScript without the user's knowledge if the user pins the security level to "Safest".
Reproduced in:
- Tor Browser 9.0 and 9.0.1 (the first affected version is unknown)
- NoScript 11.0.8 (the first affected version is unknown)
- Debian 9 (stretch)
How to reproduce:
- `user.js` allows pinning of Tor Browser (Firefox) parameters upon launch.
1. Create `user.js` in: `<tor-browser-top>/Browser/TorBrowser/Data/Browser/profile.default/`
2. Pin the security level to "Safest". Add the line: `user_pref("extensions.torbutton.security_slider", 1);`
3. Launch Tor Browser, change the security level from "Safest" to something different, then close Tor Browser.
4. Launch Tor Browser again, and confirm the security level is set to "Safest".
5. Access a website that requires JavaScript to work properly.
6. Confirm whether or not JavaScript is running.
**Trac**:
**Username**: kjSponsor 131 - Phase 2 - Privacy Browserma1ma1https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/18288Sign Tor Browser binaries on Windows (not just the setup executable)2022-07-09T18:03:24ZGeorg KoppenSign Tor Browser binaries on Windows (not just the setup executable)Mozilla is doing the signing of Firefox binaries for a while now, beyond providing signatures for the setup executable. https://blogs.msdn.com/b/ieinternals/archive/2011/03/22/authenticode-code-signing-for-developers-for-file-downloads-b...Mozilla is doing the signing of Firefox binaries for a while now, beyond providing signatures for the setup executable. https://blogs.msdn.com/b/ieinternals/archive/2011/03/22/authenticode-code-signing-for-developers-for-file-downloads-building-smartscreen-application-reputation.aspx has some things to say about that.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/41071Targeted Deanonymization via the Cache Side Channel2024-01-29T11:59:05ZGhost UserTargeted Deanonymization via the Cache Side Channelhttps://leakuidatorplusteam.github.io/
A paper describing the attacks will appear in the 31st USENIX Security Symposium (Boston, 10–12 August, 2022). A preprint of the paper is available [here](https://leakuidatorplusteam.github.io/prep...https://leakuidatorplusteam.github.io/
A paper describing the attacks will appear in the 31st USENIX Security Symposium (Boston, 10–12 August, 2022). A preprint of the paper is available [here](https://leakuidatorplusteam.github.io/preprint.pdf). The paper is the result of a collaboration between a group of researchers at the New Jersey Institute of Technology: Mojtaba Zaheri, Yossi Oren, and Reza Curtmola.
According to the authors, this attack has some nasty elements:
- It can precisely target any user with a specific public identifier, otherwise leave non-targeted users untouched.
- It can target users logged into highly popular resource-sharing services, for example Google, Dropbox, Twitter, Facebook.
- It works on users who use any browser including Tor Browser.
- It's scalable to attack large numbers of users.
- It gives no indication to the victim that they are being attacked.
- Effective countermeasures may involve a compromise of usability.
> On the Internet, the casual person surfing a website has a reasonable expectation that their identity remains private. We reveal new cache-based target deanonymization attacks which threaten user anonymity: An attacker who has complete or partial control over a website can learn whether a specific target (i.e., a unique individual) is browsing the website. The attacker knows this target only through a public identifier, such as an email address or a Twitter handle.
>
> The attacks leverage the sharing/blocking functionality provided by resource-sharing services such as YouTube, Google Drive, Dropbox, or Twitter. The target user is assumed to be logged into such a sharing service. The attacks exploit the CPU cache side channel on the target’s device, and can bypass isolation mechanisms and various defenses deployed by browser vendors or resource-sharing services.
>
> We evaluated the attacks on multiple hardware microarchitectures, multiple operating systems and multiple browser versions, including the highly-secure Tor Browser, and demonstrated practical targeted deanonymization attacks on major sites, including Google, Twitter, LinkedIn, TikTok, Facebook, Instagram and Reddit. The attack runs in less than 3 seconds in most cases, and can be scaled to target a large number of users.ma1ma12022-08-10https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/28147[meta] Improve Tor Browser Content Process Sandbox2022-07-12T23:33:12ZTom Rittertom@ritter.vg[meta] Improve Tor Browser Content Process SandboxThis ticket is specifically for tightening the content process sandbox.
An attacker who achieves code execution inside the content process sandbox should not be able to achieve the most valuable goals (proxy bypass/persistent user iden...This ticket is specifically for tightening the content process sandbox.
An attacker who achieves code execution inside the content process sandbox should not be able to achieve the most valuable goals (proxy bypass/persistent user identifier) inside the content process and should instead need a sandbox escape.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/25559Miscellaneous security- and privacy-related prefs for Tor Browser2022-11-09T18:38:53ZArthur EdelsteinMiscellaneous security- and privacy-related prefs for Tor BrowserJKT has been working on some prefs he suggested we might consider:
* Security.mixed_content.upgrade_display_content
* Upgrades passive mixed content to HTTPS transparently
* Network.ftp.enabled
* disable FTP
* security.insecure_conne...JKT has been working on some prefs he suggested we might consider:
* Security.mixed_content.upgrade_display_content
* Upgrades passive mixed content to HTTPS transparently
* Network.ftp.enabled
* disable FTP
* security.insecure_connection_icon.enabled and security.insecure_connection_icon.pbmode.enabled
* security.insecure_connection_text.enabled and security.insecure_connection_text.pbmode.enabled
* Both of these mark HTTP connections as insecure. One with a broken lock icon, the other with text saying ‘Not Secure’
* Insecure flash content:
* security.mixed_content.block_object_subrequest
* Sensors:
* device.sensors.*.enabled (motion, proximity, ambientLight and orientation) && the Event constructors are now also included in device.sensors.enabled
* `device.sensors.enabled` set to False in RF (https://bugzilla.mozilla.org/show_bug.cgi?id=1369319)
* dom.registerProtocolHandler.insecure.enabled
* browser.cache.offline.insecure.enable
* dom.registerContentHandler.enabled
Others being pondered:
* Http-disabled
* I believe this is to block all HTTP connections.Sponsor 131 - Phase 3 - Major ESR 102 MigrationPier Angelo VendramePier Angelo Vendramehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/3246Isolate HTTP cookies according to first and third party domain contexts2022-01-11T19:33:57ZMike PerryIsolate HTTP cookies according to first and third party domain contextsRight now, we've set Tor Browser to block third party cookies. This will probably break some sites. There is a less intrusive option described at https://wiki.mozilla.org/Thirdparty that we should use.
**Rebase** and test existing patch...Right now, we've set Tor Browser to block third party cookies. This will probably break some sites. There is a less intrusive option described at https://wiki.mozilla.org/Thirdparty that we should use.
**Rebase** and test existing patches (originating from https://bugzilla.mozilla.org/show_bug.cgi?id=565965)
**Revise requirements** according to preliminary tests and devise a broad test plan.
**Reimplement and retest** to guarantee proper isolation without severely impeding cookie dependent applications.
**Document** the implementation and optionally a contrast of browser cookie handling.
Pave the way towards a **improved privacy panel** including a new cookie inspector and API supporting such UI.
----
**Note:** This is a metaticket composed of work items in child tickets.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/3347Permanently Opt-In to YouTube's HTML5 Beta Test2022-01-11T19:33:57ZTracPermanently Opt-In to YouTube's HTML5 Beta TestAn option to permanently opt-in to YouTube's HTML5 beta test would greatly enhance the Torbutton user experience. Bonus points if the option would work even if cookies are disabled.
**Trac**:
**Username**: katmagicAn option to permanently opt-in to YouTube's HTML5 beta test would greatly enhance the Torbutton user experience. Bonus points if the option would work even if cookies are disabled.
**Trac**:
**Username**: katmagicTorBrowserBundle 2.2.x-stableMike PerryMike Perryhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/3875TBB's Firefox should use optimistic data socks handshake variant2022-01-11T19:33:57ZRoger DingledineTBB's Firefox should use optimistic data socks handshake variantTor Proposal 181 lets the Tor client save a round-trip if the application speaks socks in a special way. In short, the application needs to send its data before hearing that the socks connection was successful. It's supported as of Tor 0...Tor Proposal 181 lets the Tor client save a round-trip if the application speaks socks in a special way. In short, the application needs to send its data before hearing that the socks connection was successful. It's supported as of Tor 0.2.3.3-alpha.
Ian originally had suggested to hack polipo to use a modified socks handshake. With polipo out of the picture for TBB, we should make Firefox itself do it.
Is this something Torbutton should (could) do, or should we patch the Firefox we include in TBB?TorBrowserBundle 2.3.x-stableMike PerryMike Perryhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/4152Implement Bottom Up Randomization (Windows platform)2023-04-25T16:58:34ZbastikImplement Bottom Up Randomization (Windows platform)To improve ASLR efficiency you could add Bottom Up Randomization.
Matt Miller told Didier Stevens how he did. So I know that too.
“It works by reserving a random number (between [0,256]) of 64K regions via VirtualAlloc. This has the ef...To improve ASLR efficiency you could add Bottom Up Randomization.
Matt Miller told Didier Stevens how he did. So I know that too.
“It works by reserving a random number (between [0,256]) of 64K regions via VirtualAlloc. This has the effect of consuming a small portion of the bottom part of the address space. Since the Windows kernel assigns base addresses for collided DLLs by searching for a free region starting at the bottom of the address space, bottom up randomization ensures that a random base address will be assigned. Without bottom up randomization the bottom part of the address space remains fairly static (with some exceptions, such as due to heap, stack, and EXE randomization).”
Code
"int iIter;
int iRand;
srand(time(NULL));
iRand = rand() % 256 + 1;
for (iIter = 0; iIter < iRand; iIter++)
VirtualAlloc(NULL, 64*1024, MEM_COMMIT | MEM_RESERVE, PAGE_NOACCESS);"
"In stead of 15 base addresses, with the most frequent address being using 30% of the time, my Bottom Up Randomization implementation gives me more than 300 addresses after 150.000 runs. And there’s no single address being used more than 0,5% of the time."
An comment adds that only MEM_RESERVE should be used for VirtualAlloc, because MEM_COMMIT would require more memory. Didier Stevens replies that this is possible although the additional memory wouldn't be much.
Here's the link: http://blog.didierstevens.com/2011/09/29/add-bottom-up-randomization-to-your-own-source-code/
BTW: It's impossible to chose an component, because all binaries (Tor/Vidalia at least) should make use of it.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/4234Deploy experimental builds using the Firefox update process2023-11-13T21:04:12ZMike PerryDeploy experimental builds using the Firefox update processSure, it's probably not hardened against version downgrade attacks, interruption attacks, no-progress attacks, and maybe not even against CA compromises.
But it's gotta be better than nothing, and maybe it is easily serviceable into so...Sure, it's probably not hardened against version downgrade attacks, interruption attacks, no-progress attacks, and maybe not even against CA compromises.
But it's gotta be better than nothing, and maybe it is easily serviceable into something that will work for us.
Users are having a hard time manually working with our TBB packages if they want to preserve bookmarks, settings, and history, and are getting themselves into trouble by copying pieces of them over each other incorrectly while trying to manually upgrade:
https://lists.torproject.org/pipermail/tor-talk/2011-October/021771.html
I think any form of process that automates this for them is a step above status quo. It's just a matter of finding out if it is significantly less time+effort to deploy than Thandy, and what the security tradeoffs are.TorBrowserBundle 2.3.x-stablehttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/4280build changes for TBB2022-01-11T19:33:57ZJacob Appelbaumbuild changes for TBBre legacy/trac#2176 - I came up with a list of things I think we should disable or change in TBB's build process.
I noticed that as it stands, we don't disable stuff like JSctypes. Which, well, if it's anything like python ctypes, holy ...re legacy/trac#2176 - I came up with a list of things I think we should disable or change in TBB's build process.
I noticed that as it stands, we don't disable stuff like JSctypes. Which, well, if it's anything like python ctypes, holy moley!
```
diff --git a/build-scripts/config/dot_mozconfig b/build-scripts/config/dot_mozconfig
index 9333a6f..227bd01 100755
--- a/build-scripts/config/dot_mozconfig
+++ b/build-scripts/config/dot_mozconfig
@@ -5,5 +5,16 @@ mk_add_options MOZ_APP_DISPLAYNAME=TorBrowser
ac_add_options --enable-optimize
ac_add_options --enable-strip
+ac_add_options --enable-install-strip
ac_add_options --disable-tests
ac_add_options --disable-debug
+ac_add_options --disable-ctypes
+ac_add_options --disable-necko-disk-cache
+ac_add_options --disable-necko-wifio
+ac_add_options --disable-installer
+ac_add_options --disable-updater
+ac_add_options --disable-parental-controls
+
+
+# Linux options
+ac_add_options --disable-dbus
```https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/4335Per-urlbar domain plugin control2022-01-11T19:33:57ZTracPer-urlbar domain plugin controlHow about instead of completely having to dis/enable all plug-ins when using Tor, you allow site-exceptions (like youtube for flashplayer) as well as a possibility to select which plug-ins to disable and which not (i personally only have...How about instead of completely having to dis/enable all plug-ins when using Tor, you allow site-exceptions (like youtube for flashplayer) as well as a possibility to select which plug-ins to disable and which not (i personally only have 2 plug-ins: shockwave flash & adobe acrobat, and i dont think the risks that apply to flash also apply to acrobat (if i should be mistaken, please correct me).
**Trac**:
**Username**: trallalahttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/4522Add privilege separation for bundled browser2022-01-11T19:33:57ZTracAdd privilege separation for bundled browserTBB comes with Firefox which runs with full user privileges by default. A single vulnerability for example in its rendering or javascript code can be used to access private data stored on the system or to bypass Tor and reveal IP and loc...TBB comes with Firefox which runs with full user privileges by default. A single vulnerability for example in its rendering or javascript code can be used to access private data stored on the system or to bypass Tor and reveal IP and location.
Modern OSs offer security mechanisms to run 3rd party applications with reduced privileges:
Windows Vista and later have Protected/Low Integrity Mode.
OS X has seatbelt, fully usable at least since Lion.
Linux has several mechanisms, seccomp is in the kernel and should be available on all recent distros, SELinux and Apparmor are more distro specific (Red Hat, Fedora, Ubuntu).
Firefox upstream doesn't make use of any of them yet but that shouldn't stop redistributors with different security requirements...
Firefox is also the only major browser that doesn't have a multi-process architecture to further limit the privileges of code that handles untrusted input. I don't think anything can be done about that short of waiting for Electrolysis making it into Aurora or switching the browser to something else in the meantime which is probably undesirable for many reasons.
However sandboxing the firefox process could be done right now with relatively little difficulty. The heavy-lifting has been done already, Chromium has several sandbox mechanisms to cover all major platforms.
A few links to get started:
For Windows:
a few icacls commands are enough for a basic configuration.
https://wiki.mozilla.org/Mozilla_2/Protected_mode
http://superuser.com/questions/30668/how-to-run-firefox-in-protected-mode-i-e-at-low-integrity-level
For OS X:
http://developer.apple.com/library/mac/#documentation/Security/Conceptual/AppSandboxDesignGuide/AboutAppSandbox/AboutAppSandbox.html
http://dev.chromium.org/developers/design-documents/sandbox/osx-sandboxing-design
For Linux:
http://code.google.com/p/chromium/wiki/LinuxSandboxing
Ubuntu comes with a Firefox Apparmor profile which just needs to be adapted to point at the correct binary.
For *BSD:
jail is available across the board
None of these are designed with the threat model of Tor in mind. Special focus would be needed to protect the IP address from the browser.
Summary:
Outdated security architecture of Firefox together with the javascript heavy web and modern drive by exploits make the current TBB increasingly susceptible to application level attacks.
Similar levels of security and resilience against application vulnerabilities to the "anonymizing middlebox" (transparent proxy in separate computer of VM) can be achieved with privilege separation.
Make it happen before Electrolysis comes out (is it even still on their roadmap?)
**Trac**:
**Username**: kteelhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/4894TBB permissions problem in multi-user OS X environment2022-01-11T19:33:57ZTracTBB permissions problem in multi-user OS X environmentInstalling a centrally-accessible copy of TBB on OS X results in only the installing user being able to launch the bundle, owing to a permissions issue:
[Warning] ../../Contents/Resources/Data/Tor is not owned by this user (REDACTED, RE...Installing a centrally-accessible copy of TBB on OS X results in only the installing user being able to launch the bundle, owing to a permissions issue:
[Warning] ../../Contents/Resources/Data/Tor is not owned by this user (REDACTED, REDACTED) but by REDACTED (REDACTED). Perhaps you are running Tor as the wrong user?
[Warning] Failed to parse/validate config: Couldn't access/create private data directory "../../Contents/Resources/Data/Tor"
[Error] Reading config failed--see warnings above.
If possible, TBB should be re-made in such a fashion that it can be installed on OS X by one user, yet used successfully by others, without having to manually undertake a permissions-workaround.
Problem encountered with TBB Version 2.2.35-4 - OS X (64-Bit) on OS X 10.6.8.
**Trac**:
**Username**: h8a14i20QHhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/5024compile time hardening of TBB (RELRO, canary, PIE)2022-01-11T19:33:57Zcypherpunkscompile time hardening of TBB (RELRO, canary, PIE)Would be nice if TBB (for Linux and OS X at least) would come with gcc hardening features applied.
Output of checksec.sh:
vidalia 3925 No RELRO No canary found NX enabled No PIE
...Would be nice if TBB (for Linux and OS X at least) would come with gcc hardening features applied.
Output of checksec.sh:
vidalia 3925 No RELRO No canary found NX enabled No PIE
tor 3933 No RELRO No canary found NX enabled No PIE
firefox 3935 No RELRO No canary found NX enabled No PIE
compared to bundled Firefox in Ubuntu:
firefox 8779 Full RELRO Canary found NX enabled PIE enabledErinn ClarkErinn Clarkhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/5767Document auditing setups for testers to use2023-01-05T16:56:22ZMike PerryDocument auditing setups for testers to useWe've got a TBB AppArmor profile at https://trac.torproject.org/projects/tor/wiki/doc/AppArmorForTBB. On legacy/trac#5741, some dude named unknown posted iptables rules that log violations. I hear there is also an OSX Seatbelt policy flo...We've got a TBB AppArmor profile at https://trac.torproject.org/projects/tor/wiki/doc/AppArmorForTBB. On legacy/trac#5741, some dude named unknown posted iptables rules that log violations. I hear there is also an OSX Seatbelt policy floating around somewhere that may also be useful.
We should create a meta document, or perhaps just describe on https://trac.torproject.org/projects/tor/wiki/doc/build/BuildSignoff how to use these things to test for disk leaks, proxy issues, oddities, and other violations.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/6458Double-key HSTS for third party content2022-01-11T19:33:57ZMike PerryDouble-key HSTS for third party contentWith proper cache+identifier siloing to url bar origin, it is no longer a security issue to allow 3rd party content from HSTS urls to get loaded from non-HSTS sites. Therefore, we can disable HSTS enforcement for third parties in this ca...With proper cache+identifier siloing to url bar origin, it is no longer a security issue to allow 3rd party content from HSTS urls to get loaded from non-HSTS sites. Therefore, we can disable HSTS enforcement for third parties in this case.
This will eliminate a super-cookie vector that HSTS creates (registering 32 domains, using HSTS for each domain as a bit).
This is going to be a painful patch to write, though...https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/6528Combine cache isolation patches and bind to pref2022-01-11T19:33:57ZMike PerryCombine cache isolation patches and bind to prefWe have two cache isolation patches that we should try to get merged upstream, especially since one of them is a crazy large change to the image cache operation and is likely to generate conflicts in future releases.
However, the patche...We have two cache isolation patches that we should try to get merged upstream, especially since one of them is a crazy large change to the image cache operation and is likely to generate conflicts in future releases.
However, the patches need work before they can be merged. In addition to the pref, we'll want to make the normal cache isolation built-in, instead of relying on torbutton's stanford-safecache.js and associated observers.
Someone will probably also need to write/updated tests. Bleh.
Getting stuff like this polished and merged is a non-trivial amount of work. At a guess, I'd say around 3 to 4 days worth in total?https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/6560AppArmor, SELinux and other protections2022-01-11T19:33:40ZJacob AppelbaumAppArmor, SELinux and other protectionsWe should create AppArmor, SELinux and other kernel level protection configurations for TorBrowser.We should create AppArmor, SELinux and other kernel level protection configurations for TorBrowser.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/6577WebSockets seem totally broken by Firefox SOCKS settings2022-01-11T19:33:40ZMike PerryWebSockets seem totally broken by Firefox SOCKS settingshttp://html5demos.com/web-socket is failing on both TBB 2.2.x and 2.3.x. I also tested a Vanilla FF14 with SOCKS proxy settings, and the socket also breaks there, too.
This is thus probably an upstream Firefox bug. We should at least fi...http://html5demos.com/web-socket is failing on both TBB 2.2.x and 2.3.x. I also tested a Vanilla FF14 with SOCKS proxy settings, and the socket also breaks there, too.
This is thus probably an upstream Firefox bug. We should at least file a bugzilla bug and bring it to someone's attention, I guess.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/6948Shared memory for zygote mind meld2022-03-15T17:26:27ZJacob AppelbaumShared memory for zygote mind meldAs I mentioned in legacy/trac#6937, I think we could really use a shared memory mutex as some kind meeting point for various Tor specific applications. This likely would be created by Tor or by a zygote that launches things like Tor and ...As I mentioned in legacy/trac#6937, I think we could really use a shared memory mutex as some kind meeting point for various Tor specific applications. This likely would be created by Tor or by a zygote that launches things like Tor and the Tor Browser. In theory, we could then have the launcher application and ensure we don't launch extra copies of Tor, we could ensure that we know _how_ we may connect as well as how to authenticate; lots of stuff becomes possible with a little IPC love.
I have a sketch of the zygote/launcher process but I need to take it from my notepad to text, so I'm merely capturing this part of the process in this ticket. More tickets to come.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/7008Make it safe to run Flash in TBB2022-01-11T19:33:40ZRoger DingledineMake it safe to run Flash in TBBHere's what we wrote:
"The Tor Project will design a sandbox to allow Tor Browser Bundle users to safely use Adobe Flash plugins, or compatible technology, with a majority of web sites on the Internet. We will work with experts in the fi...Here's what we wrote:
"The Tor Project will design a sandbox to allow Tor Browser Bundle users to safely use Adobe Flash plugins, or compatible technology, with a majority of web sites on the Internet. We will work with experts in the field of sandbox technology to develop a solution for Microsoft Windows, Apple OS X, and Linux operating systems. This implementation will be integrated into the alpha-release branch of the Tor Browser Bundle packages."
Originally we'd been planning to have trams lead this project, but it's been a year or more since we originally proposed it so we should try to rope him back in. I'm assigning to Mike to start with, since we need his help deciding what direction to take. We have basically a full-time person of funding to devote here, so let's do it right.Mike PerryMike Perryhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/7255Prompt if Tor Browser is Maximized2022-01-11T19:33:40ZMike PerryPrompt if Tor Browser is MaximizedWe should display some kind of toolbar message or otherwise warn the user against maximizing their Tor Browser window, because maximization reveals monitor resolution and toolbar sizes.We should display some kind of toolbar message or otherwise warn the user against maximizing their Tor Browser window, because maximization reveals monitor resolution and toolbar sizes.Georg KoppenGeorg Koppenhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/7256Explore zoom-based alternatives to fixed window sizes2022-01-11T19:33:40ZMike PerryExplore zoom-based alternatives to fixed window sizesRight now, we set the size of new Tor Browser windows such that their content area is a 200x100 multiple. We also lie to content that the entire desktop resolution is this size.
However, this potentially leaks information for users who ...Right now, we set the size of new Tor Browser windows such that their content area is a 200x100 multiple. We also lie to content that the entire desktop resolution is this size.
However, this potentially leaks information for users who maximize their browser windows, as such windows will no longer be rounded.
We could play with zooming such that maximized windows do not reveal Firefox decoration sizes. We could also set the zoom level automatically such that we end up with a content window size of a 200x100 multiple as well.
Not sure how complicated this will be. See also legacy/trac#7255 for a potentially simpler stopgap.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/7446TorButton should not fixup .onion domains2022-01-11T19:33:40ZSteven MurdochTorButton should not fixup .onion domainsI received the following email, which might be worth investigating:
> Sorry to bother you with this, but didn't know who else to contact.
>
> Defaults (about:config) for TorBrowser should include:
>
> browser.fixup.alternate.enable...I received the following email, which might be worth investigating:
> Sorry to bother you with this, but didn't know who else to contact.
>
> Defaults (about:config) for TorBrowser should include:
>
> browser.fixup.alternate.enabled;false
>
> to prevent injecting www. & .com on timed-out sites.
>
> Thanks for all your great work; it means a lot to a lot of people.cypherpunkscypherpunkshttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/7501Audit PDF.js2022-03-15T17:28:41ZMike PerryAudit PDF.jsWhile I'm reviewing and testing Firefox 17 for ESR, I should see if I can get PDF.js working well enough to include it.
I'll also need to review its source code for obvious signs of proxy bypass and potential third party state storage, ...While I'm reviewing and testing Firefox 17 for ESR, I should see if I can get PDF.js working well enough to include it.
I'll also need to review its source code for obvious signs of proxy bypass and potential third party state storage, though.
https://addons.mozilla.org/en-US/firefox/addon/pdfjs/https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/7561Contents of FTP requests are cached and not isolated to the URL bar origin2022-01-11T19:33:40ZGeorg KoppenContents of FTP requests are cached and not isolated to the URL bar originContents of FTP requests can get cached but are currently not isolated to the URL bar origin which contradicts the goal of section 3.5.2 of the Tor Browser design documentation. The relevant code is here: https://mxr.mozilla.org/mozilla-...Contents of FTP requests can get cached but are currently not isolated to the URL bar origin which contradicts the goal of section 3.5.2 of the Tor Browser design documentation. The relevant code is here: https://mxr.mozilla.org/mozilla-central/source/netwerk/protocol/ftp/nsFtpConnectionThread.cpp
There are two things to note:
1) This caching is working a bit differently than the familiar HTTP caching. E.g. are there no E-Tags, no headers involved which makes a scalable exploitation much harder (that's the only reason why I think the prio is normal) IMO.
2) Furthermore, only directory listings can get cached, not "normal" files like CSS or JS files loaded via FTP.Georg KoppenGeorg Koppenhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/8282integrate tbb firefox osx sandbox in build2022-01-11T19:33:40ZTracintegrate tbb firefox osx sandbox in buildWe should make it possible to create bundles for osx that are sandboxed default, by updating the make file with additional targets.
The finalized bundle need to have the following files in the appbundle root:
sandbox/ff.sb
sandbox/tor...We should make it possible to create bundles for osx that are sandboxed default, by updating the make file with additional targets.
The finalized bundle need to have the following files in the appbundle root:
sandbox/ff.sb
sandbox/tor-wrapper
sandbox/tor.sb
Library/Vidalia/vidalia.conf
Contents/MacOS/TorBrowser.app/Contents/MacOS/firefox-wrapper
**Trac**:
**Username**: tramshttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/8288security, relability and repeatability issues in the TBB build process2022-01-11T19:33:40ZJacob Appelbaumsecurity, relability and repeatability issues in the TBB build processCurrently when building TBB on any system, we open the builder up to compromise. We also open ourselves up to reliability issues as a mirror might vanish and leave us out in the cold.
We rely on fetching software from servers that we do...Currently when building TBB on any system, we open the builder up to compromise. We also open ourselves up to reliability issues as a mirror might vanish and leave us out in the cold.
We rely on fetching software from servers that we do not control and in doing so, we use insecure transport mechanisms. Building TBB should not allow a local network attacker to get code execution on the builder's machine. I propose that we host at least one HTTPS mirror of the required source code. I've opened bug legacy/trac#8286 to discuss this topic and to propose patches. I believe this will make our build process more reliable as a third-party downed mirror will not prevent a build.
We also do not verify that the dependencies for TBB are verified - if someone were to simply tamper with the remote server's archive, the builder would be in trouble. I've opened a ticket to add what I think should be the current expected hashes to the build process in bug legacy/trac#8283. I think it would also make sense to _check_ against the expected hashes, I may or may not open a separate bug for that issue - thoughts?
To the goal of being able to build TBB on OS X from a clean slate is currently being discussed in legacy/trac#8246 and I think it is a reasonable goal to try to work homebrew into the process. Homebrew ensures that a similar hash check is done on software before it installs the software. Thus we'll nearly have a totally trusted chain of tools and source code to build TBB on OS X. Later, I think we should ensure this is the same for all platforms.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/8491build hardening for TBB2022-01-11T19:33:40ZJacob Appelbaumbuild hardening for TBBI was looking at the latest 64bit stable tbb and ran scanelf on it:
```
~/tor-browser_en-US % find .| xargs -n 1 scanelf -a -v
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - -...I was looking at the latest 64bit stable tbb and ran scanelf on it:
```
~/tor-browser_en-US % find .| xargs -n 1 scanelf -a -v
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libevent_extra-2.0.so.5
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libpng15.so.15
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libpng15.so.15.13.0
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libevent_core-2.0.so.5
ET_DYN PeMRxS 0755 LE RW- --- RW- - /srv/build-trees/build-alpha/x86_64/built/lib LAZY ./Lib/libQtGui.so.4
ET_DYN PeMRxS 0755 LE RW- --- RW- - /srv/build-trees/build-alpha/x86_64/built/lib LAZY ./Lib/libQtCore.so.4
ET_DYN PeMRxS 0644 LE RW- --- RW- - - LAZY ./Lib/libcrypto.so.1.0.0
ET_DYN PeMRxS 0644 LE RW- --- RW- - - LAZY ./Lib/libssl.so.1.0.0
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libevent-2.0.so.5
ET_DYN PeMRxS 0755 LE RW- --- RW- - /srv/build-trees/build-alpha/x86_64/built/lib LAZY ./Lib/libQtNetwork.so.4
ET_DYN PeMRxS 0755 LE RW- --- RW- - /srv/build-trees/build-alpha/x86_64/built/lib LAZY ./Lib/libQtXml.so.4
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libevent_extra-2.0.so.5
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libpng15.so.15
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libz/libz.so.1
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libz/libz.so.1
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libpng15.so.15.13.0
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libevent_core-2.0.so.5
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - /srv/build-trees/build-alpha/x86_64/built/lib LAZY ./Lib/libQtGui.so.4
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - /srv/build-trees/build-alpha/x86_64/built/lib LAZY ./Lib/libQtCore.so.4
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0644 LE RW- --- RW- - - LAZY ./Lib/libcrypto.so.1.0.0
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0644 LE RW- --- RW- - - LAZY ./Lib/libssl.so.1.0.0
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./Lib/libevent-2.0.so.5
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - /srv/build-trees/build-alpha/x86_64/built/lib LAZY ./Lib/libQtNetwork.so.4
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - /srv/build-trees/build-alpha/x86_64/built/lib LAZY ./Lib/libQtXml.so.4
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/vidalia
ET_EXEC PeMRxS 0755 LE RW- R-- RW- - /srv/build-trees/build-alpha/x86_64/built/lib NOW ./App/tor
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/vidalia
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/firefox-bin
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/webapprt-stub
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libmozalloc.so
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/firefox
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libsoftokn3.so
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libxpcom.so
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libnssdbm3.so
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libplc4.so
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libxul.so
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/mozilla-xremote-client
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libnssckbi.so
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/plugin-container
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libnss3.so
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libmozsqlite3.so
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/updater
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libssl3.so
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libplds4.so
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libfreebl3.so
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libnssutil3.so
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libnspr4.so
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libsmime3.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/firefox-bin
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/webapprt-stub
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libmozalloc.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/firefox
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libsoftokn3.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libxpcom.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libnssdbm3.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/components/libdbusservice.so
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/components/libbrowsercomps.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/components/libdbusservice.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/components/libbrowsercomps.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libplc4.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libxul.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/mozilla-xremote-client
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libnssckbi.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/plugin-container
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libnss3.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libmozsqlite3.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_EXEC PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/updater
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libssl3.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libplds4.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libfreebl3.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libnssutil3.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libnspr4.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_DYN PeMRxS 0755 LE RW- --- RW- - - LAZY ./App/Firefox/libsmime3.so
TYPE PAX PERM ENDIAN STK/REL/PTL TEXTREL RPATH BIND FILE
ET_EXEC PeMRxS 0755 LE RW- R-- RW- - /srv/build-trees/build-alpha/x86_64/built/lib NOW ./App/tor
```
The output is explained on <a href="http://www.gentoo.org/proj/en/hardened/pax-utils.xml">the pax-utils</a> documentation website.
A few things come to mind - one is that all our binaries should be set to BIND 'NOW' at run time. There are likely other things we could/should improve about these builds.Mike PerryMike Perryhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/9461Tor AppArmor profile prevents flashproxy-client from starting2022-01-11T19:33:40ZproperTor AppArmor profile prevents flashproxy-client from startingSince legacy/trac#9460 and after looking at /etc/apparmor.d/system_tor I am certain, that Tor won't be allowed to execute flashproxy-client. (Didn't test.)Since legacy/trac#9460 and after looking at /etc/apparmor.d/system_tor I am certain, that Tor won't be allowed to execute flashproxy-client. (Didn't test.)weasel (Peter Palfrader)weasel (Peter Palfrader)https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/9623Referers being sent from hidden service websites2023-02-14T17:48:26ZcypherpunksReferers being sent from hidden service websitesCurrently, when browsing on a hidden service website, when you click on a clearnet/hidden service link it sends the current address as referer.
I think Tor Browser should behave for websites on .onion addresses the same as https:// webs...Currently, when browsing on a hidden service website, when you click on a clearnet/hidden service link it sends the current address as referer.
I think Tor Browser should behave for websites on .onion addresses the same as https:// websites on clearnet in certain cases.
Normally, when you click on a http link from a https website, it doesn't send any referer.
Tor Browser should at least use this same behavior of https for http hidden services (both are encrypted right?). No referers should be sent to clearnet or to other hidden services, this is unacceptable. I believe it shouldn't send referers for https links as well, so send nothing at all.
Other than a partial solution, I still believe using the [smart referer](https://addons.mozilla.org/en-us/firefox/addon/smart-referer/) is a better solution overall.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/9864Make it easier for users to do file verification2022-01-11T19:33:40ZMatt PaganMake it easier for users to do file verificationVerifying the contents of the Tor Browser Bundle seems to be one of the most confusing things that we ask users to do. The help desk often gets requests from users seeking guidance on verifying bundles.
The website documentation on fil...Verifying the contents of the Tor Browser Bundle seems to be one of the most confusing things that we ask users to do. The help desk often gets requests from users seeking guidance on verifying bundles.
The website documentation on file signature verification we have can be found at https://www.torproject.org/docs/verifying-signatures.html.en. Multiple users have reported that these inctructions are confusing. I don't think this entirely the fault of the page's author.
There are several issues here to consider:
1) On the file verification page we tell Windows users to download Gpg4win so they can download the bundles. Unfortunately there's no verification tool for gpg4win.
2) The signature verification page will be out-of-date once TBB 3 becomes stable. Verifying TBB 3 requires users to verify a signed text file of sha256sums, and then take the sha256sum of the package and see if it matches what's in the signed text file. Currently there is no way to take the sha256sum of anything on Windows unles you compile a program to do it yourself or download and run an unverified .exe file from any number of http-only websites that show up on a google search.
3) Command line interface is intimidating for many people. There are no instructions on our website for using GUI GnuPG frontends.SheriefSheriefhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10065Improve Hardening for TBB3.02022-01-11T19:33:40ZMike PerryImprove Hardening for TBB3.0In the rush to get the Windows builds working correctly, we may have disabled hardening options that we shouldn't have. I experienced crashes on Windows 7 with a few options as well. We should get to the bottom of those issues and if the...In the rush to get the Windows builds working correctly, we may have disabled hardening options that we shouldn't have. I experienced crashes on Windows 7 with a few options as well. We should get to the bottom of those issues and if they persist, we should ping the mingw-w64 people and find out what the issue is.
Here's a set of tools for Windows to validate hardening:
http://www.microsoft.com/en-us/download/details.aspx?id=11910
https://www.microsoft.com/en-us/download/details.aspx?id=29851
Here's one for Linux:
http://www.trapkit.de/tools/checksec.html
Here's some random documentation:
http://wiki.debian.org/Hardening
http://stackoverflow.com/questions/13276692/safeseh-gs-on-g
https://developer.pidgin.im/ticket/15290Erinn ClarkErinn Clarkhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10138Ship 64bit builds for OS X with Gitian2022-01-11T19:33:40ZGeorg KoppenShip 64bit builds for OS X with GitianHaving a 64bit clang cross-compiler starting with ESR24 hopefully soon lets us envision a not so distant future where we could use that to produce 64bit TBBs for Mac OS X.Having a 64bit clang cross-compiler starting with ESR24 hopefully soon lets us envision a not so distant future where we could use that to produce 64bit TBBs for Mac OS X.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10250Disable RC4 in TBB Firefox2022-01-11T19:33:41ZJesse VictorsDisable RC4 in TBB FirefoxAttacks against RC4 have recently been reported as plausible, and Microsoft, among other groups, have recommended avoiding RC4 for symmetric-key encryption. I would recommend blacklisting cipher suites that rely upon RC4 so that other st...Attacks against RC4 have recently been reported as plausible, and Microsoft, among other groups, have recommended avoiding RC4 for symmetric-key encryption. I would recommend blacklisting cipher suites that rely upon RC4 so that other stronger algorithms, such as AES, will be preferred instead, so as to avoid these attacks. For example, I have disabled 0x9c, 0x35, 0x5, 0x4, 0x2f, and 0xa in Chromium because they do not provide perfect forward secrecy, and 0xc007, 0xc011, and 0x66 because they rely on RC4 but do provide perfect forward secrecy.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10281Investigate usage of alternate memory allocators and memory hardening options2022-01-11T19:33:41ZMike PerryInvestigate usage of alternate memory allocators and memory hardening optionsOne thing we can do to improve security of TBB is to build it with an alternate semi-hardened malloc implementation that attempts to randomize the allocation pattern and performs some minimal checks to guard against heap overflows an ref...One thing we can do to improve security of TBB is to build it with an alternate semi-hardened malloc implementation that attempts to randomize the allocation pattern and performs some minimal checks to guard against heap overflows an reference count issues in Firefox (perhaps by also enabling some additional reference count debugging features already in Firefox).
Such allocator behavior may make exploitation of various use-after-free vulnerabilities more difficult, as it would be harder to predict the location of reallocated regions during exploitation in order to get a target object to overlay an incorrectly freed object.
The downside is this will likely come at the performance costs of loss of locality, increased fragmentation, and additional overhead of reference count checks, but this may be an acceptable cost for improved hardening against exploits.
The first question is: are there any existing drop-in replacement memory allocators we can use in place of Firefox's current jemalloc implementation?
The second question is will any of the Firefox refcounting checks actually help, or will they just increase runtime for no real benefit?Arthur EdelsteinArthur Edelsteinhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10393Torbrowser updates are verified through the Tor consensus2022-01-11T19:33:41ZTom LowenthalTorbrowser updates are verified through the Tor consensusTorbrowser's updater checks the Tor consensus to see whether Torbrowser's current version is recommended. If not, the updater gets updates from a location described in the consensus, and verifies downloaded updates against a hash provide...Torbrowser's updater checks the Tor consensus to see whether Torbrowser's current version is recommended. If not, the updater gets updates from a location described in the consensus, and verifies downloaded updates against a hash provided in the consensus.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10394Torbrowser's updater updates HTTPS-everywhere2022-01-11T19:33:41ZTom LowenthalTorbrowser's updater updates HTTPS-everywhereLet's think about shipping HTTPS-Everywhere solely via our updater, disabling update pings for that extension as well.Let's think about shipping HTTPS-Everywhere solely via our updater, disabling update pings for that extension as well.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10397Torbrowser's updater integrates additional protections from Thandy's threat m...2022-03-21T20:18:40ZTom LowenthalTorbrowser's updater integrates additional protections from Thandy's threat modelhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10498Get only the NoScript we want to our users2022-03-21T20:19:59ZcypherpunksGet only the NoScript we want to our usersNoscript is Firefox extension, known for years security tool and simplest way to stop stuff. Author of Noscript never used [public repository](http://forums.informaction.com/viewtopic.php?p=10981#p10981) for demonstrating development pro...Noscript is Firefox extension, known for years security tool and simplest way to stop stuff. Author of Noscript never used [public repository](http://forums.informaction.com/viewtopic.php?p=10981#p10981) for demonstrating development progress, all known code was available as standalone archive or file from [AMO](https://addons.mozilla.org/). However, author used to sign components of archive [before 2.6.6.9 version](http://hackademix.net/2013/07/20/noscript-and-flashgot-unsigned/). All we have now to try guess files wasn't modified on a way, and still chance to recreate history of development by hands or by 3rd party [repository for versions difference](https://github.com/avian2/noscript)
TBB takes Noscript from servers of AMO during building and run-time addon updates. Do we trust them so much?https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10515Compile Firefox with buffer overflow protection2022-01-11T19:33:41ZbastikCompile Firefox with buffer overflow protectionIt appears like Firefox.exe is not compiled with buffer overflow protection enabled.
https://en.wikipedia.org/wiki/Buffer_overflow_protection
Browsers have holes, and this is better than relying on being fast enough when it comes to up...It appears like Firefox.exe is not compiled with buffer overflow protection enabled.
https://en.wikipedia.org/wiki/Buffer_overflow_protection
Browsers have holes, and this is better than relying on being fast enough when it comes to upgrades.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10599Investigate building TBB with SoftBound or AddressSanitizer2022-01-11T19:33:18ZMike PerryInvestigate building TBB with SoftBound or AddressSanitizerWe should see if we can get TBB to build with SoftBound+CETS, a memory-safety extension to LLVM: http://acg.cis.upenn.edu/softbound/
Apparently to get full benefit we may need to annotate the Mozilla allocator, but we should be able mak...We should see if we can get TBB to build with SoftBound+CETS, a memory-safety extension to LLVM: http://acg.cis.upenn.edu/softbound/
Apparently to get full benefit we may need to annotate the Mozilla allocator, but we should be able make a test build without that annotation (it will just treat the entire malloc pool as one allocation).
SAFECode is apparently an extension to SoftBound, but it has only been rebased to LLVM 3.2 (where as SoftBound has been kept up to date to LLVM 3.4): http://safecode.cs.illinois.edu/
Other resources:
* https://events.ccc.de/congress/2013/Fahrplan/events/5412.html (CCC talk about building FreeBSD with Softbound)
* http://media.ccc.de/browse/congress/2013/30C3_-_5412_-_en_-_saal_1_-_201312271830_-_bug_class_genocide_-_andreas_bogk.html (Video for the same)
* http://blog.regehr.org/archives/939 (see especially the comments)
* http://lists.cs.uiuc.edu/pipermail/llvmdev/2012-April/048569.html (Related projects to SoftBound, including some enhancements/alternatives)Georg KoppenGeorg Koppenhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10820Create preferences for Firefox patches that need them2022-01-11T19:33:18ZMike PerryCreate preferences for Firefox patches that need themWe have several patches that could use a pref to control if them if they are to be enabled.
Ideally, there would have be a pref to control each one so that it only applies to private browsing mode windows (though this would could also b...We have several patches that could use a pref to control if them if they are to be enabled.
Ideally, there would have be a pref to control each one so that it only applies to private browsing mode windows (though this would could also be broken off into separate tickets if it is substantial).
This ticket will serve as the parent ticket for tickets for prefs for our individual patches.
Feel free to file child tickets for any of those patches. All patches that need prefs are also fair game for any bounty programs or interview processes we may run.
Our patch set can be perused at:
https://gitweb.torproject.org/tor-browser.git/shortlog/refs/heads/tor-browser-24.3.0esr-1
An additional Google Doc with more information and Tor Trac and Mozilla Bugzilla bug numbers can be found at:
https://docs.google.com/spreadsheet/ccc?key=0AroPYigJXMK4dFhzUGl5eFFkY09XbjBSTlNVS3o2SWc
Not all patches obviously require prefs and/or private browsing mode detection, but there are many that do (especially the fingerprinting and isolation patches).https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/10840Revert #10682 and block fetches for real2022-01-11T19:33:18ZcypherpunksRevert #10682 and block fetches for reallegacy/trac#10682 should to block connection for updates for sure and for just in case lets try to connect to localhost.legacy/trac#10682 should to block connection for updates for sure and for just in case lets try to connect to localhost.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/11096Randomize MAC address before start of Tor2022-01-11T19:33:18ZTracRandomize MAC address before start of TorI realize this is a tricky ask, as changing the MAC address of a computer requires root privileges. However, I think it is worth finding a suitable way of doing this.
Based on analysis of court documents and conversations with people in...I realize this is a tricky ask, as changing the MAC address of a computer requires root privileges. However, I think it is worth finding a suitable way of doing this.
Based on analysis of court documents and conversations with people in the government malware industry, it is my understanding that US government malware that has targeted Tor users (via TBB exploits) has specifically sought out the MAC address of the infected target's machine. Knowing the MAC address allows the government, at a later date, to verify that the machine they probed with their malware is the same device as the one they have seized through a raid of the person's home or office.
As long as the government is going to use the MAC address as a unique identifier, we might as well try to make it difficult for them.
**Trac**:
**Username**: csoghoianhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/11511Investigate why TorLauncher is sometimes not loaded when starting TBB2022-01-11T19:33:18ZGeorg KoppenInvestigate why TorLauncher is sometimes not loaded when starting TBBNot sure how to frame this but it seems there is the possibility that the Tor Browser Bundle is not proper working at least on some Linux machines: on #tor on Saturday there was a user, aurel, who extracted a fresh 64bit TBB 3.5.4 but co...Not sure how to frame this but it seems there is the possibility that the Tor Browser Bundle is not proper working at least on some Linux machines: on #tor on Saturday there was a user, aurel, who extracted a fresh 64bit TBB 3.5.4 but could not start browsing. The reason was that about:addons showed a missing TorLauncher. We should investigate how this can happen.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/12418TBBs with UBSan create lots of errors when running2022-03-21T20:23:46ZGeorg KoppenTBBs with UBSan create lots of errors when runningWhen running TBBs (based on ESR 24) built with UBSan we get loads of errors which look like:
```
/home/ubuntu/build/tor-browser/js/src/jsobj.cpp:1008:17: runtime error: load of value 120, which is not a valid value for type 'bool'
pkix_p...When running TBBs (based on ESR 24) built with UBSan we get loads of errors which look like:
```
/home/ubuntu/build/tor-browser/js/src/jsobj.cpp:1008:17: runtime error: load of value 120, which is not a valid value for type 'bool'
pkix_pl_object.c:580:31: runtime error: left shift of 4276994303 by 32 places cannot be represented in type 'long int'
/home/ubuntu/build/tor-browser/db/sqlite3/src/sqlite3.c:62742:22: runtime error: left shift of 173 by 24 places cannot be represented in type 'int'
/home/ubuntu/build/tor-browser/layout/style/nsCSSParser.cpp:4861:53: runtime error: load of value 128, which is not a valid value for type 'bool'
/home/ubuntu/build/tor-browser/layout/style/../base/nsStyleConsts.h:27:12: runtime error: load of value 4, which is not a valid value for type 'Side'
/home/ubuntu/build/tor-browser/layout/style/nsCSSParser.cpp:6181:3: runtime error: load of value 4, which is not a valid value for type 'Side'
/home/ubuntu/build/tor-browser/layout/style/nsCSSParser.cpp:7962:5: runtime error: load of value 4, which is not a valid value for type 'Side'
/home/ubuntu/build/tor-browser/dom/workers/Workers.h:81:18: runtime error: load of value 4294967295, which is not a valid value for type 'JSGCParamKey'
/home/ubuntu/build/tor-browser/dom/workers/Workers.h:135:23: runtime error: load of value 4294967295, which is not a valid value for type 'JSGCParamKey'
```https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/12420Investigate deploying STACK to check for optimization-unstable code2022-03-21T20:24:15ZGeorg KoppenInvestigate deploying STACK to check for optimization-unstable codeOptimization-unstable code (code that is unexpectedly eliminated by compiler optimizations due to undefined behavior in the program) can lead to serious bugs in programs. We should think about deploying STACK, which helps to detect this ...Optimization-unstable code (code that is unexpectedly eliminated by compiler optimizations due to undefined behavior in the program) can lead to serious bugs in programs. We should think about deploying STACK, which helps to detect this class of bugs, when building our hardened bundles at least. Relevant reading material:
http://kqueue.org/blog/2013/09/17/cltq/
http://css.csail.mit.edu/stack/
http://pdos.csail.mit.edu/papers/stack:sosp13.pdf
http://pdos.csail.mit.edu/papers/ub:apsys12.pdfhttps://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/12425Investigate setjmp/longjmp-based exception handling for Tor Browser on Windows2022-03-21T20:24:55ZGeorg KoppenInvestigate setjmp/longjmp-based exception handling for Tor Browser on WindowsAs GCC does not implement Structured Exception Handling (SEH) we might want to enable setjmp/longjmp-based exception handling for Tor Browser on Windows. We should do this at least if there are no other exception handling mechanisms enab...As GCC does not implement Structured Exception Handling (SEH) we might want to enable setjmp/longjmp-based exception handling for Tor Browser on Windows. We should do this at least if there are no other exception handling mechanisms enabled by Windows.https://gitlab.torproject.org/tpo/applications/tor-browser/-/issues/12426Make use of HeapEnableTerminationOnCorruption in Tor Browser on Windows2022-01-11T19:33:18ZGeorg KoppenMake use of HeapEnableTerminationOnCorruption in Tor Browser on WindowsThis function gets defined in ipc/chromium/src/base/process_util* but is only used in the test suite: https://mxr.mozilla.org/mozilla-esr24/source/ipc/chromium/src/base/test_suite.h. We should make more use of it in the code itself. See:...This function gets defined in ipc/chromium/src/base/process_util* but is only used in the test suite: https://mxr.mozilla.org/mozilla-esr24/source/ipc/chromium/src/base/test_suite.h. We should make more use of it in the code itself. See: https://blogs.msdn.com/b/oldnewthing/archive/2013/12/27/10484882.aspx for more information.