Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • tpo/tpa/wiki-replica
  • gus/wiki-replica
  • rhatto/wiki-replica
  • jnewsome/wiki-replica
  • meskio/wiki-replica
  • cohosh/wiki-replica
  • emmapeel/wiki-replica
  • gk/wiki-replica
  • nickm/tpa-wiki-replica
  • sready/wiki-replica
  • Diziet/wiki-replica
  • stefani/wiki-replica
  • kez/wiki-replica
  • juga/wiki-replica
  • shelikhoo/wiki-replica
  • sarthikg/wiki-replica
  • sebastian/wiki-replica
  • lelutin/wiki-replica
  • intrigeri/wiki-replica
  • stephen/wiki-replica
  • gaba/wiki-replica
21 results
Show changes
Commits on Source (3948)
Read Activ 0.0 Reval (ms) 0.0 Other Err 2.8K 1 KB 2.0%
0: cs:Connected ro:Secondary/Primary ds:UpToDate/UpToDate C r-----
1: cs:Connected ro:Secondary/Primary ds:UpToDate/UpToDate C r-----
2: cs:Connected ro:Secondary/Primary ds:UpToDate/UpToDate C r-----
0: cs:SyncTarget ro:Secondary/Primary ds:Inconsistent/UpToDate C r-----
1: cs:SyncTarget ro:Secondary/Primary ds:Inconsistent/UpToDate C r-----
2: cs:Connected ro:Secondary/Primary ds:UpToDate/UpToDate C r-----
9: cs:WFConnection ro:Primary/Unknown ds:UpToDate/DUnknown C r-----
10: cs:WFConnection ro:Primary/Unknown ds:UpToDate/DUnknown C r-----
18: cs:SyncSource ro:Primary/Secondary ds:UpToDate/Inconsistent C r-----
DRBD CRITICAL: Device 10 WFConnection UpToDate, Device 9 WFConnection UpToDate
DRBD CRITICAL: Device 28 WFConnection UpToDate, Device 3 WFConnection UpToDate, Device 31 WFConnection UpToDate, Device 4 WFConnection UpToDate
paper jam, and use a fresh ream of paper as used paper tends to jam
openstack keypair create --public-key=~/.ssh/id_rsa.pub anarcat
openstack keypair create --public-key=<(gpg --export-ssh-key anarcat@debian.org) anarcat
* `--keypair=anarcat` refers to the keypair created in the
sudo mkfs.msdos /dev/loop0 &&
This page describes the role of the help desk coordinator. This role is currently handled by Colin "Phoul" Childs.
### Failed update with 'Error running context: An error occured during authentication'
> Your nodes are synching...
> Your node is synching the entire blockchain and validating the consensus rules...
hel.icmp.hetzner.com
[this paper]: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.187.3062&rep=rep1&type=pdf
DMARC result, which is then analysed by the [Verify DMARC scrip][].
[Verify DMARC scrip]: https://rt.torproject.org/Admin/Scrips/Modify.html?id=34
typically an unexpected and unplanned emergency that derails the above
0.10 release of the crate also supports signature generation, PIN
[codespell]
skip = *.json,*.csv,./.git/*/*,./.git/*/*/*
exclude-file = .codespellexclude
.ikiwiki
doc/css
doc/js
doc/*.html
---
# this is a pre-processing job that runs inside a git-enabled
# container
#
# it is designed to run when a new commit is pushed to the repository,
# not merge requests, for which there is a separate job below.
#
# it finds the modified files to make sure we only run the linter on
# those files. it uses a separate image because
# markdownlint/markdownlint doesn't ship with git (and runs as a
# regular user, so we can't install it either)
find-files:
stage: build
image: containers.torproject.org/tpo/tpa/base-images/debian:stable
script:
- apt update && apt install -yy --no-install-recommends git ca-certificates
- export LATEST_COMMIT_SHA="${CI_MERGE_REQUEST_SOURCE_BRANCH_SHA:-${CI_COMMIT_SHA}}"
- echo "commit SHA $LATEST_COMMIT_SHA"
- |
echo "working on files... $(git diff-tree --no-commit-id --name-only -r $LATEST_COMMIT_SHA | tee changed-files.txt)"
rules:
- if: $CI_MERGE_REQUEST_SOURCE_BRANCH_SHA != null || $CI_COMMIT_SHA != null
- changes:
paths:
- "*.md"
- "**/*.md"
artifacts:
paths:
- changed-files.txt
expose_as: 'changed files'
# this compares the current repo with the actual wiki to make sure
# we're not missing any commits, and will fail the push if we need to
# pull from the wiki
#
# it doesn't run on merge requests to leave those poor people alone
fail-on-desync-wiki:
stage: build
image: containers.torproject.org/tpo/tpa/base-images/debian:stable
script:
- apt update && apt install -yy --no-install-recommends git ca-certificates
- git fetch wiki || git remote add -f wiki https://gitlab.torproject.org/tpo/tpa/team.wiki.git
- git merge --ff-only wiki/master
rules:
- if: '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH'
# not an actual, full link check, just some internal consistency
# checks for now
linkcheck:
image: containers.torproject.org/tpo/tpa/base-images/python:bookworm
script:
- ./bin/check-links.py -v
# this runs after the "build" stage above, and consumes the
# `changed-files.txt` artifact from that stage, regardless of the job
# which generated it.
mdlint:
image: containers.torproject.org/tpo/tpa/base-images/debian:trixie
needs:
- job: find-files
artifacts: true
script:
- echo C.UTF-8 > /etc/locale.gen
- apt update && apt install -yy markdownlint locales
- |
echo "working on files: $(cat changed-files.txt)"
- env LANG=C.UTF-8 ./bin/mdl-wrapper $(cat changed-files.txt)
rules:
- changes:
paths:
- "*.md"
- "**/*.md"
# this will simply run all the time, regardless of which files
# changed, so it doesn't require the above
mdlintall:
image:
image: containers.torproject.org/tpo/tpa/base-images/debian:trixie
needs:
- job: find-files
artifacts: true
script:
- echo C.UTF-8 > /etc/locale.gen
- apt update && apt install -yy markdownlint locales
- echo 'this is important to get the return value of mdl, not grep'
- set -o pipefail
- |
env LANG=C.UTF-8 mdl . | ( grep -v "Kramdown Warning: No link definition for link ID '\[\?_toc_\]\?' found on line" || true )
# this could be turned into allow_failures:exit_codes 2 when
# everything but [[_toc_]] is fixed (because that will never be)
allow_failure: true
codespell:
image:
name: containers.torproject.org/tpo/tpa/base-images/debian:stable
needs:
- job: find-files
artifacts: true
before_script:
- apt update
- apt install -qy codespell
script:
- codespell $(cat changed-files.txt)
rules:
- changes:
paths:
- "*.md"
- "**/*.md"
codespellall:
image:
name: containers.torproject.org/tpo/tpa/base-images/debian:stable
needs:
- job: find-files
artifacts: true
before_script:
- apt update
- apt install -qy codespell
script:
- codespell
allow_failure: true
---
howto/backup: service/backup
howto/template: service/template
service/donate-review: service/donate
howto/prometheus: service/prometheus
all
# MD003 Header style
#
# default is "consistent", would be better to set to ATX (because it
# makes "outlines" and grepping more readable), but too many matches
exclude_rule "MD003"
# MD004 - Unordered list style
#
# we don't care if people switch halfway, for now, too many matches
exclude_rule "MD004"
# MD005 Inconsistent indentation for list items at the same level
#
# For some reason, it breaks with:
#
# * a
# * b
# * c
#
# which looks fine to me
exclude_rule "MD005"
# MD006 starting bulleted lists at the beginning of the line
#
# too many matches
exclude_rule "MD006"
# MD007 Unordered list indentation
#
# same as the above
exclude_rule "MD007"
# MD009 trailing spaces
#
# no rationale provided by markdownlint, too many matches
exclude_rule "MD009"
# MD010 Hard tabs
#
# matches tabs even in code blocks, which is frustrating because
# sometimes tabs *are* used in terminal output
exclude_rule "MD010"
# MD013 - Line length
#
# no rationale provided by markdownlint, too many matches
exclude_rule "MD013"
# MD024 Multiple headers with the same content
#
# the rationale is that some markdown parser would generate duplicate
# headers anchors, but we consider those broken. it is not the case of
# the GitLab wiki parser in any case
exclude_rule "MD024"
# MD025 Multiple top level headers in the same document
#
# corollary of the above
exclude_rule "MD025"
# MD026 Trailing punctuation in header
#
# allow exclamation marks and question marks in headings
rule 'MD026', :punctuation => ".,;:"
# MD029 Ordered list item prefix
#
# allow 1. 1. 1. and 1.2.3....
exclude_rule "MD029"
# MD034 Bare URL used
#
# seems legit, but too many matches.
exclude_rule "MD034"
# MD033 Inline HTML
#
# <kbd> is too useful and pretty
exclude_rule "MD033"
# MD040 Fenced code blocks should have a language specified
#
# we have too many shell scripts and random samples
exclude_rule "MD040"
# MD041 First line in file should be a top level header
#
# it's fine to start a document without a heading. it's called a
# lead. the document title should be in the front matter, not in a
# heading
exclude_rule "MD041"
# MD046 Code block style
#
# no rationale provided, seems legit to mix ``` and prefixes
exclude_rule "MD046"
# this is a markdownlint configuration file, see this documentation
# for more information:
# https://github.com/markdownlint/markdownlint/blob/master/docs/configuration.md
# this includes our default style in the styles/ subdirectory, next to
# this configuration file
style "#{File.dirname(__FILE__)}/.mdl-style.rb"
# this will show compile-time warnings from the Kramdown engine, which
# will identify errors like [link][target] without a [target]:.
show_kramdown_warnings true
StylesPath = .vale
MinAlertLevel = suggestion
Vocab = tpa
[*.md]
BasedOnStyles = Vale
---
extends: spelling
message: "Did you really mean '%s'?"
level: warning
ignore:
- gitlab/spelling-exceptions.txt
[Aa]narcat
Ansible
Arti
calcurses
colocation
cState
Debian
DRBD
gaba
Gandi
Ganeti
GitHub
GitLab
Gitolite
Gmail
Gollum
Greasemonkey
Hetzner
Hugo
iCal
ikhal
ikiwiki
Inkscape
IPSec
Isabela
Kaniko
kez
khal
Korganizer
lavamind
lego
Lektor
Linaro
mdBook
MediaWiki
mkdocs
Nagios
Nextcloud
Nitrokey
NVMe
PCIe
pgbarman
Podman
Pospesel
Riseup
Scaleway
Sunet
Supermicro
TPA
Trac
vdirsyncer
VLAN
WikiMedia
Yubico
YubiKey
#!/bin/sh
set -e
set -u
WIKI_DIR=~/wikis/help.torproject.org
podman run -it --rm \
-v $PWD:/docs \
-v $WIKI_DIR/.vale:/styles \
-v $WIKI_DIR/.vale.ini:/docs/.vale.ini \
-w /docs \
docker.io/jdkato/vale "${1:-.}"
# Quick links
* [How to get help!](policy/tpa-rfc-2-support#how-to-get-help)
* [User documentation](docs)
* [Sysadmin howtos](howto)
* [Services](service)
* [Support](support)
* [User documentation](doc)
* [Sysadmin how-to's](howto)
* [Service list](service)
* [Machine list](https://db.torproject.org/machines.cgi)
* [Policies](policy)
* [Meetings](meeting)
* [Roadmaps](roadmap)
---
## [View All Pages](pages)
<!-- when this page is updated, home.md must be as well. -->
#!/usr/bin/python3
# coding: utf-8
"""check for missing things in the wiki"""
# Copyright (C) 2024 Antoine Beaupré <anarcat@debian.org>
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
from __future__ import division, absolute_import
from __future__ import print_function, unicode_literals
import argparse
from glob import glob
import logging
import os.path
import re
import sys
from typing import Iterator
def parse_args():
parser = argparse.ArgumentParser(description=__doc__, epilog="""""")
parser.add_argument(
"--verbose",
"-v",
dest="log_level",
action="store_const",
const="info",
default="warning",
)
parser.add_argument(
"--debug",
"-d",
dest="log_level",
action="store_const",
const="debug",
default="warning",
)
parser.add_argument(
"--directory", "-t", default=".", help="base directory, default: %(default)s"
)
parser.add_argument(
"--exclude",
"-e",
default=[49],
nargs="+",
help="exclude the given RFC numbers, default: %(default)s",
)
parser.add_argument("--dryrun", "-n", action="store_true", help="do nothing")
args = parser.parse_args()
try:
args.exclude = list(map(int, args.exclude))
except ValueError as exc:
parser.error("could not parse --exclude value: %s", exc)
return args
RFC_FILENAME_RE = r"tpa-rfc-(\d+)-[^)]*"
def find_policy_numbers_toc(path: str) -> set[int]:
with open(path) as fp:
policy_toc = fp.read()
found_policy_numbers = set()
for m in re.finditer(r"\(policy/" + RFC_FILENAME_RE + r"\)", policy_toc):
found_policy_numbers.add(int(m.group(1)))
return found_policy_numbers
def find_policy_numbers_filenames(directory: str) -> set[int]:
policy_filenames = glob(os.path.join(directory, "policy/tpa-rfc-*.md"))
policy_numbers_filenames = set()
for policy_file in policy_filenames:
m = re.search(RFC_FILENAME_RE, policy_file)
assert m, "oops, %s doesn't match pattern %s, regex doesn't match glob?" % (
policy_file,
RFC_FILENAME_RE,
)
policy_numbers_filenames.add(int(m.group(1)))
return policy_numbers_filenames
def find_gaps(number_list: list[int]) -> Iterator[int]:
prev = None
for i in sorted(number_list):
if prev is None:
prev = i
continue
if i != prev + 1:
yield from range(prev + 1, i)
prev = i
def main():
errors = 0
args = parse_args()
logging.basicConfig(format="%(message)s", level=args.log_level.upper())
policy_numbers_toc = find_policy_numbers_toc(
os.path.join(args.directory, "policy.md")
)
logging.debug(
"found %d policy items in policy.md: %s",
len(policy_numbers_toc),
policy_numbers_toc,
)
policy_numbers_filenames = find_policy_numbers_filenames(args.directory)
logging.debug(
"found %d policy files in policy/: %s",
len(policy_numbers_filenames),
policy_numbers_filenames,
)
if len(policy_numbers_filenames) != len(policy_numbers_toc):
logging.error(
"number of policy items mismatch between policy.md (%d) and policy/ directory (%d)",
len(policy_numbers_toc),
len(policy_numbers_filenames),
)
errors += 1
logging.debug("policy_numbers_toc: %r", policy_numbers_toc)
logging.debug("policy_numbers_filenames: %r", policy_numbers_filenames)
missing_filenames = [
x for x in policy_numbers_toc if x not in policy_numbers_filenames
]
missing_toc = [
x for x in policy_numbers_filenames if x not in policy_numbers_toc
]
if missing_filenames:
errors += 1
logging.info("missing filenames: %s", missing_filenames)
if missing_toc:
errors += 1
logging.info("missing from policy.md: %s", missing_toc)
filenames_gaps = [
x for x in find_gaps(policy_numbers_filenames) if x not in args.exclude
]
if filenames_gaps:
errors += 1
logging.info("gaps found in policy numbers filenames: %s", filenames_gaps)
toc_gaps = [x for x in find_gaps(policy_numbers_toc) if x not in args.exclude]
if toc_gaps:
errors += 1
logging.info("gaps found in policy numbers TOC: %s", toc_gaps)
logging.info(
"last RFC number is %s, next one would logically be %s",
max(policy_numbers_filenames),
max(policy_numbers_filenames) + 1,
)
if errors:
logging.error("check failed: %d errors found", errors)
sys.exit(1)
if __name__ == "__main__":
main()
#!/bin/sh
if [ -n "$SKIP_LINT" ] ; then
echo "skipping markdown checks because SKIP_LINT is defined"
exit 0
fi
# This script will pass provided arguments to markdownlint (mdl), with
# Kramdown warnings enabled, but with the GitLab-specific (and
# non-standard) [[_TOC_]] blob removed. This only works on single
# files, if provided a directory, it will just throw the entire thing
# at mdl.
for path in "$@"; do
if [ -d "$path" ]; then
echo "checking directory $path..."
mdl "$path"
elif ! [ -e "$path" ]; then
echo "file $path does not exist, maybe it was removed or renamed? skipping."
else
# this could be a symlink, a normal file, or whatever, but it exists
case "$path" in
*.md|*.mdwn|*.markdown)
echo "checking file $path..."
# this also removes [x] style checklists which kramdown doesn't like
sed 's/^\[\[_TOC_\]\]/TOC_PLACEHOLDER/;s/^ *\([*-]\|[0-9][0-9]*\.\) *\[[x ]\] /\* /' "$path" | mdl - ;;
esac
fi
done
# Documentation
# User Documentation
This documentation is primarily aimed at users.
Note: most of this documentation is a little chaotic and needs to be
merged with the [service listing](service). You might interested in
one of the following quick links instead:
* [Email](service/email)
* [Forum (Discourse)](service/forum)
* [GitLab](howto/gitlab)
* [Nextcloud](service/nextcloud)
* [IRC](howto/irc)
* [Yubikeys](howto/yubikey)
Other documentation:
<!-- update with `ls -d doc/*.md | sed 's/.md$//;s/\(.*\)/ * [\1](doc\/\1)/'` -->
* [accounts](doc/accounts)
......@@ -9,7 +22,7 @@ This documentation is primarily aimed at users.
* [bits-and-pieces](doc/bits-and-pieces)
* [extra](doc/extra)
* [hardware-requirements](doc/hardware-requirements)
* [how-to-get-help](doc/how-to-get-help)
* [how-to-get-help](support)
* [naming-scheme](doc/naming-scheme)
* [reporting-email-problems](doc/reporting-email-problems)
* [services](doc/services)
......
......@@ -4,6 +4,9 @@ title: git, shell, ldap, etc. accounts
[[_TOC_]]
Note that this documentation needs work, as it overlaps with user
creation procedures, see [issue 40129](https://gitlab.torproject.org/tpo/tpa/team/-/issues/40129).
# torproject.org Accounts #
The Tor project keeps all user information in a central LDAP database which
......@@ -14,11 +17,11 @@ It also stores group memberships which in turn affects which users can log into
which [hosts](https://db.torproject.org/machines.cgi).
This document should be consistent with the [Tor membership
policy](https://gitweb.torproject.org/community/policies.git/plain/membership.txt),
policy](https://gitlab.torproject.org/tpo/community/policies/-/blob/HEAD/docs/membership.md),
in case of discrepancy between the two documents, the membership
policy overrules this document.
## <a id="ldap-or-alias">Decision tree: LDAP account or email alias?</a> ##
## <a name="ldap-or-alias">Decision tree: LDAP account or email alias?</a> ##
Here is a simple decision tree to help you decide if a new contributor
needs an LDAP account, or if an email alias will do. (All things being
......@@ -37,11 +40,8 @@ Regardless of whether they are a Core Contributor:
Are they a Core Contributor?
* Do they want to make their own personal clones of our git repos, for
example to propose patches and changes?
* They should get an LDAP account.
* If they're not a Core Contributor, they should put their git repos
somewhere else, like github or gitlab.
* Do they want to make their own personal clones of our git repos, for example to propose patches and changes?
* They don't need an LDAP account for just this case anymore, since gitlab can host git repos. (They are also welcome to put their personal git repos on external sites if they prefer.)
* Do they need to log in to our servers to use our shared irc host?
* They should get an LDAP account.
* If they're not a Core Contributor, they should put their IRC somewhere
......@@ -53,8 +53,10 @@ Are they a Core Contributor?
to maintain services, then Tor Project Inc should request an LDAP account.
* If they are not a staff member, then an existing Core Contributor should
request an LDAP account, and explain why they need access.
* Do they need to be able to *send* email using an @torproject.org address?
* In our 2022/2023 process of locking down email, it's increasingly necessary for people to have a full ldap account in order to deliver their tor mail to the internet properly.
See <a href="../../howto/create-a-new-user">New LDAP accounts</a> for details.
See [New LDAP accounts](howto/create-a-new-user) for details.
### Email alias reasons ###
......@@ -65,9 +67,9 @@ If none of the above cases apply:
* Are they a staff member?
* Tor Project Inc should request an email alias.
See <a href="aliases">Changing email aliases</a> for details.
See <a href="#aliases">Changing email aliases</a> for details.
## <a id="new-account">New LDAP accounts</a> ##
## <a name="new-account">New LDAP accounts</a> ##
New accounts have to be sponsored by somebody who already has a torproject.org
account. If you need an account created, please find somebody in the project
......@@ -100,7 +102,7 @@ The sponsor will [create a ticket in GitLab](https://gitlab.torproject.org/tpo/t
#### username policy ####
Usernames are allocated on a first-come, first-served basis. Usernames
should be checked for conflict with commonly used adminstrative
should be checked for conflict with commonly used administrative
aliases (`root`, `abuse`, ...) or abusive names (`killall*`, ...). In
particular, the following have special meaning for various services
and should be avoided:
......@@ -117,12 +119,12 @@ and should be avoided:
webmaster
That list, [taken from the leap
project](https://leap.se/git/leap_platform.git/blob/HEAD:/puppet/modules/site_postfix/manifests/mx/static_aliases.pp)
project](https://leap.se/git/leap_platform.git/tree/puppet/modules/site_postfix/manifests/mx/static_aliases.pp?id=eac3056c237d523f4786593922fe8f88eb65dff7)
is not exhaustive and your own judgement should be used to spot
possibly problematic aliases. See also those other possible lists:
* [systemli](https://github.com/systemli/userli/blob/master/config/reserved_names.txt)
* [LEAP](https://leap.se/git/leap_platform.git/blob/HEAD:/puppet/modules/site_postfix/manifests/mx/static_aliases.pp)
* [systemli](https://github.com/systemli/userli/blob/cc7bd7d8744df321aaea5b6289a231657dbc7643/config/reserved_names.txt)
* [LEAP](https://leap.se/git/leap_platform.git/tree/puppet/modules/site_postfix/manifests/mx/static_aliases.pp?id=eac3056c237d523f4786593922fe8f88eb65dff7)
* [immerda](https://git.immerda.ch/iapi/tree/lib/iapi/helpers/forbidden_aliases.rb)
### Step n+1 ###
......@@ -133,7 +135,7 @@ and either approved or rejected.
If the board indicates their assent, the sysadmin team will then create the
account as requested.
## <a id="retiring-account">Retiring accounts</a> ##
## <a name="retiring-account">Retiring accounts</a> ##
If you won't be using your LDAP account for a while, it's good security
hygiene to have it disabled. Disabling an LDAP account is a simple
......@@ -149,7 +151,7 @@ are sufficient to confirm a disable request."
and accept that email forwarding for the person will stop working too,
or add a new line in the email alias so email keeps working.)
## <a id="get-access">Getting added to an existing group/Getting access to a specific host</a> ##
## <a name="get-access">Getting added to an existing group/Getting access to a specific host</a> ##
Almost all privileges in our infrastructure, such as account on a particular
host, sudo access to a role account, or write permissions to a specific
......@@ -173,22 +175,22 @@ you be added.
To find out who is on a specific group you can ssh to perdulce:
$ ssh perdulce.torproject.org
ssh perdulce.torproject.org
Then you can run:
$ getent group
getent group
See also: the `"Host specific passwords"` section below
## <a id="aliases">Changing email aliases</a> ##
## <a name="aliases">Changing email aliases</a> ##
Create a ticket specifying the alias, the new address to add, and a
brief motivation for the change.
For specifics, see the "The sponsor will create a ticket" section above.
### <a id="new-aliases">Adding a new email alias</a> ###
### <a name="new-aliases">Adding a new email alias</a> ###
#### Personal Email Aliases ####
......@@ -202,13 +204,13 @@ Contributors.
Tor Project Inc and Core Contributors can request group email aliases for new
functions or projects.
### <a id="existing-aliases">Getting added to an existing email alias</a> ###
### <a name="existing-aliases">Getting added to an existing email alias</a> ###
Similar to being added to an LDAP group, the right way to get added
to an existing email alias is by getting somebody who is already on
that alias to file a ticket asking for you to be added.
## <a id="password-reset">Changing/Resetting your passwords</a> ##
## <a name="password-reset">Changing/Resetting your passwords</a> ##
### LDAP ###
......@@ -242,6 +244,8 @@ schedule to all hosts.
There are also delays involved in the mail loop, of course.
<a name="sudo" />
### Host specific passwords / sudo passwords ###
Your LDAP password can *not* be used to authenticate to `sudo` on
......@@ -261,8 +265,15 @@ Alternatively, or additionally, you can have per-host sudo passwords
-- just select the appropriate host in the pull-down box.
Once set on the web interface, you will have to confirm the new
settings by sending a signed challenge to the mail interface. Please
ensure you don't introduce any additional line breaks.
settings by sending a signed challenge to the mail interface. The
challenge is a single line, without line breaks, provided by the web
interface. With the challenge first you will need to produce an
openpgp signature:
echo 'confirm sudopassword ...' | gpg --armor --sign
With it you can compose an email to changes@db.torproject.org, sending
the challenge in the body followed by the openpgp signature.
Note that setting a sudo password will only enable you to use sudo to
configured accounts on configured hosts. Consult the output of "sudo
......@@ -272,7 +283,7 @@ you don't need to nor can use sudo.)
Do mind the delays in LDAP and sudo passwords change, mentioned in the
previous section.
## <a id="key-rollover">Changing/Updating your OpenPGP key</a> ##
## <a name="key-rollover">Changing/Updating your OpenPGP key</a> ##
If you are planning on migrating to a new OpenPGP key and you also want to
change your key in LDAP, or if you just want to update the copy of your key
......
......@@ -24,28 +24,29 @@ being expanded a bit to deserve their own page.
they're being deactivated in puppet.
### Clients
* `tor-puppet/modules/bacula/manifests/client.pp` gives an idea of
where things are at on backup clients.
* Clients run the Bacula File Daemon, `bacula-fd(8)`.
## Onion sites
- Example from a vhost template
* Example from a vhost template
<% if scope.function_onion_global_service_hostname(['crm-2018.torproject.org']) -%>
<Virtualhost *:80>
ServerName <%= scope.function_onion_global_service_hostname(['crm-2018.torproject.org']) %>
Use vhost-inner-crm-2018.torproject.org
</VirtualHost>
<% end -%>
<% if scope.function_onion_global_service_hostname(['crm-2018.torproject.org']) -%>
<Virtualhost *:80>
ServerName <%= scope.function_onion_global_service_hostname(['crm-2018.torproject.org']) %>
Use vhost-inner-crm-2018.torproject.org
</VirtualHost>
<% end -%>
- Function defined in
* Function defined in
`tor-puppet/modules/puppetmaster/lib/puppet/parser/functions/onion_global_service_hostname.rb`
parses
`/srv/puppet.torproject.org/puppet-facts/onionbalance-services.yaml`.
- `onionbalance-services.yaml` is populated through
* `onionbalance-services.yaml` is populated through
`onion::balance` (`tor-puppet/modules/onion/manifests/balance.pp`)
- `onion:balance` uses the `onion_balance_service_hostname` fact from `tor-puppet/modules/torproject_org/lib/facter/onion-services.rb`
* `onion:balance` uses the `onion_balance_service_hostname` fact from `tor-puppet/modules/torproject_org/lib/facter/onion-services.rb`
## Puppet
......
---
title: Hardware requirements
---
So you want to give us hardware? Great! Here's what we need...
==============================================================
# Physical hardware requirements
If you want to [donate hardware][], there are specific requirements
for machine we manage that you should follow. For other donations,
......@@ -13,24 +10,94 @@ please see the [donation site][].
[donation site]: https://donate.torproject.org/
This list is not final, and if you have questions, please [contact
us](doc/how-to-get-help).
us](support). Also note that we also accept virtual
machine "donations" now, for which requirements are different, see
below.
Must have:
## Must have
* Out of band management with dedicated network port, preferably a
BMC, or failing that, serial console and networked power bars
Rackmount
something standard (like serial-over-ssh, with BIOS redirection),
or failing that, serial console and networked power bars
* No human intervention to power on or reboot
* Warranty or post-warranty hardware support, preferably provided by
the sponsor
* Under the 'ownership' of Tor, although long-term loans can also
work
* Rescue system (PXE bootable OS or remotely loadable ISO image)
Would like to have:
## Nice to have
* Production quality rather than pre-production hardware
* Support for multiple drives (so we can do RAID) although this can
be waived for disposable servers like build boxes
* Hosting for the machine: we do not run our own datacenters or rack,
so it would be preferable if you can also find a hosting location
for the machine
* Hosting for the machine: we do not run our own data centers or
rack, so it would be preferable if you can also find a hosting
location for the machine, see the [hosting requirements](#hosting-requirements) below
for details
## To avoid
* proprietary Java/ActiveX remote consoles
* hardware RAID, unless supported with open drivers in the mainline
Linux kernel and userland utilities
# Hosting requirements
Those are requirements that apply to actual physical / virtual hosting
of machines.
## Must have
* 100-400W per unit density, depending on workload
* 1-10gbit, unmetered
* dual stack (IPv4 and IPv6)
* IPv4 address space (at least one per unit, typically 4-8 per unit)
* out of band access (IPMI or serial)
* rescue systems (e.g. PXE booting)
* remote hands SLA ("how long to replace a broken hard drive?")
## Nice to have
* "clean" IP addresses (for mail delivery, etc)
* complete /24 IPv4, donated to the Tor project
* private VLANs with local network
* BGP announcement capabilities
* not in europe or northern america
* free, or ~ 150$/unit
# Virtual machines requirements
## Must have
Without those, we will have to be basically convinced to accept those
machines:
* Debian OS
* Shell access (over SSH)
* Unattended reboots or upgrades
The latter might require more explanations. It means the machine can
be rebooted without intervention of an operator. It seems trivial, but
some setups make that difficult. This is essential so that we can
apply Linux kernel upgrades. Alternatively, manual reboots are
acceptable if such security upgrades are automatically applied.
## Nice to have
Those we would have in an ideal world, but are not deal breakers:
* Full disk encryption
* Rescue OS boot to install our own OS
* Remote console
* Provisioning API (cloud-init, OpenStack, etc)
* Reverse DNS
* Real IP address (no NAT)
## To avoid
Those are basically deal breakers, but we have been known to accept
those situations as well, in extreme cases:
* No control over the running kernel
* Proprietary drivers
How to get help
===============
This policy was moved to [the how-to-get-help policy](policy/tpa-rfc-2-support#how-to-get-help).
This policy was moved to [the support section](support).
---
title: Lektor website development environment on macOS
---
# Overview
The aim of this document is to explain the steps required to set up a local
Lektor development environment suitable for working on Tor Project websites
based on the Lektor platform.
We'll be using the [Sourcetree][] git GUI to provide a user-friendly method of
working with the various website's git repositories.
[Sourcetree]: https://www.sourcetreeapp.com/
# Prerequisites
First we'll install a few prerequisite packages, including Sourcetree.
You must have administrator privileges to install these software packages.
First we'll install the Xcode package.
Open the Terminal app and enter:
xcode-select --install
Click `Install` on the dialog that appears.
Now, we'll install the `brew` package manager, again via the Terminal:
/bin/bash -c "$(curl -fsSL
https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
Now we're ready to install a few more tools:
brew install coreutils git git-lfs python3.8
And lastly we need to download and install Sourcetree. This can be done from
the app's website: https://www.sourcetreeapp.com/
Follow the installer prompts, entering name and email address so that the git
commits are created with adequate identifying information.
![](lektor-dev-macos/install_preferences.png)
# Connect GitLab account
*This step is only required if you want to create Merge Requests in GitLab.*
Next, we'll create a GitLab token to allow Sourcetree to retrieve and update
projects.
* Navigate to https://gitlab.torproject.org/-/profile/personal_access_tokens
* Enter `sourcetree` under **Token name**
* Choose an expiration date, ideally not more than a few months
* Check the box next to `api`
* Click **Create personal access token**
* Copy the token into your clipboard
Now, open Sourcetree and click the **Connect...** button on the main windows,
then **Add...**, and fill in the dialog as below. Paste the token in the
**Password** field.
![](lektor-dev-macos/add_account.png)
Click the **Save** button.
The **Remote** tab on the main window should now show a list of git
repositories available on the Tor Project GitLab.
To clone a project, enter its name (eg. `tpo` or `blog`) in the **Filter
repositories** input box and click the **Clone** link next to it.
Depending on the project, a dialog titled **Git LFS: install required** may
then appear. If so, click **Yes** to ensure all the files in the project are
downloaded from GitLab.