Sync i18n with recent changes

This commit is contained in:
Slavi Pantaleev
2025-01-27 09:56:21 +02:00
parent 43d5596086
commit 83eedc44f1
347 changed files with 24578 additions and 22358 deletions

View File

@ -1,5 +1,5 @@
# SOME DESCRIPTIVE TITLE.
# Copyright (C) 2018-2024, Slavi Pantaleev, Aine Etke, MDAD community members
# Copyright (C) 2018-2025, Slavi Pantaleev, Aine Etke, MDAD community members
# This file is distributed under the same license as the matrix-docker-ansible-deploy package.
# FIRST AUTHOR <EMAIL@ADDRESS>, YEAR.
#
@ -8,7 +8,7 @@ msgid ""
msgstr ""
"Project-Id-Version: matrix-docker-ansible-deploy \n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2024-12-20 07:23+0200\n"
"POT-Creation-Date: 2025-01-27 09:54+0200\n"
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n"
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
"Language-Team: LANGUAGE <LL@li.org>\n"
@ -61,7 +61,7 @@ msgid "You may be thinking **if all files are stored locally as well, what's the
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:20
msgid "You can run some scripts to delete the local files once in a while (which we do automatically by default - see [Periodically cleaning up the local filesystem](#periodically-cleaning-up-the-local-filesystem)), thus freeing up local disk space. If these files are needed in the future (for serving them to users, etc.), Synapse will pull them from the media storage provider on demand."
msgid "You can run some scripts to delete the local files once in a while (which we do automatically by default see [Periodically cleaning up the local filesystem](#periodically-cleaning-up-the-local-filesystem)), thus freeing up local disk space. If these files are needed in the future (for serving them to users, etc.), Synapse will pull them from the media storage provider on demand."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:22
@ -69,185 +69,205 @@ msgid "While you will need some local disk space around, it's only to accommodat
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:24
msgid "Installing"
msgid "Adjusting the playbook configuration"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:26
msgid "After [creating the S3 bucket and configuring it](configuring-playbook-s3.md#bucket-creation-and-security-configuration), you can proceed to configure `s3-storage-provider` in your configuration file (`inventory/host_vars/matrix.example.com/vars.yml`):"
msgid "After [creating the S3 bucket and configuring it](configuring-playbook-s3.md#bucket-creation-and-security-configuration), add the following configuration to your `inventory/host_vars/matrix.example.com/vars.yml` file:"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:48
msgid "Extending the configuration"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:50
msgid "If you have existing files in Synapse's media repository (`/matrix/synapse/media-store/..`):"
msgid "There are some additional things you may wish to configure about the server."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:52
msgid "new files will start being stored both locally and on the S3 store"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:53
msgid "the existing files will remain on the local filesystem only until [migrating them to the S3 store](#migrating-your-existing-media-files-to-the-s3-store)"
msgid "Take a look at:"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:54
msgid "at some point (and periodically in the future), you can delete local files which have been uploaded to the S3 store already"
msgid "`roles/custom/matrix-synapse/defaults/main.yml` for some variables that you can customize via your `vars.yml` file"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:56
msgid "Regardless of whether you need to [Migrate your existing files to the S3 store](#migrating-your-existing-media-files-to-the-s3-store) or not, make sure you've familiarized yourself with [How it works?](#how-it-works) above and [Periodically cleaning up the local filesystem](#periodically-cleaning-up-the-local-filesystem) below."
msgid "Usage"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:58
msgid "Migrating your existing media files to the S3 store"
msgid "If you have existing files in Synapse's media repository (`/matrix/synapse/storage/media-store/…`):"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:60
msgid "Migrating your existing data can happen in multiple ways:"
msgid "new files will start being stored both locally and on the S3 store"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:61
msgid "the existing files will remain on the local filesystem only until [migrating them to the S3 store](#migrating-your-existing-media-files-to-the-s3-store)"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:62
msgid "at some point (and periodically in the future), you can delete local files which have been uploaded to the S3 store already"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:64
msgid "Regardless of whether you need to [Migrate your existing files to the S3 store](#migrating-your-existing-media-files-to-the-s3-store) or not, make sure you've familiarized yourself with [How it works?](#how-it-works) above and [Periodically cleaning up the local filesystem](#periodically-cleaning-up-the-local-filesystem) below."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:66
msgid "Migrating your existing media files to the S3 store"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:68
msgid "Migrating your existing data can happen in multiple ways:"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:70
msgid "[using the `s3_media_upload` script from `synapse-s3-storage-provider`](#using-the-s3_media_upload-script-from-synapse-s3-storage-provider) (very slow when dealing with lots of data)"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:63
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:71
msgid "[using another tool in combination with `s3_media_upload`](#using-another-tool-in-combination-with-s3_media_upload) (quicker when dealing with lots of data)"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:65
msgid "Using the `s3_media_upload` script from `synapse-s3-storage-provider`"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:67
msgid "Instead of using `s3_media_upload` directly, which is very slow and painful for an initial data migration, we recommend [using another tool in combination with `s3_media_upload`](#using-another-tool-in-combination-with-s3_media_upload)."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:69
msgid "To copy your existing files, SSH into the server and run `/matrix/synapse/ext/s3-storage-provider/bin/shell`."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:71
msgid "This launches a Synapse container, which has access to the local media store, Postgres database, S3 store and has some convenient environment variables configured for you to use (`MEDIA_PATH`, `BUCKET`, `ENDPOINT`, `UPDATE_DB_DAYS`, etc)."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:73
msgid "Then use the following commands (`$` values come from environment variables - they're **not placeholders** that you need to substitute):"
msgid "💡 **Note**: instead of using `s3_media_upload` directly, which is very slow and painful for an initial data migration, we recommend [using another tool in combination with `s3_media_upload`](#using-another-tool-in-combination-with-s3_media_upload)."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:75
msgid "`s3_media_upload update-db $UPDATE_DB_DURATION` - create a local SQLite database (`cache.db`) with a list of media repository files (from the `synapse` Postgres database) eligible for operating on"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:76
msgid "`$UPDATE_DB_DURATION` is influenced by the `matrix_synapse_ext_synapse_s3_storage_provider_update_db_day_count` variable (defaults to `0`)"
msgid "Using the `s3_media_upload` script from `synapse-s3-storage-provider`"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:77
msgid "`$UPDATE_DB_DURATION` defaults to `0d` (0 days), which means **include files which haven't been accessed for more than 0 days** (that is, **all files will be included**)."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:78
msgid "`s3_media_upload check-deleted $MEDIA_PATH` - check whether files in the local cache still exist in the local media repository directory"
msgid "To copy your existing files, SSH into the server and run `/matrix/synapse/ext/s3-storage-provider/bin/shell`."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:79
msgid "`s3_media_upload upload $MEDIA_PATH $BUCKET --delete --storage-class $STORAGE_CLASS --endpoint-url $ENDPOINT` - uploads locally-stored files to S3 and deletes them from the local media repository directory"
msgid "This launches a Synapse container, which has access to the local media store, Postgres database, S3 store and has some convenient environment variables configured for you to use (`MEDIA_PATH`, `BUCKET`, `ENDPOINT`, `UPDATE_DB_DAYS`, etc)."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:81
msgid "The `s3_media_upload upload` command may take a lot of time to complete."
msgid "Then use the following commands (`$` values come from environment variables — they're **not placeholders** that you need to substitute):"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:83
msgid "Instead of running the above commands manually in the shell, you can also run the `/matrix/synapse/ext/s3-storage-provider/bin/migrate` script which will run the same commands automatically. We demonstrate how to do it manually, because:"
msgid "`s3_media_upload update-db $UPDATE_DB_DURATION` — create a local SQLite database (`cache.db`) with a list of media repository files (from the `synapse` Postgres database) eligible for operating on"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:84
msgid "`$UPDATE_DB_DURATION` is influenced by the `matrix_synapse_ext_synapse_s3_storage_provider_update_db_day_count` variable (defaults to `0`)"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:85
msgid "it's what the upstream project demonstrates and it teaches you how to use the `s3_media_upload` tool"
msgid "`$UPDATE_DB_DURATION` defaults to `0d` (0 days), which means **include files which haven't been accessed for more than 0 days** (that is, **all files will be included**)."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:86
msgid "allows you to check and verify the output of each command, to catch mistakes"
msgid "`s3_media_upload check-deleted $MEDIA_PATH` — check whether files in the local cache still exist in the local media repository directory"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:87
msgid "includes progress bars and detailed output for each command"
msgid "`s3_media_upload upload $MEDIA_PATH $BUCKET --delete --storage-class $STORAGE_CLASS --endpoint-url $ENDPOINT` — uploads locally-stored files to S3 and deletes them from the local media repository directory"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:88
msgid "allows you to easily interrupt slow-running commands, etc. (the `/matrix/synapse/ext/s3-storage-provider/bin/migrate` starts a container without interactive TTY support, so `Ctrl+C` may not work and you and require killing via `docker kill ..`)"
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:89
msgid "The `s3_media_upload upload` command may take a lot of time to complete."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:90
msgid "Using another tool in combination with `s3_media_upload`"
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:91
msgid "Instead of running the above commands manually in the shell, you can also run the `/matrix/synapse/ext/s3-storage-provider/bin/migrate` script which will run the same commands automatically. We demonstrate how to do it manually, because:"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:92
msgid "To migrate your existing local data to S3, we recommend to:"
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:93
msgid "it's what the upstream project demonstrates and it teaches you how to use the `s3_media_upload` tool"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:94
msgid "**first** use another tool ([`aws s3`](#copying-data-to-amazon-s3) or [`b2 sync`](#copying-data-to-backblaze-b2), etc.) to copy the local files to the S3 bucket"
msgid "allows you to check and verify the output of each command, to catch mistakes"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:95
msgid "includes progress bars and detailed output for each command"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:96
msgid "**only then** [use the `s3_media_upload` tool to finish the migration](#using-the-s3_media_upload-script-from-synapse-s3-storage-provider) (this checks to ensure all files are uploaded and then deletes the local files)"
msgid "allows you to easily interrupt slow-running commands, etc. (the `/matrix/synapse/ext/s3-storage-provider/bin/migrate` starts a container without interactive TTY support, so `Ctrl+C` may not work and you and require killing via `docker kill …`)"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:98
msgid "Copying data to Amazon S3"
msgid "Using another tool in combination with `s3_media_upload`"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:100
msgid "To migrate your existing local data to S3, we recommend to:"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:102
msgid "**first** use another tool ([`aws s3`](#copying-data-to-amazon-s3) or [`b2 sync`](#copying-data-to-backblaze-b2), etc.) to copy the local files to the S3 bucket"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:104
msgid "**only then** [use the `s3_media_upload` tool to finish the migration](#using-the-s3_media_upload-script-from-synapse-s3-storage-provider) (this checks to ensure all files are uploaded and then deletes the local files)"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:106
msgid "Copying data to Amazon S3"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:108
msgid "To copy to AWS S3, start a container on the Matrix server like this:"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:112
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:120
msgid "Copying data to an S3 alternative using the aws-s3 tool"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:114
msgid "To copy to a provider other than AWS S3 (e.g. Wasabi, Digital Ocean Spaces, etc.), you can use the command for [Copying data to Amazon S3](#copying-data-to-amazon-s3) with an added `--endpoint-url=$ENDPOINT` argument."
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:122
msgid "To copy to a provider other than AWS S3 (e.g. Storj, Wasabi, Digital Ocean Spaces, etc.), you can use the command for [Copying data to Amazon S3](#copying-data-to-amazon-s3) with an added `--endpoint-url=$ENDPOINT` argument."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:116
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:124
msgid "Add this argument to the command **as-is** (`$ENDPOINT` is an environment variable corresponding to `matrix_synapse_ext_synapse_s3_storage_provider_config_endpoint_url`, so you don't need to touch it). Make sure to add the argument **before** the final quote (`'`) of the command."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:118
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:126
msgid "Copying data to Backblaze B2"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:120
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:128
msgid "You can copy files to Backblaze B2 either by following the [Copying data to an S3 alternative using the aws-s3 tool](#copying-data-to-an-s3-alternative-using-the-aws-s3-tool) or by using the B2-specific [b2 command-line tool](https://www.backblaze.com/b2/docs/quick_command_line.html) as described below."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:122
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:130
msgid "To copy the data using the `b2` tool, start a container on the Matrix server like this:"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:136
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:144
msgid "Periodically cleaning up the local filesystem"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:138
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:146
msgid "As described in [How it works?](#how-it-works) above, when new media is uploaded to the Synapse homeserver, it's first stored locally and then also stored on the remote S3 storage."
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:140
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:148
msgid "By default, we periodically ensure that all local files are uploaded to S3 and are then removed from the local filesystem. This is done automatically using:"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:142
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:150
msgid "the `/matrix/synapse/ext/s3-storage-provider/bin/migrate` script"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:143
msgid ".. invoked via the `matrix-synapse-s3-storage-provider-migrate.service` service"
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:151
msgid " invoked via the `matrix-synapse-s3-storage-provider-migrate.service` service"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:144
msgid ".. triggered by the `matrix-synapse-s3-storage-provider-migrate.timer` timer, every day at 05:00"
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:152
msgid " triggered by the `matrix-synapse-s3-storage-provider-migrate.timer` timer, every day at 05:00"
msgstr ""
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:146
msgid "So.. you don't need to perform any maintenance yourself."
#: ../../../docs/configuring-playbook-synapse-s3-storage-provider.md:154
msgid "So you don't need to perform any maintenance yourself."
msgstr ""