Compare commits

..

67 Commits

Author SHA1 Message Date
e64ada0067 docs and recipes updatte 2024-09-13 01:48:38 +03:00
d9a2045d32 try to improve parser 2024-09-13 01:42:43 +03:00
07eb930bd1 simplify typed get 2024-09-11 03:14:23 +03:00
e7eccca342 completely remove makepkg calls 2024-09-11 02:57:37 +03:00
6e232f0cd6 pkgbuild parser impl 2024-09-10 17:50:34 +03:00
05f87a36d6 generate filenames without using makepkg 2024-09-06 16:16:52 +03:00
1d85a61cc4 feat: get rid of jquery (#133) 2024-09-05 02:26:52 +03:00
689de82139 build: make cerberus dependency optional 2024-09-04 22:28:25 +03:00
5b9f35220f feat: implement stats subcommand (#132) 2024-09-04 22:28:25 +03:00
8fc4d7b4a5 feat: allow filter events by timestamp 2024-09-04 22:28:25 +03:00
cedf18ac7a chore: add rss generation to samples 2024-09-04 22:28:25 +03:00
164b6d7956 feat: add event log and update chart to package info modal 2024-09-04 22:28:25 +03:00
27e595cdf4 feat: remove duplicates from the toast 2024-09-04 22:28:25 +03:00
020560d341 refactor: simplify Validator class 2024-09-04 22:28:25 +03:00
cdef67986b feat: allow cross reference in the configuration (#131) 2024-09-04 22:28:25 +03:00
dddcd0bfce feat: implement rss generation (#130) 2024-09-04 22:28:25 +03:00
a0784b7af1 feat: add ability to log sql statements 2024-09-04 22:28:25 +03:00
4c4c9b2bfd feat: serve logs and events from the newest to oldest, but keep the
ordering

So basically initial implementation, with limit=1, would emit the oldest
record in series. New implementation will return the most recent one
instead

The response is still sorted by ascension
2024-09-04 22:28:25 +03:00
5c34c051cb feat: log package update events 2024-09-04 22:28:25 +03:00
4fa44b0532 refactor: allow event to receive keyword arguments
This change also replaces the dataclass implementation of the class to
custom one
2024-09-04 22:28:25 +03:00
f167ce7d3b feat: add timer for metrics purposes 2024-09-04 22:28:25 +03:00
950b9e4289 docs: update booleans in docs 2024-09-04 22:28:25 +03:00
264aeb7150 feat: implement audit log tables and methods (#129) 2024-09-04 22:28:25 +03:00
be7169c5df feat: replace scan paths options to single one
It has been found that previous system didn't allow to configure
specific cases (e.g. a whitelisted directory inside /usr/lib/cmake). The
current solution replaces two options to single one, which also allows a
regular expressions

Also PackageArchive class has been moved to core package, because it is
more about service rather than model
2024-09-04 22:25:54 +03:00
9c1e9ecbdc Release 2.14.1 2024-09-04 22:01:04 +03:00
4b2f6bbee9 bug: fix removal of the packages
It has been broken since reporter improvements, because it effectivelly
1) didn't call remove functions in database
2) used empty repository identifier for web service

With those changes it also raises exception when you try to call id on
empty identifier
2024-09-04 21:50:33 +03:00
fd8c8a00d0 chore: small contributing guide update 2024-09-04 21:49:31 +03:00
eaf1984eb3 refactor: fix some IDE warnings 2024-09-04 21:49:31 +03:00
794dddccd9 build: update pytest configuration to suppress deprecation warnings 2024-09-04 21:49:31 +03:00
7bd7f95f76 Release 2.14.0 2024-08-23 14:37:05 +03:00
375374c396 docs: improve waiter classes docs 2024-08-23 14:33:07 +03:00
d1ad5ecc11 feat: add ability to suppress git hints
It can be done by setting options in command. The commit author/email is
also now using this logic
2024-08-23 14:33:07 +03:00
1eb4d8e47f feat: add blacklisted paths to implicit dependencies processing
It has been found that in some cases additional packages have been added
as dependencies, like usr/share/applications, usr/lib/cmake, etc

This commit adds an ability to blacklist specific paths from processing
2024-08-23 14:33:07 +03:00
0861548b56 docs: split faq into multiple files 2024-08-20 16:44:40 +03:00
e9e4172063 feat: add support of pam authentication
Add naive implementation of user password check by calling su command.
Also change some authentication method to require username to be string
instead of optional string
2024-08-20 16:44:40 +03:00
beb6156795 fix: print current and updated version correctly
The issue appears in case if versions ar the same (e.g. rebuild); in
this case printer doesn't increment version as builder does.

Also util has been renamed to utils, keeping backward compatibiltiy
2024-08-16 16:24:11 +03:00
dbfb460557 feat: optimize archive reading
Instead of trying to load every database and look for files, this commit
introduces the optimization in which, the service loads packages first,
groups them by database and load files later.

In some cases it significantly descreases times for loading files
2024-08-14 17:07:10 +03:00
f7f76c4119 fix: explicitly process list of packages
Small workaround to remove debug packages from being processed
2024-08-14 17:07:10 +03:00
88ee300b9e fix: remove trailit slash when loading packages files from a database 2024-08-14 17:07:10 +03:00
6f30c687c2 fix: skip debug packages as well 2024-08-14 17:07:10 +03:00
c023ebe165 docs: update documentation for implicit dependencies resolution 2024-08-14 17:07:10 +03:00
54b99cacfd feat: remove excess dependencies leaves (#128)
This mr improves implicit dependencies processing by reducing tree leaves by using the following algorithm:

* remove paths which belong to any base package
* remove packages which are (opt)dependencies of one of the package which provides same path. It also tries to handle circular dependencies by excluding them from being "satisfied"
* remove packages which are already satisfied by any children path
2024-08-14 17:07:10 +03:00
4f5166ff25 feat: improve lock mechanisms
* improve lock mechanisms

* use /run/ahriman for sockett

* better water
2024-08-14 17:07:10 +03:00
c8afcbf36a feat: implement local reporter mode (#126)
* implement local reporter mode

* simplify watcher class

* review changes

* do not update unknown status

* allow empty key patches via api

* fix some pylint warnings in tests
2024-08-14 17:07:10 +03:00
2b9880bd3c feat: allow to use simplified keys for context
Initial implementation requires explicit context key name to be set.
Though it is still useful sometimes (e.g. if there should be two
variables with the same type), in the most used scenarios internally
only type is required. This commit extends set and get methods to allow
to construct ContextKey from type directly

Also it breaks old keys, since - in order to reduce amount of possible
mistakes - internal classes uses this generation method
2024-08-14 17:07:10 +03:00
3be5cdafe8 feat: add abillity to check broken dependencies (#122)
* implement elf dynamic linking check

* load local database too in pacman wrapper
2024-08-14 17:07:10 +03:00
668be41c3e type: drop MiddlewareType in favour of Middleware builtin 2024-08-14 17:07:10 +03:00
3353daec6d type: fix mypy warn for fresh unixsocket release 2024-08-14 17:07:10 +03:00
eef4d2dd98 type: remove another unused mypy directive 2024-08-14 17:07:10 +03:00
b15161554e build: use requests-unixsocket2 fork
Since requests-2.32.0, the http+unix url scheme is brokek, check
https://github.com/msabramo/requests-unixsocket/issues/73 for more
details
2024-08-14 17:07:10 +03:00
bb4a0d75fc Release 2.13.8 2024-05-12 11:53:19 +03:00
bca0df41d1 fix: drop integrity check for javascript
It has been added to improve security, however, it changes over time for
no reason ¯\_(ツ)_/¯ I guess either cdn was hacked or fuck js
2024-05-12 11:49:12 +03:00
07b77be6b8 Release 2.13.7 2024-05-09 13:26:40 +03:00
2b33510ada fix: parse array variable from command 2024-05-09 13:21:42 +03:00
6d05389639 Release 2.13.6 2024-05-05 21:59:30 +03:00
daf9841717 fix: update integrity checksums for momentjs and daterangepicker 2024-05-05 21:17:30 +03:00
0d243a781a refactor: update code to the latest python (3.12+) 2024-05-05 21:17:30 +03:00
cf2e66a934 fix: remove debug packages together with normal ones (#124) 2024-05-05 21:17:30 +03:00
f01f35238d Release 2.13.5 2024-04-04 13:33:03 +03:00
d30d512eb6 fix: update Repo.init to the latest pacman release 2024-04-04 13:16:05 +03:00
0437f90e5a build: install base-devel package 2024-04-04 13:16:03 +03:00
3cab65855a fix: lazy web component initialization
In some cases (probably slow internet) in place initialization can cause
exception, because elements are not available yet. This commit moves
events initialization to $()
2024-04-04 13:14:17 +03:00
ecfb615f97 feat: add ability to disable debug packages distribution
The feature is implemented as supplying !debug option to makepkg when
generating package list. In this case debug packages still will be
built, however, they will not be added to the repository
2024-04-04 13:14:17 +03:00
243983ee64 docs: update docs 2024-02-10 03:12:09 +02:00
812c03d1eb Release 2.13.4 2024-02-09 17:47:01 +02:00
01597c531b fix: return only built packages from task
Since the last updates makepkg --packagelist also adds debug packages
which causes errors
2024-02-09 17:37:50 +02:00
4fec42eac8 refactor: rename packages http methods to own package
docs: update docs import
2024-01-22 02:20:11 +02:00
332 changed files with 20225 additions and 10783 deletions

View File

@ -10,17 +10,15 @@ echo -e '[arcanisrepo]\nServer = https://repo.arcanis.me/$arch\nSigLevel = Never
# refresh the image
pacman -Syu --noconfirm
# main dependencies
pacman -Sy --noconfirm devtools git pyalpm python-cerberus python-inflection python-passlib python-requests python-srcinfo python-systemd sudo
pacman -Sy --noconfirm devtools git pyalpm python-inflection python-passlib python-pyelftools python-requests python-srcinfo python-systemd sudo
# make dependencies
pacman -Sy --noconfirm python-build python-flit python-installer python-tox python-wheel
pacman -Sy --noconfirm --asdeps base-devel python-build python-flit python-installer python-tox python-wheel
# optional dependencies
if [[ -z $MINIMAL_INSTALL ]]; then
# VCS support
pacman -Sy --noconfirm breezy darcs mercurial subversion
# web server
pacman -Sy --noconfirm python-aioauth-client python-aiohttp python-aiohttp-apispec-git python-aiohttp-cors python-aiohttp-jinja2 python-aiohttp-security python-aiohttp-session python-cryptography python-jinja
# additional features
pacman -Sy --noconfirm gnupg python-boto3 rsync
pacman -Sy --noconfirm gnupg python-boto3 python-cerberus python-matplotlib rsync
fi
# FIXME since 1.0.4 devtools requires dbus to be run, which doesn't work now in container
cp "docker/systemd-nspawn.sh" "/usr/local/bin/systemd-nspawn"
@ -32,25 +30,28 @@ mv dist/ahriman-*.tar.gz package/archlinux
chmod +777 package/archlinux # because fuck you that's why
cd package/archlinux
sudo -u nobody -- makepkg -cf --skipchecksums --noconfirm
sudo -u nobody -- makepkg --packagelist | pacman -U --noconfirm -
sudo -u nobody -- makepkg --packagelist | grep -v -- -debug- | pacman -U --noconfirm -
# create machine-id which is required by build tools
systemd-machine-id-setup
# remove unused dependencies
pacman -Qdtq | pacman -Rscn --noconfirm -
# initial setup command as root
[[ -z $MINIMAL_INSTALL ]] && WEB_ARGS=("--web-port" "8080")
ahriman -a x86_64 -r "github" service-setup --packager "ahriman bot <ahriman@example.com>" "${WEB_ARGS[@]}"
# validate configuration
ahriman service-config-validate --exit-code
# enable services
systemctl enable ahriman-web
systemctl enable ahriman@x86_64-github.timer
if [[ -z $MINIMAL_INSTALL ]]; then
# validate configuration
ahriman service-config-validate --exit-code
# run web service (detached)
sudo -u ahriman -- ahriman web &
WEB_PID=$!
fi
# add the first package
sudo -u ahriman -- ahriman package-add --now ahriman
sudo -u ahriman -- ahriman --log-handler console package-add --now ahriman
# check if package was actually installed
test -n "$(find "/var/lib/ahriman/repository/github/x86_64" -name "ahriman*pkg*")"
# run package check

View File

@ -82,6 +82,7 @@ limit-inference-results=100
# List of plugins (as comma separated values of python module names) to load,
# usually to register additional checkers.
load-plugins=pylint.extensions.docparams,
pylint.extensions.bad_builtin,
definition_order,
import_order,
@ -131,6 +132,8 @@ attr-naming-style=snake_case
# style.
#attr-rgx=
bad-functions=print,
# Bad variable names which should always be refused, separated by a comma.
bad-names=foo,
bar,

View File

@ -3,7 +3,7 @@ version: 2
build:
os: ubuntu-20.04
tools:
python: "3.11"
python: "3.12"
python:
install:
@ -12,6 +12,7 @@ python:
extra_requirements:
- docs
- s3
- validator
- web
formats:

View File

@ -36,6 +36,7 @@ Again, the most checks can be performed by `tox` command, though some additional
Notes:
Very important note about this function
Probably multi-line
Args:
argument(str): an argument. This argument has
@ -70,6 +71,7 @@ Again, the most checks can be performed by `tox` command, though some additional
Attributes:
CLAZZ_ATTRIBUTE(int): (class attribute) a brand-new class attribute
instance_attribute(str): an instance attribute
with the long description
Examples:
Very informative class usage example, e.g.::
@ -92,7 +94,7 @@ Again, the most checks can be performed by `tox` command, though some additional
```
* Type annotations are the must, even for local functions. For the function argument `self` (for instance methods) and `cls` (for class methods) should not be annotated.
* For collection types built-in classes must be used if possible (e.g. `dict` instead of `typing.Dict`, `tuple` instead of `typing.Tuple`). In case if built-in type is not available, but `collections.abc` provides interface, it must be used (e.g. `collections.abc.Awaitable` instead of `typing.Awaitable`, `collections.abc.Iterable` instead of `typing.Iterable`). For union classes, the bar operator (`|`) must be used (e.g. `float | int` instead of `typing.Union[float, int]`), which also includes `typinng.Optional` (e.g. `str | None` instead of `Optional[str]`).
* For collection types built-in classes must be used if possible (e.g. `dict` instead of `typing.Dict`, `tuple` instead of `typing.Tuple`). In case if built-in type is not available, but `collections.abc` provides interface, it must be used (e.g. `collections.abc.Awaitable` instead of `typing.Awaitable`, `collections.abc.Iterable` instead of `typing.Iterable`). For union classes, the bar operator (`|`) must be used (e.g. `float | int` instead of `typing.Union[float, int]`), which also includes `typing.Optional` (e.g. `str | None` instead of `Optional[str]`).
* `classmethod` should (almost) always return `Self`. In case of mypy warning (e.g. if there is a branch in which function doesn't return the instance of `cls`) consider using `staticmethod` instead.
* Recommended order of function definitions in class:
@ -132,7 +134,7 @@ Again, the most checks can be performed by `tox` command, though some additional
* For any path interactions `pathlib.Path` must be used.
* Configuration interactions must go through `ahriman.core.configuration.Configuration` class instance.
* In case if class load requires some actions, it is recommended to create class method which can be used for class instantiating.
* The code must follow the exception safety, unless it is explicitly asked by end user. It means that most exceptions must be handled and printed to log, no other actions must be done (e.g. raising another exception).
* The most (expected) exceptions must be handled and printed to log, allowing service to continue work. However, fatal and (in some cases) unexpected exceptions may lead to the application termination.
* Exceptions without parameters should be raised without parentheses, e.g.:
```python

View File

@ -31,12 +31,43 @@ RUN useradd -m -d "/home/build" -s "/usr/bin/nologin" build && \
echo "build ALL=(ALL) NOPASSWD: ALL" > "/etc/sudoers.d/build"
COPY "docker/install-aur-package.sh" "/usr/local/bin/install-aur-package"
## install package dependencies
## darcs is not installed by reasons, because it requires a lot haskell packages which dramatically increase image size
RUN pacman -Sy --noconfirm --asdeps devtools git pyalpm python-cerberus python-inflection python-passlib python-requests python-srcinfo && \
pacman -Sy --noconfirm --asdeps python-build python-flit python-installer python-wheel && \
pacman -Sy --noconfirm --asdeps breezy git mercurial python-aiohttp python-boto3 python-cryptography python-jinja python-requests-unixsocket python-systemd rsync subversion && \
runuser -u build -- install-aur-package python-aioauth-client python-aiohttp-apispec-git python-aiohttp-cors \
python-aiohttp-jinja2 python-aiohttp-session python-aiohttp-security
RUN pacman -Sy --noconfirm --asdeps \
devtools \
git \
pyalpm \
python-inflection \
python-passlib \
python-pyelftools \
python-requests \
python-srcinfo \
&& \
pacman -Sy --noconfirm --asdeps \
base-devel \
python-build \
python-flit \
python-installer \
python-wheel \
&& \
pacman -Sy --noconfirm --asdeps \
git \
python-aiohttp \
python-boto3 \
python-cerberus \
python-cryptography \
python-jinja \
python-matplotlib \
python-systemd \
rsync \
&& \
runuser -u build -- install-aur-package \
python-aioauth-client \
python-webargs \
python-aiohttp-apispec-git \
python-aiohttp-cors \
python-aiohttp-jinja2 \
python-aiohttp-session \
python-aiohttp-security \
python-requests-unixsocket2
## FIXME since 1.0.4 devtools requires dbus to be run, which doesn't work now in container
COPY "docker/systemd-nspawn.sh" "/usr/local/bin/systemd-nspawn"

View File

@ -2,7 +2,7 @@
[![tests status](https://github.com/arcan1s/ahriman/actions/workflows/tests.yml/badge.svg)](https://github.com/arcan1s/ahriman/actions/workflows/tests.yml)
[![setup status](https://github.com/arcan1s/ahriman/actions/workflows/setup.yml/badge.svg)](https://github.com/arcan1s/ahriman/actions/workflows/setup.yml)
[![Docker Image Version (latest semver)](https://img.shields.io/docker/v/arcan1s/ahriman?label=Docker%20image)](https://hub.docker.com/r/arcan1s/ahriman)
[![Docker Image Version (latest semver)](https://img.shields.io/docker/v/arcan1s/ahriman?label=Docker%20image&sort=semver)](https://hub.docker.com/r/arcan1s/ahriman)
[![CodeFactor](https://www.codefactor.io/repository/github/arcan1s/ahriman/badge)](https://www.codefactor.io/repository/github/arcan1s/ahriman)
[![Documentation Status](https://readthedocs.org/projects/ahriman/badge/?version=latest)](https://ahriman.readthedocs.io)
@ -33,10 +33,12 @@ Every available option is described in the [documentation](https://ahriman.readt
The application provides reasonable defaults which allow to use it out-of-box; however additional steps (like configuring build toolchain and sudoers) are recommended and can be easily achieved by following install instructions.
## [FAQ](https://ahriman.readthedocs.io/en/stable/faq.html)
## [FAQ](https://ahriman.readthedocs.io/en/stable/faq/index.html)
## Live demos
* [Build status page](https://ahriman-demo.arcanis.me). You can log in as `demo` user by using `demo` password. However, you will not be able to run tasks. [HTTP API documentation](https://ahriman-demo.arcanis.me/api-docs) is also available.
* [Repository index](https://repo.arcanis.me/arcanisrepo/x86_64/).
* [Telegram feed](https://t.me/arcanisrepo).
Do you have any success story? You can [share it](https://github.com/arcan1s/ahriman/issues/new?template=04-discussion.md)!

View File

@ -8,9 +8,6 @@ cat <<EOF > "/etc/ahriman.ini.d/00-docker.ini"
[repository]
root = $AHRIMAN_REPOSITORY_ROOT
[settings]
database = $AHRIMAN_REPOSITORY_ROOT/ahriman.db
[web]
host = $AHRIMAN_HOST

View File

@ -6,7 +6,7 @@ for PACKAGE in "$@"; do
BUILD_DIR="$(mktemp -d)"
git clone https://aur.archlinux.org/"$PACKAGE".git "$BUILD_DIR"
cd "$BUILD_DIR"
makepkg --noconfirm --install --rmdeps --syncdeps
makepkg --nocheck --noconfirm --install --rmdeps --syncdeps
cd /
rm -r "$BUILD_DIR"
done

File diff suppressed because it is too large Load Diff

Before

Width:  |  Height:  |  Size: 1.2 MiB

After

Width:  |  Height:  |  Size: 1.2 MiB

View File

@ -172,6 +172,14 @@ ahriman.application.handlers.sign module
:no-undoc-members:
:show-inheritance:
ahriman.application.handlers.statistics module
----------------------------------------------
.. automodule:: ahriman.application.handlers.statistics
:members:
:no-undoc-members:
:show-inheritance:
ahriman.application.handlers.status module
------------------------------------------

View File

@ -20,6 +20,14 @@ ahriman.core.alpm.pacman module
:no-undoc-members:
:show-inheritance:
ahriman.core.alpm.pacman\_database module
-----------------------------------------
.. automodule:: ahriman.core.alpm.pacman_database
:members:
:no-undoc-members:
:show-inheritance:
ahriman.core.alpm.repo module
-----------------------------

View File

@ -36,6 +36,14 @@ ahriman.core.auth.oauth module
:no-undoc-members:
:show-inheritance:
ahriman.core.auth.pam module
----------------------------
.. automodule:: ahriman.core.auth.pam
:members:
:no-undoc-members:
:show-inheritance:
Module contents
---------------

View File

@ -4,6 +4,14 @@ ahriman.core.build\_tools package
Submodules
----------
ahriman.core.build\_tools.package\_archive module
-------------------------------------------------
.. automodule:: ahriman.core.build_tools.package_archive
:members:
:no-undoc-members:
:show-inheritance:
ahriman.core.build\_tools.sources module
----------------------------------------

View File

@ -108,6 +108,22 @@ ahriman.core.database.migrations.m012\_last\_commit\_sha module
:no-undoc-members:
:show-inheritance:
ahriman.core.database.migrations.m013\_dependencies module
----------------------------------------------------------
.. automodule:: ahriman.core.database.migrations.m013_dependencies
:members:
:no-undoc-members:
:show-inheritance:
ahriman.core.database.migrations.m014\_auditlog module
------------------------------------------------------
.. automodule:: ahriman.core.database.migrations.m014_auditlog
:members:
:no-undoc-members:
:show-inheritance:
Module contents
---------------

View File

@ -28,6 +28,22 @@ ahriman.core.database.operations.changes\_operations module
:no-undoc-members:
:show-inheritance:
ahriman.core.database.operations.dependencies\_operations module
----------------------------------------------------------------
.. automodule:: ahriman.core.database.operations.dependencies_operations
:members:
:no-undoc-members:
:show-inheritance:
ahriman.core.database.operations.event\_operations module
---------------------------------------------------------
.. automodule:: ahriman.core.database.operations.event_operations
:members:
:no-undoc-members:
:show-inheritance:
ahriman.core.database.operations.logs\_operations module
--------------------------------------------------------

View File

@ -44,6 +44,14 @@ ahriman.core.formatters.configuration\_printer module
:no-undoc-members:
:show-inheritance:
ahriman.core.formatters.event\_stats\_printer module
----------------------------------------------------
.. automodule:: ahriman.core.formatters.event_stats_printer
:members:
:no-undoc-members:
:show-inheritance:
ahriman.core.formatters.package\_printer module
-----------------------------------------------
@ -52,6 +60,14 @@ ahriman.core.formatters.package\_printer module
:no-undoc-members:
:show-inheritance:
ahriman.core.formatters.package\_stats\_printer module
------------------------------------------------------
.. automodule:: ahriman.core.formatters.package_stats_printer
:members:
:no-undoc-members:
:show-inheritance:
ahriman.core.formatters.patch\_printer module
---------------------------------------------

View File

@ -60,6 +60,14 @@ ahriman.core.report.report\_trigger module
:no-undoc-members:
:show-inheritance:
ahriman.core.report.rss module
------------------------------
.. automodule:: ahriman.core.report.rss
:members:
:no-undoc-members:
:show-inheritance:
ahriman.core.report.telegram module
-----------------------------------

View File

@ -12,6 +12,14 @@ ahriman.core.repository.cleaner module
:no-undoc-members:
:show-inheritance:
ahriman.core.repository.event\_logger module
--------------------------------------------
.. automodule:: ahriman.core.repository.event_logger
:members:
:no-undoc-members:
:show-inheritance:
ahriman.core.repository.executor module
---------------------------------------

View File

@ -60,6 +60,14 @@ ahriman.core.util module
:no-undoc-members:
:show-inheritance:
ahriman.core.utils module
-------------------------
.. automodule:: ahriman.core.utils
:members:
:no-undoc-members:
:show-inheritance:
Module contents
---------------

View File

@ -12,6 +12,14 @@ ahriman.core.status.client module
:no-undoc-members:
:show-inheritance:
ahriman.core.status.local\_client module
----------------------------------------
.. automodule:: ahriman.core.status.local_client
:members:
:no-undoc-members:
:show-inheritance:
ahriman.core.status.watcher module
----------------------------------

View File

@ -60,6 +60,30 @@ ahriman.models.counters module
:no-undoc-members:
:show-inheritance:
ahriman.models.dependencies module
----------------------------------
.. automodule:: ahriman.models.dependencies
:members:
:no-undoc-members:
:show-inheritance:
ahriman.models.event module
---------------------------
.. automodule:: ahriman.models.event
:members:
:no-undoc-members:
:show-inheritance:
ahriman.models.filesystem\_package module
-----------------------------------------
.. automodule:: ahriman.models.filesystem_package
:members:
:no-undoc-members:
:show-inheritance:
ahriman.models.internal\_status module
--------------------------------------
@ -84,6 +108,14 @@ ahriman.models.log\_record\_id module
:no-undoc-members:
:show-inheritance:
ahriman.models.metrics\_timer module
------------------------------------
.. automodule:: ahriman.models.metrics_timer
:members:
:no-undoc-members:
:show-inheritance:
ahriman.models.migration module
-------------------------------
@ -204,6 +236,14 @@ ahriman.models.result module
:no-undoc-members:
:show-inheritance:
ahriman.models.scan\_paths module
---------------------------------
.. automodule:: ahriman.models.scan_paths
:members:
:no-undoc-members:
:show-inheritance:
ahriman.models.sign\_settings module
------------------------------------

View File

@ -44,6 +44,14 @@ ahriman.web.schemas.counters\_schema module
:no-undoc-members:
:show-inheritance:
ahriman.web.schemas.dependencies\_schema module
-----------------------------------------------
.. automodule:: ahriman.web.schemas.dependencies_schema
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.schemas.error\_schema module
----------------------------------------
@ -52,6 +60,22 @@ ahriman.web.schemas.error\_schema module
:no-undoc-members:
:show-inheritance:
ahriman.web.schemas.event\_schema module
----------------------------------------
.. automodule:: ahriman.web.schemas.event_schema
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.schemas.event\_search\_schema module
------------------------------------------------
.. automodule:: ahriman.web.schemas.event_search_schema
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.schemas.file\_schema module
---------------------------------------
@ -156,6 +180,14 @@ ahriman.web.schemas.package\_status\_schema module
:no-undoc-members:
:show-inheritance:
ahriman.web.schemas.package\_version\_schema module
---------------------------------------------------
.. automodule:: ahriman.web.schemas.package_version_schema
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.schemas.pagination\_schema module
---------------------------------------------

View File

@ -0,0 +1,21 @@
ahriman.web.views.v1.auditlog package
=====================================
Submodules
----------
ahriman.web.views.v1.auditlog.events module
-------------------------------------------
.. automodule:: ahriman.web.views.v1.auditlog.events
:members:
:no-undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: ahriman.web.views.v1.auditlog
:members:
:no-undoc-members:
:show-inheritance:

View File

@ -0,0 +1,69 @@
ahriman.web.views.v1.packages package
=====================================
Submodules
----------
ahriman.web.views.v1.packages.changes module
--------------------------------------------
.. automodule:: ahriman.web.views.v1.packages.changes
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.packages.dependencies module
-------------------------------------------------
.. automodule:: ahriman.web.views.v1.packages.dependencies
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.packages.logs module
-----------------------------------------
.. automodule:: ahriman.web.views.v1.packages.logs
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.packages.package module
--------------------------------------------
.. automodule:: ahriman.web.views.v1.packages.package
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.packages.packages module
---------------------------------------------
.. automodule:: ahriman.web.views.v1.packages.packages
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.packages.patch module
------------------------------------------
.. automodule:: ahriman.web.views.v1.packages.patch
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.packages.patches module
--------------------------------------------
.. automodule:: ahriman.web.views.v1.packages.patches
:members:
:no-undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: ahriman.web.views.v1.packages
:members:
:no-undoc-members:
:show-inheritance:

View File

@ -7,7 +7,9 @@ Subpackages
.. toctree::
:maxdepth: 4
ahriman.web.views.v1.auditlog
ahriman.web.views.v1.distributed
ahriman.web.views.v1.packages
ahriman.web.views.v1.service
ahriman.web.views.v1.status
ahriman.web.views.v1.user

View File

@ -4,14 +4,6 @@ ahriman.web.views.v1.status package
Submodules
----------
ahriman.web.views.v1.status.changes module
------------------------------------------
.. automodule:: ahriman.web.views.v1.status.changes
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.status.info module
---------------------------------------
@ -20,46 +12,6 @@ ahriman.web.views.v1.status.info module
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.status.logs module
---------------------------------------
.. automodule:: ahriman.web.views.v1.status.logs
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.status.package module
------------------------------------------
.. automodule:: ahriman.web.views.v1.status.package
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.status.packages module
-------------------------------------------
.. automodule:: ahriman.web.views.v1.status.packages
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.status.patch module
----------------------------------------
.. automodule:: ahriman.web.views.v1.status.patch
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.status.patches module
------------------------------------------
.. automodule:: ahriman.web.views.v1.status.patches
:members:
:no-undoc-members:
:show-inheritance:
ahriman.web.views.v1.status.repositories module
-----------------------------------------------

View File

@ -0,0 +1,21 @@
ahriman.web.views.v2.packages package
=====================================
Submodules
----------
ahriman.web.views.v2.packages.logs module
-----------------------------------------
.. automodule:: ahriman.web.views.v2.packages.logs
:members:
:no-undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: ahriman.web.views.v2.packages
:members:
:no-undoc-members:
:show-inheritance:

View File

@ -7,7 +7,7 @@ Subpackages
.. toctree::
:maxdepth: 4
ahriman.web.views.v2.status
ahriman.web.views.v2.packages
Module contents
---------------

View File

@ -1,21 +0,0 @@
ahriman.web.views.v2.status package
===================================
Submodules
----------
ahriman.web.views.v2.status.logs module
---------------------------------------
.. automodule:: ahriman.web.views.v2.status.logs
:members:
:no-undoc-members:
:show-inheritance:
Module contents
---------------
.. automodule:: ahriman.web.views.v2.status
:members:
:no-undoc-members:
:show-inheritance:

View File

@ -192,6 +192,7 @@ Idea is to add package to a build queue from which it will be handled automatica
* If supplied argument is file, then application moves the file to the directory with built packages. Same rule applies for directory, but in this case it copies every package-like file from the specified directory.
* If supplied argument is directory and there is ``PKGBUILD`` file there, it will be treated as local package. In this case it will queue this package to build and copy source files (``PKGBUILD`` and ``.SRCINFO``) to caches.
* If supplied argument looks like URL (i.e. it has scheme - e.g. ``http://`` which is neither ``data`` nor ``file``), it tries to download the package from the specified remote source.
* If supplied argument is not file then application tries to lookup for the specified name in AUR and clones it into the directory with manual updates. This scenario can also handle package dependencies which are missing in repositories.
This logic can be overwritten by specifying the ``source`` parameter, which is partially useful if you would like to add package from AUR, but there is local directory cloned from AUR. Also official repositories calls are hidden behind explicit source definition.
@ -206,10 +207,20 @@ Remove packages
This flow removes package from filesystem, updates repository database and also runs synchronization and reporting methods.
Check outdated packages
^^^^^^^^^^^^^^^^^^^^^^^
There are few ways for packages to be marked as out-of-date and hence requiring rebuild. Those are following:
#. User requested update of the package. It can be caused by calling ``package-add`` subcommand (or ``package-update`` with arguments).
#. The most common way for packages to be marked as out-of-dated is that the version in AUR (or the official repositories) is newer than in the repository.
#. In addition to the above, if package is named as VCS (e.g. has suffix ``-git``) and the last update was more than specified threshold ago, the service will also try to fetch sources and check if the revision is newer than the built one.
#. In addition, there is ability to check if the dependencies of the package have been updated (e.g. if linked library has been renamed or the modules directory - e.g. in case of python and ruby packages - has been changed). And if so, the package will be marked as out-of-dated as well.
Update packages
^^^^^^^^^^^^^^^
This feature is divided into to the following stages: check AUR for updates and run rebuild for required packages. Whereas check does not do anything except for check itself, update flow is the following:
This feature is divided into to the following stages: check AUR for updates and run rebuild for required packages. The package update flow is the following:
#. Process every built package first. Those packages are usually added manually.
#. Run sync and report methods.
@ -259,6 +270,24 @@ The application is able to automatically bump package release (``pkgrel``) durin
#. If it has ``major.minor`` notation (e.g. ``1.1``), then increment last part by 1, e.g. ``1.1 -> 1.2``, ``1.0.1 -> 1.0.2``.
#. If ``pkgrel`` is a number (e.g. ``1``), then append 1 to the end of the string, e.g. ``1 -> 1.1``.
Implicit dependencies resolution
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In addition to the depends/optional/make/check depends lists the server also handles implicit dependencies. After success build, the application traverse through the build tree and finds
* Libraries to which the binaries (ELF-files) are linked. To do so, the ``NEEDED`` section of the ELF-files are read.
* Directories which contains files of the package, but do not belong to this package. This case covers, for example, python and ruby submodules.
Having the initial dependencies tree, the application is looking for packages which contains those (both files and directories) paths and creates the initial packages list. After that, the packages list is reduced in the following way:
* From any leaf exclude the package itself and possible debug packages.
* If the entry (i.e. file or directory) belongs to the package which is in base group, it will be removed.
* If there is a package which depends on the another package which provide the same entry, the package will be removed.
* After that, if there is a package which *optionally* depends on the another package in the remaining list, the package will be removed.
* And finally, if there is any path, which is the child of the entry, and it contains the same package, the package from the smaller entry will be removed.
All those implicit dependencies are stored in the database and extracted on each check. In case if any of the repository packages doesn't contain any entry anymore (e.g. so version has been changed or modules directory has been changed), the dependent package will be marked as out-of-dated.
Core functions reference
------------------------
@ -366,7 +395,7 @@ Web application requires the following python packages to be installed:
* Additional web features also require ``aiohttp-apispec`` (autogenerated documentation), ``aiohttp_cors`` (CORS support, required by documentation).
* In addition, authorization feature requires ``aiohttp_security``, ``aiohttp_session`` and ``cryptography``.
* In addition to base authorization dependencies, OAuth2 also requires ``aioauth-client`` library.
* In addition if you would like to disable authorization for local access (recommended way in order to run the application itself with reporting support), the ``requests-unixsocket`` library is required.
* In addition if you would like to disable authorization for local access (recommended way in order to run the application itself with reporting support), the ``requests-unixsocket2`` library is required.
Middlewares
^^^^^^^^^^^

View File

@ -17,14 +17,33 @@ There are two variable types which have been added to default ones, they are pat
Path values, except for casting to ``pathlib.Path`` type, will be also expanded to absolute paths relative to the configuration path. E.g. if path is set to ``ahriman.ini.d/logging.ini`` and root configuration path is ``/etc/ahriman.ini``, the value will be expanded to ``/etc/ahriman.ini.d/logging.ini``. In order to disable path expand, use the full path, e.g. ``/etc/ahriman.ini.d/logging.ini``.
Configuration allows string interpolation from environment variables, e.g.:
Configuration allows string interpolation from the same configuration file, e.g.:
.. code-block:: ini
[section]
key = ${anoher_key}
another_key = value
will read value for the ``section.key`` option from ``section.another_key``. In case if the cross-section reference is required, the ``${section:another_key}`` notation must be used. It also allows string interpolation from environment variables, e.g.:
.. code-block:: ini
[section]
key = $SECRET
will try to read value from ``SECRET`` environment variable. In case if the required environment variable wasn't found, it will keep original value (i.e. ``$SECRET`` in the example). Dollar sign can be set as ``$$``.
will try to read value from ``SECRET`` environment variable. In case if the required environment variable wasn't found, it will keep original value (i.e. ``$SECRET`` in the example). Dollar sign can be set as ``$$``. All those interpolations will be applied in succession and - expected to be - recursively, e.g. the following configuration:
.. code-block:: ini
[section1]
key = ${section2:key}
[section2]
key = ${home}
home = $HOME
will eventually lead ``section1.key`` option to be set to the value of ``HOME`` environment variable (if available).
There is also additional subcommand which will allow to validate configuration and print found errors. In order to do so, run ``service-config-validate`` subcommand, e.g.:
@ -53,6 +72,7 @@ libalpm and AUR related configuration. Group name can refer to architecture, e.g
* ``mirror`` - package database mirror used by pacman for synchronization, string, required. This option supports standard pacman substitutions with ``$arch`` and ``$repo``. Note that the mentioned mirror should contain all repositories which are set by ``alpm.repositories`` option.
* ``repositories`` - list of pacman repositories, used for package search, space separated list of strings, required.
* ``root`` - root for alpm library, string, required. In the most cases it must point to the system root.
* ``sync_files_database`` - download files database from mirror, boolean, required.
* ``use_ahriman_cache`` - use local pacman package cache instead of system one, boolean, required. With this option enabled you might want to refresh database periodically (available as additional flag for some subcommands). If set to ``no``, databases must be synchronized manually.
``auth`` group
@ -60,15 +80,17 @@ libalpm and AUR related configuration. Group name can refer to architecture, e.g
Base authorization settings. ``OAuth`` provider requires ``aioauth-client`` library to be installed.
* ``target`` - specifies authorization provider, string, optional, default ``disabled``. Allowed values are ``disabled``, ``configuration``, ``oauth``.
* ``target`` - specifies authorization provider, string, optional, default ``disabled``. Allowed values are ``disabled``, ``configuration``, ``oauth``, ``pam``.
* ``allow_read_only`` - allow requesting status APIs without authorization, boolean, required.
* ``client_id`` - OAuth2 application client ID, string, required in case if ``oauth`` is used.
* ``client_secret`` - OAuth2 application client secret key, string, required in case if ``oauth`` is used.
* ``cookie_secret_key`` - secret key which will be used for cookies encryption, string, optional. It must be 32 bytes URL-safe base64-encoded and can be generated as following ``base64.urlsafe_b64encode(os.urandom(32)).decode("utf8")``. If not set, it will be generated automatically; note, however, that in this case, all sessions will be automatically invalidated during the service restart.
* ``full_access_group`` - name of the secondary group (e.g. ``wheel``) to be used as admin group in the service, string, required in case if ``pam`` is used.
* ``max_age`` - parameter which controls both cookie expiration and token expiration inside the service in seconds, integer, optional, default is 7 days.
* ``oauth_icon`` - OAuth2 login button icon, string, optional, default is ``google``. Must be valid `Bootstrap icon <https://icons.getbootstrap.com/>`__ name.
* ``oauth_provider`` - OAuth2 provider class name as is in ``aioauth-client`` (e.g. ``GoogleClient``, ``GithubClient`` etc), string, required in case if ``oauth`` is used.
* ``oauth_scopes`` - scopes list for OAuth2 provider, which will allow retrieving user email (which is used for checking user permissions), e.g. ``https://www.googleapis.com/auth/userinfo.email`` for ``GoogleClient`` or ``user:email`` for ``GithubClient``, space separated list of strings, required in case if ``oauth`` is used.
* ``permit_root_login`` - allow login as root user, boolean, optional, default ``no``.
* ``salt`` - additional password hash salt, string, optional.
Authorized users are stored inside internal database, if any of external providers (e.g. ``oauth``) are used, the password field for non-service users must be empty.
@ -81,8 +103,10 @@ Build related configuration. Group name can refer to architecture, e.g. ``build:
* ``archbuild_flags`` - additional flags passed to ``archbuild`` command, space separated list of strings, optional.
* ``build_command`` - default build command, string, required.
* ``ignore_packages`` - list packages to ignore during a regular update (manual update will still work), space separated list of strings, optional.
* ``include_debug_packages`` - distribute debug packages, boolean, optional, default ``yes``.
* ``makepkg_flags`` - additional flags passed to ``makepkg`` command, space separated list of strings, optional.
* ``makechrootpkg_flags`` - additional flags passed to ``makechrootpkg`` command, space separated list of strings, optional.
* ``scan_paths`` - paths to be used for implicit dependencies scan, space separated list of strings, optional. If any of those paths is matched against the path, it will be added to the allowed list.
* ``triggers`` - list of ``ahriman.core.triggers.Trigger`` class implementation (e.g. ``ahriman.core.report.ReportTrigger ahriman.core.upload.UploadTrigger``) which will be loaded and run at the end of processing, space separated list of strings, optional. You can also specify triggers by their paths, e.g. ``/usr/lib/python3.10/site-packages/ahriman/core/report/report.py.ReportTrigger``. Triggers are run in the order of definition.
* ``triggers_known`` - optional list of ``ahriman.core.triggers.Trigger`` class implementations which are not run automatically and used only for trigger discovery and configuration validation.
* ``vcs_allowed_age`` - maximal age in seconds of the VCS packages before their version will be updated with its remote source, integer, optional, default is 7 days.
@ -128,7 +152,7 @@ Web server settings. This feature requires ``aiohttp`` libraries to be installed
* ``port`` - port to bind, integer, optional.
* ``service_only`` - disable status routes (including logs), boolean, optional, default ``no``.
* ``static_path`` - path to directory with static files, string, required.
* ``templates`` - path to templates directories, space separated list of strings, required.
* ``templates`` - path to templates directories, space separated list of paths, required.
* ``unix_socket`` - path to the listening unix socket, string, optional. If set, server will create the socket on the specified address which can (and will) be used by application. Note, that unlike usual host/port configuration, unix socket allows to perform requests without authorization.
* ``unix_socket_unsafe`` - set unsafe (o+w) permissions to unix socket, boolean, optional, default ``yes``. This option is enabled by default, because it is supposed that unix socket is created in safe environment (only web service is supposed to be used in unsafe), but it can be disabled by configuration.
* ``wait_timeout`` - wait timeout in seconds, maximum amount of time to be waited before lock will be free, integer, optional.
@ -246,11 +270,12 @@ Section name must be either ``email`` (plus optional architecture name, e.g. ``e
* ``password`` - SMTP password to authenticate, string, optional.
* ``port`` - SMTP port for sending emails, integer, required.
* ``receivers`` - SMTP receiver addresses, space separated list of strings, required.
* ``rss_url`` - link to RSS feed, string, optional.
* ``sender`` - SMTP sender address, string, required.
* ``ssl`` - SSL mode for SMTP connection, one of ``ssl``, ``starttls``, ``disabled``, optional, default ``disabled``.
* ``template`` - Jinja2 template name, string, required.
* ``template_full`` - Jinja2 template name for full package description index, string, optional.
* ``templates`` - path to templates directories, space separated list of strings, required.
* ``templates`` - path to templates directories, space separated list of paths, required.
* ``user`` - SMTP user to authenticate, string, optional.
``html`` type
@ -261,9 +286,10 @@ Section name must be either ``html`` (plus optional architecture name, e.g. ``ht
* ``type`` - type of the report, string, optional, must be set to ``html`` if exists.
* ``homepage`` - link to homepage, string, optional.
* ``link_path`` - prefix for HTML links, string, required.
* ``path`` - path to html report file, string, required.
* ``path`` - path to HTML report file, string, required.
* ``rss_url`` - link to RSS feed, string, optional.
* ``template`` - Jinja2 template name, string, required.
* ``templates`` - path to templates directories, space separated list of strings, required.
* ``templates`` - path to templates directories, space separated list of paths, required.
``remote-call`` type
^^^^^^^^^^^^^^^^^^^^
@ -276,6 +302,20 @@ Section name must be either ``remote-call`` (plus optional architecture name, e.
* ``manual`` - update manually built packages, boolean, optional, default ``no``.
* ``wait_timeout`` - maximum amount of time in seconds to be waited before remote process will be terminated, integer, optional, default ``-1``.
``rss`` type
^^^^^^^^^^^^
Section name must be either ``rss`` (plus optional architecture name, e.g. ``rss:x86_64``) or random name with ``type`` set.
* ``type`` - type of the report, string, optional, must be set to ``rss`` if exists.
* ``homepage`` - link to homepage, string, optional.
* ``link_path`` - prefix for HTML links, string, required.
* ``max_entries`` - maximal amount of entries to be included to the report, negative means no limit, integer, optional, default ``-1``.
* ``path`` - path to generated RSS file, string, required.
* ``rss_url`` - link to RSS feed, string, optional.
* ``template`` - Jinja2 template name, string, required.
* ``templates`` - path to templates directories, space separated list of paths, required.
``telegram`` type
^^^^^^^^^^^^^^^^^
@ -286,9 +326,10 @@ Section name must be either ``telegram`` (plus optional architecture name, e.g.
* ``chat_id`` - telegram chat id, either string with ``@`` or integer value, required.
* ``homepage`` - link to homepage, string, optional.
* ``link_path`` - prefix for HTML links, string, required.
* ``rss_url`` - link to RSS feed, string, optional.
* ``template`` - Jinja2 template name, string, required.
* ``template_type`` - ``parse_mode`` to be passed to telegram API, one of ``MarkdownV2``, ``HTML``, ``Markdown``, string, optional, default ``HTML``.
* ``templates`` - path to templates directories, space separated list of strings, required.
* ``templates`` - path to templates directories, space separated list of paths, required.
* ``timeout`` - HTTP request timeout in seconds, integer, optional, default is ``30``.
``upload`` group

File diff suppressed because it is too large Load Diff

35
docs/faq/backup.rst Normal file
View File

@ -0,0 +1,35 @@
Backup and restore
------------------
The service provides several commands aim to do easy repository backup and restore. If you would like to move repository from the server ``server1.example.com`` to another ``server2.example.com`` you have to perform the following steps:
#.
On the source server ``server1.example.com`` run ``repo-backup`` command, e.g.:
.. code-block:: shell
ahriman repo-backup /tmp/repo.tar.gz
This command will pack all configuration files together with database file into the archive specified as command line argument (i.e. ``/tmp/repo.tar.gz``). In addition it will also archive ``cache`` directory (the one which contains local clones used by e.g. local packages) and ``.gnupg`` of the ``ahriman`` user.
#.
Copy created archive from source server ``server1.example.com`` to target ``server2.example.com``.
#.
Install package as usual on the target server ``server2.example.com`` if you didn't yet.
#.
Extract archive e.g. by using subcommand:
.. code-block:: shell
ahriman repo-restore /tmp/repo.tar.gz
An additional argument ``-o``/``--output`` can be used to specify extraction root (``/`` by default).
#.
Rebuild repository:
.. code-block:: shell
sudo -u ahriman ahriman repo-rebuild --from-database

320
docs/faq/distributed.rst Normal file
View File

@ -0,0 +1,320 @@
Distributed builds
------------------
The service allows to run build on multiple machines and collect packages on main node. There are several ways to achieve it, this section describes officially supported methods.
Remote synchronization and remote server call
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This setup requires at least two instances of the service:
#. Web service (with opt-in authorization enabled), later will be referenced as ``master`` node.
#. Application instances responsible for build, later will be referenced as ``worker`` nodes.
In this example the following settings are assumed:
* Repository architecture is ``x86_64``.
* Master node address is ``master.example.com``.
Master node configuration
"""""""""""""""""""""""""
The only requirements for the master node is that API must be available for worker nodes to call (e.g. port must be exposed to internet, or local network in case of VPN, etc) and file upload must be enabled:
.. code-block:: ini
[web]
enable_archive_upload = yes
In addition, the following settings are recommended for the master node:
*
As it has been mentioned above, it is recommended to enable authentication (see :doc:`How to enable basic authorization <web>`) and create system user which will be used later. Later this user (if any) will be referenced as ``worker-user``.
*
In order to be able to spawn multiple processes at the same time, wait timeout must be configured:
.. code-block:: ini
[web]
wait_timeout = 0
Worker nodes configuration
""""""""""""""""""""""""""
#.
First of all, in this setup you need to split your repository into chunks manually, e.g. if you have repository on master node with packages ``A``, ``B`` and ``C``, you need to split them between all available workers, as example:
* Worker #1: ``A``.
* Worker #2: ``B`` and ``C``.
Hint: ``repo-tree`` subcommand provides ``--partitions`` argument.
#.
Each worker must be configured to upload files to master node:
.. code-block:: ini
[upload]
target = remote-service
[remote-service]
#.
Worker must be configured to access web on master node:
.. code-block:: ini
[status]
address = https://master.example.com
username = worker-user
password = very-secure-password
As it has been mentioned above, ``status.address`` must be available for workers. In case if unix socket is used, it can be passed in the same option as usual. Optional ``status.username``/``status.password`` can be supplied in case if authentication was enabled on master node.
#.
Each worker must call master node on success:
.. code-block:: ini
[report]
target = remote-call
[remote-call]
manual = yes
After success synchronization (see above), the built packages will be put into directory, from which they will be read during manual update, thus ``remote-call.manual`` flag is required.
#.
Change order of trigger runs. This step is required, because by default the report trigger is called before the upload trigger and we would like to achieve the opposite:
.. code-block:: ini
[build]
triggers = ahriman.core.gitremote.RemotePullTrigger ahriman.core.upload.UploadTrigger ahriman.core.report.ReportTrigger ahriman.core.gitremote.RemotePushTrigger
In addition, the following settings are recommended for workers:
*
You might want to wait until report trigger will be completed; in this case the following option must be set:
.. code-block:: ini
[remote-call]
wait_timeout = 0
Dependency management
"""""""""""""""""""""
By default worker nodes don't know anything about master nodes packages, thus it will try to build each dependency by its own. However, using ``AHRIMAN_REPOSITORY_SERVER`` docker variable (or ``--server`` flag for setup command), it is possible to specify address of the master node for devtools configuration.
Repository and packages signing
"""""""""""""""""""""""""""""""
You can sign packages on worker nodes and then signatures will be synced to master node. In order to do so, you need to configure worker node as following, e.g.:
.. code-block:: ini
[sign]
target = package
key = 8BE91E5A773FB48AC05CC1EDBED105AED6246B39
Note, however, that in this case, signatures will not be validated on master node and just will be copied to repository tree.
If you would like to sign only database files (aka repository sign), it has to be configured only on master node as usual, e.g.:
.. code-block:: ini
[sign]
target = repository
key = 8BE91E5A773FB48AC05CC1EDBED105AED6246B39
Double node minimal docker example
""""""""""""""""""""""""""""""""""
Master node config (``master.ini``) as:
.. code-block:: ini
[auth]
target = configuration
[web]
enable_archive_upload = yes
wait_timeout = 0
Command to run master node:
.. code-block:: shell
docker run --privileged -p 8080:8080 -e AHRIMAN_PORT=8080 -v master.ini:/etc/ahriman.ini.d/overrides.ini arcan1s/ahriman:latest web
The user ``worker-user`` has been created additionally. Worker node config (``worker.ini``) as:
.. code-block:: ini
[status]
address = http://172.17.0.1:8080
username = worker-user
password = very-secure-password
[upload]
target = remote-service
[remote-service]
[report]
target = remote-call
[remote-call]
manual = yes
wait_timeout = 0
[build]
triggers = ahriman.core.gitremote.RemotePullTrigger ahriman.core.upload.UploadTrigger ahriman.core.report.ReportTrigger ahriman.core.gitremote.RemotePushTrigger
The address above (``http://172.17.0.1:8080``) is somewhat available for worker container.
Command to run worker node:
.. code-block:: shell
docker run --privileged -v worker.ini:/etc/ahriman.ini.d/overrides.ini -it arcan1s/ahriman:latest package-add ahriman --now
The command above will successfully build ``ahriman`` package, upload it on master node and, finally, will update master node repository.
Check proof-of-concept setup `here <https://github.com/arcan1s/ahriman/tree/master/recipes/distributed-manual>`__.
Addition of new package and repository update
"""""""""""""""""""""""""""""""""""""""""""""
Just run on worker command as usual, the built packages will be automatically uploaded to master node. Note that automatic update process must be disabled on master node.
Package removal
"""""""""""""""
This action must be done in two steps:
#. Remove package on worker.
#. Remove package on master node.
Delegate builds to remote workers
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This setup heavily uses upload feature described above and, in addition, also delegates build process automatically to build machines. Same as above, there must be at least two instances available (``master`` and ``worker``), however, all ``worker`` nodes must be run in the web service mode.
Master node configuration
"""""""""""""""""""""""""
In addition to the configuration above, the worker list must be defined in configuration file (``build.workers`` option), i.e.:
.. code-block:: ini
[build]
workers = https://worker1.example.com https://worker2.example.com
[web]
enable_archive_upload = yes
wait_timeout = 0
In the example above, ``https://worker1.example.com`` and ``https://worker2.example.com`` are remote ``worker`` node addresses available for ``master`` node.
In case if authentication is required (which is recommended way to setup it), it can be set by using ``status`` section as usual.
Worker nodes configuration
""""""""""""""""""""""""""
It is required to point to the master node repository, otherwise internal dependencies will not be handled correctly. In order to do so, the ``--server`` argument (or ``AHRIMAN_REPOSITORY_SERVER`` environment variable for docker images) can be used.
Also, in case if authentication is enabled, the same user with the same password must be created for all workers.
It is also recommended to set ``web.wait_timeout`` to infinite in case of multiple conflicting runs and ``service_only`` to ``yes`` in order to disable status endpoints.
Other settings are the same as mentioned above.
Triple node minimal docker example
""""""""""""""""""""""""""""""""""
In this example, all instances are run on the same machine with address ``172.17.0.1`` with ports available outside of container. Master node config (``master.ini``) as:
.. code-block:: ini
[auth]
target = configuration
[status]
username = builder-user
password = very-secure-password
[build]
workers = http://172.17.0.1:8081 http://172.17.0.1:8082
[web]
enable_archive_upload = yes
wait_timeout = 0
Command to run master node:
.. code-block:: shell
docker run --privileged -p 8080:8080 -e AHRIMAN_PORT=8080 -v master.ini:/etc/ahriman.ini.d/overrides.ini arcan1s/ahriman:latest web
Worker nodes (applicable for all workers) config (``worker.ini``) as:
.. code-block:: ini
[auth]
target = configuration
[status]
address = http://172.17.0.1:8080
username = builder-user
password = very-secure-password
[upload]
target = remote-service
[remote-service]
[report]
target = remote-call
[remote-call]
manual = yes
wait_timeout = 0
[web]
service_only = yes
[build]
triggers = ahriman.core.upload.UploadTrigger ahriman.core.report.ReportTrigger
Command to run worker nodes (considering there will be two workers, one is on ``8081`` port and other is on ``8082``):
.. code-block:: shell
docker run --privileged -p 8081:8081 -e AHRIMAN_PORT=8081 -v worker.ini:/etc/ahriman.ini.d/overrides.ini arcan1s/ahriman:latest web
docker run --privileged -p 8082:8082 -e AHRIMAN_PORT=8082 -v worker.ini:/etc/ahriman.ini.d/overrides.ini arcan1s/ahriman:latest web
Unlike the previous setup, it doesn't require to mount repository root for ``worker`` nodes, because they don't use it anyway.
Check proof-of-concept setup `here <https://github.com/arcan1s/ahriman/tree/master/recipes/distributed>`__.
Addition of new package, package removal, repository update
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
In all scenarios, update process must be run only on ``master`` node. Unlike the manually distributed packages described above, automatic update must be enabled only for ``master`` node.
Automatic worker nodes discovery
""""""""""""""""""""""""""""""""
Instead of setting ``build.workers`` option it is also possible to configure services to load worker list dynamically. To do so, the ``ahriman.core.distributed.WorkerLoaderTrigger`` and ``ahriman.core.distributed.WorkerTrigger`` must be used for ``master`` and ``worker`` nodes repsectively. See recipes for more details.
Known limitations
"""""""""""""""""
* Workers don't support local packages. However, it is possible to build custom packages by providing sources by using ``ahriman.core.gitremote.RemotePullTrigger`` trigger.
* No dynamic nodes discovery. In case if one of worker nodes is unavailable, the build process will fail.
* No pkgrel bump on conflicts.
* The identical user must be created for all workers. However, the ``master`` node user can be different from this one.

115
docs/faq/docker.rst Normal file
View File

@ -0,0 +1,115 @@
Docker image
------------
We provide official images which can be found under:
* docker registry ``arcan1s/ahriman``;
* ghcr.io registry ``ghcr.io/arcan1s/ahriman``.
These images are totally identical.
Docker image is being updated on each commit to master as well as on each version. If you would like to use last (probably unstable) build you can use ``edge`` tag or ``latest`` for any tagged versions; otherwise you can use any version tag available.
The default action (in case if no arguments provided) is ``repo-update``. Basically the idea is to run container, e.g.:
.. code-block:: shell
docker run --privileged -v /path/to/local/repo:/var/lib/ahriman arcan1s/ahriman:latest
``--privileged`` flag is required to make mount possible inside container. In order to make data available outside of container, you would need to mount local (parent) directory inside container by using ``-v /path/to/local/repo:/var/lib/ahriman`` argument, where ``/path/to/local/repo`` is a path to repository on local machine. In addition, you can pass own configuration overrides by using the same ``-v`` flag, e.g.:
.. code-block:: shell
docker run --privileged -v /path/to/local/repo:/var/lib/ahriman -v /path/to/overrides/overrides.ini:/etc/ahriman.ini.d/10-overrides.ini arcan1s/ahriman:latest
The action can be specified during run, e.g.:
.. code-block:: shell
docker run --privileged -v /path/to/local/repo:/var/lib/ahriman arcan1s/ahriman:latest package-add ahriman --now
For more details please refer to the docker FAQ.
Environment variables
^^^^^^^^^^^^^^^^^^^^^
The following environment variables are supported:
* ``AHRIMAN_ARCHITECTURE`` - architecture of the repository, default is ``x86_64``.
* ``AHRIMAN_DEBUG`` - if set all commands will be logged to console.
* ``AHRIMAN_FORCE_ROOT`` - force run ahriman as root instead of guessing by subcommand.
* ``AHRIMAN_HOST`` - host for the web interface, default is ``0.0.0.0``.
* ``AHRIMAN_MULTILIB`` - if set (default) multilib repository will be used, disabled otherwise.
* ``AHRIMAN_OUTPUT`` - controls logging handler, e.g. ``syslog``, ``console``. The name must be found in logging configuration. Note that if ``syslog`` handler is used you will need to mount ``/dev/log`` inside container because it is not available there.
* ``AHRIMAN_PACKAGER`` - packager name from which packages will be built, default is ``ahriman bot <ahriman@example.com>``.
* ``AHRIMAN_PACMAN_MIRROR`` - override pacman mirror server if set.
* ``AHRIMAN_PORT`` - HTTP server port if any, default is empty.
* ``AHRIMAN_POSTSETUP_COMMAND`` - if set, the command which will be called (as root) after the setup command, but before any other actions.
* ``AHRIMAN_PRESETUP_COMMAND`` - if set, the command which will be called (as root) right before the setup command.
* ``AHRIMAN_REPOSITORY`` - repository name, default is ``aur-clone``.
* ``AHRIMAN_REPOSITORY_SERVER`` - optional override for the repository URL. Useful if you would like to download packages from remote instead of local filesystem.
* ``AHRIMAN_REPOSITORY_ROOT`` - repository root. Because of filesystem rights it is required to override default repository root. By default, it uses ``ahriman`` directory inside ahriman's home, which can be passed as mount volume.
* ``AHRIMAN_UNIX_SOCKET`` - full path to unix socket which is used by web server, default is empty. Note that more likely you would like to put it inside ``AHRIMAN_REPOSITORY_ROOT`` directory (e.g. ``/var/lib/ahriman/ahriman/ahriman-web.sock``) or to ``/run/ahriman``.
* ``AHRIMAN_USER`` - ahriman user, usually must not be overwritten, default is ``ahriman``.
* ``AHRIMAN_VALIDATE_CONFIGURATION`` - if set (default) validate service configuration.
You can pass any of these variables by using ``-e`` argument, e.g.:
.. code-block:: shell
docker run --privileged -e AHRIMAN_PORT=8080 -v /path/to/local/repo:/var/lib/ahriman arcan1s/ahriman:latest
Daemon service
^^^^^^^^^^^^^^
There is special ``repo-daemon`` subcommand which emulates systemd timer and will perform repository update periodically:
.. code-block:: shell
docker run --privileged -v /path/to/local/repo:/var/lib/ahriman arcan1s/ahriman:latest repo-daemon
This command uses same rules as ``repo-update``, thus, e.g. requires ``--privileged`` flag. Check also `examples <https://github.com/arcan1s/ahriman/tree/master/recipes/daemon>`__.
Web service setup
^^^^^^^^^^^^^^^^^
For that you would need to have web container instance running forever; it can be achieved by the following command:
.. code-block:: shell
docker run --privileged -p 8080:8080 -e AHRIMAN_PORT=8080 -e AHRIMAN_UNIX_SOCKET=/var/lib/ahriman/ahriman/ahriman-web.sock -v /path/to/local/repo:/var/lib/ahriman arcan1s/ahriman:latest
Note about ``AHRIMAN_PORT`` environment variable which is required in order to enable web service. An additional port bind by ``-p 8080:8080`` is required to pass docker port outside of container.
The ``AHRIMAN_UNIX_SOCKET`` variable is not required, however, highly recommended as it can be used for interprocess communications. If you set this variable you would like to be sure that this path is available outside of container if you are going to use multiple docker instances.
If you are using ``AHRIMAN_UNIX_SOCKET`` variable, for every next container run it has to be passed also, e.g.:
.. code-block:: shell
docker run --privileged -e AHRIMAN_UNIX_SOCKET=/var/lib/ahriman/ahriman/ahriman-web.sock -v /path/to/local/repo:/var/lib/ahriman arcan1s/ahriman:latest
Otherwise, you would need to pass ``AHRIMAN_PORT`` and mount container network to the host system (``--net=host``), e.g.:
.. code-block:: shell
docker run --privileged --net=host -e AHRIMAN_PORT=8080 -v /path/to/local/repo:/var/lib/ahriman arcan1s/ahriman:latest
Simple server with authentication can be found in `examples <https://github.com/arcan1s/ahriman/tree/master/recipes/web>`__ too.
Mutli-repository web service
""""""""""""""""""""""""""""
Idea is pretty same as to just run web service. However, it is required to run setup commands for each repository, except for one which is specified by ``AHRIMAN_REPOSITORY`` and ``AHRIMAN_ARCHITECTURE`` variables.
In order to create configuration for additional repositories, the ``AHRIMAN_POSTSETUP_COMMAND`` variable should be used, e.g.:
.. code-block:: shell
docker run --privileged -p 8080:8080 -e AHRIMAN_PORT=8080 -e AHRIMAN_UNIX_SOCKET=/var/lib/ahriman/ahriman/ahriman-web.sock -e AHRIMAN_POSTSETUP_COMMAND="ahriman --architecture x86_64 --repository aur-clone-v2 service-setup --build-as-user ahriman --packager 'ahriman bot <ahriman@example.com>'" -v /path/to/local/repo:/var/lib/ahriman arcan1s/ahriman:latest
The command above will also create configuration for the repository named ``aur-clone-v2``.
Note, however, that the command above is only required in case if the service is going to be used to run subprocesses. Otherwise, everything else (web interface, status, etc) will be handled as usual.
Configuration `example <https://github.com/arcan1s/ahriman/tree/master/recipes/multirepo>`__.

12
docs/faq/examples.rst Normal file
View File

@ -0,0 +1,12 @@
Use cases
---------
There is a collection of some specific recipes which can be found in `the repository <https://github.com/arcan1s/ahriman/tree/master/recipes>`__.
Most of them can be run (``AHRIMAN_PASSWORD`` environment variable is required in the most setups) as simple as:
.. code-block:: shell
AHRIMAN_PASSWORD=demo docker compose up
Note, however, they are just an examples of specific configuration for specific cases and they are never intended to be used as is in real environment.

427
docs/faq/general.rst Normal file
View File

@ -0,0 +1,427 @@
General topics
--------------
What is the purpose of the project
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This project has been created in order to maintain self-hosted Arch Linux user repository without manual intervention - checking for updates and building packages.
How to install ahriman
^^^^^^^^^^^^^^^^^^^^^^
TL;DR
.. code-block:: shell
yay -S ahriman
ahriman -a x86_64 -r aur-clone service-setup --packager "ahriman bot <ahriman@example.com>"
systemctl enable --now ahriman@x86_64-aur-clone.timer
Long answer
"""""""""""
The idea is to install the package as usual, create working directory tree, create configuration for ``sudo`` and ``devtools``. Detailed description of the setup instruction can be found :doc:`here </setup>`.
Run as daemon
"""""""""""""
The alternative way (though not recommended) is to run service instead of timer:
.. code-block:: shell
systemctl enable --now ahriman-daemon@x86_64-aur-clone
How to validate settings
^^^^^^^^^^^^^^^^^^^^^^^^
There is special command which can be used in order to validate current configuration:
.. code-block:: shell
ahriman service-config-validate --exit-code
This command will print found errors, based on `cerberus <https://docs.python-cerberus.org/>`__, e.g.:
.. code-block:: shell
auth
ssalt: unknown field
target: none or more than one rule validate
oneof definition 0: unallowed value mapping
oneof definition 1: field 'salt' is required
oneof definition 2: unallowed value mapping
oneof definition 2: field 'salt' is required
oneof definition 2: field 'client_id' is required
oneof definition 2: field 'client_secret' is required
gitremote
pull_url: unknown field
If an additional flag ``--exit-code`` is supplied, the application will return non-zero exit code, which can be used partially in scripts.
What does "architecture specific" mean / How to configure for different architectures
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Some sections can be configured per architecture. The service will merge architecture specific values into common settings. In order to specify settings for specific architecture you must point it in section name.
For example, the section
.. code-block:: ini
[build]
build_command = extra-x86_64-build
states that default build command is ``extra-x86_64-build``. But if there is section
.. code-block:: ini
[build:i686]
build_command = extra-i686-build
the ``extra-i686-build`` command will be used for ``i686`` architecture. You can also override settings for different repositories and architectures; in this case section names will be ``build:aur-clone`` (repository name only) and ``build:aur-clone:i686`` (both repository name and architecture).
How to generate build reports
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Normally you would probably like to generate only one report for the specific type, e.g. only one email report. In order to do so you will need to have the following configuration:
.. code-block:: ini
[report]
target = email
[email]
...
or in case of multiple architectures and *different* reporting settings:
.. code-block:: ini
[report]
target = email
[email:i686]
...
[email:x86_64]
...
But for some cases you would like to have multiple different reports with the same type (e.g. sending different templates to different addresses). For these cases you will need to specify section name in target and type in section, e.g. the following configuration can be used:
.. code-block:: ini
[report]
target = email_1 email_2
[email_1]
type = email
...
[email_2]
type = email
...
How to add new package
^^^^^^^^^^^^^^^^^^^^^^
.. code-block:: shell
sudo -u ahriman ahriman package-add ahriman --now
``--now`` flag is totally optional and just run ``repo-update`` subcommand after the registering the new package. Thus the extended flow is the following:
.. code-block:: shell
sudo -u ahriman ahriman package-add ahriman
sudo -u ahriman ahriman repo-update
How to build package from local PKGBUILD
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TL;DR
.. code-block:: shell
sudo -u ahriman ahriman package-add /path/to/local/directory/with/PKGBUILD --now
Before using this command you will need to create local directory, put ``PKGBUILD`` there and generate ``.SRCINFO`` by using ``makepkg --printsrcinfo > .SRCINFO`` command. These packages will be stored locally and *will be ignored* during automatic update; in order to update the package you will need to run ``package-add`` command again.
How to copy package from another repository
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
As simple as add package from archive. Considering case when you would like to copy package ``package`` with version ``ver-rel`` from repository ``source-repository`` to ``target-respository`` (same architecture), the command will be following:
.. code-block:: shell
sudo -u ahriman ahriman -r target-repository package-add /var/lib/ahriman/repository/source-repository/x86_64/package-ver-rel-x86_64.pkg.tar.zst
In addition, you can remove source package as usual later.
This feature in particular useful if for managing multiple repositories like ``[testing]`` and ``[extra]``.
How to fetch PKGBUILDs from remote repository
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
For that purpose you could use ``RemotePullTrigger`` trigger. To do so you will need to configure trigger as following:
.. code-block:: ini
[remote-pull]
target = gitremote
[gitremote]
pull_url = https://github.com/username/repository
During the next application run it will fetch repository from the specified URL and will try to find packages there which can be used as local sources.
This feature can be also used to build packages which are not listed in AUR, the example of the feature use can be found `here <https://github.com/arcan1s/ahriman/tree/master/recipes/pull>`__.
How to push updated PKGBUILDs to remote repository
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
For that purpose you'd need to use another trigger called ``RemotePushTrigger``. Configure trigger as following:
.. code-block:: ini
[remote-push]
target = gitremote
[gitremote]
push_url = https://github.com/username/repository
Unlike ``RemotePullTrigger`` trigger, the ``RemotePushTrigger`` more likely will require authorization. It is highly recommended to use application tokens for that instead of using your password (e.g. for GitHub you can generate tokens `here <https://github.com/settings/tokens>`__ with scope ``public_repo``). Authorization can be supplied by using authorization part of the URL, e.g. ``https://key:token@github.com/username/repository``.
How to change PKGBUILDs before build
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Well it is supported also. The recommended way is to patch specific function, e.g. by running
.. code-block:: shell
sudo -u ahriman ahriman patch-add ahriman version
This command will prompt for new value of the PKGBUILD variable ``version``. You can also write it to file and read from it:
.. code-block:: shell
sudo -u ahriman ahriman patch-add ahriman version version.patch
The command also supports arrays, but in this case you need to specify full array, e.g.
.. code-block:: shell
sudo -u ahriman ahriman patch-add ahriman depends
Post new function or variable value below. Press Ctrl-D to finish:
(python python-aiohttp)
^D
will set depends PKGBUILD variable (exactly) to array ``["python", "python-aiohttp"]``.
Alternatively you can create full-diff patches, which are calculated by using ``git diff`` from current PKGBUILD master branch:
#.
Clone sources from AUR.
#.
Make changes you would like to (e.g. edit ``PKGBUILD``, add external patches).
#.
Run command
.. code-block:: shell
sudo -u ahriman ahriman patch-set-add /path/to/local/directory/with/PKGBUILD
The last command will calculate diff from current tree to the ``HEAD`` and will store it locally. Patches will be applied on any package actions (e.g. it can be used for dependency management).
It is also possible to create simple patch during package addition, e.g.:
.. code-block:: shell
sudo -u ahriman ahriman package-add ahriman --variable PKGEXT=.pkg.tar.xz
The ``--variable`` argument accepts variables in shell like format: quotation and lists are supported as usual, but functions are not. This feature is useful in particular in order to override specific makepkg variables during build.
How to build package from official repository
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
It is the same as adding any other package, but due to restrictions you must specify source explicitly, e.g.:
.. code-block:: shell
sudo -u ahriman ahriman package-add pacman --source repository
This feature is heavily depends on local pacman cache. In order to use this feature it is recommended to either run ``pacman -Sy`` before the interaction or use internal application cache with ``--refresh`` flag.
Package build fails because it cannot validate PGP signature of source files
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TL;DR
.. code-block:: shell
sudo -u ahriman ahriman service-key-import ...
How to update VCS packages
^^^^^^^^^^^^^^^^^^^^^^^^^^
Normally the service handles VCS packages correctly. The version is updated in clean chroot, no additional actions are required.
How to review changes before build
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In this scenario, the update process must be separated into several stages. First, it is required to check updates:
.. code-block:: shell
sudo -u ahriman ahriman repo-check
During the check process, the service will generate changes from the last known commit and will send it to remote service. In order to verify source files changes, the web interface or special subcommand can be used:
.. code-block:: shell
ahriman package-changes ahriman
After validation, the operator can run update process with approved list of packages, e.g.:
.. code-block:: shell
sudo -u ahriman ahriman repo-update ahriman
How to remove package
^^^^^^^^^^^^^^^^^^^^^
.. code-block:: shell
sudo -u ahriman ahriman package-remove ahriman
Also, there is command ``repo-remove-unknown`` which checks packages in AUR and local storage and removes ones which have been removed.
Remove commands also remove any package files (patches, caches etc).
How to sign repository
^^^^^^^^^^^^^^^^^^^^^^
Repository sign feature is available in several configurations. The recommended way is just to sign repository database file by single key instead of trying to sign each package. However, the steps are pretty same, just configuration is a bit different. For more details about options kindly refer to :doc:`configuration reference </configuration>`.
#.
First you would need to create the key on your local machine:
.. code-block:: shell
gpg --full-generate-key
This command will prompt you for several questions. Most of them may be left default, but you will need to fill real name and email address with some data. Because at the moment the service doesn't support passphrases, it must be left blank.
#.
The command above will generate key and print its fingerprint, something like ``8BE91E5A773FB48AC05CC1EDBED105AED6246B39``. Copy it.
#.
Export your private key by using the fingerprint above:
.. code-block:: shell
gpg --export-secret-keys -a 8BE91E5A773FB48AC05CC1EDBED105AED6246B39 > repository-key.gpg
#.
Copy the specified key to the build machine (i.e. where the service is running).
#.
Import the specified key to the service user:
.. code-block:: shell
sudo -u ahriman gpg --import repository-key.gpg
Don't forget to remove the key from filesystem after import.
#.
Change trust level to ``ultimate``:
.. code-block:: shell
sudo -u ahriman gpg --edit-key 8BE91E5A773FB48AC05CC1EDBED105AED6246B39
The command above will drop you into gpg shell, in which you will need to type ``trust``, choose ``5 = I trust ultimately``, confirm and exit ``quit``.
#.
Proceed with service configuration according to the :doc:`configuration </configuration>`:
.. code-block:: ini
[sign]
target = repository
key = 8BE91E5A773FB48AC05CC1EDBED105AED6246B39
How to rebuild packages after library update
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TL;DR
.. code-block:: shell
sudo -u ahriman ahriman repo-rebuild --depends-on python
You can even rebuild the whole repository (which is particular useful in case if you would like to change packager) if you do not supply ``--depends-on`` option. This action will automatically increment ``pkgrel`` value; in case if you don't want to, the ``--no-increment`` option has to be supplied.
However, note that you do not need to rebuild repository in case if you just changed signing option, just use ``repo-sign`` command instead.
Automated broken dependencies detection
"""""""""""""""""""""""""""""""""""""""
After the success build the application extracts all linked libraries and used directories and stores them in database. During the check process, the application extracts pacman databases and checks if file names have been changed (e.g. new python release caused ``/usr/lib/python3.x`` directory renaming to ``/usr/lib/python3.y`` or soname for a linked library has been changed). In case if broken dependencies have been detected, the package will be added to the rebuild queue.
In order to disable this check completely, the ``--no-check-files`` flag can be used.
In addition, there is possibility to control paths which will be used for checking, by using option ``build.scan_paths``, which supports regular expressions. Leaving ``build.scan_paths`` blank will effectively disable any check too.
How to install built packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Add the following lines to your ``pacman.conf``:
.. code-block:: ini
[repository]
Server = file:///var/lib/ahriman/repository/$repo/$arch
(You might need to add ``SigLevel`` option according to the pacman documentation.)
How to serve repository
^^^^^^^^^^^^^^^^^^^^^^^
Easy. For example, nginx configuration (without SSL) will look like:
.. code-block::
server {
listen 80;
server_name repo.example.com;
location / {
autoindex on;
root /var/lib/ahriman/repository;
}
}
Example of the status page configuration is the following (status service is using 8080 port):
.. code-block::
server {
listen 80;
server_name builds.example.com;
location / {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarder-Proto $scheme;
proxy_pass http://127.0.0.1:8080;
}
}
Some more examples can be found in configuration `recipes <https://github.com/arcan1s/ahriman/tree/master/recipes>`__.

17
docs/faq/index.rst Normal file
View File

@ -0,0 +1,17 @@
FAQ
===
.. toctree::
:maxdepth: 2
general
docker
non-x86_64-setup
synchronization
reporting
distributed
maintenance-packages
web
backup
examples
misc

View File

@ -0,0 +1,73 @@
Maintenance packages
--------------------
Generate keyring package
^^^^^^^^^^^^^^^^^^^^^^^^
The application provides special plugin which generates keyring package. This plugin heavily depends on ``sign`` group settings, however it is possible to override them. The minimal package can be generated in the following way:
#.
Edit configuration:
.. code-block:: ini
[keyring]
target = keyring-generator
By default it will use ``sign.key`` as trusted key and all other keys as packagers ones. For all available options refer to :doc:`configuration </configuration>`.
#.
Create package source files:
.. code-block:: shell
sudo -u ahriman ahriman repo-create-keyring
This command will generate PKGBUILD, revoked and trusted listings and keyring itself and will register the package in database.
#.
Build new package as usual:
.. code-block:: shell
sudo -u ahriman ahriman package-add aur-clone-keyring --source local --now
where ``aur-clone`` is your repository name.
This plugin might have some issues, in case of any of them, kindly create `new issue <https://github.com/arcan1s/ahriman/issues/new/choose>`__.
Generate mirrorlist package
^^^^^^^^^^^^^^^^^^^^^^^^^^^
The application provides special plugin which generates mirrorlist package also. It is possible to distribute this package as usual later. The package can be generated in the following way:
#.
Edit configuration:
.. code-block:: ini
[mirrorlist]
target = mirrorlist-generator
[mirrorlist-generator]
servers = https://repo.example.com/$arch
The ``mirrorlist-generator.servers`` must contain list of available mirrors, the ``$arch`` and ``$repo`` variables are supported. For more options kindly refer to :doc:`configuration </configuration>`.
#.
Create package source files:
.. code-block:: shell
sudo -u ahriman ahriman repo-create-mirrorlist
This command will generate PKGBUILD and mirrorlist file and will register the package in database.
#.
Build new package as usual:
.. code-block:: shell
sudo -u ahriman ahriman package-add aur-clone-mirrorlist --source local --now
where ``aur-clone`` is your repository name.

100
docs/faq/misc.rst Normal file
View File

@ -0,0 +1,100 @@
Other topics
------------
How does it differ from %another-manager%?
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Short answer - I do not know. Also for some references credits to `Alad <https://github.com/AladW>`__, he `did <https://wiki.archlinux.org/title/User:Alad/Local_repo_tools>`__ really good investigation of existing alternatives.
`arch-repo-manager <https://github.com/Martchus/arch-repo-manager>`__
"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Looks actually pretty good, in case if I would find it, I would probably didn't start this project; the most of features (like web interface or additional helpers) are already implemented or planned to be. However, this project seems to be at early alpha stage (as for Nov 2022), written in C++ (not pro or con) and misses documentation.
`archrepo2 <https://github.com/lilydjwg/archrepo2>`__
"""""""""""""""""""""""""""""""""""""""""""""""""""""
Don't know, haven't tried it. But it lacks of documentation at least.
* ``ahriman`` has web interface.
* ``archrepo2`` doesn't have synchronization and reporting.
* ``archrepo2`` actively uses direct shell calls and ``yaourt`` components.
* ``archrepo2`` has constantly running process instead of timer process (it is not pro or con).
`repoctl <https://github.com/cassava/repoctl>`__
""""""""""""""""""""""""""""""""""""""""""""""""
* ``ahriman`` has web interface.
* ``repoctl`` does not have reporting feature.
* ``repoctl`` does not support local packages and patches.
* Some actions are not fully automated in ``repoctl`` (e.g. package update still requires manual intervention for the build itself).
* ``repoctl`` has better AUR interaction features. With colors!
* ``repoctl`` has much easier configuration and even completion.
* ``repoctl`` is able to store old packages.
* Ability to host repository from same command in ``repoctl`` vs external services (e.g. nginx) in ``ahriman``.
`repod <https://gitlab.archlinux.org/archlinux/repod>`__
""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Official tool provided by distribution, has clean logic, but it is just a helper for ``repo-add``, e.g. it doesn't work with AUR and all packages builds have to be handled separately.
`repo-scripts <https://github.com/arcan1s/repo-scripts>`__
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
Though originally I've created ahriman by trying to improve the project, it still lacks a lot of features:
* ``ahriman`` has web interface.
* ``ahriman`` has better reporting with template support.
* ``ahriman`` has more synchronization features (there was only ``rsync`` based).
* ``ahriman`` supports local packages and patches.
* ``repo-scripts`` doesn't have dependency management.
...and so on. ``repo-scripts`` also has bad architecture and bad quality code and uses out-of-dated ``yaourt`` and ``package-query``.
`toolbox <https://github.com/chaotic-aur/toolbox>`__
""""""""""""""""""""""""""""""""""""""""""""""""""""
It is automation tools for ``repoctl`` mentioned above. Except for using shell it looks pretty cool and also offers some additional features like patches, remote synchronization (isn't it?) and reporting.
How to check service logs
^^^^^^^^^^^^^^^^^^^^^^^^^
By default, the service writes logs to ``journald`` which can be accessed by using ``journalctl`` command (logs are written to the journal of the user under which command is run). In order to retrieve logs for the process you can use the following command:
.. code-block:: shell
sudo journalctl SYSLOG_IDENTIFIER=ahriman
You can also ask to forward logs to ``stderr``, just set ``--log-handler`` flag, e.g.:
.. code-block:: shell
ahriman --log-handler console ...
You can even configure logging as you wish, but kindly refer to python ``logging`` module `configuration <https://docs.python.org/3/library/logging.config.html>`__.
The application uses java concept to log messages, e.g. class ``Application`` imported from ``ahriman.application.application`` package will have logger called ``ahriman.application.application.Application``. In order to e.g. change logger name for whole application package it is possible to change values for ``ahriman.application`` package; thus editing ``ahriman`` logger configuration will change logging for whole application (unless there are overrides for another logger).
Html customization
^^^^^^^^^^^^^^^^^^
It is possible to customize html templates. In order to do so, create files somewhere (refer to Jinja2 documentation and the service source code for available parameters) and prepend ``templates`` with value pointing to this directory.
In addition, default html templates supports style customization out-of-box. In order to customize style, just put file named ``user-style.jinja2`` to the templates directory.
Web API extension
^^^^^^^^^^^^^^^^^
The application loads web views dynamically, so it is possible relatively easy extend its API. In order to do so:
#. Create view class which is derived from ``ahriman.web.views.base.BaseView`` class.
#. Create implementation for this class.
#. Put file into ``ahriman.web.views`` package.
#. Restart application.
For more details about implementation and possibilities, kindly refer to module documentation and source code and `aiohttp documentation <https://docs.aiohttp.org/en/stable/>`__.
I did not find my question
^^^^^^^^^^^^^^^^^^^^^^^^^^
`Create an issue <https://github.com/arcan1s/ahriman/issues>`__ with type **Question**.

View File

@ -0,0 +1,99 @@
Non-x86_64 architecture setup
-----------------------------
The following section describes how to setup ahriman with architecture different from x86_64, as example i686. For most cases you have base repository available, e.g. archlinux32 repositories for i686 architecture; in case if base repository is not available, steps are a bit different, however, idea remains the same.
The example of setup with docker compose can be found `here <https://github.com/arcan1s/ahriman/tree/master/recipes/i686>`__.
Physical server setup
^^^^^^^^^^^^^^^^^^^^^
In this example we are going to use files and packages which are provided by official repositories of the used architecture. Note, that versions might be different, thus you need to find correct versions on the distribution web site, e.g. `archlinux32 packages <https://www.archlinux32.org/packages/>`__.
#.
First, considering having base Arch Linux system, we need to install keyring for the specified repositories, e.g.:
.. code-block:: shell
wget https://pool.mirror.archlinux32.org/i686/core/archlinux32-keyring-20230705-1.0-any.pkg.tar.zst
pacman -U archlinux32-keyring-20230705-1.0-any.pkg.tar.zst
#.
In order to run ``devtools`` scripts for custom architecture they also need specific ``makepkg`` configuration, it can be retrieved by installing the ``devtools`` package of the distribution, e.g.:
.. code-block:: shell
wget https://pool.mirror.archlinux32.org/i686/extra/devtools-20221208-1.2-any.pkg.tar.zst
pacman -U devtools-20221208-1.2-any.pkg.tar.zst
Alternatively, you can create your own ``makepkg`` configuration and save it as ``/usr/share/devtools/makepkg.conf.d/i686.conf``.
#.
Setup repository as usual:
.. code-block:: shell
ahriman -a i686 service-setup --mirror 'https://de.mirror.archlinux32.org/$arch/$repo'--no-multilib ...
In addition to usual options, you need to specify the following options:
* ``--mirror`` - link to the mirrors which will be used instead of official repositories.
* ``--no-multilib`` - in the example we are using i686 architecture for which multilib repository doesn't exist.
#.
That's all Folks!
Docker container setup
^^^^^^^^^^^^^^^^^^^^^^
There are two possible ways to achieve same setup, by using docker container. The first one is just mount required files inside container and run it as usual (with specific environment variables). Another one is to create own container based on official one:
#.
Clone official container as base:
.. code-block:: dockerfile
FROM arcan1s/ahriman:latest
#.
Init pacman keys. This command is required in order to populate distribution keys:
.. code-block:: dockerfile
RUN pacman-key --init
#.
Install packages as it was described above:
.. code-block:: dockerfile
RUN pacman --noconfirm -Sy wget
RUN wget https://pool.mirror.archlinux32.org/i686/extra/devtools-20221208-1.2-any.pkg.tar.zst && pacman --noconfirm -U devtools-20221208-1.2-any.pkg.tar.zst
RUN wget https://pool.mirror.archlinux32.org/i686/core/archlinux32-keyring-20230705-1.0-any.pkg.tar.zst && pacman --noconfirm -U archlinux32-keyring-20230705-1.0-any.pkg.tar.zst
#.
At that point you should have full ``Dockerfile`` like:
.. code-block:: dockerfile
FROM arcan1s/ahriman:latest
RUN pacman-key --init
RUN pacman --noconfirm -Sy wget
RUN wget https://pool.mirror.archlinux32.org/i686/extra/devtools-20221208-1.2-any.pkg.tar.zst && pacman --noconfirm -U devtools-20221208-1.2-any.pkg.tar.zst
RUN wget https://pool.mirror.archlinux32.org/i686/core/archlinux32-keyring-20230705-1.0-any.pkg.tar.zst && pacman --noconfirm -U archlinux32-keyring-20230705-1.0-any.pkg.tar.zst
#.
After that you can build you own container, e.g.:
.. code-block:: shell
docker build --tag ahriman-i686:latest
#.
Now you can run locally built container as usual with passing environment variables for setup command:
.. code-block:: shell
docker run --privileged -p 8080:8080 -e AHRIMAN_ARCHITECTURE=i686 -e AHRIMAN_PACMAN_MIRROR='https://de.mirror.archlinux32.org/$arch/$repo' -e AHRIMAN_MULTILIB= ahriman-i686:latest

119
docs/faq/reporting.rst Normal file
View File

@ -0,0 +1,119 @@
Reporting
---------
How to report by email
^^^^^^^^^^^^^^^^^^^^^^
#.
Install dependencies:
.. code-block:: shell
yay -S --asdeps python-jinja
#.
Configure the service:
.. code-block:: ini
[report]
target = email
[email]
host = smtp.example.com
link_path = http://example.com/aur-clone/x86_64
password = ...
port = 465
receivers = me@example.com
sender = me@example.com
user = me@example.com
How to generate index page
^^^^^^^^^^^^^^^^^^^^^^^^^^
#.
Install dependencies:
.. code-block:: shell
yay -S --asdeps python-jinja
#.
Configure the service:
.. code-block:: ini
[report]
target = html
[html]
path = ${repository:root}/repository/aur-clone/x86_64/index.html
link_path = http://example.com/aur-clone/x86_64
Having this configuration, the generated ``index.html`` will be also automatically synced to remote services (e.g. S3).
How to generate RSS feed for index page
"""""""""""""""""""""""""""""""""""""""
In addition to previous steps, the following configuration is required:
.. code-block:: ini
[report]
target = html rss
[html]
rss_url = ${html:link_path}/rss.xml
[rss]
link_path = ${html:link_path}
path = ${repository:root}/repository/ahriman-demo/x86_64/rss.xml
rss_url = ${html:link_path}/rss.xml
With the appended configuration, the service fill also generate ``rss.xml``, link it to generated ``index.html`` and put it together.
How to post build report to telegram
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#.
It still requires additional dependencies:
.. code-block:: shell
yay -S --asdeps python-jinja
#.
Register bot in telegram. You can do it by starting chat with `@BotFather <https://t.me/botfather>`__. For more details please refer to `official documentation <https://core.telegram.org/bots>`__.
#.
Optionally (if you want to post message in chat):
#. Create telegram channel.
#. Invite your bot into the channel.
#. Make your channel public
#.
Get chat id if you want to use by numerical id or just use id prefixed with ``@`` (e.g. ``@ahriman``). If you are not using chat the chat id is your user id. If you don't want to make channel public you can use `this guide <https://stackoverflow.com/a/33862907>`__.
#.
Configure the service:
.. code-block:: ini
[report]
target = telegram
[telegram]
api_key = aaAAbbBBccCC
chat_id = @ahriman
link_path = http://example.com/aur-clone/x86_64
``api_key`` is the one sent by `@BotFather <https://t.me/botfather>`__, ``chat_id`` is the value retrieved from previous step.
If you did everything fine you should receive the message with the next update. Quick credentials check can be done by using the following command:
.. code-block:: shell
curl 'https://api.telegram.org/bot{api_key}/sendMessage?chat_id={chat_id}&text=hello'
(replace ``{chat_id}`` and ``{api_key}`` with the values from configuration).

View File

@ -0,0 +1,131 @@
Remote synchronization
----------------------
How to sync repository to another server
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
There are several choices:
#.
Easy and cheap, just share your local files through the internet, e.g. for ``nginx``:
.. code-block::
server {
location / {
autoindex on;
root /var/lib/ahriman/repository/;
}
}
#.
You can also upload your packages using ``rsync`` to any available server. In order to use it you would need to configure ahriman first:
.. code-block:: ini
[upload]
target = rsync
[rsync]
remote = 192.168.0.1:/srv/repo
After that just add ``/srv/repo`` to the ``pacman.conf`` as usual. You can also upload to S3 (``Server = https://s3.eu-central-1.amazonaws.com/repository/aur-clone/x86_64``) or to GitHub (``Server = https://github.com/ahriman/repository/releases/download/aur-clone-x86_64``).
How to sync to S3
^^^^^^^^^^^^^^^^^
#.
Install dependencies:
.. code-block:: shell
pacman -S python-boto3
#.
Create a bucket (e.g. ``repository``).
#.
Create an user with write access to the bucket:
.. code-block::
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ListObjectsInBucket",
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::repository"
]
},
{
"Sid": "AllObjectActions",
"Effect": "Allow",
"Action": "s3:*Object",
"Resource": [
"arn:aws:s3:::repository/*"
]
}
]
}
#.
Create an API key for the user and store it.
#.
Configure the service as following:
.. code-block:: ini
[upload]
target = s3
[s3]
access_key = ...
bucket = repository
region = eu-central-1
secret_key = ...
S3 with SSL
"""""""""""
In order to configure S3 on custom domain with SSL (and some other features, like redirects), the CloudFront should be used.
#. Configure S3 as described above.
#. In bucket properties, enable static website hosting with hosting type "Host a static website".
#. Go to AWS Certificate Manager and create public certificate on your domain. Validate domain as suggested.
#. Go to CloudFront and create distribution. The following settings are required:
* Origin domain choose S3 bucket.
* Tick use website endpoint.
* Disable caching.
* Select issued certificate.
#. Point DNS record to CloudFront address.
How to sync to GitHub releases
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#.
Create a repository.
#.
`Create API key <https://github.com/settings/tokens>`__ with scope ``public_repo``.
#.
Configure the service as following:
.. code-block:: ini
[upload]
target = github
[github]
owner = ahriman
password = ...
repository = repository
username = ahriman

145
docs/faq/web.rst Normal file
View File

@ -0,0 +1,145 @@
Web service
-----------
How to setup web service
^^^^^^^^^^^^^^^^^^^^^^^^
#.
Install dependencies:
.. code-block:: shell
yay -S --asdeps python-aiohttp python-aiohttp-jinja2 python-aiohttp-apispec>=3.0.0 python-aiohttp-cors
#.
Configure service:
.. code-block:: ini
[web]
port = 8080
#.
Start the web service ``systemctl enable --now ahriman-web``.
How to enable basic authorization
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#.
Install dependencies 😊:
.. code-block:: shell
yay -S --asdeps python-aiohttp-security python-aiohttp-session python-cryptography
#.
Configure the service to enable authorization:
.. code-block:: ini
[auth]
target = configuration
salt = somerandomstring
The ``salt`` parameter is optional, but recommended, and can be set to any (random) string.
#.
In order to provide access for reporting from application instances you can (the recommended way) use unix sockets by the following configuration (note, that it requires ``python-requests-unixsocket2`` package to be installed):
.. code-block:: ini
[web]
unix_socket = /run/ahriman/ahriman-web.sock
This socket path must be available for web service instance and must be available for all application instances (e.g. in case if you are using docker container - see above - you need to make sure that the socket is passed to the root filesystem).
By the way, unix socket variable will be automatically set in case if ``--web-unix-socket`` argument is supplied to the ``setup`` subcommand.
Alternatively, you need to create user for the service:
.. code-block:: shell
sudo -u ahriman ahriman user-add -r full api
This command will ask for the password, just type it in stdin; **do not** leave the field blank, user will not be able to authorize, and finally configure the application:
.. code-block:: ini
[status]
username = api
password = pa55w0rd
#.
Create end-user with password:
.. code-block:: shell
sudo -u ahriman ahriman user-add -r full my-first-user
#.
Restart web service ``systemctl restart ahriman-web``.
Using PAM authentication
""""""""""""""""""""""""
There is also ability to allow system users to log in. To do so, the following configuration have to be set:
.. code-block:: ini
[auth]
target = pam
full_access_group = wheel
With this setup, every user (except root) will be able to log in by using system password. If user belongs to the ``wheel`` group, the full access will be automatically granted. It is also possible to manually add, block user or change user rights via usual user management process.
How to enable OAuth authorization
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#.
Create OAuth web application, download its ``client_id`` and ``client_secret``.
#.
Guess what? Install dependencies:
.. code-block:: shell
yay -S --asdeps python-aiohttp-security python-aiohttp-session python-cryptography python-aioauth-client
#.
Configure the service:
.. code-block:: ini
[auth]
target = oauth
client_id = ...
client_secret = ...
[web]
address = https://example.com
Configure ``oauth_provider`` and ``oauth_scopes`` in case if you would like to use different from Google provider. Scope must grant access to user email. ``web.address`` is required to make callback URL available from internet.
#.
If you are not going to use unix socket, you also need to create service user (remember to set ``auth.salt`` option before if required):
.. code-block:: shell
sudo -u ahriman ahriman user-add --as-service -r full api
#.
Create end-user:
.. code-block:: shell
sudo -u ahriman ahriman user-add -r full my-first-user
When it will ask for the password leave it blank.
#.
Restart web service ``systemctl restart ahriman-web``.
How to implement own interface
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
You can write your own interface by using API which is provided by the web service. Full autogenerated API documentation is available at ``http://localhost:8080/api-docs``.

View File

@ -33,7 +33,7 @@ Contents
setup
configuration
command-line
faq
faq/index
migration
architecture
advanced-usage

View File

@ -1,18 +1,15 @@
# Maintainer: Evgeniy Alekseev
pkgname='ahriman'
pkgver=2.13.3
pkgver=2.14.1
pkgrel=1
pkgdesc="ArcH linux ReposItory MANager"
arch=('any')
url="https://github.com/arcan1s/ahriman"
license=('GPL3')
depends=('devtools>=1:1.0.0' 'git' 'pyalpm' 'python-cerberus' 'python-inflection' 'python-passlib' 'python-requests' 'python-srcinfo')
depends=('devtools>=1:1.0.0' 'git' 'pyalpm' 'python-inflection' 'python-passlib' 'python-pyelftools' 'python-requests' 'python-srcinfo')
makedepends=('python-build' 'python-flit' 'python-installer' 'python-wheel')
optdepends=('breezy: -bzr packages support'
'darcs: -darcs packages support'
'mercurial: -hg packages support'
'python-aioauth-client: web server with OAuth2 authorization'
optdepends=('python-aioauth-client: web server with OAuth2 authorization'
'python-aiohttp: web server'
'python-aiohttp-apispec>=3.0.0: web server'
'python-aiohttp-cors: web server'
@ -20,12 +17,13 @@ optdepends=('breezy: -bzr packages support'
'python-aiohttp-security: web server with authorization'
'python-aiohttp-session: web server with authorization'
'python-boto3: sync to s3'
'python-cerberus: configuration validator'
'python-cryptography: web server with authorization'
'python-requests-unixsocket: client report to web server by unix socket'
'python-matplotlib: usage statistics chart'
'python-requests-unixsocket2: client report to web server by unix socket'
'python-jinja: html report generation'
'python-systemd: journal support'
'rsync: sync by using rsync'
'subversion: -svn packages support')
'rsync: sync by using rsync')
source=("https://github.com/arcan1s/ahriman/releases/download/$pkgver/$pkgname-$pkgver.tar.gz"
'ahriman.sysusers'
'ahriman.tmpfiles')

View File

@ -1 +1,2 @@
d /var/lib/ahriman 0755 ahriman ahriman
d /var/lib/ahriman 0755 ahriman ahriman
d /run/ahriman 0755 ahriman ahriman

View File

@ -6,7 +6,7 @@ logging = ahriman.ini.d/logging.ini
; Perform database migrations on the application start. Do not touch this option unless you know what are you doing.
;apply_migrations = yes
; Path to the application SQLite database.
database = /var/lib/ahriman/ahriman.db
database = ${repository:root}/ahriman.db
[alpm]
; Path to pacman system database cache.
@ -17,6 +17,8 @@ mirror = https://geo.mirror.pkgbuild.com/$repo/os/$arch
repositories = core extra multilib
; Pacman's root directory. In the most cases it must point to the system root.
root = /
; Sync files databases too, which is required by deep dependencies check.
sync_files_database = yes
; Use local packages cache. If this option is enabled, the service will be able to synchronize databases (available
; as additional option for some subcommands). If set to no, databases must be synchronized manually.
use_ahriman_cache = yes
@ -32,6 +34,8 @@ allow_read_only = yes
; Cookie secret key to be used for cookies encryption. Must be valid 32 bytes URL-safe base64-encoded string.
; If not set, it will be generated automatically.
;cookie_secret_key =
; Name of the secondary group to be used as admin group in the service.
;full_access_group = wheel
; Authentication cookie expiration in seconds.
;max_age = 604800
; OAuth2 provider icon for the web interface.
@ -40,18 +44,26 @@ allow_read_only = yes
;oauth_provider = GoogleClient
; Scopes list for OAuth2 provider. Required if oauth is used.
;oauth_scopes = https://www.googleapis.com/auth/userinfo.email
; Allow login as root user (only if PAM is used).
;permit_root_login = no
; Optional password salt.
;salt =
[build]
; List of additional flags passed to archbuild command.
;archbuild_flags =
; Path to build command.
;build_command =
; List of packages to be ignored during automatic updates.
;ignore_packages =
; Include debug packages.
;include_debug_packages = yes
; List of additional flags passed to makechrootpkg command.
;makechrootpkg_flags =
; List of additional flags passed to makepkg command.
makepkg_flags = --nocolor --ignorearch
; List of paths to be used for implicit dependency scan. Regular expressions are supported.
scan_paths = ^usr/lib(?!/cmake).*$
; List of enabled triggers in the order of calls.
triggers = ahriman.core.gitremote.RemotePullTrigger ahriman.core.report.ReportTrigger ahriman.core.upload.UploadTrigger ahriman.core.gitremote.RemotePushTrigger
; List of well-known triggers. Used only for configuration purposes.
@ -107,9 +119,9 @@ host = 127.0.0.1
; Disable status (e.g. package status, logs, etc) endpoints. Useful for build only modes.
;service_only = no
; Path to directory with static files.
static_path = /usr/share/ahriman/templates/static
static_path = ${templates}/static
; List of directories with templates.
templates = /usr/share/ahriman/templates
templates = ${prefix}/share/ahriman/templates
; Path to unix socket. If none set, unix socket will be disabled.
;unix_socket =
; Allow unix socket to be world readable.
@ -200,14 +212,14 @@ target = console
; Console reporting trigger configuration sample.
[console]
; Trigger type name
; Trigger type name.
;type = console
; Use utf8 symbols in output.
use_utf = yes
; Email reporting trigger configuration sample.
[email]
; Trigger type name
; Trigger type name.
;type = email
; Optional URL to the repository homepage.
;homepage=
@ -223,6 +235,8 @@ use_utf = yes
;port =
; List of emails to receive the reports.
;receivers =
; Optional link to the RSS feed.
;rss_url =
; Sender email.
;sender =
; SMTP server SSL mode, one of ssl, starttls, disabled.
@ -232,13 +246,13 @@ template = email-index.jinja2
; Template name to be used for full packages list generation (same as HTML report).
;template_full =
; List of directories with templates.
templates = /usr/share/ahriman/templates
templates = ${prefix}/share/ahriman/templates
; SMTP user.
;user =
; HTML reporting trigger configuration sample.
[html]
; Trigger type name
; Trigger type name.
;type = html
; Optional URL to the repository homepage.
;homepage=
@ -246,14 +260,16 @@ templates = /usr/share/ahriman/templates
;link_path =
; Output path for the HTML report.
;path =
; Optional link to the RSS feed.
;rss_url =
; Template name to be used.
template = repo-index.jinja2
; List of directories with templates.
templates = /usr/share/ahriman/templates
templates = ${prefix}/share/ahriman/templates
; Remote service callback trigger configuration sample.
[remote-call]
; Trigger type name
; Trigger type name.
;type = remote-call
; Call for AUR packages update.
;aur = no
@ -264,9 +280,26 @@ templates = /usr/share/ahriman/templates
; Wait until remote process will be terminated in seconds.
;wait_timeout = -1
; RSS reporting trigger configuration sample.
[rss]
; Trigger type name.
;type = rss
; Optional URL to the repository homepage.
;homepage=
; Prefix for packages links. Link to a package will be formed as link_path / filename.
;link_path =
; Output path for the RSS report.
;path =
; Optional link to the RSS feed.
;rss_url =
; Template name to be used.
template = rss.jinja2
; List of directories with templates.
templates = ${prefix}/share/ahriman/templates
; Telegram reporting trigger configuration sample.
[telegram]
; Trigger type name
; Trigger type name.
;type = telegram
; Telegram bot API key.
;api_key =
@ -276,12 +309,14 @@ templates = /usr/share/ahriman/templates
;homepage=
; Prefix for packages links. Link to a package will be formed as link_path / filename.
;link_path =
; Optional link to the RSS feed.
;rss_url =
; Template name to be used.
template = telegram-index.jinja2
; Telegram specific template mode, one of MarkdownV2, HTML or Markdown.
;template_type = HTML
; List of directories with templates.
templates = /usr/share/ahriman/templates
templates = ${prefix}/share/ahriman/templates
; HTTP request timeout in seconds.
;timeout = 30
@ -292,7 +327,7 @@ target =
; GitHub upload trigger configuration sample.
[github]
; Trigger type name
; Trigger type name.
;type = github
; GitHub repository owner username.
;owner =
@ -309,14 +344,14 @@ target =
; Remote instance upload trigger configuration sample.
[remote-service]
; Trigger type name
; Trigger type name.
;type = remote-service
; HTTP request timeout in seconds.
;timeout = 30
; rsync upload trigger configuration sample.
[rsync]
; Trigger type name
; Trigger type name.
;type = rsync
; rsync command to run.
command = rsync --archive --compress --partial --delete
@ -326,7 +361,7 @@ command = rsync --archive --compress --partial --delete
; S3 upload trigger configuration sample.
[s3]
; Trigger type name
; Trigger type name.
;type = s3
; AWS services access key.
;access_key =

View File

@ -1,5 +1,5 @@
[loggers]
keys = root,http,stderr,boto3,botocore,nose,s3transfer
keys = root,http,stderr,boto3,botocore,nose,s3transfer,sql
[handlers]
keys = console_handler,journald_handler,syslog_handler
@ -64,3 +64,8 @@ propagate = 0
level = INFO
qualname = s3transfer
propagate = 0
[logger_sql]
level = INFO
qualname = sql
propagate = 0

View File

@ -6,8 +6,8 @@
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<!-- Embed elements Elements via Web Component -->
<script src="https://cdn.jsdelivr.net/npm/@stoplight/elements@7.13.7/web-components.min.js" integrity="sha384-aKMPitODat9Dqj3Mva9Rs9jS5Z3KPSW0sFlAOazuULJMFYhAfmORI5SlH9aWIst8" crossorigin="anonymous" type="application/javascript"></script>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/@stoplight/elements@7.13.7/styles.min.css" integrity="sha384-wPzTs1aFAoGq9gqp9NAs2YVTkFXcU2d6Bx11aKRFhVw2B7o1bCwaV9pGHTlUfD2+" crossorigin="anonymous" type="text/css">
<script src="https://cdn.jsdelivr.net/npm/@stoplight/elements@7.13.7/web-components.min.js" crossorigin="anonymous" type="application/javascript"></script>
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/@stoplight/elements@7.13.7/styles.min.css" crossorigin="anonymous" type="text/css">
</head>
<body>

View File

@ -44,28 +44,28 @@
</button>
<ul class="dropdown-menu">
<li>
<button id="package-add-button" class="btn dropdown-item" data-bs-toggle="modal" data-bs-target="#package-add-modal" hidden>
<button id="package-add-button" class="btn dropdown-item" data-bs-toggle="modal" data-bs-target="#package-add-modal">
<i class="bi bi-plus"></i> add
</button>
</li>
<li>
<button id="package-update-button" class="btn dropdown-item" onclick="packagesUpdate()" hidden>
<button id="package-update-button" class="btn dropdown-item" onclick="packagesUpdate()">
<i class="bi bi-play"></i> update
</button>
</li>
<li>
<button id="package-rebuild-button" class="btn dropdown-item" data-bs-toggle="modal" data-bs-target="#package-rebuild-modal" hidden>
<button id="package-rebuild-button" class="btn dropdown-item" data-bs-toggle="modal" data-bs-target="#package-rebuild-modal">
<i class="bi bi-arrow-clockwise"></i> rebuild
</button>
</li>
<li>
<button id="package-remove-button" class="btn dropdown-item" onclick="packagesRemove()" disabled hidden>
<button id="package-remove-button" class="btn dropdown-item" onclick="packagesRemove()" disabled>
<i class="bi bi-trash"></i> remove
</button>
</li>
</ul>
<button id="key-import-button" type="button" class="btn btn-info" data-bs-toggle="modal" data-bs-target="#key-import-modal" hidden>
<button id="key-import-button" type="button" class="btn btn-info" data-bs-toggle="modal" data-bs-target="#key-import-modal">
<i class="bi bi-key"></i><span class="d-none d-sm-inline"> import key</span>
</button>
{% endif %}

View File

@ -1,8 +1,12 @@
<script>
const alertPlaceholder = $("#alert-placeholder");
const alertPlaceholder = document.getElementById("alert-placeholder");
function createAlert(title, message, clz, action, id) {
id ??= md5(title + message); // MD5 id from the content
if (alertPlaceholder.querySelector(`#alert-${id}`)) return; // check if there are duplicates
function createAlert(title, message, clz, action) {
const wrapper = document.createElement("div");
wrapper.id = `alert-${id}`;
wrapper.classList.add("toast", clz);
wrapper.role = "alert";
wrapper.ariaLive = "assertive";
@ -19,21 +23,21 @@
body.innerText = message;
wrapper.appendChild(body);
alertPlaceholder.append(wrapper);
alertPlaceholder.appendChild(wrapper);
const toast = new bootstrap.Toast(wrapper);
wrapper.addEventListener("hidden.bs.toast", () => {
wrapper.addEventListener("hidden.bs.toast", _ => {
wrapper.remove(); // bootstrap doesn't remove elements
(action || reload)();
});
toast.show();
}
function showFailure(title, description, jqXHR, errorThrown) {
function showFailure(title, description, error) {
let details;
try {
details = $.parseJSON(jqXHR.responseText).error; // execution handler json error response
details = JSON.parse(error.text).error; // execution handler json error response
} catch (_) {
details = errorThrown;
details = error.text ?? error.message ?? error;
}
createAlert(title, description(details), "text-bg-danger");
}

View File

@ -36,58 +36,69 @@
</div>
<script>
const keyImportModal = $("#key-import-modal");
const keyImportForm = $("#key-import-form");
keyImportModal.on("hidden.bs.modal", () => {
keyImportBodyInput.text("");
keyImportForm.trigger("reset");
});
const keyImportModal = document.getElementById("key-import-modal");
const keyImportForm = document.getElementById("key-import-form");
const keyImportBodyInput = $("#key-import-body-input");
const keyImportCopyButton = $("#key-import-copy-button");
const keyImportBodyInput = document.getElementById("key-import-body-input");
const keyImportCopyButton = document.getElementById("key-import-copy-button");
const keyImportFingerprintInput = $("#key-import-fingerprint-input");
const keyImportServerInput = $("#key-import-server-input");
const keyImportFingerprintInput = document.getElementById("key-import-fingerprint-input");
const keyImportServerInput = document.getElementById("key-import-server-input");
async function copyPgpKey() {
const logs = keyImportBodyInput.text();
await copyToClipboard(logs, keyImportCopyButton);
const key = keyImportBodyInput.textContent;
await copyToClipboard(key, keyImportCopyButton);
}
function fetchPgpKey() {
const key = keyImportFingerprintInput.val();
const server = keyImportServerInput.val();
const key = keyImportFingerprintInput.value;
const server = keyImportServerInput.value;
if (key && server) {
$.ajax({
url: "/api/v1/service/pgp",
data: {"key": key, "server": server},
type: "GET",
dataType: "json",
success: response => { keyImportBodyInput.text(response.key); },
});
makeRequest(
"/api/v1/service/pgp",
{
query: {
key: key,
server: server,
},
convert: response => response.json(),
},
data => { keyImportBodyInput.textContent = data.key; },
);
}
}
function importPgpKey() {
const key = keyImportFingerprintInput.val();
const server = keyImportServerInput.val();
const key = keyImportFingerprintInput.value;
const server = keyImportServerInput.value;
if (key && server) {
$.ajax({
url: "/api/v1/service/pgp",
data: JSON.stringify({key: key, server: server}),
type: "POST",
contentType: "application/json",
success: _ => {
keyImportModal.modal("hide");
makeRequest(
"/api/v1/service/pgp",
{
method: "POST",
json: {
key: key,
server: server,
},
},
_ => {
bootstrap.Modal.getOrCreateInstance(keyImportModal).hide();
showSuccess("Success", `Key ${key} has been imported`);
},
error: (jqXHR, _, errorThrown) => {
error => {
const message = _ => `Could not import key ${key} from ${server}`;
showFailure("Action failed", message, jqXHR, errorThrown);
showFailure("Action failed", message, error);
},
});
);
}
}
ready(_ => {
keyImportModal.addEventListener("hidden.bs.modal", _ => {
keyImportBodyInput.textContent = "";
keyImportForm.reset();
});
});
</script>

View File

@ -34,47 +34,57 @@
</div>
<script>
const loginModal = $("#login-modal");
const loginForm = $("#login-form");
loginModal.on("hidden.bs.modal", () => {
loginForm.trigger("reset");
});
const loginModal = document.getElementById("login-modal");
const loginForm = document.getElementById("login-form");
const loginPasswordInput = $("#login-password");
const loginUsernameInput = $("#login-username");
const showHidePasswordButton = $("#login-show-hide-password-button");
const loginPasswordInput = document.getElementById("login-password");
const loginUsernameInput = document.getElementById("login-username");
const showHidePasswordButton = document.getElementById("login-show-hide-password-button");
function login() {
const password = loginPasswordInput.val();
const username = loginUsernameInput.val();
const password = loginPasswordInput.value;
const username = loginUsernameInput.value;
if (username && password) {
$.ajax({
url: "/api/v1/login",
data: JSON.stringify({username: username, password: password}),
type: "POST",
contentType: "application/json",
success: _ => {
loginModal.modal("hide");
showSuccess("Logged in", `Successfully logged in as ${username}`, () => location.href = "/");
makeRequest(
"/api/v1/login",
{
method: "POST",
json: {
username: username,
password: password,
},
},
error: (jqXHR, _, errorThrown) => {
const message = _ => `Could not login as ${username}`;
showFailure("Login error", message, jqXHR, errorThrown);
_ => {
bootstrap.Modal.getOrCreateInstance(loginModal).hide();
showSuccess("Logged in", `Successfully logged in as ${username}`, _ => location.href = "/");
},
});
error => {
const message = _ =>
username === "admin" && password === "admin"
? "You've entered a password for user \"root\", did you make a typo in username?"
: `Could not login as ${username}`;
showFailure("Login error", message, error);
},
);
}
}
function showPassword() {
if (loginPasswordInput.attr("type") === "password") {
loginPasswordInput.attr("type", "text");
showHidePasswordButton.removeClass("bi-eye");
showHidePasswordButton.addClass("bi-eye-slash");
if (loginPasswordInput.getAttribute("type") === "password") {
loginPasswordInput.setAttribute("type", "text");
showHidePasswordButton.classList.remove("bi-eye");
showHidePasswordButton.classList.add("bi-eye-slash");
} else {
loginPasswordInput.attr("type", "password");
showHidePasswordButton.removeClass("bi-eye-slash");
showHidePasswordButton.addClass("bi-eye");
loginPasswordInput.setAttribute("type", "password");
showHidePasswordButton.classList.remove("bi-eye-slash");
showHidePasswordButton.classList.add("bi-eye");
}
}
ready(_ => {
loginModal.addEventListener("hidden.bs.modal", _ => {
loginForm.reset();
});
});
</script>

View File

@ -41,45 +41,14 @@
</div>
<script>
const packageAddModal = $("#package-add-modal");
const packageAddForm = $("#package-add-form");
packageAddModal.on("shown.bs.modal", () => {
$(`#package-add-repository-input option[value="${repository.architecture}-${repository.repository}"]`).prop("selected", true);
});
packageAddModal.on("hidden.bs.modal", () => {
packageAddVariablesDiv.empty();
packageAddForm.trigger("reset");
});
const packageAddModal = document.getElementById("package-add-modal");
const packageAddForm = document.getElementById("package-add-form");
const packageAddInput = $("#package-add-input");
const packageAddRepositoryInput = $("#package-add-repository-input");
const packageAddKnownPackagesList = $("#package-add-known-packages-dlist");
packageAddInput.keyup(() => {
clearTimeout(packageAddInput.data("timeout"));
packageAddInput.data("timeout", setTimeout($.proxy(() => {
const value = packageAddInput.val();
const packageAddInput = document.getElementById("package-add-input");
const packageAddRepositoryInput = document.getElementById("package-add-repository-input");
const packageAddKnownPackagesList = document.getElementById("package-add-known-packages-dlist");
if (value.length >= 3) {
$.ajax({
url: "/api/v1/service/search",
data: {"for": value},
type: "GET",
dataType: "json",
success: response => {
const options = response.map(pkg => {
const option = document.createElement("option");
option.value = pkg.package;
option.innerText = `${pkg.package} (${pkg.description})`;
return option;
});
packageAddKnownPackagesList.empty().append(options);
},
});
}
}, this), 500));
});
const packageAddVariablesDiv = $("#package-add-variables-div");
const packageAddVariablesDiv = document.getElementById("package-add-variables-div");
function packageAddVariableInputCreate() {
const variableInput = document.createElement("div");
@ -109,7 +78,7 @@
variableButtonRemove.classList.add("btn");
variableButtonRemove.classList.add("btn-outline-danger");
variableButtonRemove.innerHTML = "<i class=\"bi bi-trash\"></i>";
variableButtonRemove.onclick = _ => { return variableInput.remove(); };
variableButtonRemove.onclick = _ => { variableInput.remove(); };
// bring them together
variableInput.appendChild(variableNameInput);
@ -117,27 +86,26 @@
variableInput.appendChild(variableValueInput);
variableInput.appendChild(variableButtonRemove);
packageAddVariablesDiv.append(variableInput);
packageAddVariablesDiv.appendChild(variableInput);
}
function patchesParse() {
const patches = packageAddVariablesDiv.find(".package-add-variable").map((_, element) => {
const richElement = $(element);
const patches = Array.from(packageAddVariablesDiv.getElementsByClassName("package-add-variable")).map(element => {
return {
key: richElement.find(".package-add-variable-name").val(),
value: richElement.find(".package-add-variable-value").val(),
key: element.querySelector(".package-add-variable-name").value,
value: element.querySelector(".package-add-variable-value").value,
};
}).filter((_, patch) => patch.key).get();
}).filter(patch => patch.key);
return {patches: patches};
}
function packagesAdd(packages, patches, repository) {
packages = packages ?? packageAddInput.val();
packages = packages ?? packageAddInput.value;
patches = patches ?? patchesParse();
repository = repository ?? getRepositorySelector(packageAddRepositoryInput);
if (packages) {
packageAddModal.modal("hide");
bootstrap.Modal.getOrCreateInstance(packageAddModal).hide();
const onSuccess = update => `Packages ${update} have been added`;
const onFailure = error => `Package addition failed: ${error}`;
doPackageAction("/api/v1/service/add", [packages], repository, onSuccess, onFailure, patches);
@ -145,15 +113,54 @@
}
function packagesRequest(packages, patches) {
packages = packages ?? packageAddInput.val();
packages = packages ?? packageAddInput.value;
patches = patches ?? patchesParse();
const repository = getRepositorySelector(packageAddRepositoryInput);
if (packages) {
packageAddModal.modal("hide");
bootstrap.Modal.getOrCreateInstance(packageAddModal).hide();
const onSuccess = update => `Packages ${update} have been requested`;
const onFailure = error => `Package request failed: ${error}`;
doPackageAction("/api/v1/service/request", [packages], repository, onSuccess, onFailure, patches);
}
}
ready(_ => {
packageAddModal.addEventListener("shown.bs.modal", _ => {
const option = packageAddRepositoryInput.querySelector(`option[value="${repository.architecture}-${repository.repository}"]`);
option.selected = "selected";
});
packageAddModal.addEventListener("hidden.bs.modal", _ => {
packageAddVariablesDiv.replaceChildren();
packageAddForm.reset();
});
packageAddInput.addEventListener("keyup", _ => {
clearTimeout(packageAddInput.requestTimeout);
packageAddInput.requestTimeout = setTimeout(_ => {
const value = packageAddInput.value;
if (value.length >= 3) {
makeRequest(
"/api/v1/service/search",
{
query: {
for: value,
},
convert: response => response.json(),
},
data => {
const options = data.map(pkg => {
const option = document.createElement("option");
option.value = pkg.package;
option.innerText = `${pkg.package} (${pkg.description})`;
return option;
});
packageAddKnownPackagesList.replaceChildren(...options);
},
);
}
}, 500);
});
});
</script>

View File

@ -45,8 +45,9 @@
<nav>
<div class="nav nav-tabs" role="tablist">
<button id="package-info-logs-button" class="nav-link active" data-bs-toggle="tab" data-bs-target="#package-info-logs" type="button" role="tab" aria-controls="package-info-logs" aria-selected="true"><h3>Build logs</h3></button>
<button id="package-info-changes-button" class="nav-link" data-bs-toggle="tab" data-bs-target="#package-info-changes" type="button" role="tab" aria-controls="package-info-changes" aria-selected="false"><h3>Changes</h3></button>
<button id="package-info-logs-button" class="nav-link active" data-bs-toggle="tab" data-bs-target="#package-info-logs" type="button" role="tab" aria-controls="package-info-logs" aria-selected="true">Build logs</button>
<button id="package-info-changes-button" class="nav-link" data-bs-toggle="tab" data-bs-target="#package-info-changes" type="button" role="tab" aria-controls="package-info-changes" aria-selected="false">Changes</button>
<button id="package-info-events-button" class="nav-link" data-bs-toggle="tab" data-bs-target="#package-info-events" type="button" role="tab" aria-controls="package-info-events" aria-selected="false">Events</button>
</div>
</nav>
<div class="tab-content" id="nav-tabContent">
@ -56,11 +57,30 @@
<div id="package-info-changes" class="tab-pane fade" role="tabpanel" aria-labelledby="package-info-changes-button" tabindex="0">
<pre class="language-diff"><code id="package-info-changes-input" class="pre-scrollable language-diff"></code><button id="package-info-changes-copy-button" type="button" class="btn language-diff" onclick="copyChanges()"><i class="bi bi-clipboard"></i> copy</button></pre>
</div>
<div id="package-info-events" class="tab-pane fade" role="tabpanel" aria-labelledby="package-info-events-button" tabindex="0">
<canvas id="package-info-events-update-chart" hidden></canvas>
<table id="package-info-events-table"
data-classes="table table-hover"
data-sortable="true"
data-sort-name="timestamp"
data-sort-order="desc"
data-toggle="table">
<thead class="table-primary">
<tr>
<th data-align="right" data-field="timestamp">date</th>
<th data-field="event">event</th>
<th data-field="message">description</th>
</tr>
</thead>
</table>
</div>
</div>
</div>
<div class="modal-footer">
<button id="package-info-update-button" type="submit" class="btn btn-success" onclick="packageInfoUpdate()" data-bs-dismiss="modal" hidden><i class="bi bi-play"></i><span class="d-none d-sm-inline"> update</span></button>
<button id="package-info-remove-button" type="submit" class="btn btn-danger" onclick="packageInfoRemove()" data-bs-dismiss="modal" hidden><i class="bi bi-trash"></i><span class="d-none d-sm-inline"> remove</span></button>
{% if not auth.enabled or auth.username is not none %}
<button id="package-info-update-button" type="submit" class="btn btn-success" onclick="packageInfoUpdate()" data-bs-dismiss="modal"><i class="bi bi-play"></i><span class="d-none d-sm-inline"> update</span></button>
<button id="package-info-remove-button" type="submit" class="btn btn-danger" onclick="packageInfoRemove()" data-bs-dismiss="modal"><i class="bi bi-trash"></i><span class="d-none d-sm-inline"> remove</span></button>
{% endif %}
<button type="button" class="btn btn-secondary" onclick="showPackageInfo()"><i class="bi bi-arrow-clockwise"></i><span class="d-none d-sm-inline"> reload</span></button>
<button type="button" class="btn btn-primary" data-bs-dismiss="modal"><i class="bi bi-x"></i><span class="d-none d-sm-inline"> close</span></button>
</div>
@ -69,61 +89,54 @@
</div>
<script>
const packageInfoModal = $("#package-info-modal");
const packageInfoModalHeader = $("#package-info-modal-header");
const packageInfo = $("#package-info");
packageInfoModal.on("hidden.bs.modal", () => {
packageInfoAurUrl.empty();
packageInfoDepends.empty();
packageInfoGroups.empty();
packageInfoLicenses.empty();
packageInfoPackager.empty();
packageInfoPackages.empty();
packageInfoUpstreamUrl.empty();
packageInfoVersion.empty();
const packageInfoModal = document.getElementById("package-info-modal");
const packageInfoModalHeader = document.getElementById("package-info-modal-header");
const packageInfo = document.getElementById("package-info");
packageInfoVariablesBlock.attr("hidden", true);
packageInfoVariablesDiv.empty();
const packageInfoLogsInput = document.getElementById("package-info-logs-input");
const packageInfoLogsCopyButton = document.getElementById("package-info-logs-copy-button");
packageInfoLogsInput.empty();
packageInfoChangesInput.empty();
const packageInfoChangesInput = document.getElementById("package-info-changes-input");
const packageInfoChangesCopyButton = document.getElementById("package-info-changes-copy-button");
packageInfoModal.trigger("reset");
// so far bootstrap-table only operates with jquery elements
const packageInfoEventsTable = $(document.getElementById("package-info-events-table"));
const packageInfoEventsUpdateChartCanvas = document.getElementById("package-info-events-update-chart");
let packageInfoEventsUpdateChart = null;
hideInfoControls(true);
});
const packageInfoAurUrl = document.getElementById("package-info-aur-url");
const packageInfoDepends = document.getElementById("package-info-depends");
const packageInfoGroups = document.getElementById("package-info-groups");
const packageInfoLicenses = document.getElementById("package-info-licenses");
const packageInfoPackager = document.getElementById("package-info-packager");
const packageInfoPackages = document.getElementById("package-info-packages");
const packageInfoUpstreamUrl = document.getElementById("package-info-upstream-url");
const packageInfoVersion = document.getElementById("package-info-version");
const packageInfoLogsInput = $("#package-info-logs-input");
const packageInfoLogsCopyButton = $("#package-info-logs-copy-button");
const packageInfoVariablesBlock = document.getElementById("package-info-variables-block");
const packageInfoVariablesDiv = document.getElementById("package-info-variables-div");
const packageInfoChangesInput = $("#package-info-changes-input");
const packageInfoChangesCopyButton = $("#package-info-changes-copy-button");
const packageInfoAurUrl = $("#package-info-aur-url");
const packageInfoDepends = $("#package-info-depends");
const packageInfoGroups = $("#package-info-groups");
const packageInfoLicenses = $("#package-info-licenses");
const packageInfoPackager = $("#package-info-packager");
const packageInfoPackages = $("#package-info-packages");
const packageInfoUpstreamUrl = $("#package-info-upstream-url");
const packageInfoVersion = $("#package-info-version");
const packageInfoVariablesBlock = $("#package-info-variables-block");
const packageInfoVariablesDiv = $("#package-info-variables-div");
function clearChart() {
packageInfoEventsUpdateChartCanvas.hidden = true;
if (packageInfoEventsUpdateChart) {
packageInfoEventsUpdateChart.data = {};
packageInfoEventsUpdateChart.update();
}
}
async function copyChanges() {
const changes = packageInfoChangesInput.text();
const changes = packageInfoChangesInput.textContent;
await copyToClipboard(changes, packageInfoChangesCopyButton);
}
async function copyLogs() {
const logs = packageInfoLogsInput.text();
const logs = packageInfoLogsInput.textContent;
await copyToClipboard(logs, packageInfoLogsCopyButton);
}
function hideInfoControls(hidden) {
packageInfoRemoveButton.attr("hidden", hidden);
packageInfoUpdateButton.attr("hidden", hidden);
function highlight(element) {
delete element.dataset.highlighted;
hljs.highlightElement(element);
}
function insertVariable(packageBase, variable) {
@ -150,12 +163,13 @@
variableButtonRemove.classList.add("btn-outline-danger");
variableButtonRemove.innerHTML = "<i class=\"bi bi-trash\"></i>";
variableButtonRemove.onclick = _ => {
$.ajax({
url: `/api/v1/packages/${packageBase}/patches/${variable.key}`,
type: "DELETE",
dataType: "json",
success: _ => variableInput.remove(),
});
makeRequest(
`/api/v1/packages/${packageBase}/patches/${variable.key}`,
{
method: "DELETE",
},
_ => variableInput.remove(),
);
};
// bring them together
@ -164,45 +178,93 @@
variableInput.appendChild(variableValueInput);
variableInput.appendChild(variableButtonRemove);
packageInfoVariablesDiv.append(variableInput);
packageInfoVariablesDiv.appendChild(variableInput);
}
function loadChanges(packageBase, onFailure) {
$.ajax({
url: `/api/v1/packages/${packageBase}/changes`,
data: {
architecture: repository.architecture,
repository: repository.repository,
makeRequest(
`/api/v1/packages/${packageBase}/changes`,
{
query: {
architecture: repository.architecture,
repository: repository.repository,
},
convert: response => response.json(),
},
type: "GET",
dataType: "json",
success: response => {
const changes = response.changes;
packageInfoChangesInput.text(changes || "");
packageInfoChangesInput.map((_, el) => hljs.highlightElement(el));
data => {
const changes = data.changes;
packageInfoChangesInput.textContent = changes ?? "";
highlight(packageInfoChangesInput);
},
error: onFailure,
});
onFailure,
);
}
function loadEvents(packageBase, onFailure) {
packageInfoEventsTable.bootstrapTable("showLoading");
clearChart();
makeRequest(
"/api/v1/events",
{
query: {
architecture: repository.architecture,
repository: repository.repository,
object_id: packageBase,
limit: 30,
},
convert: response => response.json(),
},
data => {
const events = data.map(event => {
return {
timestamp: new Date(1000 * event.created).toISOStringShort(),
event: event.event,
message: event.message || "",
};
});
const chart = data.filter(event => event.event === "package-updated");
packageInfoEventsTable.bootstrapTable("load", events);
packageInfoEventsTable.bootstrapTable("hideLoading");
if (packageInfoEventsUpdateChart) {
packageInfoEventsUpdateChart.config.data = {
labels: chart.map(event => new Date(1000 * event.created).toISOStringShort()),
datasets: [{
label: "update duration, s",
data: chart.map(event => event.data.took),
cubicInterpolationMode: "monotone",
tension: 0.4,
}],
};
packageInfoEventsUpdateChart.update();
}
packageInfoEventsUpdateChartCanvas.hidden = !chart.length;
},
onFailure,
);
}
function loadLogs(packageBase, onFailure) {
$.ajax({
url: `/api/v2/packages/${packageBase}/logs`,
data: {
architecture: repository.architecture,
repository: repository.repository,
makeRequest(
`/api/v2/packages/${packageBase}/logs`,
{
query: {
architecture: repository.architecture,
repository: repository.repository,
},
convert: response => response.json(),
},
type: "GET",
dataType: "json",
success: response => {
const logs = response.map(log_record => {
data => {
const logs = data.map(log_record => {
return `[${new Date(1000 * log_record.created).toISOString()}] ${log_record.message}`;
});
packageInfoLogsInput.text(logs.join("\n"));
packageInfoLogsInput.map((_, el) => hljs.highlightElement(el));
packageInfoLogsInput.textContent = logs.join("\n");
highlight(packageInfoLogsInput);
},
error: onFailure,
});
onFailure,
);
}
function loadPackage(packageBase, onFailure) {
@ -214,16 +276,17 @@
return ["bg-secondary", "text-white"];
};
$.ajax({
url: `/api/v1/packages/${packageBase}`,
data: {
architecture: repository.architecture,
repository: repository.repository,
makeRequest(
`/api/v1/packages/${packageBase}`,
{
query: {
architecture: repository.architecture,
repository: repository.repository,
},
convert: response => response.json(),
},
type: "GET",
dataType: "json",
success: response => {
const description = response.find(Boolean);
data => {
const description = data.find(Boolean);
const packages = Object.keys(description.package.packages);
const aurUrl = description.package.remote.web_url;
const upstreamUrls = Array.from(
@ -233,80 +296,111 @@
)
).sort();
packageInfo.text(`${description.package.base} ${description.status.status} at ${new Date(1000 * description.status.timestamp).toISOStringShort()}`);
packageInfo.textContent = `${description.package.base} ${description.status.status} at ${new Date(1000 * description.status.timestamp).toISOStringShort()}`;
packageInfoModalHeader.removeClass();
packageInfoModalHeader.addClass("modal-header");
headerClass(description.status.status).forEach(clz => packageInfoModalHeader.addClass(clz));
packageInfoModalHeader.classList.remove(...packageInfoModalHeader.classList);
packageInfoModalHeader.classList.add("modal-header");
headerClass(description.status.status).forEach(clz => packageInfoModalHeader.classList.add(clz));
packageInfoAurUrl.html(aurUrl ? safeLink(aurUrl, aurUrl, "AUR link").outerHTML : "");
packageInfoDepends.html(listToTable(
packageInfoAurUrl.innerHTML = aurUrl ? safeLink(aurUrl, aurUrl, "AUR link").outerHTML : "";
packageInfoDepends.innerHTML = listToTable(
Object.values(description.package.packages)
.reduce((accumulator, currentValue) => {
return accumulator.concat(currentValue.depends.filter(v => packages.indexOf(v) === -1))
.concat(currentValue.make_depends.filter(v => packages.indexOf(v) === -1).map(v => `${v} (make)`))
.concat(currentValue.opt_depends.filter(v => packages.indexOf(v) === -1).map(v => `${v} (optional)`));
}, [])
));
packageInfoGroups.html(listToTable(extractListProperties(description.package, "groups")));
packageInfoLicenses.html(listToTable(extractListProperties(description.package, "licenses")));
packageInfoPackager.text(description.package.packager);
packageInfoPackages.html(listToTable(packages));
packageInfoUpstreamUrl.html(upstreamUrls.map(url => safeLink(url, url, "upstream link").outerHTML).join("<br>"));
packageInfoVersion.text(description.package.version);
hideInfoControls(false);
);
packageInfoGroups.innerHTML = listToTable(extractListProperties(description.package, "groups"));
packageInfoLicenses.innerHTML = listToTable(extractListProperties(description.package, "licenses"));
packageInfoPackager.textContent = description.package.packager;
packageInfoPackages.innerHTML = listToTable(packages);
packageInfoUpstreamUrl.innerHTML = upstreamUrls.map(url => safeLink(url, url, "upstream link").outerHTML).join("<br>");
packageInfoVersion.textContent = description.package.version;
},
error: (jqXHR, _, errorThrown) => {
hideInfoControls(true);
onFailure(jqXHR, null, errorThrown);
},
});
onFailure,
);
}
function loadPatches(packageBase, onFailure) {
$.ajax({
url: `/api/v1/packages/${packageBase}/patches`,
type: "GET",
dataType: "json",
success: response => {
packageInfoVariablesDiv.empty();
response.map(patch => insertVariable(packageBase, patch));
packageInfoVariablesBlock.attr("hidden", response.length === 0);
makeRequest(
`/api/v1/packages/${packageBase}/patches`,
{
convert: response => response.json(),
},
error: onFailure,
});
data => {
packageInfoVariablesDiv.replaceChildren();
data.map(patch => insertVariable(packageBase, patch));
packageInfoVariablesBlock.hidden = !data.length;
},
onFailure,
);
}
function packageInfoRemove() {
const packageBase = packageInfoModal.data("package");
if (packageBase) return packagesRemove([packageBase]);
const packageBase = packageInfoModal.package;
packagesRemove([packageBase]);
}
function packageInfoUpdate() {
const packageBase = packageInfoModal.data("package");
if (packageBase) return packagesAdd(packageBase, [], repository);
const packageBase = packageInfoModal.package;
packagesAdd(packageBase, [], repository);
}
function showPackageInfo(packageBase) {
const isPackageBaseSet = packageBase !== undefined;
if (isPackageBaseSet)
packageInfoModal.data("package", packageBase); // set package base as currently used
else
packageBase = packageInfoModal.data("package"); // read package base from the current window attribute
if (isPackageBaseSet) {
// set package base as currently used
packageInfoModal.package = packageBase;
} else {
// read package base from the current window attribute
packageBase = packageInfoModal.package;
}
const onFailure = (jqXHR, _, errorThrown) => {
const onFailure = error => {
if (isPackageBaseSet) {
const message = error => `Could not load package ${packageBase} info: ${error}`;
showFailure("Load failure", message, jqXHR, errorThrown);
const message = details => `Could not load package ${packageBase} info: ${details}`;
showFailure("Load failure", message, error);
}
};
loadPackage(packageBase, onFailure);
loadPatches(packageBase, onFailure);
loadLogs(packageBase, onFailure);
loadChanges(packageBase, onFailure)
loadChanges(packageBase, onFailure);
loadEvents(packageBase, onFailure);
if (isPackageBaseSet) packageInfoModal.modal("show");
if (isPackageBaseSet) {
bootstrap.Modal.getOrCreateInstance(packageInfoModal).show();
}
}
ready(_ => {
packageInfoEventsUpdateChart = new Chart(packageInfoEventsUpdateChartCanvas, {
type: "line",
data: {},
options: {
responsive: true,
},
});
packageInfoModal.addEventListener("hidden.bs.modal", _ => {
packageInfoAurUrl.textContent = "";
packageInfoDepends.textContent = "";
packageInfoGroups.textContent = "";
packageInfoLicenses.textContent = "";
packageInfoPackager.textContent = "";
packageInfoPackages.textContent = "";
packageInfoUpstreamUrl.textContent = "";
packageInfoVersion.textContent = "";
packageInfoVariablesBlock.hidden = true;
packageInfoVariablesDiv.replaceChildren();
packageInfoLogsInput.textContent = "";
packageInfoChangesInput.textContent = "";
packageInfoEventsTable.bootstrapTable("load", []);
clearChart();
});
});
</script>

View File

@ -33,25 +33,31 @@
</div>
<script>
const packageRebuildModal = $("#package-rebuild-modal");
const packageRebuildForm = $("#package-rebuild-form");
packageRebuildModal.on("shown.bs.modal", () => {
$(`#package-rebuild-repository-input option[value="${repository.architecture}-${repository.repository}"]`).prop("selected", true);
const packageRebuildModal = document.getElementById("package-rebuild-modal");
const packageRebuildForm = document.getElementById("package-rebuild-form");
});
packageRebuildModal.on("hidden.bs.modal", () => { packageRebuildForm.trigger("reset"); });
const packageRebuildDependencyInput = $("#package-rebuild-dependency-input");
const packageRebuildRepositoryInput = $("#package-rebuild-repository-input");
const packageRebuildDependencyInput = document.getElementById("package-rebuild-dependency-input");
const packageRebuildRepositoryInput = document.getElementById("package-rebuild-repository-input");
function packagesRebuild() {
const packages = packageRebuildDependencyInput.val();
const packages = packageRebuildDependencyInput.value;
const repository = getRepositorySelector(packageRebuildRepositoryInput);
if (packages) {
packageRebuildModal.modal("hide");
bootstrap.Modal.getOrCreateInstance(packageRebuildModal).hide();
const onSuccess = update => `Repository rebuild has been run for packages which depend on ${update}`;
const onFailure = error => `Repository rebuild failed: ${error}`;
doPackageAction("/api/v1/service/rebuild", [packages], repository, onSuccess, onFailure);
}
}
ready(_ => {
packageRebuildModal.addEventListener("shown.bs.modal", _ => {
const option = packageRebuildRepositoryInput.querySelector(`option[value="${repository.architecture}-${repository.repository}"]`);
option.selected = "selected";
});
packageRebuildModal.addEventListener("hidden.bs.modal", _ => {
packageRebuildForm.reset();
});
});
</script>

View File

@ -1,77 +1,34 @@
<script>
const keyImportButton = $("#key-import-button");
const packageAddButton = $("#package-add-button");
const packageRebuildButton = $("#package-rebuild-button");
const packageRemoveButton = $("#package-remove-button");
const packageUpdateButton = $("#package-update-button");
const packageInfoRemoveButton = $("#package-info-remove-button");
const packageInfoUpdateButton = $("#package-info-update-button");
const packageRemoveButton = document.getElementById("package-remove-button");
const packageUpdateButton = document.getElementById("package-update-button");
let repository = null;
$("#repositories a").on("click", (event) => {
const element = event.target;
repository = {
architecture: element.dataset.architecture,
repository: element.dataset.repository,
};
packageUpdateButton.html(`<i class="bi bi-play"></i> update<span class="d-none d-sm-inline"> ${safe(repository.repository)} (${safe(repository.architecture)})</span>`);
$(`#${element.id}`).tab("show");
reload();
});
const table = $("#packages");
table.on("check.bs.table uncheck.bs.table check-all.bs.table uncheck-all.bs.table", () => {
packageRemoveButton.prop("disabled", !table.bootstrapTable("getSelections").length);
});
table.on("click-row.bs.table", (self, data, row, cell) => {
if (0 === cell || "base" === cell) {
const method = data[0] === true ? "uncheckBy" : "checkBy"; // fck javascript
table.bootstrapTable(method, {field: "id", values: [data.id]});
} else showPackageInfo(data.id);
});
table.on("created-controls.bs.table", () => {
const pickerInput = $(".bootstrap-table-filter-control-timestamp");
pickerInput.daterangepicker({
autoUpdateInput: false,
locale: {
cancelLabel: "Clear",
},
});
// so far bootstrap-table only operates with jquery elements
const table = $(document.getElementById("packages"));
pickerInput.on("apply.daterangepicker", (event, picker) => {
pickerInput.val(`${picker.startDate.format("YYYY-MM-DD")} - ${picker.endDate.format("YYYY-MM-DD")}`);
table.bootstrapTable("triggerSearch");
});
pickerInput.on("cancel.daterangepicker", () => {
pickerInput.val("");
table.bootstrapTable("triggerSearch");
});
});
const statusBadge = $("#badge-status");
const versionBadge = $("#badge-version");
const statusBadge = document.getElementById("badge-status");
const versionBadge = document.getElementById("badge-version");
function doPackageAction(uri, packages, repository, successText, failureText, data) {
const queryParams = $.param({
architecture: repository.architecture,
repository: repository.repository,
}); // it will never be empty btw
$.ajax({
url: `${uri}?${queryParams}`,
data: JSON.stringify(Object.assign({}, {packages: packages}, data || {})),
type: "POST",
contentType: "application/json",
success: _ => {
makeRequest(
uri,
{
method: "POST",
query: {
architecture: repository.architecture,
repository: repository.repository,
},
json: Object.assign({}, {packages: packages}, data || {}),
},
_ => {
const message = successText(packages.join(", "));
showSuccess("Success", message);
},
error: (jqXHR, _, errorThrown) => {
showFailure("Action failed", failureText, jqXHR, errorThrown);
error => {
showFailure("Action failed", failureText, error);
},
});
);
}
function filterListGroups() {
@ -87,10 +44,10 @@
}
function getRepositorySelector(selector) {
const selected = selector.find(":selected");
const selected = selector.options[selector.selectedIndex];
return {
architecture: selected.data("architecture"),
repository: selected.data("repository"),
architecture: selected.getAttribute("data-architecture"),
repository: selected.getAttribute("data-repository"),
};
}
@ -98,14 +55,6 @@
return table.bootstrapTable("getSelections").map(row => row.id);
}
function hideControls(hidden) {
keyImportButton.attr("hidden", hidden);
packageAddButton.attr("hidden", hidden);
packageRebuildButton.attr("hidden", hidden);
packageRemoveButton.attr("hidden", hidden);
packageUpdateButton.attr("hidden", hidden);
}
function packagesRemove(packages) {
packages = packages ?? getSelection();
const onSuccess = update => `Packages ${update} have been removed`;
@ -135,16 +84,17 @@
return "btn-outline-secondary";
};
$.ajax({
url: "/api/v1/packages",
data: {
architecture: repository.architecture,
repository: repository.repository,
makeRequest(
"/api/v1/packages",
{
query: {
architecture: repository.architecture,
repository: repository.repository,
},
convert: response => response.json(),
},
type: "GET",
dataType: "json",
success: response => {
const payload = response.map(description => {
data => {
const payload = data.map(description => {
const package_base = description.package.base;
const web_url = description.package.remote.web_url;
return {
@ -163,10 +113,9 @@
table.bootstrapTable("load", payload);
table.bootstrapTable("uncheckAll");
table.bootstrapTable("hideLoading");
hideControls(false);
},
error: (jqXHR, _, errorThrown) => {
if ((jqXHR.status === 401) || (jqXHR.status === 403)) {
error => {
if ((error.status === 401) || (error.status === 403)) {
// authorization error
const text = "In order to see statuses you must login first.";
table.find("tr.unauthorized").remove();
@ -174,39 +123,39 @@
table.bootstrapTable("hideLoading");
} else {
// other errors
const message = error => `Could not load list of packages: ${error}`;
showFailure("Load failure", message, jqXHR, errorThrown);
const message = details => `Could not load list of packages: ${details}`;
showFailure("Load failure", message, error);
}
hideControls(true);
},
});
);
$.ajax({
url: "/api/v1/status",
data: {
architecture: repository.architecture,
repository: repository.repository,
makeRequest(
"/api/v1/status",
{
query: {
architecture: repository.architecture,
repository: repository.repository,
},
convert: response => response.json(),
},
type: "GET",
dataType: "json",
success: response => {
versionBadge.html(`<i class="bi bi-github"></i> ahriman ${safe(response.version)}`);
data => {
versionBadge.innerHTML = `<i class="bi bi-github"></i> ahriman ${safe(data.version)}`;
statusBadge
.popover("dispose")
.attr("data-bs-content", `${response.status.status} at ${new Date(1000 * response.status.timestamp).toISOStringShort()}`)
.popover();
statusBadge.removeClass();
statusBadge.addClass("btn");
statusBadge.addClass(badgeClass(response.status.status));
statusBadge.classList.remove(...statusBadge.classList);
statusBadge.classList.add("btn");
statusBadge.classList.add(badgeClass(data.status.status));
const popover = bootstrap.Popover.getOrCreateInstance(statusBadge);
popover.dispose();
statusBadge.dataset.bsContent = `${data.status.status} at ${new Date(1000 * data.status.timestamp).toISOStringShort()}`;
bootstrap.Popover.getOrCreateInstance(statusBadge);
},
});
);
}
function selectRepository() {
const fragment = window.location.hash.replace("#", "") || "{{ repositories[0].id }}";
const element = $(`#${fragment}-link`);
element.click();
document.getElementById(`${fragment}-link`).click();
}
function statusFormat(value) {
@ -220,9 +169,65 @@
return {classes: cellClass(value)};
}
$(() => {
table.bootstrapTable({});
statusBadge.popover();
ready(_ => {
document.querySelectorAll("#repositories a").forEach(element => {
element.onclick = _ => {
repository = {
architecture: element.dataset.architecture,
repository: element.dataset.repository,
};
if (packageUpdateButton) {
packageUpdateButton.innerHTML = `<i class="bi bi-play"></i> update<span class="d-none d-sm-inline"> ${safe(repository.repository)} (${safe(repository.architecture)})</span>`;
}
bootstrap.Tab.getOrCreateInstance(document.getElementById(element.id)).show();
reload();
};
});
table.on("check.bs.table uncheck.bs.table check-all.bs.table uncheck-all.bs.table", _ => {
if (packageRemoveButton) {
packageRemoveButton.disabled = !table.bootstrapTable("getSelections").length;
}
});
table.on("click-row.bs.table", (self, data, row, cell) => {
if (0 === cell || "base" === cell) {
const method = data[0] === true ? "uncheckBy" : "checkBy"; // fck javascript
table.bootstrapTable(method, {field: "id", values: [data.id]});
} else showPackageInfo(data.id);
});
table.on("created-controls.bs.table", _ => {
new easepick.create({
element: document.querySelector(".bootstrap-table-filter-control-timestamp"),
css: [
"https://cdn.jsdelivr.net/npm/@easepick/bundle@1.2.1/dist/index.css",
],
grid: 2,
calendars: 2,
autoApply: false,
locale: {
cancel: "Clear",
},
RangePlugin: {
tooltip: false,
},
plugins: [
"RangePlugin",
],
setup: picker => {
picker.on("select", _ => { table.bootstrapTable("triggerSearch"); });
// replace "Cancel" behaviour to "Clear"
picker.onClickCancelButton = element => {
if (picker.isCancelButton(element)) {
picker.clear();
picker.hide();
table.bootstrapTable("triggerSearch");
}
};
},
});
});
bootstrap.Popover.getOrCreateInstance(statusBadge);
selectRepository();
});
</script>
</script>

View File

@ -7,6 +7,10 @@
{% include "utils/style.jinja2" %}
{% include "user-style.jinja2" ignore missing %}
{% if rss_url is not none %}
<link rel="alternate" href="{{ rss_url }}" type="application/rss+xml">
{% endif %}
</head>
<body>
@ -101,32 +105,13 @@ SigLevel = Database{% if has_repo_signed %}Required{% else %}Never{% endif %} Pa
</div>
<script>
const table = $("#packages");
table.on("created-controls.bs.table", () => {
const pickerInput = $(".bootstrap-table-filter-control-timestamp");
pickerInput.daterangepicker({
autoUpdateInput: false,
locale: {
cancelLabel: "Clear",
},
});
const table = $(document.getElementById("packages"));
pickerInput.on("apply.daterangepicker", (event, picker) => {
pickerInput.val(`${picker.startDate.format("YYYY-MM-DD")} - ${picker.endDate.format("YYYY-MM-DD")}`);
table.bootstrapTable("triggerSearch");
});
pickerInput.on("cancel.daterangepicker", () => {
pickerInput.val("");
table.bootstrapTable("triggerSearch");
});
});
const pacmanConf = $("#pacman-conf");
const pacmanConfCopyButton = $("#copy-btn");
const pacmanConf = document.getElementById("pacman-conf");
const pacmanConfCopyButton = document.getElementById("copy-btn");
async function copyPacmanConf() {
const conf = pacmanConf.text();
const conf = pacmanConf.textContent;
await copyToClipboard(conf, pacmanConfCopyButton);
}
@ -141,6 +126,40 @@ SigLevel = Database{% if has_repo_signed %}Required{% else %}Never{% endif %} Pa
function filterListLicenses() {
return extractDataList(table.bootstrapTable("getData"), "licenses");
}
ready(_ => {
table.on("created-controls.bs.table", _ => {
new easepick.create({
element: document.querySelector(".bootstrap-table-filter-control-timestamp"),
css: [
"https://cdn.jsdelivr.net/npm/@easepick/bundle@1.2.1/dist/index.css",
],
grid: 2,
calendars: 2,
autoApply: false,
locale: {
cancel: "Clear",
},
RangePlugin: {
tooltip: false,
},
plugins: [
"RangePlugin",
],
setup: picker => {
picker.on("select", _ => { table.bootstrapTable("triggerSearch"); });
// replace "Cancel" behaviour to "Clear"
picker.onClickCancelButton = element => {
if (picker.isCancelButton(element)) {
picker.clear();
picker.hide();
table.bootstrapTable("triggerSearch");
}
};
},
});
});
});
</script>
</body>

View File

@ -0,0 +1,27 @@
<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
<channel>
<title>{{ repository }}: Recent package updates</title>
{% if homepage is not none %}
<link>{{ homepage }}</link>
{% endif %}
<description>Recently updated packages in the {{ repository }}.</description>
{% if rss_url is not none %}
<atom:link href="{{ rss_url }}" rel="self"/>
{% endif %}
<language>en-us</language>
<lastBuildDate>{{ last_update }}</lastBuildDate>
{% for package in packages %}
<item>
<title>{{ package.name }} {{ package.version }} {{ package.architecture }}</title>
<link>{{ link_path }}/{{ package.filename }}</link>
<description>{{ package.description }}</description>
<pubDate>{{ package.build_date }}</pubDate>
<guid isPermaLink="false">{{ package.tag }}</guid>
<category>{{ repository }}</category>
<category>{{ package.architecture }}</category>
</item>
{% endfor %}
</channel>
</rss>

View File

@ -1,38 +1,30 @@
<script src="https://cdn.jsdelivr.net/npm/jquery@3.7.1/dist/jquery.min.js" integrity="sha384-1H217gwSVyLSIfaLxHbE7dRb3v4mYCKbpQvzx0cegeju1MVsGrX5xXxAvs/HgeFs" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/jquery@3.7.1/dist/jquery.min.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/js-md5@0.8.3/src/md5.min.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/moment@2.29.4/moment.min.js" integrity="sha384-8hHkOkbWN1TLWwet/jpbJ0zbx3FJDeYJgQ8dX1mRrv/vfCfHCqFSFZYCgaMML3z9" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/daterangepicker@3.1.0/daterangepicker.min.js" integrity="sha384-u4eJN1VWrTf/FnYYQJo2kqJyVxEQf5UmWY4iUcNAoLenOEtEuCkfwc5bKvZOWBi5" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/tableexport.jquery.plugin@1.30.0/tableExport.min.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/tableexport.jquery.plugin@1.28.0/tableExport.min.js" integrity="sha384-1Rz4Kz/y1rSWw+ZsjTcxB684XgofbO8iizY+UFIzCwFeQ+QUyhBNWBMh/STOyomI" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/jquery-resizable-columns@0.2.3/dist/jquery.resizableColumns.min.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/jquery-resizable-columns@0.2.3/dist/jquery.resizableColumns.min.js" integrity="sha384-IazMVNyYoUNx6357fWJoqtHYUWWCNHIXxFVtbpVgvImQNWuRP2WbHPaIb3QF8j97" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/@popperjs/core@2.11.8/dist/umd/popper.min.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/js/bootstrap.min.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap-table@1.23.2/dist/bootstrap-table.min.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/@popperjs/core@2.11.8/dist/umd/popper.min.js" integrity="sha384-I7E8VVD/ismYTF4hNIPjVp/Zjvgyol6VFvRkX/vR+Vc4jQkC+hVqc2pM8ODewa9r" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/js/bootstrap.min.js" integrity="sha384-BBtl+eGJRgqQAUMxJ7pMwbEyER4l1g+O15P+16Ep7Q9Q+zqX6gSbd85u4mG4QzX+" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap-table@1.22.1/dist/bootstrap-table.min.js" integrity="sha384-GVLHfbEvuGA/RFiQ3MK2ClEJkWYJXABg55t9LpoDPZFGIsSq8xhFlQydm5poV2jW" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap-table@1.23.2/dist/extensions/export/bootstrap-table-export.min.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap-table@1.23.2/dist/extensions/resizable/bootstrap-table-resizable.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap-table@1.23.2/dist/extensions/filter-control/bootstrap-table-filter-control.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap-table@1.22.1/dist/extensions/export/bootstrap-table-export.min.js" integrity="sha384-g9OAB1Moamcy8+l1Q/tajHlMf6NTkS79ehKLTYbA80aQRbRhFCjrSuezv+FE2Kwe" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap-table@1.22.1/dist/extensions/resizable/bootstrap-table-resizable.js" integrity="sha384-wd8Vc6Febikdnsnk9vthRWRvMwffw246vhqiqNO3aSNe1maTEA07Vh3zAQiSyDji" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/bootstrap-table@1.22.1/dist/extensions/filter-control/bootstrap-table-filter-control.js" integrity="sha384-NIqcjpr/3eZI1iNzz7hgT5rgp70qFUzkZffeCgVva9gi80B5vqcm7gn+8QvlWxko" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/@easepick/bundle@1.2.1/dist/index.umd.min.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/gh/highlightjs/cdn-release@11.9.0/build/highlight.min.js" integrity="sha384-F/bZzf7p3Joyp5psL90p/p89AZJsndkSoGwRpXcZhleCWhd8SnRuoYo4d0yirjJp" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/gh/highlightjs/cdn-release@11.10.0/build/highlight.min.js" crossorigin="anonymous" type="application/javascript"></script>
<script src="https://cdn.jsdelivr.net/npm/chart.js@4.4.4/dist/chart.umd.min.js" crossorigin="anonymous" type="application/javascript"></script>
<script>
async function copyToClipboard(text, button) {
if (navigator.clipboard === undefined) {
const input = document.createElement("textarea");
input.innerHTML = text;
document.body.appendChild(input);
input.select();
document.execCommand("copy");
document.body.removeChild(input);
} else {
await navigator.clipboard.writeText(text);
}
button.html("<i class=\"bi bi-clipboard-check\"></i> copied");
setTimeout(()=> {
button.html("<i class=\"bi bi-clipboard\"></i> copy");
await navigator.clipboard.writeText(text);
button.innerHTML = "<i class=\"bi bi-clipboard-check\"></i> copied";
setTimeout(_ => {
button.innerHTML = "<i class=\"bi bi-clipboard\"></i> copy";
}, 2000);
}
@ -73,6 +65,47 @@
.join("<br>");
}
function makeRequest(url, params, onSuccess, onFailure) {
const requestParams = {
method: params.method,
body: params.json ? JSON.stringify(params.json) : params.json,
headers: {
"Accept": "application/json",
"Content-Type": "application/json",
},
};
if (params.query) {
const query = new URLSearchParams(params.query);
url += `?${query.toString()}`;
}
const convert = params.convert ?? (response => response.text());
return fetch(url, requestParams)
.then(response => {
if (response.ok) {
return convert(response);
} else {
const error = new Error("Network request error");
error.status = response.status;
error.statusText = response.statusText;
return response.text().then(text => {
error.text = text;
throw error;
});
}
})
.then(data => onSuccess && onSuccess(data))
.catch(error => onFailure && onFailure(error));
}
function ready(fn) {
if (document.readyState === "complete" || document.readyState === "interactive") {
setTimeout(fn, 1);
} else {
document.addEventListener("DOMContentLoaded", fn);
}
}
function safe(string) {
return String(string)
.replace(/&/g, "&amp;")
@ -86,7 +119,9 @@
const element = document.createElement("a");
element.href = url;
element.innerText = text;
if (title) element.title = title;
if (title) {
element.title = title;
}
return element;
}

View File

@ -1,17 +1,15 @@
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.2/dist/css/bootstrap.min.css" integrity="sha384-T3c6CoIi6uLrA9TneNEoa7RxnatzjcDSCmG1MXxSR1GAsXEV/Dwwykc2MPK8M2HN" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.1/font/bootstrap-icons.css" integrity="sha384-4LISF5TTJX/fLmGSxO53rV4miRxdg84mZsxmO8Rx5jGtp/LbrixFETvWa5a6sESd" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.3/dist/css/bootstrap.min.css" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.1/font/bootstrap-icons.css" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-table@1.22.1/dist/bootstrap-table.min.css" integrity="sha384-sN3NwxbjH33ZidqZnPmX+nQ5IF+LoiI7HvZSoZj5wGacmu0/q4RJfsN0xqN+LIa5" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-table@1.23.2/dist/bootstrap-table.min.css" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/jquery-resizable-columns@0.2.3/dist/jquery.resizableColumns.css" integrity="sha384-1sLxvR8mXzjhvFY9f8mzXl97DNLepeZ0PnRiMMdm/rQsKjsrPZPJxYle2wwT2PMg" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/jquery-resizable-columns@0.2.3/dist/jquery.resizableColumns.css" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-table@1.22.1/dist/extensions/filter-control/bootstrap-table-filter-control.css" integrity="sha384-4Glx18jZ0Un+yDG6KUpYJ/af8hkssJ02jRASuFv23gfCl0mTXaVMPI9cB4cn3GvE" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootstrap-table@1.23.2/dist/extensions/filter-control/bootstrap-table-filter-control.css" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootswatch@5.3.2/dist/cosmo/bootstrap.min.css" integrity="sha384-RfV5VNj9uqyOdZbN0hFNmoq56291KK2Y4iKhoRAbcfBfjYlpasjxK6TefPjxiAiN" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/bootswatch@5.3.3/dist/cosmo/bootstrap.min.css" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/npm/daterangepicker@3.1.0/daterangepicker.css" integrity="sha384-zLkQsiLfAQqGeIJeKLC+rcCR1YoYaQFLCL7cLDUoKE1ajKJzySpjzWGfYS2vjSG+" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/gh/highlightjs/cdn-release@11.9.0/build/styles/github.min.css" integrity="sha384-eFTL69TLRZTkNfYZOLM+G04821K1qZao/4QLJbet1pP4tcF+fdXq/9CdqAbWRl/L" crossorigin="anonymous" type="text/css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/gh/highlightjs/cdn-release@11.10.0/build/styles/github.min.css" crossorigin="anonymous" type="text/css">
<style>
.pre-scrollable {

View File

@ -27,12 +27,12 @@ _shtab_ahriman_patch_list_option_strings=('-h' '--help' '-e' '--exit-code' '-v'
_shtab_ahriman_patch_remove_option_strings=('-h' '--help' '-v' '--variable')
_shtab_ahriman_patch_set_add_option_strings=('-h' '--help' '-t' '--track')
_shtab_ahriman_repo_backup_option_strings=('-h' '--help')
_shtab_ahriman_repo_check_option_strings=('-h' '--help' '--changes' '--no-changes' '-e' '--exit-code' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_check_option_strings=('-h' '--help' '--changes' '--no-changes' '-e' '--exit-code' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_repo_check_option_strings=('-h' '--help' '--changes' '--no-changes' '--check-files' '--no-check-files' '-e' '--exit-code' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_check_option_strings=('-h' '--help' '--changes' '--no-changes' '--check-files' '--no-check-files' '-e' '--exit-code' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_repo_create_keyring_option_strings=('-h' '--help')
_shtab_ahriman_repo_create_mirrorlist_option_strings=('-h' '--help')
_shtab_ahriman_repo_daemon_option_strings=('-h' '--help' '-i' '--interval' '--aur' '--no-aur' '--changes' '--no-changes' '--dependencies' '--no-dependencies' '--dry-run' '--increment' '--no-increment' '--local' '--no-local' '--manual' '--no-manual' '--partitions' '--no-partitions' '-u' '--username' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_daemon_option_strings=('-h' '--help' '-i' '--interval' '--aur' '--no-aur' '--changes' '--no-changes' '--dependencies' '--no-dependencies' '--dry-run' '--increment' '--no-increment' '--local' '--no-local' '--manual' '--no-manual' '--partitions' '--no-partitions' '-u' '--username' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_repo_daemon_option_strings=('-h' '--help' '-i' '--interval' '--aur' '--no-aur' '--changes' '--no-changes' '--check-files' '--no-check-files' '--dependencies' '--no-dependencies' '--dry-run' '--increment' '--no-increment' '--local' '--no-local' '--manual' '--no-manual' '--partitions' '--no-partitions' '-u' '--username' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_daemon_option_strings=('-h' '--help' '-i' '--interval' '--aur' '--no-aur' '--changes' '--no-changes' '--check-files' '--no-check-files' '--dependencies' '--no-dependencies' '--dry-run' '--increment' '--no-increment' '--local' '--no-local' '--manual' '--no-manual' '--partitions' '--no-partitions' '-u' '--username' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_repo_rebuild_option_strings=('-h' '--help' '--depends-on' '--dry-run' '--from-database' '--increment' '--no-increment' '-e' '--exit-code' '-s' '--status' '-u' '--username')
_shtab_ahriman_rebuild_option_strings=('-h' '--help' '--depends-on' '--dry-run' '--from-database' '--increment' '--no-increment' '-e' '--exit-code' '-s' '--status' '-u' '--username')
_shtab_ahriman_repo_remove_unknown_option_strings=('-h' '--help' '--dry-run')
@ -47,8 +47,8 @@ _shtab_ahriman_repo_sync_option_strings=('-h' '--help')
_shtab_ahriman_sync_option_strings=('-h' '--help')
_shtab_ahriman_repo_tree_option_strings=('-h' '--help' '-p' '--partitions')
_shtab_ahriman_repo_triggers_option_strings=('-h' '--help')
_shtab_ahriman_repo_update_option_strings=('-h' '--help' '--aur' '--no-aur' '--changes' '--no-changes' '--dependencies' '--no-dependencies' '--dry-run' '-e' '--exit-code' '--increment' '--no-increment' '--local' '--no-local' '--manual' '--no-manual' '-u' '--username' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_update_option_strings=('-h' '--help' '--aur' '--no-aur' '--changes' '--no-changes' '--dependencies' '--no-dependencies' '--dry-run' '-e' '--exit-code' '--increment' '--no-increment' '--local' '--no-local' '--manual' '--no-manual' '-u' '--username' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_repo_update_option_strings=('-h' '--help' '--aur' '--no-aur' '--changes' '--no-changes' '--check-files' '--no-check-files' '--dependencies' '--no-dependencies' '--dry-run' '-e' '--exit-code' '--increment' '--no-increment' '--local' '--no-local' '--manual' '--no-manual' '-u' '--username' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_update_option_strings=('-h' '--help' '--aur' '--no-aur' '--changes' '--no-changes' '--check-files' '--no-check-files' '--dependencies' '--no-dependencies' '--dry-run' '-e' '--exit-code' '--increment' '--no-increment' '--local' '--no-local' '--manual' '--no-manual' '-u' '--username' '--vcs' '--no-vcs' '-y' '--refresh')
_shtab_ahriman_service_clean_option_strings=('-h' '--help' '--cache' '--no-cache' '--chroot' '--no-chroot' '--manual' '--no-manual' '--packages' '--no-packages' '--pacman' '--no-pacman')
_shtab_ahriman_clean_option_strings=('-h' '--help' '--cache' '--no-cache' '--chroot' '--no-chroot' '--manual' '--no-manual' '--packages' '--no-packages' '--pacman' '--no-pacman')
_shtab_ahriman_repo_clean_option_strings=('-h' '--help' '--cache' '--no-cache' '--chroot' '--no-chroot' '--manual' '--no-manual' '--packages' '--no-packages' '--pacman' '--no-pacman')
@ -243,6 +243,8 @@ _shtab_ahriman_repo_check__h_nargs=0
_shtab_ahriman_repo_check___help_nargs=0
_shtab_ahriman_repo_check___changes_nargs=0
_shtab_ahriman_repo_check___no_changes_nargs=0
_shtab_ahriman_repo_check___check_files_nargs=0
_shtab_ahriman_repo_check___no_check_files_nargs=0
_shtab_ahriman_repo_check__e_nargs=0
_shtab_ahriman_repo_check___exit_code_nargs=0
_shtab_ahriman_repo_check___vcs_nargs=0
@ -254,6 +256,8 @@ _shtab_ahriman_check__h_nargs=0
_shtab_ahriman_check___help_nargs=0
_shtab_ahriman_check___changes_nargs=0
_shtab_ahriman_check___no_changes_nargs=0
_shtab_ahriman_check___check_files_nargs=0
_shtab_ahriman_check___no_check_files_nargs=0
_shtab_ahriman_check__e_nargs=0
_shtab_ahriman_check___exit_code_nargs=0
_shtab_ahriman_check___vcs_nargs=0
@ -270,6 +274,8 @@ _shtab_ahriman_repo_daemon___aur_nargs=0
_shtab_ahriman_repo_daemon___no_aur_nargs=0
_shtab_ahriman_repo_daemon___changes_nargs=0
_shtab_ahriman_repo_daemon___no_changes_nargs=0
_shtab_ahriman_repo_daemon___check_files_nargs=0
_shtab_ahriman_repo_daemon___no_check_files_nargs=0
_shtab_ahriman_repo_daemon___dependencies_nargs=0
_shtab_ahriman_repo_daemon___no_dependencies_nargs=0
_shtab_ahriman_repo_daemon___dry_run_nargs=0
@ -291,6 +297,8 @@ _shtab_ahriman_daemon___aur_nargs=0
_shtab_ahriman_daemon___no_aur_nargs=0
_shtab_ahriman_daemon___changes_nargs=0
_shtab_ahriman_daemon___no_changes_nargs=0
_shtab_ahriman_daemon___check_files_nargs=0
_shtab_ahriman_daemon___no_check_files_nargs=0
_shtab_ahriman_daemon___dependencies_nargs=0
_shtab_ahriman_daemon___no_dependencies_nargs=0
_shtab_ahriman_daemon___dry_run_nargs=0
@ -358,6 +366,8 @@ _shtab_ahriman_repo_update___aur_nargs=0
_shtab_ahriman_repo_update___no_aur_nargs=0
_shtab_ahriman_repo_update___changes_nargs=0
_shtab_ahriman_repo_update___no_changes_nargs=0
_shtab_ahriman_repo_update___check_files_nargs=0
_shtab_ahriman_repo_update___no_check_files_nargs=0
_shtab_ahriman_repo_update___dependencies_nargs=0
_shtab_ahriman_repo_update___no_dependencies_nargs=0
_shtab_ahriman_repo_update___dry_run_nargs=0
@ -380,6 +390,8 @@ _shtab_ahriman_update___aur_nargs=0
_shtab_ahriman_update___no_aur_nargs=0
_shtab_ahriman_update___changes_nargs=0
_shtab_ahriman_update___no_changes_nargs=0
_shtab_ahriman_update___check_files_nargs=0
_shtab_ahriman_update___no_check_files_nargs=0
_shtab_ahriman_update___dependencies_nargs=0
_shtab_ahriman_update___no_dependencies_nargs=0
_shtab_ahriman_update___dry_run_nargs=0
@ -584,7 +596,7 @@ _set_new_action() {
current_action_nargs=1
fi
current_action_args_start_index=$(( $word_index + 1 ))
current_action_args_start_index=$(( $word_index + 1 - $pos_only ))
current_action_is_positional=$2
}
@ -611,6 +623,7 @@ _shtab_ahriman() {
local prefix=_shtab_ahriman
local word_index=0
local pos_only=0 # "--" delimeter not encountered yet
_set_parser_defaults
word_index=1
@ -619,26 +632,30 @@ _shtab_ahriman() {
while [ $word_index -ne $COMP_CWORD ]; do
local this_word="${COMP_WORDS[$word_index]}"
if [[ -n $sub_parsers && " ${sub_parsers[@]} " == *" ${this_word} "* ]]; then
# valid subcommand: add it to the prefix & reset the current action
prefix="${prefix}_$(_shtab_replace_nonword $this_word)"
_set_parser_defaults
fi
if [[ $pos_only = 1 || " $this_word " != " -- " ]]; then
if [[ -n $sub_parsers && " ${sub_parsers[@]} " == *" ${this_word} "* ]]; then
# valid subcommand: add it to the prefix & reset the current action
prefix="${prefix}_$(_shtab_replace_nonword $this_word)"
_set_parser_defaults
fi
if [[ " ${current_option_strings[@]} " == *" ${this_word} "* ]]; then
# a new action should be acquired (due to recognised option string or
# no more input expected from current action);
# the next positional action can fill in here
_set_new_action $this_word false
fi
if [[ " ${current_option_strings[@]} " == *" ${this_word} "* ]]; then
# a new action should be acquired (due to recognised option string or
# no more input expected from current action);
# the next positional action can fill in here
_set_new_action $this_word false
fi
if [[ "$current_action_nargs" != "*" ]] && \
[[ "$current_action_nargs" != "+" ]] && \
[[ "$current_action_nargs" != *"..." ]] && \
(( $word_index + 1 - $current_action_args_start_index >= \
$current_action_nargs )); then
$current_action_is_positional && let "completed_positional_actions += 1"
_set_new_action "pos_${completed_positional_actions}" true
if [[ "$current_action_nargs" != "*" ]] && \
[[ "$current_action_nargs" != "+" ]] && \
[[ "$current_action_nargs" != *"..." ]] && \
(( $word_index + 1 - $current_action_args_start_index - $pos_only >= \
$current_action_nargs )); then
$current_action_is_positional && let "completed_positional_actions += 1"
_set_new_action "pos_${completed_positional_actions}" true
fi
else
pos_only=1 # "--" delimeter encountered
fi
let "word_index+=1"
@ -646,7 +663,7 @@ _shtab_ahriman() {
# Generate the completions
if [[ "${completing_word}" == -* ]]; then
if [[ $pos_only = 0 && "${completing_word}" == -* ]]; then
# optional argument started: use option strings
COMPREPLY=( $(compgen -W "${current_option_strings[*]}" -- "${completing_word}") )
else

View File

@ -1,4 +1,4 @@
.TH AHRIMAN "1" "2024\-01\-12" "ahriman" "Generated Python Manual"
.TH AHRIMAN "1" "2024\-09\-04" "ahriman" "Generated Python Manual"
.SH NAME
ahriman
.SH SYNOPSIS
@ -391,7 +391,7 @@ PKGBUILD variable or function name. If variable is a function, it must end with
path to file which contains function or variable value. If not set, the value will be read from stdin
.SH COMMAND \fI\,'ahriman patch\-list'\/\fR
usage: ahriman patch\-list [\-h] [\-e] [\-v VARIABLE] [package]
usage: ahriman patch\-list [\-h] [\-e] [\-v VARIABLE] package
list available patches for the package
@ -447,7 +447,9 @@ backup repository settings and database
path of the output archive
.SH COMMAND \fI\,'ahriman repo\-check'\/\fR
usage: ahriman repo\-check [\-h] [\-\-changes | \-\-no\-changes] [\-e] [\-\-vcs | \-\-no\-vcs] [\-y] [package ...]
usage: ahriman repo\-check [\-h] [\-\-changes | \-\-no\-changes] [\-\-check\-files | \-\-no\-check\-files] [\-e] [\-\-vcs | \-\-no\-vcs]
[\-y]
[package ...]
check for packages updates. Same as repo\-update \-\-dry\-run \-\-no\-manual
@ -460,6 +462,10 @@ filter check by package base
\fB\-\-changes\fR, \fB\-\-no\-changes\fR
calculate changes from the latest known commit if available. Only applicable in dry run mode
.TP
\fB\-\-check\-files\fR, \fB\-\-no\-check\-files\fR
enable or disable checking of broken dependencies (e.g. dynamically linked libraries or modules directories)
.TP
\fB\-e\fR, \fB\-\-exit\-code\fR
return non\-zero exit status if result is empty
@ -484,9 +490,9 @@ create package which contains list of available mirrors as set by configuration.
.SH COMMAND \fI\,'ahriman repo\-daemon'\/\fR
usage: ahriman repo\-daemon [\-h] [\-i INTERVAL] [\-\-aur | \-\-no\-aur] [\-\-changes | \-\-no\-changes]
[\-\-dependencies | \-\-no\-dependencies] [\-\-dry\-run] [\-\-increment | \-\-no\-increment]
[\-\-local | \-\-no\-local] [\-\-manual | \-\-no\-manual] [\-\-partitions | \-\-no\-partitions]
[\-u USERNAME] [\-\-vcs | \-\-no\-vcs] [\-y]
[\-\-check\-files | \-\-no\-check\-files] [\-\-dependencies | \-\-no\-dependencies] [\-\-dry\-run]
[\-\-increment | \-\-no\-increment] [\-\-local | \-\-no\-local] [\-\-manual | \-\-no\-manual]
[\-\-partitions | \-\-no\-partitions] [\-u USERNAME] [\-\-vcs | \-\-no\-vcs] [\-y]
start process which periodically will run update process
@ -503,6 +509,10 @@ enable or disable checking for AUR updates
\fB\-\-changes\fR, \fB\-\-no\-changes\fR
calculate changes from the latest known commit if available. Only applicable in dry run mode
.TP
\fB\-\-check\-files\fR, \fB\-\-no\-check\-files\fR
enable or disable checking of broken dependencies (e.g. dynamically linked libraries or modules directories)
.TP
\fB\-\-dependencies\fR, \fB\-\-no\-dependencies\fR
process missing package dependencies
@ -649,9 +659,9 @@ run triggers on empty build result as configured by settings
instead of running all triggers as set by configuration, just process specified ones in order of mention
.SH COMMAND \fI\,'ahriman repo\-update'\/\fR
usage: ahriman repo\-update [\-h] [\-\-aur | \-\-no\-aur] [\-\-changes | \-\-no\-changes] [\-\-dependencies | \-\-no\-dependencies]
[\-\-dry\-run] [\-e] [\-\-increment | \-\-no\-increment] [\-\-local | \-\-no\-local]
[\-\-manual | \-\-no\-manual] [\-u USERNAME] [\-\-vcs | \-\-no\-vcs] [\-y]
usage: ahriman repo\-update [\-h] [\-\-aur | \-\-no\-aur] [\-\-changes | \-\-no\-changes] [\-\-check\-files | \-\-no\-check\-files]
[\-\-dependencies | \-\-no\-dependencies] [\-\-dry\-run] [\-e] [\-\-increment | \-\-no\-increment]
[\-\-local | \-\-no\-local] [\-\-manual | \-\-no\-manual] [\-u USERNAME] [\-\-vcs | \-\-no\-vcs] [\-y]
[package ...]
check for packages updates and run build process if requested
@ -669,6 +679,10 @@ enable or disable checking for AUR updates
\fB\-\-changes\fR, \fB\-\-no\-changes\fR
calculate changes from the latest known commit if available. Only applicable in dry run mode
.TP
\fB\-\-check\-files\fR, \fB\-\-no\-check\-files\fR
enable or disable checking of broken dependencies (e.g. dynamically linked libraries or modules directories)
.TP
\fB\-\-dependencies\fR, \fB\-\-no\-dependencies\fR
process missing package dependencies

View File

@ -86,7 +86,7 @@ _shtab_ahriman_options=(
{-a,--architecture}"[filter by target architecture (default\: None)]:architecture:"
{-c,--configuration}"[configuration path (default\: \/etc\/ahriman.ini)]:configuration:"
"--force[force run, remove file lock (default\: False)]"
{-l,--lock}"[lock file (default\: \/tmp\/ahriman.lock)]:lock:"
{-l,--lock}"[lock file (default\: ahriman.pid)]:lock:"
"--log-handler[explicit log handler specification. If none set, the handler will be guessed from environment (default\: None)]:log_handler:(console syslog journald)"
{-q,--quiet}"[force disable any logging (default\: False)]"
{--report,--no-report}"[force enable or disable reporting to web service (default\: True)]:report:"
@ -120,6 +120,7 @@ _shtab_ahriman_aur_search_options=(
_shtab_ahriman_check_options=(
"(- : *)"{-h,--help}"[show this help message and exit]"
{--changes,--no-changes}"[calculate changes from the latest known commit if available. Only applicable in dry run mode (default\: True)]:changes:"
{--check-files,--no-check-files}"[enable or disable checking of broken dependencies (e.g. dynamically linked libraries or modules directories) (default\: True)]:check_files:"
{-e,--exit-code}"[return non-zero exit status if result is empty (default\: False)]"
{--vcs,--no-vcs}"[fetch actual version of VCS packages (default\: True)]:vcs:"
"*"{-y,--refresh}"[download fresh package databases from the mirror before actions, -yy to force refresh even if up to date (default\: False)]"
@ -153,6 +154,7 @@ _shtab_ahriman_daemon_options=(
{-i,--interval}"[interval between runs in seconds (default\: 43200)]:interval:"
{--aur,--no-aur}"[enable or disable checking for AUR updates (default\: True)]:aur:"
{--changes,--no-changes}"[calculate changes from the latest known commit if available. Only applicable in dry run mode (default\: True)]:changes:"
{--check-files,--no-check-files}"[enable or disable checking of broken dependencies (e.g. dynamically linked libraries or modules directories) (default\: True)]:check_files:"
{--dependencies,--no-dependencies}"[process missing package dependencies (default\: True)]:dependencies:"
"--dry-run[just perform check for updates, same as check command (default\: False)]"
{--increment,--no-increment}"[increment package release (pkgrel) on duplicate (default\: True)]:increment:"
@ -278,7 +280,7 @@ _shtab_ahriman_patch_list_options=(
"(- : *)"{-h,--help}"[show this help message and exit]"
{-e,--exit-code}"[return non-zero exit status if result is empty (default\: False)]"
"*"{-v,--variable}"[if set, show only patches for specified PKGBUILD variables (default\: None)]:variable:"
":package base (default\: None):"
":package base:"
)
_shtab_ahriman_patch_remove_options=(
@ -322,6 +324,7 @@ _shtab_ahriman_repo_backup_options=(
_shtab_ahriman_repo_check_options=(
"(- : *)"{-h,--help}"[show this help message and exit]"
{--changes,--no-changes}"[calculate changes from the latest known commit if available. Only applicable in dry run mode (default\: True)]:changes:"
{--check-files,--no-check-files}"[enable or disable checking of broken dependencies (e.g. dynamically linked libraries or modules directories) (default\: True)]:check_files:"
{-e,--exit-code}"[return non-zero exit status if result is empty (default\: False)]"
{--vcs,--no-vcs}"[fetch actual version of VCS packages (default\: True)]:vcs:"
"*"{-y,--refresh}"[download fresh package databases from the mirror before actions, -yy to force refresh even if up to date (default\: False)]"
@ -363,6 +366,7 @@ _shtab_ahriman_repo_daemon_options=(
{-i,--interval}"[interval between runs in seconds (default\: 43200)]:interval:"
{--aur,--no-aur}"[enable or disable checking for AUR updates (default\: True)]:aur:"
{--changes,--no-changes}"[calculate changes from the latest known commit if available. Only applicable in dry run mode (default\: True)]:changes:"
{--check-files,--no-check-files}"[enable or disable checking of broken dependencies (e.g. dynamically linked libraries or modules directories) (default\: True)]:check_files:"
{--dependencies,--no-dependencies}"[process missing package dependencies (default\: True)]:dependencies:"
"--dry-run[just perform check for updates, same as check command (default\: False)]"
{--increment,--no-increment}"[increment package release (pkgrel) on duplicate (default\: True)]:increment:"
@ -460,6 +464,7 @@ _shtab_ahriman_repo_update_options=(
"(- : *)"{-h,--help}"[show this help message and exit]"
{--aur,--no-aur}"[enable or disable checking for AUR updates (default\: True)]:aur:"
{--changes,--no-changes}"[calculate changes from the latest known commit if available. Only applicable in dry run mode (default\: True)]:changes:"
{--check-files,--no-check-files}"[enable or disable checking of broken dependencies (e.g. dynamically linked libraries or modules directories) (default\: True)]:check_files:"
{--dependencies,--no-dependencies}"[process missing package dependencies (default\: True)]:dependencies:"
"--dry-run[just perform check for updates, same as check command (default\: False)]"
{-e,--exit-code}"[return non-zero exit status if result is empty (default\: False)]"
@ -601,6 +606,7 @@ _shtab_ahriman_update_options=(
"(- : *)"{-h,--help}"[show this help message and exit]"
{--aur,--no-aur}"[enable or disable checking for AUR updates (default\: True)]:aur:"
{--changes,--no-changes}"[calculate changes from the latest known commit if available. Only applicable in dry run mode (default\: True)]:changes:"
{--check-files,--no-check-files}"[enable or disable checking of broken dependencies (e.g. dynamically linked libraries or modules directories) (default\: True)]:check_files:"
{--dependencies,--no-dependencies}"[process missing package dependencies (default\: True)]:dependencies:"
"--dry-run[just perform check for updates, same as check command (default\: False)]"
{-e,--exit-code}"[return non-zero exit status if result is empty (default\: False)]"
@ -736,4 +742,12 @@ _shtab_ahriman() {
typeset -A opt_args
_shtab_ahriman "$@"
if [[ $zsh_eval_context[-1] == eval ]]; then
# eval/source/. command, register function for later
compdef _shtab_ahriman -N ahriman
else
# autoload from fpath, call function directly
_shtab_ahriman "$@"
fi

View File

@ -17,9 +17,9 @@ authors = [
]
dependencies = [
"cerberus",
"inflection",
"passlib",
"pyelftools",
"requests",
"srcinfo",
]
@ -61,6 +61,9 @@ pacman = [
s3 = [
"boto3",
]
stats = [
"matplotlib",
]
tests = [
"pytest",
"pytest-aiohttp",
@ -70,6 +73,9 @@ tests = [
"pytest-resource-path",
"pytest-spec",
]
validator = [
"cerberus",
]
web = [
"Jinja2",
"aioauth-client",
@ -80,7 +86,8 @@ web = [
"aiohttp_session",
"aiohttp_security",
"cryptography",
"requests-unixsocket", # required by unix socket support
"requests-unixsocket2", # required by unix socket support
"setuptools", # required by aiohttp-apispec
]
[tool.flit.sdist]

View File

@ -12,6 +12,7 @@ Collection of the examples of docker compose configuration files, which covers s
* [Index](index): repository with index page generator enabled.
* [Multi repo](multirepo): run web service with two separated repositories.
* [OAuth](oauth): web service with OAuth (GitHub provider) authentication enabled.
* [PAM](pam): web service with PAM authentication enabled.
* [Pull](pull): normal service, but in addition with pulling packages from another source (e.g. GitHub repository).
* [Sign](sign): create repository with database signing.
* [Web](web): simple web service with authentication enabled.

View File

@ -1,6 +1,7 @@
# Index
1. Setup repository named `ahriman-demo` with architecture `x86_64`.
2. Generate index page.
2. Generate index page and RSS feed.
3. Repository is available at `http://localhost:8080/repo`.
4. Index page is available at `http://localhost:8080/repo/ahriman-demo/x86_64/index.html`
5. Index page is available at `http://localhost:8080/repo/ahriman-demo/x86_64/rss.xml`

View File

@ -1,6 +1,12 @@
[report]
target = html
target = html rss
[html]
path = /var/lib/ahriman/ahriman/repository/ahriman-demo/x86_64/index.html
path = ${repository:root}/repository/ahriman-demo/x86_64/index.html
link_path = http://localhost:8080/repo/ahriman-demo/x86_64
rss_url = ${html:link_path}/rss.xml
[rss]
link_path = ${html:link_path}
path = ${repository:root}/repository/ahriman-demo/x86_64/rss.xml
rss_url = ${html:link_path}/rss.xml

6
recipes/pam/README.md Normal file
View File

@ -0,0 +1,6 @@
# PAM
1. Create system user `demo` with password from `AHRIMAN_PASSWORD` environment variable and group `wheel`.
2. Setup repository named `ahriman-demo` with architecture `x86_64`.
3. Start web server at port `8080`.
4. Repository is available at `http://localhost:8080/repo`.

63
recipes/pam/compose.yml Normal file
View File

@ -0,0 +1,63 @@
services:
backend:
image: arcan1s/ahriman:edge
privileged: true
environment:
AHRIMAN_DEBUG: yes
AHRIMAN_OUTPUT: console
AHRIMAN_PASSWORD: ${AHRIMAN_PASSWORD}
AHRIMAN_PORT: 8080
AHRIMAN_PRESETUP_COMMAND: useradd -d / -G wheel -M demo; (cat /run/secrets/password; echo; cat /run/secrets/password) | passwd demo
AHRIMAN_REPOSITORY: ahriman-demo
AHRIMAN_UNIX_SOCKET: /var/lib/ahriman/ahriman/ahriman.sock
configs:
- source: service
target: /etc/ahriman.ini.d/99-settings.ini
secrets:
- password
volumes:
- type: volume
source: repository
target: /var/lib/ahriman
volume:
nocopy: true
healthcheck:
test: curl --fail --silent --output /dev/null http://backend:8080/api/v1/info
interval: 10s
start_period: 30s
command: web
frontend:
image: nginx
ports:
- 8080:80
configs:
- source: nginx
target: /etc/nginx/conf.d/default.conf
volumes:
- type: volume
source: repository
target: /srv
read_only: true
volume:
nocopy: true
configs:
nginx:
file: nginx.conf
service:
file: service.ini
secrets:
password:
environment: AHRIMAN_PASSWORD
volumes:
repository:

18
recipes/pam/nginx.conf Normal file
View File

@ -0,0 +1,18 @@
server {
listen 80;
location /repo {
rewrite ^/repo/(.*) /$1 break;
autoindex on;
root /srv/ahriman/repository;
}
location / {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarder-Proto $scheme;
proxy_pass http://backend:8080;
}
}

3
recipes/pam/service.ini Normal file
View File

@ -0,0 +1,3 @@
[auth]
target = pam
full_access_group = wheel

View File

@ -17,4 +17,4 @@
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
__version__ = "2.13.3"
__version__ = "2.14.1"

View File

@ -19,16 +19,16 @@
#
# pylint: disable=too-many-lines
import argparse
import tempfile
from pathlib import Path
from typing import TypeVar
from ahriman import __version__
from ahriman.application import handlers
from ahriman.core.util import enum_values, extract_user
from ahriman.core.utils import enum_values, extract_user
from ahriman.models.action import Action
from ahriman.models.build_status import BuildStatusEnum
from ahriman.models.event import EventType
from ahriman.models.log_handler import LogHandler
from ahriman.models.package_source import PackageSource
from ahriman.models.sign_settings import SignSettings
@ -73,8 +73,7 @@ def _parser() -> argparse.ArgumentParser:
parser.add_argument("-c", "--configuration", help="configuration path", type=Path,
default=Path("/") / "etc" / "ahriman.ini")
parser.add_argument("--force", help="force run, remove file lock", action="store_true")
parser.add_argument("-l", "--lock", help="lock file", type=Path,
default=Path(tempfile.gettempdir()) / "ahriman.lock")
parser.add_argument("-l", "--lock", help="lock file", type=Path, default=Path("ahriman.pid"))
parser.add_argument("--log-handler", help="explicit log handler specification. If none set, the handler will be "
"guessed from environment",
type=LogHandler, choices=enum_values(LogHandler))
@ -121,6 +120,7 @@ def _parser() -> argparse.ArgumentParser:
_set_repo_report_parser(subparsers)
_set_repo_restore_parser(subparsers)
_set_repo_sign_parser(subparsers)
_set_repo_statistics_parser(subparsers)
_set_repo_status_update_parser(subparsers)
_set_repo_sync_parser(subparsers)
_set_repo_tree_parser(subparsers)
@ -446,7 +446,7 @@ def _set_patch_list_parser(root: SubParserAction) -> argparse.ArgumentParser:
"""
parser = root.add_parser("patch-list", help="list patch sets",
description="list available patches for the package", formatter_class=_formatter)
parser.add_argument("package", help="package base", nargs="?")
parser.add_argument("package", help="package base")
parser.add_argument("-e", "--exit-code", help="return non-zero exit status if result is empty", action="store_true")
parser.add_argument("-v", "--variable", help="if set, show only patches for specified PKGBUILD variables",
action="append")
@ -537,6 +537,9 @@ def _set_repo_check_parser(root: SubParserAction) -> argparse.ArgumentParser:
parser.add_argument("--changes", help="calculate changes from the latest known commit if available. "
"Only applicable in dry run mode",
action=argparse.BooleanOptionalAction, default=True)
parser.add_argument("--check-files", help="enable or disable checking of broken dependencies "
"(e.g. dynamically linked libraries or modules directories)",
action=argparse.BooleanOptionalAction, default=True)
parser.add_argument("-e", "--exit-code", help="return non-zero exit status if result is empty", action="store_true")
parser.add_argument("--vcs", help="fetch actual version of VCS packages",
action=argparse.BooleanOptionalAction, default=True)
@ -605,6 +608,9 @@ def _set_repo_daemon_parser(root: SubParserAction) -> argparse.ArgumentParser:
parser.add_argument("--changes", help="calculate changes from the latest known commit if available. "
"Only applicable in dry run mode",
action=argparse.BooleanOptionalAction, default=True)
parser.add_argument("--check-files", help="enable or disable checking of broken dependencies "
"(e.g. dynamically linked libraries or modules directories)",
action=argparse.BooleanOptionalAction, default=True)
parser.add_argument("--dependencies", help="process missing package dependencies",
action=argparse.BooleanOptionalAction, default=True)
parser.add_argument("--dry-run", help="just perform check for updates, same as check command", action="store_true")
@ -622,8 +628,7 @@ def _set_repo_daemon_parser(root: SubParserAction) -> argparse.ArgumentParser:
parser.add_argument("-y", "--refresh", help="download fresh package databases from the mirror before actions, "
"-yy to force refresh even if up to date",
action="count", default=False)
parser.set_defaults(handler=handlers.Daemon, exit_code=False,
lock=Path(tempfile.gettempdir()) / "ahriman-daemon.lock", package=[])
parser.set_defaults(handler=handlers.Daemon, exit_code=False, lock=Path("ahriman-daemon.pid"), package=[])
return parser
@ -732,6 +737,30 @@ def _set_repo_sign_parser(root: SubParserAction) -> argparse.ArgumentParser:
return parser
def _set_repo_statistics_parser(root: SubParserAction) -> argparse.ArgumentParser:
"""
add parser for repository statistics subcommand
Args:
root(SubParserAction): subparsers for the commands
Returns:
argparse.ArgumentParser: created argument parser
"""
parser = root.add_parser("repo-statistics", help="repository statistics",
description="fetch repository statistics", formatter_class=_formatter)
parser.add_argument("package", help="fetch only events for the specified package", nargs="?")
parser.add_argument("--chart", help="create updates chart and save it to the specified path", type=Path)
parser.add_argument("-e", "--event", help="event type filter",
type=EventType, choices=enum_values(EventType), default=EventType.PackageUpdated)
parser.add_argument("--from-date", help="only fetch events which are newer than the date")
parser.add_argument("--limit", help="limit response by specified amount of events", type=int, default=-1)
parser.add_argument("--offset", help="skip specified amount of events", type=int, default=0)
parser.add_argument("--to-date", help="only fetch events which are older than the date")
parser.set_defaults(handler=handlers.Statistics, lock=None, quiet=True, report=False, unsafe=True)
return parser
def _set_repo_status_update_parser(root: SubParserAction) -> argparse.ArgumentParser:
"""
add parser for repository status update subcommand
@ -826,6 +855,9 @@ def _set_repo_update_parser(root: SubParserAction) -> argparse.ArgumentParser:
parser.add_argument("--changes", help="calculate changes from the latest known commit if available. "
"Only applicable in dry run mode",
action=argparse.BooleanOptionalAction, default=True)
parser.add_argument("--check-files", help="enable or disable checking of broken dependencies "
"(e.g. dynamically linked libraries or modules directories)",
action=argparse.BooleanOptionalAction, default=True)
parser.add_argument("--dependencies", help="process missing package dependencies",
action=argparse.BooleanOptionalAction, default=True)
parser.add_argument("--dry-run", help="just perform check for updates, same as check command", action="store_true")
@ -871,7 +903,7 @@ def _set_service_clean_parser(root: SubParserAction) -> argparse.ArgumentParser:
action=argparse.BooleanOptionalAction, default=False)
parser.add_argument("--pacman", help="clear directory with pacman local database cache",
action=argparse.BooleanOptionalAction, default=False)
parser.set_defaults(handler=handlers.Clean, quiet=True, unsafe=True)
parser.set_defaults(handler=handlers.Clean, lock=None, quiet=True, unsafe=True)
return parser
@ -1130,8 +1162,8 @@ def _set_web_parser(root: SubParserAction) -> argparse.ArgumentParser:
argparse.ArgumentParser: created argument parser
"""
parser = root.add_parser("web", help="web server", description="start web server", formatter_class=_formatter)
parser.set_defaults(handler=handlers.Web, architecture="", lock=Path(tempfile.gettempdir()) / "ahriman-web.lock",
report=False, repository="", parser=_parser)
parser.set_defaults(handler=handlers.Web, architecture="", lock=Path("ahriman-web.pid"), report=False,
repository="", parser=_parser)
return parser

View File

@ -62,10 +62,13 @@ class Application(ApplicationPackages, ApplicationRepository):
"""
known_packages: set[str] = set()
# local set
# this action is not really needed in case if ``alpm.use_ahriman_cache`` set to yes, because pacman
# will eventually contain all the local packages
for base in self.repository.packages():
for package, properties in base.packages.items():
known_packages.add(package)
known_packages.update(properties.provides)
# known pacman databases
known_packages.update(self.repository.pacman.packages())
return known_packages
@ -117,8 +120,7 @@ class Application(ApplicationPackages, ApplicationRepository):
process_dependencies(bool): if no set, dependencies will not be processed
Returns:
list[Package]: updated packages list. Packager for dependencies will be copied from
original package
list[Package]: updated packages list. Packager for dependencies will be copied from the original package
Examples:
In the most cases, in order to avoid build failure, it is required to add missing packages, which can be
@ -158,8 +160,7 @@ class Application(ApplicationPackages, ApplicationRepository):
package = Package.from_aur(package_name, username)
with_dependencies[package.base] = package
# register package in local database
self.database.package_base_update(package)
# register package in the database
self.repository.reporter.set_unknown(package)
return list(with_dependencies.values())

View File

@ -27,7 +27,7 @@ from typing import Any
from ahriman.application.application.application_properties import ApplicationProperties
from ahriman.core.build_tools.sources import Sources
from ahriman.core.exceptions import UnknownPackageError
from ahriman.core.util import package_like
from ahriman.core.utils import package_like
from ahriman.models.package import Package
from ahriman.models.package_source import PackageSource
from ahriman.models.result import Result
@ -65,7 +65,7 @@ class ApplicationPackages(ApplicationProperties):
"""
package = Package.from_aur(source, username)
self.database.build_queue_insert(package)
self.database.package_base_update(package)
self.reporter.set_unknown(package)
def _add_directory(self, source: str, *_: Any) -> None:
"""
@ -139,7 +139,7 @@ class ApplicationPackages(ApplicationProperties):
"""
package = Package.from_official(source, self.repository.pacman, username)
self.database.build_queue_insert(package)
self.database.package_base_update(package)
self.reporter.set_unknown(package)
def add(self, names: Iterable[str], source: PackageSource, username: str | None = None) -> None:
"""

View File

@ -21,6 +21,7 @@ from ahriman.core.configuration import Configuration
from ahriman.core.database import SQLite
from ahriman.core.log import LazyLogging
from ahriman.core.repository import Repository
from ahriman.core.status import Client
from ahriman.models.pacman_synchronization import PacmanSynchronization
from ahriman.models.repository_id import RepositoryId
@ -63,3 +64,13 @@ class ApplicationProperties(LazyLogging):
str: repository architecture
"""
return self.repository_id.architecture
@property
def reporter(self) -> Client:
"""
instance of the web/database client
Returns:
Client: repository reposter
"""
return self.repository.reporter

View File

@ -39,15 +39,13 @@ class ApplicationRepository(ApplicationProperties):
Args:
packages(Iterable[Package]): list of packages to retrieve changes
"""
last_commit_hashes = self.database.hashes_get()
for package in packages:
last_commit_sha = last_commit_hashes.get(package.base)
last_commit_sha = self.reporter.package_changes_get(package.base).last_commit_sha
if last_commit_sha is None:
continue # skip check in case if we can't calculate diff
changes = self.repository.package_changes(package, last_commit_sha)
self.repository.reporter.package_changes_set(package.base, changes)
self.repository.reporter.package_changes_update(package.base, changes)
def clean(self, *, cache: bool, chroot: bool, manual: bool, packages: bool, pacman: bool) -> None:
"""
@ -91,10 +89,7 @@ class ApplicationRepository(ApplicationProperties):
packages(Iterable[str]): only sign specified packages
"""
# copy to prebuilt directory
for package in self.repository.packages():
# no one requested this package
if packages and package.base not in packages:
continue
for package in self.repository.packages(packages):
for archive in package.packages.values():
if archive.filepath is None:
self.logger.warning("filepath is empty for %s", package.base)
@ -179,7 +174,7 @@ class ApplicationRepository(ApplicationProperties):
return result
def updates(self, filter_packages: Iterable[str], *,
aur: bool, local: bool, manual: bool, vcs: bool) -> list[Package]:
aur: bool, local: bool, manual: bool, vcs: bool, check_files: bool) -> list[Package]:
"""
get list of packages to run update process
@ -189,6 +184,7 @@ class ApplicationRepository(ApplicationProperties):
local(bool): enable or disable checking of local packages for updates
manual(bool): include or exclude manual updates
vcs(bool): enable or disable checking of VCS packages
check_files(bool): check for broken dependencies
Returns:
list[Package]: list of out-of-dated packages
@ -201,5 +197,7 @@ class ApplicationRepository(ApplicationProperties):
updates.update({package.base: package for package in self.repository.updates_local(vcs=vcs)})
if manual:
updates.update({package.base: package for package in self.repository.updates_manual()})
if check_files:
updates.update({package.base: package for package in self.repository.updates_dependencies(filter_packages)})
return [package for _, package in sorted(updates.items())]

View File

@ -38,6 +38,7 @@ from ahriman.application.handlers.service_updates import ServiceUpdates
from ahriman.application.handlers.setup import Setup
from ahriman.application.handlers.shell import Shell
from ahriman.application.handlers.sign import Sign
from ahriman.application.handlers.statistics import Statistics
from ahriman.application.handlers.status import Status
from ahriman.application.handlers.status_update import StatusUpdate
from ahriman.application.handlers.structure import Structure

View File

@ -50,12 +50,13 @@ class Add(Handler):
application.add(args.package, args.source, args.username)
patches = [PkgbuildPatch.from_env(patch) for patch in args.variable] if args.variable is not None else []
for package in args.package: # for each requested package insert patch
application.database.patches_insert(package, patches)
for patch in patches:
application.reporter.package_patches_update(package, patch)
if not args.now:
return
packages = application.updates(args.package, aur=False, local=False, manual=True, vcs=False)
packages = application.updates(args.package, aur=False, local=False, manual=True, vcs=False, check_files=False)
packages = application.with_dependencies(packages, process_dependencies=args.dependencies)
packagers = Packagers(args.username, {package.base: package.packager for package in packages})

View File

@ -18,10 +18,10 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
import argparse
import pwd
import tarfile
from pathlib import Path
from tarfile import TarFile
from pwd import getpwuid
from ahriman.application.handlers.handler import Handler
from ahriman.core.configuration import Configuration
@ -49,7 +49,7 @@ class Backup(Handler):
report(bool): force enable or disable reporting
"""
backup_paths = Backup.get_paths(configuration)
with TarFile(args.path, mode="w") as archive: # well we don't actually use compression
with tarfile.open(args.path, mode="w") as archive: # well we don't actually use compression
for backup_path in backup_paths:
archive.add(backup_path)
@ -77,7 +77,7 @@ class Backup(Handler):
# gnupg home with imported keys
uid, _ = repository_paths.root_owner
system_user = pwd.getpwuid(uid)
system_user = getpwuid(uid)
gnupg_home = Path(system_user.pw_dir) / ".gnupg"
if gnupg_home.is_dir():
paths.add(gnupg_home)

View File

@ -56,4 +56,4 @@ class Change(Handler):
ChangesPrinter(changes)(verbose=True, separator="")
Change.check_if_empty(args.exit_code, changes.is_empty)
case Action.Remove:
client.package_changes_set(args.package, Changes())
client.package_changes_update(args.package, Changes())

View File

@ -59,7 +59,7 @@ class Handler:
repository_id(RepositoryId): repository unique identifier
Returns:
bool: True on success, False otherwise
bool: ``True`` on success, ``False`` otherwise
"""
try:
configuration = Configuration.from_path(args.configuration, repository_id)
@ -129,7 +129,7 @@ class Handler:
check condition and flag and raise ExitCode exception in case if it is enabled and condition match
Args:
enabled(bool): if False no check will be performed
enabled(bool): if ``False`` no check will be performed
predicate(bool): indicates condition on which exception should be thrown
Raises:

View File

@ -98,12 +98,14 @@ class Patch(Handler):
PkgbuildPatch: created patch for the PKGBUILD function
"""
if patch_path is None:
# pylint: disable=bad-builtin
print("Post new function or variable value below. Press Ctrl-D to finish:", file=sys.stderr)
patch = "".join(list(sys.stdin))
else:
patch = patch_path.read_text(encoding="utf8")
patch = patch.strip() # remove spaces around the patch
return PkgbuildPatch(variable, patch)
# remove spaces around the patch and parse to correct type
parsed = PkgbuildPatch.parse(patch.strip())
return PkgbuildPatch(variable, parsed)
@staticmethod
def patch_set_create(application: Application, package_base: str, patch: PkgbuildPatch) -> None:
@ -115,25 +117,28 @@ class Patch(Handler):
package_base(str): package base
patch(PkgbuildPatch): patch descriptor
"""
application.database.patches_insert(package_base, [patch])
application.reporter.package_patches_update(package_base, patch)
@staticmethod
def patch_set_list(application: Application, package_base: str | None, variables: list[str] | None,
def patch_set_list(application: Application, package_base: str, variables: list[str] | None,
exit_code: bool) -> None:
"""
list patches available for the package base
Args:
application(Application): application instance
package_base(str | None): package base
package_base(str): package base
variables(list[str] | None): extract patches only for specified PKGBUILD variables
exit_code(bool): exit with error on empty search result
"""
patches = application.database.patches_list(package_base, variables)
patches = [
patch
for patch in application.reporter.package_patches_get(package_base, None)
if variables is None or patch.key in variables
]
Patch.check_if_empty(exit_code, not patches)
for base, patch in patches.items():
PatchPrinter(base, patch)(verbose=True, separator=" = ")
PatchPrinter(package_base, patches)(verbose=True, separator=" = ")
@staticmethod
def patch_set_remove(application: Application, package_base: str, variables: list[str] | None) -> None:
@ -145,4 +150,8 @@ class Patch(Handler):
package_base(str): package base
variables(list[str] | None): remove patches only for specified PKGBUILD variables
"""
application.database.patches_remove(package_base, variables)
if variables is not None:
for variable in variables: # iterate over single variable
application.reporter.package_patches_remove(package_base, variable)
else:
application.reporter.package_patches_remove(package_base, None) # just pass as is

View File

@ -76,7 +76,7 @@ class Rebuild(Handler):
if from_database:
return [
package
for (package, last_status) in application.database.packages_get()
for (package, last_status) in application.reporter.package_get(None)
if status is None or last_status.status == status
]

View File

@ -18,8 +18,7 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
import argparse
from tarfile import TarFile
import tarfile
from ahriman.application.handlers.handler import Handler
from ahriman.core.configuration import Configuration
@ -45,5 +44,5 @@ class Restore(Handler):
configuration(Configuration): configuration instance
report(bool): force enable or disable reporting
"""
with TarFile(args.path) as archive:
archive.extractall(path=args.output)
with tarfile.open(args.path) as archive:
archive.extractall(path=args.output) # nosec

View File

@ -43,7 +43,7 @@ class Search(Handler):
SORT_FIELDS = {
field.name
for field in fields(AURPackage)
if field.default_factory is not list # type: ignore[comparison-overlap]
if field.default_factory is not list
}
@classmethod

View File

@ -0,0 +1,170 @@
#
# Copyright (c) 2021-2024 ahriman team.
#
# This file is part of ahriman
# (see https://github.com/arcan1s/ahriman).
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
import argparse
import datetime
import itertools
from collections.abc import Callable
from pathlib import Path
from ahriman.application.application import Application
from ahriman.application.handlers.handler import Handler
from ahriman.core.configuration import Configuration
from ahriman.core.formatters import EventStatsPrinter, PackageStatsPrinter
from ahriman.core.utils import pretty_datetime
from ahriman.models.event import Event
from ahriman.models.repository_id import RepositoryId
class Statistics(Handler):
"""
repository statistics handler
"""
ALLOW_MULTI_ARCHITECTURE_RUN = False # conflicting io
@classmethod
def run(cls, args: argparse.Namespace, repository_id: RepositoryId, configuration: Configuration, *,
report: bool) -> None:
"""
callback for command line
Args:
args(argparse.Namespace): command line args
repository_id(RepositoryId): repository unique identifier
configuration(Configuration): configuration instance
report(bool): force enable or disable reporting
"""
application = Application(repository_id, configuration, report=True)
from_date = to_date = None
if (value := args.from_date) is not None:
from_date = datetime.datetime.fromisoformat(value).timestamp()
if (value := args.to_date) is not None:
to_date = datetime.datetime.fromisoformat(value).timestamp()
events = application.reporter.event_get(args.event, args.package, from_date, to_date, args.limit, args.offset)
match args.package:
case None:
Statistics.stats_per_package(args.event, events, args.chart)
case _:
Statistics.stats_for_package(args.event, events, args.chart)
@staticmethod
def event_stats(event_type: str, events: list[Event]) -> None:
"""
calculate event stats
Args:
event_type(str): event type
events(list[Event]): list of events
"""
times = [event.get("took") for event in events if event.get("took") is not None]
EventStatsPrinter(f"{event_type} duration, s", times)(verbose=True)
@staticmethod
def plot_packages(event_type: str, events: dict[str, int], path: Path) -> None:
"""
plot packages frequency
Args:
event_type(str): event type
events(dict[str, int]): list of events
path(Path): path to save plot
"""
from matplotlib import pyplot as plt
x, y = list(events.keys()), list(events.values())
plt.bar(x, y)
plt.xlabel("Package base")
plt.ylabel("Frequency")
plt.title(f"Frequency of the {event_type} event per package")
plt.savefig(path)
@staticmethod
def plot_times(event_type: str, events: list[Event], path: Path) -> None:
"""
plot events timeline
Args:
event_type(str): event type
events(list[Event]): list of events
path(Path): path to save plot
"""
from matplotlib import pyplot as plt
figure = plt.figure()
x, y = zip(*[(pretty_datetime(event.created), event.get("took")) for event in events])
plt.plot(x, y)
plt.xlabel("Event timestamp")
plt.ylabel("Duration, s")
plt.title(f"Duration of the {event_type} event")
figure.autofmt_xdate()
plt.savefig(path)
@staticmethod
def stats_for_package(event_type: str, events: list[Event], chart_path: Path | None) -> None:
"""
calculate statistics for a package
Args:
event_type(str): event type
events(list[Event]): list of events
chart_path(Path): path to save plot if any
"""
# event statistics
Statistics.event_stats(event_type, events)
# chart if enabled
if chart_path is not None:
Statistics.plot_times(event_type, events, chart_path)
@staticmethod
def stats_per_package(event_type: str, events: list[Event], chart_path: Path | None) -> None:
"""
calculate overall statistics
Args:
event_type(str): event type
events(list[Event]): list of events
chart_path(Path): path to save plot if any
"""
key: Callable[[Event], str] = lambda event: event.object_id
by_object_id = {
object_id: len(list(related))
for object_id, related in itertools.groupby(sorted(events, key=key), key=key)
}
# distribution per package
PackageStatsPrinter(by_object_id)(verbose=True)
EventStatsPrinter(f"{event_type} frequency", list(by_object_id.values()))(verbose=True)
# event statistics
Statistics.event_stats(event_type, events)
# chart if enabled
if chart_path is not None:
Statistics.plot_packages(event_type, by_object_id, chart_path)

View File

@ -51,12 +51,8 @@ class StatusUpdate(Handler):
match args.action:
case Action.Update if args.package:
# update packages statuses
packages = application.repository.packages()
for base in args.package:
if (local := next((package for package in packages if package.base == base), None)) is not None:
client.package_add(local, args.status)
else:
client.package_update(base, args.status)
for package in args.package:
client.package_update(package, args.status)
case Action.Update:
# update service status
client.status_update(args.status)

View File

@ -48,7 +48,8 @@ class Update(Handler):
application = Application(repository_id, configuration, report=report, refresh_pacman_database=args.refresh)
application.on_start()
packages = application.updates(args.package, aur=args.aur, local=args.local, manual=args.manual, vcs=args.vcs)
packages = application.updates(args.package, aur=args.aur, local=args.local, manual=args.manual, vcs=args.vcs,
check_files=args.check_files)
if args.dry_run: # some check specific actions
if args.changes: # generate changes if requested
application.changes(packages)
@ -76,5 +77,5 @@ class Update(Handler):
Callable[[str], None]: in case if dry_run is set it will return print, logger otherwise
"""
def inner(line: str) -> None:
return print(line) if dry_run else application.logger.info(line)
return print(line) if dry_run else application.logger.info(line) # pylint: disable=bad-builtin
return inner

View File

@ -25,7 +25,6 @@ from typing import Any
from ahriman.application.handlers.handler import Handler
from ahriman.core.configuration import Configuration
from ahriman.core.configuration.schema import CONFIGURATION_SCHEMA, ConfigurationSchema
from ahriman.core.configuration.validator import Validator
from ahriman.core.exceptions import ExtensionError
from ahriman.core.formatters import ValidationPrinter
from ahriman.core.triggers import TriggerLoader
@ -51,6 +50,8 @@ class Validate(Handler):
configuration(Configuration): configuration instance
report(bool): force enable or disable reporting
"""
from ahriman.core.configuration.validator import Validator
schema = Validate.schema(repository_id, configuration)
validator = Validator(configuration=configuration, schema=schema)

View File

@ -18,7 +18,10 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
import argparse
import fcntl
import os
from io import TextIOWrapper
from pathlib import Path
from types import TracebackType
from typing import Literal, Self
@ -27,8 +30,8 @@ from ahriman import __version__
from ahriman.core.configuration import Configuration
from ahriman.core.exceptions import DuplicateRunError
from ahriman.core.log import LazyLogging
from ahriman.core.status.client import Client
from ahriman.core.util import check_user
from ahriman.core.status import Client
from ahriman.core.utils import check_user
from ahriman.models.build_status import BuildStatusEnum
from ahriman.models.repository_id import RepositoryId
from ahriman.models.waiter import Waiter
@ -36,7 +39,7 @@ from ahriman.models.waiter import Waiter
class Lock(LazyLogging):
"""
wrapper for application lock file
wrapper for application lock file. Credits for idea to https://github.com/bmhatfield/python-pidfile.git
Attributes:
force(bool): remove lock file on start if any
@ -56,7 +59,7 @@ class Lock(LazyLogging):
>>> configuration = Configuration()
>>> try:
>>> with Lock(args, RepositoryId("x86_64", "aur-clone"), configuration):
>>> perform_actions()
>>> do_something()
>>> except Exception as exception:
>>> handle_exceptions(exception)
"""
@ -70,8 +73,15 @@ class Lock(LazyLogging):
repository_id(RepositoryId): repository unique identifier
configuration(Configuration): configuration instance
"""
self.path: Path | None = \
args.lock.with_stem(f"{args.lock.stem}_{repository_id.id}") if args.lock is not None else None
self.path: Path | None = None
if args.lock is not None:
self.path = args.lock
if not repository_id.is_empty:
self.path = self.path.with_stem(f"{args.lock.stem}_{repository_id.id}")
if not self.path.is_absolute():
# prepend full path to the lock file
self.path = Path("/") / "run" / "ahriman" / self.path
self._pid_file: TextIOWrapper | None = None
self.force: bool = args.force
self.unsafe: bool = args.unsafe
@ -80,6 +90,72 @@ class Lock(LazyLogging):
self.paths = configuration.repository_paths
self.reporter = Client.load(repository_id, configuration, report=args.report)
@staticmethod
def perform_lock(fd: int) -> bool:
"""
perform file lock
Args:
fd(int): file descriptor:
Returns:
bool: ``True`` in case if file is locked and ``False`` otherwise
"""
try:
fcntl.flock(fd, fcntl.LOCK_EX | fcntl.LOCK_NB)
except OSError:
return False
return True
def _open(self) -> None:
"""
create lock file
"""
if self.path is None:
return
self._pid_file = self.path.open("a+")
def _watch(self) -> bool:
"""
watch until lock disappear
Returns:
bool: ``True`` in case if file is locked and ``False`` otherwise
"""
# there are reasons why we are not using inotify here. First of all, if we would use it, it would bring to
# race conditions because multiple processes will be notified at the same time. Secondly, it is good library,
# but platform-specific, and we only need to check if file exists
if self._pid_file is None:
return False
waiter = Waiter(self.wait_timeout)
return bool(waiter.wait(lambda fd: not self.perform_lock(fd), self._pid_file.fileno()))
def _write(self, *, is_locked: bool = False) -> None:
"""
write pid to the lock file
Args:
is_locked(bool, optional): indicates if file was already locked or not (Default value = False)
Raises:
DuplicateRunError: if it cannot lock PID file
"""
if self._pid_file is None:
return
if not is_locked:
if not self.perform_lock(self._pid_file.fileno()):
raise DuplicateRunError
self._pid_file.seek(0) # reset position and remove file content if any
self._pid_file.truncate()
self._pid_file.write(str(os.getpid())) # write current pid
self._pid_file.flush() # flush data to disk
self._pid_file.seek(0) # reset position again
def check_user(self) -> None:
"""
check if current user is actually owner of ahriman root
@ -100,46 +176,33 @@ class Lock(LazyLogging):
"""
remove lock file
"""
if self.path is None:
return
self.path.unlink(missing_ok=True)
if self._pid_file is not None: # close file descriptor
try:
self._pid_file.close()
except IOError:
pass # suppress any IO errors occur
if self.path is not None: # remove lock file
self.path.unlink(missing_ok=True)
def create(self) -> None:
def lock(self) -> None:
"""
create lock file
Raises:
DuplicateRunError: if lock exists and no force flag supplied
create pid file
"""
if self.path is None:
return
try:
self.path.touch(exist_ok=self.force)
except FileExistsError:
raise DuplicateRunError from None
def watch(self) -> None:
"""
watch until lock disappear
"""
# there are reasons why we are not using inotify here. First of all, if we would use it, it would bring to
# race conditions because multiple processes will be notified in the same time. Secondly, it is good library,
# but platform-specific, and we only need to check if file exists
if self.path is None:
return
waiter = Waiter(self.wait_timeout)
waiter.wait(self.path.is_file)
if self.force: # remove lock if force flag is set
self.clear()
self._open()
is_locked = self._watch()
self._write(is_locked=is_locked)
def __enter__(self) -> Self:
"""
default workflow is the following:
#. Check user UID
#. Check if there is lock file
#. Check web status watcher status
#. Open lock file
#. Wait for lock file to be free
#. Create lock file and directory tree
#. Write current PID to the lock file
#. Report to status page if enabled
Returns:
@ -147,8 +210,7 @@ class Lock(LazyLogging):
"""
self.check_user()
self.check_version()
self.watch()
self.create()
self.lock()
self.reporter.status_update(BuildStatusEnum.Building)
return self
@ -163,7 +225,7 @@ class Lock(LazyLogging):
exc_tb(TracebackType): exception traceback if any
Returns:
Literal[False]: always False (do not suppress any exception)
Literal[False]: always ``False`` (do not suppress any exception)
"""
self.clear()
status = BuildStatusEnum.Success if exc_val is None else BuildStatusEnum.Failed

View File

@ -38,12 +38,12 @@ class _Context:
"""
self._content: dict[str, Any] = {}
def get(self, key: ContextKey[T]) -> T:
def get(self, key: ContextKey[T] | type[T]) -> T:
"""
get value for the specified key
Args:
key(ContextKey[T]): context key name
key(ContextKey[T] | type[T]): context key name
Returns:
T: value associated with the key
@ -52,29 +52,37 @@ class _Context:
KeyError: in case if the specified context variable was not found
ValueError: in case if type of value is not an instance of specified return type
"""
if not isinstance(key, ContextKey):
key = ContextKey.from_type(key)
if key.key not in self._content:
raise KeyError(key.key)
value = self._content[key.key]
if not isinstance(value, key.return_type):
raise ValueError(f"Value {value} is not an instance of {key.return_type}")
return value
def set(self, key: ContextKey[T], value: T) -> None:
def set(self, key: ContextKey[T] | type[T], value: T) -> None:
"""
set value for the specified key
Args:
key(ContextKey[T]): context key name
key(ContextKey[T] | type[T]): context key name
value(T): context value associated with the specified key
Raises:
KeyError: in case if the specified context variable already exists
ValueError: in case if type of value is not an instance of specified return type
"""
if not isinstance(key, ContextKey):
key = ContextKey.from_type(key)
if key.key in self._content:
raise KeyError(key.key)
if not isinstance(value, key.return_type):
raise ValueError(f"Value {value} is not an instance of {key.return_type}")
self._content[key.key] = value
def __iter__(self) -> Iterator[str]:

View File

@ -17,25 +17,33 @@
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
import itertools
import shutil
import tarfile
from collections.abc import Callable, Generator
from collections.abc import Generator, Iterable
from functools import cached_property
from pathlib import Path
from pyalpm import DB, Handle, Package, SIG_PACKAGE, error as PyalpmError # type: ignore[import-not-found]
from pyalpm import DB, Handle, Package, SIG_DATABASE_OPTIONAL, SIG_PACKAGE_OPTIONAL # type: ignore[import-not-found]
from string import Template
from ahriman.core.alpm.pacman_database import PacmanDatabase
from ahriman.core.configuration import Configuration
from ahriman.core.log import LazyLogging
from ahriman.core.util import trim_package
from ahriman.core.utils import trim_package
from ahriman.models.pacman_synchronization import PacmanSynchronization
from ahriman.models.repository_id import RepositoryId
from ahriman.models.repository_paths import RepositoryPaths
class Pacman(LazyLogging):
"""
alpm wrapper
Attributes:
configuration(Configuration): configuration instance
refresh_database(PacmanSynchronization): synchronize local cache to remote
repository_id(RepositoryId): repository unique identifier
repository_paths(RepositoryPaths): repository paths instance
"""
def __init__(self, repository_id: RepositoryId, configuration: Configuration, *,
@ -48,8 +56,11 @@ class Pacman(LazyLogging):
configuration(Configuration): configuration instance
refresh_database(PacmanSynchronization): synchronize local cache to remote
"""
self.__create_handle_fn: Callable[[], Handle] = lambda: self.__create_handle(
repository_id, configuration, refresh_database=refresh_database)
self.configuration = configuration
self.repository_id = repository_id
self.repository_paths = configuration.repository_paths
self.refresh_database = refresh_database
@cached_property
def handle(self) -> Handle:
@ -59,40 +70,39 @@ class Pacman(LazyLogging):
Returns:
Handle: generated pyalpm handle instance
"""
return self.__create_handle_fn()
return self.__create_handle(refresh_database=self.refresh_database)
def __create_handle(self, repository_id: RepositoryId, configuration: Configuration, *,
refresh_database: PacmanSynchronization) -> Handle:
def __create_handle(self, *, refresh_database: PacmanSynchronization) -> Handle:
"""
create lazy handle function
Args:
repository_id(RepositoryId): repository unique identifier
configuration(Configuration): configuration instance
refresh_database(PacmanSynchronization): synchronize local cache to remote
Returns:
Handle: fully initialized pacman handle
"""
root = configuration.getpath("alpm", "root")
pacman_root = configuration.getpath("alpm", "database")
use_ahriman_cache = configuration.getboolean("alpm", "use_ahriman_cache")
mirror = configuration.get("alpm", "mirror")
paths = configuration.repository_paths
database_path = paths.pacman if use_ahriman_cache else pacman_root
pacman_root = self.configuration.getpath("alpm", "database")
use_ahriman_cache = self.configuration.getboolean("alpm", "use_ahriman_cache")
database_path = self.repository_paths.pacman if use_ahriman_cache else pacman_root
root = self.configuration.getpath("alpm", "root")
handle = Handle(str(root), str(database_path))
for repository in configuration.getlist("alpm", "repositories"):
database = self.database_init(handle, repository, mirror, repository_id.architecture)
self.database_copy(handle, database, pacman_root, paths, use_ahriman_cache=use_ahriman_cache)
for repository in self.configuration.getlist("alpm", "repositories"):
database = self.database_init(handle, repository, self.repository_id.architecture)
self.database_copy(handle, database, pacman_root, use_ahriman_cache=use_ahriman_cache)
# install repository database too
local_database = self.database_init(handle, self.repository_id.name, self.repository_id.architecture)
self.database_copy(handle, local_database, pacman_root, use_ahriman_cache=use_ahriman_cache)
if use_ahriman_cache and refresh_database:
self.database_sync(handle, force=refresh_database == PacmanSynchronization.Force)
return handle
def database_copy(self, handle: Handle, database: DB, pacman_root: Path, paths: RepositoryPaths, *,
use_ahriman_cache: bool) -> None:
def database_copy(self, handle: Handle, database: DB, pacman_root: Path, *, use_ahriman_cache: bool) -> None:
"""
copy database from the operating system root to the ahriman local home
@ -100,7 +110,6 @@ class Pacman(LazyLogging):
handle(Handle): pacman handle which will be used for database copying
database(DB): pacman database instance to be copied
pacman_root(Path): operating system pacman root
paths(RepositoryPaths): repository paths instance
use_ahriman_cache(bool): use local ahriman cache instead of system one
"""
def repository_database(root: Path) -> Path:
@ -122,30 +131,36 @@ class Pacman(LazyLogging):
return # database for some reason deos not exist
self.logger.info("copy pacman database from operating system root to ahriman's home")
shutil.copy(src, dst)
paths.chown(dst)
self.repository_paths.chown(dst)
def database_init(self, handle: Handle, repository: str, mirror: str, architecture: str) -> DB:
def database_init(self, handle: Handle, repository: str, architecture: str) -> DB:
"""
create database instance from pacman handler and set its properties
Args:
handle(Handle): pacman handle which will be used for database initializing
repository(str): pacman repository name (e.g. core)
mirror(str): arch linux mirror url
architecture(str): repository architecture
Returns:
DB: loaded pacman database instance
"""
self.logger.info("loading pacman database %s", repository)
database: DB = handle.register_syncdb(repository, SIG_PACKAGE)
database: DB = handle.register_syncdb(repository, SIG_DATABASE_OPTIONAL | SIG_PACKAGE_OPTIONAL)
# replace variables in mirror address
variables = {
"arch": architecture,
"repo": repository,
}
database.servers = [Template(mirror).safe_substitute(variables)]
if repository != self.repository_id.name:
mirror = self.configuration.get("alpm", "mirror")
# replace variables in mirror address
variables = {
"arch": architecture,
"repo": repository,
}
server = Template(mirror).safe_substitute(variables)
else:
# special case, same database, use local storage instead
server = f"file://{self.repository_paths.repository}"
database.servers = [server]
return database
@ -160,13 +175,55 @@ class Pacman(LazyLogging):
self.logger.info("refresh ahriman's home pacman database (force refresh %s)", force)
transaction = handle.init_transaction()
for database in handle.get_syncdbs():
try:
database.update(force)
except PyalpmError:
self.logger.exception("exception during update %s", database.name)
PacmanDatabase(database, self.configuration).sync(force=force)
transaction.release()
def package_get(self, package_name: str) -> Generator[Package, None, None]:
def files(self, packages: Iterable[str]) -> dict[str, set[str]]:
"""
extract list of known packages from the databases
Args:
packages(Iterable[str]): filter by package names
Returns:
dict[str, set[str]]: map of package name to its list of files
"""
def extract(tar: tarfile.TarFile, versions: dict[str, str]) -> Generator[tuple[str, set[str]], None, None]:
for package_name, version in versions.items():
path = Path(f"{package_name}-{version}") / "files"
try:
content = tar.extractfile(str(path))
except KeyError:
# in case if database and its files has been desync somehow, the extractfile will raise
# KeyError because the entry doesn't exist
content = None
if content is None:
continue
# this is just array of files, however, the directories are with trailing slash,
# which previously has been removed by the conversion to ``pathlib.Path``
files = {filename.decode("utf8").rstrip().removesuffix("/") for filename in content.readlines()}
yield package_name, files
# sort is required for the following group by operation
descriptors = sorted(
(package for package_name in packages for package in self.package(package_name)),
key=lambda package: package.db.name
)
result: dict[str, set[str]] = {}
for database_name, pacman_packages in itertools.groupby(descriptors, lambda package: package.db.name):
database_file = self.repository_paths.pacman / "sync" / f"{database_name}.files.tar.gz"
if not database_file.is_file():
continue # no database file found
package_names = {package.name: package.version for package in pacman_packages}
with tarfile.open(database_file, "r:gz") as archive:
result.update(extract(archive, package_names))
return result
def package(self, package_name: str) -> Generator[Package, None, None]:
"""
retrieve list of the packages from the repository by name

View File

@ -0,0 +1,171 @@
#
# Copyright (c) 2021-2024 ahriman team.
#
# This file is part of ahriman
# (see https://github.com/arcan1s/ahriman).
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
import os
import shutil
from email.utils import parsedate_to_datetime
from pathlib import Path
from pyalpm import DB # type: ignore[import-not-found]
from urllib.parse import urlparse
from ahriman.core.configuration import Configuration
from ahriman.core.exceptions import PacmanError
from ahriman.core.http import SyncHttpClient
class PacmanDatabase(SyncHttpClient):
"""
implementation for database sync, because pyalpm is not always enough
Attributes:
LAST_MODIFIED_HEADER(str): last modified header name
database(DB): pyalpm database object
repository_paths(RepositoryPaths): repository paths instance
sync_files_database(bool): sync files database
"""
LAST_MODIFIED_HEADER = "Last-Modified"
def __init__(self, database: DB, configuration: Configuration) -> None:
"""
default constructor
Args:
database(DB): pyalpm database object
configuration(Configuration): configuration instance
"""
SyncHttpClient.__init__(self)
self.timeout = None # reset timeout
self.database = database
self.repository_paths = configuration.repository_paths
self.sync_files_database = configuration.getboolean("alpm", "sync_files_database")
@staticmethod
def copy(remote_path: Path, local_path: Path) -> None:
"""
copy local database file
Args:
remote_path(Path): path to source (remote) file
local_path(Path): path to locally stored file
"""
shutil.copy(remote_path, local_path)
def download(self, url: str, local_path: Path) -> None:
"""
download remote file and store it to local path with the correct last modified headers
Args:
url(str): remote url to request file
local_path(Path): path to locally stored file
Raises:
PacmanError: in case if no last-modified header was found
"""
response = self.make_request("GET", url, stream=True)
if self.LAST_MODIFIED_HEADER not in response.headers:
raise PacmanError("No last-modified header found")
with local_path.open("wb") as local_file:
for chunk in response.iter_content(chunk_size=1024):
local_file.write(chunk)
# set correct (a,m)time for the file
remote_changed = parsedate_to_datetime(response.headers[self.LAST_MODIFIED_HEADER]).timestamp()
os.utime(local_path, (remote_changed, remote_changed))
def is_outdated(self, url: str, local_path: Path) -> bool:
"""
check if local file is outdated
Args:
url(str): remote url to request last modified header
local_path(Path): path to locally stored file
Returns:
bool: ``True`` in case if remote file is newer than local file
Raises:
PacmanError: in case if no last-modified header was found
"""
if not local_path.is_file():
return True # no local file found, requires to update
response = self.make_request("HEAD", url)
if self.LAST_MODIFIED_HEADER not in response.headers:
raise PacmanError("No last-modified header found")
remote_changed = parsedate_to_datetime(response.headers["Last-Modified"]).timestamp()
local_changed = local_path.stat().st_mtime
return remote_changed > local_changed
def sync(self, *, force: bool) -> None:
"""
sync packages and files databases
Args:
force(bool): force database synchronization (same as ``pacman -Syy``)
"""
try:
self.sync_packages(force=force)
if self.sync_files_database:
self.sync_files(force=force)
except Exception:
self.logger.exception("exception during update %s", self.database.name)
def sync_files(self, *, force: bool) -> None:
"""
sync files by using http request
Args:
force(bool): force database synchronization (same as ``pacman -Syy``)
"""
server = next(iter(self.database.servers))
filename = f"{self.database.name}.files.tar.gz"
url = f"{server}/{filename}"
remote_uri = urlparse(url)
local_path = Path(self.repository_paths.pacman / "sync" / filename)
match remote_uri.scheme:
case "http" | "https":
if not force and not self.is_outdated(url, local_path):
return
self.download(url, local_path)
case "file":
# just copy file as it is relatively cheap operation, no need to check timestamps
self.copy(Path(remote_uri.path), local_path)
case other:
raise PacmanError(f"Unknown or unsupported URL scheme {other}")
def sync_packages(self, *, force: bool) -> None:
"""
sync packages by using built-in pyalpm methods
Args:
force(bool): force database synchronization (same as ``pacman -Syy``)
"""
self.database.update(force)

View File

@ -56,6 +56,6 @@ class OfficialSyncdb(Official):
raise UnknownPackageError(package_name)
try:
return next(AURPackage.from_pacman(package) for package in pacman.package_get(package_name))
return next(AURPackage.from_pacman(package) for package in pacman.package(package_name))
except StopIteration:
raise UnknownPackageError(package_name) from None

View File

@ -21,7 +21,7 @@ from pathlib import Path
from ahriman.core.exceptions import BuildError
from ahriman.core.log import LazyLogging
from ahriman.core.util import check_output
from ahriman.core.utils import check_output
from ahriman.models.repository_paths import RepositoryPaths
@ -68,7 +68,7 @@ class Repo(LazyLogging):
path(Path): path to archive to add
"""
check_output(
"repo-add", *self.sign_args, "-R", str(self.repo_path), str(path),
"repo-add", *self.sign_args, "--remove", str(self.repo_path), str(path),
exception=BuildError.from_process(path.name),
cwd=self.paths.repository,
logger=self.logger,
@ -78,8 +78,13 @@ class Repo(LazyLogging):
"""
create empty repository database. It just calls add with empty arguments
"""
check_output("repo-add", *self.sign_args, str(self.repo_path),
cwd=self.paths.repository, logger=self.logger, user=self.uid)
# since pacman-6.1.0 repo-add doesn't create empty database in case if no packages supplied
# this code creates empty files instead
if self.repo_path.exists():
return # database is already created, skip this part
self.repo_path.touch(exist_ok=True)
(self.paths.repository / f"{self.name}.db").symlink_to(self.repo_path)
def remove(self, package: str, filename: Path) -> None:
"""

View File

@ -81,32 +81,35 @@ class Auth(LazyLogging):
case AuthSettings.OAuth:
from ahriman.core.auth.oauth import OAuth
return OAuth(configuration, database)
case AuthSettings.PAM:
from ahriman.core.auth.pam import PAM
return PAM(configuration, database)
case _:
return Auth(configuration)
async def check_credentials(self, username: str | None, password: str | None) -> bool:
async def check_credentials(self, username: str, password: str | None) -> bool:
"""
validate user password
Args:
username(str | None): username
username(str): username
password(str | None): entered password
Returns:
bool: True in case if password matches, False otherwise
bool: ``True`` in case if password matches, ``False`` otherwise
"""
del username, password
return True
async def known_username(self, username: str | None) -> bool:
async def known_username(self, username: str) -> bool:
"""
check if user is known
Args:
username(str | None): username
username(str): username
Returns:
bool: True in case if user is known and can be authorized and False otherwise
bool: ``True`` in case if user is known and can be authorized and ``False`` otherwise
"""
del username
return True
@ -121,7 +124,7 @@ class Auth(LazyLogging):
context(str | None): URI request path
Returns:
bool: True in case if user is allowed to do this request and False otherwise
bool: ``True`` in case if user is allowed to do this request and ``False`` otherwise
"""
del username, required, context
return True

View File

@ -38,7 +38,7 @@ async def authorized_userid(*args: Any, **kwargs: Any) -> Any:
**kwargs(Any): named argument list as provided by authorized_userid function
Returns:
Any: None in case if no aiohttp_security module found and function call otherwise
Any: ``None`` in case if no aiohttp_security module found and function call otherwise
"""
if _has_aiohttp_security:
return await aiohttp_security.authorized_userid(*args, **kwargs) # pylint: disable=no-value-for-parameter
@ -54,7 +54,7 @@ async def check_authorized(*args: Any, **kwargs: Any) -> Any:
**kwargs(Any): named argument list as provided by authorized_userid function
Returns:
Any: None in case if no aiohttp_security module found and function call otherwise
Any: ``None`` in case if no aiohttp_security module found and function call otherwise
"""
if _has_aiohttp_security:
return await aiohttp_security.check_authorized(*args, **kwargs) # pylint: disable=no-value-for-parameter
@ -70,7 +70,7 @@ async def forget(*args: Any, **kwargs: Any) -> Any:
**kwargs(Any): named argument list as provided by authorized_userid function
Returns:
Any: None in case if no aiohttp_security module found and function call otherwise
Any: ``None`` in case if no aiohttp_security module found and function call otherwise
"""
if _has_aiohttp_security:
return await aiohttp_security.forget(*args, **kwargs) # pylint: disable=no-value-for-parameter
@ -86,7 +86,7 @@ async def remember(*args: Any, **kwargs: Any) -> Any:
**kwargs(Any): named argument list as provided by authorized_userid function
Returns:
Any: None in case if no aiohttp_security module found and function call otherwise
Any: ``None`` in case if no aiohttp_security module found and function call otherwise
"""
if _has_aiohttp_security:
return await aiohttp_security.remember(*args, **kwargs) # pylint: disable=no-value-for-parameter

Some files were not shown because too many files have changed in this diff Show More