Compare commits

...

9 Commits

Author SHA1 Message Date
2feaa14f46 Release 2.18.0 2025-06-13 16:37:58 +03:00
9653fc4f4a type: support new configparser signatures 2025-05-31 02:16:07 +03:00
bcd46c66e8 test: use new-style fixtures instead of event_loop for asyncio 2025-05-12 15:57:05 +03:00
6ea56faede build: fix tox environment creation with the latest updates 2025-05-09 15:00:40 +03:00
9e346530f2 refactor: use backslashreplace error handling instead of guessing encoding 2025-05-08 14:03:47 +03:00
d283dccc1e type: update for new cors release 2025-03-17 13:56:59 +02:00
8a4e900ab9 docs: update docs
This commit includes following changes
* add newly added option to configuration referenec
* remove few legacy options from configuration schemas used for
  validation, which might lead to errors during validation.
  Note, however, that settings will be still read by the service
* add link to aurcache
* hide service-setup command description under spoiler
2025-03-17 13:43:04 +02:00
fa6cf8ce36 website: use date instead of version for listing logs
website: make dropdown from logs versions to add some space
2025-03-13 15:45:31 +02:00
a706fbb751 bug: handle dependencies iteratively (fix #141)
It has been found that if there are missing dependencies than whole
process will break instead of just skipping packages. During package
addition it is fine-ish, but it will break updates run
2025-03-13 15:45:27 +02:00
33 changed files with 1036 additions and 876 deletions

View File

@ -8,6 +8,10 @@ on:
- '*'
- '!*rc*'
permissions:
contents: read
packages: write
jobs:
docker-image:

View File

@ -2,6 +2,9 @@ name: Regress
on: workflow_dispatch
permissions:
contents: read
jobs:
run-regress-tests:

View File

@ -5,6 +5,9 @@ on:
tags:
- '*'
permissions:
contents: write
jobs:
make-release:

View File

@ -8,6 +8,9 @@ on:
branches:
- master
permissions:
contents: read
jobs:
run-setup-minimal:

View File

@ -10,6 +10,9 @@ on:
schedule:
- cron: 1 0 * * *
permissions:
contents: read
jobs:
run-tests:

View File

@ -9,13 +9,7 @@ build:
python:
install:
- method: pip
path: .
extra_requirements:
- docs
- s3
- validator
- web
- requirements: docs/requirements.txt
formats:
- pdf

File diff suppressed because it is too large Load Diff

View File

@ -15,9 +15,8 @@ import sys
from pathlib import Path
from ahriman import __version__
# support package imports
basedir = Path(__file__).resolve().parent.parent / "src"
sys.path.insert(0, str(basedir))
@ -29,6 +28,7 @@ copyright = f"2021-{datetime.date.today().year}, ahriman team"
author = "ahriman team"
# The full version, including alpha/beta/rc tags
from ahriman import __version__
release = __version__
@ -91,7 +91,13 @@ autoclass_content = "both"
autodoc_member_order = "groupwise"
autodoc_mock_imports = ["cryptography", "pyalpm"]
autodoc_mock_imports = [
"aioauth_client",
"aiohttp_security",
"aiohttp_session",
"cryptography",
"pyalpm",
]
autodoc_default_options = {
"no-undoc-members": True,

View File

@ -81,6 +81,7 @@ Base configuration settings.
* ``apply_migrations`` - perform database migrations on the application start, boolean, optional, default ``yes``. Useful if you are using git version. Note, however, that this option must be changed only if you know what to do and going to handle migrations manually.
* ``database`` - path to the application SQLite database, string, required.
* ``include`` - path to directory with configuration files overrides, string, optional. Files will be read in alphabetical order.
* ``keep_last_logs`` - amount of build logs to be kept for each package, integer, optional ,default ``0``. Logs will be cleared at the end of each process.
* ``logging`` - path to logging configuration, string, required. Check ``logging.ini`` for reference.
``alpm:*`` groups
@ -217,7 +218,7 @@ Mirrorlist generator plugin
``remote-pull`` group
---------------------
Remote git source synchronization settings. Unlike ``Upload`` triggers those triggers are used for PKGBUILD synchronization - fetch from remote repository PKGBUILDs before updating process.
Remote git source synchronization settings. Unlike ``upload`` triggers those triggers are used for PKGBUILD synchronization - fetch from remote repository PKGBUILDs before updating process.
It supports authorization; to do so you'd need to prefix the URL with authorization part, e.g. ``https://key:token@github.com/arcan1s/ahriman.git``. It is highly recommended to use application tokens instead of your user authorization details. Alternatively, you can use any other option supported by git, e.g.:

View File

@ -56,6 +56,13 @@ Though originally I've created ahriman by trying to improve the project, it stil
It is automation tools for ``repoctl`` mentioned above. Except for using shell it looks pretty cool and also offers some additional features like patches, remote synchronization (isn't it?) and reporting.
`AURCache <https://github.com/Lukas-Heiligenbrunner/AURCache>`__
""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""
That's really cool project if you are looking for simple service to build AUR packages. It provides very informative dashboard and easy to configure and use. However, it doesn't provide direct way to control build process (e.g. it is neither trivial to build packages for architectures which are not supported by default nor to change build flags).
Also this application relies on docker setup (e.g. builders are only available as special docker containers). In addition, it uses ``paru`` to build packages instead of ``devtools``.
How to check service logs
^^^^^^^^^^^^^^^^^^^^^^^^^

128
docs/requirements.txt Normal file
View File

@ -0,0 +1,128 @@
# This file was autogenerated by uv via the following command:
# uv pip compile --group ../pyproject.toml:docs --extra s3 --extra validator --extra web --output-file ../docs/requirements.txt ../pyproject.toml
aiohappyeyeballs==2.6.1
# via aiohttp
aiohttp==3.11.18
# via
# ahriman (../pyproject.toml)
# aiohttp-cors
# aiohttp-jinja2
aiohttp-cors==0.8.1
# via ahriman (../pyproject.toml)
aiohttp-jinja2==1.6
# via ahriman (../pyproject.toml)
aiosignal==1.3.2
# via aiohttp
alabaster==1.0.0
# via sphinx
argparse-manpage==4.6
# via ahriman (../pyproject.toml:docs)
attrs==25.3.0
# via aiohttp
babel==2.17.0
# via sphinx
bcrypt==4.3.0
# via ahriman (../pyproject.toml)
boto3==1.38.11
# via ahriman (../pyproject.toml)
botocore==1.38.11
# via
# boto3
# s3transfer
cerberus==1.3.7
# via ahriman (../pyproject.toml)
certifi==2025.4.26
# via requests
charset-normalizer==3.4.2
# via requests
docutils==0.21.2
# via
# sphinx
# sphinx-argparse
# sphinx-rtd-theme
frozenlist==1.6.0
# via
# aiohttp
# aiosignal
idna==3.10
# via
# requests
# yarl
imagesize==1.4.1
# via sphinx
inflection==0.5.1
# via ahriman (../pyproject.toml)
jinja2==3.1.6
# via
# aiohttp-jinja2
# sphinx
jmespath==1.0.1
# via
# boto3
# botocore
markupsafe==3.0.2
# via jinja2
multidict==6.4.3
# via
# aiohttp
# yarl
packaging==25.0
# via sphinx
propcache==0.3.1
# via
# aiohttp
# yarl
pydeps==3.0.1
# via ahriman (../pyproject.toml:docs)
pyelftools==0.32
# via ahriman (../pyproject.toml)
pygments==2.19.1
# via sphinx
python-dateutil==2.9.0.post0
# via botocore
requests==2.32.3
# via
# ahriman (../pyproject.toml)
# sphinx
roman-numerals-py==3.1.0
# via sphinx
s3transfer==0.12.0
# via boto3
shtab==1.7.2
# via ahriman (../pyproject.toml:docs)
six==1.17.0
# via python-dateutil
snowballstemmer==3.0.0.1
# via sphinx
sphinx==8.2.3
# via
# ahriman (../pyproject.toml:docs)
# sphinx-argparse
# sphinx-rtd-theme
# sphinxcontrib-jquery
sphinx-argparse==0.5.2
# via ahriman (../pyproject.toml:docs)
sphinx-rtd-theme==3.0.2
# via ahriman (../pyproject.toml:docs)
sphinxcontrib-applehelp==2.0.0
# via sphinx
sphinxcontrib-devhelp==2.0.0
# via sphinx
sphinxcontrib-htmlhelp==2.1.0
# via sphinx
sphinxcontrib-jquery==4.1
# via sphinx-rtd-theme
sphinxcontrib-jsmath==1.0.1
# via sphinx
sphinxcontrib-qthelp==2.0.0
# via sphinx
sphinxcontrib-serializinghtml==2.0.0
# via sphinx
stdlib-list==0.11.1
# via pydeps
urllib3==2.4.0
# via
# botocore
# requests
yarl==1.20.0
# via aiohttp

View File

@ -12,19 +12,22 @@ Initial setup
sudo ahriman -a x86_64 -r aur service-setup ...
``service-setup`` literally does the following steps:
.. admonition:: Details
:collapsible: closed
#.
Create ``/var/lib/ahriman/.makepkg.conf`` with ``makepkg.conf`` overrides if required (at least you might want to set ``PACKAGER``):
``service-setup`` literally does the following steps:
.. code-block:: shell
#.
Create ``/var/lib/ahriman/.makepkg.conf`` with ``makepkg.conf`` overrides if required (at least you might want to set ``PACKAGER``):
echo 'PACKAGER="ahriman bot <ahriman@example.com>"' | sudo -u ahriman tee -a /var/lib/ahriman/.makepkg.conf
.. code-block:: shell
#.
Configure build tools (it is required for correct dependency management system):
echo 'PACKAGER="ahriman bot <ahriman@example.com>"' | sudo -u ahriman tee -a /var/lib/ahriman/.makepkg.conf
#.
#.
Configure build tools (it is required for correct dependency management system):
#.
Create build command (you can choose any name for command, basically it should be ``{name}-{arch}-build``):
.. code-block:: shell
@ -67,7 +70,7 @@ Initial setup
echo 'ahriman ALL=(ALL) NOPASSWD:SETENV: CARCHBUILD_CMD' | tee -a /etc/sudoers.d/ahriman
chmod 400 /etc/sudoers.d/ahriman
This command supports several arguments, kindly refer to its help message.
This command supports several arguments, kindly refer to its help message.
#.
Start and enable ``ahriman@.timer`` via ``systemctl``:

View File

@ -2,7 +2,7 @@
pkgbase='ahriman'
pkgname=('ahriman' 'ahriman-core' 'ahriman-triggers' 'ahriman-web')
pkgver=2.17.1
pkgver=2.18.0
pkgrel=1
pkgdesc="ArcH linux ReposItory MANager"
arch=('any')

View File

@ -60,10 +60,13 @@
<div class="tab-content" id="nav-tabContent">
<div id="package-info-logs" class="tab-pane fade show active" role="tabpanel" aria-labelledby="package-info-logs-button" tabindex="0">
<div class="row">
<div class="col-2">
<nav id="package-info-logs-versions" class="nav flex-column"></nav>
<div class="col-1 dropend">
<button id="package-info-logs-dropdown" class="btn dropdown-toggle" type="button" data-bs-toggle="dropdown" aria-expanded="false">
<i class="bi bi-list"></i>
</button>
<nav id="package-info-logs-versions" class="dropdown-menu" aria-labelledby="package-info-logs-dropdown"></nav>
</div>
<div class="col-10">
<div class="col-11">
<pre class="language-console"><code id="package-info-logs-input" class="pre-scrollable language-console"></code><button id="package-info-logs-copy-button" type="button" class="btn language-console" onclick="copyLogs()"><i class="bi bi-clipboard"></i> copy</button></pre>
</div>
</div>
@ -309,9 +312,9 @@
)
.map(version => {
const link = document.createElement("a");
link.classList.add("nav-link");
link.classList.add("dropdown-item");
link.textContent = version.version;
link.textContent = new Date(1000 * version.created).toISOStringShort();
link.href = "#";
link.onclick = _ => {
const logs = data

View File

@ -27,10 +27,4 @@
top: 0;
right: 5px;
}
.nav-link.active {
pointer-events: none;
cursor: default;
color: black !important;
}
</style>

View File

@ -635,6 +635,7 @@ _set_new_action() {
# ${!x} -> ${hello} -> "world"
_shtab_ahriman() {
local completing_word="${COMP_WORDS[COMP_CWORD]}"
local previous_word="${COMP_WORDS[COMP_CWORD-1]}"
local completed_positional_actions
local current_action
local current_action_args_start_index
@ -691,6 +692,10 @@ _shtab_ahriman() {
if [[ $pos_only = 0 && "${completing_word}" == -* ]]; then
# optional argument started: use option strings
COMPREPLY=( $(compgen -W "${current_option_strings[*]}" -- "${completing_word}") )
elif [[ "${previous_word}" == ">" || "${previous_word}" == ">>" ||
"${previous_word}" =~ ^[12]">" || "${previous_word}" =~ ^[12]">>" ]]; then
# handle redirection operators
COMPREPLY=( $(compgen -f -- "${completing_word}") )
else
# use choices & compgen
local IFS=$'\n' # items may contain spaces, so delimit using newline

View File

@ -1,4 +1,4 @@
.TH AHRIMAN "1" "2025\-01\-05" "ahriman" "Generated Python Manual"
.TH AHRIMAN "1" "2025\-06\-13" "ahriman" "Generated Python Manual"
.SH NAME
ahriman
.SH SYNOPSIS

View File

@ -25,15 +25,64 @@ dependencies = [
dynamic = ["version"]
[project.optional-dependencies]
journald = [
"systemd-python",
]
# FIXME technically this dependency is required, but in some cases we do not have access to
# the libalpm which is required in order to install the package. Thus in case if we do not
# really need to run the application we can move it to "optional" dependencies
pacman = [
"pyalpm",
]
reports = [
"Jinja2",
]
s3 = [
"boto3",
]
shell = [
"IPython"
]
stats = [
"matplotlib",
]
unixsocket = [
"requests-unixsocket2", # required by unix socket support
]
validator = [
"cerberus",
]
web = [
"aiohttp",
"aiohttp_cors",
"aiohttp_jinja2",
]
web_api-docs = [
"ahriman[web]",
"aiohttp-apispec",
"setuptools", # required by aiohttp-apispec
]
web_auth = [
"ahriman[web]",
"aiohttp_session",
"aiohttp_security",
"cryptography",
]
web_oauth2 = [
"ahriman[web_auth]",
"aioauth-client",
]
[project.scripts]
ahriman = "ahriman.application.ahriman:run"
[project.urls]
Documentation = "https://ahriman.readthedocs.io/"
Repository = "https://github.com/arcan1s/ahriman"
Changelog = "https://github.com/arcan1s/ahriman/releases"
[project.scripts]
ahriman = "ahriman.application.ahriman:run"
[project.optional-dependencies]
[dependency-groups]
check = [
"autopep8",
"bandit",
@ -47,24 +96,6 @@ docs = [
"shtab",
"sphinx-argparse",
"sphinx-rtd-theme>=1.1.1", # https://stackoverflow.com/a/74355734
]
journald = [
"systemd-python",
]
# FIXME technically this dependency is required, but in some cases we do not have access to
# the libalpm which is required in order to install the package. Thus in case if we do not
# really need to run the application we can move it to "optional" dependencies
pacman = [
"pyalpm",
]
s3 = [
"boto3",
]
shell = [
"IPython"
]
stats = [
"matplotlib",
]
tests = [
"pytest",
@ -75,22 +106,6 @@ tests = [
"pytest-resource-path",
"pytest-spec",
]
validator = [
"cerberus",
]
web = [
"Jinja2",
"aioauth-client",
"aiohttp",
"aiohttp-apispec",
"aiohttp_cors",
"aiohttp_jinja2",
"aiohttp_session",
"aiohttp_security",
"cryptography",
"requests-unixsocket2", # required by unix socket support
"setuptools", # required by aiohttp-apispec
]
[tool.flit.sdist]
include = [

View File

@ -17,4 +17,4 @@
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
__version__ = "2.17.1"
__version__ = "2.18.0"

View File

@ -117,7 +117,7 @@ class Application(ApplicationPackages, ApplicationRepository):
Args:
packages(list[Package]): list of source packages of which dependencies have to be processed
process_dependencies(bool): if no set, dependencies will not be processed
process_dependencies(bool): if set to ``False``, dependencies will not be processed
Returns:
list[Package]: updated packages list. Packager for dependencies will be copied from the original package
@ -130,6 +130,9 @@ class Application(ApplicationPackages, ApplicationRepository):
>>> packages = application.with_dependencies(packages, process_dependencies=True)
>>> application.print_updates(packages, log_fn=print)
"""
if not process_dependencies or not packages:
return packages
def missing_dependencies(source: Iterable[Package]) -> dict[str, str | None]:
# append list of known packages with packages which are in current sources
satisfied_packages = known_packages | {
@ -145,22 +148,29 @@ class Application(ApplicationPackages, ApplicationRepository):
if dependency not in satisfied_packages
}
if not process_dependencies or not packages:
return packages
def new_packages(root: Package) -> dict[str, Package]:
portion = {root.base: root}
while missing := missing_dependencies(portion.values()):
for package_name, packager in missing.items():
if (source_dir := self.repository.paths.cache_for(package_name)).is_dir():
# there is local cache, load package from it
leaf = Package.from_build(source_dir, self.repository.architecture, packager)
else:
leaf = Package.from_aur(package_name, packager)
portion[leaf.base] = leaf
# register package in the database
self.repository.reporter.set_unknown(leaf)
return portion
known_packages = self._known_packages()
with_dependencies = {package.base: package for package in packages}
while missing := missing_dependencies(with_dependencies.values()):
for package_name, username in missing.items():
if (source_dir := self.repository.paths.cache_for(package_name)).is_dir():
# there is local cache, load package from it
package = Package.from_build(source_dir, self.repository.architecture, username)
else:
package = Package.from_aur(package_name, username)
with_dependencies[package.base] = package
# register package in the database
self.repository.reporter.set_unknown(package)
with_dependencies: dict[str, Package] = {}
for package in packages:
with self.in_package_context(package.base, package.version): # use the same context for the logger
try:
with_dependencies |= new_packages(package)
except Exception:
self.logger.exception("could not process dependencies of %s, skip the package", package.base)
return list(with_dependencies.values())

View File

@ -53,7 +53,7 @@ class Handler:
Wrapper for all command line actions, though each derived class implements :func:`run()` method, it usually
must not be called directly. The recommended way is to call :func:`execute()` class method, e.g.::
>>> from ahriman.application.handlers import Add
>>> from ahriman.application.handlers.add import Add
>>>
>>> Add.execute(args)
"""

View File

@ -57,10 +57,6 @@ CONFIGURATION_SCHEMA: ConfigurationSchema = {
"path_exists": True,
"path_type": "file",
},
"suppress_http_log_errors": {
"type": "boolean",
"coerce": "boolean",
}
},
},
"alpm": {
@ -347,10 +343,6 @@ CONFIGURATION_SCHEMA: ConfigurationSchema = {
"coerce": "integer",
"min": 0,
},
"password": {
"type": "string",
"empty": False,
},
"port": {
"type": "integer",
"coerce": "integer",
@ -379,11 +371,6 @@ CONFIGURATION_SCHEMA: ConfigurationSchema = {
},
"empty": False,
},
"timeout": {
"type": "integer",
"coerce": "integer",
"min": 0,
},
"unix_socket": {
"type": "path",
"coerce": "absolute_path",
@ -392,10 +379,6 @@ CONFIGURATION_SCHEMA: ConfigurationSchema = {
"type": "boolean",
"coerce": "boolean",
},
"username": {
"type": "string",
"empty": False,
},
"wait_timeout": {
"type": "integer",
"coerce": "integer",

View File

@ -23,7 +23,7 @@ import sys
from collections.abc import Generator, Mapping, MutableMapping
from string import Template
from typing import ClassVar
from typing import Any, ClassVar
from ahriman.core.configuration.shell_template import ShellTemplate
@ -85,7 +85,7 @@ class ShellInterpolator(configparser.Interpolation):
"prefix": sys.prefix,
}
def before_get(self, parser: MutableMapping[str, Mapping[str, str]], section: str, option: str, value: str,
def before_get(self, parser: MutableMapping[str, Mapping[str, str]], section: Any, option: Any, value: str,
defaults: Mapping[str, str]) -> str:
"""
interpolate option value
@ -100,8 +100,8 @@ class ShellInterpolator(configparser.Interpolation):
Args:
parser(MutableMapping[str, Mapping[str, str]]): option parser
section(str): section name
option(str): option name
section(Any): section name
option(Any): option name
value(str): source (not-converted) value
defaults(Mapping[str, str]): default values

View File

@ -95,19 +95,6 @@ class DuplicateRunError(RuntimeError):
self, "Another application instance is run. This error can be suppressed by using --force flag.")
class EncodeError(ValueError):
"""
exception used for bytes encoding errors
"""
def __init__(self, encodings: list[str]) -> None:
"""
Args:
encodings(list[str]): list of encodings tried
"""
ValueError.__init__(self, f"Could not encode bytes by using {encodings}")
class ExitCode(RuntimeError):
"""
special exception which has to be thrown to return non-zero status without error message

View File

@ -67,14 +67,6 @@ class ReportTrigger(Trigger):
"type": "string",
"allowed": ["email"],
},
"full_template_path": {
"type": "path",
"coerce": "absolute_path",
"excludes": ["template_full"],
"required": True,
"path_exists": True,
"path_type": "file",
},
"homepage": {
"type": "string",
"empty": False,
@ -132,26 +124,16 @@ class ReportTrigger(Trigger):
},
"template": {
"type": "string",
"excludes": ["template_path"],
"dependencies": ["templates"],
"required": True,
"empty": False,
},
"template_full": {
"type": "string",
"excludes": ["template_path"],
"dependencies": ["templates"],
"required": True,
"empty": False,
},
"template_path": {
"type": "path",
"coerce": "absolute_path",
"excludes": ["template"],
"required": True,
"path_exists": True,
"path_type": "file",
},
"templates": {
"type": "list",
"coerce": "list",
@ -199,19 +181,10 @@ class ReportTrigger(Trigger):
},
"template": {
"type": "string",
"excludes": ["template_path"],
"dependencies": ["templates"],
"required": True,
"empty": False,
},
"template_path": {
"type": "path",
"coerce": "absolute_path",
"excludes": ["template"],
"required": True,
"path_exists": True,
"path_type": "file",
},
"templates": {
"type": "list",
"coerce": "list",
@ -225,76 +198,6 @@ class ReportTrigger(Trigger):
},
},
},
"telegram": {
"type": "dict",
"schema": {
"type": {
"type": "string",
"allowed": ["telegram"],
},
"api_key": {
"type": "string",
"required": True,
"empty": False,
},
"chat_id": {
"type": "string",
"required": True,
"empty": False,
},
"homepage": {
"type": "string",
"empty": False,
"is_url": ["http", "https"],
},
"link_path": {
"type": "string",
"required": True,
"empty": False,
"is_url": [],
},
"rss_url": {
"type": "string",
"empty": False,
"is_url": ["http", "https"],
},
"template": {
"type": "string",
"excludes": ["template_path"],
"dependencies": ["templates"],
"required": True,
"empty": False,
},
"template_path": {
"type": "path",
"coerce": "absolute_path",
"excludes": ["template"],
"required": True,
"path_exists": True,
"path_type": "file",
},
"template_type": {
"type": "string",
"allowed": ["MarkdownV2", "HTML", "Markdown"],
},
"templates": {
"type": "list",
"coerce": "list",
"schema": {
"type": "path",
"coerce": "absolute_path",
"path_exists": True,
"path_type": "dir",
},
"empty": False,
},
"timeout": {
"type": "integer",
"coerce": "integer",
"min": 0,
},
},
},
"remote-call": {
"type": "dict",
"schema": {
@ -354,19 +257,10 @@ class ReportTrigger(Trigger):
},
"template": {
"type": "string",
"excludes": ["template_path"],
"dependencies": ["templates"],
"required": True,
"empty": False,
},
"template_path": {
"type": "path",
"coerce": "absolute_path",
"excludes": ["template"],
"required": True,
"path_exists": True,
"path_type": "file",
},
"templates": {
"type": "list",
"coerce": "list",
@ -380,6 +274,67 @@ class ReportTrigger(Trigger):
},
},
},
"telegram": {
"type": "dict",
"schema": {
"type": {
"type": "string",
"allowed": ["telegram"],
},
"api_key": {
"type": "string",
"required": True,
"empty": False,
},
"chat_id": {
"type": "string",
"required": True,
"empty": False,
},
"homepage": {
"type": "string",
"empty": False,
"is_url": ["http", "https"],
},
"link_path": {
"type": "string",
"required": True,
"empty": False,
"is_url": [],
},
"rss_url": {
"type": "string",
"empty": False,
"is_url": ["http", "https"],
},
"template": {
"type": "string",
"dependencies": ["templates"],
"required": True,
"empty": False,
},
"template_type": {
"type": "string",
"allowed": ["MarkdownV2", "HTML", "Markdown"],
},
"templates": {
"type": "list",
"coerce": "list",
"schema": {
"type": "path",
"coerce": "absolute_path",
"path_exists": True,
"path_type": "dir",
},
"empty": False,
},
"timeout": {
"type": "integer",
"coerce": "integer",
"min": 0,
},
},
},
}
def __init__(self, repository_id: RepositoryId, configuration: Configuration) -> None:

View File

@ -83,6 +83,20 @@ class UploadTrigger(Trigger):
},
},
},
"remote-service": {
"type": "dict",
"schema": {
"type": {
"type": "string",
"allowed": ["ahriman", "remote-service"],
},
"timeout": {
"type": "integer",
"coerce": "integer",
"min": 0,
},
},
},
"rsync": {
"type": "dict",
"schema": {
@ -107,20 +121,6 @@ class UploadTrigger(Trigger):
},
},
},
"remote-service": {
"type": "dict",
"schema": {
"type": {
"type": "string",
"allowed": ["ahriman", "remote-service"],
},
"timeout": {
"type": "integer",
"coerce": "integer",
"min": 0,
},
},
},
"s3": {
"type": "dict",
"schema": {

View File

@ -24,7 +24,6 @@ from pathlib import Path
from typing import Any, ClassVar, IO, Self
from ahriman.core.alpm.pkgbuild_parser import PkgbuildParser, PkgbuildToken
from ahriman.core.exceptions import EncodeError
from ahriman.models.pkgbuild_patch import PkgbuildPatch
@ -34,13 +33,13 @@ class Pkgbuild(Mapping[str, Any]):
model and proxy for PKGBUILD properties
Attributes:
DEFAULT_ENCODINGS(list[str]): (class attribute) list of encoding to be applied on the file content
DEFAULT_ENCODINGS(str): (class attribute) default encoding to be applied on the file content
fields(dict[str, PkgbuildPatch]): PKGBUILD fields
"""
fields: dict[str, PkgbuildPatch]
DEFAULT_ENCODINGS: ClassVar[list[str]] = ["utf8", "latin-1"]
DEFAULT_ENCODINGS: ClassVar[str] = "utf8"
@property
def variables(self) -> dict[str, str]:
@ -58,13 +57,13 @@ class Pkgbuild(Mapping[str, Any]):
}
@classmethod
def from_file(cls, path: Path, encodings: list[str] | None = None) -> Self:
def from_file(cls, path: Path, encoding: str | None = None) -> Self:
"""
parse PKGBUILD from the file
Args:
path(Path): path to the PKGBUILD file
encodings(list[str] | None, optional): the encoding of the file (Default value = None)
encoding(str | None, optional): the encoding of the file (Default value = None)
Returns:
Self: constructed instance of self
@ -77,15 +76,10 @@ class Pkgbuild(Mapping[str, Any]):
content = input_file.read()
# decode bytes content based on either
encodings = encodings or cls.DEFAULT_ENCODINGS
for encoding in encodings:
try:
io = StringIO(content.decode(encoding))
return cls.from_io(io)
except ValueError:
pass
encoding = encoding or cls.DEFAULT_ENCODINGS
io = StringIO(content.decode(encoding, errors="backslashreplace"))
raise EncodeError(encodings)
return cls.from_io(io)
@classmethod
def from_io(cls, stream: IO[str]) -> Self:

View File

@ -17,7 +17,7 @@
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
import aiohttp_cors # type: ignore[import-untyped]
import aiohttp_cors
from aiohttp.web import Application
@ -36,7 +36,7 @@ def setup_cors(application: Application) -> aiohttp_cors.CorsConfig:
aiohttp_cors.CorsConfig: generated CORS configuration
"""
cors = aiohttp_cors.setup(application, defaults={
"*": aiohttp_cors.ResourceOptions(
"*": aiohttp_cors.ResourceOptions( # type: ignore[no-untyped-call]
expose_headers="*",
allow_headers="*",
allow_methods="*",

View File

@ -18,7 +18,7 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
from aiohttp.web import HTTPBadRequest, HTTPNotFound, Request, StreamResponse, View
from aiohttp_cors import CorsViewMixin # type: ignore[import-untyped]
from aiohttp_cors import CorsViewMixin
from collections.abc import Awaitable, Callable
from typing import ClassVar, TypeVar

View File

@ -113,6 +113,40 @@ def test_with_dependencies(application: Application, package_ahriman: Package, p
], any_order=True)
def test_with_dependencies_exception(application: Application, package_ahriman: Package,
package_python_schedule: Package, mocker: MockerFixture) -> None:
"""
must skip packages if exception occurs
"""
def create_package_mock(package_base) -> MagicMock:
mock = MagicMock()
mock.base = package_base
mock.depends_build = []
mock.packages_full = [package_base]
return mock
package_python_schedule.packages = {
package_python_schedule.base: package_python_schedule.packages[package_python_schedule.base]
}
package_ahriman.packages[package_ahriman.base].depends = ["devtools", "python", package_python_schedule.base]
package_ahriman.packages[package_ahriman.base].make_depends = ["python-build", "python-installer"]
packages = {
package_ahriman.base: package_ahriman,
package_python_schedule.base: package_python_schedule,
"python": create_package_mock("python"),
"python-installer": create_package_mock("python-installer"),
}
mocker.patch("pathlib.Path.is_dir", autospec=True, side_effect=lambda p: p.name == "python")
mocker.patch("ahriman.models.package.Package.from_aur", side_effect=lambda *args: packages[args[0]])
mocker.patch("ahriman.models.package.Package.from_build", side_effect=Exception)
mocker.patch("ahriman.application.application.Application._known_packages",
return_value={"devtools", "python-build", "python-pytest"})
assert not application.with_dependencies([package_ahriman], process_dependencies=True)
def test_with_dependencies_skip(application: Application, package_ahriman: Package, mocker: MockerFixture) -> None:
"""
must skip processing of dependencies

View File

@ -3,9 +3,7 @@ import pytest
from io import BytesIO, StringIO
from pathlib import Path
from pytest_mock import MockerFixture
from unittest.mock import MagicMock
from ahriman.core.exceptions import EncodeError
from ahriman.models.pkgbuild import Pkgbuild
from ahriman.models.pkgbuild_patch import PkgbuildPatch
@ -46,18 +44,6 @@ def test_from_file_latin(pkgbuild_ahriman: Pkgbuild, mocker: MockerFixture) -> N
load_mock.assert_called_once_with(pytest.helpers.anyvar(int))
def test_from_file_unknown_encoding(pkgbuild_ahriman: Pkgbuild, mocker: MockerFixture) -> None:
"""
must raise exception when encoding is unknown
"""
open_mock = mocker.patch("pathlib.Path.open")
io_mock = open_mock.return_value.__enter__.return_value = MagicMock()
io_mock.read.return_value.decode.side_effect = EncodeError(pkgbuild_ahriman.DEFAULT_ENCODINGS)
with pytest.raises(EncodeError):
assert Pkgbuild.from_file(Path("local"))
def test_from_io(pkgbuild_ahriman: Pkgbuild, mocker: MockerFixture) -> None:
"""
must correctly load from io

View File

@ -1,8 +1,8 @@
import pytest
import pytest_asyncio
from aiohttp.test_utils import TestClient
from aiohttp.web import Application, Resource, UrlMappingMatchInfo
from asyncio import BaseEventLoop
from collections.abc import Awaitable, Callable
from marshmallow import Schema
from pytest_mock import MockerFixture
@ -164,15 +164,13 @@ def application_with_auth(configuration: Configuration, user: User, spawner: Spa
return application
@pytest.fixture
def client(application: Application, event_loop: BaseEventLoop, aiohttp_client: Any,
mocker: MockerFixture) -> TestClient:
@pytest_asyncio.fixture
async def client(application: Application, aiohttp_client: Any, mocker: MockerFixture) -> TestClient:
"""
web client fixture
Args:
application(Application): application fixture
event_loop(BaseEventLoop): context event loop
aiohttp_client(Any): aiohttp client fixture
mocker(MockerFixture): mocker object
@ -180,37 +178,35 @@ def client(application: Application, event_loop: BaseEventLoop, aiohttp_client:
TestClient: web client test instance
"""
mocker.patch("pathlib.Path.iterdir", return_value=[])
return event_loop.run_until_complete(aiohttp_client(application))
return await aiohttp_client(application)
@pytest.fixture
def client_with_auth(application_with_auth: Application, event_loop: BaseEventLoop, aiohttp_client: Any,
mocker: MockerFixture) -> TestClient:
"""
web client fixture with full authorization functions
Args:
application_with_auth(Application): application fixture
event_loop(BaseEventLoop): context event loop
aiohttp_client(Any): aiohttp client fixture
mocker(MockerFixture): mocker object
Returns:
TestClient: web client test instance
"""
mocker.patch("pathlib.Path.iterdir", return_value=[])
return event_loop.run_until_complete(aiohttp_client(application_with_auth))
@pytest.fixture
def client_with_oauth_auth(application_with_auth: Application, event_loop: BaseEventLoop, aiohttp_client: Any,
@pytest_asyncio.fixture
async def client_with_auth(application_with_auth: Application, aiohttp_client: Any,
mocker: MockerFixture) -> TestClient:
"""
web client fixture with full authorization functions
Args:
application_with_auth(Application): application fixture
event_loop(BaseEventLoop): context event loop
aiohttp_client(Any): aiohttp client fixture
mocker(MockerFixture): mocker object
Returns:
TestClient: web client test instance
"""
mocker.patch("pathlib.Path.iterdir", return_value=[])
return await aiohttp_client(application_with_auth)
@pytest_asyncio.fixture
async def client_with_oauth_auth(application_with_auth: Application, aiohttp_client: Any,
mocker: MockerFixture) -> TestClient:
"""
web client fixture with full authorization functions
Args:
application_with_auth(Application): application fixture
aiohttp_client(Any): aiohttp client fixture
mocker(MockerFixture): mocker object
@ -219,4 +215,4 @@ def client_with_oauth_auth(application_with_auth: Application, event_loop: BaseE
"""
mocker.patch("pathlib.Path.iterdir", return_value=[])
application_with_auth[AuthKey] = MagicMock(spec=OAuth)
return event_loop.run_until_complete(aiohttp_client(application_with_auth))
return await aiohttp_client(application_with_auth)

46
tox.ini
View File

@ -1,9 +1,9 @@
[tox]
envlist = check, tests
isolated_build = True
isolated_build = true
labels =
release = version, docs, publish
dependencies = -e .[journald,pacman,s3,shell,stats,validator,web]
dependencies = -e .[journald,pacman,reports,s3,shell,stats,unixsocket,validator,web,web_api-docs,web_auth,web_oauth2]
project_name = ahriman
[mypy]
@ -24,10 +24,13 @@ commands =
[testenv:check]
description = Run common checks like linter, mypy, etc
dependency_groups =
check
deps =
{[tox]dependencies}
-e .[check]
pip_pre = true
setenv =
CFLAGS="-Wno-unterminated-string-initialization"
MYPYPATH=src
commands =
autopep8 --exit-code --max-line-length 120 -aa -i -j 0 -r "src/{[tox]project_name}" "tests/{[tox]project_name}"
@ -38,16 +41,19 @@ commands =
[testenv:docs]
description = Generate source files for documentation
depends =
version
deps =
{[tox]dependencies}
-e .[docs]
changedir = src
allowlist_externals =
bash
find
mv
changedir = src
dependency_groups =
docs
depends =
version
deps =
{[tox]dependencies}
uv
pip_pre = true
setenv =
SPHINX_APIDOC_OPTIONS=members,no-undoc-members,show-inheritance
commands =
@ -59,22 +65,26 @@ commands =
# remove autogenerated modules rst files
find ../docs -type f -name "{[tox]project_name}*.rst" -delete
sphinx-apidoc -o ../docs .
# compile list of dependencies for rtd.io
uv pip compile --group ../pyproject.toml:docs --extra s3 --extra validator --extra web --output-file ../docs/requirements.txt --quiet ../pyproject.toml
[testenv:html]
description = Generate html documentation
dependency_groups =
docs
deps =
{[tox]dependencies}
-e .[docs]
recreate = True
pip_pre = true
recreate = true
commands =
sphinx-build -b html -a -j auto -W docs {envtmpdir}{/}html
[testenv:publish]
description = Create and publish release to GitHub
depends =
docs
allowlist_externals =
git
depends =
docs
passenv =
SSH_AUTH_SOCK
commands =
@ -86,18 +96,22 @@ commands =
[testenv:tests]
description = Run tests
dependency_groups =
tests
deps =
{[tox]dependencies}
-e .[tests]
pip_pre = true
setenv =
CFLAGS="-Wno-unterminated-string-initialization"
commands =
pytest {posargs}
[testenv:version]
description = Bump package version
deps =
packaging
allowlist_externals =
sed
deps =
packaging
commands =
# check if version is set and validate it
{envpython} -c 'from packaging.version import Version; Version("{posargs}")'