Compare commits

...

11 Commits

Author SHA1 Message Date
3b82b8d49f Release 2.14.2 2024-09-20 02:00:14 +03:00
bb404d43d2 Revert "fix: update Repo.init to the latest pacman release"
This reverts commit d30d512eb6.
2024-09-20 01:53:22 +03:00
9c1e9ecbdc Release 2.14.1 2024-09-04 22:01:04 +03:00
4b2f6bbee9 bug: fix removal of the packages
It has been broken since reporter improvements, because it effectivelly
1) didn't call remove functions in database
2) used empty repository identifier for web service

With those changes it also raises exception when you try to call id on
empty identifier
2024-09-04 21:50:33 +03:00
fd8c8a00d0 chore: small contributing guide update 2024-09-04 21:49:31 +03:00
eaf1984eb3 refactor: fix some IDE warnings 2024-09-04 21:49:31 +03:00
794dddccd9 build: update pytest configuration to suppress deprecation warnings 2024-09-04 21:49:31 +03:00
7bd7f95f76 Release 2.14.0 2024-08-23 14:37:05 +03:00
375374c396 docs: improve waiter classes docs 2024-08-23 14:33:07 +03:00
d1ad5ecc11 feat: add ability to suppress git hints
It can be done by setting options in command. The commit author/email is
also now using this logic
2024-08-23 14:33:07 +03:00
1eb4d8e47f feat: add blacklisted paths to implicit dependencies processing
It has been found that in some cases additional packages have been added
as dependencies, like usr/share/applications, usr/lib/cmake, etc

This commit adds an ability to blacklist specific paths from processing
2024-08-23 14:33:07 +03:00
45 changed files with 6739 additions and 6059 deletions

View File

@ -82,6 +82,7 @@ limit-inference-results=100
# List of plugins (as comma separated values of python module names) to load,
# usually to register additional checkers.
load-plugins=pylint.extensions.docparams,
pylint.extensions.bad_builtin,
definition_order,
import_order,
@ -131,6 +132,8 @@ attr-naming-style=snake_case
# style.
#attr-rgx=
bad-functions=print,
# Bad variable names which should always be refused, separated by a comma.
bad-names=foo,
bar,

View File

@ -132,7 +132,7 @@ Again, the most checks can be performed by `tox` command, though some additional
* For any path interactions `pathlib.Path` must be used.
* Configuration interactions must go through `ahriman.core.configuration.Configuration` class instance.
* In case if class load requires some actions, it is recommended to create class method which can be used for class instantiating.
* The code must follow the exception safety, unless it is explicitly asked by end user. It means that most exceptions must be handled and printed to log, no other actions must be done (e.g. raising another exception).
* The most (expected) exceptions must be handled and printed to log, allowing service to continue work. However, fatal and (in some cases) unexpected exceptions may lead to the application termination.
* Exceptions without parameters should be raised without parentheses, e.g.:
```python

View File

@ -40,3 +40,5 @@ The application provides reasonable defaults which allow to use it out-of-box; h
* [Build status page](https://ahriman-demo.arcanis.me). You can log in as `demo` user by using `demo` password. However, you will not be able to run tasks. [HTTP API documentation](https://ahriman-demo.arcanis.me/api-docs) is also available.
* [Repository index](https://repo.arcanis.me/arcanisrepo/x86_64/).
* [Telegram feed](https://t.me/arcanisrepo).
Do you have any success story? You can [share it](https://github.com/arcan1s/ahriman/issues/new?template=04-discussion.md)!

File diff suppressed because it is too large Load Diff

Before

Width:  |  Height:  |  Size: 1.1 MiB

After

Width:  |  Height:  |  Size: 1.2 MiB

View File

@ -228,6 +228,14 @@ ahriman.models.result module
:no-undoc-members:
:show-inheritance:
ahriman.models.scan\_paths module
---------------------------------
.. automodule:: ahriman.models.scan_paths
:members:
:no-undoc-members:
:show-inheritance:
ahriman.models.sign\_settings module
------------------------------------

View File

@ -81,7 +81,9 @@ Authorized users are stored inside internal database, if any of external provide
Build related configuration. Group name can refer to architecture, e.g. ``build:x86_64`` can be used for x86_64 architecture specific settings.
* ``allowed_scan_paths`` - paths to be used for implicit dependencies scan, scape separated list of paths, optional.
* ``archbuild_flags`` - additional flags passed to ``archbuild`` command, space separated list of strings, optional.
* ``blacklisted_scan_paths`` - paths to be excluded for implicit dependencies scan, scape separated list of paths, optional. Normally all elements of this option must be child paths of any of ``allowed_scan_paths`` element.
* ``build_command`` - default build command, string, required.
* ``ignore_packages`` - list packages to ignore during a regular update (manual update will still work), space separated list of strings, optional.
* ``include_debug_packages`` - distribute debug packages, boolean, optional, default ``yes``.
@ -132,7 +134,7 @@ Web server settings. This feature requires ``aiohttp`` libraries to be installed
* ``port`` - port to bind, integer, optional.
* ``service_only`` - disable status routes (including logs), boolean, optional, default ``no``.
* ``static_path`` - path to directory with static files, string, required.
* ``templates`` - path to templates directories, space separated list of strings, required.
* ``templates`` - path to templates directories, space separated list of paths, required.
* ``unix_socket`` - path to the listening unix socket, string, optional. If set, server will create the socket on the specified address which can (and will) be used by application. Note, that unlike usual host/port configuration, unix socket allows to perform requests without authorization.
* ``unix_socket_unsafe`` - set unsafe (o+w) permissions to unix socket, boolean, optional, default ``yes``. This option is enabled by default, because it is supposed that unix socket is created in safe environment (only web service is supposed to be used in unsafe), but it can be disabled by configuration.
* ``wait_timeout`` - wait timeout in seconds, maximum amount of time to be waited before lock will be free, integer, optional.
@ -254,7 +256,7 @@ Section name must be either ``email`` (plus optional architecture name, e.g. ``e
* ``ssl`` - SSL mode for SMTP connection, one of ``ssl``, ``starttls``, ``disabled``, optional, default ``disabled``.
* ``template`` - Jinja2 template name, string, required.
* ``template_full`` - Jinja2 template name for full package description index, string, optional.
* ``templates`` - path to templates directories, space separated list of strings, required.
* ``templates`` - path to templates directories, space separated list of paths, required.
* ``user`` - SMTP user to authenticate, string, optional.
``html`` type
@ -267,7 +269,7 @@ Section name must be either ``html`` (plus optional architecture name, e.g. ``ht
* ``link_path`` - prefix for HTML links, string, required.
* ``path`` - path to html report file, string, required.
* ``template`` - Jinja2 template name, string, required.
* ``templates`` - path to templates directories, space separated list of strings, required.
* ``templates`` - path to templates directories, space separated list of paths, required.
``remote-call`` type
^^^^^^^^^^^^^^^^^^^^
@ -292,7 +294,7 @@ Section name must be either ``telegram`` (plus optional architecture name, e.g.
* ``link_path`` - prefix for HTML links, string, required.
* ``template`` - Jinja2 template name, string, required.
* ``template_type`` - ``parse_mode`` to be passed to telegram API, one of ``MarkdownV2``, ``HTML``, ``Markdown``, string, optional, default ``HTML``.
* ``templates`` - path to templates directories, space separated list of strings, required.
* ``templates`` - path to templates directories, space separated list of paths, required.
* ``timeout`` - HTTP request timeout in seconds, integer, optional, default is ``30``.
``upload`` group

View File

@ -370,7 +370,16 @@ TL;DR
You can even rebuild the whole repository (which is particular useful in case if you would like to change packager) if you do not supply ``--depends-on`` option. This action will automatically increment ``pkgrel`` value; in case if you don't want to, the ``--no-increment`` option has to be supplied.
However, note that you do not need to rebuild repository in case if you just changed signing option, just use ``repo-sign`` command instead.
However, note that you do not need to rebuild repository in case if you just changed signing option, just use ``repo-sign`` command instead.
Automated broken dependencies detection
"""""""""""""""""""""""""""""""""""""""
After the success build the application extracts all linked libraries and used directories and stores them in database. During the check process, the application extracts pacman databases and checks if file names have been changed (e.g. new python release caused ``/usr/lib/python3.x`` directory renaming to ``/usr/lib/python3.y`` or soname for a linked library has been changed). In case if broken dependencies have been detected, the package will be added to the rebuild queue.
In order to disable this check completely, the ``--no-check-files`` flag can be used.
In addition, there is possibility to control paths which will be used for checking, by using options ``build.allowed_scan_paths`` and ``build.blacklisted_scan_paths``. Leaving ``build.allowed_scan_paths`` blank will effectively disable any check too.
How to install built packages
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

View File

@ -1,7 +1,7 @@
# Maintainer: Evgeniy Alekseev
pkgname='ahriman'
pkgver=2.13.8
pkgver=2.14.2
pkgrel=1
pkgdesc="ArcH linux ReposItory MANager"
arch=('any')

View File

@ -50,8 +50,12 @@ allow_read_only = yes
;salt =
[build]
; List of paths to be used for implicit dependency scan
allowed_scan_paths = /usr/lib
; List of additional flags passed to archbuild command.
;archbuild_flags =
; List of paths to be excluded for implicit dependency scan. Usually they should be subpaths of allowed_scan_paths
blacklisted_scan_paths = /usr/lib/cmake
; Path to build command
;build_command =
; List of packages to be ignored during automatic updates.

View File

@ -1,4 +1,4 @@
.TH AHRIMAN "1" "2024\-05\-12" "ahriman" "Generated Python Manual"
.TH AHRIMAN "1" "2024\-09\-19" "ahriman" "Generated Python Manual"
.SH NAME
ahriman
.SH SYNOPSIS
@ -391,7 +391,7 @@ PKGBUILD variable or function name. If variable is a function, it must end with
path to file which contains function or variable value. If not set, the value will be read from stdin
.SH COMMAND \fI\,'ahriman patch\-list'\/\fR
usage: ahriman patch\-list [\-h] [\-e] [\-v VARIABLE] [package]
usage: ahriman patch\-list [\-h] [\-e] [\-v VARIABLE] package
list available patches for the package

View File

@ -86,7 +86,7 @@ _shtab_ahriman_options=(
{-a,--architecture}"[filter by target architecture (default\: None)]:architecture:"
{-c,--configuration}"[configuration path (default\: \/etc\/ahriman.ini)]:configuration:"
"--force[force run, remove file lock (default\: False)]"
{-l,--lock}"[lock file (default\: \/tmp\/ahriman.lock)]:lock:"
{-l,--lock}"[lock file (default\: ahriman.pid)]:lock:"
"--log-handler[explicit log handler specification. If none set, the handler will be guessed from environment (default\: None)]:log_handler:(console syslog journald)"
{-q,--quiet}"[force disable any logging (default\: False)]"
{--report,--no-report}"[force enable or disable reporting to web service (default\: True)]:report:"
@ -280,7 +280,7 @@ _shtab_ahriman_patch_list_options=(
"(- : *)"{-h,--help}"[show this help message and exit]"
{-e,--exit-code}"[return non-zero exit status if result is empty (default\: False)]"
"*"{-v,--variable}"[if set, show only patches for specified PKGBUILD variables (default\: None)]:variable:"
":package base (default\: None):"
":package base:"
)
_shtab_ahriman_patch_remove_options=(

View File

@ -17,4 +17,4 @@
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
__version__ = "2.13.8"
__version__ = "2.14.2"

View File

@ -98,6 +98,7 @@ class Patch(Handler):
PkgbuildPatch: created patch for the PKGBUILD function
"""
if patch_path is None:
# pylint: disable=bad-builtin
print("Post new function or variable value below. Press Ctrl-D to finish:", file=sys.stderr)
patch = "".join(list(sys.stdin))
else:

View File

@ -77,5 +77,5 @@ class Update(Handler):
Callable[[str], None]: in case if dry_run is set it will return print, logger otherwise
"""
def inner(line: str) -> None:
return print(line) if dry_run else application.logger.info(line)
return print(line) if dry_run else application.logger.info(line) # pylint: disable=bad-builtin
return inner

View File

@ -75,7 +75,9 @@ class Lock(LazyLogging):
"""
self.path: Path | None = None
if args.lock is not None:
self.path = args.lock.with_stem(f"{args.lock.stem}_{repository_id.id}")
self.path = args.lock
if not repository_id.is_empty:
self.path = self.path.with_stem(f"{args.lock.stem}_{repository_id.id}")
if not self.path.is_absolute():
# prepend full path to the lock file
self.path = Path("/") / "run" / "ahriman" / self.path

View File

@ -43,7 +43,7 @@ class Pacman(LazyLogging):
configuration(Configuration): configuration instance
refresh_database(PacmanSynchronization): synchronize local cache to remote
repository_id(RepositoryId): repository unique identifier
repository_path(RepositoryPaths): repository paths instance
repository_paths(RepositoryPaths): repository paths instance
"""
def __init__(self, repository_id: RepositoryId, configuration: Configuration, *,
@ -188,8 +188,8 @@ class Pacman(LazyLogging):
Returns:
dict[str, set[str]]: map of package name to its list of files
"""
def extract(tar: tarfile.TarFile, package_names: dict[str, str]) -> Generator[tuple[str, set[str]], None, None]:
for package_name, version in package_names.items():
def extract(tar: tarfile.TarFile, versions: dict[str, str]) -> Generator[tuple[str, set[str]], None, None]:
for package_name, version in versions.items():
path = Path(f"{package_name}-{version}") / "files"
try:
content = tar.extractfile(str(path))

View File

@ -59,7 +59,8 @@ class PacmanDatabase(SyncHttpClient):
self.sync_files_database = configuration.getboolean("alpm", "sync_files_database")
def copy(self, remote_path: Path, local_path: Path) -> None:
@staticmethod
def copy(remote_path: Path, local_path: Path) -> None:
"""
copy local database file

View File

@ -68,7 +68,7 @@ class Repo(LazyLogging):
path(Path): path to archive to add
"""
check_output(
"repo-add", *self.sign_args, "--remove", str(self.repo_path), str(path),
"repo-add", *self.sign_args, "-R", str(self.repo_path), str(path),
exception=BuildError.from_process(path.name),
cwd=self.paths.repository,
logger=self.logger,
@ -78,13 +78,8 @@ class Repo(LazyLogging):
"""
create empty repository database. It just calls add with empty arguments
"""
# since pacman-6.1.0 repo-add doesn't create empty database in case if no packages supplied
# this code creates empty files instead
if self.repo_path.exists():
return # database is already created, skip this part
self.repo_path.touch(exist_ok=True)
(self.paths.repository / f"{self.name}.db").symlink_to(self.repo_path)
check_output("repo-add", *self.sign_args, str(self.repo_path),
cwd=self.paths.repository, logger=self.logger, user=self.uid)
def remove(self, package: str, filename: Path) -> None:
"""

View File

@ -19,6 +19,7 @@
#
import shutil
from collections.abc import Generator
from pathlib import Path
from ahriman.core.exceptions import CalledProcessError
@ -38,10 +39,14 @@ class Sources(LazyLogging):
DEFAULT_BRANCH(str): (class attribute) default branch to process git repositories.
Must be used only for local stored repositories, use RemoteSource descriptor instead for real packages
DEFAULT_COMMIT_AUTHOR(tuple[str, str]): (class attribute) default commit author to be used if none set
GITCONFIG(dict[str, str]): (class attribute) git config options to suppress annoying hints
"""
DEFAULT_BRANCH = "master" # default fallback branch
DEFAULT_COMMIT_AUTHOR = ("ahriman", "ahriman@localhost")
GITCONFIG = {
"init.defaultBranch": DEFAULT_BRANCH,
}
@staticmethod
def changes(source_dir: Path, last_commit_sha: str | None) -> str | None:
@ -106,15 +111,15 @@ class Sources(LazyLogging):
instance.fetch_until(sources_dir, branch=branch)
elif remote.git_url is not None:
instance.logger.info("clone remote %s to %s using branch %s", remote.git_url, sources_dir, branch)
check_output("git", "clone", "--quiet", "--depth", "1", "--branch", branch, "--single-branch",
check_output(*instance.git(), "clone", "--quiet", "--depth", "1", "--branch", branch, "--single-branch",
remote.git_url, str(sources_dir), cwd=sources_dir.parent, logger=instance.logger)
else:
# it will cause an exception later
instance.logger.error("%s is not initialized, but no remote provided", sources_dir)
# and now force reset to our branch
check_output("git", "checkout", "--force", branch, cwd=sources_dir, logger=instance.logger)
check_output("git", "reset", "--quiet", "--hard", f"origin/{branch}",
check_output(*instance.git(), "checkout", "--force", branch, cwd=sources_dir, logger=instance.logger)
check_output(*instance.git(), "reset", "--quiet", "--hard", f"origin/{branch}",
cwd=sources_dir, logger=instance.logger)
# move content if required
@ -136,7 +141,7 @@ class Sources(LazyLogging):
bool: True in case if there is any remote and false otherwise
"""
instance = Sources()
remotes = check_output("git", "remote", cwd=sources_dir, logger=instance.logger)
remotes = check_output(*instance.git(), "remote", cwd=sources_dir, logger=instance.logger)
return bool(remotes)
@staticmethod
@ -150,7 +155,7 @@ class Sources(LazyLogging):
instance = Sources()
if not (sources_dir / ".git").is_dir():
# skip initializing in case if it was already
check_output("git", "init", "--quiet", "--initial-branch", instance.DEFAULT_BRANCH,
check_output(*instance.git(), "init", "--quiet", "--initial-branch", instance.DEFAULT_BRANCH,
cwd=sources_dir, logger=instance.logger)
# extract local files...
@ -220,7 +225,7 @@ class Sources(LazyLogging):
return # no changes to push, just skip action
git_url, branch = remote.git_source()
check_output("git", "push", "--quiet", git_url, branch, cwd=sources_dir, logger=instance.logger)
check_output(*instance.git(), "push", "--quiet", git_url, branch, cwd=sources_dir, logger=instance.logger)
def add(self, sources_dir: Path, *pattern: str, intent_to_add: bool = False) -> None:
"""
@ -241,7 +246,7 @@ class Sources(LazyLogging):
self.logger.info("found matching files %s", found_files)
# add them to index
args = ["--intent-to-add"] if intent_to_add else []
check_output("git", "add", *args, *[str(fn.relative_to(sources_dir)) for fn in found_files],
check_output(*self.git(), "add", *args, *[str(fn.relative_to(sources_dir)) for fn in found_files],
cwd=sources_dir, logger=self.logger)
def commit(self, sources_dir: Path, message: str | None = None,
@ -264,15 +269,16 @@ class Sources(LazyLogging):
if message is None:
message = f"Autogenerated commit at {utcnow()}"
args = ["--message", message]
environment: dict[str, str] = {}
if commit_author is None:
commit_author = self.DEFAULT_COMMIT_AUTHOR
user, email = commit_author
environment["GIT_AUTHOR_NAME"] = environment["GIT_COMMITTER_NAME"] = user
environment["GIT_AUTHOR_EMAIL"] = environment["GIT_COMMITTER_EMAIL"] = email
gitconfig = {
"user.email": email,
"user.name": user,
}
check_output("git", "commit", "--quiet", *args, cwd=sources_dir, logger=self.logger, environment=environment)
check_output(*self.git(gitconfig), "commit", "--quiet", *args, cwd=sources_dir, logger=self.logger)
return True
@ -290,7 +296,7 @@ class Sources(LazyLogging):
args = []
if sha is not None:
args.append(sha)
return check_output("git", "diff", *args, cwd=sources_dir, logger=self.logger)
return check_output(*self.git(), "diff", *args, cwd=sources_dir, logger=self.logger)
def fetch_until(self, sources_dir: Path, *, branch: str | None = None, commit_sha: str | None = None) -> None:
"""
@ -306,18 +312,37 @@ class Sources(LazyLogging):
commits_count = 1
while commit_sha is not None:
command = ["git", "fetch", "--quiet", "--depth", str(commits_count)]
command = self.git() + ["fetch", "--quiet", "--depth", str(commits_count)]
if branch is not None:
command += ["origin", branch]
check_output(*command, cwd=sources_dir, logger=self.logger) # fetch one more level
try:
# check if there is an object in current git directory
check_output("git", "cat-file", "-e", commit_sha, cwd=sources_dir, logger=self.logger)
check_output(*self.git(), "cat-file", "-e", commit_sha, cwd=sources_dir, logger=self.logger)
commit_sha = None # reset search
except CalledProcessError:
commits_count += 1 # increase depth
def git(self, gitconfig: dict[str, str] | None = None) -> list[str]:
"""
git command prefix
Args:
gitconfig(dict[str, str] | None, optional): additional git config flags if any (Default value = None)
Returns:
list[str]: git command prefix with valid default flags
"""
gitconfig = gitconfig or {}
def configuration_flags() -> Generator[str, None, None]:
for option, value in (self.GITCONFIG | gitconfig).items():
yield "-c"
yield f"{option}=\"{value}\""
return ["git"] + list(configuration_flags())
def has_changes(self, sources_dir: Path) -> bool:
"""
check if there are changes in current git tree
@ -329,7 +354,7 @@ class Sources(LazyLogging):
bool: True if there are uncommitted changes and False otherwise
"""
# there is --exit-code argument to diff, however, there might be other process errors
changes = check_output("git", "diff", "--cached", "--name-only", cwd=sources_dir, logger=self.logger)
changes = check_output(*self.git(), "diff", "--cached", "--name-only", cwd=sources_dir, logger=self.logger)
return bool(changes)
def head(self, sources_dir: Path, ref_name: str = "HEAD") -> str:
@ -344,7 +369,7 @@ class Sources(LazyLogging):
str: HEAD commit hash
"""
# we might want to parse git files instead though
return check_output("git", "rev-parse", ref_name, cwd=sources_dir, logger=self.logger)
return check_output(*self.git(), "rev-parse", ref_name, cwd=sources_dir, logger=self.logger)
def move(self, pkgbuild_dir: Path, sources_dir: Path) -> None:
"""
@ -372,7 +397,7 @@ class Sources(LazyLogging):
# create patch
self.logger.info("apply patch %s from database at %s", patch.key, sources_dir)
if patch.is_plain_diff:
check_output("git", "apply", "--ignore-space-change", "--ignore-whitespace",
check_output(*self.git(), "apply", "--ignore-space-change", "--ignore-whitespace",
cwd=sources_dir, input_data=patch.serialize(), logger=self.logger)
else:
patch.write(sources_dir / "PKGBUILD")

View File

@ -169,6 +169,14 @@ CONFIGURATION_SCHEMA: ConfigurationSchema = {
"build": {
"type": "dict",
"schema": {
"allowed_scan_paths": {
"type": "list",
"coerce": "list",
"schema": {
"type": "path",
"coerce": "absolute_path",
},
},
"archbuild_flags": {
"type": "list",
"coerce": "list",
@ -177,6 +185,14 @@ CONFIGURATION_SCHEMA: ConfigurationSchema = {
"empty": False,
},
},
"blacklisted_scan_paths": {
"type": "list",
"coerce": "list",
"schema": {
"type": "path",
"coerce": "absolute_path",
},
},
"build_command": {
"type": "string",
"required": True,

View File

@ -27,6 +27,7 @@ from ahriman.core.configuration import Configuration
from ahriman.core.database.migrations import Migrations
from ahriman.core.database.operations import AuthOperations, BuildOperations, ChangesOperations, \
DependenciesOperations, LogsOperations, PackageOperations, PatchOperations
from ahriman.models.repository_id import RepositoryId
# pylint: disable=too-many-ancestors
@ -102,23 +103,26 @@ class SQLite(
self.with_connection(lambda connection: Migrations.migrate(connection, configuration))
paths.chown(self.path)
def package_clear(self, package_base: str) -> None:
def package_clear(self, package_base: str, repository_id: RepositoryId | None = None) -> None:
"""
completely remove package from all tables
Args:
package_base(str): package base to remove
repository_id(RepositoryId, optional): repository unique identifier override (Default value = None)
Examples:
This method completely removes the package from all tables and must be used, e.g. on package removal::
>>> database.package_clear("ahriman")
"""
self.build_queue_clear(package_base)
self.patches_remove(package_base, [])
self.logs_remove(package_base, None)
self.changes_remove(package_base)
self.dependencies_remove(package_base)
self.build_queue_clear(package_base, repository_id)
self.patches_remove(package_base, None)
self.logs_remove(package_base, None, repository_id)
self.changes_remove(package_base, repository_id)
self.dependencies_remove(package_base, repository_id)
self.package_remove(package_base, repository_id)
# remove local cache too
self._repository_paths.tree_clear(package_base)

View File

@ -80,7 +80,8 @@ class Executor(PackageInfo, Cleaner):
# clear changes and update commit hash
self.reporter.package_changes_update(single.base, Changes(last_commit_sha))
# update dependencies list
dependencies = PackageArchive(self.paths.build_directory, single, self.pacman).depends_on()
package_archive = PackageArchive(self.paths.build_directory, single, self.pacman, self.scan_paths)
dependencies = package_archive.depends_on()
self.reporter.package_dependencies_update(single.base, dependencies)
# update result set
result.add_updated(single)

View File

@ -29,6 +29,7 @@ from ahriman.models.packagers import Packagers
from ahriman.models.pacman_synchronization import PacmanSynchronization
from ahriman.models.repository_id import RepositoryId
from ahriman.models.repository_paths import RepositoryPaths
from ahriman.models.scan_paths import ScanPaths
from ahriman.models.user import User
from ahriman.models.user_access import UserAccess
@ -46,6 +47,7 @@ class RepositoryProperties(LazyLogging):
repo(Repo): repo commands wrapper instance
reporter(Client): build status reporter instance
repository_id(RepositoryId): repository unique identifier
scan_paths(ScanPaths): scan paths for the implicit dependencies
sign(GPG): GPG wrapper instance
triggers(TriggerLoader): triggers holder
vcs_allowed_age(int): maximal age of the VCS packages before they will be checked
@ -78,6 +80,11 @@ class RepositoryProperties(LazyLogging):
self.reporter = Client.load(repository_id, configuration, database, report=report)
self.triggers = TriggerLoader.load(repository_id, configuration)
self.scan_paths = ScanPaths(
allowed_paths=configuration.getpathlist("build", "allowed_scan_paths", fallback=[]),
blacklisted_paths=configuration.getpathlist("build", "blacklisted_scan_paths", fallback=[]),
)
@property
def architecture(self) -> str:
"""

View File

@ -310,7 +310,7 @@ class Client:
def set_unknown(self, package: Package) -> None:
"""
set package status to unknown. Unlike other methods, this method also checks if package is known,
and - in case if it is - it silently skips updatd
and - in case if it is - it silently skips update
Args:
package(Package): current package properties

View File

@ -184,7 +184,7 @@ class LocalClient(Client):
Args:
package_base(str): package base to remove
"""
self.database.package_clear(package_base)
self.database.package_clear(package_base, self.repository_id)
def package_status_update(self, package_base: str, status: BuildStatusEnum) -> None:
"""

View File

@ -140,7 +140,6 @@ class Watcher(LazyLogging):
with self._lock:
self._known.pop(package_base, None)
self.client.package_remove(package_base)
self.package_logs_remove(package_base, None)
def package_status_update(self, package_base: str, status: BuildStatusEnum) -> None:
"""

View File

@ -30,6 +30,7 @@ from ahriman.core.utils import walk
from ahriman.models.dependencies import Dependencies
from ahriman.models.filesystem_package import FilesystemPackage
from ahriman.models.package import Package
from ahriman.models.scan_paths import ScanPaths
@dataclass
@ -39,13 +40,15 @@ class PackageArchive:
Attributes:
package(Package): package descriptor
root(Path): path to root filesystem
pacman(Pacman): alpm wrapper instance
root(Path): path to root filesystem
scan_paths(ScanPaths): scan paths holder
"""
root: Path
package: Package
pacman: Pacman
scan_paths: ScanPaths
@staticmethod
def dynamic_needed(binary_path: Path) -> list[str]:
@ -165,6 +168,10 @@ class PackageArchive:
if any(package.package_name in base_packages for package in packages):
continue
# check path against the black/white listed
if not self.scan_paths.is_allowed(path):
continue
# remove explicit dependencies
packages = [package for package in packages if package.is_root_package(packages, include_optional=False)]
# remove optional dependencies

View File

@ -41,9 +41,12 @@ class RepositoryId:
Returns:
str: unique id for this repository
Raises:
ValueError: if repository identifier is empty
"""
if self.is_empty:
return ""
raise ValueError("Repository ID is called on empty repository identifier")
return f"{self.architecture}-{self.name}" # basically the same as used for command line
@property

View File

@ -113,7 +113,7 @@ class RepositoryPaths(LazyLogging):
Returns:
Path: full patch to devtools chroot directory
"""
# for the chroot directory devtools will create own tree, and we don"t have to specify architecture here
# for the chroot directory devtools will create own tree, and we don't have to specify architecture here
return self.root / "chroot" / self.repository_id.name
@property

View File

@ -0,0 +1,58 @@
#
# Copyright (c) 2021-2024 ahriman team.
#
# This file is part of ahriman
# (see https://github.com/arcan1s/ahriman).
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
from dataclasses import dataclass
from pathlib import Path
@dataclass(frozen=True, kw_only=True)
class ScanPaths:
"""
paths used for scan filesystem
Attributes:
allowed_paths(list[Path]): list of whitelisted paths
blacklisted_paths(list[Path]): list of paths to be skipped from scan
"""
allowed_paths: list[Path]
blacklisted_paths: list[Path]
def __post_init__(self) -> None:
"""
compute relative to / paths
"""
object.__setattr__(self, "allowed_paths", [path.relative_to("/") for path in self.allowed_paths])
object.__setattr__(self, "blacklisted_paths", [path.relative_to("/") for path in self.blacklisted_paths])
def is_allowed(self, path: Path) -> bool:
"""
check whether path is allowed to scan or not
Args:
path(Path): path to be checked
Returns:
bool: ``True`` in case if :attr:`allowed_paths` contains element which is parent for the path and
:attr:`blacklisted_paths` doesn't and ``False`` otherwise
"""
if any(path.is_relative_to(blacklisted) for blacklisted in self.blacklisted_paths):
return False # path is blacklisted
# check if we actually have to check this path
return any(path.is_relative_to(allowed) for allowed in self.allowed_paths)

View File

@ -67,7 +67,7 @@ class WaiterTaskFinished(WaiterResult):
indicates whether the waiter completed with success or not
Returns:
Literal[True]: always False
Literal[True]: always ``True``
"""
return True
@ -82,7 +82,7 @@ class WaiterTimedOut(WaiterResult):
indicates whether the waiter completed with success or not
Returns:
Literal[False]: always False
Literal[False]: always ``False``
"""
return False
@ -108,7 +108,7 @@ class Waiter:
check if timer is out
Returns:
bool: True in case current monotonic time is more than :attr:`start_time` and :attr:`wait_timeout`
bool: ``True`` in case current monotonic time is more than :attr:`start_time` and :attr:`wait_timeout`
doesn't equal to 0
"""
since_start = time.monotonic() - self.start_time
@ -124,7 +124,7 @@ class Waiter:
**kwargs(Params.kwargs): keyword arguments for check call
Returns:
WaiterResult: consumed time in seconds
WaiterResult: waiter result object
"""
while not (timed_out := self.is_timed_out()) and in_progress(*args, **kwargs):
time.sleep(self.interval)

View File

@ -14,6 +14,7 @@ from ahriman.core.configuration import Configuration
from ahriman.core.exceptions import DuplicateRunError, UnsafeRunError
from ahriman.models.build_status import BuildStatus, BuildStatusEnum
from ahriman.models.internal_status import InternalStatus
from ahriman.models.repository_id import RepositoryId
def test_path(args: argparse.Namespace, configuration: Configuration) -> None:
@ -30,6 +31,8 @@ def test_path(args: argparse.Namespace, configuration: Configuration) -> None:
args.lock = Path("ahriman.pid")
assert Lock(args, repository_id, configuration).path == Path("/run/ahriman/ahriman_x86_64-aur-clone.pid")
assert Lock(args, RepositoryId("", ""), configuration).path == Path("/run/ahriman/ahriman.pid")
with pytest.raises(ValueError):
args.lock = Path("/")
assert Lock(args, repository_id, configuration).path # special case

View File

@ -8,12 +8,12 @@ from ahriman.core.alpm.pacman_database import PacmanDatabase
from ahriman.core.exceptions import PacmanError
def test_copy(pacman_database: PacmanDatabase, mocker: MockerFixture) -> None:
def test_copy(mocker: MockerFixture) -> None:
"""
must copy loca database file
"""
copy_mock = mocker.patch("shutil.copy")
pacman_database.copy(Path("remote"), Path("local"))
PacmanDatabase.copy(Path("remote"), Path("local"))
copy_mock.assert_called_once_with(Path("remote"), Path("local"))

View File

@ -26,28 +26,13 @@ def test_repo_add(repo: Repo, mocker: MockerFixture) -> None:
def test_repo_init(repo: Repo, mocker: MockerFixture) -> None:
"""
must create empty database files
must call repo-add with empty package list on repo initializing
"""
mocker.patch("pathlib.Path.exists", return_value=False)
touch_mock = mocker.patch("pathlib.Path.touch")
symlink_mock = mocker.patch("pathlib.Path.symlink_to")
check_output_mock = mocker.patch("ahriman.core.alpm.repo.check_output")
repo.init()
touch_mock.assert_called_once_with(exist_ok=True)
symlink_mock.assert_called_once_with(repo.repo_path)
def test_repo_init_skip(repo: Repo, mocker: MockerFixture) -> None:
"""
must do not create files if database already exists
"""
mocker.patch("pathlib.Path.exists", return_value=True)
touch_mock = mocker.patch("pathlib.Path.touch")
symlink_mock = mocker.patch("pathlib.Path.symlink_to")
repo.init()
touch_mock.assert_not_called()
symlink_mock.assert_not_called()
check_output_mock.assert_called_once() # it will be checked later
assert check_output_mock.call_args[0][0] == "repo-add"
def test_repo_remove(repo: Repo, mocker: MockerFixture) -> None:

View File

@ -74,7 +74,7 @@ def test_fetch_empty(remote_source: RemoteSource, mocker: MockerFixture) -> None
check_output_mock.assert_not_called()
def test_fetch_existing(remote_source: RemoteSource, mocker: MockerFixture) -> None:
def test_fetch_existing(sources: Sources, remote_source: RemoteSource, mocker: MockerFixture) -> None:
"""
must fetch new package via fetch command
"""
@ -86,18 +86,19 @@ def test_fetch_existing(remote_source: RemoteSource, mocker: MockerFixture) -> N
head_mock = mocker.patch("ahriman.core.build_tools.sources.Sources.head", return_value="sha")
local = Path("local")
assert Sources.fetch(local, remote_source) == "sha"
assert sources.fetch(local, remote_source) == "sha"
fetch_mock.assert_called_once_with(local, branch=remote_source.branch)
check_output_mock.assert_has_calls([
MockCall("git", "checkout", "--force", remote_source.branch, cwd=local, logger=pytest.helpers.anyvar(int)),
MockCall("git", "reset", "--quiet", "--hard", f"origin/{remote_source.branch}",
MockCall(*sources.git(), "checkout", "--force", remote_source.branch,
cwd=local, logger=pytest.helpers.anyvar(int)),
MockCall(*sources.git(), "reset", "--quiet", "--hard", f"origin/{remote_source.branch}",
cwd=local, logger=pytest.helpers.anyvar(int)),
])
move_mock.assert_called_once_with(local.resolve(), local)
head_mock.assert_called_once_with(local)
def test_fetch_new(remote_source: RemoteSource, mocker: MockerFixture) -> None:
def test_fetch_new(sources: Sources, remote_source: RemoteSource, mocker: MockerFixture) -> None:
"""
must fetch new package via clone command
"""
@ -107,19 +108,21 @@ def test_fetch_new(remote_source: RemoteSource, mocker: MockerFixture) -> None:
head_mock = mocker.patch("ahriman.core.build_tools.sources.Sources.head", return_value="sha")
local = Path("local")
assert Sources.fetch(local, remote_source) == "sha"
assert sources.fetch(local, remote_source) == "sha"
check_output_mock.assert_has_calls([
MockCall("git", "clone", "--quiet", "--depth", "1", "--branch", remote_source.branch, "--single-branch",
remote_source.git_url, str(local), cwd=local.parent, logger=pytest.helpers.anyvar(int)),
MockCall("git", "checkout", "--force", remote_source.branch, cwd=local, logger=pytest.helpers.anyvar(int)),
MockCall("git", "reset", "--quiet", "--hard", f"origin/{remote_source.branch}",
MockCall(*sources.git(), "clone", "--quiet", "--depth", "1", "--branch", remote_source.branch,
"--single-branch", remote_source.git_url, str(local),
cwd=local.parent, logger=pytest.helpers.anyvar(int)),
MockCall(*sources.git(), "checkout", "--force", remote_source.branch,
cwd=local, logger=pytest.helpers.anyvar(int)),
MockCall(*sources.git(), "reset", "--quiet", "--hard", f"origin/{remote_source.branch}",
cwd=local, logger=pytest.helpers.anyvar(int))
])
move_mock.assert_called_once_with(local.resolve(), local)
head_mock.assert_called_once_with(local)
def test_fetch_new_without_remote(mocker: MockerFixture) -> None:
def test_fetch_new_without_remote(sources: Sources, mocker: MockerFixture) -> None:
"""
must fetch nothing in case if no remote set
"""
@ -129,10 +132,11 @@ def test_fetch_new_without_remote(mocker: MockerFixture) -> None:
head_mock = mocker.patch("ahriman.core.build_tools.sources.Sources.head", return_value="sha")
local = Path("local")
assert Sources.fetch(local, RemoteSource(source=PackageSource.Archive)) == "sha"
assert sources.fetch(local, RemoteSource(source=PackageSource.Archive)) == "sha"
check_output_mock.assert_has_calls([
MockCall("git", "checkout", "--force", Sources.DEFAULT_BRANCH, cwd=local, logger=pytest.helpers.anyvar(int)),
MockCall("git", "reset", "--quiet", "--hard", f"origin/{Sources.DEFAULT_BRANCH}",
MockCall(*sources.git(), "checkout", "--force", sources.DEFAULT_BRANCH,
cwd=local, logger=pytest.helpers.anyvar(int)),
MockCall(*sources.git(), "reset", "--quiet", "--hard", f"origin/{sources.DEFAULT_BRANCH}",
cwd=local, logger=pytest.helpers.anyvar(int))
])
move_mock.assert_called_once_with(local.resolve(), local)
@ -153,15 +157,15 @@ def test_fetch_relative(remote_source: RemoteSource, mocker: MockerFixture) -> N
head_mock.assert_called_once_with(local)
def test_has_remotes(mocker: MockerFixture) -> None:
def test_has_remotes(sources: Sources, mocker: MockerFixture) -> None:
"""
must ask for remotes
"""
check_output_mock = mocker.patch("ahriman.core.build_tools.sources.check_output", return_value="origin")
local = Path("local")
assert Sources.has_remotes(local)
check_output_mock.assert_called_once_with("git", "remote", cwd=local, logger=pytest.helpers.anyvar(int))
assert sources.has_remotes(local)
check_output_mock.assert_called_once_with(*sources.git(), "remote", cwd=local, logger=pytest.helpers.anyvar(int))
def test_has_remotes_empty(mocker: MockerFixture) -> None:
@ -172,7 +176,7 @@ def test_has_remotes_empty(mocker: MockerFixture) -> None:
assert not Sources.has_remotes(Path("local"))
def test_init(mocker: MockerFixture) -> None:
def test_init(sources: Sources, mocker: MockerFixture) -> None:
"""
must create empty repository at the specified path
"""
@ -183,9 +187,9 @@ def test_init(mocker: MockerFixture) -> None:
commit_mock = mocker.patch("ahriman.core.build_tools.sources.Sources.commit")
local = Path("local")
Sources.init(local)
check_output_mock.assert_called_once_with("git", "init", "--quiet", "--initial-branch", Sources.DEFAULT_BRANCH,
cwd=local, logger=pytest.helpers.anyvar(int))
sources.init(local)
check_output_mock.assert_called_once_with(*sources.git(), "init", "--quiet", "--initial-branch",
sources.DEFAULT_BRANCH, cwd=local, logger=pytest.helpers.anyvar(int))
add_mock.assert_called_once_with(local, "PKGBUILD", ".SRCINFO", "local")
commit_mock.assert_called_once_with(local)
@ -267,7 +271,7 @@ def test_patch_create_with_newline(mocker: MockerFixture) -> None:
assert Sources.patch_create(Path("local"), "glob").endswith("\n")
def test_push(package_ahriman: Package, mocker: MockerFixture) -> None:
def test_push(package_ahriman: Package, sources: Sources, mocker: MockerFixture) -> None:
"""
must correctly push files to remote repository
"""
@ -277,11 +281,11 @@ def test_push(package_ahriman: Package, mocker: MockerFixture) -> None:
commit_author = ("commit author", "user@host")
local = Path("local")
Sources.push(local, package_ahriman.remote, "glob", commit_author=commit_author)
sources.push(local, package_ahriman.remote, "glob", commit_author=commit_author)
add_mock.assert_called_once_with(local, "glob")
commit_mock.assert_called_once_with(local, commit_author=commit_author)
check_output_mock.assert_called_once_with(
"git", "push", "--quiet", package_ahriman.remote.git_url, package_ahriman.remote.branch,
*sources.git(), "push", "--quiet", package_ahriman.remote.git_url, package_ahriman.remote.branch,
cwd=local, logger=pytest.helpers.anyvar(int))
@ -308,7 +312,7 @@ def test_add(sources: Sources, mocker: MockerFixture) -> None:
sources.add(local, "pattern1", "pattern2")
glob_mock.assert_has_calls([MockCall("pattern1"), MockCall("pattern2")])
check_output_mock.assert_called_once_with(
"git", "add", "1", "2", "1", "2", cwd=local, logger=sources.logger
*sources.git(), "add", "1", "2", "1", "2", cwd=local, logger=sources.logger
)
@ -323,7 +327,7 @@ def test_add_intent_to_add(sources: Sources, mocker: MockerFixture) -> None:
sources.add(local, "pattern1", "pattern2", intent_to_add=True)
glob_mock.assert_has_calls([MockCall("pattern1"), MockCall("pattern2")])
check_output_mock.assert_called_once_with(
"git", "add", "--intent-to-add", "1", "2", "1", "2", cwd=local, logger=sources.logger
*sources.git(), "add", "--intent-to-add", "1", "2", "1", "2", cwd=local, logger=sources.logger
)
@ -350,13 +354,8 @@ def test_commit(sources: Sources, mocker: MockerFixture) -> None:
user, email = sources.DEFAULT_COMMIT_AUTHOR
assert sources.commit(local, message=message)
check_output_mock.assert_called_once_with(
"git", "commit", "--quiet", "--message", message,
cwd=local, logger=sources.logger, environment={
"GIT_AUTHOR_NAME": user,
"GIT_AUTHOR_EMAIL": email,
"GIT_COMMITTER_NAME": user,
"GIT_COMMITTER_EMAIL": email,
}
*sources.git(), "-c", f"user.email=\"{email}\"", "-c", f"user.name=\"{user}\"",
"commit", "--quiet", "--message", message, cwd=local, logger=sources.logger
)
@ -383,13 +382,8 @@ def test_commit_author(sources: Sources, mocker: MockerFixture) -> None:
user, email = author = ("commit author", "user@host")
assert sources.commit(Path("local"), message=message, commit_author=author)
check_output_mock.assert_called_once_with(
"git", "commit", "--quiet", "--message", message,
cwd=local, logger=sources.logger, environment={
"GIT_AUTHOR_NAME": user,
"GIT_AUTHOR_EMAIL": email,
"GIT_COMMITTER_NAME": user,
"GIT_COMMITTER_EMAIL": email,
}
*sources.git(), "-c", f"user.email=\"{email}\"", "-c", f"user.name=\"{user}\"",
"commit", "--quiet", "--message", message, cwd=local, logger=sources.logger
)
@ -404,13 +398,8 @@ def test_commit_autogenerated_message(sources: Sources, mocker: MockerFixture) -
assert sources.commit(Path("local"))
user, email = sources.DEFAULT_COMMIT_AUTHOR
check_output_mock.assert_called_once_with(
"git", "commit", "--quiet", "--message", pytest.helpers.anyvar(str, strict=True),
cwd=local, logger=sources.logger, environment={
"GIT_AUTHOR_NAME": user,
"GIT_AUTHOR_EMAIL": email,
"GIT_COMMITTER_NAME": user,
"GIT_COMMITTER_EMAIL": email,
}
*sources.git(), "-c", f"user.email=\"{email}\"", "-c", f"user.name=\"{user}\"",
"commit", "--quiet", "--message", pytest.helpers.anyvar(str, strict=True), cwd=local, logger=sources.logger
)
@ -422,7 +411,7 @@ def test_diff(sources: Sources, mocker: MockerFixture) -> None:
local = Path("local")
assert sources.diff(local)
check_output_mock.assert_called_once_with("git", "diff", cwd=local, logger=sources.logger)
check_output_mock.assert_called_once_with(*sources.git(), "diff", cwd=local, logger=sources.logger)
def test_diff_specific(sources: Sources, mocker: MockerFixture) -> None:
@ -433,7 +422,7 @@ def test_diff_specific(sources: Sources, mocker: MockerFixture) -> None:
local = Path("local")
assert sources.diff(local, "hash")
check_output_mock.assert_called_once_with("git", "diff", "hash", cwd=local, logger=sources.logger)
check_output_mock.assert_called_once_with(*sources.git(), "diff", "hash", cwd=local, logger=sources.logger)
def test_fetch_until(sources: Sources, mocker: MockerFixture) -> None:
@ -450,10 +439,12 @@ def test_fetch_until(sources: Sources, mocker: MockerFixture) -> None:
local = Path("local")
sources.fetch_until(local, branch="master", commit_sha="sha")
check_output_mock.assert_has_calls([
MockCall("git", "fetch", "--quiet", "--depth", "1", "origin", "master", cwd=local, logger=sources.logger),
MockCall("git", "cat-file", "-e", "sha", cwd=local, logger=sources.logger),
MockCall("git", "fetch", "--quiet", "--depth", "2", "origin", "master", cwd=local, logger=sources.logger),
MockCall("git", "cat-file", "-e", "sha", cwd=local, logger=sources.logger),
MockCall(*sources.git(), "fetch", "--quiet", "--depth", "1", "origin", "master",
cwd=local, logger=sources.logger),
MockCall(*sources.git(), "cat-file", "-e", "sha", cwd=local, logger=sources.logger),
MockCall(*sources.git(), "fetch", "--quiet", "--depth", "2", "origin", "master",
cwd=local, logger=sources.logger),
MockCall(*sources.git(), "cat-file", "-e", "sha", cwd=local, logger=sources.logger),
])
@ -466,8 +457,9 @@ def test_fetch_until_first(sources: Sources, mocker: MockerFixture) -> None:
local = Path("local")
sources.fetch_until(local, branch="master")
check_output_mock.assert_has_calls([
MockCall("git", "fetch", "--quiet", "--depth", "1", "origin", "master", cwd=local, logger=sources.logger),
MockCall("git", "cat-file", "-e", "HEAD", cwd=local, logger=sources.logger),
MockCall(*sources.git(), "fetch", "--quiet", "--depth", "1", "origin", "master",
cwd=local, logger=sources.logger),
MockCall(*sources.git(), "cat-file", "-e", "HEAD", cwd=local, logger=sources.logger),
])
@ -480,11 +472,27 @@ def test_fetch_until_all_branches(sources: Sources, mocker: MockerFixture) -> No
local = Path("local")
sources.fetch_until(local)
check_output_mock.assert_has_calls([
MockCall("git", "fetch", "--quiet", "--depth", "1", cwd=local, logger=sources.logger),
MockCall("git", "cat-file", "-e", "HEAD", cwd=local, logger=sources.logger),
MockCall(*sources.git(), "fetch", "--quiet", "--depth", "1", cwd=local, logger=sources.logger),
MockCall(*sources.git(), "cat-file", "-e", "HEAD", cwd=local, logger=sources.logger),
])
def test_git(sources: Sources) -> None:
"""
must correctly generate git command
"""
assert sources.git() == ["git", "-c", "init.defaultBranch=\"master\""]
def test_git_overrides(sources: Sources) -> None:
"""
must correctly generate git command with additional settings
"""
assert sources.git({"user.email": "ahriman@localhost"}) == [
"git", "-c", "init.defaultBranch=\"master\"", "-c", "user.email=\"ahriman@localhost\""
]
def test_has_changes(sources: Sources, mocker: MockerFixture) -> None:
"""
must correctly identify if there are changes
@ -493,12 +501,12 @@ def test_has_changes(sources: Sources, mocker: MockerFixture) -> None:
check_output_mock = mocker.patch("ahriman.core.build_tools.sources.check_output", return_value="M a.txt")
assert sources.has_changes(local)
check_output_mock.assert_called_once_with("git", "diff", "--cached", "--name-only",
check_output_mock.assert_called_once_with(*sources.git(), "diff", "--cached", "--name-only",
cwd=local, logger=sources.logger)
check_output_mock = mocker.patch("ahriman.core.build_tools.sources.check_output", return_value="")
assert not sources.has_changes(local)
check_output_mock.assert_called_once_with("git", "diff", "--cached", "--name-only",
check_output_mock.assert_called_once_with(*sources.git(), "diff", "--cached", "--name-only",
cwd=local, logger=sources.logger)
@ -510,7 +518,7 @@ def test_head(sources: Sources, mocker: MockerFixture) -> None:
local = Path("local")
assert sources.head(local) == "sha"
check_output_mock.assert_called_once_with("git", "rev-parse", "HEAD", cwd=local, logger=sources.logger)
check_output_mock.assert_called_once_with(*sources.git(), "rev-parse", "HEAD", cwd=local, logger=sources.logger)
def test_head_specific(sources: Sources, mocker: MockerFixture) -> None:
@ -521,7 +529,7 @@ def test_head_specific(sources: Sources, mocker: MockerFixture) -> None:
local = Path("local")
assert sources.head(local, "master") == "sha"
check_output_mock.assert_called_once_with("git", "rev-parse", "master", cwd=local, logger=sources.logger)
check_output_mock.assert_called_once_with(*sources.git(), "rev-parse", "master", cwd=local, logger=sources.logger)
def test_move(sources: Sources, mocker: MockerFixture) -> None:
@ -554,7 +562,7 @@ def test_patch_apply(sources: Sources, mocker: MockerFixture) -> None:
local = Path("local")
sources.patch_apply(local, patch)
check_output_mock.assert_called_once_with(
"git", "apply", "--ignore-space-change", "--ignore-whitespace",
*sources.git(), "apply", "--ignore-space-change", "--ignore-whitespace",
cwd=local, input_data=patch.value, logger=sources.logger
)

View File

@ -4,6 +4,7 @@ from pytest_mock import MockerFixture
from ahriman.core.configuration import Configuration
from ahriman.core.database import SQLite
from ahriman.models.repository_id import RepositoryId
def test_load(configuration: Configuration, mocker: MockerFixture) -> None:
@ -35,7 +36,7 @@ def test_init_skip_migration(database: SQLite, configuration: Configuration, moc
migrate_schema_mock.assert_not_called()
def test_package_clear(database: SQLite, mocker: MockerFixture) -> None:
def test_package_clear(database: SQLite, repository_id: RepositoryId, mocker: MockerFixture) -> None:
"""
must clear package data
"""
@ -44,12 +45,14 @@ def test_package_clear(database: SQLite, mocker: MockerFixture) -> None:
logs_mock = mocker.patch("ahriman.core.database.SQLite.logs_remove")
changes_mock = mocker.patch("ahriman.core.database.SQLite.changes_remove")
dependencies_mock = mocker.patch("ahriman.core.database.SQLite.dependencies_remove")
package_mock = mocker.patch("ahriman.core.database.SQLite.package_remove")
tree_clear_mock = mocker.patch("ahriman.models.repository_paths.RepositoryPaths.tree_clear")
database.package_clear("package")
build_queue_mock.assert_called_once_with("package")
patches_mock.assert_called_once_with("package", [])
logs_mock.assert_called_once_with("package", None)
changes_mock.assert_called_once_with("package")
dependencies_mock.assert_called_once_with("package")
database.package_clear("package", repository_id)
build_queue_mock.assert_called_once_with("package", repository_id)
patches_mock.assert_called_once_with("package", None)
logs_mock.assert_called_once_with("package", None, repository_id)
changes_mock.assert_called_once_with("package", repository_id)
dependencies_mock.assert_called_once_with("package", repository_id)
package_mock.assert_called_once_with("package", repository_id)
tree_clear_mock.assert_called_once_with("package")

View File

@ -158,7 +158,7 @@ def test_package_remove(local_client: LocalClient, package_ahriman: Package, moc
"""
package_mock = mocker.patch("ahriman.core.database.SQLite.package_clear")
local_client.package_remove(package_ahriman.base)
package_mock.assert_called_once_with(package_ahriman.base)
package_mock.assert_called_once_with(package_ahriman.base, local_client.repository_id)
def test_package_status_update(local_client: LocalClient, package_ahriman: Package, mocker: MockerFixture) -> None:

View File

@ -101,13 +101,11 @@ def test_package_remove(watcher: Watcher, package_ahriman: Package, mocker: Mock
must remove package base
"""
cache_mock = mocker.patch("ahriman.core.status.local_client.LocalClient.package_remove")
logs_mock = mocker.patch("ahriman.core.status.watcher.Watcher.package_logs_remove", create=True)
watcher._known = {package_ahriman.base: (package_ahriman, BuildStatus())}
watcher.package_remove(package_ahriman.base)
assert not watcher._known
cache_mock.assert_called_once_with(package_ahriman.base)
logs_mock.assert_called_once_with(package_ahriman.base, None)
def test_package_remove_unknown(watcher: Watcher, package_ahriman: Package, mocker: MockerFixture) -> None:

View File

@ -7,6 +7,7 @@ from pytest_mock import MockerFixture
from ahriman import __version__
from ahriman.core.alpm.pacman import Pacman
from ahriman.core.alpm.remote import AUR
from ahriman.core.configuration import Configuration
from ahriman.models.build_status import BuildStatus, BuildStatusEnum
from ahriman.models.counters import Counters
from ahriman.models.filesystem_package import FilesystemPackage
@ -17,6 +18,7 @@ from ahriman.models.package_description import PackageDescription
from ahriman.models.package_source import PackageSource
from ahriman.models.remote_source import RemoteSource
from ahriman.models.repository_paths import RepositoryPaths
from ahriman.models.scan_paths import ScanPaths
@pytest.fixture
@ -77,7 +79,7 @@ def internal_status(counters: Counters) -> InternalStatus:
@pytest.fixture
def package_archive_ahriman(package_ahriman: Package, repository_paths: RepositoryPaths, pacman: Pacman,
passwd: Any, mocker: MockerFixture) -> PackageArchive:
scan_paths: ScanPaths, passwd: Any, mocker: MockerFixture) -> PackageArchive:
"""
package archive fixture
@ -85,6 +87,7 @@ def package_archive_ahriman(package_ahriman: Package, repository_paths: Reposito
package_ahriman(Package): package test instance
repository_paths(RepositoryPaths): repository paths test instance
pacman(Pacman): pacman test instance
scan_paths(ScanPaths): scan paths test instance
passwd(Any): passwd structure test instance
mocker(MockerFixture): mocker object
@ -92,7 +95,7 @@ def package_archive_ahriman(package_ahriman: Package, repository_paths: Reposito
PackageArchive: package archive test instance
"""
mocker.patch("ahriman.models.repository_paths.getpwuid", return_value=passwd)
return PackageArchive(repository_paths.build_directory, package_ahriman, pacman)
return PackageArchive(repository_paths.build_directory, package_ahriman, pacman, scan_paths)
@pytest.fixture
@ -158,3 +161,20 @@ def pyalpm_package_description_ahriman(package_description_ahriman: PackageDescr
type(mock).provides = PropertyMock(return_value=package_description_ahriman.provides)
type(mock).url = PropertyMock(return_value=package_description_ahriman.url)
return mock
@pytest.fixture
def scan_paths(configuration: Configuration) -> ScanPaths:
"""
scan paths fixture
Args:
configuration(Configuration): configuration test instance
Returns:
ScanPaths: scan paths test instance
"""
return ScanPaths(
allowed_paths=configuration.getpathlist("build", "allowed_scan_paths"),
blacklisted_paths=configuration.getpathlist("build", "blacklisted_scan_paths"),
)

View File

@ -134,8 +134,10 @@ def test_refine_dependencies(package_archive_ahriman: PackageArchive, mocker: Mo
path1 = Path("usr") / "lib" / "python3.12"
path2 = path1 / "site-packages"
path3 = Path("etc")
path4 = Path("var") / "lib" / "whatever"
path3 = Path("usr") / "lib" / "path"
path4 = Path("usr") / "lib" / "whatever"
path5 = Path("usr") / "share" / "applications"
path6 = Path("etc")
package1 = FilesystemPackage(package_name="package1", depends={"package5"}, opt_depends={"package2"})
package2 = FilesystemPackage(package_name="package2", depends={"package1"}, opt_depends=set())
@ -149,6 +151,8 @@ def test_refine_dependencies(package_archive_ahriman: PackageArchive, mocker: Mo
path2: [package1, package2, package3, package5],
path3: [package1, package4],
path4: [package1],
path5: [package1],
path6: [package1],
}) == {
path1: [package6],
path2: [package1, package5],

View File

@ -7,10 +7,17 @@ def test_id() -> None:
"""
must correctly generate id
"""
assert RepositoryId("", "").id == ""
assert RepositoryId("arch", "repo").id == "arch-repo"
def test_id_empty() -> None:
"""
must raise exception on empty identifier
"""
with pytest.raises(ValueError):
assert RepositoryId("", "").id
def test_is_empty() -> None:
"""
must check if repository id is empty or not

View File

@ -0,0 +1,42 @@
from pathlib import Path
from ahriman.models.scan_paths import ScanPaths
def test_post_init(scan_paths: ScanPaths) -> None:
"""
must convert paths to / relative
"""
assert all(not path.is_absolute() for path in scan_paths.allowed_paths)
assert all(not path.is_absolute() for path in scan_paths.blacklisted_paths)
def test_is_allowed() -> None:
"""
must check if path is subpath of one in allowed list
"""
assert ScanPaths(allowed_paths=[Path("/") / "usr"], blacklisted_paths=[]).is_allowed(Path("usr"))
assert ScanPaths(allowed_paths=[Path("/") / "usr"], blacklisted_paths=[]).is_allowed(Path("usr") / "lib")
assert not ScanPaths(allowed_paths=[Path("/") / "usr"], blacklisted_paths=[]).is_allowed(Path("var"))
def test_is_blacklisted() -> None:
"""
must check if path is not subpath of one in blacklist
"""
assert ScanPaths(
allowed_paths=[Path("/") / "usr"],
blacklisted_paths=[Path("/") / "usr" / "lib"],
).is_allowed(Path("usr"))
assert ScanPaths(
allowed_paths=[Path("/") / "usr", Path("/") / "var"],
blacklisted_paths=[Path("/") / "usr" / "lib"],
).is_allowed(Path("var"))
assert not ScanPaths(
allowed_paths=[Path("/") / "usr"],
blacklisted_paths=[Path("/") / "usr" / "lib"],
).is_allowed(Path(" usr") / "lib")
assert not ScanPaths(
allowed_paths=[Path("/") / "usr"],
blacklisted_paths=[Path("/") / "usr" / "lib"],
).is_allowed(Path("usr") / "lib" / "qt")

View File

@ -201,7 +201,7 @@ def test_service_not_found(base: BaseView) -> None:
must raise HTTPNotFound if no repository found
"""
with pytest.raises(HTTPNotFound):
base.service(RepositoryId("", ""))
base.service(RepositoryId("repo", "arch"))
def test_service_package(base: BaseView, repository_id: RepositoryId, mocker: MockerFixture) -> None:

View File

@ -20,7 +20,9 @@ salt = salt
allow_read_only = no
[build]
allowed_scan_paths = /usr/lib
archbuild_flags =
blacklisted_scan_paths = /usr/lib/cmake
build_command = extra-x86_64-build
ignore_packages =
makechrootpkg_flags =

View File

@ -11,6 +11,7 @@ flags = --implicit-reexport --strict --allow-untyped-decorators --allow-subclass
[pytest]
addopts = --cov=ahriman --cov-report=term-missing:skip-covered --no-cov-on-fail --cov-fail-under=100 --spec
asyncio_default_fixture_loop_scope = function
asyncio_mode = auto
spec_test_format = {result} {docstring_summary}