Package server#
Mirroring#
Debmirror#
Debmirror is one of the existing programs which are able to mirror Debian APT packages. In this example we are going to use the Apache HTTP server as webserver.
In this example we will use the Apache HTTP server to serve the packages.
Vedi anche
Debmirror: creiamo un mirror Debian - Guide@Debianizzati.Org [1]
Mirantis Documentation: Usage ~ DEBMIRROR [2]
Apache Tips & Tricks: Hide a file type from directory indexes · MD/Blog [3]
Set Up A Local Ubuntu / Debian Mirror with Apt-Mirror | Programster’s Blog [4]
Debian – Mirror Size [5]
Comment #4 : Bug #882941 : Bugs : debmirror package : Ubuntu [6]
Debian – Setting up a Debian archive mirror [7]
Better Default Directory Views with HTAccess | Perishable Press [8]
install
apt-get install debmirror bindfs
create a new user
useradd -m -s /bin/bash -U debmirror passwd debmirror usermod -aG jobs debmirror mkdir /home/debmirror/data chown debmirror:debmirror /home/debmirror/data
Nota
In this example
/home/debmirror/data
is the base directory where all packages are served from.Suggerimento
I suggest using normal HDDs rather than SSDs. Size in more important than speed in this case.
create the jobs directories
mkdir -p /home/jobs/{services,scripts}/by-user/debmirror
load APT’s keyring into a new keyring owned by the
debmirror
usersudo -i -u debmirror gpg --keyring /usr/share/keyrings/debian-archive-keyring.gpg --export \ | gpg --no-default-keyring --keyring trustedkeys.gpg --import exit
add this
configuration
for the standard Debian repository# The config file is a perl script so take care to follow perl syntax. # Any setting in /etc/debmirror.conf overrides these defaults and # ~/.debmirror.conf overrides those again. Take only what you need. # # The syntax is the same as on the command line and variable names # loosely match option names. If you don't recognize something here # then just stick to the command line. # # Options specified on the command line override settings in the config # files. # Location of the local mirror (use with care) # $mirrordir="/path/to/mirrordir" # Output options $verbose=1; $progress=1; $debug=0; $remoteroot="debian"; # Download options $host="debian.netcologne.de"; $download_method="rsync"; @dists="stable,oldstable,buster-updates,bullseye-updates,buster-backports,bullseye-backports"; @sections="main"; @arches="amd64,all,any"; $omit_suite_symlinks=0; $skippackages=0; # @rsync_extra="none"; $i18n=1; $getcontents=1; $do_source=1; $max_batch=0; # Includes other translations as well. # See the exclude option in # https://help.ubuntu.com/community/Debmirror @includes="Translation-(en|it).*"; # @di_dists="dists"; # @di_archs="arches"; # Save mirror state between runs; value sets validity of cache in days $state_cache_days=0; # Security/Sanity options $ignore_release_gpg=0; $ignore_release=0; $check_md5sums=0; $ignore_small_errors=1; # Cleanup $cleanup=0; $post_cleanup=1; # Locking options $timeout=300; # Rsync options $rsync_batch=200; $rsync_options="-aIL --partial --bwlimit=10240"; # FTP/HTTP options $passive=0; # $proxy="http://proxy:port/"; # Dry run $dry_run=0; # Don't keep diff files but use them $diff_mode="use"; # The config file must return true or perl complains. # Always copy this. 1;
create the
Systemd service unit file
[Unit] Description=Debmirror debian Requires=network-online.target After=network-online.target [Service] Type=simple ExecStart=/usr/bin/debmirror --config-file=/home/jobs/scripts/by-user/debmirror/debmirror.debian.conf /home/debmirror/data/debian User=debmirror Group=debmirror
create the
Systemd service timer unit file
[Unit] Description=Once a day debmirror debian [Timer] OnCalendar=*-*-* 1:30:00 Persistent=true [Install] WantedBy=timers.target
create a directory readable be Apache
mkdir -p /srv/http/debian chown www-data:www-data /srv/http/debian chmod 700 /srv/http/debian
Add this to the fstab file
/home/debmirror/data /srv/http/debian fuse.bindfs auto,force-user=www-data,force-group=www-data,ro 0 0
Nota
This mount command makes the directory exposed to the webserver readonly, in this case
/srv/http/debian
serve the files via HTTP by creating a new
Apache virtual host
. ReplaceFQDN
with the appropriate domain and include this file from the Apache configuration<IfModule mod_ssl.c> <VirtualHost *:443> UseCanonicalName on Keepalive On RewriteEngine on ServerName ${FQDN} # Set the icons also to avoid 404 errors. Alias /icons/ "/usr/share/apache2/icons/" DocumentRoot "/srv/http/debian" <Directory "/srv/http/debian"> Options -ExecCGI -Includes Options +Indexes +SymlinksIfOwnerMatch IndexOptions NameWidth=* +SuppressDescription FancyIndexing Charset=UTF-8 VersionSort FoldersFirst ReadmeName footer.html IndexIgnore header.html footer.html # # AllowOverride controls what directives may be placed in .htaccess files. # It can be "All", "None", or any combination of the keywords: # AllowOverride FileInfo AuthConfig Limit # AllowOverride All # # Controls who can get stuff from this server. # Require all granted </Directory> SSLCompression off Include /etc/letsencrypt/options-ssl-apache.conf SSLCertificateFile /etc/letsencrypt/live/${FQDN}/fullchain.pem SSLCertificateKeyFile /etc/letsencrypt/live/${FQDN}/privkey.pem </VirtualHost> </IfModule>
create a new
text file
that will serve as basic instructions for configuring the APT sources file. ReplaceFQDN
with the appropriate domain<h1>Examples</h1> Change <code>/etc/apt/sources.list</code> to one of these: <h2>Bullseye distribution</h2> <pre> deb https://${FQDN}/debian bullseye main deb https://${FQDN}/debian bullseye-updates main deb-src https://${FQDN}/debian-security bullseye-security main deb https://${FQDN}/debian bullseye-backports main deb [arch=amd64] https://${FQDN}/docker bullseye stable deb [arch=amd64] https://${FQDN}/gitea gitea main deb [arch=amd64] https://${FQDN}/postgresql bullseye-pgdg main </pre> <h2>Buster distribution</h2> <pre> deb https://${FQDN}/debian buster main deb https://${FQDN}/debian buster-updates main deb-src https://${FQDN}/debian-security buster/updates main deb https://${FQDN}/debian buster-backports main deb [arch=amd64] https://${FQDN}/docker buster stable deb [arch=amd64] https://${FQDN}/gitea gitea main deb [arch=amd64] https://${FQDN}/postgresql buster-pgdg main </pre> <h1>Repositories</h1> <code>stable</code> and <code>oldstable</code> distributions are available if applicable. <h2>debian</h2> <p>Supported architectures:</p> <ul> <li><code>amd64</code></li> <li><code>all</code></li> <li><code>any</code></li> </ul> <h2>debian-security</h2> <p>Supported architectures:</p> <ul> <li><code>amd64</code></li> <li><code>all</code></li> <li><code>any</code></li> </ul> <h2>docker</h2> <p>Supported architectures:</p> <ul> <li><code>amd64</code></li> </ul> <h2>gitea</h2> <p>Supported architectures:</p> <ul> <li><code>amd64</code></li> </ul> <h2>postgresql</h2> <p>Supported architectures:</p> <ul> <li><code>amd64</code></li> </ul>
Nota
This example includes some unofficial repositories.
restart the Apache webserver
systemctl restart apache2
fix the permissions
chown -R debmirror:debmirror /home/jobs/{services,scripts}/by-user/debmirror chmod 700 -R /home/jobs/{services,scripts}/by-user/debmirror
run the deploy script
Unofficial Debian sources#
In case you want to mirror unofficial Debian sources the same instrctions apply. You just need to change the key import step
sudo -i -u debmirror
gpg \
--no-default-keyring \
--keyring trustedkeys.gpg \
--import ${package_signing_key}
exit
Nota
package_signing_key
is provided by the repository maintainers.
PyPI server#
Build Python packages using git sources and push them to a self-hosted PyPI server.
Server#
Vedi anche
Minimal PyPI server for uploading & downloading packages with pip/easy_install Resources [9]
follow the Docker instructions
create the jobs directories
mkdir -p /home/jobs/scripts/by-user/root/docker/pypiserver chmod 700 /home/jobs/scripts/by-user/root/docker/pypiserver
install and run pypiserver. Use this
Docker compose file
version: '3.7' services: pypiserver-authenticated: image: pypiserver/pypiserver:latest volumes: # Authentication file. - type: bind source: /home/jobs/scripts/by-user/root/docker/pypiserver/auth target: /data/auth # Python files. - type: bind source: /data/pypiserver/packages target: /data/packages ports: - "127.0.0.1:4000:8080" # I have purposefully removed the # --fallback-url https://pypi.org/simple/ # option to have a fully isolated environment. command: --disable-fallback --passwords /data/auth/.htpasswd --authenticate update /data/packages
create a
Systemd unit file
. See also the Docker compose services section[Unit] Requires=docker.service Requires=network-online.target After=docker.service After=network-online.target [Service] Type=simple WorkingDirectory=/home/jobs/scripts/by-user/root/docker/pypiserver ExecStart=/usr/bin/docker-compose up --remove-orphans ExecStop=/usr/bin/docker-compose down --remove-orphans Restart=always [Install] WantedBy=multi-user.target
fix the permissions
chmod 700 /home/jobs/scripts/by-user/root/docker/pypiserver chmod 700 -R /home/jobs/services/by-user/root
run the deploy script
modify the reverse proxy port of your webserver configuration with
4000
Apache configuration#
If you use Apache as webserver you should enable caching. The upstream documentation shows how to configure pypiserver for Nginx but not for Apache.
Vedi anche
create a new
Apache virtual host
. ReplaceFQDN
with the appropriate domain############### # pypiserver # ############### <IfModule mod_ssl.c> <VirtualHost *:443> UseCanonicalName on Keepalive On RewriteEngine on ServerName ${FQDN} SSLCompression off RewriteRule ^/simple$ /simple/ [R] ProxyPass / http://127.0.0.1:4000/ Keepalive=On max=50 timeout=300 connectiontimeout=10 ProxyPassReverse / http://127.0.0.1:4000/ RequestHeader set X-Forwarded-Proto "https" RequestHeader set X-Forwarded-Port "443" RequestHeader set X-Forwarded-Host "${FQDN}" Header set Service "pypi" CacheRoot "/var/cache/apache" CacheEnable disk / CacheDirLevels 4 CacheDirLength 1 CacheDefaultExpire 3600 CacheIgnoreNoLastMod On CacheIgnoreCacheControl On CacheMaxFileSize 640000 CacheReadSize 1024 CacheIgnoreNoLastMod On CacheIgnoreQueryString On CacheIgnoreHeaders X-Forwarded-Proto X-Forwarded-For X-Forwarded-Host # Debug. Turn these two variables off after testing. CacheHeader on CacheDetailHeader On Include /etc/letsencrypt/options-ssl-apache.conf SSLCertificateFile /etc/letsencrypt/live/${FQDN}/fullchain.pem SSLCertificateKeyFile /etc/letsencrypt/live/${FQDN}/privkey.pem </VirtualHost> </IfModule>
Avvertimento
The included
Cache*
options are very aggressive!create the cache directory
mkdir /var/cache/apache chown www-data:www-data /var/cache/apache chmod 775 /var/cache/apache
enable the Apache modules
a2enmod cache cache_disk systemctl start apache-htcacheclean.service systemctl restart apache2
check for a cache hit. Replace
FQDN
with the appropriate domaincurl -s https://${FQDN} 1>/dev/null 2>/dev/null curl -s -D - https://${FQDN} | head -n 20
Nota
The
/packages/
page does not get cached.set
CacheHeader
andCacheDetailHeader
toOff
restart Apache
systemctl restart apache2
Virtual machine compiling the packages#
I suggest using a virtual machine to compile packages to improve isolation and security: arbitrary code might be executed when compiling a package.
Vedi anche
A collection of scripts I have written and/or adapted that I currently use on my systems as automated tasks [13]
Git repository pointers and configurations to build Python packages from source [14]
python - pushd through os.system - Stack Overflow [15]
Pass options to `build_ext · Issue #328 · pypa/build · GitHub` [16]
create a virtual machine with Debian Bullseye (stable) and transform it into Sid (unstable). Using the unstable version will provide more up to date software for development.
See the QEMU server section. You might need to assign a lost of disk space.
connect to the virtual machine. See the QEMU client section
create a new user
sudo -i useradd --system -s /bin/bash -U python-source-package-updater passwd python-source-package-updater usermod -aG jobs python-source-package-updater
create the jobs directories. See reference
mkdir -p /home/jobs/{scripts,services}/by-user/python-source-package-updater chown -R python-source-package-updater:python-source-package-updater /home/jobs/{scripts,services}/by-user/python-source-package-updater chmod 700 -R /home/jobs/{scripts,services}/by-user/python-source-package-updater
install these packages in the virtual machine:
apt-get install build-essential fakeroot devscripts git python3-dev python3-all-dev \ games-python3-dev libgmp-dev libssl-dev libssl1.1=1.1.1k-1 libcurl4-openssl-dev \ python3-pip python3-build twine libffi-dev graphviz libgraphviz-dev pkg-config \ clang-tools libblas-dev astro-all libblas-dev libatlas-base-dev libopenblas-dev \ libgsl-dev libblis-dev liblapack-dev liblapack3 libgslcblas0 libopenblas-base \ libatlas3-base libblas3 clang-9 clang-13 clang-12 clang-11 sphinx-doc \ libbliss-dev libblis-dev libbliss2 libblis64-serial-dev libblis64-pthread-dev \ libblis64-openmp-dev libblis64-3-serial libblis64-dev libblis64-3-pthread \ libblis64-3-openmp libblis64-3 libblis3-serial libblis3-pthread \ libblis-serial-dev libblis-pthread-dev libargon2-dev libargon2-0 libargon2-1
Nota
This is just a selection. Some Python packages need other dependencies not listed here.
install the dependencies of the script
apt-get install python3-yaml python3-fpyutils pip3 install --user platformdirs
install fpyutils. See reference
add the
script
#!/usr/bin/env python3 # # build_python_packages.py # # Copyright (C) 2023 Franco Masotti (franco \D\o\T masotti {-A-T-} tutanota \D\o\T com) # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program. If not, see <https://www.gnu.org/licenses/>. # import logging import os import pathlib import shlex import signal import subprocess import sys import fpyutils import platformdirs import yaml def _setup_logging() -> logging.Logger: # See # https://python3docs.franco.net.eu.org/howto/logging.html#logging-advanced-tutorial logger = logging.getLogger('build_python_packages.py') logger.setLevel(logging.DEBUG) # Console logging ch = logging.StreamHandler() ch.setLevel(logging.DEBUG) formatter = logging.Formatter( '%(asctime)s - %(name)s - %(levelname)s - %(message)s') ch.setFormatter(formatter) logger.addHandler(ch) return logger def _read_yaml_file(yaml_file: str) -> dict: data: dict = dict() if pathlib.Path(yaml_file).is_file(): data = yaml.load(open(yaml_file), Loader=yaml.SafeLoader) return data def _write_yaml_file(data: dict, yaml_file: str) -> dict: with open(yaml_file, 'w') as f: f.write(yaml.dump(data)) # This class represents a single submodule. class SubmoduleConfiguration: def __init__(self, path: str, skip_repository: bool = False, mark_skip_repository_successfull: bool = False, mark_failed_build_or_upload_successfull: bool = False, relative_base_directory_override: str = '', override_commands: dict = dict(), ref_checkout: list = list()): r"""Create a SubmoduleConfiguration.""" self.path: str = shlex.quote(path) self.skip_repository: bool = skip_repository # Common to all SubmoduleConfiguration instances. self.mark_skip_repository_successfull = mark_skip_repository_successfull self.mark_failed_build_or_upload_successfull = mark_failed_build_or_upload_successfull # Remote configuration self.relative_base_directory_override: str = relative_base_directory_override self.override_commands: dict = override_commands self.ref_checkout: list = ref_checkout def _execute_override_commands(self, command_type: str = 'pre'): cmd: list if (isinstance(self.override_commands, dict) and self.override_commands != dict()): # Command type must be {pre,build,post} for block in self.override_commands[command_type]: cmd = list() for c in self.override_commands[command_type][block]: cmd.append(shlex.quote(c)) try: subprocess.run(cmd, check=True) logger.info('override command executed correctly') except subprocess.CalledProcessError as e: # Print but do not abort the program. logger.warning(e) logger.info('error executing override command') class Cache: def __init__(self): r"""Create the Cache structure.""" self.path: str = '' # cache[base_path] = [tag_0, tag_1, ..., tag_n] self.cache: dict = dict() def _init(self): platformdir: platformdirs.AppDirs = platformdirs.AppDirs( 'build_python_packages') platformdir.user_cache_path.mkdir(mode=0o700, exist_ok=True, parents=True) self.path = pathlib.Path(platformdir.user_cache_dir, 'cache.yml') def _read(self): self.cache = _read_yaml_file(self.path) logger.info('cache read') def _write(self): _write_yaml_file(self.cache, self.path) logger.info('cache written') def _update(self, package_path: str, tag: str): if tag != '': if package_path not in self.cache: self.cache[package_path]: list = list() self.cache[package_path].append(tag) logger.info('cache updated') else: logger.info('cache not updated because of tagless repository') class Executables: def __init__(self, git: str = 'git', python: str = 'python3', twine: str = 'twine', rm: str = 'rm'): r"""Save the paths of all necessary executables.""" self.git: str = shlex.quote(git) self.python: str = shlex.quote(python) self.twine: str = shlex.quote(twine) self.rm: str = shlex.quote(rm) class PypiCredentials: def __init__(self, url: str, user: str, password: str): r"""Save the PyPI credentials to be used for a mirror.""" self.url = shlex.quote(url) self.user = user self.password = password class GitRepository: def __init__(self, path: str, executables: Executables): r"""Initialize a generic empty GIT repository.""" self.path: str = shlex.quote(path) self.executables: str = executables def _get_tags(self) -> list: s = subprocess.run([self.executables.git, '-C', self.path, 'tag'], check=True, capture_output=True) logger.info('obtained git tags') return s.stdout.decode('UTF-8').rstrip().split('\n') def _remove_untracked_files(self): fpyutils.shell.execute_command_live_output(self.executables.git + ' -C ' + self.path + ' checkout --force --') fpyutils.shell.execute_command_live_output(self.executables.git + ' -C ' + self.path + ' clean -d -x --force') def _get_last_ref_timestamp(self) -> str: return subprocess.run( [ self.executables.git, '-C', self.path, 'log', '-1', '--pretty=%ct' ], check=True, capture_output=True).stdout.decode('UTF-8').strip() def _tag_checkout(self, tag: str): # Checkout repository with tags: avoid checking out tagless repositories. if tag != '': fpyutils.shell.execute_command_live_output( shlex.quote(self.executables.git) + ' -C ' + self.path + ' checkout ' + tag) class Dist: def __init__(self, path: str, executables: Executables): r"""Initialize a Dist which is a subset of a package.""" self.path = shlex.quote(path) self.executables = executables def _build(self, git_repository_timestamp: str): git_repository_timestamp = shlex.quote(git_repository_timestamp) r"""Build the Python package in a reproducable way. Remove all dev, pre-releases, etc information from the package name. Use a static timestamp. See https://github.com/pypa/build/issues/328#issuecomment-877028239 """ subprocess.run([ self.executables.python, '-m', 'build', '--sdist', '--wheel', '-C--build-option=egg_info', '-C--build-option=--no-date', '-C--build-option=--tag-build=', self.path ], check=True, env=dict(os.environ, SOURCE_DATE_EPOCH=git_repository_timestamp)) def _upload(self, pypi_credentials: PypiCredentials): r"""Push the compiled package to a remote PyPI server.""" subprocess.run([ self.executables.twine, 'upload', '--repository-url', pypi_credentials.url, '--non-interactive', '--skip-existing', str(pathlib.Path(self.path, 'dist/*')) ], check=True, env=dict(os.environ, TWINE_PASSWORD=pypi_credentials.password, TWINE_USERNAME=pypi_credentials.user)) class Package: def __init__(self, path: str, tag: str, repo: GitRepository, submodule_configuration: SubmoduleConfiguration, cache: Cache, executables: Executables): r"""Initialize a Package which is a subset of a worker.""" self.path: str = shlex.quote(path) # Do not qute tag: str() == '', shlex.quote(str()) == "''" self.tag = tag self.repo = repo self.submodule_configuration: SubmoduleConfiguration = submodule_configuration self.cache: Cache = cache self.executables = executables self.dist: Dist = Dist(self.path, self.executables) def _clean(self): fpyutils.shell.execute_command_live_output( self.executables.rm + ' -rf ' + fpyutils.path.add_trailing_slash(self.path) + 'build ' + fpyutils.path.add_trailing_slash(self.path) + 'dist') def _work(self, pypi_credentials: PypiCredentials) -> bool: # Retuns True if build and upload are successfull # False otherwise update_cache: bool = False successfull: bool = False self.repo._remove_untracked_files() self._clean() # Do not checkout empty tags if self.tag != '': self.repo._tag_checkout(self.tag) try: self.submodule_configuration._execute_override_commands('pre') # Replace build command if necessary. if self.submodule_configuration.override_commands != dict(): self.submodule_configuration._execute_override_commands( 'build') else: self.dist._build(self.repo._get_last_ref_timestamp()) # Post self.submodule_configuration._execute_override_commands('post') self.dist._upload(pypi_credentials) if self.tag != '': update_cache = True successfull = True logger.info('package build successfully') except subprocess.CalledProcessError: logger.info('error building package') if (self.submodule_configuration. mark_failed_build_or_upload_successfull and self.tag != str): successfull = True update_cache = True if update_cache: self.cache._update(pathlib.Path(self.path).stem, self.tag) self.repo._remove_untracked_files() return successfull class RepositoryWorker: def __init__(self, path: str, submodule_configuration: SubmoduleConfiguration, executables: Executables): r"""Initialize a worker which corresponds to a GIT submodule.""" # Working directory full path of Python code. self.path = shlex.quote(path) self.submodule_configuration: SubmoduleConfiguration = submodule_configuration self.executables: Executables = executables self.successfull_tags: int = 0 self.total_tags: int = 0 def _work(self, cache: Cache, pypi_credentials: PypiCredentials): repo: GitRepository = GitRepository(self.path, self.executables) tags: list = repo._get_tags() self.total_tags: int = len(tags) for i, tag in enumerate(tags): logger.info('processing git tag ' + str(i + 1) + ' of ' + str(len(tags))) if self.submodule_configuration.skip_repository: logger.info('git tag ' + str(i + 1) + ' is skipped') if self.submodule_configuration.mark_skip_repository_successfull: self.successfull_tags += 1 logger.info('marking skipped git tag ' + str(i + 1) + ' as successfull') cache._update(pathlib.Path(self.path).stem, tag) elif (pathlib.Path(self.path).stem in cache.cache and tag in cache.cache[pathlib.Path(self.path).stem]): self.successfull_tags += 1 logger.info('git tag ' + str(i + 1) + ' already in cache') else: p = Package( path=self.path, tag=tag, repo=repo, submodule_configuration=self.submodule_configuration, cache=cache, executables=self.executables) self.successfull_tags += int(p._work(pypi_credentials)) class GitParentRepository(GitRepository): def __init__(self, path: str, remote: str, checkout_branch: str, local_sumodules_configuration: dict, cache: Cache, executables: Executables): r"""Initialize the main repository.""" super().__init__(path, executables) # usually set to 'origin' self.remote: str = shlex.quote(remote) self.checkout_branch: str = shlex.quote(checkout_branch) # self.configuration = SubmoduleConfiguration.all() self.submodules_configuration: dict = dict() self.submodules: list = list() self.local_sumodules_configuration = local_sumodules_configuration self.cache: Cache = cache self.total_successfull_tags: int = 0 self.total_tags: int = 0 def _get_updates(self): logger.info( 'pulling parent repository changes, this might take a while') fpyutils.shell.execute_command_live_output(self.executables.git + ' -C ' + self.path + ' pull ' + self.remote + ' ' + self.checkout_branch) fpyutils.shell.execute_command_live_output( self.executables.git + ' -C ' + self.path + ' submodule foreach --recursive git reset --hard') fpyutils.shell.execute_command_live_output(self.executables.git + ' -C ' + self.path + ' submodule sync') # We might need to add the '--recursive' option for 'git submodule update' # to build certain packages. This means that we still depend from external # services at build time if we use that option. fpyutils.shell.execute_command_live_output( self.executables.git + ' -C ' + self.path + ' submodule update --init --remote') fpyutils.shell.execute_command_live_output( self.executables.git + ' -C ' + self.path + ' submodule foreach git fetch --tags --force') logger.info('parent repository changes pulled') def _append_local_submodules_configuration(self): self.submodules_configuration[ 'local'] = self.local_sumodules_configuration # Read the 'configuration.yaml' file in the repository def _read_submodules_configuration(self): remote_configuration_file = pathlib.Path(self.path, 'configuration.yaml') if remote_configuration_file.is_file(): self.submodules_configuration = yaml.load( open(remote_configuration_file), Loader=yaml.SafeLoader) self.submodules_configuration[ 'remote'] = self.submodules_configuration['submodules'] del self.submodules_configuration['submodules'] logger.info( 'parent repository submodules configuration was read correctly' ) else: logger.info('no repository submodules configuration present') self._append_local_submodules_configuration() logger.info('all submodules configuration was read') def _get_submodules(self): self.submodules = [ x for x in pathlib.Path(self.path, 'submodules').iterdir() ] logger.info('got submodules directories list') def _call_worker(self, pypi_credentials: PypiCredentials): self._get_submodules() signal.signal(signal.SIGINT, lambda signal, frame: self._signal_handler()) signal.signal(signal.SIGTERM, lambda signal, frame: self._signal_handler()) for i in range(0, len(self.submodules)): logger.info('remaining ' + str(len(self.submodules) - i + 1) + ' submodules') d: pathlib.Path = self.submodules[i] dirname: str = pathlib.Path(self.path, d).stem skip_repository: bool = False relative_base_directory_ovr: str = '' override_commands: dict = dict() ref_checkout: list = list() if dirname in self.submodules_configuration['local'][ 'skip_repository']: skip_repository = True if dirname in self.submodules_configuration['remote']: relative_base_directory_ovr = self.submodules_configuration[ 'remote'][dirname]['base_directory_override'] override_commands = self.submodules_configuration['remote'][ dirname]['override_commands'] ref_checkout = self.submodules_configuration['remote'][ dirname]['ref_checkout'] submodule_cfg = SubmoduleConfiguration( path=d.stem, skip_repository=skip_repository, mark_skip_repository_successfull=self.submodules_configuration[ 'local']['mark_skip_repository_successfull'], mark_failed_build_or_upload_successfull=self. submodules_configuration['local'] ['mark_failed_build_or_upload_successfull'], relative_base_directory_override=relative_base_directory_ovr, override_commands=override_commands, ref_checkout=ref_checkout) worker = RepositoryWorker(path=str(d), submodule_configuration=submodule_cfg, executables=self.executables) worker._work(cache=self.cache, pypi_credentials=pypi_credentials) self.total_successfull_tags += worker.successfull_tags self.total_tags += worker.total_tags def _signal_handler(self): logger.info('signal received. finished queued workers and writing ' + str(len(self.cache.cache)) + ' repository elements to cache before exit') self.cache._write() sys.exit(1) def _stats(self): logger.info('total successfull tags: ' + str(self.total_successfull_tags)) logger.info('total tags: ' + str(self.total_tags)) class Notify: def __init__(self, gotify: dict, email: dict): r"""Save data for the notifications.""" self.message: str = '' self.gotify: dict = gotify self.email: dict = email def _get_message(self, parent_repo: GitParentRepository): self.message = ''.join([ 'total tags: ', str(parent_repo.total_tags), '\n', 'total successfull tags: ', str(parent_repo.total_successfull_tags), '\n', 'tag successfull ratio: ', str(parent_repo.total_successfull_tags / parent_repo.total_tags) ]) def _send(self): m = self.gotify['message'] + '\n' + self.message if self.gotify['enabled']: fpyutils.notify.send_gotify_message(self.gotify['url'], self.gotify['token'], m, self.gotify['title'], self.gotify['priority']) if self.email['enabled']: fpyutils.notify.send_email( self.message, self.email['smtp_server'], self.email['port'], self.email['sender'], self.email['user'], self.email['password'], self.email['receiver'], self.email['subject']) logger: logging.Logger = _setup_logging() def main(): config = _read_yaml_file(shlex.quote(sys.argv[1])) if config == dict(): raise ValueError cache = Cache() cache._init() cache._read() execs = Executables(git=config['executables']['git'], python=config['executables']['python'], twine=config['executables']['twine'], rm=config['executables']['rm']) pypi = PypiCredentials(config['pypi']['url'], config['pypi']['user'], config['pypi']['password']) parent_repo = GitParentRepository(config['repository']['path'], config['repository']['remote'], config['repository']['checkout_branch'], config['submodules'], cache, execs) parent_repo._read_submodules_configuration() parent_repo._get_updates() parent_repo._call_worker(pypi) cache._write() n = Notify(config['notify']['gotify'], config['notify']['email']) n._get_message() n._send() if __name__ == '__main__': main()
add the
configuration
# # build_python_packages.yaml # # Copyright (C) 2021-2022 Franco Masotti (franco \D\o\T masotti {-A-T-} tutanota \D\o\T com) # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program. If not, see <http://www.gnu.org/licenses/>. notify: email: enabled: false smtp_server: 'smtp.gmail.com' port: 465 sender: 'myusername@gmail.com' user: 'myusername' password: 'my awesome password' receiver: 'myusername@gmail.com' subject: 'update action' gotify: enabled: false url: '<gotify url>' token: '<app token>' title: 'update action' message: 'update action' priority: 5 repository: path: '/home/jobs/scripts/by-user/pypi-source-packages-updater/python-packages-source' remote: 'origin' checkout_branch: 'dev' submodules: mark_skip_repository_successfull: true mark_failed_build_or_upload_successfull: true # Directory names. skip_repository: [] executables: git: 'git' python: 'python3' rm: 'rm' twine: 'twine' pypi: url: '<PyPI URL>' user: '<PyPI username>' password: '<PyPI password>'
add the
helper script
. update_and_build_python_packages.sh clones and updates the python-packages-source [14] repository and compiles all the packages#!/usr/bin/env bash REPOSITORY='https://software.franco.net.eu.org/frnmst/python-packages-source.git' pushd /home/jobs/scripts/by-user/python-source-packages-updater export PATH=$PATH:/home/python-source-packages-updater/.local/bin/ git clone "${REPOSITORY}" pushd python-packages-source git checkout dev git pull # Always commit and push to dev only. [ "$(git branch --show-current)" = 'dev' ] || exit 1 # Update all submodules and the stats. make install-dev make submodules-update make submodules-add-gitea make stats git add -A git commit -m "Submodule updates." git push popd # Compile the packages. ./build_python_packages.py ./build_python_packages.yaml # Cleanup. rm -rf python-packages-source popd
add the
Systemd service file
[Unit] Description=Build python packages [Service] Type=simple ExecStart=/home/jobs/scripts/by-user/python-source-packages-updater/update_and_build_python_packages.sh User=python-source-packages-updater Group=python-source-packages-updater StandardOutput=null StandardError=null
add the
Systemd timer unit file
[Unit] Description=Once every week build python packages [Timer] OnCalendar=Weekly Persistent=true [Install] WantedBy=timers.target
fix the permissions
chown -R python-source-packages-updater:python-source-packages-updater /home/jobs/{scripts,services}/by-user/python-source-packages-updater chmod 700 -R /home/jobs/{scripts,services}/by-user/python-source-packages-updater
run the deploy script
To be able to compile most packages you need to manually compile at least these basic ones and push them to you local PyPI server.
setuptools
setuptools_scm
wheel
You can clone the python-packages-source [14] respository then compile and upload these basic packages.
sudo -i -u python-source-package-updater git clone https://software.franco.net.eu.org/frnmst/python-packages-source.git cd python-packages-source/setuptools python3 -m build --sdist --wheel twine upload --repository-url ${your_pypi_index_url} dist/* exit
Importante
Some packages might need different dependencies. Have a look at the
setup_requires
variable insetup.py
or insetup.cfg
orrequires
in thepyproject.toml
file. If you cannot compile some, download them directly from pypi.python.org.
Updating the package graph#
When you run the helper script you can update the stats graph automatically by using a GIT commit hook. The following script generates the graph and copies it to a webserver directory.
connect via SSH to the GIT remote machine and install the dependencies
sudo -i apt-get install git python3 make pipenv exit
connect to the git deploy user
sudo -i -u git-deploy
configure your remote: add this to the post-receive hooks
#!/usr/bin/bash -l IMAGE=""$(echo -n 'frnmst/python-packages-source' | sha1sum | awk '{print $1 }')"_graph0.png" DOMAIN='assets.franco.net.eu.org' TMP_GIT_CLONE=""${HOME}"/tmp/python-packages-source" PUBLIC_WWW="/var/www/${DOMAIN}/image/${IMAGE}" git clone "${GIT_DIR}" "${TMP_GIT_CLONE}" pushd "${TMP_GIT_CLONE}" make install make plot OUTPUT="${PUBLIC_WWW}" chmod 770 "${PUBLIC_WWW}" popd rm --recursive --force "${TMP_GIT_CLONE}"
Using the PyPI server#
change the PyPI index of your programs. See for example https://software.franco.net.eu.org/frnmst/python-packages-source#client-configuration
Footnotes