mirror of
https://github.com/flatcar/scripts.git
synced 2025-09-30 10:01:32 +02:00
Merge pull request #2342 from flatcar/krnowak/pkg-auto
pkg-auto: Add package automation scripts
This commit is contained in:
commit
f76f6db755
2
pkg_auto/Makefile
Normal file
2
pkg_auto/Makefile
Normal file
@ -0,0 +1,2 @@
|
|||||||
|
shellcheck:
|
||||||
|
docker run --rm -v "$$PWD:/mnt" koalaman/shellcheck:latest --norc --shell=bash --source-path=SCRIPTDIR --source-path=SCRIPTDIR/impl --external-sources --check-sourced *.sh impl/*.sh
|
58
pkg_auto/README.md
Normal file
58
pkg_auto/README.md
Normal file
@ -0,0 +1,58 @@
|
|||||||
|
Scripts for package update automation
|
||||||
|
=====================================
|
||||||
|
|
||||||
|
A quick start from blank state:
|
||||||
|
|
||||||
|
- clone scripts repo and create worktrees for the old state and new state (and optionally for the branch with package automation scripts if they are not a part of the `main` branch yet):
|
||||||
|
- `git clone https://github.com/flatcar/scripts.git scripts/main`
|
||||||
|
- `cd scripts/main`
|
||||||
|
- `PKG_AUTO="${PWD}/pkg_auto"`
|
||||||
|
- `git worktree add --branch weekly-updates ../weekly-updates origin/buildbot/weekly-portage-stable-package-updates-2024-09-23`
|
||||||
|
- prepare for generating reports (create a directory, download necessary stuff, create config):
|
||||||
|
- `mkdir ../../weekly-updates`
|
||||||
|
- `cd ../../weekly-updates`
|
||||||
|
- `"${PKG_AUTO}/download_sdk_and_listings.sh" -s ../../scripts/main -x aux-cleanups aux`
|
||||||
|
- call `"${PKG_AUTO}/download_sdk_and_listings.sh" -h` to get help
|
||||||
|
- `"${PKG_AUTO}/generate_config.sh" -a aux -n weekly-updates -o main -r reports -s ../../scripts/main -x file,wd-cleanups config`
|
||||||
|
- call `"${PKG_AUTO}/generate_config.sh" -h` to get help
|
||||||
|
- generate the reports:
|
||||||
|
- `"${PKG_AUTO}/generate_reports.sh" -w wd config`
|
||||||
|
- if the command above fails, the `reports` directory (see the `-r reports` flag in the call to `generate_config.sh` above) will have some reports that may contain hints as to why the failure happened
|
||||||
|
- the `reports` directory may also contain files like `warnings` or `manual-work-needed`
|
||||||
|
- the items in `warnings` file should be addressed and the report generation should be rerun, see below
|
||||||
|
- the items in `manual-work-needed` are things to be done while processing the updates
|
||||||
|
- in order to rerun the report generation, stuff from previous run should be removed beforehand:
|
||||||
|
- `source wd-cleanups`
|
||||||
|
- `rm -rf reports`
|
||||||
|
- `"${PKG_AUTO}/generate_reports.sh" -w wd config`
|
||||||
|
- if generating reports succeeded, process the updates, update the PR with the changelogs and update summaries:
|
||||||
|
- this is the manual part, described below
|
||||||
|
- after everything is done (like the PR got merged), things needs cleaning up:
|
||||||
|
- `source wd-cleanups`
|
||||||
|
- `rm -rf reports`
|
||||||
|
- `source aux-cleanups`
|
||||||
|
|
||||||
|
Processing the updates (the manual part)
|
||||||
|
========================================
|
||||||
|
|
||||||
|
The generated directory with reports will contain the `updates` directory. Within there are two files: `summary_stubs` and `changelog_stubs`. The rest of the entries are categories and packages that were updated. The first file, `summary_stubs`, contains a list of packages that have changed and TODO items associated to each package. It is mostly used for being pasted into the pull request description as an aid for the reviewers. The latter, `changelog_stubs`, can serve as a base for changelog that should be added to the `scripts` repo.
|
||||||
|
|
||||||
|
For each package in the `summary_stubs` there are TODO items. These are basically of four kinds:
|
||||||
|
|
||||||
|
- to review the changes in the ebuild
|
||||||
|
- to review the changes not in the ebuild (metadata, patch files)
|
||||||
|
- to review the occurences of the package name in the scripts repository
|
||||||
|
- to add a link to the release notes in case of a package update
|
||||||
|
|
||||||
|
It is possible that none of the changes in the package are relevant to Flatcar (like when a package got stabilized for hppa, for instance), then the package should be just dropped from the `summary_stubs`. Note that the package update is relevant, so as such should stay in the file.
|
||||||
|
|
||||||
|
The entries in `changelog_stubs` should be reviewed about relevancy (minor SDK-only packages should likely be dropped, they are seldom of interest to end-users) and the remaining entries should be updated with proper links to release notes.
|
||||||
|
|
||||||
|
There may be also entries in `manual-work-needed` which may need addressing. Most often the reason is that the new package was added, or an existing package stopped being pulled in. This would need adding an entry to the `summary_stubs`.
|
||||||
|
|
||||||
|
Another thing that to do is to check [the security reports](https://github.com/flatcar/Flatcar/issues?q=is%3Aopen+is%3Aissue+label%3Aadvisory). If the updated package brings a fix for a security issue, it should be mentioned in the summary and an entry in a separate security changelog should be added.
|
||||||
|
|
||||||
|
Other scripts
|
||||||
|
=============
|
||||||
|
|
||||||
|
There are other scripts in this directory. `inside_sdk_container.sh` is a script executed by `generate_reports.sh` inside the SDK to collect the package information. `sync_packages.sh` is a script that updates packages and saves the result to a new branch. `update_packages.sh` is `sync_packages.sh` + `generate_reports.sh`.
|
47
pkg_auto/config_template
Normal file
47
pkg_auto/config_template
Normal file
@ -0,0 +1,47 @@
|
|||||||
|
# Path to the toplevel directory of the scripts repo.
|
||||||
|
scripts: ..
|
||||||
|
|
||||||
|
# Path to the directory with auxiliary files.
|
||||||
|
aux: ../../downloads
|
||||||
|
|
||||||
|
# Path to the directory where update reports will be stored.
|
||||||
|
reports: ../../reports
|
||||||
|
|
||||||
|
# Base scripts branch for old state.
|
||||||
|
#
|
||||||
|
# Old state serves as an state before the updates.
|
||||||
|
#
|
||||||
|
# Optional, defaults to origin/main
|
||||||
|
#old-base: origin/main
|
||||||
|
|
||||||
|
# Base scripts branch for new state.
|
||||||
|
#
|
||||||
|
# New state serves as an state that's either after the updates or to-be-updated.
|
||||||
|
#
|
||||||
|
# Optional, defaults to old-base value
|
||||||
|
#new-base: origin/main
|
||||||
|
|
||||||
|
# Cleanups mode with semicolon-separated list of extra arguments.
|
||||||
|
#
|
||||||
|
# Can be:
|
||||||
|
#
|
||||||
|
# file,${path_to_file}
|
||||||
|
# stores cleanup steps in the file. Can be later sourced to perform cleanups.
|
||||||
|
#
|
||||||
|
# trap
|
||||||
|
# executes cleanups on script exit
|
||||||
|
#
|
||||||
|
# ignore
|
||||||
|
# performs no cleanups at all
|
||||||
|
#
|
||||||
|
# Optional, if not specified, defaults to ignore.
|
||||||
|
#cleanups: file,../../x-cleanups
|
||||||
|
#cleanups: trap
|
||||||
|
#cleanups: ignore
|
||||||
|
|
||||||
|
# Override image name to use for an SDK container.
|
||||||
|
#
|
||||||
|
# Optional, defaults to
|
||||||
|
# ghcr.io/flatcar/flatcar-sdk-all:${last_nightly_version_id_in_base}-${last_nightly_build_id_in_base}
|
||||||
|
#amd64-sdk-img=
|
||||||
|
#arm64-sdk-img=
|
265
pkg_auto/download_sdk_and_listings.sh
Executable file
265
pkg_auto/download_sdk_and_listings.sh
Executable file
@ -0,0 +1,265 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
##
|
||||||
|
## Downloads package SDKs from bincache and loads them with
|
||||||
|
## docker. Downloads package listings from bincache. Version can be
|
||||||
|
## taken either from the latest nightly tag in the passed scripts
|
||||||
|
## directory (with the -s option) or from specified version ID and
|
||||||
|
## build ID (with -v and -b options). The results are written to the
|
||||||
|
## passed downloads directory.
|
||||||
|
##
|
||||||
|
## Parameters:
|
||||||
|
## -b <ID>: build ID, conflicts with -s
|
||||||
|
## -h: this help
|
||||||
|
## -s <DIR>: scripts repo directory, conflicts with -v and -b
|
||||||
|
## -v <ID>: version ID, conflicts with -s
|
||||||
|
## -nd: skip downloading of docker images
|
||||||
|
## -p: download packages SDK images instead of the standard one (valid
|
||||||
|
## only when downloading docker images)
|
||||||
|
## -nl: skip downloading of listings
|
||||||
|
## -x <FILE>: cleanup file
|
||||||
|
##
|
||||||
|
## Positional:
|
||||||
|
## 1: downloads directory
|
||||||
|
##
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/impl/util.sh"
|
||||||
|
source "${PKG_AUTO_IMPL_DIR}/cleanups.sh"
|
||||||
|
|
||||||
|
CLEANUP_FILE=
|
||||||
|
SCRIPTS=
|
||||||
|
VERSION_ID=
|
||||||
|
BUILD_ID=
|
||||||
|
SKIP_DOCKER=
|
||||||
|
SKIP_LISTINGS=
|
||||||
|
PKGS_SDK=
|
||||||
|
|
||||||
|
while [[ ${#} -gt 0 ]]; do
|
||||||
|
case ${1} in
|
||||||
|
-b)
|
||||||
|
if [[ -n ${SCRIPTS} ]]; then
|
||||||
|
fail '-b cannot be used at the same time with -s'
|
||||||
|
fi
|
||||||
|
if [[ -z ${2:-} ]]; then
|
||||||
|
fail 'missing value for -b'
|
||||||
|
fi
|
||||||
|
BUILD_ID=${2}
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-h)
|
||||||
|
print_help
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
-p)
|
||||||
|
PKGS_SDK=x
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-s)
|
||||||
|
if [[ -n ${VERSION_ID} ]] || [[ -n ${BUILD_ID} ]]; then
|
||||||
|
fail '-s cannot be used at the same time with -v or -b'
|
||||||
|
fi
|
||||||
|
if [[ -z ${2:-} ]]; then
|
||||||
|
fail 'missing value for -s'
|
||||||
|
fi
|
||||||
|
SCRIPTS=${2}
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-v)
|
||||||
|
if [[ -n ${SCRIPTS} ]]; then
|
||||||
|
fail '-v cannot be used at the same time with -s'
|
||||||
|
fi
|
||||||
|
if [[ -z ${2:-} ]]; then
|
||||||
|
fail 'missing value for -v'
|
||||||
|
fi
|
||||||
|
VERSION_ID=${2}
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-x)
|
||||||
|
if [[ -z ${2:-} ]]; then
|
||||||
|
fail 'missing value for -x'
|
||||||
|
fi
|
||||||
|
CLEANUP_FILE=${2}
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-nd)
|
||||||
|
SKIP_DOCKER=x
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-nl)
|
||||||
|
SKIP_LISTINGS=x
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--)
|
||||||
|
shift
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
-*)
|
||||||
|
fail "unknown flag '${1}'"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ ${#} -ne 1 ]]; then
|
||||||
|
fail 'Expected one positional parameter: a downloads directory'
|
||||||
|
fi
|
||||||
|
|
||||||
|
DOWNLOADS_DIR=$(realpath "${1}"); shift
|
||||||
|
|
||||||
|
if [[ -z ${SCRIPTS} ]] && [[ -z ${VERSION_ID} ]]; then
|
||||||
|
fail 'need to pass either -s or -v (latter with the optional -b too)'
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -n ${CLEANUP_FILE} ]]; then
|
||||||
|
dirname_out "${CLEANUP_FILE}" cleanup_dir
|
||||||
|
# shellcheck disable=SC2154 # cleanup_dir is assigned in dirname_out
|
||||||
|
mkdir -p "${cleanup_dir}"
|
||||||
|
unset cleanup_dir
|
||||||
|
setup_cleanups file "${CLEANUP_FILE}"
|
||||||
|
else
|
||||||
|
setup_cleanups ignore
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ! -d "${DOWNLOADS_DIR}" ]]; then
|
||||||
|
add_cleanup "rmdir ${DOWNLOADS_DIR@Q}"
|
||||||
|
mkdir "${DOWNLOADS_DIR}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
function download() {
|
||||||
|
local url output
|
||||||
|
url="${1}"; shift
|
||||||
|
output="${1}"; shift
|
||||||
|
|
||||||
|
info "Downloading ${url}"
|
||||||
|
curl \
|
||||||
|
--fail \
|
||||||
|
--show-error \
|
||||||
|
--location \
|
||||||
|
--retry-delay 1 \
|
||||||
|
--retry 60 \
|
||||||
|
--retry-connrefused \
|
||||||
|
--retry-max-time 60 \
|
||||||
|
--connect-timeout 20 \
|
||||||
|
"${url}" >"${output}"
|
||||||
|
}
|
||||||
|
|
||||||
|
if [[ -n ${SCRIPTS} ]]; then
|
||||||
|
# shellcheck disable=SC1091 # sourcing generated file
|
||||||
|
VERSION_ID=$(source "${SCRIPTS}/sdk_container/.repo/manifests/version.txt"; printf '%s' "${FLATCAR_VERSION_ID}")
|
||||||
|
# shellcheck disable=SC1091 # sourcing generated file
|
||||||
|
BUILD_ID=$(source "${SCRIPTS}/sdk_container/.repo/manifests/version.txt"; printf '%s' "${FLATCAR_BUILD_ID}")
|
||||||
|
fi
|
||||||
|
|
||||||
|
ver_plus="${VERSION_ID}${BUILD_ID:++}${BUILD_ID}"
|
||||||
|
ver_dash="${VERSION_ID}${BUILD_ID:+-}${BUILD_ID}"
|
||||||
|
|
||||||
|
exts=(zst bz2 gz)
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # used indirectly as cmds_name and cmds
|
||||||
|
zst_cmds=(
|
||||||
|
zstd
|
||||||
|
)
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # used indirectly as cmds_name and cmds
|
||||||
|
bz2_cmds=(
|
||||||
|
lbunzip2
|
||||||
|
pbunzip2
|
||||||
|
bunzip2
|
||||||
|
)
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # used indirectly as cmds_name and cmds
|
||||||
|
gz_cmds=(
|
||||||
|
unpigz
|
||||||
|
gunzip
|
||||||
|
)
|
||||||
|
|
||||||
|
function download_sdk() {
|
||||||
|
local image_name=${1}; shift
|
||||||
|
local tarball_name=${1}; shift
|
||||||
|
local url_dir=${1}; shift
|
||||||
|
|
||||||
|
if docker images --format '{{.Repository}}:{{.Tag}}' | grep -q -x -F "${image_name}"; then
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
info "No ${image_name} available in docker, pulling it from bincache"
|
||||||
|
local ext full_tarball_name tb
|
||||||
|
for ext in "${exts[@]}"; do
|
||||||
|
full_tarball_name="${tarball_name}.tar.${ext}"
|
||||||
|
tb="${DOWNLOADS_DIR}/${full_tarball_name}"
|
||||||
|
if [[ -s ${tb} ]]; then
|
||||||
|
break;
|
||||||
|
else
|
||||||
|
add_cleanup "rm -f ${tb@Q}"
|
||||||
|
if download "${url_dir}/${full_tarball_name}" "${tb}"; then
|
||||||
|
break
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
info "Loading ${image_name} into docker"
|
||||||
|
cmds_name="${ext}_cmds"
|
||||||
|
if ! declare -p "${cmds_name}" >/dev/null 2>/dev/null; then
|
||||||
|
fail "Failed to extract ${tb@Q} - no tools to extract ${ext@Q} files"
|
||||||
|
fi
|
||||||
|
declare -n cmds="${ext}_cmds"
|
||||||
|
loaded=
|
||||||
|
for cmd in "${cmds[@]}"; do
|
||||||
|
if ! command -v "${cmd}" >/dev/null; then
|
||||||
|
info "${cmd@Q} is not available"
|
||||||
|
continue
|
||||||
|
fi
|
||||||
|
info "Using ${cmd@Q} to extract the tarball"
|
||||||
|
"${cmd}" -d -c "${tb}" | docker load
|
||||||
|
add_cleanup "docker rmi ${image_name@Q}"
|
||||||
|
loaded=x
|
||||||
|
break
|
||||||
|
done
|
||||||
|
if [[ -z ${loaded} ]]; then
|
||||||
|
fail "Failed to extract ${tb@Q} - no known available tool to extract it"
|
||||||
|
fi
|
||||||
|
unset -n cmds
|
||||||
|
}
|
||||||
|
|
||||||
|
URL_DIR="https://bincache.flatcar-linux.net/containers/${ver_dash}"
|
||||||
|
|
||||||
|
if [[ -z ${SKIP_DOCKER} ]] && [[ -z ${PKGS_SDK} ]]; then
|
||||||
|
download_sdk "ghcr.io/flatcar/flatcar-sdk-all:${ver_dash}" "flatcar-sdk-all-${ver_dash}" "${URL_DIR}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
declare -a dsal_arches
|
||||||
|
get_valid_arches dsal_arches
|
||||||
|
|
||||||
|
for arch in "${dsal_arches[@]}"; do
|
||||||
|
if [[ -z ${SKIP_DOCKER} ]] && [[ -n ${PKGS_SDK} ]]; then
|
||||||
|
download_sdk "flatcar-packages-${arch}:${ver_dash}" "flatcar-packages-${arch}-${ver_dash}.tar.${ext}" "${URL_DIR}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -z ${SKIP_LISTINGS} ]]; then
|
||||||
|
listing_dir="${DOWNLOADS_DIR}/${arch}"
|
||||||
|
add_cleanup "rmdir ${listing_dir@Q}"
|
||||||
|
mkdir "${listing_dir}"
|
||||||
|
base_url="https://bincache.flatcar-linux.net/images/${arch}/${ver_plus}"
|
||||||
|
|
||||||
|
for infix in '' 'rootfs-included-sysexts'; do
|
||||||
|
index_html="${listing_dir}/${infix}${infix:+-}index.html"
|
||||||
|
url="${base_url}${infix:+/}${infix}"
|
||||||
|
add_cleanup "rm -f ${index_html@Q}"
|
||||||
|
download "${url}/" "${index_html}"
|
||||||
|
|
||||||
|
# get names of all files ending with _packages.txt
|
||||||
|
mapfile -t listing_files < <(grep -F '_packages.txt"' "${index_html}" | sed -e 's#.*"\(\./\)\?\([^"]*\)".*#\2#')
|
||||||
|
|
||||||
|
for listing in "${listing_files[@]}"; do
|
||||||
|
info "Downloading ${listing} for ${arch}"
|
||||||
|
listing_path="${listing_dir}/${listing}"
|
||||||
|
add_cleanup "rm -f ${listing_path@Q}"
|
||||||
|
download "${url}/${listing}" "${listing_path}"
|
||||||
|
done
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
info 'Done'
|
170
pkg_auto/generate_config.sh
Executable file
170
pkg_auto/generate_config.sh
Executable file
@ -0,0 +1,170 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
##
|
||||||
|
## Generate a config.
|
||||||
|
##
|
||||||
|
## Parameters:
|
||||||
|
## -a: aux directory
|
||||||
|
## -d: debug package - list many times
|
||||||
|
## -h: this help
|
||||||
|
## -i: SDK image override in form of ${arch}:${name}, the name part
|
||||||
|
## should be a valid docker image with an optional tag
|
||||||
|
## -ip: add SDK image overrides using flatcar-packages images
|
||||||
|
## -n: new base
|
||||||
|
## -o: old base
|
||||||
|
## -r: reports directory
|
||||||
|
## -s: scripts directory
|
||||||
|
## -x: cleanup opts
|
||||||
|
##
|
||||||
|
## Positional:
|
||||||
|
## 1: path for config file
|
||||||
|
##
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/impl/util.sh"
|
||||||
|
source "${PKG_AUTO_IMPL_DIR}/cleanups.sh"
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # used by name below
|
||||||
|
gc_aux_directory=''
|
||||||
|
# shellcheck disable=SC2034 # used by name below
|
||||||
|
gc_new_base=''
|
||||||
|
# shellcheck disable=SC2034 # used by name below
|
||||||
|
gc_old_base=''
|
||||||
|
# shellcheck disable=SC2034 # used by name below
|
||||||
|
gc_reports_directory=''
|
||||||
|
# shellcheck disable=SC2034 # used by name below
|
||||||
|
gc_scripts_directory=''
|
||||||
|
# shellcheck disable=SC2034 # used by name below
|
||||||
|
gc_cleanup_opts=''
|
||||||
|
# gc_${arch}_sdk_img are declared on demand
|
||||||
|
gc_debug_packages=()
|
||||||
|
|
||||||
|
declare -A opt_map
|
||||||
|
opt_map=(
|
||||||
|
['-a']=gc_aux_directory
|
||||||
|
['-n']=gc_new_base
|
||||||
|
['-o']=gc_old_base
|
||||||
|
['-r']=gc_reports_directory
|
||||||
|
['-s']=gc_scripts_directory
|
||||||
|
['-x']=gc_cleanup_opts
|
||||||
|
)
|
||||||
|
|
||||||
|
declare -a gc_arches
|
||||||
|
get_valid_arches gc_arches
|
||||||
|
|
||||||
|
while [[ ${#} -gt 0 ]]; do
|
||||||
|
case ${1} in
|
||||||
|
-d)
|
||||||
|
if [[ -z ${2:-} ]]; then
|
||||||
|
fail 'missing value for -d'
|
||||||
|
fi
|
||||||
|
gc_debug_packages+=( "${2}" )
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-h)
|
||||||
|
print_help
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
-i)
|
||||||
|
if [[ -z ${2:-} ]]; then
|
||||||
|
fail 'missing value for -i'
|
||||||
|
fi
|
||||||
|
arch=${2%%:*}
|
||||||
|
image_name=${2#*:}
|
||||||
|
var_name="gc_${arch}_sdk_img"
|
||||||
|
unset arch
|
||||||
|
# shellcheck disable=SC2178 # shellcheck does not grok refs
|
||||||
|
declare -n ref="${var_name}"
|
||||||
|
unset var_name
|
||||||
|
# shellcheck disable=SC2178 # shellcheck does not grok refs
|
||||||
|
ref=${image_name}
|
||||||
|
unset image_name
|
||||||
|
unset -n ref
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-ip)
|
||||||
|
for arch in "${gc_arches[@]}"; do
|
||||||
|
var_name="gc_${arch}_sdk_img"
|
||||||
|
# shellcheck disable=SC2178 # shellcheck does not grok refs
|
||||||
|
declare -n ref="${var_name}"
|
||||||
|
unset var_name
|
||||||
|
# shellcheck disable=SC2178 # shellcheck does not grok refs
|
||||||
|
ref="flatcar-packages-${arch}"
|
||||||
|
unset -n ref
|
||||||
|
done
|
||||||
|
unset arch
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--)
|
||||||
|
shift
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
-*)
|
||||||
|
var_name=${opt_map["${1}"]:-}
|
||||||
|
if [[ -z ${var_name} ]]; then
|
||||||
|
fail "unknown flag '${1}'"
|
||||||
|
fi
|
||||||
|
if [[ -z ${2:-} ]]; then
|
||||||
|
fail 'missing value for -w'
|
||||||
|
fi
|
||||||
|
# shellcheck disable=SC2178 # shellcheck does not grok refs
|
||||||
|
declare -n ref="${var_name}"
|
||||||
|
# shellcheck disable=SC2178 # shellcheck does not grok refs
|
||||||
|
ref=${2}
|
||||||
|
unset -n ref
|
||||||
|
unset var_name
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
join_by gc_debug_packages_csv ',' "${gc_debug_packages[@]}"
|
||||||
|
|
||||||
|
declare -a pairs
|
||||||
|
pairs=(
|
||||||
|
'scripts' gc_scripts_directory
|
||||||
|
'aux' gc_aux_directory
|
||||||
|
'reports' gc_reports_directory
|
||||||
|
'old-base' gc_old_base
|
||||||
|
'new-base' gc_new_base
|
||||||
|
'cleanups' gc_cleanup_opts
|
||||||
|
)
|
||||||
|
|
||||||
|
for arch in "${gc_arches[@]}"; do
|
||||||
|
pairs+=( "${arch}-sdk-img" "gc_${arch}_sdk_img" )
|
||||||
|
done
|
||||||
|
|
||||||
|
pairs+=( 'debug-packages' gc_debug_packages_csv )
|
||||||
|
|
||||||
|
|
||||||
|
if [[ ${#} -ne 1 ]]; then
|
||||||
|
fail 'expected one positional parameters: a path for the config'
|
||||||
|
fi
|
||||||
|
|
||||||
|
config=${1}; shift
|
||||||
|
|
||||||
|
{
|
||||||
|
opt_idx=0
|
||||||
|
name_idx=1
|
||||||
|
while [[ ${name_idx} -lt "${#pairs[@]}" ]]; do
|
||||||
|
opt=${pairs["${opt_idx}"]}
|
||||||
|
name=${pairs["${name_idx}"]}
|
||||||
|
opt_idx=$((opt_idx + 2))
|
||||||
|
name_idx=$((name_idx + 2))
|
||||||
|
# shellcheck disable=SC2178 # shellcheck does not grok refs
|
||||||
|
declare -n ref="${name}"
|
||||||
|
if [[ -n ${ref:-} ]]; then
|
||||||
|
printf '%s: %s\n' "${opt}" "${ref}"
|
||||||
|
fi
|
||||||
|
unset -n ref
|
||||||
|
done
|
||||||
|
unset opt_idx name_idx
|
||||||
|
} >"${config}"
|
||||||
|
|
||||||
|
info 'Done, but note that the config is not guaranteed to be valid!'
|
56
pkg_auto/generate_reports.sh
Executable file
56
pkg_auto/generate_reports.sh
Executable file
@ -0,0 +1,56 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
##
|
||||||
|
## Generates reports.
|
||||||
|
##
|
||||||
|
## Parameters:
|
||||||
|
## -w: path to use for workdir
|
||||||
|
## -h: this help
|
||||||
|
##
|
||||||
|
## Positional:
|
||||||
|
## 1: config file
|
||||||
|
##
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/impl/util.sh"
|
||||||
|
source "${PKG_AUTO_IMPL_DIR}/pkg_auto_lib.sh"
|
||||||
|
|
||||||
|
workdir=''
|
||||||
|
|
||||||
|
while [[ ${#} -gt 0 ]]; do
|
||||||
|
case ${1} in
|
||||||
|
-h)
|
||||||
|
print_help
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
-w)
|
||||||
|
if [[ -z ${2:-} ]]; then
|
||||||
|
fail 'missing value for -w'
|
||||||
|
fi
|
||||||
|
workdir=${2}
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
--)
|
||||||
|
shift
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
-*)
|
||||||
|
fail "unknown flag '${1}'"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ ${#} -ne 1 ]]; then
|
||||||
|
fail 'expected one positional parameter: a config file'
|
||||||
|
fi
|
||||||
|
|
||||||
|
config_file=${1}; shift
|
||||||
|
|
||||||
|
setup_workdir_with_config "${workdir}" "${config_file}"
|
||||||
|
generate_package_update_reports
|
153
pkg_auto/impl/cleanups.sh
Normal file
153
pkg_auto/impl/cleanups.sh
Normal file
@ -0,0 +1,153 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
#
|
||||||
|
# Cleanups
|
||||||
|
#
|
||||||
|
# This is basically a command stack to be executed in some deferred
|
||||||
|
# time. So last command added will be the first to be executed at
|
||||||
|
# cleanup time.
|
||||||
|
#
|
||||||
|
# Cleanups are implemented through two functions, setup_cleanups and
|
||||||
|
# add_cleanup, prefixed with _${type}_. So for type "foo" the
|
||||||
|
# functions would be _foo_setup_cleanups and _foo_add_cleanup.
|
||||||
|
#
|
||||||
|
# setup_cleanup may take some extra parameters that are specific to
|
||||||
|
# the type. For example file type takes a path where the commands will
|
||||||
|
# be stored.
|
||||||
|
#
|
||||||
|
# add_cleanup takes one or more command to add to the cleanup stack.
|
||||||
|
#
|
||||||
|
|
||||||
|
if [[ -z ${__CLEANUPS_SH_INCLUDED__:-} ]]; then
|
||||||
|
__CLEANUPS_SH_INCLUDED__=x
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/util.sh"
|
||||||
|
|
||||||
|
# Sets up cleanup stack of a given type. A type may need some extra
|
||||||
|
# parameter, which comes after a comma. Possible types are:
|
||||||
|
#
|
||||||
|
# - file: requires extra argument about cleanup file location; an
|
||||||
|
# example could be "file,/path/to/cleanups-file"
|
||||||
|
# - trap: executed on shell exit
|
||||||
|
# - ignore: noop
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - type of cleanup
|
||||||
|
function setup_cleanups() {
|
||||||
|
local kind
|
||||||
|
kind=${1}; shift
|
||||||
|
|
||||||
|
if [[ -n ${_cleanups_sh_cleanup_kind_:-} ]]; then
|
||||||
|
fail "cannot set cleanups to '${kind}', they are already set up to '${_cleanups_sh_cleanup_kind_}'"
|
||||||
|
fi
|
||||||
|
|
||||||
|
declare -g _cleanups_sh_cleanup_kind_
|
||||||
|
|
||||||
|
_ensure_valid_cleanups_sh_cleanup_kind_ "${kind}"
|
||||||
|
_cleanups_sh_cleanup_kind_=${kind}
|
||||||
|
_call_cleanup_func setup_cleanups "${@}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Adds commands to the cleanup stack.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# @ - commands, one per parameter
|
||||||
|
function add_cleanup() {
|
||||||
|
_call_cleanup_func add_cleanup "${@}"
|
||||||
|
}
|
||||||
|
|
||||||
|
#
|
||||||
|
# Implementation details.
|
||||||
|
#
|
||||||
|
|
||||||
|
# "file" cleanups
|
||||||
|
|
||||||
|
function _file_setup_cleanups() {
|
||||||
|
if [[ ${#} -ne 1 ]]; then
|
||||||
|
fail 'missing cleanup file location argument for file cleanups'
|
||||||
|
fi
|
||||||
|
|
||||||
|
declare -g _file_cleanup_file
|
||||||
|
_file_cleanup_file=$(realpath "${1}"); shift
|
||||||
|
add_cleanup "rm -f ${_file_cleanup_file@Q}"
|
||||||
|
}
|
||||||
|
|
||||||
|
function _file_add_cleanup() {
|
||||||
|
local fac_cleanup_dir tmpfile
|
||||||
|
dirname_out "${_file_cleanup_file}" fac_cleanup_dir
|
||||||
|
|
||||||
|
tmpfile=$(mktemp -p "${fac_cleanup_dir}")
|
||||||
|
printf '%s\n' "${@}" >"${tmpfile}"
|
||||||
|
if [[ -f "${_file_cleanup_file}" ]]; then
|
||||||
|
cat "${_file_cleanup_file}" >>"${tmpfile}"
|
||||||
|
fi
|
||||||
|
mv -f "${tmpfile}" "${_file_cleanup_file}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# "trap" cleanups
|
||||||
|
|
||||||
|
function _trap_update_trap() {
|
||||||
|
# shellcheck disable=SC2064 # using double quotes on purpose instead of single quotes
|
||||||
|
trap "${_trap_cleanup_actions}" EXIT
|
||||||
|
}
|
||||||
|
|
||||||
|
function _trap_setup_cleanups() {
|
||||||
|
declare -g _trap_cleanup_actions
|
||||||
|
_trap_cleanup_actions=':'
|
||||||
|
|
||||||
|
declare -g -A _trap_cleanup_snapshots
|
||||||
|
_trap_cleanup_snapshots=()
|
||||||
|
|
||||||
|
_trap_update_trap
|
||||||
|
}
|
||||||
|
|
||||||
|
function _trap_add_cleanup() {
|
||||||
|
local tac_joined
|
||||||
|
join_by tac_joined ' ; ' "${@/%/' || :'}"
|
||||||
|
_trap_cleanup_actions="${tac_joined} ; ${_trap_cleanup_actions}"
|
||||||
|
_trap_update_trap
|
||||||
|
}
|
||||||
|
|
||||||
|
# "ignore" cleanups
|
||||||
|
|
||||||
|
function _ignore_setup_cleanups() {
|
||||||
|
:
|
||||||
|
}
|
||||||
|
|
||||||
|
function _ignore_add_cleanup() {
|
||||||
|
:
|
||||||
|
}
|
||||||
|
|
||||||
|
function _ensure_valid_cleanups_sh_cleanup_kind_() {
|
||||||
|
local kind
|
||||||
|
kind=${1}; shift
|
||||||
|
|
||||||
|
local -a functions=(
|
||||||
|
setup_cleanups
|
||||||
|
add_cleanup
|
||||||
|
)
|
||||||
|
|
||||||
|
local func
|
||||||
|
for func in "${functions[@]/#/_${kind}_}"; do
|
||||||
|
if ! declare -pF "${func}" >/dev/null 2>/dev/null; then
|
||||||
|
fail "kind '${kind}' is not a valid cleanup kind, function '${func}' is not defined"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
function _call_cleanup_func() {
|
||||||
|
local func_name
|
||||||
|
func_name=${1}; shift
|
||||||
|
if [[ -z "${_cleanups_sh_cleanup_kind_}" ]]; then
|
||||||
|
_cleanups_sh_cleanup_kind_='trap'
|
||||||
|
fi
|
||||||
|
|
||||||
|
local func
|
||||||
|
func="_${_cleanups_sh_cleanup_kind_}_${func_name}"
|
||||||
|
|
||||||
|
"${func}" "${@}"
|
||||||
|
}
|
||||||
|
|
||||||
|
fi
|
183
pkg_auto/impl/gentoo_ver.sh
Normal file
183
pkg_auto/impl/gentoo_ver.sh
Normal file
@ -0,0 +1,183 @@
|
|||||||
|
# Copyright 1999-2021 Gentoo Authors
|
||||||
|
# Distributed under the terms of the GNU General Public License v2
|
||||||
|
|
||||||
|
# @AUTHOR:
|
||||||
|
# Ulrich Müller <ulm@gentoo.org>
|
||||||
|
# Michał Górny <mgorny@gentoo.org>
|
||||||
|
|
||||||
|
# It is a cut-down and modified version of the now-gone eapi7-ver.eclass.
|
||||||
|
|
||||||
|
if [[ -z ${__GENTOO_VER_SH_INCLUDED__:-} ]]; then
|
||||||
|
__GENTOO_VER_SH_INCLUDED__=x
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/util.sh"
|
||||||
|
|
||||||
|
VER_ERE="^([0-9]+(\.[0-9]+)*)([a-z]?)((_(alpha|beta|pre|rc|p)[0-9]*)*)(-r[0-9]+)?$"
|
||||||
|
|
||||||
|
# @FUNCTION: _ver_compare_int
|
||||||
|
# @USAGE: <a> <b>
|
||||||
|
# @RETURN: 0 if <a> -eq <b>, 1 if <a> -lt <b>, 3 if <a> -gt <b>
|
||||||
|
# @INTERNAL
|
||||||
|
# @DESCRIPTION:
|
||||||
|
# Compare two non-negative integers <a> and <b>, of arbitrary length.
|
||||||
|
# If <a> is equal to, less than, or greater than <b>, return 0, 1, or 3
|
||||||
|
# as exit status, respectively.
|
||||||
|
_ver_compare_int() {
|
||||||
|
local a=$1 b=$2 d=$(( ${#1}-${#2} ))
|
||||||
|
|
||||||
|
# Zero-pad to equal length if necessary.
|
||||||
|
if [[ ${d} -gt 0 ]]; then
|
||||||
|
printf -v b "%0${d}d%s" 0 "${b}"
|
||||||
|
elif [[ ${d} -lt 0 ]]; then
|
||||||
|
printf -v a "%0$(( -d ))d%s" 0 "${a}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
[[ ${a} > ${b} ]] && return 3
|
||||||
|
[[ ${a} == "${b}" ]]
|
||||||
|
}
|
||||||
|
|
||||||
|
# @FUNCTION: _ver_compare
|
||||||
|
# @USAGE: <va> <vb>
|
||||||
|
# @RETURN: 1 if <va> < <vb>, 2 if <va> = <vb>, 3 if <va> > <vb>
|
||||||
|
# @INTERNAL
|
||||||
|
# @DESCRIPTION:
|
||||||
|
# Compare two versions <va> and <vb>. If <va> is less than, equal to,
|
||||||
|
# or greater than <vb>, return 1, 2, or 3 as exit status, respectively.
|
||||||
|
_ver_compare() {
|
||||||
|
local va=${1} vb=${2} a an al as ar b bn bl bs br re LC_ALL=C
|
||||||
|
|
||||||
|
re=${VER_ERE}
|
||||||
|
|
||||||
|
[[ ${va} =~ ${re} ]] || fail "${FUNCNAME[0]}: invalid version: ${va}"
|
||||||
|
an=${BASH_REMATCH[1]}
|
||||||
|
al=${BASH_REMATCH[3]}
|
||||||
|
as=${BASH_REMATCH[4]}
|
||||||
|
ar=${BASH_REMATCH[7]}
|
||||||
|
|
||||||
|
[[ ${vb} =~ ${re} ]] || fail "${FUNCNAME[0]}: invalid version: ${vb}"
|
||||||
|
bn=${BASH_REMATCH[1]}
|
||||||
|
bl=${BASH_REMATCH[3]}
|
||||||
|
bs=${BASH_REMATCH[4]}
|
||||||
|
br=${BASH_REMATCH[7]}
|
||||||
|
|
||||||
|
# Compare numeric components (PMS algorithm 3.2)
|
||||||
|
# First component
|
||||||
|
_ver_compare_int "${an%%.*}" "${bn%%.*}" || return
|
||||||
|
|
||||||
|
while [[ ${an} == *.* && ${bn} == *.* ]]; do
|
||||||
|
# Other components (PMS algorithm 3.3)
|
||||||
|
an=${an#*.}
|
||||||
|
bn=${bn#*.}
|
||||||
|
a=${an%%.*}
|
||||||
|
b=${bn%%.*}
|
||||||
|
if [[ ${a} == 0* || ${b} == 0* ]]; then
|
||||||
|
# Remove any trailing zeros
|
||||||
|
[[ ${a} =~ 0+$ ]] && a=${a%"${BASH_REMATCH[0]}"}
|
||||||
|
[[ ${b} =~ 0+$ ]] && b=${b%"${BASH_REMATCH[0]}"}
|
||||||
|
[[ ${a} > ${b} ]] && return 3
|
||||||
|
[[ ${a} < ${b} ]] && return 1
|
||||||
|
else
|
||||||
|
_ver_compare_int "${a}" "${b}" || return
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
[[ ${an} == *.* ]] && return 3
|
||||||
|
[[ ${bn} == *.* ]] && return 1
|
||||||
|
|
||||||
|
# Compare letter components (PMS algorithm 3.4)
|
||||||
|
[[ ${al} > ${bl} ]] && return 3
|
||||||
|
[[ ${al} < ${bl} ]] && return 1
|
||||||
|
|
||||||
|
# Compare suffixes (PMS algorithm 3.5)
|
||||||
|
as=${as#_}${as:+_}
|
||||||
|
bs=${bs#_}${bs:+_}
|
||||||
|
while [[ -n ${as} && -n ${bs} ]]; do
|
||||||
|
# Compare each suffix (PMS algorithm 3.6)
|
||||||
|
a=${as%%_*}
|
||||||
|
b=${bs%%_*}
|
||||||
|
if [[ ${a%%[0-9]*} == "${b%%[0-9]*}" ]]; then
|
||||||
|
_ver_compare_int "${a##*[a-z]}" "${b##*[a-z]}" || return
|
||||||
|
else
|
||||||
|
# Check for p first
|
||||||
|
[[ ${a%%[0-9]*} == p ]] && return 3
|
||||||
|
[[ ${b%%[0-9]*} == p ]] && return 1
|
||||||
|
# Hack: Use that alpha < beta < pre < rc alphabetically
|
||||||
|
[[ ${a} > ${b} ]] && return 3 || return 1
|
||||||
|
fi
|
||||||
|
as=${as#*_}
|
||||||
|
bs=${bs#*_}
|
||||||
|
done
|
||||||
|
if [[ -n ${as} ]]; then
|
||||||
|
[[ ${as} == p[_0-9]* ]] && return 3 || return 1
|
||||||
|
elif [[ -n ${bs} ]]; then
|
||||||
|
[[ ${bs} == p[_0-9]* ]] && return 1 || return 3
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Compare revision components (PMS algorithm 3.7)
|
||||||
|
_ver_compare_int "${ar#-r}" "${br#-r}" || return
|
||||||
|
|
||||||
|
return 2
|
||||||
|
}
|
||||||
|
|
||||||
|
# @FUNCTION: ver_test
|
||||||
|
# @USAGE: [<v1>] <op> <v2>
|
||||||
|
# @DESCRIPTION:
|
||||||
|
# Check if the relation <v1> <op> <v2> is true. If <v1> is not specified,
|
||||||
|
# default to ${PVR}. <op> can be -gt, -ge, -eq, -ne, -le, -lt.
|
||||||
|
# Both versions must conform to the PMS version syntax (with optional
|
||||||
|
# revision parts), and the comparison is performed according to
|
||||||
|
# the algorithm specified in the PMS.
|
||||||
|
ver_test() {
|
||||||
|
local va op vb
|
||||||
|
|
||||||
|
if [[ $# -eq 3 ]]; then
|
||||||
|
va=${1}
|
||||||
|
shift
|
||||||
|
else
|
||||||
|
va=${PVR}
|
||||||
|
fi
|
||||||
|
|
||||||
|
[[ $# -eq 2 ]] || fail "${FUNCNAME[0]}: bad number of arguments"
|
||||||
|
|
||||||
|
op=${1}
|
||||||
|
vb=${2}
|
||||||
|
|
||||||
|
case ${op} in
|
||||||
|
-eq|-ne|-lt|-le|-gt|-ge) ;;
|
||||||
|
*) fail "${FUNCNAME[0]}: invalid operator: ${op}" ;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
_ver_compare "${va}" "${vb}"
|
||||||
|
test $? "${op}" 2
|
||||||
|
}
|
||||||
|
|
||||||
|
# symbolic names for use with gentoo_ver_cmp_out
|
||||||
|
GV_LT=1
|
||||||
|
GV_EQ=2
|
||||||
|
GV_GT=3
|
||||||
|
|
||||||
|
# Compare two versions. The result can be compared against GV_LT, GV_EQ and GV_GT variables.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - version 1
|
||||||
|
# 2 - version 2
|
||||||
|
# 3 - name of variable to store the result in
|
||||||
|
function gentoo_ver_cmp_out() {
|
||||||
|
local v1 v2
|
||||||
|
v1=${1}; shift
|
||||||
|
v2=${1}; shift
|
||||||
|
local -n out_ref=${1}; shift
|
||||||
|
|
||||||
|
out_ref=0
|
||||||
|
_ver_compare "${v1}" "${v2}" || out_ref=${?}
|
||||||
|
case ${out_ref} in
|
||||||
|
"${GV_LT}"|"${GV_EQ}"|"${GV_GT}")
|
||||||
|
return 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
fail "unexpected return value ${out_ref} from _ver_compare for ${v1} and ${v2}"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
}
|
||||||
|
|
||||||
|
fi
|
485
pkg_auto/impl/inside_sdk_container_lib.sh
Normal file
485
pkg_auto/impl/inside_sdk_container_lib.sh
Normal file
@ -0,0 +1,485 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
if [[ -z ${__INSIDE_SDK_CONTAINER_LIB_SH_INCLUDED__:-} ]]; then
|
||||||
|
__INSIDE_SDK_CONTAINER_LIB_SH_INCLUDED__=x
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/util.sh"
|
||||||
|
|
||||||
|
# Invokes emerge to get a report about built packages for a given
|
||||||
|
# metapackage in the given root that has a portage configuration.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - root filesystem with the portage config
|
||||||
|
# 2 - metapackage to get the deps from
|
||||||
|
function emerge_pretend() {
|
||||||
|
local root package
|
||||||
|
root=${1}; shift
|
||||||
|
package=${1}; shift
|
||||||
|
|
||||||
|
# Probably a bunch of those flags are not necessary, but I'm not
|
||||||
|
# touching it - they seem to be working. :)
|
||||||
|
local -a emerge_opts=(
|
||||||
|
--config-root="${root}"
|
||||||
|
--root="${root}"
|
||||||
|
--sysroot="${root}"
|
||||||
|
--pretend
|
||||||
|
--columns
|
||||||
|
--nospinner
|
||||||
|
--oneshot
|
||||||
|
--color n
|
||||||
|
--emptytree
|
||||||
|
--verbose
|
||||||
|
--verbose-conflicts
|
||||||
|
--verbose-slot-rebuilds y
|
||||||
|
--changed-deps y
|
||||||
|
--changed-deps-report y
|
||||||
|
--changed-slot y
|
||||||
|
--changed-use
|
||||||
|
--newuse
|
||||||
|
--complete-graph y
|
||||||
|
--deep
|
||||||
|
--rebuild-if-new-slot y
|
||||||
|
--rebuild-if-unbuilt y
|
||||||
|
--with-bdeps y
|
||||||
|
--dynamic-deps y
|
||||||
|
--update
|
||||||
|
--ignore-built-slot-operator-deps y
|
||||||
|
--selective n
|
||||||
|
--keep-going y
|
||||||
|
)
|
||||||
|
local rv
|
||||||
|
rv=0
|
||||||
|
emerge "${emerge_opts[@]}" "${package}" || rv=${?}
|
||||||
|
if [[ ${rv} -ne 0 ]]; then
|
||||||
|
echo "WARNING: emerge exited with status ${rv}" >&2
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Gets package list for SDK.
|
||||||
|
function package_info_for_sdk() {
|
||||||
|
local root
|
||||||
|
root='/'
|
||||||
|
|
||||||
|
ignore_crossdev_stuff "${root}"
|
||||||
|
emerge_pretend "${root}" coreos-devel/sdk-depends
|
||||||
|
revert_crossdev_stuff "${root}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Gets package list for board of a given architecture.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - architecture
|
||||||
|
function package_info_for_board() {
|
||||||
|
local arch
|
||||||
|
arch=${1}; shift
|
||||||
|
|
||||||
|
local root
|
||||||
|
root="/build/${arch}-usr"
|
||||||
|
|
||||||
|
# Ignore crossdev stuff in both SDK root and board root - emerge
|
||||||
|
# may query SDK stuff for the board packages.
|
||||||
|
ignore_crossdev_stuff /
|
||||||
|
ignore_crossdev_stuff "${root}"
|
||||||
|
emerge_pretend "${root}" coreos-devel/board-packages
|
||||||
|
revert_crossdev_stuff "${root}"
|
||||||
|
revert_crossdev_stuff /
|
||||||
|
}
|
||||||
|
|
||||||
|
# Set the directory where the emerge output and the results of
|
||||||
|
# processing it will be stored. EO stands for "emerge output"
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - directory path
|
||||||
|
function set_eo() {
|
||||||
|
local dir=${1}; shift
|
||||||
|
|
||||||
|
SDK_EO="${dir}/sdk-emerge-output"
|
||||||
|
BOARD_EO="${dir}/board-emerge-output"
|
||||||
|
# shellcheck disable=SC2034 # used indirectly in cat_eo_f
|
||||||
|
SDK_EO_F="${SDK_EO}-filtered"
|
||||||
|
# shellcheck disable=SC2034 # used indirectly in cat_eo_f
|
||||||
|
BOARD_EO_F="${BOARD_EO}-filtered"
|
||||||
|
# shellcheck disable=SC2034 # used indirectly in cat_eo_w
|
||||||
|
SDK_EO_W="${SDK_EO}-warnings"
|
||||||
|
# shellcheck disable=SC2034 # used indirectly in cat_eo_w
|
||||||
|
BOARD_EO_W="${BOARD_EO}-warnings"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Print the contents of file, path of which is stored in a variable of
|
||||||
|
# a given name.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of the variable
|
||||||
|
function cat_var() {
|
||||||
|
local var_name
|
||||||
|
var_name=${1}; shift
|
||||||
|
local -n ref="${var_name}"
|
||||||
|
|
||||||
|
if [[ -z "${ref+isset}" ]]; then
|
||||||
|
fail "${var_name} unset"
|
||||||
|
fi
|
||||||
|
if [[ ! -e "${ref}" ]]; then
|
||||||
|
fail "${ref} does not exist"
|
||||||
|
fi
|
||||||
|
|
||||||
|
cat "${ref}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Print contents of the emerge output of a given kind. Kind can be
|
||||||
|
# either either sdk or board.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - kind
|
||||||
|
function cat_eo() {
|
||||||
|
local kind
|
||||||
|
kind=${1}; shift
|
||||||
|
|
||||||
|
cat_var "${kind^^}_EO"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Print contents of the filtered emerge output of a given kind. Kind
|
||||||
|
# can be either either sdk or board. The filtered emerge output
|
||||||
|
# contains only lines with package information.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - kind
|
||||||
|
function cat_eo_f() {
|
||||||
|
local kind
|
||||||
|
kind=${1}; shift
|
||||||
|
cat_var "${kind^^}_EO_F"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Print contents of a file that stores warnings that were printed by
|
||||||
|
# emerge. The warnings are specific to a kind (sdk or board).
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - kind
|
||||||
|
function cat_eo_w() {
|
||||||
|
local kind
|
||||||
|
kind=${1}; shift
|
||||||
|
|
||||||
|
cat_var "${kind^^}_EO_W"
|
||||||
|
}
|
||||||
|
|
||||||
|
# JSON output would be more verbose, but probably would not require
|
||||||
|
# these abominations below. But, alas, emerge doesn't have that yet.
|
||||||
|
|
||||||
|
# status package name version slot repo target (opt) keyvals size
|
||||||
|
# |--------------| |----------| |#-g1-----------#--#-g2-#| |--|-g------| |-g----------#-#-g-----| |---|
|
||||||
|
# [ebuild R ~] virtual/rust [1.71.1:0/llvm-16::coreos] to /some/root USE="-rustfmt" FOO="bar" 0 KiB
|
||||||
|
#
|
||||||
|
# Actually, there can also be another "version slot repo" part between
|
||||||
|
# the first "version slot repo" and "target" part.
|
||||||
|
STATUS_RE='\[[^]]*]' # 0 groups
|
||||||
|
PACKAGE_NAME_RE='[^[:space:]]*' # 0 groups
|
||||||
|
VER_SLOT_REPO_RE='\[\([^]]\+\)::\([^]]\+\)]' # 2 groups
|
||||||
|
TARGET_RE='to[[:space:]]\+\([^[:space:]]\)\+' # 1 group
|
||||||
|
KEYVALS_RE='\([[:space:]]*[A-Za-z0-9_]*="[^"]*"\)*' # 1 group (but containing only the last pair!)
|
||||||
|
SIZE_RE='[[:digit:]]\+[[:space:]]*[[:alpha:]]*B' # 0 groups
|
||||||
|
SPACES_RE='[[:space:]]\+' # 0 groups
|
||||||
|
NONSPACES_RE='[^[:space:]]\+' # 0 groups
|
||||||
|
NONSPACES_WITH_COLON_RE='[^[:space:]]*:' # 0 groups
|
||||||
|
|
||||||
|
FULL_LINE_RE='^'"${STATUS_RE}${SPACES_RE}${PACKAGE_NAME_RE}"'\('"${SPACES_RE}${VER_SLOT_REPO_RE}"'\)\{1,2\}\('"${SPACES_RE}${TARGET_RE}"'\)\?\('"${SPACES_RE}${KEYVALS_RE}"'\)*'"${SPACES_RE}${SIZE_RE}"'$'
|
||||||
|
|
||||||
|
# Filters sdk reports to get the package information.
|
||||||
|
function filter_sdk_eo() {
|
||||||
|
cat_eo sdk | xgrep -e "${FULL_LINE_RE}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Filters board reports for a given arch to get the package
|
||||||
|
# information.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - architecture
|
||||||
|
function filter_board_eo() {
|
||||||
|
local arch
|
||||||
|
arch=${1}; shift
|
||||||
|
|
||||||
|
# Replace ${arch}-usr in the output with a generic word BOARD.
|
||||||
|
cat_eo board | \
|
||||||
|
xgrep -e "${FULL_LINE_RE}" | \
|
||||||
|
sed -e "s#/build/${arch}-usr/#/build/BOARD/#"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Filters sdk reports to get anything but the package information
|
||||||
|
# (i.e. junk).
|
||||||
|
function junk_sdk_eo() {
|
||||||
|
cat_eo sdk | xgrep -v -e "${FULL_LINE_RE}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Filters board reports to get anything but the package information
|
||||||
|
# (i.e. junk).
|
||||||
|
function junk_board_eo() {
|
||||||
|
cat_eo board | xgrep -v -e "${FULL_LINE_RE}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# More regexp-like abominations follow.
|
||||||
|
|
||||||
|
# There may also be a line like:
|
||||||
|
#
|
||||||
|
# [blocks B ] <dev-util/gdbus-codegen-2.76.4 ("<dev-util/gdbus-codegen-2.76.4" is soft blocking dev-libs/glib-2.76.4)
|
||||||
|
#
|
||||||
|
# But currently we don't care about those - they land in junk.
|
||||||
|
|
||||||
|
SLOT_INFO_SED_FILTERS=(
|
||||||
|
# if there is no slot information in version, add :0
|
||||||
|
#
|
||||||
|
# assumption here is that version is a second word
|
||||||
|
-e "/^${NONSPACES_RE}${SPACES_RE}${NONSPACES_WITH_COLON_RE}/ ! s/^\(${NONSPACES_RE}${SPACES_RE}${NONSPACES_RE}\)/\1:0/"
|
||||||
|
)
|
||||||
|
|
||||||
|
PKG_VER_SLOT_SED_FILTERS=(
|
||||||
|
# from line like:
|
||||||
|
#
|
||||||
|
# [ebuild R ~] virtual/rust [1.71.1:0/llvm-16::coreos] USE="-rustfmt" 0 KiB
|
||||||
|
#
|
||||||
|
# extract package name, version and optionally a slot if it exists, the result would be:
|
||||||
|
#
|
||||||
|
# virtual/rust 1.71.1:0/llvm-16
|
||||||
|
-e "s/^${STATUS_RE}${SPACES_RE}\(${PACKAGE_NAME_RE}\)${SPACES_RE}${VER_SLOT_REPO_RE}.*/\1 \2/"
|
||||||
|
"${SLOT_INFO_SED_FILTERS[@]}"
|
||||||
|
)
|
||||||
|
|
||||||
|
PKG_VER_SLOT_KV_SED_FILTERS=(
|
||||||
|
# from line like:
|
||||||
|
#
|
||||||
|
# [ebuild R ~] virtual/rust [1.71.1:0/llvm-16::coreos] USE="-rustfmt" 0 KiB
|
||||||
|
#
|
||||||
|
# extract package name, version, optionally a slot if it exists and key value pairs if any, the result would be:
|
||||||
|
#
|
||||||
|
# virtual/rust 1.71.1:0/llvm-16 USE="-rustfmt"
|
||||||
|
-e "s/${STATUS_RE}${SPACES_RE}\(${PACKAGE_NAME_RE}\)${SPACES_RE}${VER_SLOT_REPO_RE}\(${SPACES_RE}${VER_SLOT_REPO_RE}\)\?\(${SPACES_RE}${TARGET_RE}\)\?${SPACES_RE}\(${KEYVALS_RE}\)${SPACES_RE}${SIZE_RE}\$/\1 \2 \9/"
|
||||||
|
"${SLOT_INFO_SED_FILTERS[@]}"
|
||||||
|
)
|
||||||
|
|
||||||
|
PKG_REPO_SED_FILTERS=(
|
||||||
|
# from line like:
|
||||||
|
#
|
||||||
|
# [ebuild R ~] virtual/rust [1.71.1:0/llvm-16::coreos] USE="-rustfmt" 0 KiB
|
||||||
|
#
|
||||||
|
# extract package name and repo, the result would be:
|
||||||
|
#
|
||||||
|
# virtual/rust coreos
|
||||||
|
-e "s/^${STATUS_RE}${SPACES_RE}\(${PACKAGE_NAME_RE}\)${SPACES_RE}${VER_SLOT_REPO_RE}${SPACES_RE}.*/\1 \3/"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Applies some sed filter over the emerge output of a given kind.
|
||||||
|
# Results are printed.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - kind (sdk or board)
|
||||||
|
# @ - parameters passed to sed
|
||||||
|
function sed_eo_and_sort() {
|
||||||
|
local kind
|
||||||
|
kind=${1}; shift
|
||||||
|
# rest goes to sed
|
||||||
|
|
||||||
|
cat_eo_f "${kind}" | sed "${@}" | sort
|
||||||
|
}
|
||||||
|
|
||||||
|
# Applies some sed filter over the SDK emerge output. Results are
|
||||||
|
# printed.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# @ - parameters passed to sed
|
||||||
|
function packages_for_sdk() {
|
||||||
|
# args are passed to sed_eo_and_sort
|
||||||
|
|
||||||
|
sed_eo_and_sort sdk "${@}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Applies some sed filter over the board emerge output. Results are
|
||||||
|
# printed.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# @ - parameters passed to sed
|
||||||
|
function packages_for_board() {
|
||||||
|
# args are passed to sed_eo_and_sort
|
||||||
|
|
||||||
|
sed_eo_and_sort board "${@}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Prints package name, slot and version information for SDK.
|
||||||
|
function versions_sdk() {
|
||||||
|
local -a sed_opts
|
||||||
|
sed_opts=(
|
||||||
|
"${PKG_VER_SLOT_SED_FILTERS[@]}"
|
||||||
|
)
|
||||||
|
packages_for_sdk "${sed_opts[@]}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Prints package name, slot, version and key-values information for
|
||||||
|
# SDK. Key-values may be something like USE="foo bar -baz".
|
||||||
|
function versions_sdk_with_key_values() {
|
||||||
|
local -a sed_opts
|
||||||
|
sed_opts=(
|
||||||
|
"${PKG_VER_SLOT_KV_SED_FILTERS[@]}"
|
||||||
|
)
|
||||||
|
packages_for_sdk "${sed_opts[@]}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Prints package name, slot and version information for board.
|
||||||
|
function versions_board() {
|
||||||
|
local -a sed_opts
|
||||||
|
sed_opts=(
|
||||||
|
-e '/to \/build\/BOARD\// ! d'
|
||||||
|
"${PKG_VER_SLOT_SED_FILTERS[@]}"
|
||||||
|
)
|
||||||
|
packages_for_board "${sed_opts[@]}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Prints package name, slot, version and key-values information for
|
||||||
|
# build dependencies of board. Key-values may be something like
|
||||||
|
# USE="foo bar -baz".
|
||||||
|
function board_bdeps() {
|
||||||
|
local -a sed_opts
|
||||||
|
sed_opts=(
|
||||||
|
-e '/to \/build\/BOARD\// d'
|
||||||
|
"${PKG_VER_SLOT_KV_SED_FILTERS[@]}"
|
||||||
|
)
|
||||||
|
packages_for_board "${sed_opts[@]}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Print package name and source repository names information for SDK.
|
||||||
|
function package_sources_sdk() {
|
||||||
|
local -a sed_opts
|
||||||
|
sed_opts=(
|
||||||
|
"${PKG_REPO_SED_FILTERS[@]}"
|
||||||
|
)
|
||||||
|
packages_for_sdk "${sed_opts[@]}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Print package name and source repository names information for
|
||||||
|
# board.
|
||||||
|
function package_sources_board() {
|
||||||
|
local -a sed_opts
|
||||||
|
sed_opts=(
|
||||||
|
"${PKG_REPO_SED_FILTERS[@]}"
|
||||||
|
)
|
||||||
|
packages_for_board "${sed_opts[@]}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Checks if no errors were produced by emerge when generating
|
||||||
|
# reports. It is assumed that emerge will print a line with "ERROR" in
|
||||||
|
# it to denote a failure.
|
||||||
|
function ensure_no_errors() {
|
||||||
|
local kind
|
||||||
|
|
||||||
|
for kind in sdk board; do
|
||||||
|
if cat_eo_w "${kind}" | grep --quiet --fixed-strings 'ERROR'; then
|
||||||
|
fail "there are errors in emerge output warnings files"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
# Stores a path to a package.provided file inside the given root
|
||||||
|
# filesystem portage configuration. Mostly used to ignore
|
||||||
|
# cross-toolchains.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - path to root filesystem with the portage configuration
|
||||||
|
# 2 - name of a variable where the path will be stored
|
||||||
|
function get_provided_file() {
|
||||||
|
local root path_var_name
|
||||||
|
root=${1}; shift
|
||||||
|
path_var_name=${1}; shift
|
||||||
|
local -n path_ref="${path_var_name}"
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # reference to external variable
|
||||||
|
path_ref="${root}/etc/portage/profile/package.provided/ignore_cross_packages"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Marks packages coming from crossdev repo as provided at a very high
|
||||||
|
# version. We do this, because updating their native counterparts will
|
||||||
|
# cause emerge to complain that cross-<triplet>/<package> is masked
|
||||||
|
# (like for sys-libs/glibc and cross-x86_64-cros-linux-gnu/glibc),
|
||||||
|
# because it has no keywords. In theory, we could try updating
|
||||||
|
# <ROOT>/etc/portage/package.mask/cross-<triplet> file created by the
|
||||||
|
# crossdev tool to unmask the new version, but it's an unnecessary
|
||||||
|
# hassle - native and cross package are supposed to be the same ebuild
|
||||||
|
# anyway, so update information about cross package is redundant.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - root directory
|
||||||
|
# 2 - ID of the crossdev repository (optional, defaults to x-crossdev)
|
||||||
|
function ignore_crossdev_stuff() {
|
||||||
|
local root crossdev_repo_id
|
||||||
|
root=${1}; shift
|
||||||
|
crossdev_repo_id=${1:-x-crossdev}; shift || :
|
||||||
|
|
||||||
|
local crossdev_repo_path
|
||||||
|
crossdev_repo_path=$(portageq get_repo_path "${root}" "${crossdev_repo_id}")
|
||||||
|
|
||||||
|
local ics_path ics_dir
|
||||||
|
get_provided_file "${root}" ics_path
|
||||||
|
dirname_out "${ics_path}" ics_dir
|
||||||
|
|
||||||
|
sudo mkdir -p "${ics_dir}"
|
||||||
|
env --chdir="${crossdev_repo_path}" find . -type l | \
|
||||||
|
cut -d/ -f2-3 | \
|
||||||
|
sed -e 's/$/-9999/' | \
|
||||||
|
sudo tee "${ics_path}" >/dev/null
|
||||||
|
}
|
||||||
|
|
||||||
|
# Reverts effects of the ignore_crossdev_stuff function.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - root directory
|
||||||
|
function revert_crossdev_stuff() {
|
||||||
|
local root
|
||||||
|
root=${1}; shift
|
||||||
|
|
||||||
|
local ics_path ics_dir
|
||||||
|
get_provided_file "${root}" ics_path
|
||||||
|
dirname_out "${ics_path}" ics_dir
|
||||||
|
|
||||||
|
sudo rm -f "${ics_path}"
|
||||||
|
if dir_is_empty "${ics_dir}"; then
|
||||||
|
sudo rmdir "${ics_dir}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Checks if the expected reports were generated by emerge.
|
||||||
|
function ensure_valid_reports() {
|
||||||
|
local kind var_name
|
||||||
|
for kind in sdk board; do
|
||||||
|
var_name="${kind^^}_EO_F"
|
||||||
|
if [[ ! -s ${!var_name} ]]; then
|
||||||
|
fail "report files are missing or are empty"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
# Drops the empty warning files in given directory.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - path to the directory
|
||||||
|
function clean_empty_warning_files() {
|
||||||
|
local dir
|
||||||
|
dir=${1}; shift
|
||||||
|
|
||||||
|
local file
|
||||||
|
for file in "${dir}/"*'-warnings'; do
|
||||||
|
if [[ ! -s ${file} ]]; then
|
||||||
|
rm -f "${file}"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
fi
|
573
pkg_auto/impl/mvm.sh
Normal file
573
pkg_auto/impl/mvm.sh
Normal file
@ -0,0 +1,573 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
#
|
||||||
|
# "mvm" stands for "multi-valued map", so these are maps of scalars
|
||||||
|
# (strings, numbers) to other container (arrays or maps)
|
||||||
|
#
|
||||||
|
# mvm is implemented with a map that contains some predefined keys,
|
||||||
|
# like "name", "constructor", "storage", etc.
|
||||||
|
#
|
||||||
|
# The "storage" field is the actual "map" part of the "mvm", at the
|
||||||
|
# values stored in it are names of the global variables being the
|
||||||
|
# "multi-valued" part of the "mvm". In the code these variables are
|
||||||
|
# referred to as "mvc" meaning "multi-value container".
|
||||||
|
#
|
||||||
|
# The "constructor" and "destructor" fields are here to properly
|
||||||
|
# implement creating and destroying mvcs. The "adder" field is for
|
||||||
|
# adding elements to an mvc.
|
||||||
|
#
|
||||||
|
# There is also a "counter" field which, together with the "name"
|
||||||
|
# field, is used for creating the names for mvc variables.
|
||||||
|
#
|
||||||
|
# The "extras" field is for user-defined mapping. The mvm will clear
|
||||||
|
# the mapping itself, but if the values are anything else than simple
|
||||||
|
# scalars (e.g. names of variables) then the cleanup of those is
|
||||||
|
# user's task.
|
||||||
|
#
|
||||||
|
# There is also an optional field named "iteration_helper" which is a
|
||||||
|
# callback invoked when iterating over the mvm.
|
||||||
|
#
|
||||||
|
# In order to implement a new mvc type, the following functions need
|
||||||
|
# to be implemented:
|
||||||
|
#
|
||||||
|
# <type>_constructor - takes an mvc name; should create an mvc with the
|
||||||
|
# passed name.
|
||||||
|
# <type>_destructor - takes an mvc name; should unset an mvc with the
|
||||||
|
# passed name, should likely take care of cleaning
|
||||||
|
# up the values stored in the mvc
|
||||||
|
# <type>_adder - takes an mvc name and values to be added; should add
|
||||||
|
# the values to the mvc
|
||||||
|
# <type>_iteration_helper - optional; takes a key, an mvc name, a
|
||||||
|
# callback and extra arguments to be
|
||||||
|
# forwarded to the callback; should invoke
|
||||||
|
# the callback with the extra arguments, the
|
||||||
|
# key, the mvc name and optionally some
|
||||||
|
# extra arguments the helper deems useful
|
||||||
|
|
||||||
|
if [[ -z ${__MVM_SH_INCLUDED__:-} ]]; then
|
||||||
|
__MVM_SH_INCLUDED__=x
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/util.sh"
|
||||||
|
|
||||||
|
# Used for creating unique names for extras and storage maps.
|
||||||
|
MVM_COUNTER=0
|
||||||
|
|
||||||
|
# mvm API
|
||||||
|
|
||||||
|
# Creates a new mvm with a passed name, optionally type and
|
||||||
|
# extras. The name must be globally unique. The type is optional. If
|
||||||
|
# no type is passed, an array mvm will be assumed. Otherwise the type
|
||||||
|
# must be valid, i.e. it must provide a constructor, a destructor, an
|
||||||
|
# adder and, optionally, an iteration helper. The built in types are
|
||||||
|
# "mvm_mvc_array", "mvm_mvc_set" and "mvm_mvc_map". If any extras are
|
||||||
|
# passed, they must be preceded with a double dash to avoid ambiguity
|
||||||
|
# between type and a first extras key. Extras are expected to be even
|
||||||
|
# in count, odd elements will be used as keys, even elements will be
|
||||||
|
# used as values.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of the mvm
|
||||||
|
# @ - optional mvc type, optionally followed by double dash and extras
|
||||||
|
# key-value pairs.
|
||||||
|
function mvm_declare() {
|
||||||
|
local mvm_var_name
|
||||||
|
mvm_var_name=${1}; shift
|
||||||
|
|
||||||
|
if declare -p "${mvm_var_name}" >/dev/null 2>/dev/null; then
|
||||||
|
fail "variable ${mvm_var_name} already exists, declaring mvm for it would clobber it"
|
||||||
|
fi
|
||||||
|
|
||||||
|
local value_handler_prefix
|
||||||
|
value_handler_prefix=''
|
||||||
|
if [[ ${#} -gt 0 ]]; then
|
||||||
|
if [[ ${1} != '--' ]]; then
|
||||||
|
value_handler_prefix=${1}
|
||||||
|
shift
|
||||||
|
fi
|
||||||
|
if [[ ${#} -gt 0 ]]; then
|
||||||
|
if [[ ${1} != '--' ]]; then
|
||||||
|
fail "missing double-dash separator between optional value handler prefix and extra key value pairs for '${mvm_var_name}'"
|
||||||
|
fi
|
||||||
|
shift
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
if [[ -z ${value_handler_prefix} ]]; then
|
||||||
|
value_handler_prefix=mvm_mvc_array
|
||||||
|
fi
|
||||||
|
# rest are key value pairs for extras
|
||||||
|
|
||||||
|
mvm_debug "${mvm_var_name}" "using prefix ${value_handler_prefix}"
|
||||||
|
|
||||||
|
local constructor destructor adder iteration_helper
|
||||||
|
constructor="${value_handler_prefix}_constructor"
|
||||||
|
destructor="${value_handler_prefix}_destructor"
|
||||||
|
adder="${value_handler_prefix}_adder"
|
||||||
|
iteration_helper="${value_handler_prefix}_iteration_helper"
|
||||||
|
|
||||||
|
local func
|
||||||
|
for func in "${constructor}" "${destructor}" "${adder}"; do
|
||||||
|
if ! declare -pF "${func}" >/dev/null 2>/dev/null; then
|
||||||
|
fail "'${func}' is not a function, is '${value_handler_prefix}' a valid prefix?"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
if ! declare -pF "${iteration_helper}" >/dev/null 2>/dev/null; then
|
||||||
|
mvm_debug "${mvm_var_name}" "no interation helper available"
|
||||||
|
iteration_helper=''
|
||||||
|
fi
|
||||||
|
|
||||||
|
local extras_idx storage_idx extras_map_var_name storage_map_var_name
|
||||||
|
extras_idx=$((MVM_COUNTER))
|
||||||
|
storage_idx=$((MVM_COUNTER + 1))
|
||||||
|
extras_map_var_name="mvm_stuff_${extras_idx}"
|
||||||
|
storage_map_var_name="mvm_stuff_${storage_idx}"
|
||||||
|
|
||||||
|
MVM_COUNTER=$((MVM_COUNTER + 2))
|
||||||
|
|
||||||
|
declare -g -A "${mvm_var_name}" "${extras_map_var_name}" "${storage_map_var_name}"
|
||||||
|
|
||||||
|
mvm_debug "${mvm_var_name}" "extras map: ${extras_map_var_name}, storage_map: ${storage_map_var_name}"
|
||||||
|
|
||||||
|
local -n storage_map_ref=${storage_map_var_name}
|
||||||
|
storage_map_ref=()
|
||||||
|
|
||||||
|
local -n mvm_ref=${mvm_var_name}
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
mvm_ref=(
|
||||||
|
['name']="${mvm_var_name}"
|
||||||
|
['constructor']="${constructor}"
|
||||||
|
['destructor']="${destructor}"
|
||||||
|
['adder']="${adder}"
|
||||||
|
['iteration_helper']="${iteration_helper}"
|
||||||
|
['counter']=0
|
||||||
|
['extras']="${extras_map_var_name}"
|
||||||
|
['storage']="${storage_map_var_name}"
|
||||||
|
)
|
||||||
|
local -n extras_map_ref=${extras_map_var_name}
|
||||||
|
while [[ ${#} -gt 1 ]]; do
|
||||||
|
mvm_debug "${mvm_var_name}" "adding ${1} -> ${2} pair to extras"
|
||||||
|
extras_map_ref["${1}"]=${2}
|
||||||
|
shift 2
|
||||||
|
done
|
||||||
|
if [[ ${#} -gt 0 ]]; then
|
||||||
|
fail "odd number of parameters for extra key value information for '${mvm_var_name}'"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Takes a name of mvm, a callback, and extra parameters that will be
|
||||||
|
# forwarded to the callback. Before invoking the callback, the
|
||||||
|
# function will declare a local variable called "mvm" which is a
|
||||||
|
# reference to the variable with the passed name. The "mvm" variable
|
||||||
|
# can be used for easy access to the map within the callback.
|
||||||
|
#
|
||||||
|
# The convention is that the function foo_barize will use mvm_call to
|
||||||
|
# invoke a callback named foo_c_barize. The foo_c_barize function can
|
||||||
|
# invoke other _c_ infixed functions, like mvm_c_get_extra or
|
||||||
|
# mvm_c_get.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of mvm variable
|
||||||
|
# 2 - name of the callback
|
||||||
|
# @ - arguments for the callback
|
||||||
|
function mvm_call() {
|
||||||
|
local name func
|
||||||
|
name=${1}; shift
|
||||||
|
func=${1}; shift
|
||||||
|
# rest are func args
|
||||||
|
|
||||||
|
mvm_debug "${name}" "invoking ${func} with args: ${*@Q}"
|
||||||
|
|
||||||
|
# The "mvm" variable can be used by ${func} now.
|
||||||
|
local -n mvm=${name}
|
||||||
|
"${func}" "${@}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Internal function that generates a name for mvc based on passed name
|
||||||
|
# and counter.
|
||||||
|
function __mvm_mvc_name() {
|
||||||
|
local name counter mvc_name_var_name
|
||||||
|
name=${1}; shift
|
||||||
|
counter=${1}; shift
|
||||||
|
mvc_name_var_name=${1}; shift
|
||||||
|
local -n mvc_name_ref=${mvc_name_var_name}
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
mvc_name_ref="mvm_${name}_mvc_${counter}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Destroy the mvm with passed name.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of mvm to destroy
|
||||||
|
function mvm_unset() {
|
||||||
|
mvm_call "${1}" mvm_c_unset "${@:2}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Helper function for mvm_unset invoked through mvm_call.
|
||||||
|
function mvm_c_unset() {
|
||||||
|
local counter name extras_map_var_name storage_map_var_name destructor mvm_mcu_mvc_name
|
||||||
|
|
||||||
|
counter=${mvm['counter']}
|
||||||
|
name=${mvm['name']}
|
||||||
|
extras_map_var_name=${mvm['extras']}
|
||||||
|
storage_map_var_name=${mvm['storage']}
|
||||||
|
destructor=${mvm['destructor']}
|
||||||
|
|
||||||
|
while [[ ${counter} -gt 0 ]]; do
|
||||||
|
counter=$((counter - 1))
|
||||||
|
__mvm_mvc_name "${name}" "${counter}" mvm_mcu_mvc_name
|
||||||
|
"${destructor}" "${mvm_mcu_mvc_name}"
|
||||||
|
done
|
||||||
|
unset "${storage_map_var_name}"
|
||||||
|
unset "${extras_map_var_name}"
|
||||||
|
unset "${name}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Gets an value from extras map for a given key.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of the mvm variable
|
||||||
|
# 2 - extra key
|
||||||
|
# 3 - name of a variable where the extra value will be stored
|
||||||
|
function mvm_get_extra() {
|
||||||
|
mvm_call "${1}" mvm_c_get_extra "${@:2}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Helper function for mvm_get_extra invoked through mvm_call.
|
||||||
|
function mvm_c_get_extra() {
|
||||||
|
local extra extra_var_name
|
||||||
|
extra=${1}; shift
|
||||||
|
extra_var_name=${1}; shift
|
||||||
|
local -n extra_ref=${extra_var_name}
|
||||||
|
|
||||||
|
local extras_map_var_name
|
||||||
|
extras_map_var_name=${mvm['extras']}
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n extras_map_ref=${extras_map_var_name}
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
extra_ref=${extras_map_ref["${extra}"]:-}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Gets a name of the mvc for a given key.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of the mvm variable
|
||||||
|
# 2 - key
|
||||||
|
# 3 - name of a variable where the mvc name will be stored
|
||||||
|
function mvm_get() {
|
||||||
|
mvm_call "${1}" mvm_c_get "${@:2}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Helper function for mvm_get invoked through mvm_call.
|
||||||
|
function mvm_c_get() {
|
||||||
|
local key value_var_name
|
||||||
|
key=${1}; shift
|
||||||
|
value_var_name=${1}; shift
|
||||||
|
local -n value_ref=${value_var_name}
|
||||||
|
|
||||||
|
local storage_map_var_name
|
||||||
|
storage_map_var_name=${mvm['storage']}
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n storage_map_ref=${storage_map_var_name}
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
value_ref=${storage_map_ref["${key}"]:-}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Internal function for creating a new mvc.
|
||||||
|
function __mvm_c_make_new_mvc() {
|
||||||
|
local key mvc_name_var_name
|
||||||
|
key=${1}; shift
|
||||||
|
mvc_name_var_name=${1}; shift
|
||||||
|
|
||||||
|
local name counter storage_map_var_name
|
||||||
|
name=${mvm['name']}
|
||||||
|
counter=${mvm['counter']}
|
||||||
|
storage_map_var_name=${mvm['storage']}
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n storage_map_ref=${storage_map_var_name}
|
||||||
|
|
||||||
|
__mvm_mvc_name "${name}" "${counter}" "${mvc_name_var_name}"
|
||||||
|
|
||||||
|
local constructor
|
||||||
|
constructor=${mvm['constructor']}
|
||||||
|
|
||||||
|
"${constructor}" "${!mvc_name_var_name}"
|
||||||
|
mvm['counter']=$((counter + 1))
|
||||||
|
storage_map_ref["${key}"]=${!mvc_name_var_name}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Adds passed elements to the mvm under the given key. If an mvc for
|
||||||
|
# the key didn't exist in the mvm, it gets created.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of the mvm variable
|
||||||
|
# 2 - key
|
||||||
|
# @ - elements
|
||||||
|
function mvm_add() {
|
||||||
|
mvm_call "${1}" mvm_c_add "${@:2}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Helper function for mvm_add invoked through mvm_call.
|
||||||
|
function mvm_c_add() {
|
||||||
|
local key
|
||||||
|
key=${1}; shift
|
||||||
|
# rest are values to add
|
||||||
|
|
||||||
|
local adder mvm_mca_mvc_name
|
||||||
|
adder=${mvm['adder']}
|
||||||
|
mvm_c_get "${key}" mvm_mca_mvc_name
|
||||||
|
|
||||||
|
if [[ -z ${mvm_mca_mvc_name} ]]; then
|
||||||
|
__mvm_c_make_new_mvc "${key}" mvm_mca_mvc_name
|
||||||
|
fi
|
||||||
|
"${adder}" "${mvm_mca_mvc_name}" "${@}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Removes the key from the mvm.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of the mvm variable
|
||||||
|
# 2 - key
|
||||||
|
function mvm_remove() {
|
||||||
|
mvm_call "${1}" mvm_c_remove "${@:2}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Helper function for mvm_remove invoked through mvm_call.
|
||||||
|
function mvm_c_remove() {
|
||||||
|
local key
|
||||||
|
key=${1}; shift
|
||||||
|
|
||||||
|
local storage_map_var_name
|
||||||
|
storage_map_var_name=${mvm['storage']}
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n storage_map_ref=${storage_map_var_name}
|
||||||
|
|
||||||
|
if [[ -z ${storage_map_ref["${key}"]:-} ]]; then
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
local var_name=${storage_map_ref["${key}"]}
|
||||||
|
unset "storage_map_ref[${key}]"
|
||||||
|
|
||||||
|
local destructor
|
||||||
|
destructor=${mvm['destructor']}
|
||||||
|
|
||||||
|
"${destructor}" "${var_name}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Iterates over the key-mvc pairs and invokes a callback for each. The
|
||||||
|
# function also takes some extra parameters to forward to the
|
||||||
|
# callback. The callback will receive, in order, extra parameters, a
|
||||||
|
# key, an mvc name, and possibly some extra parameters from the
|
||||||
|
# iteration helper, if such exists for the mvm.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of the mvm variable
|
||||||
|
# 2 - callback
|
||||||
|
# @ - extra parameters forwarded to the callback
|
||||||
|
function mvm_iterate() {
|
||||||
|
mvm_call "${1}" mvm_c_iterate "${@:2}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Helper function for mvm_iterate invoked through mvm_call.
|
||||||
|
function mvm_c_iterate() {
|
||||||
|
local callback
|
||||||
|
callback=${1}; shift
|
||||||
|
# rest are extra args passed to callback
|
||||||
|
|
||||||
|
local storage_map_var_name helper
|
||||||
|
storage_map_var_name=${mvm['storage']}
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n storage_map_ref=${storage_map_var_name}
|
||||||
|
helper=${mvm['iteration_helper']}
|
||||||
|
|
||||||
|
local key value
|
||||||
|
if [[ -n "${helper}" ]]; then
|
||||||
|
for key in "${!storage_map_ref[@]}"; do
|
||||||
|
value=${storage_map_ref["${key}"]}
|
||||||
|
"${helper}" "${key}" "${value}" "${callback}" "${@}"
|
||||||
|
done
|
||||||
|
else
|
||||||
|
for key in "${!storage_map_ref[@]}"; do
|
||||||
|
value=${storage_map_ref["${key}"]}
|
||||||
|
"${callback}" "${@}" "${key}" "${value}"
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# debugging
|
||||||
|
|
||||||
|
declare -A MVM_DEBUG_NAMES=()
|
||||||
|
|
||||||
|
# Enables printing debugging info for a specified mvm.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of the mvm variable
|
||||||
|
function mvm_debug_enable() {
|
||||||
|
local mvm_var_name=${1}; shift
|
||||||
|
MVM_DEBUG_NAMES["${mvm_var_name}"]=x
|
||||||
|
}
|
||||||
|
|
||||||
|
# Print debugging info about the mvm if debugging for it was enabled
|
||||||
|
# beforehand.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of the mvm variable
|
||||||
|
# @ - strings to be printed
|
||||||
|
function mvm_debug() {
|
||||||
|
local name=${1}; shift
|
||||||
|
|
||||||
|
if [[ -n ${MVM_DEBUG_NAMES["${name}"]:-} ]]; then
|
||||||
|
info "MVM_DEBUG(${name}): ${*}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Disables printing debugging info for a specified mvm.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of the mvm variable
|
||||||
|
function mvm_debug_disable() {
|
||||||
|
local mvm_var_name=${1}; shift
|
||||||
|
unset "MVM_DEBUG_NAMES[${mvm_var_name}]"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Array mvm, the default. Provides an iteration helper that sends all
|
||||||
|
# the array values to the iteration callback.
|
||||||
|
|
||||||
|
function mvm_mvc_array_constructor() {
|
||||||
|
local array_var_name
|
||||||
|
array_var_name=${1}; shift
|
||||||
|
|
||||||
|
declare -g -a "${array_var_name}"
|
||||||
|
|
||||||
|
local -n array_ref=${array_var_name}
|
||||||
|
array_ref=()
|
||||||
|
}
|
||||||
|
|
||||||
|
function mvm_mvc_array_destructor() {
|
||||||
|
local array_var_name
|
||||||
|
array_var_name=${1}; shift
|
||||||
|
|
||||||
|
unset "${array_var_name}"
|
||||||
|
}
|
||||||
|
|
||||||
|
function mvm_mvc_array_adder() {
|
||||||
|
local array_var_name
|
||||||
|
array_var_name=${1}; shift
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n array_ref=${array_var_name}
|
||||||
|
|
||||||
|
array_ref+=( "${@}" )
|
||||||
|
}
|
||||||
|
|
||||||
|
# iteration_helper is optional
|
||||||
|
function mvm_mvc_array_iteration_helper() {
|
||||||
|
local key array_var_name callback
|
||||||
|
key=${1}; shift
|
||||||
|
array_var_name=${1}; shift
|
||||||
|
callback=${1}; shift
|
||||||
|
# rest are extra args passed to cb
|
||||||
|
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n array_ref=${array_var_name}
|
||||||
|
"${callback}" "${@}" "${key}" "${array_var_name}" "${array_ref[@]}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Map mvm. When adding elements to the mvc, it is expected that the
|
||||||
|
# number of items passed will be even. Odd elements will be used as
|
||||||
|
# keys, even elements will be used as values.
|
||||||
|
#
|
||||||
|
# No iteration helper.
|
||||||
|
|
||||||
|
function mvm_mvc_map_constructor() {
|
||||||
|
local map_var_name
|
||||||
|
map_var_name=${1}; shift
|
||||||
|
|
||||||
|
declare -g -A "${map_var_name}"
|
||||||
|
|
||||||
|
local -n map_ref=${map_var_name}
|
||||||
|
map_ref=()
|
||||||
|
}
|
||||||
|
|
||||||
|
function mvm_mvc_map_destructor() {
|
||||||
|
local map_var_name
|
||||||
|
map_var_name=${1}; shift
|
||||||
|
|
||||||
|
unset "${map_var_name}"
|
||||||
|
}
|
||||||
|
|
||||||
|
function mvm_mvc_map_adder() {
|
||||||
|
local map_var_name
|
||||||
|
map_var_name=${1}; shift
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n map_ref=${map_var_name}
|
||||||
|
|
||||||
|
while [[ ${#} -gt 1 ]]; do
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
map_ref["${1}"]=${2}
|
||||||
|
shift 2
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
# Set mvm. Behaves like array mvm, but all elements in each set are
|
||||||
|
# unique and the order of elements is not guaranteed to be the same as
|
||||||
|
# order of insertions.
|
||||||
|
|
||||||
|
function mvm_mvc_set_constructor() {
|
||||||
|
local set_var_name
|
||||||
|
set_var_name=${1}; shift
|
||||||
|
|
||||||
|
declare -g -A "${set_var_name}"
|
||||||
|
|
||||||
|
# shellcheck disable=SC2178 # shellcheck does not grok refs
|
||||||
|
local -n set_ref=${set_var_name}
|
||||||
|
set_ref=()
|
||||||
|
}
|
||||||
|
|
||||||
|
function mvm_mvc_set_destructor() {
|
||||||
|
local set_var_name
|
||||||
|
set_var_name=${1}
|
||||||
|
|
||||||
|
unset "${set_var_name}"
|
||||||
|
}
|
||||||
|
|
||||||
|
function mvm_mvc_set_adder() {
|
||||||
|
local set_var_name
|
||||||
|
set_var_name=${1}; shift
|
||||||
|
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n set_ref=${set_var_name}
|
||||||
|
while [[ ${#} -gt 0 ]]; do
|
||||||
|
set_ref["${1}"]=x
|
||||||
|
shift
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
# iteration_helper is optional
|
||||||
|
function mvm_mvc_set_iteration_helper() {
|
||||||
|
local key map_var_name callback
|
||||||
|
|
||||||
|
key=${1}; shift
|
||||||
|
set_var_name=${1}; shift
|
||||||
|
callback=${1}; shift
|
||||||
|
# rest are extra args passed to cb
|
||||||
|
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n set_ref=${set_var_name}
|
||||||
|
"${callback}" "${@}" "${key}" "${set_var_name}" "${!set_ref[@]}"
|
||||||
|
}
|
||||||
|
|
||||||
|
fi
|
3247
pkg_auto/impl/pkg_auto_lib.sh
Normal file
3247
pkg_auto/impl/pkg_auto_lib.sh
Normal file
File diff suppressed because it is too large
Load Diff
274
pkg_auto/impl/print_profile_tree.sh
Executable file
274
pkg_auto/impl/print_profile_tree.sh
Executable file
@ -0,0 +1,274 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
##
|
||||||
|
## Prints profile information in form of an inheritance tree and/or
|
||||||
|
## evaluation order.
|
||||||
|
##
|
||||||
|
## Parameters:
|
||||||
|
## -h: this help
|
||||||
|
## -ni: no inheritance tree
|
||||||
|
## -ne: no evaluation order
|
||||||
|
## -nh: no headers
|
||||||
|
##
|
||||||
|
## Environment variables:
|
||||||
|
## ROOT
|
||||||
|
##
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/util.sh"
|
||||||
|
|
||||||
|
: "${ROOT:=/}"
|
||||||
|
|
||||||
|
print_inheritance_tree=x
|
||||||
|
print_evaluation_order=x
|
||||||
|
print_headers=x
|
||||||
|
|
||||||
|
while [[ ${#} -gt 0 ]]; do
|
||||||
|
case ${1} in
|
||||||
|
-h)
|
||||||
|
print_help
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
-ni)
|
||||||
|
print_inheritance_tree=
|
||||||
|
;;
|
||||||
|
-ne)
|
||||||
|
print_evaluation_order=
|
||||||
|
;;
|
||||||
|
-nh)
|
||||||
|
print_headers=
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
fail "unknown flag ${1}"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
shift
|
||||||
|
done
|
||||||
|
|
||||||
|
all_repo_names=()
|
||||||
|
read -a all_repo_names -r < <(portageq get_repos "${ROOT}")
|
||||||
|
|
||||||
|
declare -A repo_data repo_data_r
|
||||||
|
# name to path
|
||||||
|
repo_data=()
|
||||||
|
# path to name
|
||||||
|
repo_data_r=()
|
||||||
|
|
||||||
|
for repo_name in "${all_repo_names[@]}"; do
|
||||||
|
repo_path=$(portageq get_repo_path "${ROOT}" "${repo_name}")
|
||||||
|
repo_path=$(realpath "${repo_path}")
|
||||||
|
repo_data["${repo_name}"]="${repo_path}"
|
||||||
|
repo_data_r["${repo_path}"]="${repo_name}"
|
||||||
|
done
|
||||||
|
|
||||||
|
unset all_repo_names
|
||||||
|
|
||||||
|
function get_repo_from_profile_path() {
|
||||||
|
local path
|
||||||
|
path=${1}; shift
|
||||||
|
local -n repo_dir_ref=${1}; shift
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
repo_dir_ref="${path%/profiles/*}"
|
||||||
|
}
|
||||||
|
|
||||||
|
function repo_path_to_name() {
|
||||||
|
local path
|
||||||
|
path=${1}; shift
|
||||||
|
local -n name_ref=${1}; shift
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
name_ref=${repo_data_r["${path}"]:-'<unknown>'}
|
||||||
|
}
|
||||||
|
|
||||||
|
function repeat_string() {
|
||||||
|
local str ntimes out_str_var_name
|
||||||
|
str=${1}; shift
|
||||||
|
ntimes=${1}; shift
|
||||||
|
out_str_var_name=${1}; shift
|
||||||
|
local -n out_str_ref="${out_str_var_name}"
|
||||||
|
|
||||||
|
if [[ ${ntimes} -eq 0 ]]; then
|
||||||
|
out_str_ref=""
|
||||||
|
return 0
|
||||||
|
elif [[ ${ntimes} -eq 1 ]]; then
|
||||||
|
out_str_ref="${str}"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
local add_one
|
||||||
|
add_one=$((ntimes % 2))
|
||||||
|
repeat_string "${str}${str}" $((ntimes / 2)) "${out_str_var_name}"
|
||||||
|
if [[ add_one -gt 0 ]]; then
|
||||||
|
out_str_ref+="${str}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
function process_profile() {
|
||||||
|
local repo_name profile_path
|
||||||
|
repo_name=${1}; shift
|
||||||
|
profile_path=${1}; shift
|
||||||
|
local -n children_ref=${1}; shift
|
||||||
|
|
||||||
|
local parent_file line pp_new_repo_name new_profile_path pp_new_repo_path
|
||||||
|
local -a children
|
||||||
|
|
||||||
|
parent_file="${profile_path}/parent"
|
||||||
|
children=()
|
||||||
|
if [[ -e ${parent_file} ]]; then
|
||||||
|
while read -r line; do
|
||||||
|
if [[ ${line} = *:* ]]; then
|
||||||
|
pp_new_repo_name=${line%%:*}
|
||||||
|
if [[ -z ${pp_new_repo_name} ]]; then
|
||||||
|
pp_new_repo_name=${repo_name}
|
||||||
|
fi
|
||||||
|
pp_new_repo_path=${repo_data["${pp_new_repo_name}"]}
|
||||||
|
new_profile_path="${pp_new_repo_path}/profiles/${line#*:}"
|
||||||
|
children+=( "${pp_new_repo_name}" "${new_profile_path}" )
|
||||||
|
elif [[ ${line} = /* ]]; then
|
||||||
|
pp_new_repo_path=
|
||||||
|
get_repo_from_profile_path "${line}" pp_new_repo_path
|
||||||
|
pp_new_repo_name=
|
||||||
|
repo_path_to_name "${pp_new_repo_path}" pp_new_repo_name
|
||||||
|
children+=( "${pp_new_repo_name}" "${line}" )
|
||||||
|
else
|
||||||
|
pp_new_repo_path=$(realpath "${profile_path}/${line}")
|
||||||
|
children+=( "${repo_name}" "${pp_new_repo_path}" )
|
||||||
|
fi
|
||||||
|
done <"${parent_file}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
children_ref=( "${children[@]}" )
|
||||||
|
}
|
||||||
|
|
||||||
|
function get_profile_name() {
|
||||||
|
local repo_name profile_path
|
||||||
|
repo_name="${1}"; shift
|
||||||
|
profile_path="${1}"; shift
|
||||||
|
local -n profile_name_ref=${1}; shift
|
||||||
|
|
||||||
|
local repo_path profile_name
|
||||||
|
repo_path=${repo_data["${repo_name}"]}
|
||||||
|
profile_name=${profile_path#"${repo_path}/profiles/"}
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
profile_name_ref="${profile_name}"
|
||||||
|
}
|
||||||
|
|
||||||
|
make_profile_path="${ROOT%/}/etc/portage/make.profile"
|
||||||
|
top_profile_dir_path=$(realpath "${make_profile_path}")
|
||||||
|
top_repo_path=
|
||||||
|
get_repo_from_profile_path "${top_profile_dir_path}" top_repo_path
|
||||||
|
top_repo_name=
|
||||||
|
repo_path_to_name "${top_repo_path}" top_repo_name
|
||||||
|
|
||||||
|
if [[ -n ${print_inheritance_tree} ]]; then
|
||||||
|
|
||||||
|
set -- '0' "${top_repo_name}" "${top_profile_dir_path}"
|
||||||
|
|
||||||
|
profile_tree=()
|
||||||
|
|
||||||
|
while [[ ${#} -gt 2 ]]; do
|
||||||
|
indent=${1}; shift
|
||||||
|
repo_name=${1}; shift
|
||||||
|
profile_path=${1}; shift
|
||||||
|
|
||||||
|
lines=
|
||||||
|
fork=
|
||||||
|
if [[ ${indent} -gt 0 ]]; then
|
||||||
|
if [[ ${indent} -gt 1 ]]; then
|
||||||
|
repeat_string '| ' $((indent - 1)) lines
|
||||||
|
fi
|
||||||
|
fork='+-'
|
||||||
|
fi
|
||||||
|
g_profile_name=
|
||||||
|
get_profile_name "${repo_name}" "${profile_path}" g_profile_name
|
||||||
|
profile_tree+=( "${lines}${fork}${repo_name}:${g_profile_name}" )
|
||||||
|
g_profile_children=()
|
||||||
|
|
||||||
|
process_profile "${repo_name}" "${profile_path}" g_profile_children
|
||||||
|
|
||||||
|
new_profiles=()
|
||||||
|
new_indent=$((indent + 1))
|
||||||
|
pc_idx=0
|
||||||
|
while [[ $((pc_idx + 1)) -lt "${#g_profile_children[@]}" ]]; do
|
||||||
|
new_repo_name=${g_profile_children["${pc_idx}"]}
|
||||||
|
new_profile_path=${g_profile_children[$((pc_idx + 1))]}
|
||||||
|
new_profiles+=( "${new_indent}" "${new_repo_name}" "${new_profile_path}" )
|
||||||
|
pc_idx=$((pc_idx + 2))
|
||||||
|
done
|
||||||
|
|
||||||
|
set -- "${new_profiles[@]}" "${@}"
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ -n ${print_headers} ]]; then
|
||||||
|
echo
|
||||||
|
echo 'profile inheritance tree:'
|
||||||
|
echo
|
||||||
|
fi
|
||||||
|
for line in "${profile_tree[@]}"; do
|
||||||
|
echo "${line}"
|
||||||
|
done
|
||||||
|
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -n ${print_evaluation_order} ]]; then
|
||||||
|
|
||||||
|
set -- "${top_repo_name}" "${top_profile_dir_path}" '0'
|
||||||
|
|
||||||
|
profile_eval=()
|
||||||
|
|
||||||
|
while [[ ${#} -gt 2 ]]; do
|
||||||
|
repo_name=${1}; shift
|
||||||
|
profile_path=${1}; shift
|
||||||
|
num_parents=${1}; shift
|
||||||
|
# each parent is a repo name and profile path, so two items for each parent
|
||||||
|
num_parent_items=$((num_parents * 2))
|
||||||
|
parents=( "${@:1:${num_parent_items}}" )
|
||||||
|
shift "${num_parent_items}"
|
||||||
|
g_profile_children=()
|
||||||
|
|
||||||
|
process_profile "${repo_name}" "${profile_path}" g_profile_children
|
||||||
|
|
||||||
|
new_args=()
|
||||||
|
if [[ ${#g_profile_children[@]} -eq 0 ]]; then
|
||||||
|
to_evaluate=( "${repo_name}" "${profile_path}" "${parents[@]}" )
|
||||||
|
te_idx=0
|
||||||
|
while [[ $((te_idx + 1)) -lt "${#to_evaluate[@]}" ]]; do
|
||||||
|
new_repo_name=${to_evaluate["${te_idx}"]}
|
||||||
|
new_profile_path=${to_evaluate[$((te_idx + 1))]}
|
||||||
|
g_new_profile_name=
|
||||||
|
get_profile_name "${new_repo_name}" "${new_profile_path}" g_new_profile_name
|
||||||
|
profile_eval+=( "${new_repo_name}:${g_new_profile_name}" )
|
||||||
|
te_idx=$((te_idx + 2))
|
||||||
|
done
|
||||||
|
else
|
||||||
|
last_idx=$(( ${#g_profile_children[@]} - 2 ))
|
||||||
|
pc_idx=0
|
||||||
|
while [[ $((pc_idx + 1)) -lt "${#g_profile_children[@]}" ]]; do
|
||||||
|
new_repo_name=${g_profile_children["${pc_idx}"]}
|
||||||
|
new_profile_path=${g_profile_children[$((pc_idx + 1))]}
|
||||||
|
new_args+=( "${new_repo_name}" "${new_profile_path}" )
|
||||||
|
if [[ pc_idx -eq last_idx ]]; then
|
||||||
|
new_args+=( $((num_parents + 1)) "${repo_name}" "${profile_path}" "${parents[@]}" )
|
||||||
|
else
|
||||||
|
new_args+=( 0 )
|
||||||
|
fi
|
||||||
|
pc_idx=$((pc_idx + 2))
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
|
||||||
|
set -- "${new_args[@]}" "${@}"
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ -n ${print_headers} ]]; then
|
||||||
|
echo
|
||||||
|
echo 'profile evaluation order:'
|
||||||
|
echo
|
||||||
|
fi
|
||||||
|
for line in "${profile_eval[@]}"; do
|
||||||
|
echo "${line}"
|
||||||
|
done
|
||||||
|
|
||||||
|
fi
|
206
pkg_auto/impl/sort_packages_list.py
Executable file
206
pkg_auto/impl/sort_packages_list.py
Executable file
@ -0,0 +1,206 @@
|
|||||||
|
#!/usr/bin/python3
|
||||||
|
|
||||||
|
# The package list file is a document that consists of a header and an
|
||||||
|
# empty-line-separated list of package groups. Header is a list of all
|
||||||
|
# lines until the first package group. Package group is a list of
|
||||||
|
# packages of the same category and possibly some related
|
||||||
|
# comments. The comments are usually about packages that are
|
||||||
|
# temporarily excluded from the list. So the comments usually have two
|
||||||
|
# parts - the free form part and a commented-out package list part;
|
||||||
|
# the example would be:
|
||||||
|
#
|
||||||
|
# # Temporarily excluded from automated updates, because reasons.
|
||||||
|
# #
|
||||||
|
# # sys-devel/whatever
|
||||||
|
#
|
||||||
|
# The script tries to preserve the comments and its ordering, so it
|
||||||
|
# associates the free form part to the package name.
|
||||||
|
#
|
||||||
|
# The script also deduplicates the packages while sorting. An edge
|
||||||
|
# case is when a package appears multiple times and is not
|
||||||
|
# commented-out at least once - all commented out entries are dropped.
|
||||||
|
#
|
||||||
|
# Implementation-wise, the document has a list of lines being a
|
||||||
|
# header, a list of free form comments and a map of category name to a
|
||||||
|
# group. A group is a list of packages, where each package has a name,
|
||||||
|
# information if it's commented out and may have a free form comment
|
||||||
|
# associated with it.
|
||||||
|
|
||||||
|
import re
|
||||||
|
import sys
|
||||||
|
|
||||||
|
class FreeForm:
|
||||||
|
def __init__(self, lines):
|
||||||
|
self.lines = lines
|
||||||
|
|
||||||
|
class Pkg:
|
||||||
|
def __init__(self, idx, name, out):
|
||||||
|
self.free_form_idx = idx
|
||||||
|
self.name = name
|
||||||
|
self.commented_out = out
|
||||||
|
|
||||||
|
class Group:
|
||||||
|
def __init__(self):
|
||||||
|
self.pkgs = []
|
||||||
|
self.pkg_names_set = set()
|
||||||
|
|
||||||
|
class Document:
|
||||||
|
def __init__(self):
|
||||||
|
self.header = []
|
||||||
|
self.free_forms = []
|
||||||
|
self.groups = {}
|
||||||
|
|
||||||
|
class Reader:
|
||||||
|
category_or_pkg_pattern = re.compile("^[a-z0-9-]+(?:/[A-Za-z0-9-_+]+)?$")
|
||||||
|
parsing_header = 1
|
||||||
|
parsing_group = 2
|
||||||
|
parsing_comment = 3
|
||||||
|
|
||||||
|
def __init__(self, doc):
|
||||||
|
self.doc = doc
|
||||||
|
self.parsing_stage = Reader.parsing_header
|
||||||
|
self.current_comments = []
|
||||||
|
self.free_form_idx_for_next_pkg = None
|
||||||
|
|
||||||
|
def get_group(self, category):
|
||||||
|
if category not in self.doc.groups:
|
||||||
|
new_group = Group()
|
||||||
|
self.doc.groups[category] = new_group
|
||||||
|
return new_group
|
||||||
|
return self.doc.groups[category]
|
||||||
|
|
||||||
|
def add_pkg_impl(self, idx, name, out):
|
||||||
|
category = name.split('/', 1)[0]
|
||||||
|
group = self.get_group(category)
|
||||||
|
if name in group.pkg_names_set:
|
||||||
|
if not out:
|
||||||
|
for pkg in group.pkgs:
|
||||||
|
if pkg.name == name:
|
||||||
|
pkg.commented_out = False
|
||||||
|
break
|
||||||
|
else:
|
||||||
|
group.pkg_names_set.add(name)
|
||||||
|
group.pkgs += [Pkg(idx, name, out)]
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
def add_pkg(self, name):
|
||||||
|
if self.add_pkg_impl(self.free_form_idx_for_next_pkg, name, False):
|
||||||
|
self.free_form_idx_for_next_pkg = None
|
||||||
|
|
||||||
|
class CommentBatch:
|
||||||
|
def __init__(self, ff_lines, p_lines):
|
||||||
|
self.free_form_lines = ff_lines
|
||||||
|
self.pkg_lines = p_lines
|
||||||
|
|
||||||
|
def get_batches(self):
|
||||||
|
batches = []
|
||||||
|
free_form_lines = []
|
||||||
|
pkg_lines = []
|
||||||
|
for line in self.current_comments:
|
||||||
|
line = line.lstrip('#').strip()
|
||||||
|
if not line:
|
||||||
|
if not pkg_lines:
|
||||||
|
free_form_lines += [line]
|
||||||
|
elif Reader.category_or_pkg_pattern.match(line):
|
||||||
|
pkg_lines += [line]
|
||||||
|
else:
|
||||||
|
if pkg_lines:
|
||||||
|
while not free_form_lines[-1]:
|
||||||
|
free_form_lines = free_form_lines[:-1]
|
||||||
|
batches += [Reader.CommentBatch(free_form_lines, pkg_lines)]
|
||||||
|
free_form_lines = []
|
||||||
|
pkg_lines = []
|
||||||
|
free_form_lines += [line]
|
||||||
|
self.current_comments = []
|
||||||
|
if free_form_lines or pkg_lines:
|
||||||
|
batches += [Reader.CommentBatch(free_form_lines, pkg_lines)]
|
||||||
|
return batches
|
||||||
|
|
||||||
|
def process_current_comments(self):
|
||||||
|
for batch in self.get_batches():
|
||||||
|
free_form_idx = None
|
||||||
|
if batch.free_form_lines:
|
||||||
|
free_form_idx = len(self.doc.free_forms)
|
||||||
|
self.doc.free_forms += [FreeForm(batch.free_form_lines)]
|
||||||
|
if batch.pkg_lines:
|
||||||
|
for line in batch.pkg_lines:
|
||||||
|
self.add_pkg_impl(free_form_idx, line, True)
|
||||||
|
else:
|
||||||
|
self.free_form_idx_for_next_pkg = free_form_idx
|
||||||
|
|
||||||
|
def read(self, input):
|
||||||
|
while line := input.readline():
|
||||||
|
line = line.strip()
|
||||||
|
if self.parsing_stage == Reader.parsing_header:
|
||||||
|
if not line:
|
||||||
|
self.parsing_stage = Reader.parsing_group
|
||||||
|
elif line.startswith('#'):
|
||||||
|
self.doc.header += [line]
|
||||||
|
else:
|
||||||
|
self.parsing_stage = Reader.parsing_group
|
||||||
|
self.add_pkg(line)
|
||||||
|
elif self.parsing_stage == Reader.parsing_group:
|
||||||
|
if not line:
|
||||||
|
pass
|
||||||
|
elif line.startswith('#'):
|
||||||
|
self.current_comments += [line]
|
||||||
|
self.parsing_stage = Reader.parsing_comment
|
||||||
|
else:
|
||||||
|
self.add_pkg(line)
|
||||||
|
elif self.parsing_stage == Reader.parsing_comment:
|
||||||
|
if not line:
|
||||||
|
self.parsing_stage = Reader.parsing_group
|
||||||
|
self.process_current_comments()
|
||||||
|
elif line.startswith('#'):
|
||||||
|
self.current_comments += [line]
|
||||||
|
else:
|
||||||
|
self.parsing_stage = Reader.parsing_group
|
||||||
|
self.process_current_comments()
|
||||||
|
self.add_pkg(line)
|
||||||
|
if self.current_comments:
|
||||||
|
self.process_current_comments()
|
||||||
|
|
||||||
|
class Writer:
|
||||||
|
def __init__(self, doc):
|
||||||
|
self.doc = doc
|
||||||
|
|
||||||
|
def write(self, output):
|
||||||
|
output_lines = []
|
||||||
|
if self.doc.header:
|
||||||
|
output_lines += self.doc.header
|
||||||
|
output_lines += ['']
|
||||||
|
for category in sorted(self.doc.groups):
|
||||||
|
last_free_form_idx = None
|
||||||
|
for pkg in sorted(self.doc.groups[category].pkgs, key=lambda pkg: pkg.name):
|
||||||
|
if pkg.free_form_idx != last_free_form_idx:
|
||||||
|
last_free_form_idx = pkg.free_form_idx
|
||||||
|
if pkg.free_form_idx is not None:
|
||||||
|
for line in self.doc.free_forms[pkg.free_form_idx].lines:
|
||||||
|
if line:
|
||||||
|
output_lines += [f"# {line}"]
|
||||||
|
else:
|
||||||
|
output_lines += ['#']
|
||||||
|
if pkg.commented_out:
|
||||||
|
output_lines += [f"# {pkg.name}"]
|
||||||
|
else:
|
||||||
|
output_lines += [f"{pkg.name}"]
|
||||||
|
output_lines += ['']
|
||||||
|
while not output_lines[0]:
|
||||||
|
output_lines = output_lines[1:]
|
||||||
|
while not output_lines[-1]:
|
||||||
|
output_lines = output_lines[:-1]
|
||||||
|
for line in output_lines:
|
||||||
|
print(line, file=output)
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
if len(sys.argv) != 2:
|
||||||
|
print(f"1 argument expected, got {len(sys.argv) - 1}", file=sys.stderr)
|
||||||
|
sys.exit(1)
|
||||||
|
filename = sys.argv[1]
|
||||||
|
doc = Document()
|
||||||
|
with open(filename, 'r', encoding='UTF-8') as file:
|
||||||
|
reader = Reader(doc)
|
||||||
|
reader.read(file)
|
||||||
|
writer = Writer(doc)
|
||||||
|
writer.write(sys.stdout)
|
316
pkg_auto/impl/sync_with_gentoo.sh
Executable file
316
pkg_auto/impl/sync_with_gentoo.sh
Executable file
@ -0,0 +1,316 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
##
|
||||||
|
## Used for syncing with gentoo. Needs to be called from the
|
||||||
|
## toplevel-directory of portage-stable. If syncing everything or
|
||||||
|
## syncing metadata/glsa specifically, it is expected that the Gentoo
|
||||||
|
## repo will have the GLSA files stored in metadata/glsa too.
|
||||||
|
##
|
||||||
|
## Parameters:
|
||||||
|
## -h: this help
|
||||||
|
## -b: be brief, print only names of changed entries and errors
|
||||||
|
## -s: skip adding source git commit hash information to commits
|
||||||
|
##
|
||||||
|
## Positional:
|
||||||
|
## 0: Gentoo repository
|
||||||
|
## #: Entries to update (can be a package name, eclass, category, some special
|
||||||
|
## directories like profiles or . for everything)
|
||||||
|
##
|
||||||
|
## Example invocations:
|
||||||
|
##
|
||||||
|
## sync_with_gentoo -h
|
||||||
|
##
|
||||||
|
## Print a help message.
|
||||||
|
##
|
||||||
|
## sync_with_gentoo dev-libs/nettle app-crypt/argon2
|
||||||
|
##
|
||||||
|
## This will update the packages, each in a separate commit. The
|
||||||
|
## commit message will contain the commit hash from gentoo repo.
|
||||||
|
##
|
||||||
|
## sync_with_gentoo dev-libs
|
||||||
|
##
|
||||||
|
## This will update all the packages in dev-libs category. The
|
||||||
|
## commit message will contain the commit hash from gentoo repo.
|
||||||
|
##
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/util.sh"
|
||||||
|
|
||||||
|
BRIEF=
|
||||||
|
SKIP_GIT_INFO=
|
||||||
|
|
||||||
|
while true; do
|
||||||
|
case ${1-} in
|
||||||
|
-h)
|
||||||
|
print_help
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
-b)
|
||||||
|
BRIEF=x
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
-s)
|
||||||
|
SKIP_GIT_INFO=x
|
||||||
|
shift
|
||||||
|
;;
|
||||||
|
--)
|
||||||
|
shift
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
-*)
|
||||||
|
fail "unknown flag '${1}'"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ $# -lt 2 ]]; then
|
||||||
|
fail 'expected at least two positional parameters: a Gentoo repository and at least one package, use -h to print help'
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ ! -e 'profiles/repo_name' ]]; then
|
||||||
|
fail 'sync is only possible from ebuild packages top-level directory (a directory from which "./profiles/repo_name" is accessible)'
|
||||||
|
fi
|
||||||
|
|
||||||
|
function vcall() {
|
||||||
|
if [[ -z ${BRIEF} ]]; then
|
||||||
|
"${@}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
function bcall() {
|
||||||
|
if [[ -n ${BRIEF} ]]; then
|
||||||
|
"${@}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
GENTOO=$(realpath "${1}"); shift
|
||||||
|
# rest are package names
|
||||||
|
|
||||||
|
if [[ $(realpath '.') = "${GENTOO}" ]]; then
|
||||||
|
fail 'trying to sync within a Gentoo repo?'
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ -z ${SKIP_GIT_INFO} ]] && [[ ! -e ${GENTOO}/.git ]]; then
|
||||||
|
info "Skipping adding source git commit hash information to commits, ${GENTOO@Q} is not a git repository"
|
||||||
|
SKIP_GIT_INFO=x
|
||||||
|
fi
|
||||||
|
|
||||||
|
glsa_repo=${GENTOO}/metadata/glsa
|
||||||
|
if [[ -z ${SKIP_GIT_INFO} ]] && [[ -e ${glsa_repo} ]] && [[ ! -e ${glsa_repo}/.git ]] && [[ $(git -C "${GENTOO}" status --porcelain -- metadata/glsa) = '?? metadata/glsa' ]]; then
|
||||||
|
info "Skipping adding source git commit hash information to commits, ${glsa_repo@Q} exists, but it is not a git repository and is not a part of Gentoo git repository"
|
||||||
|
SKIP_GIT_INFO=x
|
||||||
|
fi
|
||||||
|
unset glsa_repo
|
||||||
|
|
||||||
|
# Synchronizes given path with its Gentoo counterpart. Returns true if
|
||||||
|
# there were changes.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - path within ebuild repo
|
||||||
|
function sync_git_prepare() {
|
||||||
|
local path
|
||||||
|
path=${1}; shift
|
||||||
|
|
||||||
|
local gentoo_path
|
||||||
|
gentoo_path="${GENTOO}/${path}"
|
||||||
|
|
||||||
|
if [[ ! -e "${gentoo_path}" ]]; then
|
||||||
|
info "no ${path@Q} in Gentoo repository"
|
||||||
|
if [[ ${path} = 'metadata/glsa' ]]; then
|
||||||
|
info "did you forget to clone https://gitweb.gentoo.org/data/glsa.git/ into ${gentoo_path@Q}?"
|
||||||
|
fi
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
local -a rsync_opts=( --archive --delete-before )
|
||||||
|
|
||||||
|
case ${path} in
|
||||||
|
profiles)
|
||||||
|
rsync_opts+=( --exclude /profiles/repo_name )
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
local parent
|
||||||
|
dirname_out "${path}" parent
|
||||||
|
mkdir --parents "${parent}"
|
||||||
|
rsync "${rsync_opts[@]}" "${gentoo_path}" "${parent}"
|
||||||
|
if [[ -n $(git status --porcelain -- "${path}") ]]; then
|
||||||
|
bcall info "updated ${path}"
|
||||||
|
git add "${path}"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Creates a git commit. If checking Gentoo commit ID is enabled the
|
||||||
|
# given path is used to get the ID of the commit with the last change
|
||||||
|
# in the path. Name parameter is used for denoting which part has
|
||||||
|
# changed, and sync parameter to denote if the commit is about adding
|
||||||
|
# new package or updating an existing one.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - path
|
||||||
|
# 2 - name
|
||||||
|
# 3 - not empty if existing package was updated, or an empty string if
|
||||||
|
# the package is new
|
||||||
|
function commit_with_gentoo_sha() {
|
||||||
|
local path name sync
|
||||||
|
path=${1}; shift
|
||||||
|
name=${1}; shift
|
||||||
|
sync=${1:-}; shift
|
||||||
|
|
||||||
|
local -a commit_extra=()
|
||||||
|
if [[ -z ${SKIP_GIT_INFO} ]]; then
|
||||||
|
local commit
|
||||||
|
|
||||||
|
commit=$(git -C "${GENTOO}/${path}" log --pretty=oneline -1 -- . | cut -f1 -d' ')
|
||||||
|
commit_extra+=( --message "It's from Gentoo commit ${commit}." )
|
||||||
|
unset commit
|
||||||
|
fi
|
||||||
|
commit_msg="${name}: Add from Gentoo"
|
||||||
|
if [[ -n "${sync}" ]]; then
|
||||||
|
commit_msg="${name}: Sync with Gentoo"
|
||||||
|
fi
|
||||||
|
git commit --quiet --message "${commit_msg}" "${commit_extra[@]}"
|
||||||
|
GIT_PAGER='cat' vcall git show --stat
|
||||||
|
}
|
||||||
|
|
||||||
|
# Simple path sync and commit; takes the contents from Gentoo at the
|
||||||
|
# given path and puts it in the repo.
|
||||||
|
#
|
||||||
|
# 1 - path to sync
|
||||||
|
# 2 - name for commit message
|
||||||
|
function path_sync() {
|
||||||
|
local path name
|
||||||
|
path=${1}; shift
|
||||||
|
name=${1}; shift
|
||||||
|
|
||||||
|
local sync
|
||||||
|
sync=''
|
||||||
|
if [[ -e "${path}" ]]; then
|
||||||
|
sync='x'
|
||||||
|
fi
|
||||||
|
|
||||||
|
if sync_git_prepare "${path}"; then
|
||||||
|
commit_with_gentoo_sha "${path}" "${name}" "${sync}"
|
||||||
|
else
|
||||||
|
vcall info "no changes in ${path}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Goes over the given directory and syncs its subdirectories or
|
||||||
|
# files. No commit is created.
|
||||||
|
function prepare_dir() {
|
||||||
|
local dir
|
||||||
|
dir=${1}; shift
|
||||||
|
|
||||||
|
local pkg mod=''
|
||||||
|
for pkg in "${dir}/"*; do
|
||||||
|
if sync_git_prepare "${pkg}"; then
|
||||||
|
mod=x
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
if [[ -n ${mod} ]]; then
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Synces entire category of packages and creates a commit. Note that
|
||||||
|
# if the category already exists, no new packages will be added.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - path to the category directory
|
||||||
|
function category_sync() {
|
||||||
|
local path
|
||||||
|
path=${1}; shift
|
||||||
|
|
||||||
|
if [[ ! -e "${path}" ]]; then
|
||||||
|
if sync_git_prepare "${path}"; then
|
||||||
|
commit_with_gentoo_sha "${path}" "${path}"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
if prepare_dir "${path}"; then
|
||||||
|
commit_with_gentoo_sha "${path}" "${path}" 'x'
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
# Synces entire repo. No new packages will be added.
|
||||||
|
function everything_sync() {
|
||||||
|
local path mod
|
||||||
|
|
||||||
|
for path in *; do
|
||||||
|
case ${path} in
|
||||||
|
licenses|profiles|scripts)
|
||||||
|
if sync_git_prepare "${path}"; then
|
||||||
|
mod=x
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
metadata)
|
||||||
|
# do only metadata updates
|
||||||
|
if sync_git_prepare metadata/glsa; then
|
||||||
|
mod=x
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
eclass|virtual|*-*)
|
||||||
|
if prepare_dir "${path}"; then
|
||||||
|
mod=x
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
changelog|*.md)
|
||||||
|
# ignore those
|
||||||
|
:
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
info "Unknown entry ${path@Q}, ignoring"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
if [[ -n ${mod} ]]; then
|
||||||
|
commit_with_gentoo_sha '.' '*' 'x'
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
shopt -s extglob
|
||||||
|
|
||||||
|
for cpn; do
|
||||||
|
cpn=${cpn%%*(/)}
|
||||||
|
case ${cpn} in
|
||||||
|
.)
|
||||||
|
everything_sync
|
||||||
|
;;
|
||||||
|
licenses|profiles|scripts)
|
||||||
|
path_sync "${cpn}" "${cpn}"
|
||||||
|
;;
|
||||||
|
eclass/*.eclass)
|
||||||
|
path_sync "${cpn}" "${cpn%.eclass}"
|
||||||
|
;;
|
||||||
|
metadata/glsa)
|
||||||
|
path_sync "${cpn}" "${cpn}"
|
||||||
|
;;
|
||||||
|
metadata)
|
||||||
|
fail "metadata directory can't be synced, did you mean metadata/glsa?"
|
||||||
|
;;
|
||||||
|
virtual/*/*|*-*/*/*)
|
||||||
|
fail "invalid thing to sync: ${cpn}"
|
||||||
|
;;
|
||||||
|
virtual/*|*-*/*)
|
||||||
|
path_sync "${cpn}" "${cpn}"
|
||||||
|
;;
|
||||||
|
eclass|virtual|*-*)
|
||||||
|
category_sync "${cpn}"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
fail "invalid thing to sync: ${cpn}"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
253
pkg_auto/impl/util.sh
Normal file
253
pkg_auto/impl/util.sh
Normal file
@ -0,0 +1,253 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
if [[ -z ${__UTIL_SH_INCLUDED__:-} ]]; then
|
||||||
|
__UTIL_SH_INCLUDED__=x
|
||||||
|
|
||||||
|
# Works like dirname, but without spawning new processes.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - path to operate on
|
||||||
|
# 2 - name of a variable which will contain a dirname of the path
|
||||||
|
function dirname_out() {
|
||||||
|
local path dir_var_name
|
||||||
|
path=${1}; shift
|
||||||
|
dir_var_name=${1}; shift
|
||||||
|
local -n dir_ref=${dir_var_name}
|
||||||
|
|
||||||
|
if [[ -z ${path} ]]; then
|
||||||
|
dir_ref='.'
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
local cleaned_up dn
|
||||||
|
# strip trailing slashes
|
||||||
|
cleaned_up=${path%%*(/)}
|
||||||
|
# strip duplicated slashes
|
||||||
|
cleaned_up=${cleaned_up//+(\/)/\/}
|
||||||
|
# strip last component
|
||||||
|
dn=${cleaned_up%/*}
|
||||||
|
if [[ -z ${dn} ]]; then
|
||||||
|
dir_ref='/'
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
if [[ ${cleaned_up} = "${dn}" ]]; then
|
||||||
|
dir_ref='.'
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
dir_ref=${dn}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Works like basename, but without spawning new processes.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - path to operate on
|
||||||
|
# 2 - name of a variable which will contain a basename of the path
|
||||||
|
function basename_out() {
|
||||||
|
local path base_var_name
|
||||||
|
path=${1}; shift
|
||||||
|
base_var_name=${1}; shift
|
||||||
|
local -n base_ref=${base_var_name}
|
||||||
|
|
||||||
|
if [[ -z ${path} ]]; then
|
||||||
|
base_ref=''
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
local cleaned_up dn
|
||||||
|
# strip trailing slashes
|
||||||
|
cleaned_up=${path%%*(/)}
|
||||||
|
if [[ -z ${cleaned_up} ]]; then
|
||||||
|
base_ref='/'
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
# strip duplicated slashes
|
||||||
|
cleaned_up=${cleaned_up//+(\/)/\/}
|
||||||
|
# keep last component
|
||||||
|
dn=${cleaned_up##*/}
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
base_ref=${dn}
|
||||||
|
}
|
||||||
|
|
||||||
|
if [[ ${BASH_SOURCE[-1]##*/} = 'util.sh' ]]; then
|
||||||
|
THIS=${BASH}
|
||||||
|
basename_out "${THIS}" THIS_NAME
|
||||||
|
THIS_DIR=.
|
||||||
|
else
|
||||||
|
THIS=${BASH_SOURCE[-1]}
|
||||||
|
basename_out "${THIS}" THIS_NAME
|
||||||
|
dirname_out "${THIS}" THIS_DIR
|
||||||
|
fi
|
||||||
|
|
||||||
|
THIS=$(realpath "${THIS}")
|
||||||
|
THIS_DIR=$(realpath "${THIS_DIR}")
|
||||||
|
dirname_out "${BASH_SOURCE[0]}" PKG_AUTO_IMPL_DIR
|
||||||
|
PKG_AUTO_IMPL_DIR=$(realpath "${PKG_AUTO_IMPL_DIR}")
|
||||||
|
# shellcheck disable=SC2034 # may be used by scripts sourcing this file
|
||||||
|
PKG_AUTO_DIR=$(realpath "${PKG_AUTO_IMPL_DIR}/..")
|
||||||
|
|
||||||
|
# Prints an info line.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# @ - strings to print
|
||||||
|
function info() {
|
||||||
|
printf '%s: %s\n' "${THIS_NAME}" "${*}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Prints info lines.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# @ - lines to print
|
||||||
|
function info_lines() {
|
||||||
|
printf '%s\n' "${@/#/"${THIS_NAME}: "}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Prints an info to stderr and fails the execution.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# @ - strings to print
|
||||||
|
function fail() {
|
||||||
|
info "${@}" >&2
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Prints infos to stderr and fails the execution.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# @ - lines to print
|
||||||
|
function fail_lines() {
|
||||||
|
info_lines "${@}" >&2
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Yells a message.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# @ - strings to yell
|
||||||
|
function yell() {
|
||||||
|
echo
|
||||||
|
echo '!!!!!!!!!!!!!!!!!!'
|
||||||
|
echo " ${*}"
|
||||||
|
echo '!!!!!!!!!!!!!!!!!!'
|
||||||
|
echo
|
||||||
|
}
|
||||||
|
|
||||||
|
# Prints help. Help is taken from the lines prefixed with double
|
||||||
|
# hashes in the top sourcer of this file.
|
||||||
|
function print_help() {
|
||||||
|
if [[ ${THIS} != "${BASH}" ]]; then
|
||||||
|
grep '^##' "${THIS}" | sed -e 's/##[[:space:]]*//'
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Joins passed strings with a given delimiter.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of a variable that will contain the joined result
|
||||||
|
# 2 - delimiter
|
||||||
|
# @ - strings to join
|
||||||
|
function join_by() {
|
||||||
|
local output_var_name delimiter first
|
||||||
|
|
||||||
|
output_var_name=${1}; shift
|
||||||
|
delimiter=${1}; shift
|
||||||
|
first=${1-}
|
||||||
|
if shift; then
|
||||||
|
printf -v "${output_var_name}" '%s' "${first}" "${@/#/${delimiter}}";
|
||||||
|
else
|
||||||
|
local -n output_ref=${output_var_name}
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
output_ref=''
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Checks if directory is empty, returns true if so, otherwise false.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - path to a directory
|
||||||
|
function dir_is_empty() {
|
||||||
|
local dir
|
||||||
|
dir=${1}; shift
|
||||||
|
|
||||||
|
[[ -z $(echo "${dir}"/*) ]]
|
||||||
|
}
|
||||||
|
|
||||||
|
# Just like diff, but ignores the return value.
|
||||||
|
function xdiff() {
|
||||||
|
diff "${@}" || :
|
||||||
|
}
|
||||||
|
|
||||||
|
# Just like grep, but ignores the return value.
|
||||||
|
function xgrep() {
|
||||||
|
grep "${@}" || :
|
||||||
|
}
|
||||||
|
|
||||||
|
# Strips leading and trailing whitespace from the passed parameter.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - string to strip
|
||||||
|
# 2 - name of a variable where the result of stripping will be stored
|
||||||
|
function strip_out() {
|
||||||
|
local l
|
||||||
|
l=${1}; shift
|
||||||
|
local -n out_ref=${1}; shift
|
||||||
|
|
||||||
|
local t
|
||||||
|
t=${l}
|
||||||
|
t=${t/#+([[:space:]])}
|
||||||
|
t=${t/%+([[:space:]])}
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
out_ref=${t}
|
||||||
|
}
|
||||||
|
|
||||||
|
# Gets supported architectures.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of an array variable, where the architectures will be stored
|
||||||
|
function get_valid_arches() {
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n arches_ref=${1}; shift
|
||||||
|
|
||||||
|
# shellcheck disable=SC2034 # it's a reference to external variable
|
||||||
|
arches_ref=( 'amd64' 'arm64' )
|
||||||
|
}
|
||||||
|
|
||||||
|
# Generates all pairs from a given sequence of strings. Each pair will
|
||||||
|
# be stored in the given variable and items in the pair will be
|
||||||
|
# separated by the given separator. For N strings, (N * N - N) / 2
|
||||||
|
# pairs will be generatated.
|
||||||
|
#
|
||||||
|
# Params:
|
||||||
|
#
|
||||||
|
# 1 - name of an array variable where the pairs will be stored
|
||||||
|
# 2 - separator string
|
||||||
|
# @ - strings
|
||||||
|
function all_pairs() {
|
||||||
|
# shellcheck disable=SC2178 # shellcheck doesn't grok references to arrays
|
||||||
|
local -n pairs_ref=${1}; shift
|
||||||
|
local sep=${1}; shift
|
||||||
|
|
||||||
|
# indices in ${@} are 1-based, 0 gives script name or something
|
||||||
|
local idx=1 next_idx
|
||||||
|
|
||||||
|
pairs_ref=()
|
||||||
|
while [[ ${idx} -lt ${#} ]]; do
|
||||||
|
next_idx=$((idx + 1))
|
||||||
|
while [[ ${next_idx} -le ${#} ]]; do
|
||||||
|
pairs_ref+=( "${!idx}${sep}${!next_idx}" )
|
||||||
|
next_idx=$((next_idx + 1))
|
||||||
|
done
|
||||||
|
idx=$((idx+1))
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
fi
|
102
pkg_auto/inside_sdk_container.sh
Executable file
102
pkg_auto/inside_sdk_container.sh
Executable file
@ -0,0 +1,102 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
##
|
||||||
|
## Gathers information about SDK and board packages. Also collects
|
||||||
|
## info about actual build deps of board packages, which may be useful
|
||||||
|
## for verifying if SDK provides those.
|
||||||
|
##
|
||||||
|
## Reports generated:
|
||||||
|
## sdk-pkgs - contains package information for SDK
|
||||||
|
## sdk-pkgs-kv - contains package information with key values (USE, PYTHON_TARGETS, CPU_FLAGS_X86) for SDK
|
||||||
|
## board-pkgs - contains package information for board for chosen architecture
|
||||||
|
## board-bdeps - contains package information with key values (USE, PYTHON_TARGETS, CPU_FLAGS_X86) of board build dependencies
|
||||||
|
## sdk-profiles - contains a list of profiles used by the SDK, in evaluation order
|
||||||
|
## board-profiles - contains a list of profiles used by the board for the chosen architecture, in evaluation order
|
||||||
|
## sdk-package-repos - contains package information with their repos for SDK
|
||||||
|
## board-package-repos - contains package information with their repos for board
|
||||||
|
## sdk-emerge-output - contains raw emerge output for SDK being a base for other reports
|
||||||
|
## board-emerge-output - contains raw emerge output for board being a base for other reports
|
||||||
|
## sdk-emerge-output-filtered - contains only lines with package information for SDK
|
||||||
|
## board-emerge-output-filtered - contains only lines with package information for board
|
||||||
|
## sdk-emerge-output-junk - contains only junk lines for SDK
|
||||||
|
## board-emerge-output-junk - contains only junk lines for board
|
||||||
|
## *-warnings - warnings printed by emerge or other tools
|
||||||
|
##
|
||||||
|
## Parameters:
|
||||||
|
## -h: this help
|
||||||
|
##
|
||||||
|
## Positional:
|
||||||
|
## 1 - architecture (amd64 or arm64)
|
||||||
|
## 2 - reports directory
|
||||||
|
##
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/impl/util.sh"
|
||||||
|
source "${PKG_AUTO_IMPL_DIR}/inside_sdk_container_lib.sh"
|
||||||
|
|
||||||
|
while [[ ${#} -gt 0 ]]; do
|
||||||
|
case ${1} in
|
||||||
|
-h)
|
||||||
|
print_help
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
--)
|
||||||
|
shift
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
-*)
|
||||||
|
fail "unknown flag '${1}'"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ ${#} -ne 2 ]]; then
|
||||||
|
fail 'Expected two parameters: board architecture and reports directory'
|
||||||
|
fi
|
||||||
|
|
||||||
|
arch=${1}; shift
|
||||||
|
reports_dir=${1}; shift
|
||||||
|
|
||||||
|
mkdir -p "${reports_dir}"
|
||||||
|
|
||||||
|
set_eo "${reports_dir}"
|
||||||
|
|
||||||
|
echo 'Running pretend-emerge to get complete report for SDK'
|
||||||
|
package_info_for_sdk >"${SDK_EO}" 2>"${SDK_EO_W}"
|
||||||
|
echo 'Running pretend-emerge to get complete report for board'
|
||||||
|
package_info_for_board "${arch}" >"${BOARD_EO}" 2>"${BOARD_EO_W}"
|
||||||
|
|
||||||
|
ensure_no_errors
|
||||||
|
|
||||||
|
echo 'Separating emerge info from junk in SDK emerge output'
|
||||||
|
filter_sdk_eo >"${SDK_EO_F}" 2>>"${SDK_EO_W}"
|
||||||
|
junk_sdk_eo >"${SDK_EO}-junk" 2>>"${SDK_EO_W}"
|
||||||
|
echo 'Separating emerge info from junk in board emerge output'
|
||||||
|
filter_board_eo "${arch}" >"${BOARD_EO_F}" 2>>"${BOARD_EO_W}"
|
||||||
|
junk_board_eo >"${BOARD_EO}-junk" 2>>"${BOARD_EO_W}"
|
||||||
|
|
||||||
|
ensure_valid_reports
|
||||||
|
|
||||||
|
echo 'Generating SDK packages listing'
|
||||||
|
versions_sdk >"${reports_dir}/sdk-pkgs" 2>"${reports_dir}/sdk-pkgs-warnings"
|
||||||
|
echo 'Generating SDK packages listing with key-values (USE, PYTHON_TARGETS CPU_FLAGS_X86, etc)'
|
||||||
|
versions_sdk_with_key_values >"${reports_dir}/sdk-pkgs-kv" 2>"${reports_dir}/sdk-pkgs-kv-warnings"
|
||||||
|
echo 'Generating board packages listing'
|
||||||
|
versions_board >"${reports_dir}/board-pkgs" 2>"${reports_dir}/board-pkgs-warnings"
|
||||||
|
echo 'Generating board packages bdeps listing'
|
||||||
|
board_bdeps >"${reports_dir}/board-bdeps" 2>"${reports_dir}/board-bdeps-warnings"
|
||||||
|
echo 'Generating SDK profiles evaluation list'
|
||||||
|
ROOT=/ "${PKG_AUTO_IMPL_DIR}/print_profile_tree.sh" -ni -nh >"${reports_dir}/sdk-profiles" 2>"${reports_dir}/sdk-profiles-warnings"
|
||||||
|
echo 'Generating board profiles evaluation list'
|
||||||
|
ROOT="/build/${arch}-usr" "${PKG_AUTO_IMPL_DIR}/print_profile_tree.sh" -ni -nh >"${reports_dir}/board-profiles" 2>"${reports_dir}/board-profiles-warnings"
|
||||||
|
echo 'Generating SDK package source information'
|
||||||
|
package_sources_sdk >"${reports_dir}/sdk-package-repos" 2>"${reports_dir}/sdk-package-repos-warnings"
|
||||||
|
echo 'Generating board package source information'
|
||||||
|
package_sources_board >"${reports_dir}/board-package-repos" 2>"${reports_dir}/board-package-repos-warnings"
|
||||||
|
|
||||||
|
echo "Cleaning empty warning files"
|
||||||
|
clean_empty_warning_files "${reports_dir}"
|
60
pkg_auto/sync_packages.sh
Executable file
60
pkg_auto/sync_packages.sh
Executable file
@ -0,0 +1,60 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
##
|
||||||
|
## Synces the packages with Gentoo.
|
||||||
|
##
|
||||||
|
## Parameters:
|
||||||
|
## -f: remove reports directory if it exists at startup
|
||||||
|
## -w: path to use for workdir
|
||||||
|
## -h: this help
|
||||||
|
##
|
||||||
|
## Positional:
|
||||||
|
## 1: config file
|
||||||
|
## 2: new branch name with updates
|
||||||
|
## 3: gentoo repo
|
||||||
|
##
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/impl/util.sh"
|
||||||
|
source "${PKG_AUTO_IMPL_DIR}/pkg_auto_lib.sh"
|
||||||
|
|
||||||
|
workdir=''
|
||||||
|
|
||||||
|
while [[ ${#} -gt 0 ]]; do
|
||||||
|
case ${1} in
|
||||||
|
-h)
|
||||||
|
print_help
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
-w)
|
||||||
|
if [[ -z ${2:-} ]]; then
|
||||||
|
fail 'missing value for -w'
|
||||||
|
fi
|
||||||
|
workdir=${2}
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
--)
|
||||||
|
shift
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
-*)
|
||||||
|
fail "unknown flag '${1}'"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ ${#} -ne 3 ]]; then
|
||||||
|
fail 'expected three positional parameters: a config file, a final branch name and a path to Gentoo repo'
|
||||||
|
fi
|
||||||
|
|
||||||
|
config_file=${1}; shift
|
||||||
|
saved_branch_name=${1}; shift
|
||||||
|
gentoo=${1}; shift
|
||||||
|
|
||||||
|
setup_workdir_with_config "${workdir}" "${config_file}"
|
||||||
|
perform_sync_with_gentoo "${gentoo}"
|
||||||
|
save_new_state "${saved_branch_name}"
|
60
pkg_auto/update_packages.sh
Executable file
60
pkg_auto/update_packages.sh
Executable file
@ -0,0 +1,60 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
|
||||||
|
##
|
||||||
|
## Updates the packages
|
||||||
|
##
|
||||||
|
## Parameters:
|
||||||
|
## -w: path to use for workdir
|
||||||
|
## -h: this help
|
||||||
|
##
|
||||||
|
## Positional:
|
||||||
|
## 1: config file
|
||||||
|
## 2: new branch name with updates
|
||||||
|
## 3: gentoo repo
|
||||||
|
##
|
||||||
|
|
||||||
|
set -euo pipefail
|
||||||
|
|
||||||
|
source "$(dirname "${BASH_SOURCE[0]}")/impl/util.sh"
|
||||||
|
source "${PKG_AUTO_IMPL_DIR}/pkg_auto_lib.sh"
|
||||||
|
|
||||||
|
workdir=''
|
||||||
|
|
||||||
|
while [[ ${#} -gt 0 ]]; do
|
||||||
|
case ${1} in
|
||||||
|
-h)
|
||||||
|
print_help
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
-w)
|
||||||
|
if [[ -z ${2:-} ]]; then
|
||||||
|
fail 'missing value for -w'
|
||||||
|
fi
|
||||||
|
workdir=${2}
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
--)
|
||||||
|
shift
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
-*)
|
||||||
|
fail "unknown flag '${1}'"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
break
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
if [[ ${#} -ne 3 ]]; then
|
||||||
|
fail 'expected three positional parameters: a config file, a final branch name and a path to Gentoo repo'
|
||||||
|
fi
|
||||||
|
|
||||||
|
config_file=${1}; shift
|
||||||
|
saved_branch_name=${1}; shift
|
||||||
|
gentoo=${1}; shift
|
||||||
|
|
||||||
|
setup_workdir_with_config "${workdir}" "${config_file}"
|
||||||
|
perform_sync_with_gentoo "${gentoo}"
|
||||||
|
save_new_state "${saved_branch_name}"
|
||||||
|
generate_package_update_reports
|
Loading…
x
Reference in New Issue
Block a user