armbian_build/lib/tools/info/output-gha-matrix.py
Ricardo Pardini b92575381a pipeline: inventory all board vars; add not-eos-with-video; introduce TARGETS_FILTER_INCLUDE
> How to use:
>
> `./compile.sh inventory` - does just the board inventory; look for output in `output/info`
>
> `./compile.sh targets-dashboard` - does inventory, targets compositing, and images info; look for output in `output/info`, read the instructions output by the command if you want to load the OpenSearch dashboards.
>
> `./compile.sh targets` - does the full targets compositing and artifacts, look for output in `output/info`
>
> If you don't have a `userpatches/targets.yaml`, _one will be provided for you_ defaulting to Jammy minimal CLI
> and Jammy xfce desktop, for all boards in all branches. You can pass filters via `TARGETS_FILTER_INCLUDE=...` to narrow.
>

- board JSON inventory:
  - more generic regex parsing of variables from board files:
    - all top-level (non-indented) variables are parsed and included in the JSON board inventory
    - this allows us to add new variables to the board files without having to update the parser
    - variables can be bare, `export` or `declare -g`, but **_must_ be quoted** (single or double) and UPPER_CASE
  - some special treatment for certain variables:
    - `KERNEL_TARGET` is parsed as a _comma-separated_ list of valid BRANCH'es
    - `BOARD_MAINTAINER` is parsed as _space-separated_ list of valid maintainer GH usernames as `BOARD_MAINTAINERS: [...]` in the JSON
      - script complains if `BOARD_MAINTAINER` is not set in core boards. Empty is still allowed.
    - `HAS_VIDEO_OUTPUT="no"` causes `BOARD_HAS_VIDEO: false` in the JSON (for desktop-only inventorying, see below)
- introduce `not-eos-with-video` in `items-from-inventory` at the targets compositor
  - the same as `not-eos`, but with added `BOARD_HAS_VIDEO: true` filter, see above
- introduce `TARGETS_FILTER_INCLUDE` for targets compositor
  - this filters the targets _after_ compositing (but before getting image info), based on the board inventory data
  - it's a comma-separated list of `key:value` pairs, which are OR-ed together
  - new virtual info `BOARD_SLASH_BRANCH` post-compositing inventory for filtering of a specific BOARD/BRANCH combo (e.g. `odroidhc4/edge`)
  - some interesting possible filters:
    - `TARGETS_FILTER_INCLUDE="BOARD:odroidhc4"`: _only_ build a single board, all branches. JIRA [AR-1806]
    - `TARGETS_FILTER_INCLUDE="BOARD_SLASH_BRANCH:odroidhc4/current"`: _only_ build a single board/branch combo
    - `TARGETS_FILTER_INCLUDE="BOARD:odroidhc4,BOARD:odroidn2"`: _only_ build _two_ boards, all branches.
    - `TARGETS_FILTER_INCLUDE="BOARD_MAINTAINERS:rpardini"`: build all boards and branches where rpardini is a maintainer
    - `TARGETS_FILTER_INCLUDE="BOARDFAMILY:rockchip64"`: build all boards and branches in the rockchip64 family
  - image-info-only variables like `LINUXFAMILY` is **not** available for filtering at this stage
- rename `config/templates` `targets-all-cli.yaml` to `targets-default.yaml`
    - this is used when no `userpatches/targets.yaml` is found
    - new default includes all boards vs branches for non-EOS boards
        - also desktop for all boards that _don't_ have `HAS_VIDEO_OUTPUT='no``
- introduce simplified `targets-dashboard` CLI:
  - does only inventory, compositing, and image info, but not artifact reducing, etc.
  - ignore desktop builds in the OpenSearch indexer
  - update the OpenSearch Dashboards, including new information now available
- invert the logic used for `CLEAN_INFO` and `CLEAN_MATRIX`
    - defaults to `yes` now, so new users/CI don't get hit by stale caches by default
    - repo pipeline CLI stuff is usually run on saved/restored artifacts for `output/info`, so don't clean by default via the CLI
2023-07-26 15:15:02 +02:00

201 lines
7.3 KiB
Python
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

#!/usr/bin/env python3
#
# SPDX-License-Identifier: GPL-2.0
# Copyright (c) 2023 Ricardo Pardini <ricardo@pardini.net>
# This file is a part of the Armbian Build Framework https://github.com/armbian/build/
#
import json
import logging
import os
import sys
sys.path.append(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from common import armbian_utils
from common import gha
# Prepare logging
armbian_utils.setup_logging()
log: logging.Logger = logging.getLogger("output-gha-matrix")
def resolve_gha_runner_tags_via_pipeline_gha_config(input: dict, artifact_name: str, artifact_arch: str):
log.debug(f"Resolving GHA runner tags for artifact/image '{artifact_name}' '{artifact_arch}'")
# if no config, default to "ubuntu-latest" as a last-resort
ret = "ubuntu-latest"
if not "pipeline" in input:
log.warning(f"No 'pipeline' config in input, defaulting to '{ret}'")
return ret
pipeline = input["pipeline"]
if not "gha" in pipeline:
log.warning(f"No 'gha' config in input.pipeline, defaulting to '{ret}'")
return ret
gha = pipeline["gha"]
if (gha is None) or (not "runners" in gha):
log.warning(f"No 'runners' config in input.pipeline.gha, defaulting to '{ret}'")
return ret
runners = gha["runners"]
if "default" in runners:
ret = runners["default"]
log.debug(f"Found 'default' config in input.pipeline.gha.runners, defaulting to '{ret}'")
# Now, 'by-name' first.
if "by-name" in runners:
by_names = runners["by-name"]
if artifact_name in by_names:
ret = by_names[artifact_name]
log.debug(f"Found 'by-name' value '{artifact_name}' config in input.pipeline.gha.runners, using '{ret}'")
# Now, 'by-name-and-arch' second.
artifact_name_and_arch = f"{artifact_name}{f'-{artifact_arch}' if artifact_arch is not None else ''}"
if "by-name-and-arch" in runners:
by_names_and_archs = runners["by-name-and-arch"]
if artifact_name_and_arch in by_names_and_archs:
ret = by_names_and_archs[artifact_name_and_arch]
log.debug(f"Found 'by-name-and-arch' value '{artifact_name_and_arch}' config in input.pipeline.gha.runners, using '{ret}'")
log.debug(f"Resolved GHA runs_on for name:'{artifact_name}' arch:'{artifact_arch}' to runs_on:'{ret}'")
return ret
def generate_matrix_images(info) -> list[dict]:
# each image
matrix = []
for image_id in info["images"]:
image = info["images"][image_id]
if armbian_utils.get_from_env("IMAGES_ONLY_OUTDATED_ARTIFACTS") == "yes":
log.info(f"IMAGES_ONLY_OUTDATED_ARTIFACTS is set: outdated artifacts: {image['outdated_artifacts_count']} for image {image_id}")
skip = image["outdated_artifacts_count"] == 0
if skip:
log.warning(f"Skipping image {image_id} because it has no outdated artifacts")
continue
if armbian_utils.get_from_env("SKIP_IMAGES") == "yes":
log.warning(f"Skipping image {image_id} because SKIP_IMAGES=yes")
continue
desc = f"{image['image_file_id']} {image_id}"
inputs = image['in']
image_arch = image['out']['ARCH']
runs_on = resolve_gha_runner_tags_via_pipeline_gha_config(inputs, "image", image_arch)
cmds = (armbian_utils.map_to_armbian_params(inputs["vars"], True) + inputs["configs"]) # image build is "build" command, omitted here
invocation = " ".join(cmds)
item = {"desc": desc, "runs_on": runs_on, "invocation": invocation}
matrix.append(item)
return matrix
def generate_matrix_artifacts(info):
# each artifact
matrix = []
for artifact_id in info["artifacts"]:
artifact = info["artifacts"][artifact_id]
skip = not not artifact["oci"]["up-to-date"]
if skip:
continue
artifact_name = artifact['in']['artifact_name']
desc = f"{artifact['out']['artifact_name']}={artifact['out']['artifact_version']}"
inputs = artifact['in']['original_inputs']
artifact_arch = None
# Try via the inputs to artifact...
if "inputs" in artifact['in']:
if "ARCH" in artifact['in']['inputs']:
artifact_arch = artifact['in']['inputs']['ARCH']
runs_on = resolve_gha_runner_tags_via_pipeline_gha_config(inputs, artifact_name, artifact_arch)
cmds = (["artifact"] + armbian_utils.map_to_armbian_params(inputs["vars"], True) + inputs["configs"])
invocation = " ".join(cmds)
item = {"desc": desc, "runs_on": runs_on, "invocation": invocation}
matrix.append(item)
return matrix
# generate images or artifacts?
type_gen = sys.argv[1]
# read the outdated artifacts+imaes json file passed as first argument as a json object
with open(sys.argv[2]) as f:
info = json.load(f)
matrix = None
if type_gen == "artifacts":
matrix = generate_matrix_artifacts(info)
elif type_gen == "images":
matrix = generate_matrix_images(info)
else:
log.error(f"Unknown type: {type_gen}")
sys.exit(1)
# third argument is the number of chunks wanted.
ideal_chunk_size = 150
max_chunk_size = 250
# check is sys.argv[3] exists...
if len(sys.argv) >= 4:
num_chunks = int(sys.argv[3])
else:
log.warning(f"Number of chunks not specified. Calculating automatically, matrix: {len(matrix)} chunk ideal: {ideal_chunk_size}.")
# calculate num_chunks by dividing the matrix size by the ideal chunk size, and rounding always up.
num_chunks = int(len(matrix) / ideal_chunk_size) + 1
log.warning(f"Number of chunks: {num_chunks}")
# distribute the matrix items equally along the chunks. try to keep every chunk the same size.
chunks = []
for i in range(num_chunks):
chunks.append([])
for i, item in enumerate(matrix):
chunks[i % num_chunks].append(item)
# ensure chunks are not too big
for i, chunk in enumerate(chunks):
if len(chunk) > ideal_chunk_size:
log.warning(f"Chunk '{i + 1}' is bigger than ideal: {len(chunk)}")
if len(chunk) > max_chunk_size:
log.error(f"Chunk '{i + 1}' is too big: {len(chunk)}")
sys.exit(1)
# Directly set outputs for _each_ GHA chunk here. (code below is for all the chunks)
gha.set_gha_output(f"{type_gen}-chunk-json-{i + 1}", json.dumps({"include": chunk}))
# An output that is used to test for empty matrix.
gha.set_gha_output(f"{type_gen}-chunk-not-empty-{i + 1}", "yes" if len(chunk) > 0 else "no")
gha.set_gha_output(f"{type_gen}-chunk-size-{i + 1}", len(chunk))
# For the full matrix, we can't have empty chunks; use a "really" field to indicate a fake entry added to make it non-empty.
if len(chunk) == 0:
log.warning(f"Chunk '{i + 1}' for '{type_gen}' is empty, adding fake invocation.")
chunks[i] = [{"desc": "Fake matrix element so matrix is not empty", "runs_on": "ubuntu-latest", "invocation": "none", "really": "no"}]
else:
for item in chunk:
item["really"] = "yes"
# massage the chunks so they're objects with "include" key, the way GHA likes it.
all_chunks = {}
for i, chunk in enumerate(chunks):
log.info(f"Chunk {i + 1} has {len(chunk)} elements.")
all_chunks[f"chunk{i + 1}"] = {"include": chunk}
print(json.dumps(all_chunks))