Ubunut 20.04 is EOL. Per our support and package policies we no longer
need to develop or test for that platform.
Signed-off-by: Ryan Cragun <me@ryan.ec>
While working on VAULT-34829 it became apparent that if our new backporter
could know which branches are active and which CE counterparts are active
then we could completely omit the need for `ce` backport labels and instead
automatically backport to corresponding CE branches that are active.
To facilitate that we can re-use our `.release/versions.hcl` file as it is
the current source of truth for our present backport assistant workflow.
Here we add a new `pipeline releases list versions` command that is capable
of decoding that file and optionally displaying it. It will be used in the
next PR that fully implements VAULT-34829.
As part of this work we refactors `pipeline releases` to include a new `list`
sub-command and moved both `list-active-versions` and `versions` to it.
We also include a few small fixes that were noticed:
- `.release/verions.hcl` was not up-to-date
- Our cached dynamic config was not getting recreated when the pipeline
tool changed. That has been fixed so now dynamic config should always
get recreated when the pipeline binary changes
- We now initialize a git client when using the `github` sub-command.
This will be used in more forthcoming work
- Update our changed file detection to resolve some incorrect groupings
- Add some additional changed file helpers that we be used in forthcoming
work
Signed-off-by: Ryan Cragun <me@ryan.ec>
* enos(artifactory): unify dev and test scenario artifactory metadata into new module
There was previously a lot of shared logic between
`build_artifactory_artifact` and `build_artifactory_package` as it
regards to building an artifact name. When it comes down to it, both
modules are very similar and their only major difference is searching
for any artifact (released or not) by either a combination of
`revision`, `edition`, `version`, and `type` vs. searching for a
released artifact with a combination of `version`, `edition`, and
`type`.
Rather than bolt on new `s390x` and `fips1403` artifact metadata to
both, I factored their metadata for package names and such into a
unified and shared `artifact/metadata` module that is now called by
both.
This was tricky as dev and test scenarios currently differ in what
we pass in as the `vault_version`, but we hope to remove that
difference soon. We also add metadata support for the forthcoming
FIPS 140-3.
This commit was tested extensively, along with other test scenarios
in support for `s390x but will be useful immediately for FIPS 140-3
so I've extracted it out.
Signed-off-by: Ryan Cragun <me@ryan.ec>
* Fix artifactory metadata before merge
The initial pass of the artifactory metadata was largely untested and
extracted from a different branch. After testing, this commit fixes a
few issues with the metadata module.
In order to test this I also had to fix an issue where AWS secrets
engine testing became a requirement but is impossible unless you exectue
against a blessed AWS account that has required roles. Instead, we now
make those verification opt-in via a new variable.
We also make some improvements to the pki-verify-certificates script so
that it works reliably against all our supported distros.
We also update our dynamic configuration to use the updated versions in
samples.
Signed-off-by: Ryan Cragun <me@ryan.ec>
`gosimports` is the preferred style for module imports and it is
enforced via CI. I've found that things often manage to drift so I've
taken the liberty to update our pre-commit hook to verify our imports
formatting before a change is committed.
Along with updating the formatting helper I've also run `make fmt` to
resolve any formatting drift that managed to make it into the codebase.
Signed-off-by: Ryan Cragun <me@ryan.ec>
* VAULT-33074: add `github` sub-command to `pipeline`
Investigating test workflow failures is common task that engineers on the
sustaining rotation perform. This task often requires quite a bit of
manual labor by manually inspecting all failed/cancelled workflows in
the Github UI on per repo/branch/workflow basis and performing root cause
analysis.
As we work to improve our pipeline discoverability this PR adds a new `github`
sub-command to the `pipeline` utility that allows querying for such workflows
and returning either machine readable or human readable summaries in a single
place. Eventually we plan to automate sending a summary of this data to
an OTEL collector automatically but for now sustaining engineers can
utilize it to query for workflows with lots of various criteria.
A common pattern for investigating build/enos test failure workflows would be:
```shell
export GITHUB_TOKEN="YOUR_TOKEN"
go run -race ./tools/pipeline/... github list-workflow-runs -o hashicorp -r vault -d '2025-01-13..2025-01-23' --branch main --status failure build
```
This will list `build` workflow runs in `hashicorp/vault` repo for the
`main` branch with the `status` or `conclusion` of `failure` within the date
range of `2025-01-13..2025-01-23`.
A sustaining engineer will likely do this for both `vault` and
`vault-enterprise` repositories along with `enos-release-testing-oss` and
`enos-release-testing-ent` workflows in addition to `build` in order to
get a full picture of the last weeks failures.
You can also use this utility to summarize workflows based on other
statuses, branches, HEAD SHA's, event triggers, github actors, etc. For
a full list of filter arguments you can pass `-h` to the sub-command.
> [!CAUTION]
> Be careful not to run this without setting strict filter arguments.
> Failing to do so could result in trying to summarize way too many
> workflows resulting in your API token being disabled for an hour.
Signed-off-by: Ryan Cragun <me@ryan.ec>
As the Vault pipeline and release processes evolve over time, so too must the tooling that drives them. Historically we've utilized a combination of CI features and shell scripts that are wrapped into make targets to drive our CI. While this
approach has worked, it requires careful consideration of what features to use (bash in CI almost never matches bash in developer machines, etc.) and often requires a deep understanding of several CLI tools (jq, etc). `make` itself also has limitations in user experience, e.g. passing flags.
As we're all in on Github Actions as our pipeline coordinator, continuing to utilize and build CLI tools to perform our pipeline tasks makes sense. This PR adds a new CLI tool called `pipeline` which we can use to build new isolated tasks that we can string together in Github Actions. We intend to use this utility as the interface for future release automation work, see VAULT-27514.
For the first task in this new `pipeline` tool, I've chosen to build two small sub-commands:
* `pipeline releases list-versions` - Allows us to list Vault versions between a range. The range is configurable either by setting `--upper` and/or `--lower` bounds, or by using the `--nminus` to set the N-X to go back from the current branches version. As CE and ENT do not have version parity we also consider the `--edition`, as well as none-to-many `--skip` flags to exclude specific versions.
* `pipeline generate enos-dynamic-config` - Which creates dynamic enos configuration based on the branch and the current list of release versions. It takes largely the same flags as the `release list-versions` command, however it also expects a `--dir` for the enos directory and a `--file` where the dynamic configuration will be written. This allows us to dynamically update and feed the latest versions into our sampling algorithm to get coverage over all supported prior versions.
We then integrate these new tools into the pipeline itself and cache the dynamic config on a weekly basis. We also cache the pipeline tool itself as it will likely become a repository for pipeline specific tooling. The caching strategy for the `pipeline` tool itself will make most workflows that require it super fast.
Signed-off-by: Ryan Cragun <me@ryan.ec>