vault/tools/tools.sh
Ryan Cragun ce5885279b
VAULT-31181: Add pipeline tool to Vault (#28536)
As the Vault pipeline and release processes evolve over time, so too must the tooling that drives them. Historically we've utilized a combination of CI features and shell scripts that are wrapped into make targets to drive our CI. While this 
approach has worked, it requires careful consideration of what features to use (bash in CI almost never matches bash in developer machines, etc.) and often requires a deep understanding of several CLI tools (jq, etc). `make` itself also has limitations in user experience, e.g. passing flags.

As we're all in on Github Actions as our pipeline coordinator, continuing to utilize and build CLI tools to perform our pipeline tasks makes sense. This PR adds a new CLI tool called `pipeline` which we can use to build new isolated tasks that we can string together in Github Actions. We intend to use this utility as the interface for future release automation work, see VAULT-27514.

For the first task in this new `pipeline` tool, I've chosen to build two small sub-commands:

* `pipeline releases list-versions` - Allows us to list Vault versions between a range. The range is configurable either by setting `--upper` and/or `--lower` bounds, or by using the `--nminus` to set the N-X to go back from the current branches version. As CE and ENT do not have version parity we also consider the `--edition`, as well as none-to-many `--skip` flags to exclude specific versions.

* `pipeline generate enos-dynamic-config` - Which creates dynamic enos configuration based on the branch and the current list of release versions. It takes largely the same flags as the `release list-versions` command, however it also expects a `--dir` for the enos directory and a `--file` where the dynamic configuration will be written. This allows us to dynamically update and feed the latest versions into our sampling algorithm to get coverage over all supported prior versions.

We then integrate these new tools into the pipeline itself and cache the dynamic config on a weekly basis. We also cache the pipeline tool itself as it will likely become a repository for pipeline specific tooling. The caching strategy for the `pipeline` tool itself will make most workflows that require it super fast.


Signed-off-by: Ryan Cragun <me@ryan.ec>
2024-10-23 15:31:24 -06:00

189 lines
3.9 KiB
Bash
Executable File

#!/usr/bin/env bash
# Copyright (c) HashiCorp, Inc.
# SPDX-License-Identifier: BUSL-1.1
set -euo pipefail
# Determine the root directory of the repository
repo_root() {
git rev-parse --show-toplevel
}
# Install an external Go tool.
go_install() {
if go install "$1"; then
echo "--> $1"
else
echo "--> $1"
return 1
fi
}
# Check for a tool binary in the path.
check_tool() {
if builtin type -P "$2" &> /dev/null; then
echo "--> $2"
else
echo "--> $2"
echo "Could not find required $1 tool $2. Run 'make tools-$1' to install it." 1>&2
return 1
fi
}
# Install external tools.
install_external() {
local tools
# If you update this please update check_external below as well as our external tools
# install action .github/actions/install-external-tools/action.yml
#
tools=(
honnef.co/go/tools/cmd/staticcheck@latest
github.com/bufbuild/buf/cmd/buf@v1.45.0
github.com/favadi/protoc-go-inject-tag@latest
github.com/golangci/misspell/cmd/misspell@latest
github.com/golangci/revgrep/cmd/revgrep@latest
github.com/loggerhead/enumer@latest
github.com/rinchsan/gosimports/cmd/gosimports@latest
golang.org/x/tools/cmd/goimports@latest
google.golang.org/protobuf/cmd/protoc-gen-go@latest
google.golang.org/grpc/cmd/protoc-gen-go-grpc@latest
gotest.tools/gotestsum@latest
mvdan.cc/gofumpt@latest
mvdan.cc/sh/v3/cmd/shfmt@latest
)
echo "==> Installing external tools..."
for tool in "${tools[@]}"; do
go_install "$tool"
done
}
# Check that all tools are installed
check_external() {
# Ensure that all external tools are available. In CI we'll prefer installing pre-built external
# tools for speed instead of go install so that we don't require downloading Go modules and
# compiling tools from scratch in every CI job.
# See .github/actions/install-external-tools.yml for that workflow.
local tools
tools=(
buf
enumer
gofumpt
goimports
gosimports
gotestsum
misspell
protoc-gen-go
protoc-gen-go-grpc
protoc-go-inject-tag
revgrep
shfmt
staticcheck
)
echo "==> Checking for external tools..."
for tool in "${tools[@]}"; do
check_tool external "$tool"
done
}
# Install internal tools.
install_internal() {
local tools
# If you update this please update check tools below.
tools=(
codechecker
stubmaker
)
echo "==> Installing internal tools..."
pushd "$(repo_root)/tools" &> /dev/null
for tool in "${tools[@]}"; do
go_install ./"$tool"
done
popd &> /dev/null
}
# Check internal that all tools are installed
check_internal() {
# Ensure that all required internal tools are available.
local tools
tools=(
codechecker
stubmaker
)
echo "==> Checking for internal tools..."
for tool in "${tools[@]}"; do
check_tool internal "$tool"
done
}
# Install our pipeline tools. In some cases these may require access to internal repositories so
# they are excluded from our baseline toolset.
install_pipeline() {
echo "==> Installing pipeline tools..."
pushd "$(repo_root)/tools/pipeline" &> /dev/null
if env GOPRIVATE=github.com/hashicorp go install ./...; then
echo "--> pipeline ✔"
else
echo "--> pipeline ✖"
popd &> /dev/null
return 1
fi
popd &> /dev/null
}
# Check that all required pipeline tools are installed
check_pipeline() {
echo "==> Checking for pipeline tools..."
check_tool pipeline pipeline
}
# Install tools.
install() {
install_internal
install_external
}
# Check tools.
check() {
check_internal
check_external
}
main() {
case $1 in
install-external)
install_external
;;
install-internal)
install_internal
;;
install-pipeline)
install_pipeline
;;
check-external)
check_external
;;
check-internal)
check_internal
;;
check-pipeline)
check_pipeline
;;
install)
install
;;
check)
check
;;
*)
echo "unknown sub-command" >&2
exit 1
;;
esac
}
main "$@"