vault/tools/pipeline/internal/pkg/git/pull.go
Ryan Cragun d595a95c01
[VAULT-37096] pipeline(github): add github copy pr command (#31095)
After the merge workflow has been reversed and branches hosted in
`hashicorp/vault` are downstream from community branches hosted in
`hashicorp/vault-enterprise`, most contributions to the source code
will originate in `hashicorp/vault-enterprise` and be backported to
a community branch in hosted in `hashicorp/vault-enterprise`. These
community branches will be considered the primary source of truth and
we'll automatically push changes from them to mirrors hosted in
`hashicorp/vault`.

This workflow ought to yield a massive efficiency boost for HashiCorp
contributors with access to `hashicorp/vault-enterprise`. Before the
workflow would look something like:
  - Develop a change in vault-enterprise
  - Manually extract relevant changes from your vault-enterprise branch
    into a new community vault branch.
  - Add any stubs that might be required so as to support any enterprise
    only changes.
  - Get the community change reviewed. If changes are necessary it often
    means changing and testing them on both the enteprise and community
    branches.
  - Merge the community change
  - Wait for it to sync to enterprise
  - *Hope you changes have not broken the build*. If they have, fix the
    build.
  - Update your enterprise branch
  - Get the enterprise branch reviewed again
  - Merge enterprise change
  - Deal with complicated backports.

After the workflow will look like:
  - Develop the change on enterprise
  - Get the change reviewed
  - Address feedback and test on the same branch
  - Merge the change
  - Automation will extract community changes and create a community
    backport PR for you depending on changes files and branch
    activeness.
  - Automation will create any enterprise backports for you.
  - Fix any backport as necessary
  - Merge the changes
  - The pipeline will automatically push the changes to the community
    branch mirror hosted in hashicorp/vault.

No more
 - Duplicative reviews
 - Risky merges
 - Waiting for changes to sync from community to enterprise
 - Manual decompistion of changes from enterprise and community
 - *Doing the same PR 3 times*
 - Dealing with a different backport process depending on which branches
   are active or not.

These changes do come at cost however. Since they always originate from
`vault-enterprise` only HashiCorp employees can take advatange of the
workflow. We need to be able to support community contributions that
originate from the mirrors but retain attribution.

That's what this PR is designed to do. The community will be able to
open a pull request as normal and have it reviewed as such, but rather
than merging it into the mirror we'll instead copy the PR and open it
against the corresponding enterprise base branch and have it merged it
from there. The change will automatically get backported to the
community branch if necessary, which eventually makes it back to the
mirror in hashicorp/vault.

To handle our squash merge workflow while retaining the correct
attribution, we'll automatically create merge commits in the copied PR
that include `Co-Authored-By:` trailers for all commit authors on the
original PR.

We also take care to ensure that the HashiCorp maitainers that approve
the PR and/or are assigned to it are also assigned to the copied PR.

This change is only the tooling to enable it. The workflow that drives
it will be implemented in VAULT-34827.

Signed-off-by: Ryan Cragun <me@ryan.ec>
2025-06-25 15:20:57 -06:00

245 lines
5.4 KiB
Go

// Copyright (c) HashiCorp, Inc.
// SPDX-License-Identifier: BUSL-1.1
package git
import (
"context"
"fmt"
"strconv"
"strings"
)
// RecurseSubmodules is a sub-module recurse mode
type RecurseSubmodules = string
const (
RecurseSubmodulesYes RecurseSubmodules = "yes"
RecurseSubmodulesOnDemand RecurseSubmodules = "on-demand"
RecurseSubmodulesNo RecurseSubmodules = "no"
)
// PullOpts are the git pull flags and arguments
// See: https://git-scm.com/docs/git-pull
type PullOpts struct {
// Options
Quiet bool // --quiet
Verbose bool // --verbose
RecurseSubmodules RecurseSubmodules // --recurse-submodules=
NoRecurseSubmodules bool // --no-recurse-submodules
// Merge options
Autostash bool // --autostash
AllowUnrelatedHistories bool // --allow-unrelated-histories
DoCommit bool // --commit
NoDoCommit bool // --no-commit
Cleanup Cleanup // --cleanup=
FF bool // --ff
FFOnly bool // --ff-onnly
NoFF bool // --no-ff
GPGSign bool // --gpgsign
GPGSignKeyID string // --gpgsign=<key-id>
Log uint // --log=
NoAutostash bool // --no-autostash
NoLog bool // --no-log
NoRebase bool // --no-rebase
NoStat bool // --no-stat
NoSquash bool // --no-squash
NoVerify bool // --no-verify
Stat bool // --stat
Squash bool // --squash
Strategy MergeStrategy // --stategy=
StrategyOptions []MergeStrategyOption // --strategy-option=
Rebase RebaseStrategy // --rebase=
Verify bool // --verify
// Fetch options
All bool // --all
Append bool // --append
Atomic bool // --atomic
Depth uint // --depth
Deepen uint // --deepen
Force bool // --force
NoTags bool // --no-tags
Porcelain bool // --porcelain
Progress bool // --progress
Prune bool // --prune
PruneTags bool // --prune-tags
SetUpstream bool // --set-upstream
Unshallow bool // --unshallow
UpdateShallow bool // --update-shallow
// Targets
Repository string // <repository>
Refspec []string // <refspec>
}
// Pull runs the git pull command
func (c *Client) Pull(ctx context.Context, opts *PullOpts) (*ExecResponse, error) {
return c.Exec(ctx, "pull", opts)
}
// String returns the options as a string
func (o *PullOpts) String() string {
return strings.Join(o.Strings(), " ")
}
// Strings returns the options as a string slice
func (o *PullOpts) Strings() []string {
if o == nil {
return nil
}
opts := []string{}
if o.All {
opts = append(opts, "--all")
}
if o.Atomic {
opts = append(opts, "--atomic")
}
if o.Autostash {
opts = append(opts, "--autostash")
}
if o.DoCommit {
opts = append(opts, "--commit")
}
if o.Depth > 0 {
opts = append(opts, "--depth", strconv.FormatUint(uint64(o.Depth), 10))
}
if o.Deepen > 0 {
opts = append(opts, "--deepen", strconv.FormatUint(uint64(o.Deepen), 10))
}
if o.FF {
opts = append(opts, "--ff")
}
if o.FFOnly {
opts = append(opts, "--ff-only")
}
if o.Force {
opts = append(opts, "--force")
}
if o.GPGSign {
opts = append(opts, "--gpg-sign")
}
if o.GPGSignKeyID != "" {
opts = append(opts, fmt.Sprintf("--gpg-sign=%s", o.GPGSignKeyID))
}
if o.Log > 0 {
opts = append(opts, fmt.Sprintf("--log=%d", o.Log))
}
if o.Squash {
opts = append(opts, "--squash")
}
if o.Stat {
opts = append(opts, "--stat")
}
for _, opt := range o.StrategyOptions {
opts = append(opts, "-X", string(opt))
}
if o.NoAutostash {
opts = append(opts, "--no-autostash")
}
if o.NoDoCommit {
opts = append(opts, "--no-commit")
}
if o.NoFF {
opts = append(opts, "--no-ff")
}
if o.NoLog {
opts = append(opts, "--no-log")
}
if o.NoRebase {
opts = append(opts, "--no-rebase")
}
if o.NoRecurseSubmodules {
opts = append(opts, "--no-recurse-submodules")
}
if o.NoSquash {
opts = append(opts, "--no-squash")
}
if o.NoStat {
opts = append(opts, "--no-stat")
}
if o.NoStat {
opts = append(opts, "--no-stat")
}
if o.NoTags {
opts = append(opts, "--no-tags")
}
if o.NoVerify {
opts = append(opts, "--no-verify")
}
if o.Porcelain {
opts = append(opts, "--porcelain")
}
if o.Progress {
opts = append(opts, "--progress")
}
if o.Prune {
opts = append(opts, "--prune")
}
if o.PruneTags {
opts = append(opts, "--prune-tags")
}
if o.Quiet {
opts = append(opts, "--quiet")
}
if o.Rebase != "" {
opts = append(opts, fmt.Sprintf("--rebase=%s", string(o.Rebase)))
}
if o.SetUpstream {
opts = append(opts, "--set-upstream")
}
if o.Unshallow {
opts = append(opts, "--unshallow")
}
if o.Verbose {
opts = append(opts, "--verbose")
}
if o.Repository != "" {
opts = append(opts, o.Repository)
}
if len(o.Refspec) > 0 {
opts = append(opts, o.Refspec...)
}
return opts
}