On Thu, Jul 09, 2020 at 12:13:11PM +0200, Philippe Mathieu-Daudé wrote: > On 7/9/20 10:55 AM, Erik Skultety wrote: > > On Wed, Jul 08, 2020 at 10:46:56PM -0400, Cleber Rosa wrote: > >> This script is intended to be used right after a push to a branch. > >> > >> By default, it will look for the pipeline associated with the commit > >> that is the HEAD of the *local* staging branch. It can be used as a > >> one time check, or with the `--wait` option to wait until the pipeline > >> completes. > >> > >> If the pipeline is successful, then a merge of the staging branch into > >> the master branch should be the next step. > >> > >> Signed-off-by: Cleber Rosa <[email protected]> > >> --- > >> scripts/ci/gitlab-pipeline-status | 156 ++++++++++++++++++++++++++++++ > >> 1 file changed, 156 insertions(+) > >> create mode 100755 scripts/ci/gitlab-pipeline-status > >> > >> diff --git a/scripts/ci/gitlab-pipeline-status > >> b/scripts/ci/gitlab-pipeline-status > >> new file mode 100755 > >> index 0000000000..4a9de39872 > >> --- /dev/null > >> +++ b/scripts/ci/gitlab-pipeline-status > >> @@ -0,0 +1,156 @@ > >> +#!/usr/bin/env python3 > >> +# > >> +# Copyright (c) 2019-2020 Red Hat, Inc. > >> +# > >> +# Author: > >> +# Cleber Rosa <[email protected]> > >> +# > >> +# This work is licensed under the terms of the GNU GPL, version 2 or > >> +# later. See the COPYING file in the top-level directory. > >> + > >> +""" > >> +Checks the GitLab pipeline status for a given commit commit > > > > s/commit$/(hash|sha|ID|) > > > >> +""" > >> + > >> +# pylint: disable=C0103 > >> + > >> +import argparse > >> +import http.client > >> +import json > >> +import os > >> +import subprocess > >> +import time > >> +import sys > >> + > >> + > >> +def get_local_staging_branch_commit(): > >> + """ > >> + Returns the commit sha1 for the *local* branch named "staging" > >> + """ > >> + result = subprocess.run(['git', 'rev-parse', 'staging'], > > > > If one day Peter decides that "staging" is not a cool name anymore and use a > > different name for the branch :) we should account for that and make it a > > variable, possibly even parametrize this function with it. > > This script can be used by any fork, not only Peter. > So having a parameter (default to 'staging') is a requisite IMO. >
Right, as explained in the reply to Erik, this is just used for
finding the commit ID for the staging branch. Still, I'm making it
configurable in a new patch, and if people want, we can change the
current behavior to accept any kind of revision (but this would
probably mean changing the options or names, given that -c/--commit is
quite descriptive).
> >> + stdin=subprocess.DEVNULL,
> >> + stdout=subprocess.PIPE,
> >> + stderr=subprocess.DEVNULL,
> >> + cwd=os.path.dirname(__file__),
> >> + universal_newlines=True).stdout.strip()
> >> + if result == 'staging':
> >> + raise ValueError("There's no local staging branch")
> >
> > "There's no local branch named 'staging'" would IMO be more descriptive, so
> > as
> > not to confuse it with staging in git.
> >
> >> + if len(result) != 40:
> >> + raise ValueError("Branch staging HEAD doesn't look like a sha1")
> >> + return result
> >> +
> >> +
> >> +def get_pipeline_status(project_id, commit_sha1):
> >> + """
> >> + Returns the JSON content of the pipeline status API response
> >> + """
> >> + url = '/api/v4/projects/{}/pipelines?sha={}'.format(project_id,
> >> + commit_sha1)
> >> + connection = http.client.HTTPSConnection('gitlab.com')
> >> + connection.request('GET', url=url)
> >> + response = connection.getresponse()
> >> + if response.code != http.HTTPStatus.OK:
> >> + raise ValueError("Failed to receive a successful response")
> >> + json_response = json.loads(response.read())
> >
> > a blank line separating the commentary block would slightly help readability
> >
> >> + # afaict, there should one one pipeline for the same project + commit
> >
> > s/one one/be only one/
>
> 'afaict' is not a word.
>
Yes, good point. Thomas has addressed this.
> >
> >> + # if this assumption is false, we can add further filters to the
> >> + # url, such as username, and order_by.
> >> + if not json_response:
> >> + raise ValueError("No pipeline found")
> >> + return json_response[0]
> >> +
> >> +
> >> +def wait_on_pipeline_success(timeout, interval,
> >> + project_id, commit_sha):
> >> + """
> >> + Waits for the pipeline to end up to the timeout given
> >
> > "Waits for the pipeline to finish within the given timeout"
> >
> >> + """
> >> + start = time.time()
> >> + while True:
> >> + if time.time() >= (start + timeout):
> >> + print("Waiting on the pipeline success timed out")
> >
> > s/success//
> > (the pipeline doesn't always have to finish with success)
> >
> >> + return False
> >> +
> >> + status = get_pipeline_status(project_id, commit_sha)
> >> + if status['status'] == 'running':
> >> + time.sleep(interval)
> >> + print('running...')
>
> If we want to automate the use of this script by a daemon, it would
> be better to use the logging class. Then maybe 'running...' is for
> the DEBUG level, Other print() calls can be updated to WARN/INFO
> levels.
>
Makes sense. I'll look into using proper logging in a future
improvement series.
Thanks,
- Cleber.
signature.asc
Description: PGP signature
