On Thu, Jul 16, 2020 at 11:53:57AM +0200, Pavel Hrdina wrote:
Having limit set to 100 is in most cases perfect but sometimes there
can
be a larger series that will have more than 100 patches and it will make
the check-dco job fail.
Signed-off-by: Pavel Hrdina <phrdina(a)redhat.com>
---
.gitlab-ci.yml | 2 +-
ci/cirrus/build.yml | 2 +-
2 files changed, 2 insertions(+), 2 deletions(-)
diff --git a/.gitlab-ci.yml b/.gitlab-ci.yml
index 702198ec8e4..c997dc6df25 100644
--- a/.gitlab-ci.yml
+++ b/.gitlab-ci.yml
@@ -1,5 +1,5 @@
variables:
- GIT_DEPTH: 100
+ GIT_DEPTH: 1000
If its only for the DCO job, then just override the GIT_DEPTH
variable only in the DCO job definition, leaving all the
others at 100 for speed.
stages:
- sanity_checks
diff --git a/ci/cirrus/build.yml b/ci/cirrus/build.yml
index 893e13d7241..49e90b6d67b 100644
--- a/ci/cirrus/build.yml
+++ b/ci/cirrus/build.yml
@@ -14,7 +14,7 @@ build_task:
install_script:
- @INSTALL_COMMAND@ @PKGS@
clone_script:
- - git clone --depth 100 "$CI_REPOSITORY_URL" .
+ - git clone --depth 1000 "$CI_REPOSITORY_URL" .
Seems redundant per the subject goal
- git fetch origin "$CI_COMMIT_REF_NAME"
- git reset --hard "$CI_COMMIT_SHA"
build_script:
Regards,
Daniel
--
|:
https://berrange.com -o-
https://www.flickr.com/photos/dberrange :|
|:
https://libvirt.org -o-
https://fstop138.berrange.com :|
|:
https://entangle-photo.org -o-
https://www.instagram.com/dberrange :|