Skip to content

Add External build strategy #7949

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from

Conversation

jimmidyson
Copy link
Contributor

Ref: #6954

This just adds a simple External build strategy that indicates that builds will be handled by an external system, e.g. Jenkins. I expect that annotations on the build config will be used to configure anything required on the external job, e.g. path to Jenkinsfile in repo.

I'm also working on a Jenkins plugin that watches build configs with this build strategy & creates/updates/deletes Jenkins jobs as required.

Looking for feedback please.

/cc @jstrachan @rawlingsj @bparees @smarterclayton @mfojtik

@@ -218,6 +218,10 @@ func (g *BuildGenerator) Instantiate(ctx kapi.Context, request *buildapi.BuildRe
return nil, err
}

if bc.Spec.Strategy.ExternalStrategy != nil {
return nil, &GeneratorFatalError{fmt.Sprintf("can't instantiate from BuildConfig %s/%s: BuildConfig uses an External build strategy", bc.Namespace, bc.Name)}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think we ultimately are going to want to instantiate these, just like any other BC, we'll instantiate it to create a build object, and then something will be monitoring for those new build jobs to bridge into launching the jenkins job.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(that will necessitate changes to our build controller logic to ignore these though.)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we'd do that before we merge this.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i would imagine this will kick the jenkins job and return the jenkins job logs (by launching a pod that will connect to jenkins and stream the logs).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, i figured, i realize this PR is very much a WIP to spur discussion.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Absolutely - this is an early WIP PR to ensure we all understand & agree on the approach.

@bparees
Copy link
Contributor

bparees commented Mar 11, 2016

Thanks @jimmidyson! At first pass it looks like the right direction, will spend more time w/ this post-travel.

@@ -303,6 +303,9 @@ type BuildStrategy struct {

// CustomStrategy holds the parameters to the Custom build strategy
CustomStrategy *CustomBuildStrategy `json:"customStrategy,omitempty" description:"holds parameters to the Custom build strategy"`

// ExternalStrategy holds the parameters to the External build strategy
ExternalStrategy *ExternalBuildStrategy `json:"externalStrategy,omitempty" description:"holds parameters to the External build strategy"`
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think, based on everything we've said, we want to make this be somewhat specific to workflow but not necessarily (implicitly) specific to Jenkins. We want to pick the most general abstraction that solves the following problems:

  1. Users want to define a complex workflow on their builds
  2. A third party is going to implement that workflow
  3. The third party needs to report back the workflow status to the status sub resource on the build config and clients need to read it (as a versionable API struct)
  4. A client can parameterize the external workflow (what parameters will we support?)
  5. If the workflow has a single definition (a Jenkinsfile), can we allow users to directly support that (are there any cases where a Jenkinsfile needs multiple files to work, because it includes another groovy script in the same dir?)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what does the example flow look like? does that mean user will have to create multiple build configs for each step in the pipeline?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no. the build steps are defined in the jenkinsfile (or whatever mechanism the pipeline engine of choice uses to define a workflow). This buildconfig is purely a wrapper to that external workflow/pipeline process. it doesn't care about individual steps.

some of the pipeline steps might be represented by other "normal" buildconfigs (eg an s2 build) but that's not necessarily always going to be the case.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bparees so openshift itself does not know how the final pipeline looks like? i mean if jenkinsfile defines it, how we are going to visualize it in the console? i was expecting something like a linked build configurations (prev step/next step), where some build configs will be external and some local (regular docker/s2i/custom builds)...

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@mfojtik fabric knows how to interrogate a jenkins workflow job definition and visualize it. that's what will come to openshift. The buildconfig will just point to that workflow job. (that's one scenario. another is that that buildconfig actually contains enough metadata on its own to define a workflow, without a jenkinsfile).

but no, there will not be linked buildconfigurations, not all steps will be represented as buildconfiguration objects (eg a human approval step)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Think of the build config as representing an entire build process. The external engine will write back a pretty detailed status blob that will either be strictly API versioned (generic steps, tasks, stages, pipelines, whatever) or we'll figure out how to let it evolve and still have naive UI clients show it. If that external process uses builds, other build configs, deployment configs, or things not on the platform, they'll set the appropriate labels / annotations to let naive clients do something kind of useful, but mostly drive status.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks Ben, I think i am getting the idea now :)
On Mar 11, 2016 18:18, "Ben Parees" [email protected] wrote:

In pkg/build/api/v1/types.go
#7949 (comment):

@@ -303,6 +303,9 @@ type BuildStrategy struct {

// CustomStrategy holds the parameters to the Custom build strategy
CustomStrategy *CustomBuildStrategy `json:"customStrategy,omitempty" description:"holds parameters to the Custom build strategy"`
  • // ExternalStrategy holds the parameters to the External build strategy
  • ExternalStrategy *ExternalBuildStrategy json:"externalStrategy,omitempty" description:"holds parameters to the External build strategy"

@mfojtik https://github.com/mfojtik fabric knows how to interrogate a
jenkins workflow job definition and visualize it. that's what will come to
openshift. The buildconfig will just point to that workflow job. (that's
one scenario. another is that that buildconfig actually contains enough
metadata on its own to define a workflow, without a jenkinsfile).

but no, there will not be linked buildconfigurations, not all steps will
be represented as buildconfiguration objects (eg a human approval step)


Reply to this email directly or view it on GitHub
https://github.com/openshift/origin/pull/7949/files#r55866364.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can get the Jenkins plugin to write the status of each Jenkins pipeline build back into the OpenShift Build object with details of what stages have completed, what are pending & the expected duration of each stage; so that the console or CLI can visualise pipeline builds purely using the metadata inside the OpenShift Build object
openshift/jenkins-sync-plugin#2

@jimmidyson
Copy link
Contributor Author

Is this level of abstraction right (External) or do we want to be more specific, e.g. JenkinsPipeline?

FYI Jenkins Workflow has been renamed as Jenkins Pipeline so please make sure we're using this name to prevent confusion.

@mfojtik
Copy link
Contributor

mfojtik commented Mar 11, 2016

@jimmidyson i'm ok with external.

@bparees
Copy link
Contributor

bparees commented Mar 11, 2016

@jimmidyson I think External is correct, we may have internal structs within the strategy that are jenkins specific(and optional, in case you are using something other than jenkins in the future), but the strategy is just generically "External"

@smarterclayton
Copy link
Contributor

I'd like to answer the two other questions before we decide on the name - types and kinds of parameters (how would it be used for jenkins pipeline specifically) and then what does the "status" look like. Can someone here take the todo to mock up both of those for a representative pipeline of moderate to high complexity?

@bparees
Copy link
Contributor

bparees commented Mar 11, 2016

"There are two mutually exclusive techniques that are used in the early stages of programming: The Software Engineering method, and the ever-popular Brute Force strategy. Right from the start of our computer careers, we are told that any problem can be broken down into manageable pieces, and that these pieces can be linked together to form a logically constructed program; the method used by Software Engineers. This process is time consuming, yet incredibly simple. Keep the pieces as small as possible, construct each one separately, get it to working, and plug it in. ``This method can be applied to any problem you'll ever have to solve, in the field of computer science, or in real life situations,'' says the textbook. Sure. If you've got the time.

Brute Force can similarly be applied to any real life situation, and in the early stages it's quicker than the Software Engineering method. It's instinctive, spontaneous, and produces concrete results almost immediately. Read the problem, get a general idea of where you're headed, and head there. Start simply, and then build the sucker. If you don't understand something, ignore it. If it doesn't work, throw it out. Assume you know more about what you're doing than you actually do. It's kind of like picking a nice living room set, and building a house around it."

(I will take the todo to mock up the flows)

@smarterclayton
Copy link
Contributor

smarterclayton commented Mar 11, 2016 via email

@bparees
Copy link
Contributor

bparees commented Mar 11, 2016

I will still take the todo but i wouldn't object to seeing @jimmidyson and @jstrachan's version as well, since they have more experience w/ jenkinsfile pipeline definitions.

@smarterclayton
Copy link
Contributor

smarterclayton commented Mar 12, 2016 via email

@jimmidyson
Copy link
Contributor Author

Conscious that while we need to have a target design in mind, I want to implement iteratively because things will become clearer as we go. Personally I'm still learning a lot about CD, pipelines in general & Jenkins pipeline specifically, but really feels like we're moving in the right direction already.

@mfojtik
Copy link
Contributor

mfojtik commented Mar 13, 2016

@smarterclayton @jimmidyson @bparees can we make a real world example of using a pipeline/external BC? Let say I have 4 services baked by 4 pods. All pods run single container with application. I have the frontend (rails app), payment REST app (nodejs app), analytic/monitoring app (sinatra?) and storage REST app (go?). Every app has its own GIT repo (or nodejs and rails share one common repo). Now I create this topology in project named "myapp-prod".

Now I want to build a pipeline that will test, deploy to stage and deploy to prod. So I create the Jenkinsfile in the rails app repo (?). Then I create the BuildConfig with the external build strategy. This will add a Jenkins service (or discover existing service?) and instruct Jenkins it to pickup the Jenkinsfile (?). That Jenkinsfile will create the workflow for building all components? Now I push a change to the nodejs component. The Jenkins will rebuild the nodejs app image and the what? I assume you can define the steps in Jenkinsfile. Will Jenkins then update the BC for each step?

Maybe you already discussed this in Miami, but it will help me visualize this in my brain ;-)

@jstrachan
Copy link
Contributor

in terms of what metadata we'd wanna put into the JenkinsStrategy a start would be:

  • a URL of where the Jenkinsfile is or
  • a path, relative to the git repository of where the Jenkinsfile is (which defaults to Jenkinsfile. The default behaviour will be to refer to the Jenkinsfile inside the git repository of the source of the project being built (with Jenkinsfile being the default name)
  • later on we could add trigger stuff (do we poll for git changes etc)

This Jenkinsfile could define, say, 3 stages. Once the BuildConfig is created, if Jenkins is running with this plugin https://github.com/fabric8io/openshift-jenkins-sync-plugin - this plugin in Jenkins would automatically create a Jenkins job for this pipeline (by watching OpenShift's BuildConfig's as per openshift/jenkins-sync-plugin#1).

When this jenkins pipeline job is run (via web hook, via oc start-build or via the OpenShift or Jenkins consoles), a new Jenkins Job Build would be created for the Jenkins Job. Then this part of the plugin openshift/jenkins-sync-plugin#2 will then ensure that there's an OpenShift Build object for each Jenkins Job which has an OpenShift BuildConfig - plus we'd dump the stage metrics (what stages there are, which are completed/failed/running, whats the estimated duration etc) into the Build - initially as an annotation; but ideally in the Build schema. i.e. openshift-jenkins-sync plugin would lazily create a Build object and periodically update it at key times during the pipeline (e.g. when a stage changes its state)

@mfojtik
Copy link
Contributor

mfojtik commented Mar 14, 2016

@jstrachan should URL for Jenkinsfile be an optional API field for JenkinsStrategy?

@openshift-bot openshift-bot added the needs-rebase Indicates a PR cannot be merged because it has merge conflicts with HEAD. label Mar 14, 2016
@openshift-bot openshift-bot removed the needs-rebase Indicates a PR cannot be merged because it has merge conflicts with HEAD. label Mar 14, 2016
@smarterclayton
Copy link
Contributor

smarterclayton commented Mar 15, 2016 via email

@jstrachan
Copy link
Contributor

@mfojtik yeah, most things we can configure on a Pipeline job in jenkins should probably go into the JenkinsStrategy; either embedding the Jenkinsfile, providing a URL to it or referencing it from a path relative to the git repo

@jimmidyson
Copy link
Contributor Author

Most people are going to be storing their Jenkinsfile in source repo referenced from the build config source - that should be the primary use case.

As Jenkins pipeline accepts only inline Jenkinsfile or Jenkins file from SCM, users that want to access the Jenkinsfile via a URL will be able to specify the Jenkinsfile URL in JenkinsPipelineStrategy. Jenkins plugin can download the file & create/update the Jenkins job as needed. Question: how do users trigger an update to the Jenkinsfile if it's accessed via a URL?

@jstrachan
Copy link
Contributor

@jimmidyson we can't ;) which is why I like the idea of either the Jenkinsfile being stored in git or inside the BuildConfig. Maybe we punt on the URL option for now?

@jimmidyson
Copy link
Contributor Author

@jstrachan That's the answer I was hoping for ;)

@jimmidyson
Copy link
Contributor Author

Added JenkinsfilePath (relative to source repo root) & Jenkinsfile (inline pipeline definition) to Jenkins pipeline strategy.

@jstrachan
Copy link
Contributor

@jimmidyson sounds good!

@bparees
Copy link
Contributor

bparees commented Mar 15, 2016

@jimmidyson @jstrachan Keep in mind that if you have a legitimate source repo (eg a java app) that contains a jenkinsfile, you're really going to potentially end up with 2 buildconfigs:

  1. external buildconfig. points to the source repo+relative directory to pick up the jenkinsfile.
  2. s2i buildconfig. points to the source repo for building the application image. this buildconfig is triggered by one of the stages in the Jenkinsfile.

where (2) is optional because the pipeline defined in (1) could just build the war (or the whole image) directly, rather than doing it in openshift.

My point being that the only thing (1) cares about is the jenkinsfile, what repo it's in, or anything else that may or may not be in that repo, should be irrelevant to that buildconfig.

@jimmidyson
Copy link
Contributor Author

@bparees Agreed - the source repo for a pipeline build config should the source repo containing the Jenkinsfile.

@bparees
Copy link
Contributor

bparees commented Mar 15, 2016

@jimmidyson well, the other point I should have made is, if i have a microservices app that's built from 5 repos, and a pipeline that knows how to build/deploy all 5 pieces, which repo do i put the jenkinsfile in?

I would think in a case like that i would want a separate repo just for the jenkinsfile, that contains nothing else.

@smarterclayton
Copy link
Contributor

API is approved, we can follow up with additional status and the sub resources in a subsequent pull after we merge this.

@smarterclayton
Copy link
Contributor

I think conversions are fixed now.

On Fri, Apr 22, 2016 at 10:10 AM, Jimmi Dyson [email protected]
wrote:

Requires #8511 #8511 to fix
generated conversions.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#7949 (comment)

@jimmidyson
Copy link
Contributor Author

Rebased.

@openshift-bot openshift-bot removed the needs-rebase Indicates a PR cannot be merged because it has merge conflicts with HEAD. label Apr 28, 2016
@jimmidyson
Copy link
Contributor Author

@smarterclayton Doesn't look like generation is sorted :( Same errors in Jenkins check build (https://ci.openshift.redhat.com/jenkins/job/test_pull_requests_origin_check/452/console)

@smarterclayton
Copy link
Contributor

Please squash down to 2 commits - one for michal's, one for yours.

@smarterclayton
Copy link
Contributor

Until we cut over to 1.6 on RHEL, you'll need to generate conversions on go 1.4.2.

@smarterclayton
Copy link
Contributor

I'd like to merge this tomorrow morning - squash, regen on go 1.4.2 (or just revert to what is checked in), I'll do a final pass, and we'll get this in.

@jimmidyson
Copy link
Contributor Author

Squashed to 2 commits (one for Jenkins pipeline strategy, one for on-demand provision of Jenkins). Regenerated stuff too, hope it passes muster...

@jimmidyson
Copy link
Contributor Author

Obviously the regen'd stuff fails on travis which uses go 1.5.3/1.6...

@smarterclayton
Copy link
Contributor

smarterclayton commented Apr 29, 2016 via email

@jimmidyson
Copy link
Contributor Author

Hmm failure seems unrelated. Can someone please retest this?

@smarterclayton
Copy link
Contributor

[merge]

@smarterclayton
Copy link
Contributor

LGTM - thank you for hard work on this

@openshift-bot
Copy link
Contributor

Evaluated for origin merge up to d9fe147

@smarterclayton
Copy link
Contributor

[test]

@openshift-bot
Copy link
Contributor

Evaluated for origin test up to d9fe147

@smarterclayton smarterclayton changed the title [WIP] Add External build strategy Add External build strategy Apr 29, 2016
@openshift-bot
Copy link
Contributor

continuous-integration/openshift-jenkins/test FAILURE (https://ci.openshift.redhat.com/jenkins/job/test_pr_origin/3449/)

@smarterclayton
Copy link
Contributor

Your unit tests are still failing. I'll try to pull together a fix and
merge.

On Apr 29, 2016, at 3:16 PM, OpenShift Bot [email protected] wrote:

continuous-integration/openshift-jenkins/test FAILURE (
https://ci.openshift.redhat.com/jenkins/job/test_pr_origin/3449/)


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#7949 (comment)

@openshift-bot
Copy link
Contributor

continuous-integration/openshift-jenkins/merge FAILURE (https://ci.openshift.redhat.com/jenkins/job/merge_pull_requests_origin/5760/)

@smarterclayton
Copy link
Contributor

smarterclayton commented Apr 29, 2016 via email

@smarterclayton
Copy link
Contributor

Let's link in the follow ups here as they open - next steps are:

  • to formally define the status API after we've had some time to experiment
    with it
  • get the jenkins dev image in
  • turn on the conformance test for jenkins
  • get ootb examples checked in openshift

???

On Sat, Apr 30, 2016 at 6:26 PM, OpenShift Bot [email protected]
wrote:

Closed #7949 #7949 via #8693
#8693.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#7949 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.