Jenkins 如何从并行子作业复制工件
我正在使用Jenkins自动化并行JMeter测试。这被设置为两个独立的Jenkins管道作业,父作业和子作业 子作业接受一系列参数,并针对目标服务执行JMeter测试。这是工作和档案四个CSV的和一个XML文件在每个版本 父作业在不同节点上并行执行子作业多次。目前,它在测试中执行了两次,但最终打算一次生成10或20个子作业。并行执行可以工作,并且每次执行父作业时,子作业都会记录两个构建,并存档它们的工件 问题是如何配置Copy Artifacts插件以从子作业检索工件,以便将其归档到父作业上Jenkins 如何从并行子作业复制工件,jenkins,jenkins-pipeline,jenkins-plugins,Jenkins,Jenkins Pipeline,Jenkins Plugins,我正在使用Jenkins自动化并行JMeter测试。这被设置为两个独立的Jenkins管道作业,父作业和子作业 子作业接受一系列参数,并针对目标服务执行JMeter测试。这是工作和档案四个CSV的和一个XML文件在每个版本 父作业在不同节点上并行执行子作业多次。目前,它在测试中执行了两次,但最终打算一次生成10或20个子作业。并行执行可以工作,并且每次执行父作业时,子作业都会记录两个构建,并存档它们的工件 问题是如何配置Copy Artifacts插件以从子作业检索工件,以便将其归档到父作业上
ParentBuildTag
的参数,类型为buildselector for Copy Artifact
。选中复制工件的权限
复选框,将允许复制工件的项目
字段设置为*
stage('Node 2') {
agent { node { label 'PIPELINE' } }
steps {
script {
node2 = build job: 'CC_DGN_Test',
parameters: [
string(name: 'dummy', value: "2"),
string(name: 'ParentBuildTag', value: "${BUILD_TAG}"),
string(name: 'Labels', value: "JMETER"),
...additional parameters snipped...
]
}
}
}
控制台日志显示一个错误:
Error when executing always post condition:
hudson.AbortException: Unable to find a build for artifact copy from: CC_DGN_Test
at hudson.plugins.copyartifact.CopyArtifact.perform(CopyArtifact.java:412)
at org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:80)
at org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:67)
at org.jenkinsci.plugins.workflow.steps.SynchronousNonBlockingStepExecution.lambda$start$0(SynchronousNonBlockingStepExecution.java:47)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
不会将任何内容复制到父级。build标记正确地打印到控制台日志中(来自post{}中的print语句)
内部版本号正确打印到控制台日志中,但不会记录任何错误,也不会复制任何内容
pipeline {
agent { node { label 'PIPELINE' } }
options {
timeout(time: 1, unit: 'HOURS')
buildDiscarder(logRotator(numToKeepStr: '100'))
timestamps()
}
environment {
node1 = ""
node2 = ""
}
stages {
stage('Clean Up') {
steps {
cleanWs()
}
}
stage('Test') {
parallel {
stage('Node 1') {
agent { node { label 'PIPELINE' } }
steps {
script {
node1 = build job: 'CC_DGN_Test',
parameters: [
string(name: 'dummy', value: "1"),
string(name: 'ParentBuildTag', value: "${BUILD_TAG}"),
string(name: 'Labels', value: "JMETER"),
...additional parameters snipped...
]
}
}
}
stage('Node 2') {
agent { node { label 'PIPELINE' } }
steps {
script {
node2 = build job: 'CC_DGN_Test',
parameters: [
string(name: 'dummy', value: "2"),
string(name: 'ParentBuildTag', value: "${BUILD_TAG}"),
string(name: 'Labels', value: "JMETER"),
...additional parameters snipped...
]
}
}
}
}
}
}
post {
always {
script {
copyArtifacts optional: false, projectName: 'CC_DGN_Test', selector: buildParameter("${BUILD_TAG}")
archiveArtifacts "*.xml"
}
cleanWs()
}
}
}
我的目标是,根据当前配置,在作业完成后,父作业总共包含八个CSV和两个XML,但当前没有任何内容与父作业一起存档。copyArtifact语法哪里出了问题?您的观点2。方法是正确的。您只需将
node2.number
转换为字符串:
selector: specific("${node2.number}")
您还可以在方法中调用子作业。下面是一个示例脚本:
#!棒极了
管道{
环境{
childJobName=“测试/MyChildJob”
}
舞台{
阶段(“子作业”){
平行的{
阶段('ChildJob1'){
台阶{
runJob(childJobName、@tag1@tag2、'job1')
}
}
阶段('ChildJob2'){
台阶{
runJob(childJobName、@tag3@tag4、'job2')
}
}
}
}
}
职位{
清理{
cleanWs()
}
}
}
def runJob(字符串jobName、字符串标记、字符串rootReportDir){
def childJob=生成作业:jobName,传播:false,等待:true,参数:[字符串(名称:'TAGS',值:TAGS)]
copyArtifacts过滤器:“report.html”,项目名称:jobName,选择器:specific(${childJob.number}),目标:rootReportDir
archiveArtifacts工件:“${rootReportDir}/report.html”
if(childJob.result==“失败”){
bat“出口1”
}
}
在本例中,子作业都是相同的Jenkins作业。父作业向每个作业传递不同的参数
子作业生成的报告文件将复制到父作业中的rootReportDir中。rootReportDir对于每个子作业都应该是唯一的,这样每个报告在存档到父作业时都有一个唯一的路径。您能够解决这个问题吗?@edt\u devel否,我无法解决。目前,子作业有一个管道步骤,它们在其中处理自己的工件。父作业的存在只是为了生成配置数量的子作业,并保留计划定义。
stage('Node 2') {
agent { node { label 'PIPELINE' } }
steps {
script {
node2 = build job: 'CC_DGN_Test',
parameters: [
string(name: 'dummy', value: "2"),
string(name: 'ParentBuildTag', value: "${BUILD_TAG}"),
string(name: 'Labels', value: "JMETER"),
...additional parameters snipped...
]
print "Build number (node 2) = " + node2.number //prints build number to console e.g. "Build number (node 2) = 102"
copyArtifacts optional: false, filter: '*.xml, *.csv', fingerprintArtifacts: true, projectName: 'CC_DGN_Test', selector: specific(node2.number)
}
}
}
properties([parameters([
[$class: 'BuildSelectorParameter',
defaultSelector: upstream(fallbackToLastSuccessful: true),
description: '',
name: 'ParentBuildTag']])
])
copyArtifacts(
projectName: 'CC_DGN_Test',
selector: [
class: 'ParameterizedBuildSelector',
parameterName: 'ParentBuildTag'
]
);
stash includes: '*.xml', name: 'node1xml'
unstash 'node1xml'
pipeline {
agent { node { label 'PIPELINE' } }
options {
timeout(time: 1, unit: 'HOURS')
buildDiscarder(logRotator(numToKeepStr: '100'))
timestamps()
}
environment {
node1 = ""
node2 = ""
}
stages {
stage('Clean Up') {
steps {
cleanWs()
}
}
stage('Test') {
parallel {
stage('Node 1') {
agent { node { label 'PIPELINE' } }
steps {
script {
node1 = build job: 'CC_DGN_Test',
parameters: [
string(name: 'dummy', value: "1"),
string(name: 'ParentBuildTag', value: "${BUILD_TAG}"),
string(name: 'Labels', value: "JMETER"),
...additional parameters snipped...
]
}
}
}
stage('Node 2') {
agent { node { label 'PIPELINE' } }
steps {
script {
node2 = build job: 'CC_DGN_Test',
parameters: [
string(name: 'dummy', value: "2"),
string(name: 'ParentBuildTag', value: "${BUILD_TAG}"),
string(name: 'Labels', value: "JMETER"),
...additional parameters snipped...
]
}
}
}
}
}
}
post {
always {
script {
copyArtifacts optional: false, projectName: 'CC_DGN_Test', selector: buildParameter("${BUILD_TAG}")
archiveArtifacts "*.xml"
}
cleanWs()
}
}
}
selector: specific("${node2.number}")