build-python3-image: [3.12, BASE_IMAGE_TAG=3.12-bookworm, Dockerfile.python-3] ○ success

Duration: 11s
Queued: 3s
📁 Stage: docker-image
🖥 Runner: linux-aws-1
Average Duration
30s
This job: 11s
Failure Rate
0.7%
last 30 days

Job Execution Phases

💡 Tip: Click on any phase bar to jump to that section in the log below

Job Analysis

Job Status: Passed

Status: Job passed successfully

Full Job Log

257 lines
Match - of 0
1 14:27:23 Running with gitlab-runner 18.5.0 (bda84871)
2 14:27:23 on gitlab-runner-linux-1-86d76d468c-dqnmc wRxjPbsJX, system ID: r_rbm7B2gpCnst
3 14:27:23 feature flags: FF_USE_FASTZIP:true, FF_USE_NEW_BASH_EVAL_STRATEGY:true, FF_SCRIPT_SECTIONS:true, FF_USE_ADVANCED_POD_SPEC_CONFIGURATION:true, FF_PRINT_POD_EVENTS:true, FF_USE_DUMB_INIT_WITH_KUBERNETES_EXECUTOR:true, FF_LOG_IMAGES_CONFIGURED_FOR_JOB:true, FF_CLEAN_UP_FAILED_CACHE_EXTRACT:true, FF_TIMESTAMPS:true, FF_GIT_URLS_WITHOUT_TOKENS:true
4 14:27:23 Resolving secrets
5 14:27:23 section_start:1765290443:prepare_executor
6 14:27:23 +Preparing the "kubernetes" executor
7 14:27:23 "CPURequest" overwritten with "2"
8 14:27:23 "MemoryRequest" overwritten with "4G"
9 14:27:23 Using Kubernetes namespace: gitlab-runner
10 14:27:23 Using Kubernetes executor with image registry.scandit.com/dockerfiles/kaniko:v1.26.3-crane@sha256:15e0e485d8fe32a3e4f08116e163a6e5473014e297910cfdec3c58e2880a0e66 ...
11 14:27:23 Using attach strategy to execute scripts...
12 14:27:23 Using effective pull policy of [Always] for container helper
13 14:27:23 Using effective pull policy of [Always] for container init-permissions
14 14:27:23 Using effective pull policy of [Always] for container build
15 14:27:23 section_end:1765290443:prepare_executor
16 14:27:23 +section_start:1765290443:prepare_script
17 14:27:23 +Preparing environment
18 14:27:23 Using FF_USE_POD_ACTIVE_DEADLINE_SECONDS, the Pod activeDeadlineSeconds will be set to the job timeout: 1h0m0s...
19 14:27:23 WARNING: Advanced Pod Spec configuration enabled, merging the provided PodSpec to the generated one. This is a beta feature and is subject to change. Feedback is collected in this issue: https://gitlab.com/gitlab-org/gitlab-runner/-/issues/29659 ...
20 14:27:23 Subscribing to Kubernetes Pod events...
21 14:27:23 Type Reason Message
22 14:27:23 Normal Scheduled Successfully assigned gitlab-runner/runner-wrxjpbsjx-project-621-concurrent-0-osbial2m to ip-10-0-38-72.eu-central-1.compute.internal
23 14:27:24 Normal Pulled Container image "gitlab/gitlab-runner-helper:x86_64-v18.5.0" already present on machine
24 14:27:24 Normal Created Created container: init-permissions
25 14:27:24 Normal Started Started container init-permissions
26 14:27:25 Normal Pulling Pulling image "498954711405.dkr.ecr.eu-central-1.amazonaws.com/dockerfiles/kaniko@sha256:15e0e485d8fe32a3e4f08116e163a6e5473014e297910cfdec3c58e2880a0e66"
27 14:27:26 Normal Pulled Successfully pulled image "498954711405.dkr.ecr.eu-central-1.amazonaws.com/dockerfiles/kaniko@sha256:15e0e485d8fe32a3e4f08116e163a6e5473014e297910cfdec3c58e2880a0e66" in 1.101s (1.101s including waiting). Image size: 46487061 bytes.
28 14:27:26 Normal Created Created container: build
29 14:27:26 Normal Started Started container build
30 14:27:26 Normal Pulled Container image "gitlab/gitlab-runner-helper:x86_64-v18.5.0" already present on machine
31 14:27:26 Normal Created Created container: helper
32 14:27:26 Normal Started Started container helper
33 14:27:30 Running on runner-wrxjpbsjx-project-621-concurrent-0-osbial2m via gitlab-runner-linux-1-86d76d468c-dqnmc...
34 14:27:30
35 14:27:30 section_end:1765290450:prepare_script
36 14:27:30 +section_start:1765290450:get_sources
37 14:27:30 +Getting source from Git repository
38 14:27:31 Gitaly correlation ID: 01KC1R500G55T1A89NQD7KKP94
39 14:27:31 Fetching changes with git depth set to 1...
40 14:27:31 Initialized empty Git repository in /build/internal/gitlab-templates/.git/
41 14:27:31 Created fresh repository.
42 14:27:31 Checking out 940006cc as detached HEAD (ref is refs/merge-requests/507/merge)...
43 14:27:31
44 14:27:31 Skipping Git submodules setup
45 14:27:31
46 14:27:31 section_end:1765290451:get_sources
47 14:27:31 +section_start:1765290451:step_script
48 14:27:31 +Executing "step_script" stage of the job script
49 14:27:32 section_start:1765290451:section_pre_build_script_0[hide_duration=true,collapsed=true] $ function cleanup {
50 14:27:32 rv=$?
51 14:27:32 if [ $rv -ne 0 ]; then
52 14:27:32 echo ""
53 14:27:32 echo " Failure Cause Analysis might help, please open this link:"
54 14:27:32 echo " https://failure-cause-analysis.zrh.int.scandit.io/analysis/projects/${CI_PROJECT_ID}/jobs/${CI_JOB_ID}"
55 14:27:32 echo ""
56 14:27:32 fi
57 14:27:32 echo ""
58 14:27:32 echo "Grafana Pod-View: https://grafana.scandit.com/d/k8s_views_pods/kubernetes-views-pods?orgId=1&refresh=1m&var-datasource=${GRAFANA_DATASOURCE}&var-host=${SC_K8S_NODE_NAME}&var-namespace=${SC_K8S_NAMESPACE}&var-pod=${HOSTNAME}&var-resolution=15&from=${__start_time}000&to=${EPOCHSECONDS}000"
59 14:27:32 echo "Grafana Node-View: https://grafana.scandit.com/d/k8s_views_nodes/kubernetes-views-nodes?orgId=1&refresh=1m&var-datasource=${GRAFANA_DATASOURCE}&var-node=${SC_K8S_NODE_NAME}&var-resolution=15s&from=${__start_time}000&to=${EPOCHSECONDS}000"
60 14:27:32 echo ""
61 14:27:32 exit $rv
62 14:27:32 }
63 14:27:32 trap cleanup EXIT
64 14:27:32 echo "INFO: This is the CI job pre_build_script"
65 14:27:32 echo "INFO: It's defined in the backend/infra/aws repo."
66 14:27:32 echo "INFO: These additional Scandit variables are available to you:"
67 14:27:32 echo " SC_K8S_NODE_NAME: $SC_K8S_NODE_NAME"
68 14:27:32 echo " SC_K8S_IMAGE_ID: $SC_K8S_IMAGE_ID"
69 14:27:32 echo "cpu (r/l): ${SC_K8S_REQUESTS_CPU}/${SC_K8S_LIMITS_CPU}"
70 14:27:32 if command -v numfmt >/dev/null 2>&1; then
71 14:27:32 echo "memory (r/l): $(numfmt --to=iec --suffix=B $SC_K8S_REQUESTS_MEMORY)/$(numfmt --to=iec --suffix=B $SC_K8S_LIMITS_MEMORY)"
72 14:27:32 else
73 14:27:32 echo "memory (r/l): ${SC_K8S_REQUESTS_MEMORY}/${SC_K8S_LIMITS_MEMORY}"
74 14:27:32 fi
75 14:27:32 __start_time=${EPOCHSECONDS}
76 14:27:32 echo ""
77 14:27:32 echo "Grafana Pod-View: https://grafana.scandit.com/d/k8s_views_pods/kubernetes-views-pods?orgId=1&refresh=1m&var-datasource=${GRAFANA_DATASOURCE}&var-host=${SC_K8S_NODE_NAME}&var-namespace=${SC_K8S_NAMESPACE}&var-pod=${HOSTNAME}&var-resolution=15&from=${__start_time}000&to=now"
78 14:27:32 echo "Grafana Node-View: https://grafana.scandit.com/d/k8s_views_nodes/kubernetes-views-nodes?orgId=1&refresh=1m&var-datasource=${GRAFANA_DATASOURCE}&var-node=${SC_K8S_NODE_NAME}&var-resolution=15s&from=${__start_time}000&to=now"
79 14:27:32 echo ""
80 14:27:32 echo "Setting up credentials for Gitlab Python registries"
81 14:27:32 mkdir -p ~
82 14:27:32 echo "machine gitlab.scandit.com" > ~/.netrc
83 14:27:32 echo "login gitlab-ci-token" >> ~/.netrc
84 14:27:32 echo "password ${CI_JOB_TOKEN}" >> ~/.netrc
85 14:27:32 chmod 600 ~/.netrc
86 14:27:32 if command -v git &> /dev/null && [ "$(id -u)" -ne 0 ]; then
87 14:27:32 git config --global --add safe.directory $CI_PROJECT_DIR
88 14:27:32 fi
89 14:27:32 # Sonarqube server is running on the same cluster. Use internal address
90 14:27:32 export SONAR_HOST_URL="http://sonarqube.sonarqube.svc.cluster.local:9000"
91 14:27:32 section_end:1765290451:section_pre_build_script_0
92 14:27:32 INFO: This is the CI job pre_build_script
93 14:27:32 INFO: It's defined in the backend/infra/aws repo.
94 14:27:32 INFO: These additional Scandit variables are available to you:
95 14:27:32 SC_K8S_NODE_NAME: ip-10-0-38-72.eu-central-1.compute.internal
96 14:27:32 SC_K8S_IMAGE_ID:
97 14:27:32 cpu (r/l): 2/4
98 14:27:32 memory (r/l): 4000000000/17179869184
99 14:27:32
100 14:27:32 Grafana Pod-View: https://grafana.scandit.com/d/k8s_views_pods/kubernetes-views-pods?orgId=1&refresh=1m&var-datasource=lu1rmx27z&var-host=ip-10-0-38-72.eu-central-1.compute.internal&var-namespace=gitlab-runner&var-pod=runner-wrxjpbsjx-project-621-concurrent-0-osbial2m&var-resolution=15&from=1765290451000&to=now
101 14:27:32 Grafana Node-View: https://grafana.scandit.com/d/k8s_views_nodes/kubernetes-views-nodes?orgId=1&refresh=1m&var-datasource=lu1rmx27z&var-node=ip-10-0-38-72.eu-central-1.compute.internal&var-resolution=15s&from=1765290451000&to=now
102 14:27:32
103 14:27:32 Setting up credentials for Gitlab Python registries
104 14:27:32 $ echo $DOCKER_CONFIG_JSON > /kaniko/.docker/config.json
105 14:27:32 section_start:1765290452:section_script_step_1[hide_duration=true,collapsed=true] $ if [ "$CONTAINER_USER_HOME" != "" ]; then
106 14:27:32 mkdir -p "$CONTAINER_USER_HOME"
107 14:27:32 cp ~/.netrc "$CONTAINER_USER_HOME"
108 14:27:32 if [ "$CONTAINER_USER_UID" != "" ]; then
109 14:27:32 chown "$CONTAINER_USER_UID" "$CONTAINER_USER_HOME/.netrc"
110 14:27:32 fi
111 14:27:32 fi
112 14:27:32 section_end:1765290452:section_script_step_1
113 14:27:32 section_start:1765290452:section_script_step_2[hide_duration=true,collapsed=true] $ function copy_files() {
114 14:27:32 local src="$1"
115 14:27:32 local trg="$2"
116 14:27:32 for f in $src; do
117 14:27:32 t="$trg/`dirname $f`"
118 14:27:32 mkdir -p $t || true
119 14:27:32 echo "Copy $f"
120 14:27:32 cp -pr $f $trg/$f
121 14:27:32 done
122 14:27:32 }
123 14:27:32 function recursive_hash() {
124 14:27:32 local dir="$1"
125 14:27:32 (find $dir -type f -exec sha256sum {} + ; find $dir -type d) | sort | sha256sum | cut -d ' ' -f1
126 14:27:32 }
127 14:27:32 function remote_docker_digest() {
128 14:27:32 local images="$1"
129 14:27:32 echo $images | xargs -n 1 crane digest
130 14:27:32 }
131 14:27:32 function remote_image_exists() {
132 14:27:32 local image="$1"
133 14:27:32 crane manifest $image > /dev/null 2>&1
134 14:27:32 }
135 14:27:32 function remote_images_are_identical() {
136 14:27:32 local imageA="$1"
137 14:27:32 local imageB="$2"
138 14:27:32 if [[ $(remote_docker_digest "$imageA") == $(remote_docker_digest "$imageB") ]]; then
139 14:27:32 return 0
140 14:27:32 else
141 14:27:32 return 1
142 14:27:32 fi
143 14:27:32 }
144 14:27:32 function copy_image() {
145 14:27:32 local image="$1"
146 14:27:32 local remotes="$2"
147 14:27:32 local backup_ext="$3"
148 14:27:32 echo "$image"
149 14:27:32 local source_digest=$(remote_docker_digest $image)
150 14:27:32 local target_digest
151 14:27:32 for registry in $remotes; do
152 14:27:32 if target_digest=$(remote_docker_digest $registry); then
153 14:27:32 if [ "$target_digest" != "$source_digest" ]; then
154 14:27:32 echo "image outdated, overwriting with newest version"
155 14:27:32 crane copy $image $registry
156 14:27:32 crane copy $image ${registry}${backup_ext}
157 14:27:32 fi
158 14:27:32 else
159 14:27:32 echo "image does not exist, writing newest version"
160 14:27:32 crane copy $image $registry
161 14:27:32 crane copy $image ${registry}${backup_ext}
162 14:27:32 fi
163 14:27:32 done
164 14:27:32 }
165 14:27:32 section_end:1765290452:section_script_step_2
166 14:27:32 section_start:1765290452:section_script_step_3[hide_duration=true,collapsed=true] $ if [ "$CONTAINER_SUBDIR" != "" ]; then
167 14:27:32 echo "Entering subpath $CONTAINER_SUBDIR"
168 14:27:32 cd $CONTAINER_SUBDIR
169 14:27:32 fi
170 14:27:32 section_end:1765290452:section_script_step_3
171 14:27:32 $ copy_files "$CONTAINER_IMPLICIT_REQUIREMENTS $CONTAINER_REQUIREMENTS" "$CONTAINER_CONTEXT_PATH"
172 14:27:32 Copy Dockerfile.python-3
173 14:27:32 Copy requirements.txt
174 14:27:32 Copy .python-version
175 14:27:32 Copy .pip-version
176 14:27:32 $ echo "$CONTAINER_BUILD_ENVIRONMENT" > $CONTAINER_CONTEXT_PATH/.docker-build-env
177 14:27:32 $ docker_checksum=$(recursive_hash $CONTAINER_CONTEXT_PATH)
178 14:27:32 section_start:1765290452:section_script_step_7[hide_duration=true,collapsed=true] $ if [ "$CONTAINER_IMAGE_NAME" == "" ]; then
179 14:27:32 final_image_name=${CONTAINER_IMAGE_URL}
180 14:27:32 else
181 14:27:32 final_image_name=${CONTAINER_IMAGE_URL}/${CONTAINER_IMAGE_NAME}
182 14:27:32 fi
183 14:27:32 section_end:1765290452:section_script_step_7
184 14:27:32 $ final_image_url=${final_image_name}:${docker_checksum}
185 14:27:32 section_start:1765290452:section_script_step_9[hide_duration=true,collapsed=true] $ if [ "${PIPELINE_IMAGE_REFS}" == "1" ]; then
186 14:27:32 echo $CONTAINER_IMAGE_VARIABLE=${final_image_url}-P${CI_PROJECT_ID}-${CI_PIPELINE_ID} > $CI_PROJECT_DIR/docker_image_build.env
187 14:27:32 else
188 14:27:32 echo $CONTAINER_IMAGE_VARIABLE=$final_image_url > $CI_PROJECT_DIR/docker_image_build.env
189 14:27:32 fi
190 14:27:32 section_end:1765290452:section_script_step_9
191 14:27:32 $ echo ${CONTAINER_IMAGE_VARIABLE}_HASH=$docker_checksum >> $CI_PROJECT_DIR/docker_image_build.env
192 14:27:32 section_start:1765290452:section_script_step_11[hide_duration=true,collapsed=true] $ if [ "${FORCE_BUILD}" != "true" ] || command -v crane &> /dev/null; then
193 14:27:32 echo $REGISTRY_PASSWORD | crane auth login $REGISTRY -u $REGISTRY_USER --password-stdin
194 14:27:32 fi
195 14:27:32 section_end:1765290452:section_script_step_11
196 14:27:32
197 14:27:32 WARNING! Your credentials are stored unencrypted in '/kaniko/.docker/config.json'.
198 14:27:32 Configure a credential helper to remove this warning. See
199 14:27:32 https://docs.docker.com/go/credential-store/
200 14:27:32
201 14:27:32 2025/12/09 14:27:32 logged in via /kaniko/.docker/config.json
202 14:27:32 section_start:1765290452:section_script_step_12[hide_duration=true,collapsed=true] $ if [ "${FORCE_BUILD}" != "true" ] && remote_image_exists "$final_image_url"; then
203 14:27:32 echo "Image already exists, skip the build."
204 14:27:32 echo "$final_image_url"
205 14:27:32 if [[ "$CI_COMMIT_BRANCH" == "$CI_DEFAULT_BRANCH" ]]; then
206 14:27:32 _EXT=""
207 14:27:32 _BACKUP_EXT="-CI${CI_JOB_ID}-$(date '+%Y%m%d')"
208 14:27:32 elif [[ -n "$CI_MERGE_REQUEST_ID" ]]; then
209 14:27:32 _EXT="-MR${CI_MERGE_REQUEST_IID}"
210 14:27:32 _BACKUP_EXT=""
211 14:27:32 elif [[ "$CI_COMMIT_REF_PROTECTED" == "true" ]]; then
212 14:27:32 _EXT="-${CI_COMMIT_REF_SLUG}"
213 14:27:32 _BACKUP_EXT="-CI${CI_JOB_ID}-$(date '+%Y%m%d')"
214 14:27:32 else
215 14:27:32 _EXT="-${CI_COMMIT_REF_SLUG}"
216 14:27:32 _BACKUP_EXT=""
217 14:27:32 fi
218 14:27:32 for _TAG in $CONTAINER_IMAGE_TAG; do
219 14:27:32 echo "Copying ${final_image_url} to ${final_image_name}:${_TAG}${_EXT}"
220 14:27:32 copy_image "${final_image_url}" "${final_image_name}:${_TAG}${_EXT}" "${_BACKUP_EXT}"
221 14:27:32 done
222 14:27:32 if [ "${PIPELINE_IMAGE_REFS}" == "1" ]; then
223 14:27:32 _EXT="-P${CI_PROJECT_ID}-${CI_PIPELINE_ID}"
224 14:27:32 echo "Copying ${final_image_url} to ${final_image_url}${_EXT}"
225 14:27:32 copy_image "${final_image_url}" "${final_image_url}${_EXT}"
226 14:27:32 for _TAG in $CONTAINER_IMAGE_TAG; do
227 14:27:32 echo "Copying ${final_image_url} to ${final_image_name}:${_TAG}${_EXT}"
228 14:27:32 copy_image "${final_image_url}" "${final_image_name}:${_TAG}${_EXT}"
229 14:27:32 done
230 14:27:32 fi
231 14:27:32 exit 0
232 14:27:32 fi
233 14:27:32 section_end:1765290452:section_script_step_12
234 14:27:32 Image already exists, skip the build.
235 14:27:32 registry.scandit.com/internal/gitlab-templates/python:84b74ce43bf4f781828bf5a8fb662404b640c5ec93aad6d8d207b534b1aa643f
236 14:27:32 Copying registry.scandit.com/internal/gitlab-templates/python:84b74ce43bf4f781828bf5a8fb662404b640c5ec93aad6d8d207b534b1aa643f to registry.scandit.com/internal/gitlab-templates/python:3.12-MR507
237 14:27:32 registry.scandit.com/internal/gitlab-templates/python:84b74ce43bf4f781828bf5a8fb662404b640c5ec93aad6d8d207b534b1aa643f
238 14:27:32
239 14:27:32 Grafana Pod-View: https://grafana.scandit.com/d/k8s_views_pods/kubernetes-views-pods?orgId=1&refresh=1m&var-datasource=lu1rmx27z&var-host=ip-10-0-38-72.eu-central-1.compute.internal&var-namespace=gitlab-runner&var-pod=runner-wrxjpbsjx-project-621-concurrent-0-osbial2m&var-resolution=15&from=1765290451000&to=1765290452000
240 14:27:32 Grafana Node-View: https://grafana.scandit.com/d/k8s_views_nodes/kubernetes-views-nodes?orgId=1&refresh=1m&var-datasource=lu1rmx27z&var-node=ip-10-0-38-72.eu-central-1.compute.internal&var-resolution=15s&from=1765290451000&to=1765290452000
241 14:27:32
242 14:27:32
243 14:27:32 section_end:1765290452:step_script
244 14:27:32 +section_start:1765290452:upload_artifacts_on_success
245 14:27:32 +Uploading artifacts for successful job
246 14:27:33 Uploading artifacts...
247 14:27:33 docker_image_build.env: found 1 matching artifact files and directories
248 14:27:33 Uploading artifacts as "dotenv" to coordinator... 201 Created correlation_id=01KC1R5A10C3501JD2GF1C3NMK id=46308182 responseStatus=201 Created token=64_exH2gG
249 14:27:33
250 14:27:33 section_end:1765290453:upload_artifacts_on_success
251 14:27:33 +section_start:1765290453:cleanup_file_variables
252 14:27:33 +Cleaning up project directory and file based variables
253 14:27:34
254 14:27:34 section_end:1765290454:cleanup_file_variables
255 14:27:34 +
256 14:27:34 Job succeeded
257