The remote_execute_dataflow_workflow recipe [1] that is run on [2] fails sometimes with:
Full logs at [3]
Traceback (most recent call last):
File "/b/rr/tmpr6yvcn/w/infra/packages/dataflow/cq_attempts.py", line 247, in <module>
main()
File "/b/rr/tmpr6yvcn/w/infra/packages/dataflow/cq_attempts.py", line 243, in main
p.run()
File "/b/rr/tmpr6yvcn/w/infra/ENV/local/lib/python2.7/site-packages/apache_beam/pipeline.py", line 176, in run
return self.runner.run(self)
File "/b/rr/tmpr6yvcn/w/infra/ENV/local/lib/python2.7/site-packages/apache_beam/runners/dataflow/dataflow_runner.py", line 252, in run
self.dataflow_client.create_job(self.job), self)
File "/b/rr/tmpr6yvcn/w/infra/ENV/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", line 168, in wrapper
return fun(*args, **kwargs)
File "/b/rr/tmpr6yvcn/w/infra/ENV/local/lib/python2.7/site-packages/apache_beam/runners/dataflow/internal/apiclient.py", line 439, in create_job
return self.submit_job_description(job)
File "/b/rr/tmpr6yvcn/w/infra/ENV/local/lib/python2.7/site-packages/apache_beam/utils/retry.py", line 168, in wrapper
return fun(*args, **kwargs)
File "/b/rr/tmpr6yvcn/w/infra/ENV/local/lib/python2.7/site-packages/apache_beam/runners/dataflow/internal/apiclient.py", line 477, in submit_job_description
response = self._client.projects_locations_jobs.Create(request)
File "/b/rr/tmpr6yvcn/w/infra/ENV/local/lib/python2.7/site-packages/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py", line 553, in Create
config, request, global_params=global_params)
File "/b/rr/tmpr6yvcn/w/infra/ENV/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 723, in _RunMethod
return self.ProcessHttpResponse(method_config, http_response, request)
File "/b/rr/tmpr6yvcn/w/infra/ENV/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 729, in ProcessHttpResponse
self.__ProcessHttpResponse(method_config, http_response, request))
File "/b/rr/tmpr6yvcn/w/infra/ENV/local/lib/python2.7/site-packages/apitools/base/py/base_api.py", line 600, in __ProcessHttpResponse
http_response.request_url, method_config, request)
apitools.base.py.exceptions.HttpError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/chrome-infra-events/locations/us-central1/jobs?alt=json>: response: <{'status': '409', 'content-length': '337', 'x-xss-protection': '1; mode=block', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'vary': 'Origin, X-Origin, Referer', 'server': 'ESF', '-content-encoding': 'gzip', 'cache-control': 'private', 'date': 'Tue, 05 Jun 2018 14:47:23 GMT', 'x-frame-options': 'SAMEORIGIN', 'content-type': 'application/json; charset=UTF-8'}>, content <{
"error": {
"code": 409,
"message": "(9cbd709750c708a4): The workflow could not be created. Causes: (75f6647baeec6540): There is already an active job named cq-attempts-2f6ac2cff6f9df70d4c7fae956ec0e02dccd1213. If you want to submit a second job, try again by setting a different name.",
"status": "ALREADY_EXISTS"
}
}
>
[1] https://cs.chromium.org/chromium/infra/recipes/recipes/remote_execute_dataflow_workflow.py
[2] https://uberchromegw.corp.google.com/i/internal.infra.cron/builders/dataflow-workflow-cq-attempts
[3] https://logs.chromium.org/v/?s=infra-internal%2Fbb%2Finternal.infra.cron%2Fdataflow-workflow-cq-attempts%2F7689%2F%2B%2Frecipes%2Fsteps%2FRemote_execute%2F0%2Fstdout
Comment 1 by mmoss@google.com
, Jun 7 2018