Some time ago, our Gitlab (self-hosted) instance started to throw errors that the archives are too big:
ERROR: Uploading artifacts as "archive" to coordinator... too large archive id=something responseStatus=413 Request Entity Too Large status=413 token=something FATAL: too large
ERROR: Job failed: exit code 1
The only resolution ideas we found was to set the max build artifact size (it's under /admin/application_settings). This did not work for us, the error still occurred.
Reference articles:
From gitlab official docs:
The maximum size of the job artifacts can be set at:
The value is in MB and the default is 100MB per job.
To change it at the instance level:
On the top bar, select Menu > Admin.
On the left sidebar, select Settings > CI/CD.
Change the value of maximum artifacts size (in MB).
Select Save changes for the changes to take effect.
Group level (this overrides the instance setting):
To change it at the group level:
Go to the group’s Settings > CI/CD > General Pipelines.
Change the value of maximum artifacts size (in MB).
Select Save changes for the changes to take effect.
Project level (this overrides the instance and group settings):
To change it at the project level:
Go to the project’s Settings > CI/CD > General Pipelines.
Change the value of maximum artifacts size (in MB).
Select Save changes for the changes to take effect.
Ref: Maximum artifacts size

The solution to this issue is to set the max build artifact size (under /admin/application_settings) and to increase the Gitlab NGINX client_max_body_size property in the configuration file to something higher.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With