Today I ran into a bit of a pickle 🥒. I was upgrading GitLab on an Ubuntu server machine using Aptitude. I’ve done this hundreds of times and GitLab has never had the slightest issues updating for me in the past. Today was time for a new experience.
1 2 3 4 5 6 7 8 |
There was an error running gitlab-ctl reconfigure: storage_directory[/var/opt/gitlab/.ssh] (gitlab::gitlab-shell line 38) had an error: Mixlib::ShellOut::ShellCommandFailed: ruby_block[directory resource: /var/opt/gitlab/.ssh] (/opt/gitlab/embedded/cookbooks/cache/cookbooks/package/resources/storage_directory.rb line 33) had an error: Mixlib::ShellOut::ShellCommandFailed: Expected process to exit with [0], but received '1' ---- Begin output of chgrp git /var/opt/gitlab/.ssh ---- STDOUT: STDERR: chgrp: invalid group: ‘git’ ---- End output of chgrp git /var/opt/gitlab/.ssh ---- Ran chgrp git /var/opt/gitlab/.ssh returned 1 |
So to make a long story short… When GitLab is updated, it runs the command gitlab-ctl reconfigure
. This command essentially performs some checks, makes sure that everything is upgraded and set up correctly. One of its internal features ensures that a folder exists and that it has the correct permissions.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
# In /opt/gitlab/embedded/cookbooks/cache/cookbooks/package/resources/storage_directory.rb 33: ruby_block "directory resource: #{new_resource.path}" do 34: block do 35: # Ensure the directory exists 36: storage_helper.ensure_directory_exists(new_resource.path) 37: 38: # Ensure the permissions are set 39: storage_helper.ensure_permissions_set(new_resource.path) 40: 41: # Error out if we have not achieved the target permissions 42: storage_helper.validate!(new_resource.path) 43: end 44: not_if { storage_helper.validate(new_resource.path) } 45: end 46: end |
This code failed to set the user group for the folder /var/opt/gitlab/.ssh
because the required user group does not exist. Though this error seems fairly obvious, the problem I faced is that the user group actually does exist.
Luckily I remembered that after setting up this server, I also did some server hardening. One step in that process was to restrict access permissions to specific system files. So I went and checked which files I modified and one of the files happened to be /etc/group
.

What I had done was take away privileges for those system files so they could only be accessed by their owner, the root user. This is typically not an issue, but it turns out that even though Aptitude was being ran by the root user, the gitlab-ctl process must have been trying to set the permissions of that folder using a different user account, which had no permission to access /etc/group
.
The solution: “sudo chmod og+r /etc/group
” to permit other users to read the file. After having Aptitude retry the upgrade process, everything worked great!
After you have finished the upgrade, you can go ahead and revert the permission changes (sudo chmod og-r /etc/group
). But keep in mind that it will probably happen again the next time you are upgrading GitLab.