GitLab Label Change Notifications

January 23, 2020


A large team of developers is working hard to hit a timeline that is closing quickly. The number of merge requests has gone up significantly. Branches created off of branches cause the same code to appear in multiple merge requests. Merge requests reviews fall behind. Communication is lacking.


We need a way to mark a merge request as "Needs Work," and GitLab doesn't currently have this feature. Getting people to send a Slack notification to the team every time a merge request is updated or reviewed is, challenging. We need an automated way of sending Slack notifications to the team.


Here is what I came up with for a solution. GitLab has a feature called Labels, which allow you to, among other things, categorize merge requests. We can use labels to accomplish our goal. First, I created a label called "Needs Work". This label can be added to any merge request and will be visible on the Merge Requests page. This gets us part of the way to implementing our "Needs Work" solution. Now we need a way of sending notifications to Slack. Unfortunately, GitLab does not send notifications when a merge request label changes, so a custom solution is required.

I threw together a quick python script that utilizes GitLab's resource label events API. Using the cron, every minute, we check all open merge requests for label events, filtering down for only the label we want to send a notification.

Optional: I also created labels for each member of the team so they may assign themselves to a merge requests.


default = general
ssl_verify = true
timeout = 5
per_page = 100

url = https://gitlab.<your-company>.com
private_token = <your-token>
api_version = 4
import gitlab
import json
import slack

DATA_FILE = 'data.json'
LABEL_ID = 124

# Initiate a Slack Client
client = slack.WebClient(token=slack_token)

# Initiate a GitLab object
gl = gitlab.Gitlab.from_config('general')

# Load previous data from our file
with open(DATA_FILE) as json_file:
  # Convert to integer keys instead of strings
  data = json.load(json_file, object_hook=lambda d: {int(k) if k.lstrip('-').isdigit() else k: v for k, v in d.items()})

# Load the group object
group = gl.groups.get(GROUP_ID)
# Get all merge requests from the group with the opened state
merges = group.mergerequests.list(state='opened')
for merge in merges:
  # Load the project object
  project = gl.projects.get(merge.project_id)
  # Load the merge request object
  mergerequest = project.mergerequests.get(merge.iid)
  # List the events object
  events = mergerequest.resourcelabelevents.list()

  # Get the first event that matches our id
  for eve in events:
    if eve.label['id'] == LABEL_ID:
      event = eve

    # Check if our key is already set
    if data.get(, -1) == -1:
      data[] = {}

    preveventid = data[].get(mergerequest.iid, 0)
    curreventid =

    # Check if the event has changed
    if preveventid != curreventid:
      data[][mergerequest.iid] = curreventid
      label = event.label['name']
      action = 'added' if event.action == 'add' else 'removed'
      prourl = project.web_url
      propath = project.path_with_namespace
      mrurl = mergerequest.web_url
      mrref = mergerequest.reference
      mrsrc = mergerequest.source_branch
      # Send the message to the user directly if updates are needed
      #  or send the message to the team for additional review
      channel = "saas" if action == 'removed' else "@" +['username']

      message = "Label *" + label + "* " + action + " from <" + mrurl + "|" + mrref + " *" + mrsrc + "*> in <" + prourl + "|" + propath + ">"

      # Send the message
    del event
    print("An exception occurred")

# Save the data to our file
with open(DATA_FILE, 'w') as outfile:
  json.dump(data, outfile)
crontab -e
* * * * * cd /root/cron && /usr/bin/python3 > labels.log



I wrote this script in a couple of hours, and I know it could be improved and optimized. If you end up taking this and making it better, let me know so I can update this post. Or if you find it helpful and want to share, let me know via our our tweet.