"Microsoft Copilot Studio Exploit Leaks Sensitive Cloud Data"
Researchers at Tenable have exploited a vulnerability in Microsoft's Copilot Studio tool to make external HTTP requests that could access sensitive information on internal services within a cloud environment, potentially affecting multiple tenants. The researchers found and exploited a Server-Side Request Forgery (SSRF) vulnerability in the chatbot creation tool. The exploitation of this flaw allowed them to access Microsoft's internal infrastructure, including the Instance Metadata Service (IMDS) and internal Cosmos DB instances. This article continues to discuss the SSRF bug discovered in Microsoft's tool for creating custom Artificial Intelligence (AI) chatbots.
Dark Reading reports "Microsoft Copilot Studio Exploit Leaks Sensitive Cloud Data"
Submitted by grigby1