high

Hijacking Azure Machine Learning Notebooks

Published Wed, Jan 8th, 2025
Platforms

Summary

Azure Machine Learning notebooks can be hijacked by attackers with Storage Account access to inject malicious code. A now-fixed vulnerability allowed Reader role escalation to code execution. The article details the attack methods, including modifying notebooks, obtaining managed identity tokens, and exfiltrating data. It also introduces a tool for dumping AML workspace credentials.

Affected Services

Azure Machine Learning, Azure Storage

Remediation

Restrict Storage Account access, monitor for suspicious modifications to .ipynb files, and ensure proper access controls on AML workspaces.

Tracked CVEs

No tracked CVEs

References

Entry Status
Stub (AI-Generated)
Disclosure Date
-
Exploitablity Period
-
Known ITW Exploitation
-
Detection Methods
Monitor Azure Activity Logs for suspicious API calls to list AML workspace secrets and Storage Account keys. Set up diagnostics to detect anomalous modifications to .ipynb files in associated Storage Accounts.
Piercing Index Rating
-
Discovered by
Karl Fosaaen, NetSPI