Last updated: April 24, 2026
- Lets Detrics fully operate transfers and schema migrations
- Scopes all data access to one specific dataset
- Prevents Detrics from seeing or reading any other dataset or table in your project — even their names
Image persistence is not compatible with least-privilege today. If you enable image persistence, the Detrics service account needs project-level
roles/storage.admin to auto-create the Cloud Storage bucket that hosts the images — there is no narrower grant that works with the current flow. If your security policy doesn’t allow project-level Storage permissions, leave image persistence disabled. Everything else on this page applies normally.How It Works
Detrics’ write path requires two distinct kinds of permission:- The ability to submit jobs to BigQuery (queries, load jobs, DML). Job creation is a project-level action in BigQuery — it cannot be scoped to a dataset.
- The ability to read and write data. This can be scoped to a single dataset.
Access Denied unless the service account also has data permissions on the specific dataset the job touches. So combining a project-level jobUser with a dataset-level dataEditor gives Detrics exactly what it needs and nothing more.
Required Roles
| Scope | Role | What it allows |
|---|---|---|
| Project | roles/bigquery.jobUser | Submit BigQuery jobs (queries, load jobs, DML). No data access. |
| Dataset (only the one Detrics writes to) | roles/bigquery.dataEditor | Create, read, modify, and delete tables in this dataset. Insert and update data. |
The dataset must be pre-created by you under this configuration — Detrics won’t have permission to create it automatically. See Pre-creating the Dataset below.
Step-by-Step Setup
1. Pre-create the BigQuery Dataset
In Google Cloud Console:- Navigate to BigQuery
- Click your project → Create Dataset
- Set the Dataset ID (e.g.,
detrics_marketing) and Data location (must match what you’ll configure in Detrics) - Click Create Dataset
2. Grant BigQuery Job User at the Project Level
- Go to Google Cloud Console → IAM & Admin → IAM
- Make sure the correct project is selected in the top dropdown
- Click Grant Access
- In New principals, paste your Detrics service account email (shown on your destination page)
- In Select a role, search for and select BigQuery Job User
- Click Save
3. Grant BigQuery Data Editor on Your Dataset
- In BigQuery Console, click your dataset (the one you created in step 1)
- Click SHARING → Permissions
- Click ADD PRINCIPAL
- Paste the Detrics service account email
- Select role BigQuery Data Editor
- Click Save
4. Test the Connection
In Detrics, open your destination and click Test Connection. All five stages should pass:- BigQuery Connectivity
- Project Access
- Dataset Permissions (verifies your pre-created dataset)
- Table Permissions (creates and deletes a test table inside the dataset)
- Job Permissions (loads and reads a test row)
What Detrics Can and Cannot Do
With this configuration, Detrics has access to a precise, verifiable subset of your project.Can do (within the granted dataset only)
- Create, alter, and drop tables
- Insert, update, and delete rows
- Read data from tables it created
- Run queries that reference only this dataset
- Run schema migrations using its safe temp-table-then-rename pattern
Cannot do (anywhere else in your project)
- See the existence or names of other datasets
- List tables or read metadata in other datasets
- Query data from any other dataset
- Create or delete datasets
- Read IAM policies, billing data, or any other GCP resource
Pre-creating the Dataset
In the standard setup, Detrics auto-creates the dataset on first use. Under least-privilege, that creation permission is intentionally not granted, so you create the dataset yourself once during setup. Day-to-day operation does not require any further intervention from you — Detrics manages tables and rows within the dataset on its own. If you ever need to add a second dataset (for example, to separate environments), repeat the dataset-level grant for each one.Troubleshooting
Test fails at Stage 4 (Table Permissions) with 'datasets.create permission denied'
Test fails at Stage 4 (Table Permissions) with 'datasets.create permission denied'
The dataset doesn’t exist yet and Detrics tried to create it. Pre-create the dataset in BigQuery Console (see Step 1 above) and re-run the test.
Test fails at Stage 1 (BigQuery Connectivity) with 'jobs.create permission denied'
Test fails at Stage 1 (BigQuery Connectivity) with 'jobs.create permission denied'
The
BigQuery Job User role wasn’t granted at the project level, or hasn’t propagated yet. Confirm the role is assigned in IAM & Admin → IAM, then wait ~60 seconds and retry.Test fails at Stage 4 (Table Permissions) with 'tables.create permission denied'
Test fails at Stage 4 (Table Permissions) with 'tables.create permission denied'
The
BigQuery Data Editor role wasn’t granted on the dataset itself. Open the dataset in BigQuery Console → SHARING → Permissions and confirm the service account is listed with BigQuery Data Editor.Service account can read other datasets it shouldn't have access to
Service account can read other datasets it shouldn't have access to
This usually means the service account inherited a broader role from somewhere — most commonly a basic
Viewer, Editor, or Owner role at the project level, or an explicit grant on another dataset. Audit:- Project IAM — confirm the service account has only
BigQuery Job User - Each suspicious dataset — open it in BigQuery Console → SHARING → Permissions and remove the service account if present
- Inherited roles — basic project roles (
Viewer/Editor/Owner) automatically grant data-level read across all datasets. Look for “Viewers of project X” or “Editors of project X” entries on dataset sharing panels — those are inherited from project-level basic roles
When to Use This Setup
Choose least-privilege when:- Your GCP project is shared with other workloads or contains sensitive data
- Your security or compliance team requires least-privilege access for third-party integrations
- You want a defense-in-depth posture even if Detrics’ credentials were ever compromised
- The GCP project is dedicated to Detrics (no other data lives there)
- You’d rather not pre-create the dataset yourself
- You need image persistence (requires project-level Storage Admin, not compatible with least-privilege)
- Convenience matters more than scope-limited access in your context