Policy types for @turbot/gcp-dataflow

GCP > Dataflow > API Enabled

Check whether GCP Dataflow API is enabled.

API Enabled refers specifically to the API state of a service in a cloud project.
This control determines whether the API state is set as per desired level.

The GCP > Dataflow > API Enabled control compares
the API state against the API Enabled policies,
raises an alarm, and takes the defined enforcement action.

URI
tmod:@turbot/gcp-dataflow#/policy/types/dataflowApiEnabled
Valid Value
[
"Skip",
"Check: Disabled",
"Check: Enabled",
"Check: Enabled if Dataflow > Enabled",
"Enforce: Disabled",
"Enforce: Enabled",
"Enforce: Enabled if Dataflow > Enabled"
]
Schema
{
"type": "string",
"enum": [
"Skip",
"Check: Disabled",
"Check: Enabled",
"Check: Enabled if Dataflow > Enabled",
"Enforce: Disabled",
"Enforce: Enabled",
"Enforce: Enabled if Dataflow > Enabled"
],
"default": "Skip"
}

GCP > Dataflow > Approved Regions [Default]

A list of GCP regions in which GCP Dataflow resources are approved for use.

The expected format is an array of regions names. You may use the '*' and
'?' wildcard characters.

This policy is the default value for all GCP Dataflow resources' Approved > Regions policies.

URI
tmod:@turbot/gcp-dataflow#/policy/types/dataflowApprovedRegionsDefault
Default Template Input
"{\n regions: policyValue(uri:\"tmod:@turbot/gcp#/policy/types/approvedRegionsDefault\") {\n value\n }\n}\n"
Default Template
"{% if $.regions.value | length == 0 %} [] {% endif %}{% for item in $.regions.value %}- '{{ item }}'\n{% endfor %}"

GCP > Dataflow > CMDB

Configure whether to record and synchronize details for the GCP Dataflow dataflow into the CMDB.

The CMDB control is responsible for populating and updating all the attributes for that resource type in the Guardrails CMDB.
All policies and controls in Guardrails are based around the resource, so usually the CMDB policy is set to "Enforce: Enabled".

If set to Skip then all changes to the CMDB are paused - no new resources will be discovered, no updates will be made and deleted resources will not be removed.

To cleanup resources and stop tracking changes, set this policy to "Enforce: Disabled".

CMDB controls also use the Regions policy associated with the resource. If region is not in GCP > Dataflow > Dataflow > Regions policy, the CMDB control will delete the resource from the CMDB.

(Note: Setting CMDB to "Skip" will also pause these changes.)

URI
tmod:@turbot/gcp-dataflow#/policy/types/dataflowCmdb
Category
Valid Value
[
"Skip",
"Enforce: Enabled",
"Enforce: Disabled"
]
Schema
{
"type": "string",
"enum": [
"Skip",
"Enforce: Enabled",
"Enforce: Disabled"
],
"example": [
"Skip"
],
"default": "Enforce: Enabled"
}

GCP > Dataflow > Enabled

Enabled Dataflow.

URI
tmod:@turbot/gcp-dataflow#/policy/types/dataflowEnabled
Valid Value
[
"Enabled",
"Enabled: Metadata Only",
"Disabled"
]
Schema
{
"type": "string",
"enum": [
"Enabled",
"Enabled: Metadata Only",
"Disabled"
],
"example": [
"Enabled"
],
"default": "Disabled"
}

GCP > Dataflow > Job > Active

Determine the action to take when an GCP Dataflow job, based on the GCP > Dataflow > Job > Active > * policies.

The control determines whether the resource is in active use, and if not,
has the ability to delete / cleanup the resource. When running an automated
compliance environment, it's common to end up with a wide range of alarms
that are difficult and time consuming to clear. The Active control brings
automated, well-defined control to this process.

The Active control checks the status of all defined Active policies for the
resource (GCP > Dataflow > Job > Active > *), raises an alarm, and takes the defined enforcement
action. Each Active sub-policy can calculate a status of active, inactive
or skipped. Generally, if the resource appears to be Active for any reason
it will be considered Active.
Note the contrast with Approved, where if the
resource appears to be Unapproved for any reason it will be considered
Unapproved.

See Active for more information.

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobActive
Valid Value
[
"Skip",
"Check: Active"
]
Schema
{
"type": "string",
"enum": [
"Skip",
"Check: Active"
],
"example": [
"Check: Active"
],
"default": "Skip"
}

GCP > Dataflow > Job > Active > Age

The age after which the GCP Dataflow job
is no longer considered active. If a create time is unavailable, the time Guardrails discovered the resource is used.

The Active
control determines whether the resource is in active use, and if not, has
the ability to delete / cleanup the resource. When running an automated
compliance environment, it's common to end up with a wide range of alarms
that are difficult and time consuming to clear. The Active control brings
automated, well-defined control to this process.

The Active control checks the status of all defined Active policies for the
resource (GCP > Dataflow > Job > Active > *),
raises an alarm, and takes the defined enforcement action. Each Active
sub-policy can calculate a status of active, inactive or skipped. Generally,
if the resource appears to be Active for any reason it will be considered Active.
Note the contrast with Approved, where if the resource appears to be Unapproved
for any reason it will be considered Unapproved.

See Active for more information.

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobActiveAge
Valid Value
[
"Skip",
"Force inactive if age > 1 day",
"Force inactive if age > 3 days",
"Force inactive if age > 7 days",
"Force inactive if age > 14 days",
"Force inactive if age > 30 days",
"Force inactive if age > 60 days",
"Force inactive if age > 90 days",
"Force inactive if age > 180 days",
"Force inactive if age > 365 days"
]
Schema
{
"type": "string",
"enum": [
"Skip",
"Force inactive if age > 1 day",
"Force inactive if age > 3 days",
"Force inactive if age > 7 days",
"Force inactive if age > 14 days",
"Force inactive if age > 30 days",
"Force inactive if age > 60 days",
"Force inactive if age > 90 days",
"Force inactive if age > 180 days",
"Force inactive if age > 365 days"
],
"example": [
"Force inactive if age > 90 days"
],
"default": "Skip"
}

GCP > Dataflow > Job > Active > Last Modified

The number of days since the GCP Dataflow job was last modified before it is considered
inactive.

The Active
control determines whether the resource is in active use, and if not, has
the ability to delete / cleanup the resource. When running an automated
compliance environment, it's common to end up with a wide range of alarms
that are difficult and time consuming to clear. The Active control brings
automated, well-defined control to this process.

The Active control checks the status of all defined Active policies for the
resource (GCP > Dataflow > Job > Active > *), raises an alarm, and takes the defined enforcement
action. Each Active sub-policy can calculate a status of active, inactive
or skipped. Generally, if the resource appears to be Active for any reason
it will be considered Active.
Note the contrast with Approved, where if the
resource appears to be Unapproved for any reason it will be considered
Unapproved.

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobActiveLastModified
Valid Value
[
"Skip",
"Active if last modified <= 1 day",
"Active if last modified <= 3 days",
"Active if last modified <= 7 days",
"Active if last modified <= 14 days",
"Active if last modified <= 30 days",
"Active if last modified <= 60 days",
"Active if last modified <= 90 days",
"Active if last modified <= 180 days",
"Active if last modified <= 365 days",
"Force active if last modified <= 1 day",
"Force active if last modified <= 3 days",
"Force active if last modified <= 7 days",
"Force active if last modified <= 14 days",
"Force active if last modified <= 30 days",
"Force active if last modified <= 60 days",
"Force active if last modified <= 90 days",
"Force active if last modified <= 180 days",
"Force active if last modified <= 365 days"
]
Schema
{
"type": "string",
"enum": [
"Skip",
"Active if last modified <= 1 day",
"Active if last modified <= 3 days",
"Active if last modified <= 7 days",
"Active if last modified <= 14 days",
"Active if last modified <= 30 days",
"Active if last modified <= 60 days",
"Active if last modified <= 90 days",
"Active if last modified <= 180 days",
"Active if last modified <= 365 days",
"Force active if last modified <= 1 day",
"Force active if last modified <= 3 days",
"Force active if last modified <= 7 days",
"Force active if last modified <= 14 days",
"Force active if last modified <= 30 days",
"Force active if last modified <= 60 days",
"Force active if last modified <= 90 days",
"Force active if last modified <= 180 days",
"Force active if last modified <= 365 days"
],
"example": [
"Active if last modified <= 90 days"
],
"default": "Skip"
}

GCP > Dataflow > Job > Approved

Determine the action to take when a GCP Dataflow job is not approved based on GCP > Dataflow > Job > Approved > * policies.

The Approved control checks the status of the defined Approved sub-policies for the resource. If the resource is not approved according to any of these policies, this control raises an alarm and takes the defined enforcement action.

For any enforcement actions that specify if new, e.g., Enforce: Delete unapproved if new, this control will only take the enforcement actions for resources created within the last 60 minutes.

See Approved for more information.

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobApproved
Valid Value
[
"Skip",
"Check: Approved"
]
Schema
{
"type": "string",
"enum": [
"Skip",
"Check: Approved"
],
"example": [
"Check: Approved"
],
"default": "Skip"
}

GCP > Dataflow > Job > Approved > Custom

Determine whether the GCP Dataflow job is allowed to exist.
This policy will be evaluated by the Approved control. If a GCP Dataflow job is not approved, it will be subject to the action specified in the GCP > Dataflow > Job > Approved policy.
See Approved for more information.

Note: The policy value must be a string with a value of Approved, Not approved or Skip, or in the form of YAML objects. The object(s) must contain the key result with its value as Approved or Not approved. A custom title and message can also be added using the keys title and message respectively.

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobApprovedCustom
Schema
{
"example": [
"Approved",
"Not approved",
"Skip",
{
"result": "Approved"
},
{
"title": "string",
"result": "Not approved"
},
{
"title": "string",
"result": "Approved",
"message": "string"
},
[
{
"title": "string",
"result": "Approved",
"message": "string"
},
{
"title": "string",
"result": "Not approved",
"message": "string"
}
]
],
"anyOf": [
{
"type": "array",
"items": {
"type": "object",
"properties": {
"title": {
"type": "string",
"pattern": "^[\\W\\w]{1,32}$"
},
"message": {
"type": "string",
"pattern": "^[\\W\\w]{1,128}$"
},
"result": {
"type": "string",
"pattern": "^(Approved|Not approved|Skip)$"
}
},
"required": [
"result"
],
"additionalProperties": false
}
},
{
"type": "object",
"properties": {
"title": {
"type": "string",
"pattern": "^[\\W\\w]{1,32}$"
},
"message": {
"type": "string",
"pattern": "^[\\W\\w]{1,128}$"
},
"result": {
"type": "string",
"pattern": "^(Approved|Not approved|Skip)$"
}
},
"required": [
"result"
],
"additionalProperties": false
},
{
"type": "string",
"pattern": "^(Approved|Not approved|Skip)$"
}
],
"default": "Skip"
}

GCP > Dataflow > Job > Approved > Encryption at Rest

Define the minimum level of encryption required for GCP > Dataflow > Job.
This policy will be evaluated by the Approved control. If a GCP Dataflow job does not meet the minimum encryption level specified, it will be subject to the action specified in the GCP > Dataflow > Job > Approved policy.
See Approved for more information.

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobApprovedEncryptionAtRest
Valid Value
[
"Google managed key",
"Google managed key or higher",
"Customer managed key",
"Customer managed key or higher",
"Encryption at Rest > Customer Managed Key"
]
Schema
{
"type": "string",
"enum": [
"Google managed key",
"Google managed key or higher",
"Customer managed key",
"Customer managed key or higher",
"Encryption at Rest > Customer Managed Key"
],
"example": [
"Google managed key"
],
"default": "Google managed key or higher"
}

GCP > Dataflow > Job > Approved > Encryption at Rest > Customer Managed Key

The ID of a GCP KMS symmetric key that must be used as the encryption key for a GCP > Dataflow > Job.
This policy will be evaluated by the Approved control. If a GCP Dataflow job is not encrypted with the specified key, it will be subject to the action specified in the GCP > Dataflow > Job > Approved policy.
See Approved for more information.

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobApprovedEncryptionAtRestCustomerManagedKey
Schema
{
"type": "string",
"example": "projects/my-kms-project/locations/us-east1/keyRings/my-keyring/cryptoKeys/my-key",
"default": ""
}

GCP > Dataflow > Job > Approved > Regions

A list of GCP regions in which GCP Dataflow jobs are approved for use.

The expected format is an array of regions names. You may use the '*' and '?' wildcard characters.

This policy will be evaluated by the Approved control. If a GCP Dataflow job is created in a region that is not in the approved list, it will be subject to the action specified in the GCP > Dataflow > Job > Approved policy.

See Approved for more information.

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobApprovedRegions
Default Template Input
"{\n regions: policyValue(uri:\"tmod:@turbot/gcp-dataflow#/policy/types/dataflowApprovedRegionsDefault\") {\n value\n }\n}\n"
Default Template
"{% if $.regions.value | length == 0 %} [] {% endif %}{% for item in $.regions.value %}- &#39;{{ item }}&#39;&#92;n{% endfor %}"

GCP > Dataflow > Job > Approved > Usage

Determine whether the GCP Dataflow job is allowed to exist.

This policy will be evaluated by the Approved control. If a GCP Dataflow job is not approved, it will be subject to the action specified in the GCP > Dataflow > Job > Approved policy.

See Approved for more information.

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobApprovedUsage
Valid Value
[
"Not approved",
"Approved",
"Approved if GCP > Dataflow > Enabled"
]
Schema
{
"type": "string",
"enum": [
"Not approved",
"Approved",
"Approved if GCP > Dataflow > Enabled"
],
"example": [
"Not approved"
],
"default": "Approved if GCP > Dataflow > Enabled"
}

GCP > Dataflow > Job > CMDB

Configure whether to record and synchronize details for the GCP Dataflow job into the CMDB.

The CMDB control is responsible for populating and updating all the attributes for that resource type in the Guardrails CMDB.
All policies and controls in Guardrails are based around the resource, so usually the CMDB policy is set to "Enforce: Enabled".

If set to Skip then all changes to the CMDB are paused - no new resources will be discovered, no updates will be made and deleted resources will not be removed.

To cleanup resources and stop tracking changes, set this policy to "Enforce: Disabled".

CMDB controls also use the Regions policy associated with the resource. If region is not in GCP > Dataflow > Job > Regions policy, the CMDB control will delete the resource from the CMDB.

(Note: Setting CMDB to "Skip" will also pause these changes.)

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobCmdb
Category
Valid Value
[
"Skip",
"Enforce: Enabled",
"Enforce: Enabled if Dataflow API is enabled",
"Enforce: Disabled"
]
Schema
{
"type": "string",
"enum": [
"Skip",
"Enforce: Enabled",
"Enforce: Enabled if Dataflow API is enabled",
"Enforce: Disabled"
],
"example": [
"Skip"
],
"default": "Enforce: Enabled if Dataflow API is enabled"
}

GCP > Dataflow > Job > Regions

A list of GCP regions in which GCP Dataflow jobs are supported for use.

Any jobs in a region not listed here will not be recorded in CMDB.

The expected format is an array of regions names. You may use the '*' and
'?' wildcard characters.

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobRegions
Schema
{
"allOf": [
{
"$ref": "gcp#/definitions/regionNameMatcherList"
},
{
"default": [
"asia-east1",
"asia-northeast1",
"asia-southeast1",
"australia-southeast1",
"europe-west1",
"europe-west2",
"europe-west3",
"europe-west4",
"northamerica-northeast1",
"us-central1",
"us-east1",
"us-east4",
"us-west1"
]
}
]
}

GCP > Dataflow > Job > Usage

Configure the number of GCP Dataflow jobs that can be used for this project and the current consumption against the limit.

You can configure the behavior of the control with this GCP > Dataflow > Job > Usage policy.

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobUsage
Valid Value
[
"Skip",
"Check: Usage <= 85% of Limit",
"Check: Usage <= 100% of Limit"
]
Schema
{
"type": "string",
"enum": [
"Skip",
"Check: Usage <= 85% of Limit",
"Check: Usage <= 100% of Limit"
],
"example": [
"Check: Usage <= 85% of Limit"
],
"default": "Skip"
}

GCP > Dataflow > Job > Usage > Limit

Maximum number of items that can be created for this project

URI
tmod:@turbot/gcp-dataflow#/policy/types/jobUsageLimit
Schema
{
"type": "integer",
"minimum": 0,
"default": 25
}

GCP > Dataflow > Labels Template [Default]

A template used to generate the keys and values for GCP Dataflow resources.

By default, all Dataflow resource Labels > Template policies will use this value.

URI
tmod:@turbot/gcp-dataflow#/policy/types/dataflowLabelsTemplate
Default Template Input
"{\n defaultLabels: policyValue(uri:\"tmod:@turbot/gcp#/policy/types/defaultLabelsTemplate\") {\n value\n }\n}\n"
Default Template
"{%- if $.defaultLabels.value | length == 0 %} [] {%- elif $.defaultLabels.value != undefined %}{{ $.defaultLabels.value | dump | safe }}{%- else %}{% for item in $.defaultLabels.value %}- {{ item }}{% endfor %}{% endif %}"

GCP > Dataflow > Permissions

Configure whether permissions policies are in effect for GCP Dataflow.
This setting does not affect Project level permissions (GCP/Admin, GCP/Owner, etc).

Note: The behavior of this policy depends on the value of GCP > Permissions.

URI
tmod:@turbot/gcp-dataflow#/policy/types/dataflowPermissions
Valid Value
[
"Enabled",
"Disabled",
"Enabled if GCP > Dataflow > Enabled"
]
Schema
{
"type": "string",
"enum": [
"Enabled",
"Disabled",
"Enabled if GCP > Dataflow > Enabled"
],
"example": [
"Enabled"
],
"default": "Enabled if GCP > Dataflow > Enabled"
}

GCP > Dataflow > Permissions > Levels

Define the permissions levels that can be used to grant access to Dataflow
an GCP project. Permissions levels defined will appear in the UI to assign access to Guardrails users.

Note: Some services do not use all permissions levels, and any permissions level that has
no permissions associated will not be created even if it is selected here.

URI
tmod:@turbot/gcp-dataflow#/policy/types/dataflowPermissionsLevels
Default Template Input
[
"{\n item: project {\n turbot{\n id\n }\n }\n}\n",
"{\n availableLevels: policyValues(filter:\"policyTypeLevel:self resourceId:{{ $.item.turbot.id }} policyTypeId:'tmod:@turbot/gcp-iam#/policy/types/permissionsLevelsDefault'\") {\n items {\n value\n }\n }\n}\n"
]
Default Template
"{% if $.availableLevels.items[0].value | length == 0 %} [] {% endif %}{% for item in $.availableLevels.items[0].value %}- {{ item }}&#92;n{% endfor %}"
Schema
{
"type": "array",
"items": {
"type": "string",
"enum": [
"Metadata",
"ReadOnly",
"Operator",
"Admin",
"Owner"
]
}
}

GCP > Dataflow > Permissions > Levels > Modifiers

A map of GCP API to Guardrails Permission Level used to customize Guardrails' standard permissions.
You can add, remove or redefine the mapping of GCP API operations to Guardrails permissions levels here.

Note: Modifiers are cumulative - if you add a permission to the metadata level, it is also added
to readOnly, operator and admin. Modifier policies set here will “roll up” to the GCP level too - if
you add a permission to Admin, it will be granted to GCP/Storage/Admin and also GCP/Admin

<br />example:<br /> - &quot;storage.bucket.create&quot;: admin<br /> - &quot;sql.database.create&quot;: metadata<br />

URI
tmod:@turbot/gcp-dataflow#/policy/types/dataflowPermissionsLevelsModifiers

GCP > Dataflow > Regions

A list of GCP regions in which GCP Dataflow resources are supported for use.

The expected format is an array of regions names. You may use the '*' and
'?' wildcard characters.

This policy is the default value for all GCP Dataflow resources' Regions policies.

URI
tmod:@turbot/gcp-dataflow#/policy/types/dataflowRegionsDefault
Schema
{
"allOf": [
{
"$ref": "gcp#/definitions/regionNameMatcherList"
},
{
"default": [
"asia-east1",
"asia-east2",
"asia-northeast1",
"asia-northeast2",
"asia-northeast3",
"asia-south1",
"asia-southeast1",
"asia-southeast2",
"australia-southeast1",
"europe-north1",
"europe-west1",
"europe-west2",
"europe-west3",
"europe-west4",
"europe-west6",
"northamerica-northeast1",
"southamerica-east1",
"us-central1",
"us-east1",
"us-east4",
"us-west1",
"us-west2",
"us-west3",
"us-west4"
]
}
]
}

GCP > Turbot > Event Handlers > Logging > Sink > Compiled Filter > @turbot/gcp-dataflow

GCP logs advanced filter
used to specify a subset of log entries that is forwarded to the Guardrails Event Handlers
by the logging sink on behalf of GCP Dataflow.

URI
tmod:@turbot/gcp-dataflow#/policy/types/dataflowCustomEventPatterns
Schema
{
"type": "string",
"default": "(resource.type = dataflow_step AND protoPayload.authorizationInfo.permission = dataflow.jobs.create)"
}

GCP > Turbot > Permissions > Compiled > Levels > @turbot/gcp-dataflow

A calculated policy that Guardrails uses to create a compiled list of ALL permission
levels for GCP Dataflow that is used as input to
the stack that manages the Guardrails IAM permissions objects.

URI
tmod:@turbot/gcp-dataflow#/policy/types/gcpLevelsCompiled

GCP > Turbot > Permissions > Compiled > Service Permissions > @turbot/gcp-dataflow

A calculated policy that Guardrails uses to create a compiled list of ALL
permissions for GCP Dataflow that is used as
input to the control that manages the IAM stack.

URI
tmod:@turbot/gcp-dataflow#/policy/types/gcpCompiledServicePermissions