Job is the Schema for the Jobs API. Creates a job in Dataflow according to a provided config file.
Type
CRD
Group
dataflow.gcp.upbound.io
Version
v1beta1
apiVersion: dataflow.gcp.upbound.io/v1beta1
kind: Job
JobSpec defines the desired state of Job
No description provided.
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].
ProviderConfigReference specifies how the provider that will be used to create, observe, update, and delete this managed resource should be configured.
Policies for referencing.
ProviderReference specifies the provider that will be used to create, observe, update, and delete this managed resource. Deprecated: Please use ProviderConfigReference, i.e. providerConfigRef
Policies for referencing.
PublishConnectionDetailsTo specifies the connection secret config which contains a name, metadata and a reference to secret store config to which any connection details for this managed resource should be written. Connection details frequently include the endpoint, username, and password required to connect to the managed resource.
WriteConnectionSecretToReference specifies the namespace and name of a Secret to which any connection details for this managed resource should be written. Connection details frequently include the endpoint, username, and password required to connect to the managed resource. This field is planned to be replaced in a future release in favor of PublishConnectionDetailsTo. Currently, both could be set independently and connection details would be published to both without affecting each other.
JobStatus defines the observed state of Job.
No description provided.
List of experiments that should be used by the job. An example value is ["enable_stackdriver_agent_metrics"].
Conditions of the resource.
big-data-job
apiVersion: dataflow.gcp.upbound.io/v1beta1
kind: Job
metadata:
annotations:
meta.upbound.io/example-id: dataflow/v1beta1/job
upjet.upbound.io/manual-intervention: This example resource needs an input file stored in a bucket
labels:
testing.upbound.io/example-name: big_data_job
name: big-data-job
spec:
forProvider:
name: dataflow-job
parameters:
inputFile: gs://my-bucket/file.txt
output: gs://my-bucket/output.txt
region: us-central1
tempGcsLocation: gs://my-bucket/tmp_dir
templateGcsPath: gs://dataflow-templates/latest/Word_Count
© 2022 Upbound, Inc.
Discover the building blocksfor your internal cloud platform.