Skip to content

Commit

Permalink
Automatically generated by magic modules for service: dataproc_v1 and…
Browse files Browse the repository at this point in the history
… resource: Projects__regions__job.

This commit includes the following changes:
- Singular Resource
- Plural Resource
- Documentation updates
- Terraform configuration
- Integration tests

Signed-off-by: Samir <[email protected]>
  • Loading branch information
sa-progress committed Jun 24, 2024
1 parent b3f4a77 commit c51359d
Show file tree
Hide file tree
Showing 61 changed files with 2,986 additions and 0 deletions.
393 changes: 393 additions & 0 deletions docs/resources/google_dataproc_project_region_job.md

Large diffs are not rendered by default.

90 changes: 90 additions & 0 deletions docs/resources/google_dataproc_project_region_jobs.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
+++

title = "google_dataproc_project_region_jobs Resource"
platform = "gcp"
draft = false
gh_repo = "inspec-gcp"


[menu.inspec]

title = "google_dataproc_project_region_jobs"
identifier = "inspec/resources/gcp/google_dataproc_project_region_jobs Resource"
parent = "inspec/resources/gcp"
+++

Use the `google_dataproc_project_region_jobs` InSpec audit resource to test the properties of a test a Google ProjectRegionJob.

## Installation
{{% inspec_gcp_install %}}

## Syntax
A `google_dataproc_project_region_jobs` is used to test a Google ProjectRegionJob resource

## Examples
```
describe google_dataproc_project_region_jobs() do
it { should exist }
end
```

## Parameters
Properties that can be accessed from the `google_dataproc_project_region_jobs` resource:

See [google_dataproc_project_region_job.md](google_dataproc_project_region_job.md) for more detailed information
* `references`: an array of `google_dataproc_project_region_job` reference
* `placements`: an array of `google_dataproc_project_region_job` placement
* `hadoop_jobs`: an array of `google_dataproc_project_region_job` hadoop_job
* `spark_jobs`: an array of `google_dataproc_project_region_job` spark_job
* `pyspark_jobs`: an array of `google_dataproc_project_region_job` pyspark_job
* `hive_jobs`: an array of `google_dataproc_project_region_job` hive_job
* `pig_jobs`: an array of `google_dataproc_project_region_job` pig_job
* `spark_r_jobs`: an array of `google_dataproc_project_region_job` spark_r_job
* `spark_sql_jobs`: an array of `google_dataproc_project_region_job` spark_sql_job
* `presto_jobs`: an array of `google_dataproc_project_region_job` presto_job
* `trino_jobs`: an array of `google_dataproc_project_region_job` trino_job
* `flink_jobs`: an array of `google_dataproc_project_region_job` flink_job
* `statuses`: an array of `google_dataproc_project_region_job` status
* `status_histories`: an array of `google_dataproc_project_region_job` status_history
* `yarn_applications`: an array of `google_dataproc_project_region_job` yarn_applications
* `driver_output_resource_uris`: an array of `google_dataproc_project_region_job` driver_output_resource_uri
* `driver_control_files_uris`: an array of `google_dataproc_project_region_job` driver_control_files_uri
* `labels`: an array of `google_dataproc_project_region_job` labels
* `schedulings`: an array of `google_dataproc_project_region_job` scheduling
* `job_uuids`: an array of `google_dataproc_project_region_job` job_uuid
* `dones`: an array of `google_dataproc_project_region_job` done
* `driver_scheduling_configs`: an array of `google_dataproc_project_region_job` driver_scheduling_config
## Properties
Properties that can be accessed from the `google_dataproc_project_region_jobs` resource:

See [google_dataproc_project_region_job.md](google_dataproc_project_region_job.md) for more detailed information
* `references`: an array of `google_dataproc_project_region_job` reference
* `placements`: an array of `google_dataproc_project_region_job` placement
* `hadoop_jobs`: an array of `google_dataproc_project_region_job` hadoop_job
* `spark_jobs`: an array of `google_dataproc_project_region_job` spark_job
* `pyspark_jobs`: an array of `google_dataproc_project_region_job` pyspark_job
* `hive_jobs`: an array of `google_dataproc_project_region_job` hive_job
* `pig_jobs`: an array of `google_dataproc_project_region_job` pig_job
* `spark_r_jobs`: an array of `google_dataproc_project_region_job` spark_r_job
* `spark_sql_jobs`: an array of `google_dataproc_project_region_job` spark_sql_job
* `presto_jobs`: an array of `google_dataproc_project_region_job` presto_job
* `trino_jobs`: an array of `google_dataproc_project_region_job` trino_job
* `flink_jobs`: an array of `google_dataproc_project_region_job` flink_job
* `statuses`: an array of `google_dataproc_project_region_job` status
* `status_histories`: an array of `google_dataproc_project_region_job` status_history
* `yarn_applications`: an array of `google_dataproc_project_region_job` yarn_applications
* `driver_output_resource_uris`: an array of `google_dataproc_project_region_job` driver_output_resource_uri
* `driver_control_files_uris`: an array of `google_dataproc_project_region_job` driver_control_files_uri
* `labels`: an array of `google_dataproc_project_region_job` labels
* `schedulings`: an array of `google_dataproc_project_region_job` scheduling
* `job_uuids`: an array of `google_dataproc_project_region_job` job_uuid
* `dones`: an array of `google_dataproc_project_region_job` done
* `driver_scheduling_configs`: an array of `google_dataproc_project_region_job` driver_scheduling_config

## Filter Criteria
This resource supports all of the above properties as filter criteria, which can be used
with `where` as a block or a method.

## GCP Permissions

Ensure the [Cloud Dataproc API](https://console.cloud.google.com/apis/library/dataproc.googleapis.com) is enabled for the current project.
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** Type: MMv1 ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
module GoogleInSpec
module Dataproc
module Property
class ProjectRegionJobDriverSchedulingConfig
attr_reader :memory_mb

attr_reader :vcores

def initialize(args = nil, parent_identifier = nil)
return if args.nil?
@parent_identifier = parent_identifier
@memory_mb = args['memoryMb']
@vcores = args['vcores']
end

def to_s
"#{@parent_identifier} ProjectRegionJobDriverSchedulingConfig"
end
end
end
end
end
55 changes: 55 additions & 0 deletions libraries/google/dataproc/property/projectregionjob_flink_job.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** Type: MMv1 ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
require 'google/dataproc/property/projectregionjob_flink_job_logging_config'
require 'google/dataproc/property/projectregionjob_flink_job_logging_config_driver_log_levels'
require 'google/dataproc/property/projectregionjob_flink_job_properties'
module GoogleInSpec
module Dataproc
module Property
class ProjectRegionJobFlinkJob
attr_reader :main_jar_file_uri

attr_reader :main_class

attr_reader :args

attr_reader :jar_file_uris

attr_reader :savepoint_uri

attr_reader :properties

attr_reader :logging_config

def initialize(args = nil, parent_identifier = nil)
return if args.nil?
@parent_identifier = parent_identifier
@main_jar_file_uri = args['mainJarFileUri']
@main_class = args['mainClass']
@args = args['args']
@jar_file_uris = args['jarFileUris']
@savepoint_uri = args['savepointUri']
@properties = GoogleInSpec::Dataproc::Property::ProjectRegionJobFlinkJobProperties.new(args['properties'], to_s)
@logging_config = GoogleInSpec::Dataproc::Property::ProjectRegionJobFlinkJobLoggingConfig.new(args['loggingConfig'], to_s)
end

def to_s
"#{@parent_identifier} ProjectRegionJobFlinkJob"
end
end
end
end
end
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** Type: MMv1 ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
require 'google/dataproc/property/projectregionjob_flink_job_logging_config_driver_log_levels'
module GoogleInSpec
module Dataproc
module Property
class ProjectRegionJobFlinkJobLoggingConfig
attr_reader :driver_log_levels

def initialize(args = nil, parent_identifier = nil)
return if args.nil?
@parent_identifier = parent_identifier
@driver_log_levels = GoogleInSpec::Dataproc::Property::ProjectRegionJobFlinkJobLoggingConfigDriverLogLevels.new(args['driverLogLevels'], to_s)
end

def to_s
"#{@parent_identifier} ProjectRegionJobFlinkJobLoggingConfig"
end
end
end
end
end
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** Type: MMv1 ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
module GoogleInSpec
module Dataproc
module Property
class ProjectRegionJobFlinkJobLoggingConfigDriverLogLevels
attr_reader :additional_properties

def initialize(args = nil, parent_identifier = nil)
return if args.nil?
@parent_identifier = parent_identifier
@additional_properties = args['additionalProperties']
end

def to_s
"#{@parent_identifier} ProjectRegionJobFlinkJobLoggingConfigDriverLogLevels"
end
end
end
end
end
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** Type: MMv1 ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
module GoogleInSpec
module Dataproc
module Property
class ProjectRegionJobFlinkJobProperties
attr_reader :additional_properties

def initialize(args = nil, parent_identifier = nil)
return if args.nil?
@parent_identifier = parent_identifier
@additional_properties = args['additionalProperties']
end

def to_s
"#{@parent_identifier} ProjectRegionJobFlinkJobProperties"
end
end
end
end
end
58 changes: 58 additions & 0 deletions libraries/google/dataproc/property/projectregionjob_hadoop_job.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** Type: MMv1 ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
require 'google/dataproc/property/projectregionjob_hadoop_job_logging_config'
require 'google/dataproc/property/projectregionjob_hadoop_job_logging_config_driver_log_levels'
require 'google/dataproc/property/projectregionjob_hadoop_job_properties'
module GoogleInSpec
module Dataproc
module Property
class ProjectRegionJobHadoopJob
attr_reader :main_jar_file_uri

attr_reader :main_class

attr_reader :args

attr_reader :jar_file_uris

attr_reader :file_uris

attr_reader :archive_uris

attr_reader :properties

attr_reader :logging_config

def initialize(args = nil, parent_identifier = nil)
return if args.nil?
@parent_identifier = parent_identifier
@main_jar_file_uri = args['mainJarFileUri']
@main_class = args['mainClass']
@args = args['args']
@jar_file_uris = args['jarFileUris']
@file_uris = args['fileUris']
@archive_uris = args['archiveUris']
@properties = GoogleInSpec::Dataproc::Property::ProjectRegionJobHadoopJobProperties.new(args['properties'], to_s)
@logging_config = GoogleInSpec::Dataproc::Property::ProjectRegionJobHadoopJobLoggingConfig.new(args['loggingConfig'], to_s)
end

def to_s
"#{@parent_identifier} ProjectRegionJobHadoopJob"
end
end
end
end
end
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# frozen_string_literal: false

# ----------------------------------------------------------------------------
#
# *** AUTO GENERATED CODE *** Type: MMv1 ***
#
# ----------------------------------------------------------------------------
#
# This file is automatically generated by Magic Modules and manual
# changes will be clobbered when the file is regenerated.
#
# Please read more about how to change this file in README.md and
# CONTRIBUTING.md located at the root of this package.
#
# ----------------------------------------------------------------------------
require 'google/dataproc/property/projectregionjob_hadoop_job_logging_config_driver_log_levels'
module GoogleInSpec
module Dataproc
module Property
class ProjectRegionJobHadoopJobLoggingConfig
attr_reader :driver_log_levels

def initialize(args = nil, parent_identifier = nil)
return if args.nil?
@parent_identifier = parent_identifier
@driver_log_levels = GoogleInSpec::Dataproc::Property::ProjectRegionJobHadoopJobLoggingConfigDriverLogLevels.new(args['driverLogLevels'], to_s)
end

def to_s
"#{@parent_identifier} ProjectRegionJobHadoopJobLoggingConfig"
end
end
end
end
end
Loading

0 comments on commit c51359d

Please sign in to comment.