Class WorkerComputeConfiguration
- All Implemented Interfaces:
 Serializable,SdkPojo,ToCopyableBuilder<WorkerComputeConfiguration.Builder,WorkerComputeConfiguration> 
The configuration of the compute resources for workers running an analysis with the Clean Rooms SQL analytics engine.
- See Also:
 
- 
Nested Class Summary
Nested Classes - 
Method Summary
Modifier and TypeMethodDescriptionbuilder()final booleanfinal booleanequalsBySdkFields(Object obj) Indicates whether some other object is "equal to" this one by SDK fields.final <T> Optional<T> getValueForField(String fieldName, Class<T> clazz) final inthashCode()final Integernumber()The number of workers.The configuration properties for the worker compute environment.static Class<? extends WorkerComputeConfiguration.Builder> Take this object and create a builder that contains all of the current property values of this object.final StringtoString()Returns a string representation of this object.final WorkerComputeTypetype()The worker compute configuration type.final StringThe worker compute configuration type.Methods inherited from interface software.amazon.awssdk.utils.builder.ToCopyableBuilder
copy 
- 
Method Details
- 
type
The worker compute configuration type.
If the service returns an enum value that is not available in the current SDK version,
typewill returnWorkerComputeType.UNKNOWN_TO_SDK_VERSION. The raw value returned by the service is available fromtypeAsString().- Returns:
 - The worker compute configuration type.
 - See Also:
 
 - 
typeAsString
The worker compute configuration type.
If the service returns an enum value that is not available in the current SDK version,
typewill returnWorkerComputeType.UNKNOWN_TO_SDK_VERSION. The raw value returned by the service is available fromtypeAsString().- Returns:
 - The worker compute configuration type.
 - See Also:
 
 - 
number
The number of workers.
SQL queries support a minimum value of 2 and a maximum value of 400.
PySpark jobs support a minimum value of 4 and a maximum value of 128.
- Returns:
 - The number of workers.
         
SQL queries support a minimum value of 2 and a maximum value of 400.
PySpark jobs support a minimum value of 4 and a maximum value of 128.
 
 - 
properties
The configuration properties for the worker compute environment. These properties allow you to customize the compute settings for your Clean Rooms workloads.
- Returns:
 - The configuration properties for the worker compute environment. These properties allow you to customize the compute settings for your Clean Rooms workloads.
 
 - 
toBuilder
Description copied from interface:ToCopyableBuilderTake this object and create a builder that contains all of the current property values of this object.- Specified by:
 toBuilderin interfaceToCopyableBuilder<WorkerComputeConfiguration.Builder,WorkerComputeConfiguration> - Returns:
 - a builder for type T
 
 - 
builder
 - 
serializableBuilderClass
 - 
hashCode
 - 
equals
 - 
equalsBySdkFields
Description copied from interface:SdkPojoIndicates whether some other object is "equal to" this one by SDK fields. An SDK field is a modeled, non-inherited field in anSdkPojoclass, and is generated based on a service model.If an
SdkPojoclass does not have any inherited fields,equalsBySdkFieldsandequalsare essentially the same.- Specified by:
 equalsBySdkFieldsin interfaceSdkPojo- Parameters:
 obj- the object to be compared with- Returns:
 - true if the other object equals to this object by sdk fields, false otherwise.
 
 - 
toString
 - 
getValueForField
 - 
sdkFields
 - 
sdkFieldNameToField
- Specified by:
 sdkFieldNameToFieldin interfaceSdkPojo- Returns:
 - The mapping between the field name and its corresponding field.
 
 
 -