Documentation
¶
Index ¶
Constants ¶
This section is empty.
Variables ¶
This section is empty.
Functions ¶
func BedrockBatchSfn_IsConstruct ¶
func BedrockBatchSfn_IsConstruct(x interface{}) *bool
Checks if `x` is a construct.
Use this method instead of `instanceof` to properly detect `Construct` instances, even when the construct library is symlinked.
Explanation: in JavaScript, multiple copies of the `constructs` library on disk are seen as independent, completely different libraries. As a consequence, the class `Construct` in each copy of the `constructs` library is seen as a different class, and an instance of one class will not test as `instanceof` the other class. `npm install` will not create installations like this, but users may manually symlink construct libraries together or use a monorepo tool: in those cases, multiple copies of the `constructs` library can be accidentally installed, and `instanceof` will behave unpredictably. It is safest to avoid using `instanceof`, and using this type-testing method instead.
Returns: true if `x` is an object created from a class which extends `Construct`. Experimental.
func NewBedrockBatchSfn_Override ¶
func NewBedrockBatchSfn_Override(b BedrockBatchSfn, parent constructs.Construct, id *string, props *BedrockBatchSfnProps)
Experimental.
Types ¶
type BedrockBatchSfn ¶
type BedrockBatchSfn interface {
awsstepfunctions.StateMachineFragment
// The states to chain onto if this fragment is used.
// Experimental.
EndStates() *[]awsstepfunctions.INextable
// Descriptive identifier for this chainable.
// Experimental.
Id() *string
// The tree node.
// Experimental.
Node() constructs.Node
// The start state of this state machine fragment.
// Experimental.
StartState() awsstepfunctions.State
// Continue normal execution with the given state.
// Experimental.
Next(next awsstepfunctions.IChainable) awsstepfunctions.Chain
// Prefix the IDs of all states in this state machine fragment.
//
// Use this to avoid multiple copies of the state machine all having the
// same state IDs.
// Experimental.
PrefixStates(prefix *string) awsstepfunctions.StateMachineFragment
// Wrap all states in this state machine fragment up into a single state.
//
// This can be used to add retry or error handling onto this state
// machine fragment.
//
// Be aware that this changes the result of the inner state machine
// to be an array with the result of the state machine in it. Adjust
// your paths accordingly. For example, change 'outputPath' to
// '$[0]'.
// Experimental.
ToSingleState(options *awsstepfunctions.SingleStateOptions) awsstepfunctions.Parallel
// Returns a string representation of this construct.
// Experimental.
ToString() *string
}
A state machine fragment that creates a Bedrock Batch Inference Job and waits for its completion.
Input schema: ```
{
"job_name": string, // Required. Name of the batch inference job
"manifest_keys": string[], // Required. List of S3 keys of the input manifest files
"model_id": string // Required. Model ID to use.
}
```
Output schema: ```
{
"status": string, // Required. Status of the batch job. One of: "Completed" or "PartiallyCompleted"
"bucket": string, // Required. S3 bucket where the output is stored
"keys": string[] // Required. Array of S3 keys for the output files
}
```
Error schema: ```
{
"status": string, // Required. Status will be one of: "Failed", "Stopped", or "Expired"
"error": string, // Required. Error code
"cause": string // Required. Error message
}
```. Experimental.
func NewBedrockBatchSfn ¶
func NewBedrockBatchSfn(parent constructs.Construct, id *string, props *BedrockBatchSfnProps) BedrockBatchSfn
Experimental.
type BedrockBatchSfnProps ¶
type BedrockBatchSfnProps struct {
// The S3 bucket where the Bedrock Batch Inference Job gets the input manifests.
// Experimental.
BedrockBatchInputBucket awss3.IBucket `field:"required" json:"bedrockBatchInputBucket" yaml:"bedrockBatchInputBucket"`
// The S3 bucket where the Bedrock Batch Inference Job stores the output.
// Experimental.
BedrockBatchOutputBucket awss3.IBucket `field:"required" json:"bedrockBatchOutputBucket" yaml:"bedrockBatchOutputBucket"`
// IAM policy used for Bedrock batch processing.
//
// The policy must have the following permissions for the models and inference profiles you plan to use:
// - bedrock:InvokeModel
// - bedrock:CreateModelInvocationJob.
// Default: const bedrockBatchPolicy = new iam.ManagedPolicy(this, 'BedrockBatchPolicy', {
// statements: [
// new iam.PolicyStatement({
// sid: 'Inference',
// actions: ['bedrock:InvokeModel', 'bedrock:CreateModelInvocationJob'],
// resources: [
// 'arn:aws:bedrock:*::foundation-model/*',
// Stack.of(this).formatArn({
// service: 'bedrock',
// resource: 'inference-profile',
// resourceName: '*',
// }),
// ],
// }),
// ],
// });.
//
// Experimental.
BedrockBatchPolicy awsiam.IManagedPolicy `field:"optional" json:"bedrockBatchPolicy" yaml:"bedrockBatchPolicy"`
// JSONPath expression to select part of the state to be the input to this state.
//
// May also be the special value JsonPath. DISCARD, which will cause the effective input to be the empty object {}.
//
// Input schema:
// “`
// {
// "job_name": string, // Required. Name of the batch inference job
// "manifest_keys": string[], // Required. List of S3 keys of the input manifest files
// "model_id": string // Required. Model ID to use.
// }
// “`.
// Default: The entire task input (JSON path '$').
//
// Experimental.
InputPath *string `field:"optional" json:"inputPath" yaml:"inputPath"`
// JSONPath expression to indicate where to inject the state's output May also be the special value JsonPath.
//
// DISCARD, which will cause the state's input to become its output.
//
// Output schema:
// “`
// {
// "status": string, // Required. Status of the batch job. One of: "Completed" or "PartiallyCompleted"
// "bucket": string, // Required. S3 bucket where the output is stored
// "keys": string[] // Required. Array of S3 keys for the output files
// }
// “`.
// Default: Replaces the entire input with the result (JSON path '$').
//
// Experimental.
ResultPath *string `field:"optional" json:"resultPath" yaml:"resultPath"`
// The timeout duration for the batch inference job.
//
// Must be between 24 hours and 1 week (168 hours).
// Default: Duration.hours(48)
//
// Experimental.
Timeout awscdk.Duration `field:"optional" json:"timeout" yaml:"timeout"`
}
Experimental.
TypeScript
Python