README
¶
🐚 Shell Plugin
The Shell Plugin enables Heimdall to execute shell commands directly on configured clusters (typically local). It provides a simple, flexible way to integrate existing scripts and command-line tools into the Heimdall orchestration flow.
⚠️ Security Note: Use with caution. Improperly configured shell commands can introduce security vulnerabilities. Always validate inputs and restrict command access where possible.
🧩 Plugin Overview
- Plugin Name:
shell - Execution Mode: Sync or Async
- Use Case: Integrating shell scripts, CLI tools, or simple automation jobs
⚙️ Defining a Shell Command
A shell command requires a command in its context—typically a path to a script or binary.
- name: my-shell-script-0.0.1
status: active
plugin: shell
version: 0.0.1
description: Run super application
context:
command:
- /usr/local/bin/super-script.sh
tags:
- type:super-command
cluster_tags:
- type:localhost
🔸 This defines an asynchronous shell command that runs /usr/local/bin/super-script.sh. Once the job completes, logs can be accessed via:
GET /api/v1/job/<job_id>/stdout
GET /api/v1/job/<job_id>/stderr
🖥️ Cluster Configuration
To run shell commands, define a cluster (typically localhost) that supports command execution:
- name: localhost-0.0.1
status: active
version: 0.0.1
description: Just a localhost
tags:
- type:localhost
- data:local
This cluster will be matched against cluster_criteria in job submissions.
🚀 Submitting a Shell Job
You can pass arguments to your shell command via the context.arguments field in the job payload:
{
"name": "my-super-job",
"version": "0.0.2",
"command_criteria": ["type:super-command"],
"cluster_criteria": ["data:local"],
"context": {
"arguments": ["arg1", "arg2", "..."]
}
}
🔹 Equivalent execution:
/usr/local/bin/super-script.sh "arg1" "arg2" "..."
📦 Job Context & Runtime
Each shell-based job is executed in a dedicated working directory created by Heimdall at runtime. This directory includes:
context.json: Provides structured context for the job, including job, command, and cluster metadata.- Temporary output or result files generated by your script.
You can extract values from the context in your scripts using tools like jq:
jq '.job.name' context.json
📊 Returning Job Results
To return structured data from a shell command, create a result.json file in the job's working directory. Heimdall will parse this and record the output if it matches the expected format:
type column struct {
Name string `yaml:"name,omitempty" json:"name,omitempty"`
Type columnType `yaml:"type,omitempty" json:"type,omitempty"`
}
type Result struct {
Columns []*column `yaml:"columns,omitempty" json:"columns,omitempty"`
Data [][]any `yaml:"data,omitempty" json:"data,omitempty"`
}
✅ Example:
{
"columns": [
{"name": "status", "type": "string"},
{"name": "duration", "type": "int"}
],
"data": [
["success", 120]
]
}
If valid, this result will be made available via:
GET /api/v1/job/<job_id>/result
🧠 Best Practices
- Validate inputs to avoid command injection.
- Use dedicated scripts to abstract complex logic away from Heimdall config.
- Store sensitive data in secure environment variables or secrets—never pass them as command-line arguments.
Documentation
¶
There is no documentation for this package.