You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: dsl-reference.md
+6-6Lines changed: 6 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -1519,7 +1519,7 @@ from: .order.pet
1519
1519
1520
1520
### Output
1521
1521
1522
-
Documents the structure - and optionally configures the filtering of - workflow/task output data.
1522
+
Documents the structure - and optionally configures the transformations of - workflow/task output data.
1523
1523
1524
1524
It's crucial for authors to document the schema of output data whenever feasible. This documentation empowers consuming applications to provide contextual auto-suggestions when handling runtime expressions.
1525
1525
@@ -1544,11 +1544,11 @@ output:
1544
1544
petId:
1545
1545
type: string
1546
1546
required: [ petId ]
1547
-
as:
1548
-
petId: '${ .pet.id }'
1547
+
as:
1548
+
petId: '${ .pet.id }'
1549
1549
export:
1550
1550
as:
1551
-
'.petList += [ . ]'
1551
+
'.petList += [ $task.output ]'
1552
1552
```
1553
1553
1554
1554
### Export
@@ -1566,13 +1566,13 @@ Optionally, the context might have an associated schema.
Copy file name to clipboardExpand all lines: dsl.md
+96-24Lines changed: 96 additions & 24 deletions
Original file line number
Diff line number
Diff line change
@@ -146,39 +146,97 @@ Once the task has been executed, different things can happen:
146
146
147
147
In Serverless Workflow DSL, data flow management is crucial to ensure that the right data is passed between tasks and to the workflow itself.
148
148
149
-
Here's how data flows through a workflow based on various filtering stages:
149
+
Here's how data flows through a workflow based on various transformation stages:
150
150
151
-
1.**Filter Workflow Input**
152
-
Before the workflow starts, the input data provided to the workflow can be filtered to ensure only relevant data is passed into the workflow context. This step allows the workflow to start with a clean and focused dataset, reducing potential overhead and complexity in subsequent tasks.
151
+
1.**Transform Workflow Input**
152
+
Before the workflow starts, the input data provided to the workflow can be transformed to ensure only relevant data in the expected format is passed into the workflow context. This can be done using the top level `input.from` expression. It evaluates on the raw workflow input and defaults to the identity expression which leaves the input unchanged. This step allows the workflow to start with a clean and focused dataset, reducing potential overhead and complexity in subsequent tasks. The result of this expression will set as the initial value for the `$context` runtime expression argument and be passed to the first task.
153
153
154
-
*Example: If the workflow receives a JSON object as input, a filter can be applied to remove unnecessary fields and retain only those that are required for the workflow's execution.*
154
+
*Example: If the workflow receives a JSON object as input, a transformation can be applied to remove unnecessary fields and retain only those that are required for the workflow's execution.*
155
155
156
-
2.**Filter First Task Input**
157
-
The input data for the first task can be filtered to match the specific requirements of that task. This ensures that the first task receives only the necessary data it needs to perform its operations.
156
+
2.**Transform First Task Input**
157
+
The input data for the first task can be transformed to match the specific requirements of that task. This ensures that the first task receives only the data required to perform its operations. This can be done using the task's `input.from` expression. It evaluates the transformed workflow input and defaults to the identity expression, which leaves the input unchanged. The result of this expression will be set as the `$input` runtime expression argument and be passed to the task. This transformed input will be evaluated against any runtime expressions used within the task definition.
158
158
159
-
*Example: If the first task is a function call that only needs a subset of the workflow input, a filter can be applied to provide only those fields needed for the function to execute.*
159
+
*Example: If the first task is a function call that only needs a subset of the workflow input, a transformation can be applied to provide only those fields needed for the function to execute.*
160
160
161
-
3.**Filter First Task Output**
162
-
After the first task completes, its output can be filtered before passing it to the next task or storing it in the workflow context. This helps in managing the data flow and keeping the context clean by removing any unnecessary data produced by the task.
161
+
3.**Transform First Task Output**
162
+
After completing the first task, its output can be transformed before passing it to the next task or storing it in the workflow context. Transformations are applied using the `output.as` runtime expression. It evaluates the raw task output and defaults to the identity expression, which leaves the output unchanged. Its result will be input for the next task. To update the context, one uses the `export.as` runtime expression. It evaluates the raw output and defaults to the expression that returns the existing context. The result of this runtime expression replaces the workflow's current context and the content of the `$context` runtime expression argument. This helps manage the data flow and keep the context clean by removing any unnecessary data produced by the task.
163
163
164
-
*Example: If the first task returns a large dataset, a filter can be applied to retain only the relevant results needed for subsequent tasks.*
164
+
*Example: If the first task returns a large dataset, a transformation can be applied to retain only the relevant results needed for subsequent tasks.*
165
165
166
-
4.**Filter Last Task Input**
167
-
Before the last task in the workflow executes, its input data can be filtered to ensure it receives only the necessary information. This step is crucial for ensuring that the final task has all the required data to complete the workflow successfully.
166
+
4.**Transform Last Task Input**
167
+
Before the last task in the workflow executes, its input data can be transformed to ensure it receives only the necessary information. This can be done using the task's `input.from` expression. It evaluates the transformed workflow input and defaults to the identity expression, which leaves the input unchanged. The result of this expression will be set as the `$input` runtime expression argument and be passed to the task. This transformed input will be evaluated against any runtime expressions used within the task definition. This step is crucial for ensuring the final task has all the required data to complete the workflow successfully.
168
168
169
-
*Example: If the last task involves generating a report, the input filter can ensure that only the data required for the report generation is passed to the task.*
169
+
*Example: If the last task involves generating a report, the input transformation can ensure that only the data required for the report generation is passed to the task.*
170
170
171
-
5.**Filter Last Task Output**
172
-
After the last task completes, its output can be filtered before it is considered as the workflow output. This ensures that the workflow produces a clean and relevant output, free from any extraneous data that might have been generated during the task execution.
171
+
5.**Transform Last Task Output**
172
+
After the last task completes, its output can be transformed before it is considered the workflow output. Transformations are applied using the `output.as` runtime expression. It evaluates the raw task output and defaults to the identity expression, which leaves the output unchanged. Its result will be passed to the workflow `output.as` runtime expression. This ensures that the workflow produces a clean and relevant output, free from any extraneous data that might have been generated during the task execution.
173
173
174
-
*Example: If the last task outputs various statistics, a filter can be applied to retain only the key metrics that are relevant to the stakeholders.*
174
+
*Example: If the last task outputs various statistics, a transformation can be applied to retain only the key metrics that are relevant to the stakeholders.*
175
175
176
-
6.**Filter Workflow Output**
177
-
Finally, the overall workflow output can be filtered before it is returned to the caller or stored. This step ensures that the final output of the workflow is concise and relevant, containing only the necessary information that needs to be communicated or recorded.
176
+
6.**Transform Workflow Output**
177
+
Finally, the overall workflow output can be transformed before it is returned to the caller or stored. Transformations are applied using the `output.as` runtime expression. It evaluates the last task's output and defaults to the identity expression, which leaves the output unchanged. This step ensures that the final output of the workflow is concise and relevant, containing only the necessary information that needs to be communicated or recorded.
178
178
179
-
*Example: If the workflow's final output is a summary report, a filter can ensure that the report contains only the most important summaries and conclusions, excluding any intermediate data.*
179
+
*Example: If the workflow's final output is a summary report, a transformation can ensure that the report contains only the most important summaries and conclusions, excluding any intermediate data.*
180
180
181
-
By applying filters at these strategic points, Serverless Workflow DSL ensures that data flows through the workflow in a controlled and efficient manner, maintaining clarity and relevance at each stage of execution. This approach helps in managing complex workflows and ensures that each task operates with the precise data it requires, leading to more predictable and reliable workflow outcomes.
181
+
By applying transformations at these strategic points, Serverless Workflow DSL ensures that data flows through the workflow in a controlled and efficient manner, maintaining clarity and relevance at each execution stage. This approach helps manage complex workflows and ensures that each task operates with the precise data required, leading to more predictable and reliable workflow outcomes.
@@ -202,8 +260,9 @@ When the evaluation of an expression fails, runtimes **must** raise an error wit
202
260
203
261
| Name | Type | Description |
204
262
|:-----|:----:|:------------|
205
-
| context |`any`| The task's context data. |
206
-
| input |`any`| The task's filtered input. |
263
+
| context |`map`| The task's context data. |
264
+
| input |`any`| The task's transformed input. |
265
+
| output |`any`| The task's transformed output. |
207
266
| secrets |`map`| A key/value map of the workflow secrets.<br>To avoid unintentional bleeding, secrets can only be used in the `input.from` runtime expression. |
208
267
| task |[`taskDescriptor`](#task-descriptor)| Describes the current task. |
209
268
| workflow |[`workflowDescriptor`](#workflow-descriptor)| Describes the current workflow. |
@@ -225,7 +284,8 @@ This argument contains information about the runtime executing the workflow.
225
284
|:-----|:----:|:------------|:--------|
226
285
| name |`string`| The task's name. |`getPet`|
227
286
| definition |`map`| The tasks definition (specified under the name) as a parsed object |`{ "call": "http", "with": { ... } }`|
228
-
| input |`any`| The task's input *BEFORE* the `input.from` expression. For the result of `input.from` expression use the context of the runtime expression (for jq `.`) | - |
287
+
| input |`any`| The task's *raw* input (i.e. *BEFORE* the `input.from` expression). For the result of `input.from` expression use the context of the runtime expression (for jq `.`) | - |
288
+
| output |`any`| The task's *raw* output (i.e. *BEFORE* the `output.as` expression). | - |
229
289
| startedAt.iso8601 |`string`| The start time of the task as a ISO 8601 date time string. It uses `T` as the date-time delimiter, either UTC (`Z`) or a time zone offset (`+01:00`). The precision can be either seconds, milliseconds or nanoseconds |`2022-01-01T12:00:00Z`, `2022-01-01T12:00:00.123456Z`, `2022-01-01T12:00:00.123+01:00`|
230
290
| startedAt.epochMillis |`integer`| The start time of the task as a integer value of milliseconds since midnight of 1970-01-01 UTC |`1641024000123` (="2022-01-01T08:00:00.123Z") |
231
291
| startedAt.epochNanos |`integer`| The start time of the task as a integer value of nanoseconds since midnight of 1970-01-01 UTC |`1641024000123456` (="2022-01-01T08:00:00.123456Z") |
@@ -236,11 +296,23 @@ This argument contains information about the runtime executing the workflow.
236
296
|:-----|:----:|:------------|:--------|
237
297
| id |`string`| A unique id of the workflow execution. Now specific format is imposed | UUIDv4: `4a5c8422-5868-4e12-8dd9-220810d2b9ee`, ULID: `0000004JFGDSW1H037G7J7SFB9`|
238
298
| definition |`map`| The workflow's definition as a parsed object |`{ "document": { ... }, "do": [...] }`|
239
-
| input |`any`| The workflow's input *BEFORE* the `input.from` expression. For the result of `input.from` expression use the `$input` argument | - |
299
+
| input |`any`| The workflow's *raw*input (i.e *BEFORE* the `input.from` expression). For the result of `input.from` expression use the `$input` argument | - |
240
300
| startedAt.iso8601 |`string`| The start time of the execution as a ISO 8601 date time string. It uses `T` as the date-time delimiter, either UTC (`Z`) or a time zone offset (`+01:00`). The precision can be either seconds, milliseconds or nanoseconds |`2022-01-01T12:00:00Z`, `2022-01-01T12:00:00.123456Z`, `2022-01-01T12:00:00.123+01:00`|
241
301
| startedAt.epochMillis |`integer`| The start time of the execution as a integer value of milliseconds since midnight of 1970-01-01 UTC |`1641024000123` (="2022-01-01T08:00:00.123Z") |
242
302
| startedAt.epochNanos |`integer`| The start time of the execution as a integer value of nanoseconds since midnight of 1970-01-01 UTC |`1641024000123456` (="2022-01-01T08:00:00.123456Z") |
243
303
304
+
The following table shows which arguments are available for each runtime expression:
305
+
306
+
| Runtime Expression | Evaluated on | Produces |`$context`|`$input`|`$output`|`$secrets`|`$task`|`$workflow`|
Serverless Workflow is designed with resilience in mind, acknowledging that errors are an inevitable part of any system. The DSL provides robust mechanisms to identify, describe, and handle errors effectively, ensuring the workflow can recover gracefully from failures.
0 commit comments