Loop
Iterates over an array resolved from a data reference. The single "each item" branch runs once per array element; after all iterations, the "done" branch fires with the accumulated results.
When to use
- Process a list of items one at a time. ("For each contact in the HubSpot response, create a Postmark email.")
- Bulk operations that need per-item error handling or conditional per-item behaviour.
Configuration
| Field | Required | What it does |
|---|---|---|
arrayPath | Yes | Data reference pointing to the array to iterate. Use the data picker to select an array. |
Handles
- Each Item (
each_item) — fires once per element. Inside, you can reference{{<loop_id>.data.item}}(the current element) and{{<loop_id>.data.index}}(zero-based position). - Done (
done) — fires once after all iterations. Reference{{<loop_id>.data.results}}for the array of per-iteration outputs.
Example
[Trigger] → [Gateway Call: HubSpot list contacts]
→ [Loop arrayPath={{hubspot.data.contacts}}]
↘ each_item → [Gateway Call: Postmark send]
↘ done → [Emit Event: bulk_email_complete]Inside the each_item branch, the HubSpot call's output is referenced as {{loop.data.item.email}}, {{loop.data.item.first_name}}, etc.
What it outputs
{
index: 42, // while inside each_item
item: <current>, // while inside each_item
results: [...], // after the loop, on the done handle
count: 43, // after the loop
failedCount: 2 // after the loop
}Limits
- Max iterations: 1000 per single loop invocation. If you need to process more, chunk the input upstream.
- Single wrapped child: in the current release, a Loop can wrap exactly one downstream node. If you need multi-step per-iteration work, emit an event inside the loop and handle it in a separate workflow.
- No break/continue: you can't short-circuit a loop from inside. Use a Filter upstream to pre-trim the array.
Gotchas
- Non-array
arrayPath: if the reference resolves to a non-array (string, number, null), the loop is skipped entirely — thedonebranch fires withresults: []. No error. - Concurrency: iterations run sequentially, not in parallel. For parallel per-item work, use Parallel Split instead (if the number of items is known and small).