The workflow-internal API surface, exposed as Durable.workflow. Every method on this class is designed to be called inside a workflow function — they participate in deterministic replay and durable state management.

Method Purpose
proxyActivities Create durable activity proxies with retry
sleepFor Durable, crash-safe sleep
waitFor Pause until a signal is received
signal Send data to a waiting workflow
execChild Spawn and await a child workflow
startChild Spawn a child workflow (fire-and-forget)
execHook Spawn a hook and await its signal response
execHookBatch Spawn multiple hooks in parallel
hook Low-level hook spawning
interrupt Terminate a running workflow
Method Purpose
search Read/write flat HASH key-value data
enrich One-shot HASH enrichment
entity Structured JSONB document storage
emit Publish events to the event bus
trace Emit OpenTelemetry trace spans
Method Purpose
getContext Access workflow ID, namespace, replay state
random Deterministic pseudo-random numbers
all Workflow-safe Promise.all
didInterrupt Type guard for engine control-flow errors
import { Durable } from '@hotmeshio/hotmesh';
import * as activities from './activities';

export async function orderWorkflow(orderId: string): Promise<string> {
// Proxy activities for durable execution
const { validateOrder, processPayment, sendReceipt } =
Durable.workflow.proxyActivities<typeof activities>({
activities,
retryPolicy: { maximumAttempts: 3 },
});

await validateOrder(orderId);

// Durable sleep (survives restarts)
await Durable.workflow.sleepFor('5 seconds');

const receipt = await processPayment(orderId);

// Store searchable metadata
await Durable.workflow.enrich({ orderId, status: 'paid' });

// Wait for external approval signal
const approval = await Durable.workflow.waitFor<{ ok: boolean }>('approve');
if (!approval.ok) return 'cancelled';

await sendReceipt(orderId, receipt);
return receipt;
}

Properties

all: (<T>(...promises: Promise<T>[]) => Promise<T[]>) = all

Type declaration

    • <T>(...promises): Promise<T[]>
    • A workflow-safe version of Promise.all that applies a micro-delay before parallel execution to ensure correct sequencing of the deterministic execution counter. Use this when you need to run multiple durable operations concurrently within a workflow function.

      In most cases, standard Promise.all works correctly for Durable operations (e.g., parallel waitFor calls). Use Durable.workflow.all when you observe counter-sequencing issues with complex parallel patterns.

      import { Durable } from '@hotmeshio/hotmesh';

      export async function parallelWorkflow(): Promise<[string, number]> {
      const { fetchName, fetchScore } = Durable.workflow.proxyActivities<typeof activities>();

      const [name, score] = await Durable.workflow.all(
      fetchName('user-1'),
      fetchScore('user-1'),
      );

      return [name, score];
      }

      Type Parameters

      • T

      Parameters

      • Rest...promises: Promise<T>[]

        An array of promises to execute concurrently.

      Returns Promise<T[]>

      A promise resolving to an array of results.

didInterrupt: ((error: Error) => boolean) = didInterrupt

Type declaration

    • (error): boolean
    • Type guard that returns true if an error is a Durable engine control-flow signal rather than a genuine application error.

      Durable uses thrown errors internally to suspend workflow execution for durable operations like sleepFor, waitFor, proxyActivities, and execChild. These errors must be re-thrown (not swallowed) so the engine can persist state and schedule the next step.

      Always use didInterrupt in catch blocks inside workflow functions to avoid accidentally swallowing engine signals.

      DurableChildError, DurableFatalError, DurableMaxedError, DurableProxyError, DurableRetryError, DurableSleepError, DurableTimeoutError, DurableWaitForError, DurableWaitForAllError

      import { Durable } from '@hotmeshio/hotmesh';

      export async function safeWorkflow(): Promise<string> {
      const { riskyOperation } = Durable.workflow.proxyActivities<typeof activities>();

      try {
      return await riskyOperation();
      } catch (error) {
      // CRITICAL: re-throw engine signals
      if (Durable.workflow.didInterrupt(error)) {
      throw error;
      }
      // Handle real application errors
      return 'fallback-value';
      }
      }
      // Common pattern in interceptors
      const interceptor: WorkflowInterceptor = {
      async execute(ctx, next) {
      try {
      return await next();
      } catch (error) {
      if (Durable.workflow.didInterrupt(error)) {
      throw error; // always re-throw engine signals
      }
      // Log and re-throw application errors
      console.error('Workflow failed:', error);
      throw error;
      }
      },
      };

      Parameters

      • error: Error

        The error to check.

      Returns boolean

      true if the error is a Durable engine interruption signal.

didRun: Object = didRun
emit: ((events: StringAnyType, config??: {
    once: boolean;
}) => Promise<boolean>) = emit

Type declaration

    • (events, config?): Promise<boolean>
    • Emits pub/sub events to the event bus, allowing workflows to broadcast messages to external subscribers. Each entry in the events map is published as a separate message on the corresponding topic.

      By default (config.once = true), events are emitted exactly once per workflow execution — the isSideEffectAllowed guard prevents re-emission on replay. Set config.once = false to emit on every re-execution (rarely needed).

      import { Durable } from '@hotmeshio/hotmesh';

      // Emit a domain event when an order is processed
      export async function orderWorkflow(orderId: string): Promise<void> {
      const { processOrder } = Durable.workflow.proxyActivities<typeof activities>();
      const result = await processOrder(orderId);

      await Durable.workflow.emit({
      'order.completed': { orderId, total: result.total },
      'analytics.event': { type: 'order', orderId },
      });
      }
      // Emit progress events during a long-running workflow
      export async function batchWorkflow(items: string[]): Promise<void> {
      const { processItem } = Durable.workflow.proxyActivities<typeof activities>();

      for (let i = 0; i < items.length; i++) {
      await processItem(items[i]);
      await Durable.workflow.emit(
      { 'batch.progress': { completed: i + 1, total: items.length } },
      { once: false }, // emit on every execution (progress updates)
      );
      }
      }

      Parameters

      • events: StringAnyType

        A mapping of topic to message payload.

      • Optionalconfig: {
            once: boolean;
        } = ...

        If true, events emit only once (idempotent).

        • once: boolean

      Returns Promise<boolean>

      true after emission completes.

enrich: ((fields: StringStringType) => Promise<boolean>) = enrich

Type declaration

    • (fields): Promise<boolean>
    • Adds custom key-value metadata to the workflow's backend HASH record in a single call. This is a convenience wrapper around search().set(fields) that handles session creation automatically.

      Enrichment runs exactly once per workflow execution — the underlying search session ensures idempotency on replay.

      Use enrich for quick one-shot writes. For repeated reads/writes within the same workflow, prefer acquiring a search() session handle directly.

      import { Durable } from '@hotmeshio/hotmesh';

      export async function onboardingWorkflow(userId: string): Promise<void> {
      // Tag the workflow record with queryable metadata
      await Durable.workflow.enrich({
      userId,
      stage: 'verification',
      startedAt: new Date().toISOString(),
      });

      const { verifyIdentity } = Durable.workflow.proxyActivities<typeof activities>();
      await verifyIdentity(userId);

      await Durable.workflow.enrich({ stage: 'complete' });
      }

      Parameters

      Returns Promise<boolean>

      true when enrichment is completed.

entity: (() => Promise<Entity>) = entity

Type declaration

    • (): Promise<Entity>
    • Returns an Entity session handle for interacting with the workflow's structured JSON document storage. Unlike search() (flat HASH key-value pairs), entity() provides a JSONB document store with deep merge, append, and path-based get operations.

      Each call produces a unique session ID tied to the deterministic execution counter, ensuring correct replay behavior.

      import { Durable } from '@hotmeshio/hotmesh';

      export async function userProfileWorkflow(userId: string): Promise<UserProfile> {
      const entity = await Durable.workflow.entity();

      // Initialize a structured document
      await entity.set({
      user: { id: userId, status: 'active' },
      preferences: { theme: 'dark', locale: 'en-US' },
      });

      // Deep merge: adds name without overwriting existing fields
      await entity.merge({ user: { name: 'Alice', email: 'alice@example.com' } });
      // user is now: { id: userId, status: 'active', name: 'Alice', email: '...' }

      // Append to an array
      await entity.set({ user: { tags: ['premium'] } });
      await entity.append({ user: { tags: ['verified'] } });
      // user.tags is now: ['premium', 'verified']

      // Read a nested path
      const user = await entity.get('user');
      return user as UserProfile;
      }
      // Accumulate state across activities
      export async function pipelineWorkflow(input: string): Promise<PipelineResult> {
      const entity = await Durable.workflow.entity();
      const { step1, step2, step3 } = Durable.workflow.proxyActivities<typeof activities>();

      const r1 = await step1(input);
      await entity.merge({ pipeline: { step1: r1 } });

      const r2 = await step2(r1);
      await entity.merge({ pipeline: { step2: r2 } });

      const r3 = await step3(r2);
      await entity.merge({ pipeline: { step3: r3 } });

      return await entity.get('pipeline') as PipelineResult;
      }

      Returns Promise<Entity>

      An entity session scoped to the current workflow job.

execChild: (<T>(options: WorkflowOptions) => Promise<T>) = execChild

Type declaration

    • <T>(options): Promise<T>
    • Spawns a child workflow and awaits its result. The child runs as an independent job with its own lifecycle, retry policy, and dimensional isolation. If the child fails, the error is propagated to the parent as a typed error (DurableFatalError, DurableMaxedError, DurableTimeoutError, or DurableRetryError).

      On replay, the stored child result is returned immediately without re-spawning the child workflow.

      If options.workflowId is provided, it is used directly. Otherwise, the child ID is generated from the entity/workflow name, a GUID, the parent's dimensional coordinates, and the execution index — ensuring uniqueness across parallel and re-entrant executions.

      import { Durable } from '@hotmeshio/hotmesh';

      // Spawn a child workflow and await its result
      export async function parentWorkflow(orderId: string): Promise<string> {
      const result = await Durable.workflow.execChild<{ status: string }>({
      taskQueue: 'payments',
      workflowName: 'processPayment',
      args: [orderId, 99.99],
      config: {
      maximumAttempts: 3,
      backoffCoefficient: 2,
      },
      });
      return result.status;
      }
      // Fan-out: spawn multiple children in parallel
      export async function batchWorkflow(items: string[]): Promise<string[]> {
      const results = await Promise.all(
      items.map((item) =>
      Durable.workflow.execChild<string>({
      taskQueue: 'processors',
      workflowName: 'processItem',
      args: [item],
      }),
      ),
      );
      return results;
      }
      // Entity-based child (uses entity name as task queue)
      const user = await Durable.workflow.execChild<UserRecord>({
      entity: 'user',
      args: [{ name: 'Alice', email: 'alice@example.com' }],
      workflowId: 'user-alice', // deterministic ID
      expire: 3600, // 1 hour TTL
      });

      Type Parameters

      • T

        The return type of the child workflow.

      Parameters

      Returns Promise<T>

      The child workflow's return value.

execHook: (<T>(options: ExecHookOptions) => Promise<T>) = execHook

Type declaration

    • <T>(options): Promise<T>
    • Combines hook() + waitFor() into a single call: spawns a hook function on a target workflow and suspends the current workflow until the hook signals completion. This is the recommended pattern for request/response communication between workflow threads.

      A signalId is automatically generated (or use the one you provide) and injected as the last argument to the hooked function as { signal: string, $durable: true }. The hook function must call Durable.workflow.signal(signalInfo.signal, result) to deliver its response back to the waiting workflow.

      • execChild spawns a new workflow job with its own lifecycle.
      • execHook runs within an existing workflow's job, in an isolated dimensional thread. This is lighter-weight and shares the parent job's data namespace.
      import { Durable } from '@hotmeshio/hotmesh';

      // Orchestrator: spawn a hook and await its result
      export async function reviewWorkflow(docId: string): Promise<string> {
      const verdict = await Durable.workflow.execHook<{ approved: boolean }>({
      taskQueue: 'reviewers',
      workflowName: 'reviewDocument',
      args: [docId],
      });

      return verdict.approved ? 'accepted' : 'rejected';
      }
      // The hooked function (runs on the 'reviewers' worker)
      export async function reviewDocument(
      docId: string,
      signalInfo?: { signal: string; $durable: boolean },
      ): Promise<{ approved: boolean }> {
      const { analyzeDocument } = Durable.workflow.proxyActivities<typeof activities>();
      const score = await analyzeDocument(docId);
      const result = { approved: score > 0.8 };

      // Signal the waiting workflow with the result
      if (signalInfo?.signal) {
      await Durable.workflow.signal(signalInfo.signal, result);
      }
      return result;
      }
      // With explicit signalId for traceability
      const result = await Durable.workflow.execHook<AnalysisResult>({
      taskQueue: 'analyzers',
      workflowName: 'runAnalysis',
      args: [datasetId],
      signalId: `analysis-${datasetId}`,
      });

      Type Parameters

      • T

        The type of data returned by the hook function's signal.

      Parameters

      • options: ExecHookOptions

        Hook configuration including target workflow and arguments.

      Returns Promise<T>

      The signal result from the hooked function.

execHookBatch: (<T>(hookConfigs: BatchHookConfig<any>[]) => Promise<T>) = execHookBatch

Type declaration

    • <T>(hookConfigs): Promise<T>
    • Executes multiple hooks in parallel and awaits all their signal responses, returning a keyed object of results. This is the recommended way to run concurrent hooks — it solves a race condition where calling Promise.all([execHook(), execHook()]) would throw before all waitFor registrations complete.

      1. Fire all hooks via Promise.all (registers streams immediately).
      2. Await all signals via Promise.all (all waitFor registrations happen together before any DurableWaitForError is thrown).
      3. Combine results into a { [key]: result } map.
      import { Durable } from '@hotmeshio/hotmesh';

      // Fan-out to multiple AI agents, gather all perspectives
      export async function researchWorkflow(query: string): Promise<Summary> {
      const perspectives = await Durable.workflow.execHookBatch<{
      optimistic: PerspectiveResult;
      skeptical: PerspectiveResult;
      neutral: PerspectiveResult;
      }>([
      {
      key: 'optimistic',
      options: {
      taskQueue: 'agents',
      workflowName: 'analyzeOptimistic',
      args: [query],
      },
      },
      {
      key: 'skeptical',
      options: {
      taskQueue: 'agents',
      workflowName: 'analyzeSkeptical',
      args: [query],
      },
      },
      {
      key: 'neutral',
      options: {
      taskQueue: 'agents',
      workflowName: 'analyzeNeutral',
      args: [query],
      },
      },
      ]);

      // All three results are available as typed properties
      const { synthesize } = Durable.workflow.proxyActivities<typeof activities>();
      return await synthesize(
      perspectives.optimistic,
      perspectives.skeptical,
      perspectives.neutral,
      );
      }
      // Parallel validation with different services
      const checks = await Durable.workflow.execHookBatch<{
      fraud: { safe: boolean };
      compliance: { approved: boolean };
      }>([
      {
      key: 'fraud',
      options: {
      taskQueue: 'fraud-detection',
      workflowName: 'checkFraud',
      args: [transactionId],
      },
      },
      {
      key: 'compliance',
      options: {
      taskQueue: 'compliance',
      workflowName: 'checkCompliance',
      args: [transactionId],
      },
      },
      ]);

      if (checks.fraud.safe && checks.compliance.approved) {
      // proceed with transaction
      }

      Type Parameters

      • T extends Record<string, any>

        Object type with keys matching the batch hook keys.

      Parameters

      • hookConfigs: BatchHookConfig<any>[]

        Array of hook configurations with unique keys.

      Returns Promise<T>

      Object mapping each config's key to its signal response.

executeChild: (<T>(options: WorkflowOptions) => Promise<T>) = executeChild

Type declaration

    • <T>(options): Promise<T>
    • Spawns a child workflow and awaits its result. The child runs as an independent job with its own lifecycle, retry policy, and dimensional isolation. If the child fails, the error is propagated to the parent as a typed error (DurableFatalError, DurableMaxedError, DurableTimeoutError, or DurableRetryError).

      On replay, the stored child result is returned immediately without re-spawning the child workflow.

      If options.workflowId is provided, it is used directly. Otherwise, the child ID is generated from the entity/workflow name, a GUID, the parent's dimensional coordinates, and the execution index — ensuring uniqueness across parallel and re-entrant executions.

      import { Durable } from '@hotmeshio/hotmesh';

      // Spawn a child workflow and await its result
      export async function parentWorkflow(orderId: string): Promise<string> {
      const result = await Durable.workflow.execChild<{ status: string }>({
      taskQueue: 'payments',
      workflowName: 'processPayment',
      args: [orderId, 99.99],
      config: {
      maximumAttempts: 3,
      backoffCoefficient: 2,
      },
      });
      return result.status;
      }
      // Fan-out: spawn multiple children in parallel
      export async function batchWorkflow(items: string[]): Promise<string[]> {
      const results = await Promise.all(
      items.map((item) =>
      Durable.workflow.execChild<string>({
      taskQueue: 'processors',
      workflowName: 'processItem',
      args: [item],
      }),
      ),
      );
      return results;
      }
      // Entity-based child (uses entity name as task queue)
      const user = await Durable.workflow.execChild<UserRecord>({
      entity: 'user',
      args: [{ name: 'Alice', email: 'alice@example.com' }],
      workflowId: 'user-alice', // deterministic ID
      expire: 3600, // 1 hour TTL
      });

      Type Parameters

      • T

        The return type of the child workflow.

      Parameters

      Returns Promise<T>

      The child workflow's return value.

getContext: (() => WorkflowContext) = getContext

Type declaration

    • (): WorkflowContext
    • Returns the current workflow's execution context, providing access to the workflow ID, replay state, dimensional coordinates, connection info, and other runtime metadata.

      The context is populated by the worker's wrapWorkflowFunction and stored in AsyncLocalStorage, making it available to any code running within the workflow function's call stack.

      import { Durable } from '@hotmeshio/hotmesh';

      // Access the workflow ID and namespace
      export async function contextAwareWorkflow(): Promise<string> {
      const ctx = Durable.workflow.getContext();
      console.log(`Running workflow ${ctx.workflowId} in ${ctx.namespace}`);
      return ctx.workflowId;
      }
      // Check if the current execution is a replay
      export async function replayAwareWorkflow(): Promise<void> {
      const { counter, workflowDimension } = Durable.workflow.getContext();

      // Use context for logging/debugging
      console.log(`Execution counter: ${counter}, dimension: ${workflowDimension}`);
      }
      // Pass context info to child workflows
      export async function parentWorkflow(): Promise<void> {
      const { workflowId } = Durable.workflow.getContext();

      await Durable.workflow.execChild({
      taskQueue: 'children',
      workflowName: 'childWorkflow',
      args: [workflowId], // pass parent ID to child
      });
      }

      Returns WorkflowContext

      The current workflow context.

hook: ((options: HookOptions) => Promise<string>) = hook

Type declaration

    • (options): Promise<string>
    • Spawns a hook execution against an existing workflow job. The hook runs in an isolated dimensional thread within the target job's namespace, allowing it to read/write the same job state without interfering with the main workflow thread.

      This is the low-level primitive behind execHook(). Use hook() directly when you need fire-and-forget hook execution or when you manage signal coordination yourself.

      • If taskQueue and workflowName (or entity) are provided, the hook targets that specific workflow type.
      • If neither is provided, the hook targets the current workflow. However, targeting the same topic as the current workflow is rejected to prevent infinite loops.

      The isSideEffectAllowed guard ensures hooks fire exactly once — on replay, the hook is not re-spawned.

      import { Durable } from '@hotmeshio/hotmesh';

      // Fire-and-forget: spawn a hook without waiting for its result
      export async function notifyWorkflow(userId: string): Promise<void> {
      await Durable.workflow.hook({
      taskQueue: 'notifications',
      workflowName: 'sendNotification',
      args: [userId, 'Your order has shipped'],
      });
      // Continues immediately, does not wait for the hook
      }
      // Manual signal coordination (equivalent to execHook)
      export async function manualHookPattern(itemId: string): Promise<string> {
      const signalId = `process-${itemId}`;

      await Durable.workflow.hook({
      taskQueue: 'processors',
      workflowName: 'processItem',
      args: [itemId, signalId],
      });

      // Manually wait for the hook to signal back
      return await Durable.workflow.waitFor<string>(signalId);
      }
      // Hook with retry configuration
      await Durable.workflow.hook({
      taskQueue: 'enrichment',
      workflowName: 'enrichProfile',
      args: [profileId],
      config: {
      maximumAttempts: 5,
      backoffCoefficient: 2,
      maximumInterval: '1m',
      },
      });

      Parameters

      • options: HookOptions

        Hook configuration including target workflow and arguments.

      Returns Promise<string>

      The resulting hook/stream ID.

interrupt: ((jobId: string, options??: JobInterruptOptions) => Promise<string | void>) = interrupt

Type declaration

    • (jobId, options?): Promise<string | void>
    • Terminates a running workflow job by its ID. The target job's status is set to an error code indicating abnormal termination, and any pending activities or timers are cancelled.

      This is the workflow-internal interrupt — it can only be called from within a workflow function. For external interruption, use hotMesh.interrupt() directly.

      The interrupt fires exactly once per workflow execution — the isSideEffectAllowed guard prevents re-interrupting on replay.

      import { Durable } from '@hotmeshio/hotmesh';

      // Cancel a child workflow from the parent
      export async function supervisorWorkflow(): Promise<void> {
      const childId = await Durable.workflow.startChild({
      taskQueue: 'workers',
      workflowName: 'longTask',
      args: [],
      });

      // Wait for a timeout, then cancel the child
      await Durable.workflow.sleepFor('5 minutes');
      await Durable.workflow.interrupt(childId, {
      reason: 'Timed out waiting for child',
      descend: true, // also interrupt any grandchild workflows
      });
      }
      // Self-interrupt on validation failure
      export async function validatedWorkflow(input: string): Promise<void> {
      const { workflowId } = Durable.workflow.getContext();
      const { validate } = Durable.workflow.proxyActivities<typeof activities>();

      const isValid = await validate(input);
      if (!isValid) {
      await Durable.workflow.interrupt(workflowId, {
      reason: 'Invalid input',
      });
      }
      }

      Parameters

      • jobId: string

        The ID of the workflow job to interrupt.

      • Optionaloptions: JobInterruptOptions = {}

        Interruption options (reason, descend, etc.).

      Returns Promise<string | void>

      The result of the interruption, if any.

isSideEffectAllowed: Object = isSideEffectAllowed
proxyActivities: (<ACT>(options?: ActivityConfig) => ProxyType<ACT>) = proxyActivities

Type declaration

    • <ACT>(options?): ProxyType<ACT>
    • Creates a typed proxy for calling activity functions with durable execution, automatic retry, and deterministic replay. This is the primary way to invoke side-effectful code (HTTP calls, database writes, file I/O) from within a workflow function.

      Activities execute on a separate worker process via message queue, isolating side effects from the deterministic workflow function. Each proxied call is assigned a unique execution index, and on replay the stored result is returned without re-executing the activity.

      • Default: Activities route to {workflowTaskQueue}-activity.
      • Explicit taskQueue: Activities route to {taskQueue}-activity, enabling shared/global activity worker pools across workflows.
      Option Default Description
      maximumAttempts 50 Max retries before the activity is marked as failed
      backoffCoefficient 2 Exponential backoff multiplier
      maximumInterval '5m' Cap on delay between retries
      throwOnError true Throw on activity failure (set false to return the error)
      import { Durable } from '@hotmeshio/hotmesh';
      import * as activities from './activities';

      // Standard pattern: register and proxy activities inline
      export async function orderWorkflow(orderId: string): Promise<string> {
      const { validateOrder, chargePayment, sendConfirmation } =
      Durable.workflow.proxyActivities<typeof activities>({
      activities,
      retryPolicy: {
      maximumAttempts: 3,
      backoffCoefficient: 2,
      maximumInterval: '30s',
      },
      });

      await validateOrder(orderId);
      const receipt = await chargePayment(orderId);
      await sendConfirmation(orderId, receipt);
      return receipt;
      }
      // Remote activities: reference a pre-registered worker pool by taskQueue
      interface PaymentActivities {
      processPayment: (amount: number) => Promise<string>;
      refundPayment: (txId: string) => Promise<void>;
      }

      export async function refundWorkflow(txId: string): Promise<void> {
      const { refundPayment } =
      Durable.workflow.proxyActivities<PaymentActivities>({
      taskQueue: 'payments',
      retryPolicy: { maximumAttempts: 5 },
      });

      await refundPayment(txId);
      }
      // Interceptor with shared activity pool
      const auditInterceptor: WorkflowInterceptor = {
      async execute(ctx, next) {
      const { auditLog } = Durable.workflow.proxyActivities<{
      auditLog: (id: string, action: string) => Promise<void>;
      }>({
      taskQueue: 'shared-audit',
      retryPolicy: { maximumAttempts: 3 },
      });

      await auditLog(ctx.get('workflowId'), 'started');
      const result = await next();
      await auditLog(ctx.get('workflowId'), 'completed');
      return result;
      },
      };
      // Graceful error handling (no throw)
      const { riskyOperation } = Durable.workflow.proxyActivities<typeof activities>({
      activities,
      retryPolicy: { maximumAttempts: 1, throwOnError: false },
      });

      const result = await riskyOperation();
      if (result instanceof Error) {
      // handle gracefully
      }

      Type Parameters

      • ACT

        The activity type map (use typeof activities for inline registration).

      Parameters

      • Optionaloptions: ActivityConfig

        Activity configuration including retry policy and routing.

      Returns ProxyType<ACT>

      A typed proxy object mapping activity names to their durable wrappers.

random: (() => number) = random

Type declaration

    • (): number
    • Returns a deterministic pseudo-random number between 0 and 1. The value is derived from the current execution counter, so it produces the same result on every replay of the workflow at this execution point.

      Use this instead of Math.random() inside workflow functions. Math.random() is non-deterministic and would break replay correctness.

      import { Durable } from '@hotmeshio/hotmesh';

      // Generate a deterministic unique suffix
      export async function uniqueWorkflow(): Promise<string> {
      const suffix = Math.floor(Durable.workflow.random() * 10000);
      return `item-${suffix}`;
      }
      // A/B test routing (deterministic per workflow execution)
      export async function experimentWorkflow(userId: string): Promise<string> {
      const { variantA, variantB } = Durable.workflow.proxyActivities<typeof activities>();

      if (Durable.workflow.random() < 0.5) {
      return await variantA(userId);
      } else {
      return await variantB(userId);
      }
      }

      Returns number

      A deterministic pseudo-random number in [0, 1).

search: (() => Promise<Search>) = search

Type declaration

    • (): Promise<Search>
    • Returns a Search session handle for reading and writing key-value data on the workflow's backend HASH record. Search fields are flat string key-value pairs stored alongside the job state, making them queryable via Durable.Client.workflow.search() (FT.SEARCH).

      Each call produces a unique session ID tied to the deterministic execution counter, ensuring correct replay behavior.

      Use search() for flat, indexable key-value data. For structured JSON documents, use entity() instead.

      import { Durable } from '@hotmeshio/hotmesh';

      export async function orderWorkflow(orderId: string): Promise<void> {
      const search = await Durable.workflow.search();

      // Write searchable fields
      await search.set({
      orderId,
      status: 'processing',
      createdAt: new Date().toISOString(),
      });

      const { processOrder } = Durable.workflow.proxyActivities<typeof activities>();
      await processOrder(orderId);

      // Update status
      await search.set({ status: 'completed' });

      // Read a field back
      const status = await search.get('status');
      }
      // Increment a numeric counter
      export async function counterWorkflow(): Promise<number> {
      const search = await Durable.workflow.search();
      await search.set({ count: '0' });
      await search.incr('count', 1);
      await search.incr('count', 1);
      return Number(await search.get('count')); // 2
      }

      Returns Promise<Search>

      A search session scoped to the current workflow job.

signal: ((signalId: string, data: Record<any, any>) => Promise<string>) = signal

Type declaration

    • (signalId, data): Promise<string>
    • Sends a signal payload to a paused workflow thread that is awaiting this signalId via waitFor(). Signals are the primary mechanism for inter-workflow communication and for delivering results from hook functions back to the orchestrating workflow.

      signal is the send side of the coordination pair. The receive side is waitFor(). A signal can be sent from:

      • Another workflow function
      • A hook function (most common pattern with execHook)
      • An external client via Durable.Client.workflow.signal()

      Signals fire exactly once per workflow execution — the isSideEffectAllowed guard ensures they are not re-sent on replay.

      import { Durable } from '@hotmeshio/hotmesh';

      // Hook function that signals completion back to the parent workflow
      export async function processOrder(
      orderId: string,
      signalInfo?: { signal: string; $durable: boolean },
      ): Promise<{ total: number }> {
      const { calculateTotal } = Durable.workflow.proxyActivities<typeof activities>();
      const total = await calculateTotal(orderId);

      // Signal the waiting workflow with the result
      if (signalInfo?.signal) {
      await Durable.workflow.signal(signalInfo.signal, { total });
      }
      return { total };
      }
      // Cross-workflow coordination: workflow A signals workflow B
      export async function coordinatorWorkflow(): Promise<void> {
      const { prepareData } = Durable.workflow.proxyActivities<typeof activities>();
      const data = await prepareData();

      // Signal another workflow that is paused on waitFor('data-ready')
      await Durable.workflow.signal('data-ready', { payload: data });
      }
      // External signal from an API handler (outside a workflow)
      const client = new Durable.Client({ connection });
      await client.workflow.signal('approval-signal', { approved: true });

      Parameters

      • signalId: string

        Unique signal identifier that matches a waitFor() call.

      • data: Record<any, any>

        The payload to deliver to the waiting workflow.

      Returns Promise<string>

      The resulting hook/stream ID.

sleepFor: ((duration: string) => Promise<number>) = sleepFor

Type declaration

    • (duration): Promise<number>
    • Suspends workflow execution for a durable, crash-safe duration. Unlike setTimeout, this sleep survives process restarts — the engine persists the wake-up time and resumes the workflow when the timer expires.

      On replay, sleepFor returns immediately with the stored duration (no actual waiting occurs). This makes it safe for deterministic re-execution.

      Accepts any human-readable duration string parsed by the ms module: '5 seconds', '30s', '2 minutes', '1m', '1 hour', '2h', '1 day', '7d'.

      import { Durable } from '@hotmeshio/hotmesh';

      // Simple delay before continuing
      export async function reminderWorkflow(userId: string): Promise<void> {
      const { sendReminder } = Durable.workflow.proxyActivities<typeof activities>();

      // Wait 24 hours (survives server restarts)
      await Durable.workflow.sleepFor('24 hours');
      await sendReminder(userId, 'Your trial expires tomorrow');

      // Wait another 6 days
      await Durable.workflow.sleepFor('6 days');
      await sendReminder(userId, 'Your trial has expired');
      }
      // Exponential backoff with retry loop
      export async function pollingWorkflow(resourceId: string): Promise<string> {
      const { checkStatus } = Durable.workflow.proxyActivities<typeof activities>();

      for (let attempt = 0; attempt < 10; attempt++) {
      const status = await checkStatus(resourceId);
      if (status === 'ready') return status;

      // Exponential backoff: 1s, 2s, 4s, 8s, ...
      const delay = Math.pow(2, attempt);
      await Durable.workflow.sleepFor(`${delay} seconds`);
      }
      return 'timeout';
      }
      // Race a sleep against an activity
      const [result, _] = await Promise.all([
      activities.fetchData(id),
      Durable.workflow.sleepFor('30 seconds'),
      ]);

      Parameters

      • duration: string

        A human-readable duration string.

      Returns Promise<number>

      The resolved duration in seconds.

startChild: ((options: WorkflowOptions) => Promise<string>) = startChild

Type declaration

    • (options): Promise<string>
    • Spawns a child workflow in fire-and-forget mode. The parent workflow continues immediately without waiting for the child to complete. Returns the child's job ID for later reference (e.g., to interrupt or query the child).

      This is a convenience wrapper around execChild with await: false.

      import { Durable } from '@hotmeshio/hotmesh';

      export async function dispatchWorkflow(taskId: string): Promise<string> {
      // Fire-and-forget: start the child and continue immediately
      const childJobId = await Durable.workflow.startChild({
      taskQueue: 'background',
      workflowName: 'longRunningTask',
      args: [taskId],
      });

      // Optionally store the child ID for monitoring
      const search = await Durable.workflow.search();
      await search.set({ childJobId });

      return childJobId;
      }

      Parameters

      Returns Promise<string>

      The child workflow's job ID.

trace: ((attributes: StringScalarType, config??: {
    once: boolean;
}) => Promise<boolean>) = trace

Type declaration

    • (attributes, config?): Promise<boolean>
    • Emits a distributed trace span to the configured telemetry sink (e.g., OpenTelemetry). The span is linked to the workflow's trace context (traceId, spanId) for end-to-end observability across workflow executions, activities, and child workflows.

      By default (config.once = true), the trace is emitted exactly once per workflow execution — it will not re-fire on replay.

      import { Durable } from '@hotmeshio/hotmesh';

      // Emit trace spans at key workflow milestones
      export async function orderWorkflow(orderId: string): Promise<void> {
      await Durable.workflow.trace({
      'order.id': orderId,
      'order.stage': 'started',
      });

      const { processPayment } = Durable.workflow.proxyActivities<typeof activities>();
      await processPayment(orderId);

      await Durable.workflow.trace({
      'order.id': orderId,
      'order.stage': 'payment_completed',
      });
      }
      // Trace on every re-execution (for debugging replay behavior)
      await Durable.workflow.trace(
      { 'debug.counter': counter, 'debug.phase': 'retry' },
      { once: false },
      );

      Parameters

      • attributes: StringScalarType

        Key-value attributes to attach to the trace span.

      • Optionalconfig: {
            once: boolean;
        } = ...

        If true, trace fires only once (idempotent).

        • once: boolean

      Returns Promise<boolean>

      true if tracing succeeded.

waitFor: (<T>(signalId: string) => Promise<T>) = waitFor

Type declaration

    • <T>(signalId): Promise<T>
    • Pauses the workflow until a signal with the given signalId is received. The workflow suspends durably — it survives process restarts and will resume exactly once when the matching signal() call delivers data.

      waitFor is the receive side of the signal coordination pair. The send side is signal(), which can be called from another workflow, a hook function, or externally via Durable.Client.workflow.signal().

      On replay, waitFor returns the previously stored signal payload immediately (no actual suspension occurs).

      import { Durable } from '@hotmeshio/hotmesh';

      // Human-in-the-loop approval pattern
      export async function approvalWorkflow(orderId: string): Promise<boolean> {
      const { submitForReview } = Durable.workflow.proxyActivities<typeof activities>();

      await submitForReview(orderId);

      // Pause indefinitely until a human approves or rejects
      const decision = await Durable.workflow.waitFor<{ approved: boolean }>('approval');

      return decision.approved;
      }

      // Later, from outside the workflow (e.g., an API handler):
      await client.workflow.signal('approval', { approved: true });
      // Fan-in: wait for multiple signals in parallel
      export async function gatherWorkflow(): Promise<[string, number]> {
      const [name, score] = await Promise.all([
      Durable.workflow.waitFor<string>('name-signal'),
      Durable.workflow.waitFor<number>('score-signal'),
      ]);
      return [name, score];
      }
      // Paired with hook: spawn work and wait for the result
      export async function orchestrator(input: string): Promise<string> {
      const signalId = `result-${Durable.workflow.random()}`;

      // Spawn a hook that will signal back when done
      await Durable.workflow.hook({
      taskQueue: 'processors',
      workflowName: 'processItem',
      args: [input, signalId],
      });

      // Wait for the hook to signal completion
      return await Durable.workflow.waitFor<string>(signalId);
      }

      Type Parameters

      • T

        The type of data expected in the signal payload.

      Parameters

      • signalId: string

        A unique signal identifier shared by the sender and receiver.

      Returns Promise<T>

      The data payload associated with the received signal.

Methods