Skip to main content
This page describes the planned AionContext surface for aion-langgraph. The design is intentionally grounded in LangGraph’s own invocation-scoped runtime context model. The goal is to avoid forcing Aion transport metadata into graph state just to make reply routing, history, and normalized events available inside nodes and tools.

Overview

The intended shape is:
from dataclasses import dataclass

from aion.shared.types import A2AInbox


@dataclass
class AionContext:
    inbox: A2AInbox | None
    thread: Thread
    message: Message | None
    event: Event
    self: AgentIdentity
Access it from LangGraph nodes through Runtime[AionContext]:
from langgraph.runtime import Runtime


async def node(state: State, runtime: Runtime[AionContext]) -> dict:
    thread = runtime.context.thread
    inbound = runtime.context.message
    return {}

Properties

PropertyPurpose
inboxRaw A2AInbox snapshot for hybrid authoring and debugging
threadNormalized messaging thread or context for the current turn
messageNormalized inbound message when the current event is message-like
eventNormalized event descriptor for message, reaction, command, or card action turns
selfAgent identity and configuration visible to the current invocation
The important distinction is that thread and message are developer-facing convenience surfaces. inbox remains the lower-level transport escape hatch.

Why This Lives in Runtime Context

LangGraph gives us a clean separation between:
  • graph state, which is ideal for LLM-facing data such as state.messages
  • invocation-scoped runtime context, which is ideal for immutable request data and dependency injection
That is a better fit for Aion than storing routing metadata in graph state, because:
  • the inbound messaging target is part of the current request, not long-term graph memory
  • the shared response buffer is request-scoped
  • history lookups and outbound side effects should not require custom state reducers

LangGraph thread_id vs Aion Thread IDs

LangGraph already uses config["configurable"]["thread_id"] for checkpointing and persistence. That should remain LangGraph’s own persistence identifier. The planned Aion thread.id should instead represent the inbound messaging context or conversation identifier carried by the distribution and A2A extensions. Those two IDs can be related, but they should not be conflated.

Example

from langgraph.graph import END, START, MessagesState, StateGraph
from langgraph.runtime import Runtime

from aion.langgraph import AionContext


async def router(state: MessagesState, runtime: Runtime[AionContext]) -> dict:
    event = runtime.context.event

    if event.kind == "command":
        await runtime.context.thread.reply("Command received.")
        return {}

    return {}


builder = StateGraph(MessagesState, context_schema=AionContext)
builder.add_node("router", router)
builder.add_edge(START, "router")
builder.add_edge("router", END)
graph = builder.compile()