Documentation Index
Fetch the complete documentation index at: https://docs.aion.to/llms.txt
Use this file to discover all available pages before exploring further.
This page documents the lower-level helpers that already exist in
aion.langgraph today.
Use these helpers when you need direct control over emitted
TaskStatusUpdateEvent or TaskArtifactUpdateEvent objects. For ordinary
reply and post authoring, the higher-level Thread and Message APIs
described elsewhere in this section are the intended fluent surface.
These helpers still participate in the same request-scoped response
accumulator. They do not replace the standard precedence of response buffer,
a2a_outbox, and framework-native fallback.
These helpers are intended for use with streaming requests where Aion Server
consumes stream_mode=["values", "messages", "custom", "updates"].
Event Helpers
| Helper | Emitted A2A Event | Behavior |
|---|
emit_file_artifact | TaskArtifactUpdateEvent | Emits a file artifact with a file part. |
emit_data_artifact | TaskArtifactUpdateEvent | Emits a structured artifact with a data part. |
emit_message | TaskStatusUpdateEvent or TaskArtifactUpdateEvent | Emits message or stream updates. |
emit_task_update | TaskStatusUpdateEvent | Emits one combined task update. |
Functions
emit_file_artifact(...)
emit_file_artifact(
writer,
*,
url=None,
base64=None,
mime_type,
name=None,
artifact_id=None,
append=False,
is_last_chunk=True,
)
Emits a file artifact and maps to an A2A Artifact with a file part.
| Parameter | Description |
|---|
writer | LangGraph StreamWriter from node signature |
url | File URL for remote files; mutually exclusive with base64 |
base64 | File content as base64; mutually exclusive with url |
mime_type | MIME type such as application/pdf or image/png |
name | Artifact name (default: file) |
artifact_id | Explicit artifact ID; auto-generated if not provided |
append | Set true to append to a previously sent artifact |
is_last_chunk | Set false if more chunks are coming |
Use cases: generated PDFs or images, chunked file streaming, external file
references.
Example:
from uuid import uuid4
from langgraph.types import StreamWriter
from aion.langgraph import emit_file_artifact
def my_node(state: dict, writer: StreamWriter):
emit_file_artifact(
writer,
url="https://example.com/report.pdf",
mime_type="application/pdf",
name="report",
)
artifact_id = str(uuid4())
for i, chunk_base64 in enumerate(file_chunks):
emit_file_artifact(
writer,
base64=chunk_base64,
mime_type="text/plain",
artifact_id=artifact_id,
append=True,
is_last_chunk=(i == len(file_chunks) - 1),
)
return state
emit_data_artifact(...)
emit_data_artifact(writer, data, name=None, artifact_id=None, append=False, is_last_chunk=True)
Emits a structured data artifact. data must be JSON-serializable.
| Parameter | Description |
|---|
writer | LangGraph StreamWriter from node signature |
data | Dictionary or any JSON-serializable value |
name | Artifact name (default: data) |
artifact_id | Explicit artifact ID; auto-generated if not provided |
append | Set true to append to a previously sent artifact |
is_last_chunk | Set false if more chunks are coming |
Use cases: analysis outputs, metrics, and structured response payloads.
Example:
from langgraph.types import StreamWriter
from aion.langgraph import emit_data_artifact
def my_node(state: dict, writer: StreamWriter):
emit_data_artifact(
writer,
{"status": "success", "results": ["a", "b"]},
name="analysis",
)
return state
emit_message(...)
emit_message(writer, message, ephemeral=False)
Emits a programmatic message during graph execution. Supports full messages and
streaming chunks.
| Parameter | Description |
|---|
writer | LangGraph StreamWriter from node signature |
message | LangChain AIMessage or AIMessageChunk |
ephemeral | If true, event is sent to client but not persisted in task history |
ephemeral=False (default):
AIMessage -> TaskStatusUpdateEvent(working, message=...); persisted in history.
AIMessageChunk -> TaskArtifactUpdateEvent(STREAM_DELTA); streamed and not persisted.
ephemeral=True:
AIMessage or AIMessageChunk -> TaskArtifactUpdateEvent(EPHEMERAL_MESSAGE).
- Emitted to client and filtered out by task store.
- Does not change durable response fallback behavior.
Use cases: progress notifications, thinking indicators, and transient status
events.
Example:
from langchain_core.messages import AIMessage, AIMessageChunk
from langgraph.types import StreamWriter
from aion.langgraph import emit_message
def my_node(state: dict, writer: StreamWriter):
emit_message(writer, AIMessageChunk(content="Processing..."))
emit_message(
writer,
AIMessage(content="Searching knowledge base..."),
ephemeral=True,
)
return state
emit_task_update(...)
emit_task_update(writer, message=None, metadata=None)
Emits one combined task update containing message and or metadata.
Only AIMessage is accepted for message. For chunk streaming, use
emit_message().
| Parameter | Description |
|---|
writer | LangGraph StreamWriter from node signature |
message | Optional full message (AIMessage only) |
metadata | Optional metadata dictionary to merge into task metadata |
At least one of message or metadata must be provided. Keys with the
aion: prefix in metadata are ignored.
Use cases: step completion updates with metadata, progress plus message in one
event.
Example:
from langchain_core.messages import AIMessage
from langgraph.types import StreamWriter
from aion.langgraph import emit_task_update
def my_node(state: dict, writer: StreamWriter):
emit_task_update(
writer,
message=AIMessage(content="Processing complete"),
metadata={"progress": 100},
)
return state