Examples
Runnable SDK examples for Rust, Python, and Node.js — each using real LLM configuration
Examples
Every example uses real LLM configuration from ~/.a3s/config.hcl or $A3S_CONFIG and is available in all three SDKs.
Set A3S_CONFIG=/path/to/config.hcl or create ~/.a3s/config.hcl before running any example. See Providers for config format.
Quick Setup
Rust
cd crates/code/core
cargo run --example <name>Python
cd crates/code/sdk/python
pip install -e .
python examples/<name>.pyNode.js
cd crates/code/sdk/node
npm install && npm run build
node examples/<name>.jsExample Index
Quick Start
Basic send, streaming, and multi-turn conversation
Streaming
Real-time event stream with AgentEvent
Model Switching
Switch providers and models at runtime
Direct Tools
Call tools directly without involving the LLM
Planning
Task decomposition with goal tracking
Hooks
Lifecycle event interception (PreToolUse, PostToolUse, ...)
Auto-Compact
Automatic context window compaction
Memory
Persistent file-based memory store
Security
Input taint tracking and output sanitization
Skills
Built-in and custom skill system
Batch Tool
Parallel tool execution in a single turn
Vector RAG
Semantic code search via vector embeddings
Lane Queue
Priority-based task queue with preemption
External Tasks
Multi-machine coordinator/worker pattern
Git Worktree
Parallel workspace isolation with git_worktree tool
Prompt Slots
Slot-based system prompt customization (role, guidelines, style)
Feature Coverage
Prop
Type
Feature Overview
Core (always available)
Prop
Type
Session configuration
Prop
Type
Extension points
Prop
Type