Documentation Index Fetch the complete documentation index at: https://0g.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Create Agent
Create intelligent AI agents with memory and conversation capabilities using Python.
Overview
The Agent class combines Chat, Memory, and Storage capabilities into a unified interface for building sophisticated AI applications. Agents maintain conversation context, store persistent data, and can be extended with custom tools and behaviors.
Constructor
Agent(config: AgentConfig, broker: ZGComputeBroker, storage_client: ZGStorageClient)
Parameters
Configuration object for the agent
The compute broker for connecting to 0G network
The storage client for persistent memory
AgentConfig
@dataclass
class AgentConfig :
name: str
provider_address: str
memory_bucket: str
max_ephemeral_messages: int = 50
temperature: float = 0.7
max_tokens: int = 1000
Convenience Function
create_agent()
Create a pre-configured agent with default settings.
async def create_agent ( config : Dict[ str , Any]) -> Agent
Parameters:
config (Dict[str, Any]): Agent configuration dictionary
Configuration Options:
name (str): Agent name
provider_address (str): 0G network provider address
memory_bucket (str): Storage bucket for memory
private_key (str): Private key for blockchain operations
rpc_url (str, optional): Custom RPC endpoint
indexer_rpc (str, optional): Custom indexer endpoint
kv_rpc (str, optional): Custom KV storage endpoint
max_ephemeral_messages (int, optional): Max conversation history
temperature (float, optional): Response randomness (0.0-2.0)
max_tokens (int, optional): Maximum response length
Examples
Basic Agent Creation
Advanced Agent Configuration
Direct Agent Construction
Multi-Agent System
import asyncio
from zg_ai_sdk import create_agent
async def main ():
# Create a basic agent
agent = await create_agent({
'name' : 'My Assistant' ,
'provider_address' : '0xf07240Efa67755B5311bc75784a061eDB47165Dd' ,
'memory_bucket' : 'my-agent-memory' ,
'private_key' : 'your-private-key'
})
# Initialize the agent
await agent.init()
# Set system prompt
agent.set_system_prompt( 'You are a helpful AI assistant.' )
# Have a conversation
response = await agent.ask( 'Hello, what can you help me with?' )
print (response)
asyncio.run(main())
Core Methods
init()
Initialize the agent and test connections.
async def init ( self ) -> None
Example:
agent = await create_agent(config)
await agent.init() # Must call before using agent
ask()
Send a simple question to the agent.
async def ask ( self , input_text : str ) -> str
Parameters:
input_text (str): The question or message to send
Returns: str - The agent’s response
Example:
response = await agent.ask( 'What is machine learning?' )
print (response)
chat_with_context()
Have a conversation with full context and memory.
async def chat_with_context ( self , input_text : str ) -> str
Parameters:
input_text (str): The message to send
Returns: str - The agent’s response with full context
Example:
# Messages are automatically added to conversation history
response1 = await agent.chat_with_context( 'My name is Alice' )
response2 = await agent.chat_with_context( 'What is my name?' ) # Will remember Alice
stream_chat()
Stream responses in real-time.
async def stream_chat (
self ,
input_text : str ,
on_chunk : Callable[[ str ], None ]
) -> str
Parameters:
input_text (str): The message to send
on_chunk (Callable): Function to handle each response chunk
Returns: str - Complete response after streaming
Example:
def print_chunk ( chunk : str ):
print (chunk, end = '' , flush = True )
response = await agent.stream_chat( 'Tell me a story' , print_chunk)
System Prompt Management
set_system_prompt()
Set the agent’s system prompt to define behavior.
def set_system_prompt ( self , prompt : str ) -> None
Example:
agent.set_system_prompt( '''
You are a helpful coding assistant. When helping with code:
1. Provide clear explanations
2. Include comments in code examples
3. Suggest best practices
4. Ask for clarification when needed
''' )
save_system_prompt()
Save the current system prompt to persistent memory.
async def save_system_prompt ( self ) -> None
Example:
agent.set_system_prompt( 'You are a creative writing assistant.' )
await agent.save_system_prompt() # Persists across sessions
Memory Methods
remember()
Store data in persistent memory.
async def remember ( self , key : str , value : Any) -> None
recall()
Retrieve data from persistent memory.
async def recall ( self , key : str ) -> Any
forget()
Remove data from persistent memory.
async def forget ( self , key : str ) -> None
Example:
# Store user preferences
await agent.remember( 'user_preferences' , {
'language' : 'Python' ,
'experience_level' : 'intermediate'
})
# Retrieve preferences
prefs = await agent.recall( 'user_preferences' )
# Remove old data
await agent.forget( 'temporary_data' )
Conversation Management
save_conversation()
Save current conversation to persistent storage.
async def save_conversation ( self , conversation_id : Optional[ str ] = None ) -> str
Returns: str - The conversation ID
load_conversation()
Load a previously saved conversation.
async def load_conversation ( self , conversation_id : str ) -> None
clear_conversation()
Clear current conversation from memory.
def clear_conversation ( self ) -> None
Example:
# Have a conversation
await agent.ask( 'Hello' )
await agent.ask( 'How are you?' )
# Save it
conv_id = await agent.save_conversation( 'greeting_session' )
# Start fresh
agent.clear_conversation()
# Load previous conversation
await agent.load_conversation( 'greeting_session' )
Configuration Methods
set_temperature()
Adjust response creativity.
def set_temperature ( self , temperature : float ) -> None
set_max_tokens()
Set maximum response length.
def set_max_tokens ( self , max_tokens : int ) -> None
Example:
agent.set_temperature( 0.9 ) # More creative
agent.set_max_tokens( 2000 ) # Longer responses
Introspection Methods
get_stats()
Get agent statistics and current state.
def get_stats ( self ) -> Dict[ str , Any]
Returns: Dictionary with agent statistics
get_service_info()
Get information about the connected AI service.
async def get_service_info ( self ) -> ServiceMetadata
Example:
stats = agent.get_stats()
print ( f "Agent: { stats[ 'name' ] } " )
print ( f "Messages in memory: { stats[ 'memory' ][ 'ephemeral_messages' ] } " )
service_info = await agent.get_service_info()
print ( f "Model: { service_info.model } " )
Error Handling
from zg_ai_sdk import SDKError
try :
agent = await create_agent(config)
await agent.init()
response = await agent.ask( 'Hello' )
except SDKError as e:
print ( f "SDK Error: { e.message } (Code: { e.code } )" )
except Exception as e:
print ( f "Unexpected error: { e } " )
Best Practices
Always Initialize : Call await agent.init() before using the agent
Set System Prompts : Define clear behavior with system prompts
Handle Errors : Implement proper error handling for network issues
Manage Memory : Use conversation management for long sessions
Save Important Data : Store critical information in persistent memory
Monitor Usage : Check agent stats periodically for performance insights
Next Steps