Create Chat
Create a new chat instance to interact with AI models using Python.
Constructor
Chat(broker: ZGComputeBroker, provider_address: str, temperature: float = 0.7, max_tokens: int = 1000)
Parameters
The compute broker instance for connecting to the 0G network
The provider address on the 0G network for the AI model
Controls randomness in responses (0.0-2.0)
Maximum number of tokens in the response
Response
Returns a Chat
instance that can be used to send messages and stream responses.
Example
import asyncio
from zg_ai_sdk import create_agent
async def main():
agent = await create_agent({
'name': 'My Assistant',
'provider_address': '0xf07240Efa67755B5311bc75784a061eDB47165Dd', # llama-3.3-70b-instruct
'memory_bucket': 'my-agent-memory',
'private_key': 'your-private-key',
'max_tokens': 2000,
'temperature': 0.8
})
response = await agent.ask('Hello, how are you?')
print(response)
asyncio.run(main())
Available Models
The 0G AI SDK connects to models running on the 0G decentralized compute network:
Model | Provider Address | Description | Verification |
---|
llama-3.3-70b-instruct | 0xf07240Efa67755B5311bc75784a061eDB47165Dd | State-of-the-art 70B parameter model for general AI tasks | TEE (TeeML) |
deepseek-r1-70b | 0x3feE5a4dd5FDb8a32dDA97Bed899830605dBD9D3 | Advanced reasoning model optimized for complex problem solving | TEE (TeeML) |
Methods
ask()
Send a simple question to the AI model.
async def ask(self, question: str, system_prompt: Optional[str] = None) -> str
Parameters:
question
(str): The question to ask the AI
system_prompt
(Optional[str]): Optional system prompt to set context
Returns: str
- The AI’s response
Example:
response = await chat.ask('What is machine learning?')
print(response)
chat_completion()
Send a structured chat completion request.
async def chat_completion(self, messages: List[ChatMessage]) -> ChatCompletionResponse
Parameters:
messages
(List[ChatMessage]): List of chat messages
Returns: ChatCompletionResponse
- Structured response with usage info
Example:
from zg_ai_sdk import ChatMessage
messages = [
ChatMessage(role='system', content='You are a helpful assistant'),
ChatMessage(role='user', content='Explain quantum computing')
]
response = await chat.chat_completion(messages)
print(response.choices[0].message.content)
stream_chat_completion()
Stream responses for real-time chat.
async def stream_chat_completion(
self,
messages: List[ChatMessage],
on_chunk: callable
) -> str
Parameters:
messages
(List[ChatMessage]): List of chat messages
on_chunk
(callable): Callback function for each chunk
Returns: str
- Complete response text
Example:
def handle_chunk(chunk: str):
print(chunk, end='', flush=True)
messages = [ChatMessage(role='user', content='Tell me a story')]
full_response = await chat.stream_chat_completion(messages, handle_chunk)
get_service_info()
Get information about the AI service.
async def get_service_info(self) -> ServiceMetadata
Returns: ServiceMetadata
- Service information including endpoint and model
Example:
info = await chat.get_service_info()
print(f"Model: {info.model}, Endpoint: {info.endpoint}")
Configuration Methods
set_temperature()
Update the temperature setting.
def set_temperature(self, temperature: float) -> None
Parameters:
temperature
(float): New temperature value (0.0-2.0)
Example:
chat.set_temperature(0.9) # More creative responses
set_max_tokens()
Update the maximum tokens setting.
def set_max_tokens(self, max_tokens: int) -> None
Parameters:
max_tokens
(int): New maximum tokens value
Example:
chat.set_max_tokens(2000) # Allow longer responses
get_config()
Get current configuration.
def get_config(self) -> Dict[str, Any]
Returns: Dict[str, Any]
- Current configuration settings
Example:
config = chat.get_config()
print(f"Temperature: {config['temperature']}, Max Tokens: {config['max_tokens']}")
Error Handling
The constructor and methods will raise exceptions if:
- Invalid provider address is provided
- Network connection fails
- Invalid configuration parameters are passed
from zg_ai_sdk import SDKError
try:
chat = Chat(
broker=broker,
provider_address='invalid-address',
temperature=0.7
)
response = await chat.ask('Hello')
except SDKError as error:
print(f'Failed to create chat: {error.message} (Code: {error.code})')
except Exception as error:
print(f'Unexpected error: {error}')
Next Steps