Python Quickstart
Get started with the Nebula SDK Python library in just a few minutes.Installation
Basic Setup
First, import and initialize the SDK:Your First Chat
Create a simple chat interaction:Streaming Chat
For real-time responses, use streaming:Adding Memory
Enhance your chat with persistent memory:Creating an Agent with Context
Build an agent that maintains conversation context:Advanced Configuration
Customize your agent with advanced settings:Error Handling
Implement proper error handling for production use:Environment Configuration
Set up environment variables for easier configuration:Next Steps
Now that you have the basics down, explore more advanced features:Python Chat API
Learn about advanced chat features and streaming
Python Memory System
Dive deep into persistent memory and data storage
Python Agent Framework
Build sophisticated AI agents with custom behaviors
Agent Tools
Extend agents with custom tools and integrations
Available Models
The Nebula SDK connects to models running on the 0G decentralized compute network:Model | Provider Address | Best For | Verification |
---|---|---|---|
llama-3.3-70b-instruct | 0xf07240Efa67755B5311bc75784a061eDB47165Dd | General AI tasks, conversations, creative writing | TEE (TeeML) |
deepseek-r1-70b | 0x3feE5a4dd5FDb8a32dDA97Bed899830605dBD9D3 | Complex reasoning, problem-solving, analysis | TEE (TeeML) |
Examples Repository
Check out complete examples in our GitHub repository:Community and Support
- Documentation: https://docs.0g.ai
- GitHub: https://github.com/0glabs/nebula-sdk-python
- Discord: https://discord.gg/0g
- Email: support@0g.ai