Quick Deployment
Docker Quick Start
Docker Compose Deployment
Configure APIYI
Method 1: Environment Variable Configuration
Set environment variables during deployment:Method 2: Interface Configuration
- Access Open WebUI admin interface
- Go to Settings > Connections
- Configure in OpenAI API section:
- API Base URL:
https://api.apiyi.com/v1 - API Key: Enter your APIYI key
- API Base URL:
- Click save configuration
Configuration Key Points
- API Base URL needs to include the
/v1suffix - API Key can be obtained from APIYI Console
- Recommended to use environment variable method for easier management and updates
Supported Models
Open WebUI supports the following model series through APIYI:Recommended Models
| Model Series | Model ID | Features |
|---|---|---|
| GPT-4 Turbo | gpt-4-turbo-2024-04-09 | Latest GPT-4, balanced performance |
| Claude Sonnet | claude-sonnet-4-20250514 | Long text processing, creative writing |
| Gemini Pro | gemini-2.5-pro | Multimodal capabilities, fast response |
| GPT-3.5 Turbo | gpt-3.5-turbo | Economical and practical, daily conversation |
Featured Function Models
| Feature | Recommended Model | Description |
|---|---|---|
| Vision Understanding | gpt-4-vision-preview | Image analysis and understanding |
| Code Generation | gpt-4-turbo | Programming assistance and code optimization |
| Reasoning Thinking | claude-sonnet-4-20250514-thinking | Shows thinking process |
Core Features
RAG (Retrieval-Augmented Generation)
Open WebUI supports document upload and knowledge base features:-
Document Upload
- Supports PDF, TXT, DOCX and other formats
- Automatic vectorization storage
- Supports multilingual documents
-
Knowledge Base Management
- Create specialized knowledge bases
- Document classification and tagging
- Intelligent retrieval matching
OpenAI Compatible API
Open WebUI provides a complete OpenAI compatible API:Tool Integration
Supports external tools and plugins:- Web search
- Code execution
- Image generation
- Document processing
Advanced Configuration
Multi-Model Configuration
Configure multiple model sources indocker-compose.yml:
User Permission Management
Data Persistence
API Integration Examples
Python Integration
JavaScript Integration
Troubleshooting
Common Issues
Connection Failed- Check if API Base URL is correct:
https://api.apiyi.com/v1 - Verify API Key validity
- Confirm firewall settings
- Check account balance
- Confirm model is within service scope
- Check APIYI service status
- Check file format support
- Confirm sufficient storage space
- Verify file size limits
Log Debugging
Enable debug mode:Best Practices
Performance Optimization
-
Model Selection
- Daily conversation use GPT-3.5 Turbo
- Complex tasks use GPT-4 Turbo
- Long text processing use Claude Sonnet
-
Caching Strategy
- Enable conversation caching
- Set reasonable cache expiration time
- Regularly clean unused cache
-
Resource Management
- Monitor memory usage
- Set reasonable concurrent limits
- Regularly backup user data