📦 Installation Guide¶
System Requirements¶
- Python 3.8 or higher
- Ollama with Mistral:7b model
- 4GB RAM (minimum)
- Internet connection (for initial model download)
- 10GB free disk space (for models and dependencies)
Installing Ollama¶
macOS/Linux¶
# Download and install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Download the Mistral:7b model
ollama pull mistral:7b
Windows¶
- Download the installer from Ollama's website
- Run the installer
- Open a new command prompt and run:
Installing shapi¶
Using pip (Recommended)¶
From Source¶
# Clone the repository
git clone https://github.com/yourusername/shapi.git
cd shapi
# Install in development mode
pip install -e .
Development Setup¶
For contributing to shapi:
git clone https://github.com/yourusername/shapi.git
cd shapi
# Install with development dependencies
pip install -e ".[dev]"
# Install pre-commit hooks
pre-commit install
Verifying Installation¶
# Check Ollama is running
ollama list
# Check shapi installation
shapi --version
# Check system status
shapi status
Troubleshooting¶
Common Issues¶
- Ollama not found
- Ensure Ollama is properly installed and in your system PATH
-
Try restarting your terminal or computer
-
Model not found
- Verify you've pulled the correct model:
ollama pull mistral:7b
-
Check your internet connection
-
Permission errors
- On Linux/macOS, you might need to use
sudo
for Ollama commands - Consider adding your user to the
docker
group if using Docker