Skip to content

Quick Start

Get up and running with AI Documentation Agent in under 5 minutes!

Prerequisites

Before you begin, ensure you have:

  • ✅ Python 3.8 or higher
  • ✅ Ollama installed and running
  • ✅ Git (for cloning the repository)

Installation Steps

1. Clone the Repository

git clone https://github.com/deepak-sekarbabu/ai-doc-agent.git
cd ai-doc-agent

2. Install Dependencies

pip install -r config/requirements.txt

3. Configure Environment

cp config/.env.example .env

The default configuration works for most users. Edit .env if you need custom settings.

4. Start Ollama

ollama serve

5. Pull an LLM Model

# Recommended for getting started (fast, good quality)
ollama pull llama2:7b

# Or choose an alternative:
# ollama pull mistral        # Better quality
# ollama pull codellama      # Best for code

Your First Documentation

Generate docs for the sample project:

python run.py --directory ./examples --output my_first_docs

This will:

  1. ✅ Analyze the example project
  2. ✅ Generate comprehensive documentation
  3. ✅ Save to output/my_first_docs.md

Check the output:

# View the generated documentation
cat output/my_first_docs.md

# Or open in your editor
code output/my_first_docs.md

Generate Docs for Your Project

Now try it on your own project:

python run.py --directory /path/to/your/project

Quick Examples

# Analyze current directory
python run.py
# Generate HTML documentation
python run.py --directory ./my-app --format html
# Maximum quality with more iterations
python run.py --directory ./my-app --iterations 5 --max-files 100
# Specify project type for better results
python run.py --directory ./api --project-type backend

Common Commands

# Quick documentation (fast)
python run.py --max-files 15 --iterations 2

# Standard documentation
python run.py --directory ~/my-project

# High-quality documentation
python run.py --directory ~/my-project --iterations 5 --model codellama

# Generate HTML or PDF
python run.py --format html
python run.py --format pdf

# Verbose output for debugging
python run.py --verbose

Understanding the Output

The generated documentation includes:

  1. Project Overview - High-level description and purpose
  2. Architecture - System design and component structure
  3. Key Components - Detailed module documentation
  4. Development Setup - Installation and configuration
  5. Deployment Guide - Build and hosting instructions
  6. File Documentation - Functions, classes, and methods
  7. Best Practices - Standards and recommendations

Quick Troubleshooting

Cannot connect to Ollama

Solution: Make sure Ollama is running:

ollama serve

No files found

Solution: Check the directory path:

python run.py --directory /absolute/path/to/project --verbose

API Timeout

Solution: Reduce the number of files or increase timeout:

python run.py --max-files 20
# Or edit .env: API_TIMEOUT=600

Next Steps

✅ You're all set! Now learn more:

Tips for Best Results

Choose the Right Model

  • Small projectsllama2:7b (fast)
  • Medium projectsmistral (balanced)
  • Large/complexcodellama (best quality)

Adjust File Count

  • Start with --max-files 20 for testing
  • Increase to 50-100 for comprehensive docs
  • Too many files may cause timeouts

Iterative Refinement

  • 2 iterations → Quick documentation
  • 3 iterations → Standard quality (default)
  • 5 iterations → Maximum quality

Command Reference

Option Description Example
--directory DIR Project to analyze --directory ~/my-app
--format FORMAT Output format --format html
--output FILE Output filename --output my_docs
--max-files N Max files to analyze --max-files 50
--iterations N Refinement cycles --iterations 5
--model MODEL LLM model --model codellama
--project-type TYPE Project type --project-type backend
--verbose Verbose logging --verbose

Support

Need help?