Skip to content

feat: add local LLM support via Ollama#29

Open
skull463 wants to merge 3 commits intomskayyali:mainfrom
skull463:feat-local-llm
Open

feat: add local LLM support via Ollama#29
skull463 wants to merge 3 commits intomskayyali:mainfrom
skull463:feat-local-llm

Conversation

@skull463
Copy link
Copy Markdown

Added a new Local provider to support offline models via Ollama. It bypasses the API key requirement and defaults to http://localhost:11434/v1. Updated the Next.js Content Security Policy (connect-src) to allow localhost connections. Added a preset list of popular local models (Llama 3, Mistral, Qwen, etc.). Tested locally and working

@skull463 skull463 closed this Apr 14, 2026
@skull463 skull463 reopened this Apr 14, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant