Skip to content

Add Ollama, LM Studio providers and Tauri desktop app#16

Open
kpulik wants to merge 22 commits intomskayyali:mainfrom
kpulik:feat/local-providers-tauri
Open

Add Ollama, LM Studio providers and Tauri desktop app#16
kpulik wants to merge 22 commits intomskayyali:mainfrom
kpulik:feat/local-providers-tauri

Conversation

@kpulik
Copy link
Copy Markdown

@kpulik kpulik commented Apr 8, 2026

Summary

  • Local AI providers: Adds Ollama and LM Studio as providers, no API key required, models are auto-detected from the running server and shown in a dropdown
  • Tauri desktop app: Adds a Tauri v2 scaffold so nodepad can run as a native desktop app (npm run tauri:dev / npm run tauri:build), with no CORS restrictions for local providers
  • CORS proxy routes: Two new API routes (/api/local-models, /api/local-chat) proxy requests to local servers so everything works in the browser without extra configuration

Changed files

File What
lib/ai-settings.ts New ollama/lmstudio provider types, dynamic model fetching via fetchLocalModels(), isLocalProvider() helper, skip auth headers for local providers
components/project-sidebar.tsx Auto-fetch model dropdown for local providers, loading/empty states, refresh button, server URL field, hide API key for local providers
lib/ai-enrich.ts Route local provider chat completions through /api/local-chat proxy
lib/ai-ghost.ts Same proxy routing for ghost/synthesis generation
app/api/local-models/route.ts New - proxies model list requests to Ollama/LM Studio (CORS bypass)
app/api/local-chat/route.ts New - proxies chat completion requests to local servers (CORS bypass)
next.config.mjs CSP updated for localhost, conditional output: "export" for Tauri builds
src-tauri/ New - Tauri v2 project scaffold (Rust backend, config, icons)
package.json Added @tauri-apps/cli, @tauri-apps/api, tauri:dev/tauri:build scripts
README.md Setup instructions for browser, desktop app, and local providers

How to use

Browser (unchanged): npm run dev → localhost:3000

Desktop app: npm run tauri:dev (dev) or npm run tauri:build (distributable)

Local providers: Settings → select Ollama or LM Studio → models appear automatically

Test plan

  • Verify existing cloud providers (OpenRouter, OpenAI, Z.ai) still work unchanged
  • Select Ollama provider with Ollama running - models should auto-populate in dropdown
  • Select LM Studio provider with LM Studio server running - models should auto-populate
  • Add a note with a local provider selected - enrichment should complete
  • Verify ghost/synthesis works with local providers
  • npm run tauri:dev opens a native window with the app working
  • npm run tauri:build produces a distributable binary

- Add Ollama and LM Studio as local AI providers (no API key needed)
- Auto-detect installed models via dropdown (fetched from local server)
- Proxy local requests through Next.js API routes to bypass CORS
- Add Tauri v2 scaffold for native desktop builds (.dmg/.msi/.AppImage)
- Conditional static export for Tauri builds (browser mode unchanged)
- Update README with setup instructions for browser, desktop, and local providers

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copilot AI review requested due to automatic review settings April 8, 2026 21:09
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds first-class support for local AI providers (Ollama, LM Studio) and introduces a Tauri v2 desktop scaffold so nodepad can run as a native app, alongside new proxy API routes intended to work around browser CORS limitations.

Changes:

  • Added ollama / lmstudio provider types with dynamic local model discovery and local-provider header behavior.
  • Updated settings UI to support local providers (server URL, refreshable model dropdown, hide API key).
  • Added Tauri v2 scaffold + conditional Next static export + CSP updates, plus new /api/local-models and /api/local-chat proxy routes.

Reviewed changes

Copilot reviewed 16 out of 34 changed files in this pull request and generated 9 comments.

Show a summary per file
File Description
src-tauri/tauri.conf.json Adds Tauri app configuration (build commands, CSP, bundle icons).
src-tauri/src/main.rs Tauri entrypoint delegating to app_lib::run().
src-tauri/src/lib.rs Tauri builder setup (log plugin in debug) and app run.
src-tauri/icons/StoreLogo.png Adds Tauri icon asset.
src-tauri/icons/Square89x89Logo.png Adds Tauri icon asset.
src-tauri/icons/Square71x71Logo.png Adds Tauri icon asset.
src-tauri/icons/Square44x44Logo.png Adds Tauri icon asset.
src-tauri/icons/Square30x30Logo.png Adds Tauri icon asset.
src-tauri/icons/Square150x150Logo.png Adds Tauri icon asset.
src-tauri/icons/Square142x142Logo.png Adds Tauri icon asset.
src-tauri/icons/Square107x107Logo.png Adds Tauri icon asset.
src-tauri/icons/32x32.png Adds Tauri icon asset.
src-tauri/icons/128x128@2x.png Adds Tauri icon asset.
src-tauri/icons/128x128.png Adds Tauri icon asset.
src-tauri/Cargo.toml Adds Rust crate config and Tauri dependencies.
src-tauri/capabilities/default.json Adds default Tauri capabilities configuration.
src-tauri/build.rs Adds Tauri build script.
README.md Documents browser vs desktop setup and local provider usage.
package.json Adds Tauri scripts + dependencies.
package-lock.json Locks new Tauri dependencies.
next.config.mjs Adds conditional output: "export" for Tauri builds + CSP updates.
lib/ai-settings.ts Adds provider types, local model fetcher, local-provider helpers, header/baseUrl logic.
lib/ai-ghost.ts Routes local-provider ghost completions via /api/local-chat.
lib/ai-enrich.ts Routes local-provider enrichment completions via /api/local-chat.
components/project-sidebar.tsx Adds local provider UI (server URL, dynamic models, refresh, API key hiding).
app/opengraph-image.tsx Forces static generation (helpful for export builds).
app/api/local-models/route.ts Adds proxy route to fetch model lists from local servers.
app/api/local-chat/route.ts Adds proxy route to send chat completions to local servers.
.gitignore Ignores out/ and src-tauri/target/.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread app/api/local-chat/route.ts Outdated
Comment thread app/api/local-models/route.ts Outdated
Comment thread lib/ai-settings.ts Outdated
Comment thread lib/ai-enrich.ts Outdated
Comment thread lib/ai-ghost.ts Outdated
Comment thread components/project-sidebar.tsx
Comment thread app/api/local-chat/route.ts Outdated
Comment thread app/api/local-models/route.ts Outdated
Comment thread next.config.mjs
- Fix IPv6 loopback: [::1] → ::1 in ALLOWED_HOSTS (URL parser strips brackets)
- Detect Tauri runtime via __TAURI_INTERNALS__; bypass proxy routes and
  call local servers directly (static export has no API routes)
- Remove sentinel "local" API key; use empty string + isLocalProvider()
- Add SSRF guards: proxy routes check request Host is localhost, restrict
  target ports to 11434/1234 only
- Scope CSP localhost entries to dev only; production uses /api/* proxy
  (covered by 'self')

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 16 out of 34 changed files in this pull request and generated 8 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread app/api/local-chat/route.ts
Comment thread app/api/local-chat/route.ts Outdated
Comment thread app/api/local-models/route.ts
Comment thread app/api/local-models/route.ts Outdated
Comment thread components/project-sidebar.tsx Outdated
Comment thread components/project-sidebar.tsx
Comment thread lib/ai-settings.ts
Comment thread src-tauri/tauri.conf.json
- Replace Host-header SSRF guard with explicit NODE_ENV === "production"
  check (Host is client-controlled and spoofable)
- Remove isLocalHost entirely — production guard is sufficient
- Fix stale loading state: clear localModelsLoading in .finally() and
  when switching away from local providers
- Trim customBaseUrl in getBaseUrl() and on save so whitespace-only
  values fall back to preset
- Remove unsafe-eval from Tauri production CSP (only needed in dev)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 16 out of 34 changed files in this pull request and generated 6 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread components/project-sidebar.tsx Outdated
Comment thread components/project-sidebar.tsx
Comment thread app/api/local-chat/route.ts
Comment thread app/api/local-models/route.ts
Comment thread next.config.mjs Outdated
Comment thread README.md
- Debounce customBaseUrl effect (500ms) to avoid per-keystroke fetches
- Clear modelId when switching to local provider (populated by fetch)
- Clarify in README and CSP comments that local providers work in dev
  and Tauri desktop, not hosted production deployments

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 16 out of 34 changed files in this pull request and generated 1 comment.

Comments suppressed due to low confidence (1)

lib/ai-settings.ts:307

  • loadAIConfig() now allows local providers without an API key, but it can still return a config with an empty modelId (e.g., if the user saves immediately after switching to a local provider before model fetch completes, or leaves it blank). Downstream calls send model: "", which many OpenAI-compatible servers reject. Consider returning null (or a more explicit error) when modelId is falsy, or enforcing a non-empty modelId at save time for local providers.
export function loadAIConfig(): AIConfig | null {
  const s = loadSettings()
  // Local providers don't need an API key
  if (!s.apiKey && !isLocalProvider(s.provider)) return null
  const models = getModelsForProvider(s.provider)
  const model = models.find(m => m.id === s.modelId)
  const modelId = model?.id ?? models[0]?.id ?? s.modelId ?? DEFAULT_MODEL_ID
  const supportsGrounding =
    (s.provider === "openrouter" || s.provider === "openai") &&
    s.webGrounding &&
    (model?.supportsGrounding ?? false)
  return { apiKey: s.apiKey, modelId, supportsGrounding, provider: s.provider, customBaseUrl: s.customBaseUrl }

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread components/project-sidebar.tsx Outdated
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 17 out of 36 changed files in this pull request and generated 5 comments.

Comments suppressed due to low confidence (1)

lib/ai-settings.ts:307

  • For local providers, getModelsForProvider() returns an empty array. If the user saves settings with an empty/unknown modelId (e.g. just switched to Ollama/LM Studio and fetch hasn’t populated yet), loadAIConfig() falls back to DEFAULT_MODEL_ID (openai/gpt-4o). That value will then be sent to the local server and will reliably fail. Consider special-casing local providers so modelId never falls back to a cloud default (e.g. return null / throw a clear error when s.modelId is empty, or keep modelId as-is for locals and let the UI enforce selection).
export function loadAIConfig(): AIConfig | null {
  const s = loadSettings()
  // Local providers don't need an API key
  if (!s.apiKey && !isLocalProvider(s.provider)) return null
  const models = getModelsForProvider(s.provider)
  const model = models.find(m => m.id === s.modelId)
  const modelId = model?.id ?? models[0]?.id ?? s.modelId ?? DEFAULT_MODEL_ID
  const supportsGrounding =
    (s.provider === "openrouter" || s.provider === "openai") &&
    s.webGrounding &&
    (model?.supportsGrounding ?? false)
  return { apiKey: s.apiKey, modelId, supportsGrounding, provider: s.provider, customBaseUrl: s.customBaseUrl }

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread components/project-sidebar.tsx Outdated
Comment thread components/project-sidebar.tsx Outdated
Comment thread components/project-sidebar.tsx
Comment thread app/api/local-models/route.ts Outdated
Comment thread src-tauri/Cargo.toml
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 17 out of 36 changed files in this pull request and generated 5 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread components/project-sidebar.tsx
Comment thread app/api/local-chat/route.ts Outdated
Comment thread app/api/local-models/route.ts Outdated
Comment thread lib/ai-settings.ts Outdated
Comment thread app/api/local-models/route.ts Outdated
@kpulik kpulik requested a review from Copilot April 8, 2026 22:31
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 18 out of 37 changed files in this pull request and generated 4 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread src-tauri/tauri.conf.json
Comment thread lib/ai-settings.ts
Comment thread app/api/local-chat/route.ts
Comment thread app/api/local-models/route.ts
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 18 out of 37 changed files in this pull request and generated 3 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread app/api/local-models/route.ts
Comment thread lib/ai-settings.ts
Comment thread lib/ai-settings.ts
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 18 out of 37 changed files in this pull request and generated 2 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread lib/ai-settings.ts
Comment thread components/project-sidebar.tsx
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 18 out of 37 changed files in this pull request and generated 1 comment.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread components/project-sidebar.tsx Outdated
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 18 out of 37 changed files in this pull request and generated 1 comment.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread components/project-sidebar.tsx
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 18 out of 37 changed files in this pull request and generated 3 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread components/project-sidebar.tsx
Comment thread lib/ai-settings.ts
Comment thread lib/ai-settings.ts
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 18 out of 37 changed files in this pull request and generated no new comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@kpulik
Copy link
Copy Markdown
Author

kpulik commented Apr 9, 2026

Ready for final review/merge.

  • Addressed all Copilot review comments.
  • Copilot now reports no new comments.
  • Branch is up to date with base and merges cleanly.
  • Local checks completed: production build and production start succeed; browser endpoint returns HTTP 200.
  • Note: lint script currently requires eslint to be added/configured (non-blocking for this PR).

Dev-020 added a commit to Dev-020/nodepad_Dev that referenced this pull request May 7, 2026
- Add getSafeContentTypeConfig helper to lib/content-types.ts
- Update TileCard, GraphArea, and other UI components with robust fallbacks
- Add proactive contentType validation to AI enrichment parser
- Document crash fix in BACKLOG.md as issue mskayyali#16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants