Skip to content

Conversation

@taylorpatterson-T1D
Copy link

@taylorpatterson-T1D taylorpatterson-T1D commented Feb 11, 2026

This PR replaces the legacy FoodFinder PR #2329

Summary

FoodFinder adds AI-powered food identification to Loop's carb entry screen, helping people with diabetes quickly and accurately log meals. It integrates directly into the existing Add Carb Entry flow with zero changes to Loop's core dosing logic.

The Problem We're Solving:

Carb counting is the single hardest daily task for people managing diabetes with Loop. Every meal requires estimating carbohydrate content — and getting it wrong directly impacts Time in Range. Current workflow: the user mentally estimates carbs, types a number, and hopes for the best. There's no assistance, no database lookup, no learning from past meals.

What FoodFinder Does

FoodFinder adds AI-powered food identification directly into Loop's existing Add Carb Entry screen. It provides four ways to identify food and auto-populate carb values:

Search modes:

  • Text search — OpenFoodFacts database lookup with barcode scanning
  • AI image analysis — Take a photo of your plate; AI identifies foods, portions, and macros
  • Voice/dictation — Describe your meal naturally; AI generates nutritional analysis
  • Menu/recipe OCR — Photograph a menu or recipe in multiple languages; on-device OCR extracts text for AI analysis

Key features:

  • BYO API key model — users configure their own AI provider (OpenAI, Anthropic, Google, or any compatible endpoint)
  • Secure API key storage via iOS Keychain
  • Feature toggle (FoodFinder_FeatureFlags.isEnabled) — completely dormant when disabled
  • Analysis history with configurable retention (7/14/30/90 days)
  • Favorite food integration with thumbnail persistence
  • AI-recommended absorption times with conservative model anchored to Loop's 3-hour default
  • Advanced dosing notes (FPU, fiber, GI, insulin timing) behind a separate toggle
  • LoopInsights data hooks for future AI-driven tuning recommendations

User-Configurable Settings

All settings are in Loop Settings → FoodFinder:

Setting Description
FoodFinder Toggle Master on/off for the entire feature
AI Provider BYO API key — supports any image processing OpenAI-compatible endpoint, Anthropic Messages API, or Google Generative AI
API Key Stored in iOS Keychain (encrypted at rest, excluded from backups)
Model Selection User picks their preferred model (e.g., gpt-4o, claude-sonnet-4-5-20250929, gemini-2.0-flash) (Be sure you use models supporting image processing)
Analysis History Retention Last 24 hours, 7 days, 14 days, or 30 days
Advanced Dosing Recommendations Optional AI-generated dosing context (disabled by default)
Advanced API Settings Custom endpoint paths for self-hosted or Azure deployments

Safety Considerations

  • FoodFinder never boluses automatically. It only pre-populates the carb entry field — the user always reviews, edits, and confirms before any insulin action occurs.
  • Confidence scoring is displayed for every AI analysis (20–97% range). Low-confidence results are visually flagged.
  • The user can edit all values. AI-suggested carbs, absorption time, and serving count are starting points, not final answers.
  • API keys are stored in iOS Keychain — encrypted at rest, never in UserDefaults or plain text.
  • Feature toggle allows complete disable without removing code.
  • No data leaves the device except the API call to the user's own configured AI provider. No analytics, no telemetry, no third-party tracking.
  • Absorption time model — conservative sensible adjustments based on all known nutriments: most meals stay at 3–3.5 hours, max cap of 5 hours

Architecture and impact on existing Loop code

FoodFinder was designed for minimal integration footprint and easy containment within Loop:

  • 34 new files in FoodFinder/ subdirectories, all prefixed FoodFinder_
  • ~29 lines of integration across 3 existing files (CarbEntryView, SettingsView, FavoriteFoodDetailView)
  • No LoopKit modifications — everything lives in the Loop target
  • All FoodFinder code compiles out cleanly when the feature flag is disabled

Modified existing files:

File What changed
CarbEntryView.swift Embeds FoodFinder_EntryPoint (~5 lines) + analysis history picker
SettingsView.swift Adds FoodFinder settings row + navigation link
CarbEntryViewModel.swift Adds analysis history state + restored result bindings
FavoriteFoodDetailView.swift Adds "Analyze with AI" button for saved favorites
FavoriteFoodsView.swift Adds FoodFinder thumbnail support on favorite food rows
AddEditFavoriteFoodView.swift Accepts pre-populated name/image from FoodFinder
AddEditFavoriteFoodViewModel.swift Passes through thumbnail image binding

New file locations (all under Loop/):

Directory Files Purpose
Models/FoodFinder/ 3 Data models, analysis records, input result types
View Models/FoodFinder/ 1 Search state, AI analysis, product selection logic
Views/FoodFinder/ 8 All FoodFinder UI (entry point, camera, scanner, settings, search, etc.)
Services/FoodFinder/ 12 AI analysis, API clients, barcode scanning, voice, image storage, routing
Resources/FoodFinder/ 1 Feature flags and configuration
Documentation/FoodFinder/ 1 README
LoopTests/FoodFinder/ 3 Unit tests for barcode, OpenFoodFacts, and voice search

Screenshots

  1. FoodFinder search bar with text search results
IMG_1590
  1. Barcode scanner camera capture
IMG_1591
  1. AI camera view analyzing a plate of food
IMG_1592
  1. Product info card with nutrition circles and confidence badge
IMG_1593
  1. Menu/recipe analysis showing translated items
IMG_1594
  1. FoodFinder settings page with AI provider configuration
IMG_1595 IMG_1596
  1. Analysis history picker showing recent meals
IMG_1597
  1. Favorite food with FoodFinder thumbnail
IMG_1598

Video Demo

YouTube Demo: https://youtu.be/i8xToAYBe4M

Test plan

  • Enable FoodFinder in Settings → verify toggle works
  • Text search for single word common foods → verify carbs populate correctly
  • Scan a barcode → verify product lookup and nutrition
  • Take a photo of a meal → verify AI identifies foods with reasonable carb estimates
  • Dictate a meal description → text search, tap mic button, describe a meal, verify OCR extracts text and AI analyzes
  • Photograph a menu → verify AI generates nutrition analysis
  • Adjust servings on AI result → verify carbs and nutriments scale proportionally
  • Tap "Why X hrs?" pill → verify reasoning expands
  • Save a food as favorite → verify thumbnail and nutrition persist
  • Select from recent analyses → verify history picker works
  • Disable FoodFinder → verify carb entry screen returns to normal

Requesting review by @marionbarker based on availability.

FoodFinder adds barcode scanning, AI camera analysis, voice search, and
text-based food lookup to Loop's carb entry workflow. All feature code
lives in dedicated FoodFinder/ subdirectories with FoodFinder_ prefixed
filenames for clean isolation and portability to other Loop forks.

Integration touchpoints: ~29 lines across 3 existing files
(CarbEntryView, SettingsView, FavoriteFoodDetailView). Feature is
controlled by a single toggle in FoodFinder_FeatureFlags.swift.

New files: 34 (11 views, 3 models, 13 services, 2 view models,
1 feature flags, 1 documentation, 3 tests)
Voice search (microphone button) now uses the AI analysis pipeline
instead of USDA text search, enabling natural language food descriptions
like "a medium bowl of spicy ramen and a side of gyoza". Text-typed
searches continue using USDA/OpenFoodFacts as before.

Changes:
- SearchBar: Add mic button with voice search callback
- SearchRouter: Add analyzeFoodByDescription() routing through AI providers
- SearchViewModel: Add performVoiceSearch() async method
- EntryPoint: Wire VoiceSearchView sheet to AI analysis pipeline
Replace the separate mic button with automatic natural language detection.
When the user dictates into the search field via iOS keyboard dictation,
the text is analyzed: short queries (1-3 words like "apple") use USDA,
while longer descriptive phrases (4+ words like "a medium bowl of spicy
ramen and a side of gyoza") automatically route to the AI analysis path.

Changes:
- SearchBar: Remove mic button and onVoiceSearchTapped parameter
- SearchViewModel: Add isNaturalLanguageQuery() heuristic, route detected
  natural language through performVoiceSearch in performFoodSearch
- EntryPoint: Remove voice search sheet, wire onGenerativeSearchResult
  callback to handleAIFoodAnalysis
The Python script created group definitions but didn't properly attach
all of them to their parent groups. Fixes:
- Services group → now child of Loop app root (was orphaned)
- Resources group → now child of Loop app root (was orphaned)
- Documentation group → now child of project root (was orphaned)
- ViewModels/FoodFinder → moved from Loop root to View Models group
- Tests/FoodFinder → moved from project root to LoopTests group
…, analysis history

- Fix triple barcode fire by consuming scan result immediately in Combine sink
- Replace AsyncImage with pre-downloaded thumbnail to avoid SwiftUI rebuild issues
- Use smallest OFF thumbnail (100px) with static food icon fallback for slow servers
- Add secure Keychain storage for AI provider API keys
- Add analysis history tracking with FoodFinder_AnalysisRecord
- Consolidate AI provider settings and remove BYOTestConfig
- Remove barcode connectivity pre-check that added 3+ seconds latency per scan
- Add NSCache to ImageDownloader for thumbnail deduplication (50 items, 10MB)
- Remove artificial minimumSearchDuration delay from search and error paths
- Merge duplicate Combine observers into single combineLatest for AI recomputation
- Decode image_thumb_url from OpenFoodFacts API for smallest available thumbnail
- Wrap 369 bare print() calls in #if DEBUG across 8 FoodFinder files
…eaders

File consolidations (6 files removed, 2 new files created):

1. FoodFinder_ScanResult.swift + FoodFinder_VoiceResult.swift
   → FoodFinder_InputResults.swift

2. FoodFinder_FavoriteDetailView.swift + FoodFinder_FavoriteEditView.swift
   + FoodFinder_FavoritesView.swift → FoodFinder_FavoritesHelpers.swift

3. FoodFinder_AISettingsManager.swift
   → absorbed into FoodFinder_AIProviderConfig.swift

4. FoodFinder_FavoritesViewModel.swift
   → absorbed into FoodFinder_SearchViewModel.swift

Other changes:
- Fix long analysis titles overflowing the screen by programmatically
  truncating picker row names and constraining food type to 20 chars
- Improve AI prompts for menu/recipe/text image analysis
- Add text-only AI analysis path in AIServiceManager
- Increase AI token budget for multi-item responses
- Standardize all 26 FoodFinder file headers with consistent format
- Add originalAICarbs and aiConfidencePercent fields to
  FoodFinder_AnalysisRecord for tracking AI estimate accuracy
- Add Notification.Name.foodFinderMealLogged for real-time
  meal event observation
- Add MealDataProvider protocol with date-range query interface
  and AnalysisHistoryStore conformance
- Add "Last 30 days" retention option to Analysis History settings
- Add originalAICarbs and aiConfidencePercent fields to
  FoodFinder_AnalysisRecord for tracking AI estimate accuracy
- Add Notification.Name.foodFinderMealLogged for real-time
  meal event observation
- Add MealDataProvider protocol with date-range query interface
  and AnalysisHistoryStore conformance
- Add "Last 30 days" retention option to Analysis History settings
- Absorption time model: conservative adjustments anchored to Loop's
  3-hour default. FPU adds +0/+0.5/+1.0 hr (was +1/+2.5/+4), fiber
  +0/+0.25/+0.5 (was +0/+1/+2), meal size +0/+0.25/+0.5 (was +0/+1/+2).
  Cap reduced from 8 to 5 hours. Updated AI prompt and 3 examples.
- OCR routing fix: raised menu detection threshold from 1 to 5 significant
  lines and always include image on menu path to prevent food photo
  misclassification (fixes "Unidentifiable Food Item" on food photos).
- Inline "Why X hrs?" pill on Absorption Time row replaces standalone
  DisclosureGroup row. Purple centered pill with fixed width, expands
  reasoning on tap. Uses AIAbsorptionTimePickerRow when AI-generated.
@taylorpatterson-T1D taylorpatterson-T1D changed the title FoodFinder: AI-powered food identification for carb entry FoodFinder: AI-powered food identification for carb entry (supports Loop 3.10.0) Feb 11, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant