-
Notifications
You must be signed in to change notification settings - Fork 1.4k
FoodFinder: AI-powered food identification for carb entry (supports Loop 3.10.0) #2404
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
taylorpatterson-T1D
wants to merge
12
commits into
LoopKit:dev
Choose a base branch
from
TaylorJPatterson:feat/FoodFinder
base: dev
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
FoodFinder: AI-powered food identification for carb entry (supports Loop 3.10.0) #2404
taylorpatterson-T1D
wants to merge
12
commits into
LoopKit:dev
from
TaylorJPatterson:feat/FoodFinder
+16,349
−97
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
FoodFinder adds barcode scanning, AI camera analysis, voice search, and text-based food lookup to Loop's carb entry workflow. All feature code lives in dedicated FoodFinder/ subdirectories with FoodFinder_ prefixed filenames for clean isolation and portability to other Loop forks. Integration touchpoints: ~29 lines across 3 existing files (CarbEntryView, SettingsView, FavoriteFoodDetailView). Feature is controlled by a single toggle in FoodFinder_FeatureFlags.swift. New files: 34 (11 views, 3 models, 13 services, 2 view models, 1 feature flags, 1 documentation, 3 tests)
Voice search (microphone button) now uses the AI analysis pipeline instead of USDA text search, enabling natural language food descriptions like "a medium bowl of spicy ramen and a side of gyoza". Text-typed searches continue using USDA/OpenFoodFacts as before. Changes: - SearchBar: Add mic button with voice search callback - SearchRouter: Add analyzeFoodByDescription() routing through AI providers - SearchViewModel: Add performVoiceSearch() async method - EntryPoint: Wire VoiceSearchView sheet to AI analysis pipeline
Replace the separate mic button with automatic natural language detection. When the user dictates into the search field via iOS keyboard dictation, the text is analyzed: short queries (1-3 words like "apple") use USDA, while longer descriptive phrases (4+ words like "a medium bowl of spicy ramen and a side of gyoza") automatically route to the AI analysis path. Changes: - SearchBar: Remove mic button and onVoiceSearchTapped parameter - SearchViewModel: Add isNaturalLanguageQuery() heuristic, route detected natural language through performVoiceSearch in performFoodSearch - EntryPoint: Remove voice search sheet, wire onGenerativeSearchResult callback to handleAIFoodAnalysis
The Python script created group definitions but didn't properly attach all of them to their parent groups. Fixes: - Services group → now child of Loop app root (was orphaned) - Resources group → now child of Loop app root (was orphaned) - Documentation group → now child of project root (was orphaned) - ViewModels/FoodFinder → moved from Loop root to View Models group - Tests/FoodFinder → moved from project root to LoopTests group
…, analysis history - Fix triple barcode fire by consuming scan result immediately in Combine sink - Replace AsyncImage with pre-downloaded thumbnail to avoid SwiftUI rebuild issues - Use smallest OFF thumbnail (100px) with static food icon fallback for slow servers - Add secure Keychain storage for AI provider API keys - Add analysis history tracking with FoodFinder_AnalysisRecord - Consolidate AI provider settings and remove BYOTestConfig
- Remove barcode connectivity pre-check that added 3+ seconds latency per scan - Add NSCache to ImageDownloader for thumbnail deduplication (50 items, 10MB) - Remove artificial minimumSearchDuration delay from search and error paths - Merge duplicate Combine observers into single combineLatest for AI recomputation - Decode image_thumb_url from OpenFoodFacts API for smallest available thumbnail - Wrap 369 bare print() calls in #if DEBUG across 8 FoodFinder files
…eaders File consolidations (6 files removed, 2 new files created): 1. FoodFinder_ScanResult.swift + FoodFinder_VoiceResult.swift → FoodFinder_InputResults.swift 2. FoodFinder_FavoriteDetailView.swift + FoodFinder_FavoriteEditView.swift + FoodFinder_FavoritesView.swift → FoodFinder_FavoritesHelpers.swift 3. FoodFinder_AISettingsManager.swift → absorbed into FoodFinder_AIProviderConfig.swift 4. FoodFinder_FavoritesViewModel.swift → absorbed into FoodFinder_SearchViewModel.swift Other changes: - Fix long analysis titles overflowing the screen by programmatically truncating picker row names and constraining food type to 20 chars - Improve AI prompts for menu/recipe/text image analysis - Add text-only AI analysis path in AIServiceManager - Increase AI token budget for multi-item responses - Standardize all 26 FoodFinder file headers with consistent format
- Add originalAICarbs and aiConfidencePercent fields to FoodFinder_AnalysisRecord for tracking AI estimate accuracy - Add Notification.Name.foodFinderMealLogged for real-time meal event observation - Add MealDataProvider protocol with date-range query interface and AnalysisHistoryStore conformance - Add "Last 30 days" retention option to Analysis History settings
- Add originalAICarbs and aiConfidencePercent fields to FoodFinder_AnalysisRecord for tracking AI estimate accuracy - Add Notification.Name.foodFinderMealLogged for real-time meal event observation - Add MealDataProvider protocol with date-range query interface and AnalysisHistoryStore conformance - Add "Last 30 days" retention option to Analysis History settings
- Absorption time model: conservative adjustments anchored to Loop's 3-hour default. FPU adds +0/+0.5/+1.0 hr (was +1/+2.5/+4), fiber +0/+0.25/+0.5 (was +0/+1/+2), meal size +0/+0.25/+0.5 (was +0/+1/+2). Cap reduced from 8 to 5 hours. Updated AI prompt and 3 examples. - OCR routing fix: raised menu detection threshold from 1 to 5 significant lines and always include image on menu path to prevent food photo misclassification (fixes "Unidentifiable Food Item" on food photos). - Inline "Why X hrs?" pill on Absorption Time row replaces standalone DisclosureGroup row. Purple centered pill with fixed width, expands reasoning on tap. Uses AIAbsorptionTimePickerRow when AI-generated.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR replaces the legacy FoodFinder PR #2329
Summary
FoodFinder adds AI-powered food identification to Loop's carb entry screen, helping people with diabetes quickly and accurately log meals. It integrates directly into the existing Add Carb Entry flow with zero changes to Loop's core dosing logic.
The Problem We're Solving:
Carb counting is the single hardest daily task for people managing diabetes with Loop. Every meal requires estimating carbohydrate content — and getting it wrong directly impacts Time in Range. Current workflow: the user mentally estimates carbs, types a number, and hopes for the best. There's no assistance, no database lookup, no learning from past meals.
What FoodFinder Does
FoodFinder adds AI-powered food identification directly into Loop's existing Add Carb Entry screen. It provides four ways to identify food and auto-populate carb values:
Search modes:
Key features:
FoodFinder_FeatureFlags.isEnabled) — completely dormant when disabledUser-Configurable Settings
All settings are in Loop Settings → FoodFinder:
gpt-4o,claude-sonnet-4-5-20250929,gemini-2.0-flash) (Be sure you use models supporting image processing)Safety Considerations
Architecture and impact on existing Loop code
FoodFinder was designed for minimal integration footprint and easy containment within Loop:
FoodFinder/subdirectories, all prefixedFoodFinder_Modified existing files:
CarbEntryView.swiftFoodFinder_EntryPoint(~5 lines) + analysis history pickerSettingsView.swiftCarbEntryViewModel.swiftFavoriteFoodDetailView.swiftFavoriteFoodsView.swiftAddEditFavoriteFoodView.swiftAddEditFavoriteFoodViewModel.swiftNew file locations (all under
Loop/):Models/FoodFinder/View Models/FoodFinder/Views/FoodFinder/Services/FoodFinder/Resources/FoodFinder/Documentation/FoodFinder/LoopTests/FoodFinder/Screenshots
Video Demo
YouTube Demo: https://youtu.be/i8xToAYBe4M
Test plan
Requesting review by @marionbarker based on availability.