Skip to content

Support filtering by duplicate request groups. Add duplicate request drawer UI.#7965

Draft
lucanovera wants to merge 4 commits intomainfrom
ENG-1806-FE-Add-DSR-duplication-detection-activity-log-link
Draft

Support filtering by duplicate request groups. Add duplicate request drawer UI.#7965
lucanovera wants to merge 4 commits intomainfrom
ENG-1806-FE-Add-DSR-duplication-detection-activity-log-link

Conversation

@lucanovera
Copy link
Copy Markdown
Contributor

@lucanovera lucanovera commented Apr 20, 2026

Ticket ENG-1806

Description Of Changes

Replaces the generic "View Log" action on the Duplicate Request Detection entry in the privacy request activity timeline with a new "View duplicates" drawer that lists every request in the same duplicate group as a table (Request ID → Created at → Source → Status), pinning the current request at the top with a "Current" tag and linking the rest to their detail pages in new tabs.

Getting there required two small but important backend pieces:

  1. A new duplicate_request_group_id filter on PrivacyRequestFilter, wired into filter_privacy_request_queryset. The frontend calls POST /privacy-request/search with the current request's duplicate_request_group_id to fetch the group members — no new endpoint, no new query hook (searchPrivacyRequests already extends Partial<PrivacyRequestFilter>).
  2. Recreating the missing index on privacyrequest.duplicate_request_group_id. Migration c09e76282dd1 created this index when the column was a String; migration 80d28dea3b6b dropped the column/index and re-added the column as a UUID foreign key but never recreated the index. Postgres does not auto-index FK columns. There was no existing production query hitting this column (the column was write-only — set by DuplicateDetectionService and never read as a filter) so the gap hadn't surfaced, but the new filter is the first query pattern to hit it. Without the index this would seq-scan privacyrequest on every call.

The migration follows the repo's existing large-table pattern (a7241db3ee6a_add_identity_indexes.py): inline CREATE INDEX under 1M rows, otherwise register the key in post_upgrade_background_migration_tasks so post_upgrade_index_creation.py runs CREATE INDEX CONCURRENTLY at application startup (non-blocking). index=True was added to the SQLAlchemy column so future alembic autogenerate runs keep model and DB aligned.

Frontend note: the timeline branches inside ActivityTimeline.tsx — when the timeline entry is Duplicate Request Detection and the current request has a duplicate_request_group_id, clicking it opens the new drawer; otherwise it falls through to the existing LogDrawer (e.g. legacy records without a group id).

Code Changes

Backend

  • src/fides/api/alembic/migrations/versions/xx_2026_04_20_1200_e8f9a0b1c2d3_recreate_ix_privacyrequest_duplicate_request_group_id.py — new conditional migration to recreate ix_privacyrequest_duplicate_request_group_id
  • src/fides/api/migrations/post_upgrade_index_creation.py — registered deferred CREATE INDEX CONCURRENTLY statement under privacyrequest
  • src/fides/api/models/privacy_request/privacy_request.py — added index=True on duplicate_request_group_id
  • src/fides/api/schemas/privacy_request.py — added duplicate_request_group_id: Optional[UUID] to PrivacyRequestFilter
  • src/fides/service/privacy_request/privacy_request_query_utils.py — applied filter in filter_privacy_request_queryset
  • tests/ops/api/v1/endpoints/privacy_request/test_privacy_request_duplicate_group_filtering.py — new file, 4 tests (returns group members, excludes unrelated groups, empty for unknown group id, 422 for invalid UUID)

Frontend

  • clients/admin-ui/src/features/privacy-requests/events-and-logs/DuplicatesDrawer.tsx — new component; Ant Drawer + Table with Request ID → Created at → Source → Status, current row pinned at top with "Current" tag, other rows use RouterLink with target="_blank"
  • clients/admin-ui/src/features/privacy-requests/events-and-logs/ActivityTimeline.tsx — state + conditional render; branches to the new drawer when the entry is Duplicate Request Detection and a group id is set
  • clients/admin-ui/src/features/privacy-requests/events-and-logs/ActivityTimelineEntry.tsx — dynamic button label ("View duplicates" vs "View Log")
  • clients/admin-ui/src/features/privacy-requests/events-and-logs/hooks/usePrivacyRequestEventLogs.ts — flag isDuplicateDetection on the timeline item; exports shared DUPLICATE_DETECTION_DATASET_NAME constant
  • clients/admin-ui/src/features/privacy-requests/types.tsPrivacyRequestEntity.duplicate_request_group_id, ActivityTimelineItem.isDuplicateDetection/hasDuplicateGroup
  • clients/admin-ui/src/types/api/models/PrivacyRequestFilter.ts — regenerated to include duplicate_request_group_id

Steps to Confirm

Backend

  1. Pull this branch and run migrations. Confirm ix_privacyrequest_duplicate_request_group_id exists on privacyrequest:
    SELECT indexname FROM pg_indexes
    WHERE tablename = 'privacyrequest'
      AND indexname = 'ix_privacyrequest_duplicate_request_group_id';
    On a prod-sized snapshot (> 1M rows), instead confirm the migration_key is registered and then completed by the startup task:
    SELECT key, completed_at FROM post_upgrade_background_migration_tasks
    WHERE task_type = 'index'
      AND key = 'ix_privacyrequest_duplicate_request_group_id';
  2. Enable duplicate detection locally:
    CONFIG__duplicate_detection__enabled=true
    CONFIG__duplicate_detection__match_identity_fields=["email"]
    
  3. Submit two privacy requests with the same email within the detection window. Wait for detection to run; the second request should end up with status duplicate.
  4. Verify the new filter on the API directly:
    POST /api/v1/privacy-request/search
    { "duplicate_request_group_id": "<uuid-from-step-3>" }
    
    Returns both requests.
  5. Invalid UUID returns 422:
    POST /api/v1/privacy-request/search
    { "duplicate_request_group_id": "not-a-uuid" }
    
  6. Unknown UUID returns an empty page:
    POST /api/v1/privacy-request/search
    { "duplicate_request_group_id": "00000000-0000-0000-0000-000000000000" }
    
  7. Query plan sanity check (run ANALYZE privacyrequest first if the plan looks off):
    EXPLAIN (ANALYZE, BUFFERS)
    SELECT id FROM privacyrequest
    WHERE duplicate_request_group_id = '<uuid>';
    Expect Index Scan using ix_privacyrequest_duplicate_request_group_id, not Seq Scan.
  8. Run the new test file:
    pytest tests/ops/api/v1/endpoints/privacy_request/test_privacy_request_duplicate_group_filtering.py
    

Frontend

  1. With duplicate detection enabled and two duplicate requests created (backend steps 2–3 above), open /privacy-requests/<second-request-id>.
  2. In the activity timeline, the Duplicate Request Detection entry should show "View duplicates" (not "View Log").
  3. Click it → drawer opens with columns Request ID → Created at → Source → Status and both requests listed.
  4. The current request is pinned at the top with a "Current" tag next to its ID and is not clickable.
  5. The other row's Request ID is a link — click it and confirm it opens the other request's detail page in a new browser tab (current tab stays on the current request).
  6. Sanity check on a non-duplicate request: open a regular privacy request detail page. There should be no Duplicate Request Detection entry. Clicking "View Log" on other skipped/errored entries still opens the regular LogDrawer unchanged.

Pre-Merge Checklist

  • Issue requirements met
  • All CI pipelines succeeded
  • CHANGELOG.md updated
    • Add a db-migration This indicates that a change includes a database migration label to the entry if your change includes a DB migration
    • Add a high-risk This issue suggests changes that have a high-probability of breaking existing code label to the entry if your change includes a high-risk change (i.e. potential for performance impact or unexpected regression) that should be flagged
    • Updates unreleased work already in Changelog, no new entry necessary
  • UX feedback:
    • All UX related changes have been reviewed by a designer
    • No UX review needed
  • Followup issues:
    • Followup issues created
    • No followup issues
  • Database migrations:
    • Ensure that your downrev is up to date with the latest revision on main
    • Ensure that your downgrade() migration is correct and works
      • If a downgrade migration is not possible for this change, please call this out in the PR description!
    • No migrations
  • Documentation:
    • Documentation complete, PR opened in fidesdocs
    • Documentation issue created in fidesdocs
    • If there are any new client scopes created as part of the pull request, remember to update public-facing documentation that references our scope registry
    • No documentation updates required

@vercel
Copy link
Copy Markdown
Contributor

vercel Bot commented Apr 20, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

2 Skipped Deployments
Project Deployment Actions Updated (UTC)
fides-plus-nightly Ignored Ignored Preview Apr 20, 2026 7:53pm
fides-privacy-center Ignored Ignored Apr 20, 2026 7:53pm

Request Review

@github-actions
Copy link
Copy Markdown

github-actions Bot commented Apr 20, 2026

Title Lines Statements Branches Functions
admin-ui Coverage: 8%
6.29% (2771/44031) 5.49% (1376/25035) 4.41% (575/13033)
fides-js Coverage: 78%
78.98% (1962/2484) 65.55% (1214/1852) 72.57% (336/463)
privacy-center Coverage: 88%
85.97% (331/385) 81.36% (179/220) 78.87% (56/71)

@codecov
Copy link
Copy Markdown

codecov Bot commented Apr 20, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 83.17%. Comparing base (d8860a9) to head (82cdaa7).
⚠️ Report is 9 commits behind head on main.

❌ Your project check has failed because the head coverage (83.17%) is below the target coverage (85.00%). You can increase the head coverage or adjust the target coverage.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #7965      +/-   ##
==========================================
- Coverage   85.04%   83.17%   -1.87%     
==========================================
  Files         631      631              
  Lines       41203    41217      +14     
  Branches     4806     4808       +2     
==========================================
- Hits        35040    34284     -756     
- Misses       5071     5711     +640     
- Partials     1092     1222     +130     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@lucanovera lucanovera added the db-migration This indicates that a change includes a database migration label Apr 20, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

db-migration This indicates that a change includes a database migration

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant