diff --git a/CHANGELOG.md b/CHANGELOG.md index 7be0667..7c3da5b 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -34,6 +34,8 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - `SegmentStarted` event now carries `start_byte` and `end_byte` so downstream consumers can identify which byte range a segment covers ### Added +- **Clear completed / Clear failed downloads**: two new toolbar buttons in the Downloads view, separated from the bulk actions by a vertical separator. Each opens a confirmation dialog with an optional "Also delete files from disk" checkbox gated behind a prominent red warning panel. Success and error outcomes are reported via toasts. +- `sonner` toast notifications (new dependency) mounted globally in `App.tsx`, with a thin `src/lib/toast.ts` wrapper so components do not depend on the library directly. - YouTube 1080p+ support: when `resolve_stream_url` returns `AdaptiveStreamOnly`, `download_media_start` now falls back to `download_to_file` which delegates the full DASH download + ffmpeg merge to yt-dlp. The merged file is moved to the diff --git a/docs/superpowers/plans/2026-04-16-youtube-download-to-file.md b/docs/superpowers/plans/2026-04-16-youtube-download-to-file.md new file mode 100644 index 0000000..fa6a751 --- /dev/null +++ b/docs/superpowers/plans/2026-04-16-youtube-download-to-file.md @@ -0,0 +1,1365 @@ +# YouTube `download_to_file` Implementation Plan + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Permettre le téléchargement 1080p+ depuis YouTube en déléguant le download+merge DASH à yt-dlp via une nouvelle fonction `download_to_file` dans le plugin, puis enregistrer le fichier résultant comme téléchargement complété dans Vortex. + +**Architecture:** Le plugin YouTube expose une nouvelle fonction WASM `download_to_file` qui lance yt-dlp avec `bestvideo+bestaudio --merge-output-format`, retourne le chemin du fichier fusionné. Côté Vortex, quand `resolve_stream_url` échoue avec `AdaptiveStreamOnly`, le moteur bascule sur `download_to_file`, déplace le fichier temp vers le dossier de téléchargements, et enregistre le download comme complété via un nouveau `RegisterLocalFileCommand`. + +**Tech Stack:** Rust, extism-pdk 1.4, yt-dlp subprocess, Tauri 2 IPC, CQRS CommandBus, thiserror, serde_json + +--- + +## Fichiers touchés + +### Plugin (`vortex-mod-youtube/`) +| Fichier | Action | Responsabilité | +|---------|--------|----------------| +| `src/extractor.rs` | Modifier | Ajouter `yt_dlp_args_for_download_to_file` + `DEFAULT_DOWNLOAD_TIMEOUT_MS` + `parse_download_path_from_stdout` | +| `src/plugin_api.rs` | Modifier | Ajouter `#[plugin_fn] download_to_file` | +| `plugin.toml` | Modifier | Bump version `1.2.0` | +| `Cargo.toml` | Modifier | Bump version `1.2.0` | +| `CHANGELOG.md` | Modifier | Section `[1.2.0]` | + +### Vortex core (`vortex/src-tauri/src/`) +| Fichier | Action | Responsabilité | +|---------|--------|----------------| +| `domain/error.rs` | Modifier | Ajouter variant `AdaptiveStreamOnly` | +| `domain/ports/driven/plugin_loader.rs` | Modifier | Ajouter `DownloadedFileInfo` + méthode `download_to_file` | +| `adapters/driven/plugin/extism_loader.rs` | Modifier | Implémenter `download_to_file` + détecter `AdaptiveStreamOnly` dans `resolve_stream_url` | +| `application/commands/mod.rs` | Modifier | Ajouter `RegisterLocalFileCommand` | +| `application/commands/register_local_file.rs` | Créer | Handler `handle_register_local_file` | +| `adapters/driving/tauri_ipc.rs` | Modifier | Fallback `AdaptiveStreamOnly` → `download_to_file` dans `download_media_start` | +| `vortex/CHANGELOG.md` | Modifier | Section `[Unreleased]` | + +--- + +## Task 1 — Plugin : helpers `extractor.rs` + +**Files:** +- Modify: `vortex-mod-youtube/src/extractor.rs` + +- [ ] **Step 1 : Écrire les tests qui vont échouer** + +Ajouter à la fin du bloc `#[cfg(test)] mod tests` dans `extractor.rs` : + +```rust +#[test] +fn download_args_include_bestvideo_plus_bestaudio() { + let args = yt_dlp_args_for_download_to_file("https://youtu.be/abc", "1080p", "mp4", "/tmp/vx", false); + let fmt_idx = args.iter().position(|a| a == "--format").unwrap(); + assert!(args[fmt_idx + 1].contains("bestvideo"), "selector must start with bestvideo"); + assert!(args[fmt_idx + 1].contains("bestaudio"), "selector must include bestaudio"); +} + +#[test] +fn download_args_audio_only_uses_bestaudio() { + let args = yt_dlp_args_for_download_to_file("https://youtu.be/abc", "", "m4a", "/tmp/vx", true); + let fmt_idx = args.iter().position(|a| a == "--format").unwrap(); + assert!(args[fmt_idx + 1].starts_with("bestaudio"), "audio_only must start with bestaudio"); +} + +#[test] +fn download_args_include_merge_output_format() { + let args = yt_dlp_args_for_download_to_file("https://youtu.be/abc", "1080p", "mp4", "/tmp/vx", false); + assert!(args.contains(&"--merge-output-format".into())); + let idx = args.iter().position(|a| a == "--merge-output-format").unwrap(); + assert_eq!(args[idx + 1], "mp4"); +} + +#[test] +fn download_args_include_output_template_with_dir() { + let args = yt_dlp_args_for_download_to_file("https://youtu.be/abc", "720p", "mp4", "/tmp/vx", false); + let out_idx = args.iter().position(|a| a == "--output").unwrap(); + assert!(args[out_idx + 1].starts_with("/tmp/vx/"), "output template must be in output_dir"); +} + +#[test] +fn download_args_include_print_after_move() { + let args = yt_dlp_args_for_download_to_file("https://youtu.be/abc", "1080p", "mp4", "/tmp/vx", false); + let idx = args.iter().position(|a| a == "--print").unwrap(); + assert_eq!(args[idx + 1], "after_move:%(filepath)s"); +} + +#[test] +fn parse_download_path_returns_last_nonempty_line() { + let stdout = "\n/tmp/vx/dQw4w9WgXcQ.mp4\n"; + let path = parse_download_path_from_stdout(stdout).unwrap(); + assert_eq!(path, "/tmp/vx/dQw4w9WgXcQ.mp4"); +} + +#[test] +fn parse_download_path_empty_stdout_returns_error() { + let result = parse_download_path_from_stdout(" \n \n"); + assert!(matches!(result, Err(PluginError::NoMatchingFormat))); +} +``` + +- [ ] **Step 2 : Vérifier que les tests échouent** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +cargo test 2>&1 | grep -E "FAILED|error\[" | head -20 +``` +Attendu : erreurs de compilation (fonctions manquantes). + +- [ ] **Step 3 : Implémenter les fonctions dans `extractor.rs`** + +Ajouter après la constante `DEFAULT_TIMEOUT_MS` existante (ligne 28) : + +```rust +/// Default timeout for full video download+merge — 30 minutes. +pub const DEFAULT_DOWNLOAD_TIMEOUT_MS: u64 = 1_800_000; +``` + +Ajouter après `yt_dlp_args_for_stream_url` (avant la section `// ── Tests`) : + +```rust +/// Build yt-dlp args for a full download+merge operation. +/// +/// Unlike `yt_dlp_args_for_stream_url`, this actually downloads the video +/// and audio streams, merges them via ffmpeg (spawned internally by yt-dlp), +/// and writes the final file to `output_dir`. The merged file path is printed +/// to stdout via `--print after_move:%(filepath)s` for the caller to read. +/// +/// The format selector prefers DASH streams (bestvideo+bestaudio) which allow +/// 1080p and above, unlike the `best[protocol=https]` selector used by +/// `resolve_stream_url` which is limited to pre-merged ≤720p streams. +pub fn yt_dlp_args_for_download_to_file( + url: &str, + quality: &str, + format: &str, + output_dir: &str, + audio_only: bool, +) -> Vec { + let selector = build_download_format_selector(quality, format, audio_only); + let merge_format = if audio_only { format } else { format }; + let output_template = format!("{output_dir}/%(id)s.%(ext)s"); + + vec![ + "--format".into(), + selector, + "--merge-output-format".into(), + merge_format.into(), + "--output".into(), + output_template, + "--print".into(), + "after_move:%(filepath)s".into(), + "--no-playlist".into(), + "--no-warnings".into(), + "--".into(), + url.into(), + ] +} + +/// Build a yt-dlp format selector for DASH download+merge. +/// +/// For video: prefers `bestvideo[height<=H]+bestaudio`, which selects +/// the best DASH video/audio streams up to the requested height and lets +/// yt-dlp merge them via ffmpeg. Falls back to `best[height<=H]` for +/// services that only offer pre-merged streams. +/// +/// For audio-only: uses `bestaudio`. +fn build_download_format_selector(quality: &str, format: &str, audio_only: bool) -> String { + let height: Option = quality.trim_end_matches('p').parse().ok(); + let has_format = !format.is_empty() && format.chars().all(|c| c.is_ascii_alphanumeric()); + + if audio_only { + if has_format { + format!("bestaudio[ext={format}]/bestaudio") + } else { + "bestaudio".into() + } + } else { + match height { + Some(h) => format!( + "bestvideo[height<={h}]+bestaudio/bestvideo[height<={h}]+bestaudio[ext=m4a]/best[height<={h}]" + ), + None => "bestvideo+bestaudio/best".into(), + } + } +} + +/// Parse the final merged file path from yt-dlp stdout. +/// +/// With `--print after_move:%(filepath)s`, yt-dlp appends one line to stdout +/// containing the absolute path of the merged output file. We take the last +/// non-empty line to be robust against any incidental output. +pub fn parse_download_path_from_stdout(stdout: &str) -> Result { + stdout + .lines() + .rev() + .map(str::trim) + .find(|l| !l.is_empty()) + .map(str::to_string) + .ok_or(PluginError::NoMatchingFormat) +} +``` + +- [ ] **Step 4 : Vérifier que les tests passent** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +cargo test extractor 2>&1 | tail -20 +``` +Attendu : tous les tests `extractor::tests::download_args_*` et `parse_download_path_*` PASS. + +- [ ] **Step 5 : Lint** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +cargo clippy -- -D warnings 2>&1 | tail -20 +``` +Attendu : aucun warning. + +- [ ] **Step 6 : Commit** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +git add src/extractor.rs +git commit -m "feat(extractor): add yt_dlp_args_for_download_to_file and parse helpers" +``` + +--- + +## Task 2 — Plugin : fonction WASM `download_to_file` + +**Files:** +- Modify: `vortex-mod-youtube/src/plugin_api.rs` + +- [ ] **Step 1 : Implémenter `download_to_file` dans `plugin_api.rs`** + +Ajouter l'import en tête du fichier (après les imports existants) : + +```rust +use crate::extractor::{ + build_subprocess_request, parse_subprocess_response, + yt_dlp_args_for_download_to_file, parse_download_path_from_stdout, + DEFAULT_DOWNLOAD_TIMEOUT_MS, +}; +``` + +> Note : `DEFAULT_DOWNLOAD_TIMEOUT_MS` n'est pas encore dans les imports existants — ajouter uniquement si absent. + +Ajouter la fonction après `resolve_stream_url` (avant `call_yt_dlp`) : + +```rust +/// Download a video/audio file using yt-dlp's native download+merge pipeline. +/// +/// Use this when `resolve_stream_url` returns `AdaptiveStreamOnly` — i.e. when +/// the requested quality is only available as DASH streams that must be merged +/// with ffmpeg. yt-dlp handles the multi-stream download and ffmpeg merge +/// internally; the merged file is written to `output_dir` and its path is +/// returned. +/// +/// Input: JSON `{ "url", "quality"?, "format"?, "output_dir", "audio_only"? }` +/// Output: absolute path of the merged file (raw string, not JSON) +#[plugin_fn] +pub fn download_to_file(input: String) -> FnResult { + #[derive(serde::Deserialize)] + struct Input { + url: String, + #[serde(default)] + quality: String, + #[serde(default)] + format: String, + output_dir: String, + #[serde(default)] + audio_only: bool, + } + + let params: Input = + serde_json::from_str(&input).map_err(|e| error_to_fn_error(PluginError::SerdeJson(e)))?; + + ensure_single_video(¶ms.url).map_err(error_to_fn_error)?; + + let args = yt_dlp_args_for_download_to_file( + ¶ms.url, + ¶ms.quality, + ¶ms.format, + ¶ms.output_dir, + params.audio_only, + ); + + // Override timeout: full download+merge can take 30+ minutes. + let req = crate::extractor::SubprocessRequest { + binary: "yt-dlp".into(), + args, + timeout_ms: DEFAULT_DOWNLOAD_TIMEOUT_MS, + }; + let req_json = serde_json::to_string(&req) + .map_err(|e| error_to_fn_error(PluginError::SerdeJson(e)))?; + + let resp_json = unsafe { run_subprocess(req_json)? }; + let stdout = parse_subprocess_response(&resp_json).map_err(error_to_fn_error)?; + + parse_download_path_from_stdout(&stdout).map_err(error_to_fn_error) +} +``` + +> **Note** : `SubprocessRequest` est actuellement `pub(crate)` dans `extractor.rs`. Changer sa visibilité en `pub` pour que `plugin_api.rs` puisse l'utiliser directement. Alternativement, exposer un helper `build_subprocess_request_with_timeout(args, timeout_ms)`. + +- [ ] **Step 2 : Ajuster la visibilité de `SubprocessRequest` dans `extractor.rs`** + +Dans `vortex-mod-youtube/src/extractor.rs`, ligne 12, changer : +```rust +pub struct SubprocessRequest { +``` +(vérifier qu'il est déjà `pub` — si oui, rien à faire). + +Si `SubprocessRequest` est `pub(crate)`, le rendre `pub`. + +- [ ] **Step 3 : Compiler** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +cargo build 2>&1 | tail -20 +``` +Attendu : compilation réussie. + +- [ ] **Step 4 : Lint** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +cargo clippy -- -D warnings 2>&1 | tail -20 +``` +Attendu : aucun warning. + +- [ ] **Step 5 : Commit** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +git add src/plugin_api.rs src/extractor.rs +git commit -m "feat(plugin): add download_to_file WASM export for DASH+merge support" +``` + +--- + +## Task 3 — Plugin : version bump + CHANGELOG + +**Files:** +- Modify: `vortex-mod-youtube/plugin.toml` +- Modify: `vortex-mod-youtube/Cargo.toml` +- Modify: `vortex-mod-youtube/CHANGELOG.md` (créer si absent) + +- [ ] **Step 1 : Bumper `plugin.toml` vers `1.2.0`** + +Dans `plugin.toml`, ligne 3 : +```toml +version = "1.2.0" +``` + +- [ ] **Step 2 : Bumper `Cargo.toml` vers `1.2.0`** + +Dans `Cargo.toml`, ligne 3 : +```toml +version = "1.2.0" +``` + +- [ ] **Step 3 : Mettre à jour / créer `CHANGELOG.md`** + +Si `CHANGELOG.md` n'existe pas, créer avec ce contenu. +Si il existe, ajouter la section `[1.2.0]` en haut (après le titre) : + +```markdown +## [1.2.0] - 2026-04-16 + +### Added +- `download_to_file` plugin function: delegates DASH download + ffmpeg merge to + yt-dlp, enabling true 1080p/1440p/2160p downloads. Called by Vortex core when + `resolve_stream_url` returns `AdaptiveStreamOnly` (i.e. when YouTube only + offers the requested quality as separate video+audio DASH streams). + +### Changed +- `DEFAULT_DOWNLOAD_TIMEOUT_MS` set to 30 minutes for `download_to_file` + (vs 60 seconds for `resolve_stream_url`). +``` + +- [ ] **Step 4 : Vérifier le build complet** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +cargo test 2>&1 | tail -10 +``` +Attendu : tous les tests PASS. + +- [ ] **Step 5 : Commit** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +git add plugin.toml Cargo.toml CHANGELOG.md +git commit -m "chore(release): bump to 1.2.0 — add download_to_file" +``` + +--- + +## Task 4 — Vortex core : `DomainError::AdaptiveStreamOnly` + +**Files:** +- Modify: `vortex/src-tauri/src/domain/error.rs` + +- [ ] **Step 1 : Écrire le test qui va échouer** + +Dans `domain/error.rs`, ajouter dans `#[cfg(test)] mod tests` : + +```rust +#[test] +fn test_display_adaptive_stream_only() { + let err = DomainError::AdaptiveStreamOnly; + assert_eq!( + err.to_string(), + "Video is only available as adaptive stream (DASH/HLS); use download_to_file" + ); +} +``` + +- [ ] **Step 2 : Vérifier que le test échoue** + +```bash +cd /home/matvei/projets/vx/vortex +cargo test domain::error -- --nocapture 2>&1 | tail -10 +``` +Attendu : erreur de compilation (variant manquant). + +- [ ] **Step 3 : Ajouter le variant** + +Dans `domain/error.rs`, ajouter dans l'enum `DomainError` (après `PluginError`) : + +```rust +AdaptiveStreamOnly, +``` + +Dans `impl std::fmt::Display for DomainError`, ajouter dans le `match` : + +```rust +DomainError::AdaptiveStreamOnly => write!( + f, + "Video is only available as adaptive stream (DASH/HLS); use download_to_file" +), +``` + +- [ ] **Step 4 : Vérifier que le test passe** + +```bash +cd /home/matvei/projets/vx/vortex +cargo test test_display_adaptive_stream_only 2>&1 | tail -10 +``` +Attendu : PASS. + +- [ ] **Step 5 : Vérifier que l'existant compiletoujours** + +```bash +cd /home/matvei/projets/vx/vortex +cargo build --workspace 2>&1 | grep -E "^error" | head -10 +``` +Attendu : aucune erreur (le compilateur signalera les `match` non exhaustifs si des places utilisent `DomainError` avec match exhaustif — corriger chaque cas). + +- [ ] **Step 6 : Corriger les matchs non exhaustifs** + +Si le compilateur signale des matchs non exhaustifs (ex : dans des `impl From`), ajouter le bras : +```rust +DomainError::AdaptiveStreamOnly => { /* même traitement que PluginError */ } +``` + +- [ ] **Step 7 : Commit** + +```bash +cd /home/matvei/projets/vx/vortex +git add src-tauri/src/domain/error.rs +git commit -m "feat(domain): add AdaptiveStreamOnly error variant" +``` + +--- + +## Task 5 — Vortex core : trait `PluginLoader` + `DownloadedFileInfo` + +**Files:** +- Modify: `vortex/src-tauri/src/domain/ports/driven/plugin_loader.rs` + +- [ ] **Step 1 : Écrire le test qui va échouer** + +Ajouter dans `plugin_loader.rs` (ou un fichier de test dédié dans `domain/ports/driven/tests.rs` si un tel fichier existe) : + +```rust +#[cfg(test)] +mod plugin_loader_tests { + use super::*; + use crate::domain::error::DomainError; + + struct MinimalLoader; + impl PluginLoader for MinimalLoader { + fn load(&self, _: &crate::domain::model::plugin::PluginManifest) -> Result<(), DomainError> { Ok(()) } + fn unload(&self, _: &str) -> Result<(), DomainError> { Ok(()) } + fn resolve_url(&self, _: &str) -> Result, DomainError> { Ok(None) } + fn list_loaded(&self) -> Result, DomainError> { Ok(vec![]) } + fn set_enabled(&self, _: &str, _: bool) -> Result<(), DomainError> { Ok(()) } + } + + #[test] + fn test_download_to_file_default_returns_not_found() { + let loader = MinimalLoader; + let result = loader.download_to_file("https://youtu.be/x", "1080p", "mp4", "/tmp", false); + assert!(matches!(result, Err(DomainError::NotFound(_)))); + } +} +``` + +- [ ] **Step 2 : Vérifier que le test échoue** + +```bash +cd /home/matvei/projets/vx/vortex +cargo test test_download_to_file_default_returns_not_found 2>&1 | tail -10 +``` +Attendu : erreur de compilation (méthode manquante). + +- [ ] **Step 3 : Ajouter `DownloadedFileInfo` et la méthode `download_to_file`** + +Dans `plugin_loader.rs`, ajouter avant le trait `PluginLoader` : + +```rust +/// Result of a `download_to_file` plugin call. +pub struct DownloadedFileInfo { + /// Absolute path to the merged output file on the host filesystem. + pub path: std::path::PathBuf, + /// File size in bytes (obtained from host `std::fs::metadata`). + pub size: u64, +} +``` + +Ajouter dans le trait `PluginLoader` (après `resolve_stream_url`) : + +```rust +/// Download a video/audio file using the plugin's native download+merge +/// pipeline (e.g. yt-dlp DASH). Used as fallback when `resolve_stream_url` +/// returns `AdaptiveStreamOnly`. +/// +/// The plugin downloads both streams to `output_dir`, merges them, and +/// returns the absolute path of the merged file. The host then reads the +/// file size via `std::fs::metadata`. +/// +/// Returns `Err(DomainError::NotFound)` by default (adapters that do not +/// support this operation should rely on the default). +fn download_to_file( + &self, + _url: &str, + _quality: &str, + _format: &str, + _output_dir: &str, + _audio_only: bool, +) -> Result { + Err(DomainError::NotFound( + "download_to_file not supported by this loader".into(), + )) +} +``` + +- [ ] **Step 4 : Vérifier que le test passe** + +```bash +cd /home/matvei/projets/vx/vortex +cargo test test_download_to_file_default_returns_not_found 2>&1 | tail -10 +``` +Attendu : PASS. + +- [ ] **Step 5 : Commit** + +```bash +cd /home/matvei/projets/vx/vortex +git add src-tauri/src/domain/ports/driven/plugin_loader.rs +git commit -m "feat(domain): add DownloadedFileInfo and download_to_file to PluginLoader trait" +``` + +--- + +## Task 6 — Vortex core : `ExtismPluginLoader` — implémenter `download_to_file` + détecter `AdaptiveStreamOnly` + +**Files:** +- Modify: `vortex/src-tauri/src/adapters/driven/plugin/extism_loader.rs` + +- [ ] **Step 1 : Écrire les tests qui vont échouer** + +Dans le module `#[cfg(test)] mod tests` de `extism_loader.rs`, ajouter : + +```rust +#[test] +fn test_resolve_stream_url_maps_adaptive_stream_error() { + // Verify that a PluginError containing "adaptive stream" in its message + // gets mapped to DomainError::AdaptiveStreamOnly. + let msg = "video is only available as an adaptive stream (HLS/DASH) at this quality; try 360p or 480p for a direct download"; + assert!(is_adaptive_stream_error(msg)); +} + +#[test] +fn test_resolve_stream_url_does_not_map_other_errors() { + assert!(!is_adaptive_stream_error("no format matches requested quality")); + assert!(!is_adaptive_stream_error("yt-dlp failed (exit code 1): video unavailable")); +} +``` + +- [ ] **Step 2 : Vérifier que les tests échouent** + +```bash +cd /home/matvei/projets/vx/vortex +cargo test test_resolve_stream_url_maps_adaptive 2>&1 | tail -10 +``` +Attendu : erreur de compilation. + +- [ ] **Step 3 : Ajouter le helper `is_adaptive_stream_error` et implémenter `download_to_file`** + +Dans `extism_loader.rs`, ajouter après `impl PluginLoader for ExtismPluginLoader` (dans le bloc impl) : + +```rust +fn resolve_stream_url( + &self, + url: &str, + quality: &str, + format: &str, + audio_only: bool, +) -> Result { + // Find the plugin that claims this URL. + let info = self + .resolve_url(url)? + .ok_or_else(|| DomainError::PluginError(format!("no plugin can handle URL: {url}")))?; + + if info.name() == "builtin-http" { + return Err(DomainError::NotFound("builtin-http".into())); + } + + let input = serde_json::json!({ + "url": url, + "quality": quality, + "format": format, + "audio_only": audio_only, + }) + .to_string(); + + self.registry + .call_plugin(info.name(), "resolve_stream_url", &input) + .map_err(|e| { + let msg = e.to_string(); + if is_adaptive_stream_error(&msg) { + DomainError::AdaptiveStreamOnly + } else { + DomainError::PluginError(format!( + "plugin '{}' resolve_stream_url failed: {msg}", + info.name() + )) + } + }) +} + +fn download_to_file( + &self, + url: &str, + quality: &str, + format: &str, + output_dir: &str, + audio_only: bool, +) -> Result { + let info = self + .resolve_url(url)? + .ok_or_else(|| DomainError::PluginError(format!("no plugin can handle URL: {url}")))?; + + if info.name() == "builtin-http" { + return Err(DomainError::NotFound("builtin-http".into())); + } + + let input = serde_json::json!({ + "url": url, + "quality": quality, + "format": format, + "output_dir": output_dir, + "audio_only": audio_only, + }) + .to_string(); + + let path_str = self + .registry + .call_plugin(info.name(), "download_to_file", &input) + .map_err(|e| { + DomainError::PluginError(format!( + "plugin '{}' download_to_file failed: {e}", + info.name() + )) + })?; + + let path = std::path::PathBuf::from(path_str.trim()); + + // Validate the returned path is within output_dir (path traversal protection). + let canon_output = std::path::Path::new(output_dir) + .canonicalize() + .map_err(|e| DomainError::StorageError(format!("output_dir invalid: {e}")))?; + let canon_path = path + .canonicalize() + .map_err(|e| DomainError::StorageError(format!("returned path invalid: {e}")))?; + if !canon_path.starts_with(&canon_output) { + return Err(DomainError::ValidationError(format!( + "plugin returned path outside output_dir: {}", + path.display() + ))); + } + + let size = std::fs::metadata(&canon_path) + .map(|m| m.len()) + .unwrap_or(0); + + Ok(crate::domain::ports::driven::DownloadedFileInfo { + path: canon_path, + size, + }) +} +``` + +Ajouter la fonction libre (hors du bloc `impl`) : + +```rust +/// Returns `true` if the plugin error message indicates an adaptive-only stream. +/// +/// Matches the exact error text emitted by `vortex-mod-youtube`'s +/// `PluginError::AdaptiveStreamOnly` variant. Kept as a named function so +/// it can be unit-tested independently of the Extism runtime. +fn is_adaptive_stream_error(msg: &str) -> bool { + msg.contains("adaptive stream") +} +``` + +> **Note :** Remplacer l'implémentation existante de `resolve_stream_url` dans le `impl PluginLoader for ExtismPluginLoader` — ne pas dupliquer. + +- [ ] **Step 4 : Vérifier que les tests passent** + +```bash +cd /home/matvei/projets/vx/vortex +cargo test test_resolve_stream_url_maps_adaptive test_resolve_stream_url_does_not_map 2>&1 | tail -10 +``` +Attendu : PASS. + +- [ ] **Step 5 : Tous les tests du workspace** + +```bash +cd /home/matvei/projets/vx/vortex +cargo test --workspace 2>&1 | tail -20 +``` +Attendu : aucun FAILED. + +- [ ] **Step 6 : Commit** + +```bash +cd /home/matvei/projets/vx/vortex +git add src-tauri/src/adapters/driven/plugin/extism_loader.rs +git commit -m "feat(plugin): implement download_to_file in ExtismPluginLoader + detect AdaptiveStreamOnly" +``` + +--- + +## Task 7 — Vortex core : `RegisterLocalFileCommand` + handler + +**Files:** +- Modify: `vortex/src-tauri/src/application/commands/mod.rs` +- Create: `vortex/src-tauri/src/application/commands/register_local_file.rs` + +- [ ] **Step 1 : Écrire les tests qui vont échouer** + +Créer `vortex/src-tauri/src/application/commands/register_local_file.rs` avec les tests en premier : + +```rust +//! Handler for `RegisterLocalFileCommand`. +//! +//! Registers an already-downloaded local file as a Completed download. +//! Used after `download_to_file` produces a merged file via yt-dlp. + +use std::path::PathBuf; + +use crate::application::command_bus::CommandBus; +use crate::application::error::AppError; +use crate::domain::event::DomainEvent; +use crate::domain::model::download::{Download, DownloadId, Url}; + +impl CommandBus { + pub async fn handle_register_local_file( + &self, + cmd: super::RegisterLocalFileCommand, + ) -> Result { + todo!() + } +} + +#[cfg(test)] +mod tests { + use std::collections::HashMap; + use std::path::PathBuf; + use std::sync::Mutex; + + use crate::application::command_bus::CommandBus; + use crate::application::commands::RegisterLocalFileCommand; + use crate::domain::error::DomainError; + use crate::domain::event::DomainEvent; + use crate::domain::model::config::{AppConfig, ConfigPatch}; + use crate::domain::model::credential::Credential; + use crate::domain::model::download::{Download, DownloadId, DownloadState}; + use crate::domain::model::http::HttpResponse; + use crate::domain::model::meta::DownloadMeta; + use crate::domain::model::plugin::{PluginInfo, PluginManifest}; + use crate::domain::ports::driven::{ + ClipboardObserver, ConfigStore, CredentialStore, DownloadEngine, DownloadRepository, + EventBus, FileStorage, HttpClient, PluginLoader, + }; + use std::sync::Arc; + + // ── Minimal mocks (copies from start_download.rs tests) ────────────────── + + struct MockRepo(Mutex>); + impl MockRepo { + fn new() -> Self { Self(Mutex::new(HashMap::new())) } + } + impl DownloadRepository for MockRepo { + fn find_by_id(&self, id: DownloadId) -> Result, DomainError> { + Ok(self.0.lock().unwrap().get(&id.0).cloned()) + } + fn save(&self, d: &Download) -> Result<(), DomainError> { + self.0.lock().unwrap().insert(d.id().0, d.clone()); Ok(()) + } + fn delete(&self, id: DownloadId) -> Result<(), DomainError> { + self.0.lock().unwrap().remove(&id.0); Ok(()) + } + fn find_by_state(&self, s: DownloadState) -> Result, DomainError> { + Ok(self.0.lock().unwrap().values().filter(|d| d.state() == s).cloned().collect()) + } + } + struct MockEngine; + impl DownloadEngine for MockEngine { + fn start(&self, _: &Download) -> Result<(), DomainError> { Ok(()) } + fn pause(&self, _: DownloadId) -> Result<(), DomainError> { Ok(()) } + fn resume(&self, _: DownloadId) -> Result<(), DomainError> { Ok(()) } + fn cancel(&self, _: DownloadId) -> Result<(), DomainError> { Ok(()) } + } + struct MockBus(Mutex>); + impl MockBus { fn new() -> Self { Self(Mutex::new(vec![])) } } + impl EventBus for MockBus { + fn publish(&self, e: DomainEvent) { self.0.lock().unwrap().push(e); } + fn subscribe(&self, _: Box) {} + } + struct MockHttp; + impl HttpClient for MockHttp { + fn head(&self, _: &str) -> Result { Err(DomainError::NetworkError("no".into())) } + fn get_range(&self, _: &str, s: u64, e: u64) -> Result, DomainError> { Ok(vec![0u8; (e - s + 1) as usize]) } + fn supports_range(&self, _: &str) -> Result { Ok(false) } + } + struct MockFs; + impl FileStorage for MockFs { + fn create_file(&self, _: &std::path::Path, _: u64) -> Result<(), DomainError> { Ok(()) } + fn write_segment(&self, _: &std::path::Path, _: u64, _: &[u8]) -> Result<(), DomainError> { Ok(()) } + fn read_meta(&self, _: &std::path::Path) -> Result, DomainError> { Ok(None) } + fn write_meta(&self, _: &std::path::Path, _: &DownloadMeta) -> Result<(), DomainError> { Ok(()) } + fn delete_meta(&self, _: &std::path::Path) -> Result<(), DomainError> { Ok(()) } + } + struct MockPlugin; + impl PluginLoader for MockPlugin { + fn load(&self, _: &PluginManifest) -> Result<(), DomainError> { Ok(()) } + fn unload(&self, _: &str) -> Result<(), DomainError> { Ok(()) } + fn resolve_url(&self, _: &str) -> Result, DomainError> { Ok(None) } + fn list_loaded(&self) -> Result, DomainError> { Ok(vec![]) } + fn set_enabled(&self, _: &str, _: bool) -> Result<(), DomainError> { Ok(()) } + } + struct MockCfg; + impl ConfigStore for MockCfg { + fn get_config(&self) -> Result { Ok(AppConfig::default()) } + fn update_config(&self, _: ConfigPatch) -> Result { Ok(AppConfig::default()) } + } + struct MockCred; + impl CredentialStore for MockCred { + fn get(&self, _: &str) -> Result, DomainError> { Ok(None) } + fn store(&self, _: &str, _: &Credential) -> Result<(), DomainError> { Ok(()) } + fn delete(&self, _: &str) -> Result<(), DomainError> { Ok(()) } + } + struct MockClip; + impl ClipboardObserver for MockClip { + fn start(&self) -> Result<(), DomainError> { Ok(()) } + fn stop(&self) -> Result<(), DomainError> { Ok(()) } + fn get_urls(&self) -> Result, DomainError> { Ok(vec![]) } + } + struct FakeArchive; + impl crate::domain::ports::driven::ArchiveExtractor for FakeArchive { + fn detect_format(&self, _: &std::path::Path) -> Result, DomainError> { Ok(None) } + fn can_extract(&self, _: &std::path::Path) -> Result { Ok(false) } + fn extract(&self, _: &std::path::Path, _: &std::path::Path, _: Option<&str>) -> Result { + Ok(crate::domain::model::archive::ExtractSummary { extracted_files: 0, extracted_bytes: 0, duration_ms: 0, warnings: vec![] }) + } + fn list_contents(&self, _: &std::path::Path, _: Option<&str>) -> Result, DomainError> { Ok(vec![]) } + fn detect_segments(&self, _: &std::path::Path) -> Result>, DomainError> { Ok(None) } + } + + fn make_bus() -> (CommandBus, Arc, Arc) { + let repo = Arc::new(MockRepo::new()); + let events = Arc::new(MockBus::new()); + let bus = CommandBus::new( + repo.clone(), Arc::new(MockEngine), events.clone(), + Arc::new(MockFs), Arc::new(MockHttp), Arc::new(MockPlugin), + Arc::new(MockCfg), Arc::new(MockCred), Arc::new(MockClip), + Arc::new(FakeArchive), None, + ); + (bus, repo, events) + } + + #[tokio::test] + async fn test_register_local_file_creates_completed_download() { + let (bus, repo, _) = make_bus(); + + let cmd = RegisterLocalFileCommand { + source_url: "https://www.youtube.com/watch?v=dQw4w9WgXcQ".to_string(), + destination_path: PathBuf::from("/tmp/downloads/video.mp4"), + filename: "Rick Astley - Never Gonna Give You Up.mp4".to_string(), + source_hostname: Some("www.youtube.com".to_string()), + file_size: 52_428_800, + }; + + let id = bus.handle_register_local_file(cmd).await.unwrap(); + + let saved = repo.0.lock().unwrap().get(&id.0).cloned().unwrap(); + assert_eq!(saved.state(), DownloadState::Completed); + assert_eq!(saved.file_name(), "Rick Astley - Never Gonna Give You Up.mp4"); + assert_eq!(saved.source_hostname(), "www.youtube.com"); + } + + #[tokio::test] + async fn test_register_local_file_emits_created_and_completed_events() { + let (bus, _, events) = make_bus(); + + let cmd = RegisterLocalFileCommand { + source_url: "https://www.youtube.com/watch?v=dQw4w9WgXcQ".to_string(), + destination_path: PathBuf::from("/tmp/downloads/video.mp4"), + filename: "video.mp4".to_string(), + source_hostname: None, + file_size: 0, + }; + + let id = bus.handle_register_local_file(cmd).await.unwrap(); + + let evs = events.0.lock().unwrap(); + assert!(evs.iter().any(|e| *e == DomainEvent::DownloadCreated { id }), "must emit DownloadCreated"); + assert!(evs.iter().any(|e| *e == DomainEvent::DownloadCompleted { id }), "must emit DownloadCompleted"); + } +} +``` + +- [ ] **Step 2 : Vérifier que les tests échouent** + +```bash +cd /home/matvei/projets/vx/vortex +cargo test test_register_local_file 2>&1 | tail -20 +``` +Attendu : erreur de compilation (`RegisterLocalFileCommand` manquant, `todo!()`). + +- [ ] **Step 3 : Ajouter `RegisterLocalFileCommand` dans `commands/mod.rs`** + +Dans `commands/mod.rs`, ajouter : +- `mod register_local_file;` dans la liste des modules (après `mod start_download;`) +- La struct de commande (à la fin du fichier) : + +```rust +#[derive(Debug)] +pub struct RegisterLocalFileCommand { + /// Original source URL (e.g. "https://www.youtube.com/watch?v=...") + /// used to populate the download record's URL field. + pub source_url: String, + /// Absolute path where the merged file has been moved by the caller. + pub destination_path: PathBuf, + /// Final filename (e.g. "Rick Astley - Never Gonna Give You Up.mp4"). + pub filename: String, + /// Origin hostname override (e.g. "www.youtube.com"). + pub source_hostname: Option, + /// File size in bytes. + pub file_size: u64, +} +impl Command for RegisterLocalFileCommand {} +``` + +- [ ] **Step 4 : Implémenter `handle_register_local_file` dans `register_local_file.rs`** + +Remplacer le `todo!()` par l'implémentation : + +```rust +impl CommandBus { + pub async fn handle_register_local_file( + &self, + cmd: super::RegisterLocalFileCommand, + ) -> Result { + let url = Url::new(&cmd.source_url)?; + + let id = next_download_id(); + let dest = cmd.destination_path.to_string_lossy().to_string(); + + let mut download = Download::new(id, url, cmd.filename, dest); + + if let Some(hostname) = cmd.source_hostname { + download = download.with_source_hostname(hostname); + } + if cmd.file_size > 0 { + download.set_file_size(cmd.file_size); + } + + // Transition: Queued → Downloading → Completed + download.start().map_err(|e| AppError::Domain(e))?; + let completed_event = download.complete().map_err(|e| AppError::Domain(e))?; + + self.download_repo().save(&download)?; + self.event_bus().publish(DomainEvent::DownloadCreated { id }); + self.event_bus().publish(completed_event); + + Ok(id) + } +} +``` + +Ajouter la fonction `next_download_id` — elle est déjà définie dans `start_download.rs` mais est privée. Options : +- La déplacer dans un module partagé `commands/id_gen.rs`, ou +- La dupliquer dans `register_local_file.rs` (YAGNI — deux usages ne justifient pas encore une abstraction si les fichiers sont proches). + +**Choisir la duplication** (deux fichiers, même module) : + +```rust +use std::sync::atomic::{AtomicU64, Ordering}; + +static NEXT_LOCAL_SEQ: AtomicU64 = AtomicU64::new(0); + +fn next_download_id() -> DownloadId { + let seq = NEXT_LOCAL_SEQ.fetch_add(1, Ordering::Relaxed) & 0xFFF; + let ts = std::time::SystemTime::now() + .duration_since(std::time::UNIX_EPOCH) + .unwrap_or_default() + .as_millis() as u64; + DownloadId((ts << 12) | seq) +} +``` + +> **Note** : Si `Download::set_file_size` n'existe pas encore, ajouter dans `domain/model/download.rs` : +> ```rust +> pub fn set_file_size(&mut self, bytes: u64) { +> self.file_size = Some(crate::domain::model::download::FileSize(bytes)); +> self.downloaded_bytes = bytes; +> } +> ``` + +- [ ] **Step 5 : Vérifier que les tests passent** + +```bash +cd /home/matvei/projets/vx/vortex +cargo test test_register_local_file 2>&1 | tail -10 +``` +Attendu : 2 tests PASS. + +- [ ] **Step 6 : Tous les tests workspace** + +```bash +cd /home/matvei/projets/vx/vortex +cargo test --workspace 2>&1 | grep -E "FAILED|test result" | tail -10 +``` +Attendu : 0 FAILED. + +- [ ] **Step 7 : Commit** + +```bash +cd /home/matvei/projets/vx/vortex +git add src-tauri/src/application/commands/mod.rs src-tauri/src/application/commands/register_local_file.rs src-tauri/src/domain/model/download.rs +git commit -m "feat(commands): add RegisterLocalFileCommand for yt-dlp merged downloads" +``` + +--- + +## Task 8 — Vortex core : `tauri_ipc.rs` — fallback `AdaptiveStreamOnly` + +**Files:** +- Modify: `vortex/src-tauri/src/adapters/driving/tauri_ipc.rs` + +- [ ] **Step 1 : Modifier `download_media_start`** + +Remplacer la section `let stream_url = tokio::task::spawn_blocking(...)...await...?;` et la section `let cmd = StartDownloadCommand { ... }` + `state.command_bus.handle_start_download(cmd)...` + +par : + +```rust +// Plugin calls are synchronous (Extism runs inside a Mutex). Run on the +// blocking thread pool so we don't starve the async executor. +enum StreamResolution { + CdnUrl(String), + LocalFile { + path: std::path::PathBuf, + size: u64, + filename: String, + }, +} + +let plugin_loader = state.plugin_loader.clone(); +let url_clone = url.clone(); +let quality_clone = quality.clone(); +let format_clone = format.clone(); +let title_clone = title.clone(); + +let resolution = tokio::task::spawn_blocking(move || -> Result { + match plugin_loader.resolve_stream_url( + &url_clone, + &quality_clone, + &format_clone, + audio_only, + ) { + Ok(cdn_url) => Ok(StreamResolution::CdnUrl(cdn_url)), + + Err(crate::domain::error::DomainError::AdaptiveStreamOnly) => { + // yt-dlp must handle the full download+merge. + let temp_dir = std::env::temp_dir().join("vortex-downloads"); + std::fs::create_dir_all(&temp_dir).map_err(|e| format!("failed to create temp dir: {e}"))?; + + let file_info = plugin_loader + .download_to_file( + &url_clone, + &quality_clone, + &format_clone, + temp_dir.to_str().unwrap_or("/tmp/vortex-downloads"), + audio_only, + ) + .map_err(|e| format!("download_to_file failed: {e}"))?; + + // Determine final filename: prefer title override, else keep yt-dlp's name. + let filename = title_clone + .as_deref() + .filter(|t| !t.trim().is_empty()) + .map(|t| format!("{}.{}", sanitize_filename(t), format_clone)) + .unwrap_or_else(|| { + file_info + .path + .file_name() + .and_then(|n| n.to_str()) + .unwrap_or("download") + .to_string() + }); + + // Determine final destination directory. + let dest_dir = dirs::download_dir().unwrap_or_else(|| std::path::PathBuf::from(".")); + let dest_path = dest_dir.join(&filename); + + // Atomic move (same filesystem) → fallback copy+delete. + if std::fs::rename(&file_info.path, &dest_path).is_err() { + std::fs::copy(&file_info.path, &dest_path) + .map_err(|e| format!("failed to copy merged file: {e}"))?; + let _ = std::fs::remove_file(&file_info.path); + } + + Ok(StreamResolution::LocalFile { + path: dest_path, + size: file_info.size, + }) + } + + Err(crate::domain::error::DomainError::NotFound(_)) => { + if is_known_media_platform(&url_clone) { + Err( + "No media plugin installed for this URL. \ + Open the Plugin Store and install the appropriate plugin (e.g. vortex-mod-youtube)." + .to_string(), + ) + } else { + Ok(StreamResolution::CdnUrl(url_clone)) + } + } + + Err(e) => Err(format!("Failed to resolve stream URL: {e}")), + } +}) +.await +.map_err(|e| format!("Task join error: {e}"))??; + +match resolution { + StreamResolution::CdnUrl(stream_url) => { + let filename = title + .as_deref() + .filter(|t| !t.trim().is_empty()) + .map(|t| format!("{}.{}", sanitize_filename(t), format)); + + let cmd = crate::application::commands::StartDownloadCommand { + url: stream_url, + destination: None, + filename, + source_hostname_override, + }; + state + .command_bus + .handle_start_download(cmd) + .await + .map(|id| id.0) + .map_err(|e| e.to_string()) + } + + StreamResolution::LocalFile { path, size, filename } => { + let cmd = crate::application::commands::RegisterLocalFileCommand { + source_url: url, + destination_path: path, + filename, + source_hostname: source_hostname_override, + file_size: size, + }; + state + .command_bus + .handle_register_local_file(cmd) + .await + .map(|id| id.0) + .map_err(|e| e.to_string()) + } +} +``` + +> **Note** : L'enum `StreamResolution` doit être déclarée **à l'intérieur** de la fonction `download_media_start` (pas au niveau du module) pour rester localisée. + +- [ ] **Step 2 : Supprimer le code mort** + +Supprimer les lignes devenues inutilisées (l'ancien bloc `let stream_url`, `let filename`, `let cmd = StartDownloadCommand...`, `state.command_bus.handle_start_download(...)`). + +- [ ] **Step 3 : Compiler** + +```bash +cd /home/matvei/projets/vx/vortex +cargo build --workspace 2>&1 | grep -E "^error" | head -20 +``` +Attendu : compilation sans erreur. + +- [ ] **Step 4 : Tous les tests** + +```bash +cd /home/matvei/projets/vx/vortex +cargo test --workspace 2>&1 | grep -E "FAILED|test result" | tail -10 +``` +Attendu : 0 FAILED. + +- [ ] **Step 5 : Lint** + +```bash +cd /home/matvei/projets/vx/vortex +cargo clippy --workspace -- -D warnings 2>&1 | grep "^error" | head -10 +``` +Attendu : 0 erreur. + +- [ ] **Step 6 : Mettre à jour `vortex/CHANGELOG.md`** + +Dans la section `[Unreleased]` : + +```markdown +### Added +- YouTube 1080p+ support: when `resolve_stream_url` returns `AdaptiveStreamOnly`, + `download_media_start` now falls back to `download_to_file` which delegates the + full DASH download + ffmpeg merge to yt-dlp. The merged file is moved to the + downloads folder and registered as a completed download. + +### Fixed +- YouTube downloads silently downgrading to 360p when 1080p was requested but only + DASH streams were available. +``` + +- [ ] **Step 7 : Commit** + +```bash +cd /home/matvei/projets/vx/vortex +git add src-tauri/src/adapters/driving/tauri_ipc.rs CHANGELOG.md +git commit -m "feat(download): fallback to download_to_file when AdaptiveStreamOnly (fixes 1080p YouTube)" +``` + +--- + +## Task 9 — Build plugin, release, mise à jour registry + +- [ ] **Step 1 : Build WASM** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +cargo build --target wasm32-wasip1 --release 2>&1 | tail -10 +``` +Attendu : `Finished release [optimized] target(s)`. + +- [ ] **Step 2 : Calculer les checksums** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +sha256sum target/wasm32-wasip1/release/vortex_mod_youtube.wasm +sha256sum plugin.toml +``` +Copier les deux hash SHA-256 pour l'étape suivante. + +- [ ] **Step 3 : Tag + push le plugin** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +git tag -a v1.2.0 -m "Release v1.2.0 — add download_to_file for 1080p DASH support" +git push && git push --tags +``` + +- [ ] **Step 4 : Créer la GitHub Release** + +```bash +cd /home/matvei/projets/vx/vortex-mod-youtube +gh release create v1.2.0 \ + target/wasm32-wasip1/release/vortex_mod_youtube.wasm \ + plugin.toml \ + --title "v1.2.0 — 1080p DASH support" \ + --notes "## What's new + +### Added +- \`download_to_file\` plugin function: delegates DASH download + ffmpeg merge to + yt-dlp, enabling true 1080p/1440p/2160p downloads from YouTube. + +### Fixed +- Downloads silently downgrading to 360p when 1080p was requested but only DASH + streams were available. + +## Checksums +See \`plugin.toml\` included in this release for SHA-256 verification." +``` + +- [ ] **Step 5 : Mettre à jour `vortex/registry/registry.toml`** + +Remplacer l'entrée `vortex-mod-youtube` existante par : + +```toml +[[plugin]] +name = "vortex-mod-youtube" +description = "YouTube video/playlist/shorts/channel downloader via yt-dlp" +author = "vortex-community" +version = "1.2.0" +category = "crawler" +repository = "https://github.com/mpiton/vortex-mod-youtube" +checksum_sha256 = "" +checksum_sha256_toml = "" +official = true +min_vortex_version = "0.1.0" +``` + +Remplacer `` et `` par les valeurs calculées à l'étape 2. + +- [ ] **Step 6 : Commit registry** + +```bash +cd /home/matvei/projets/vx/vortex +git add registry/registry.toml +git commit -m "chore(registry): bump vortex-mod-youtube to 1.2.0" +``` + +--- + +## Self-Review + +### Couverture spec + +| Exigence | Tâche | +|----------|-------| +| Plugin `download_to_file` WASM export | Tasks 1, 2 | +| yt-dlp DASH + merge ffmpeg | Task 1 (`yt_dlp_args_for_download_to_file`) | +| `DomainError::AdaptiveStreamOnly` | Task 4 | +| `PluginLoader::download_to_file` trait | Task 5 | +| `ExtismPluginLoader` implémentation | Task 6 | +| `RegisterLocalFileCommand` + handler | Task 7 | +| Fallback IPC `AdaptiveStreamOnly` | Task 8 | +| Release plugin v1.2.0 | Task 9 | +| Registry mis à jour | Task 9 | + +### Vérifications de cohérence +- `DownloadedFileInfo` défini en Task 5, utilisé en Tasks 6 et 8 ✓ +- `RegisterLocalFileCommand` défini en Task 7 (mod.rs), utilisé en Task 8 ✓ +- `DomainError::AdaptiveStreamOnly` défini en Task 4, détecté en Task 6, matché en Task 8 ✓ +- `parse_download_path_from_stdout` défini en Task 1, importé en Task 2 ✓ +- `DEFAULT_DOWNLOAD_TIMEOUT_MS` défini en Task 1, utilisé en Task 2 ✓ diff --git a/docs/superpowers/plans/2026-04-17-clear-completed-and-failed-downloads.md b/docs/superpowers/plans/2026-04-17-clear-completed-and-failed-downloads.md new file mode 100644 index 0000000..193285a --- /dev/null +++ b/docs/superpowers/plans/2026-04-17-clear-completed-and-failed-downloads.md @@ -0,0 +1,1477 @@ +# Clear Completed & Failed Downloads — Implementation Plan + +> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking. + +**Goal:** Add two bulk-clear buttons (completed & failed) to the Downloads toolbar with an optional "also delete files from disk" checkbox gated by a prominent red warning panel. Provide success/error toasts. + +**Architecture:** One parameterised domain Command (`ClearDownloadsByStateCommand`) handled on `CommandBus`. Two thin Tauri IPC handlers restricting the state to `Completed` or `Error`. Domain-level guard rejects any non-terminal state. Frontend adds a reusable `ClearDownloadsDialog` and two buttons in `ActionsBar`. Sonner is installed once and wrapped by `src/lib/toast.ts` to avoid leaking the library everywhere. + +**Tech Stack:** Rust (Tauri 2, tokio, tracing, thiserror), React 19, TypeScript, TanStack Query v5, Zustand, Tailwind 4, shadcn/ui (Radix Dialog + Checkbox + Separator + Button), sonner, Vitest + Testing Library, react-i18next. + +**Spec:** `docs/superpowers/specs/2026-04-17-clear-completed-and-failed-downloads-design.md` + +--- + +## File structure + +### Create + +- `src-tauri/src/application/commands/clear_downloads_by_state.rs` — handler + unit tests +- `src/views/DownloadsView/ClearDownloadsDialog.tsx` — reusable confirmation dialog +- `src/views/DownloadsView/__tests__/ClearDownloadsDialog.test.tsx` +- `src/lib/toast.ts` — sonner wrapper +- `src/views/DownloadsView/__tests__/ActionsBar.test.tsx` — upgraded (new suites added) + +### Modify + +- `src-tauri/src/application/commands/mod.rs` — declare new module + `ClearDownloadsByStateCommand` struct +- `src-tauri/src/adapters/driving/tauri_ipc.rs` — add `download_clear_completed` + `download_clear_failed` +- `src-tauri/src/lib.rs` — export new IPC handlers + register in `invoke_handler![...]` +- `src/views/DownloadsView/ActionsBar.tsx` — add buttons + separator + dialog state + toast calls +- `src/i18n/locales/en.json` + `src/i18n/locales/fr.json` — new keys under `downloads.actions.*` and `downloads.toast.*` +- `src/App.tsx` — mount `` +- `package.json` / `package-lock.json` — add `sonner` +- `CHANGELOG.md` — `[Unreleased] > Added` entry + +--- + +## Task 1: Add `ClearDownloadsByStateCommand` struct + +**Files:** +- Modify: `src-tauri/src/application/commands/mod.rs` + +- [ ] **Step 1: Add the command struct** + +Insert between `RemoveDownloadCommand` (line 114-118) and `ResolveLinksCommand`: + +```rust +#[derive(Debug)] +pub struct ClearDownloadsByStateCommand { + pub state: crate::domain::model::download::DownloadState, + pub delete_files: bool, +} +impl Command for ClearDownloadsByStateCommand {} +``` + +Also add the module declaration near the top (after `mod remove_download;`): + +```rust +mod clear_downloads_by_state; +``` + +- [ ] **Step 2: Compile-check** + +Run: `cargo check -p vortex-core 2>&1 | tail -20` +Expected: compile error `unresolved module 'clear_downloads_by_state'` (we declare it next). + +- [ ] **Step 3: Commit** + +```bash +git add src-tauri/src/application/commands/mod.rs +git commit -m "feat(download): declare ClearDownloadsByStateCommand + +Adds the CQRS command struct for the upcoming bulk clear handler. The module +file is created in task 2." +``` + +--- + +## Task 2: RED — failing test for the "completed" happy path + +**Files:** +- Create: `src-tauri/src/application/commands/clear_downloads_by_state.rs` + +- [ ] **Step 1: Scaffold the file with test-only content** + +Create the file with: + +```rust +// Handler lives here. Test module sits at the bottom. + +#[cfg(test)] +mod tests { + use std::collections::HashMap; + use std::path::Path; + use std::sync::{Arc, Mutex}; + + use crate::application::command_bus::CommandBus; + use crate::application::commands::ClearDownloadsByStateCommand; + use crate::application::error::AppError; + use crate::domain::error::DomainError; + use crate::domain::event::DomainEvent; + use crate::domain::model::config::{AppConfig, ConfigPatch}; + use crate::domain::model::credential::Credential; + use crate::domain::model::download::{Download, DownloadId, DownloadState, Url}; + use crate::domain::model::http::HttpResponse; + use crate::domain::model::meta::DownloadMeta; + use crate::domain::model::plugin::{PluginInfo, PluginManifest}; + use crate::domain::ports::driven::{ + ArchiveExtractor, ClipboardObserver, ConfigStore, CredentialStore, DownloadEngine, + DownloadRepository, EventBus, FileStorage, HttpClient, PluginLoader, + }; + + // ---------- Mocks (copied from remove_download.rs — kept inline to stay + // consistent with the existing test style in this crate) ---------- + + struct MockDownloadRepo { + store: Mutex>, + } + impl MockDownloadRepo { + fn new() -> Self { Self { store: Mutex::new(HashMap::new()) } } + fn with(self, dl: Download) -> Self { + self.store.lock().unwrap().insert(dl.id().0, dl); + self + } + } + impl DownloadRepository for MockDownloadRepo { + fn find_by_id(&self, id: DownloadId) -> Result, DomainError> { + Ok(self.store.lock().unwrap().get(&id.0).cloned()) + } + fn save(&self, d: &Download) -> Result<(), DomainError> { + self.store.lock().unwrap().insert(d.id().0, d.clone()); Ok(()) + } + fn delete(&self, id: DownloadId) -> Result<(), DomainError> { + self.store.lock().unwrap().remove(&id.0); Ok(()) + } + fn find_by_state(&self, s: DownloadState) -> Result, DomainError> { + Ok(self.store.lock().unwrap().values().filter(|d| d.state() == s).cloned().collect()) + } + } + + struct MockDownloadEngine; + impl DownloadEngine for MockDownloadEngine { + fn start(&self, _: &Download) -> Result<(), DomainError> { Ok(()) } + fn pause(&self, _: DownloadId) -> Result<(), DomainError> { Ok(()) } + fn resume(&self, _: DownloadId) -> Result<(), DomainError> { Ok(()) } + fn cancel(&self, _: DownloadId) -> Result<(), DomainError> { Ok(()) } + } + + struct MockEventBus { events: Mutex> } + impl MockEventBus { fn new() -> Self { Self { events: Mutex::new(Vec::new()) } } } + impl EventBus for MockEventBus { + fn publish(&self, e: DomainEvent) { self.events.lock().unwrap().push(e); } + fn subscribe(&self, _: Box) {} + } + + struct MockFileStorage { deleted_metas: Mutex> } + impl MockFileStorage { fn new() -> Self { Self { deleted_metas: Mutex::new(Vec::new()) } } } + impl FileStorage for MockFileStorage { + fn create_file(&self, _: &Path, _: u64) -> Result<(), DomainError> { Ok(()) } + fn write_segment(&self, _: &Path, _: u64, _: &[u8]) -> Result<(), DomainError> { Ok(()) } + fn read_meta(&self, _: &Path) -> Result, DomainError> { Ok(None) } + fn write_meta(&self, _: &Path, _: &DownloadMeta) -> Result<(), DomainError> { Ok(()) } + fn delete_meta(&self, p: &Path) -> Result<(), DomainError> { + self.deleted_metas.lock().unwrap().push(p.to_string_lossy().into_owned()); Ok(()) + } + } + + struct MockHttpClient; + impl HttpClient for MockHttpClient { + fn head(&self, _: &str) -> Result { + Ok(HttpResponse { status_code: 200, headers: HashMap::new(), body: vec![] }) + } + fn get_range(&self, _: &str, _: u64, _: u64) -> Result, DomainError> { Ok(vec![]) } + fn supports_range(&self, _: &str) -> Result { Ok(true) } + } + + struct MockPluginLoader; + impl PluginLoader for MockPluginLoader { + fn load(&self, _: &PluginManifest) -> Result<(), DomainError> { Ok(()) } + fn unload(&self, _: &str) -> Result<(), DomainError> { Ok(()) } + fn resolve_url(&self, _: &str) -> Result, DomainError> { Ok(None) } + fn list_loaded(&self) -> Result, DomainError> { Ok(vec![]) } + fn set_enabled(&self, _: &str, _: bool) -> Result<(), DomainError> { Ok(()) } + } + + struct MockConfigStore; + impl ConfigStore for MockConfigStore { + fn get_config(&self) -> Result { Ok(AppConfig::default()) } + fn update_config(&self, _: ConfigPatch) -> Result { Ok(AppConfig::default()) } + } + + struct MockCredentialStore; + impl CredentialStore for MockCredentialStore { + fn get(&self, _: &str) -> Result, DomainError> { Ok(None) } + fn store(&self, _: &str, _: &Credential) -> Result<(), DomainError> { Ok(()) } + fn delete(&self, _: &str) -> Result<(), DomainError> { Ok(()) } + } + + struct MockClipboardObserver; + impl ClipboardObserver for MockClipboardObserver { + fn start(&self) -> Result<(), DomainError> { Ok(()) } + fn stop(&self) -> Result<(), DomainError> { Ok(()) } + fn get_urls(&self) -> Result, DomainError> { Ok(vec![]) } + } + + struct FakeArchiveExtractor; + impl ArchiveExtractor for FakeArchiveExtractor { + fn detect_format(&self, _: &Path) -> Result, DomainError> { Ok(None) } + fn can_extract(&self, _: &Path) -> Result { Ok(false) } + fn extract(&self, _: &Path, _: &Path, _: Option<&str>) -> Result { + Ok(crate::domain::model::archive::ExtractSummary { extracted_files: 0, extracted_bytes: 0, duration_ms: 0, warnings: vec![] }) + } + fn list_contents(&self, _: &Path, _: Option<&str>) -> Result, DomainError> { Ok(vec![]) } + fn detect_segments(&self, _: &Path) -> Result>, DomainError> { Ok(None) } + } + + // ---------- Fixture helpers ---------- + + fn completed_download(id: u64, path: &str) -> Download { + let mut d = Download::new( + DownloadId(id), + Url::new("http://example.com/f.zip").unwrap(), + format!("f{id}.zip"), + path.to_string(), + ); + d.start().unwrap(); + // Drive the state machine to Completed. The domain exposes `mark_completed` + // (see domain/model/download.rs — confirm the exact method name when running + // the test; if the API differs, adjust here). + d.mark_completed().unwrap(); + d + } + + fn errored_download(id: u64, path: &str) -> Download { + let mut d = Download::new( + DownloadId(id), + Url::new("http://example.com/f.zip").unwrap(), + format!("f{id}.zip"), + path.to_string(), + ); + d.start().unwrap(); + d.mark_error("boom".to_string()).unwrap(); + d + } + + struct TestHarness { + bus: CommandBus, + event_bus: Arc, + file_storage: Arc, + } + + fn make_harness(repo: MockDownloadRepo) -> TestHarness { + let event_bus = Arc::new(MockEventBus::new()); + let file_storage = Arc::new(MockFileStorage::new()); + let bus = CommandBus::new( + Arc::new(repo), + Arc::new(MockDownloadEngine), + event_bus.clone(), + file_storage.clone(), + Arc::new(MockHttpClient), + Arc::new(MockPluginLoader), + Arc::new(MockConfigStore), + Arc::new(MockCredentialStore), + Arc::new(MockClipboardObserver), + Arc::new(FakeArchiveExtractor), + None, + ); + TestHarness { bus, event_bus, file_storage } + } + + // ---------- Tests ---------- + + #[tokio::test] + async fn test_clear_completed_returns_count_and_deletes_from_db() { + let repo = MockDownloadRepo::new() + .with(completed_download(1, "/tmp/a.zip")) + .with(completed_download(2, "/tmp/b.zip")); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: false, + }; + let count = h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + + assert_eq!(count, 2); + assert!(h.bus.download_repo().find_by_id(DownloadId(1)).unwrap().is_none()); + assert!(h.bus.download_repo().find_by_id(DownloadId(2)).unwrap().is_none()); + } +} +``` + +**Important caveat:** the helpers `mark_completed()` and `mark_error("boom")` must match the real API of `Download`. If the names differ (e.g. `complete()`, `fail(msg)`, or state transitions via a different method), adjust *only these two lines* by reading `src-tauri/src/domain/model/download.rs`. Do not invent a new API. + +- [ ] **Step 2: Run the test, confirm it fails** + +Run: `cargo test -p vortex-core test_clear_completed_returns_count_and_deletes_from_db 2>&1 | tail -30` +Expected: FAIL — the method `handle_clear_downloads_by_state` does not exist on `CommandBus`. + +- [ ] **Step 3: Do NOT commit yet. Continue to Task 3.** + +--- + +## Task 3: GREEN — minimal happy-path implementation + +**Files:** +- Modify: `src-tauri/src/application/commands/clear_downloads_by_state.rs` + +- [ ] **Step 1: Add the handler on top of the test module** + +Insert at the top of the file (before `#[cfg(test)] mod tests`): + +```rust +use std::path::Path; + +use crate::application::command_bus::CommandBus; +use crate::application::error::AppError; +use crate::domain::event::DomainEvent; +use crate::domain::model::download::DownloadState; + +impl CommandBus { + pub async fn handle_clear_downloads_by_state( + &self, + cmd: super::ClearDownloadsByStateCommand, + ) -> Result { + if !matches!(cmd.state, DownloadState::Completed | DownloadState::Error) { + return Err(AppError::Validation( + "state must be Completed or Error".into(), + )); + } + + let downloads = self.download_repo().find_by_state(cmd.state)?; + let mut count: u32 = 0; + + for download in downloads { + if cmd.delete_files { + let dest = Path::new(download.destination_path()); + if dest.exists() { + if let Err(e) = std::fs::remove_file(dest) { + tracing::warn!( + path = %dest.display(), + error = %e, + "failed to delete download file" + ); + } + } + let meta_path = format!("{}.vortex-meta", download.destination_path()); + if let Err(e) = self.file_storage().delete_meta(Path::new(&meta_path)) { + tracing::warn!( + path = %meta_path, + error = %e, + "failed to delete .vortex-meta sidecar" + ); + } + } + + if let Err(e) = self.download_repo().delete(download.id()) { + tracing::error!( + id = download.id().0, + error = %e, + "failed to delete download from repository" + ); + continue; + } + + self.event_bus() + .publish(DomainEvent::DownloadRemoved { id: download.id() }); + count += 1; + } + + Ok(count) + } +} +``` + +- [ ] **Step 2: Run the test, confirm it passes** + +Run: `cargo test -p vortex-core test_clear_completed_returns_count_and_deletes_from_db 2>&1 | tail -20` +Expected: PASS. + +- [ ] **Step 3: Full workspace tests still green** + +Run: `cargo test --workspace 2>&1 | tail -5` +Expected: all pass. + +- [ ] **Step 4: Commit** + +```bash +git add src-tauri/src/application/commands/clear_downloads_by_state.rs +git commit -m "feat(download): add handle_clear_downloads_by_state + +Bulk-clears downloads in terminal states (Completed, Error) with optional +on-disk file deletion. Emits one DownloadRemoved event per successfully +cleared download. Non-terminal states are rejected with AppError::Validation. +Filesystem failures are logged and ignored (best-effort)." +``` + +--- + +## Task 4: RED → GREEN — error state + validation + event emission + idempotence + +**Files:** +- Modify: `src-tauri/src/application/commands/clear_downloads_by_state.rs` + +Append these tests to the `tests` module: + +- [ ] **Step 1: Write the extra test suite** + +```rust +#[tokio::test] +async fn test_clear_failed_returns_count() { + let repo = MockDownloadRepo::new() + .with(errored_download(1, "/tmp/a.zip")) + .with(completed_download(2, "/tmp/b.zip")); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Error, + delete_files: false, + }; + let count = h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + + assert_eq!(count, 1); + // The completed one must remain untouched. + assert!(h.bus.download_repo().find_by_id(DownloadId(2)).unwrap().is_some()); +} + +#[tokio::test] +async fn test_clear_non_terminal_state_returns_validation_error() { + let h = make_harness(MockDownloadRepo::new()); + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Downloading, + delete_files: false, + }; + let err = h.bus.handle_clear_downloads_by_state(cmd).await.unwrap_err(); + assert!(matches!(err, AppError::Validation(_))); +} + +#[tokio::test] +async fn test_clear_emits_one_removed_event_per_cleared_download() { + let repo = MockDownloadRepo::new() + .with(completed_download(1, "/tmp/a.zip")) + .with(completed_download(2, "/tmp/b.zip")); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: false, + }; + h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + + let events = h.event_bus.events.lock().unwrap(); + let removed: Vec<_> = events + .iter() + .filter_map(|e| match e { + DomainEvent::DownloadRemoved { id } => Some(*id), + _ => None, + }) + .collect(); + assert_eq!(removed.len(), 2); + assert!(removed.contains(&DownloadId(1))); + assert!(removed.contains(&DownloadId(2))); +} + +#[tokio::test] +async fn test_clear_with_delete_files_calls_filestorage_delete_meta() { + let repo = MockDownloadRepo::new().with(completed_download(1, "/tmp/a.zip")); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: true, + }; + h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + + let metas = h.file_storage.deleted_metas.lock().unwrap(); + assert_eq!(metas.len(), 1); + assert_eq!(metas[0], "/tmp/a.zip.vortex-meta"); +} + +#[tokio::test] +async fn test_clear_without_delete_files_skips_filestorage() { + let repo = MockDownloadRepo::new().with(completed_download(1, "/tmp/a.zip")); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: false, + }; + h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + + assert!(h.file_storage.deleted_metas.lock().unwrap().is_empty()); +} + +#[tokio::test] +async fn test_clear_missing_file_is_idempotent() { + // Path that surely does not exist on disk. + let repo = MockDownloadRepo::new() + .with(completed_download(1, "/nonexistent/definitely/not/here.zip")); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: true, + }; + let count = h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + assert_eq!(count, 1); +} + +#[tokio::test] +async fn test_clear_empty_returns_zero() { + let h = make_harness(MockDownloadRepo::new()); + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: true, + }; + let count = h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + assert_eq!(count, 0); + assert!(h.event_bus.events.lock().unwrap().is_empty()); +} +``` + +- [ ] **Step 2: Run all the handler's tests** + +Run: `cargo test -p vortex-core clear_downloads_by_state 2>&1 | tail -20` +Expected: all 7 tests PASS. + +- [ ] **Step 3: Full workspace** + +Run: `cargo test --workspace 2>&1 | tail -5` +Expected: all pass. + +- [ ] **Step 4: Commit** + +```bash +git add src-tauri/src/application/commands/clear_downloads_by_state.rs +git commit -m "test(download): exhaustive tests for clear-by-state handler + +Covers: Error state, validation guard against non-terminal states, event +emission, file deletion gating, idempotence on missing files, and empty +result." +``` + +--- + +## Task 5: Tauri IPC handlers + +**Files:** +- Modify: `src-tauri/src/adapters/driving/tauri_ipc.rs` +- Modify: `src-tauri/src/lib.rs` + +- [ ] **Step 1: Add the two IPC handlers** + +Open `tauri_ipc.rs`, find the `download_remove` handler (line 138-153). Immediately after it, insert: + +```rust +#[tauri::command] +pub async fn download_clear_completed( + state: State<'_, AppState>, + delete_files: bool, +) -> Result { + let cmd = ClearDownloadsByStateCommand { + state: crate::domain::model::download::DownloadState::Completed, + delete_files, + }; + state + .command_bus + .handle_clear_downloads_by_state(cmd) + .await + .map_err(|e| e.to_string()) +} + +#[tauri::command] +pub async fn download_clear_failed( + state: State<'_, AppState>, + delete_files: bool, +) -> Result { + let cmd = ClearDownloadsByStateCommand { + state: crate::domain::model::download::DownloadState::Error, + delete_files, + }; + state + .command_bus + .handle_clear_downloads_by_state(cmd) + .await + .map_err(|e| e.to_string()) +} +``` + +Locate the imports at the top of `tauri_ipc.rs` (anywhere `RemoveDownloadCommand` is brought in) and add `ClearDownloadsByStateCommand` to the `use` list, e.g.: + +```rust +use crate::application::commands::{ + // ...existing imports... + ClearDownloadsByStateCommand, + // ... +}; +``` + +- [ ] **Step 2: Export the new handlers and register them in the Tauri builder** + +In `src-tauri/src/lib.rs` (line 55-63) update the `pub use` statement to add `download_clear_completed, download_clear_failed` to the exported identifiers (keep alphabetical order): + +```rust +pub use adapters::driving::tauri_ipc::{ + self, AppState, clipboard_state, clipboard_toggle, command_get_media_metadata, download_cancel, + download_clear_completed, download_clear_failed, + download_count_by_state, download_detail, download_list, download_logs, download_media_start, + download_pause, download_pause_all, download_remove, download_resume, download_resume_all, + download_retry, download_set_priority, download_start, link_resolve, plugin_disable, + plugin_enable, plugin_install, plugin_list, plugin_store_install, plugin_store_list, + plugin_store_refresh, plugin_store_update, plugin_uninstall, settings_get, settings_update, + status_bar_get, +}; +``` + +Then find `.invoke_handler(tauri::generate_handler![...])` (around line 270-300 — search for `download_pause_all,`) and add `download_clear_completed, download_clear_failed,` to the list (again keep alphabetical / section grouping with existing `download_*` entries). + +- [ ] **Step 3: Compile check** + +Run: `cargo check --workspace 2>&1 | tail -10` +Expected: no errors, no warnings. + +- [ ] **Step 4: Run full test suite** + +Run: `cargo test --workspace 2>&1 | tail -5` +Expected: all pass. + +- [ ] **Step 5: Commit** + +```bash +git add src-tauri/src/adapters/driving/tauri_ipc.rs src-tauri/src/lib.rs +git commit -m "feat(download): expose clear-completed and clear-failed IPC + +Two distinct Tauri commands restrict the target state to Completed or Error +respectively. Non-terminal states cannot be reached from the frontend." +``` + +--- + +## Task 6: Install sonner and mount the Toaster + +**Files:** +- Modify: `package.json` (via `npm install`) +- Create: `src/lib/toast.ts` +- Modify: `src/App.tsx` + +- [ ] **Step 1: Install sonner** + +Run: `npm install sonner` +Expected: `package.json` `dependencies.sonner` present; `package-lock.json` updated. + +- [ ] **Step 2: Create the toast wrapper** + +Create `src/lib/toast.ts`: + +```ts +import { toast as sonnerToast } from 'sonner'; + +export const toast = { + success: (message: string) => sonnerToast.success(message), + error: (message: string) => sonnerToast.error(message), +}; +``` + +- [ ] **Step 3: Mount the Toaster** + +Edit `src/App.tsx`. Import sonner's Toaster at the top: + +```tsx +import { Toaster } from 'sonner'; +``` + +Place the `` component inside `` and outside ``, e.g. just before the `` closing tag: + +```tsx + + + {/* ...routes... */} + + + +``` + +- [ ] **Step 4: Verify build** + +Run: `npm run build 2>&1 | tail -10` +Expected: success. + +- [ ] **Step 5: Commit** + +```bash +git add package.json package-lock.json src/lib/toast.ts src/App.tsx +git commit -m "feat(ui): add sonner toast infrastructure + +Installs sonner (~5 kB gzipped), mounts the Toaster in App, and adds a thin +src/lib/toast wrapper so components depend on our abstraction rather than +the sonner API directly." +``` + +--- + +## Task 7: i18n keys + +**Files:** +- Modify: `src/i18n/locales/en.json` +- Modify: `src/i18n/locales/fr.json` + +- [ ] **Step 1: Edit `en.json`** + +Find the `"downloads"` block, locate the `"actions"` object at line 133-137 and extend it. Also append a sibling `"toast"` object and a `"clearDialog"` object. + +Replace the current `"actions"` block and the following lines with: + +```json +"actions": { + "pauseAll": "Pause All", + "resumeAll": "Resume All", + "cancelSelected": "Cancel Selected", + "clearCompleted": "Clear completed", + "clearFailed": "Clear failed" +}, +"clearDialog": { + "titleCompleted_one": "Clear {{count}} completed download?", + "titleCompleted_other": "Clear {{count}} completed downloads?", + "titleFailed_one": "Clear {{count}} failed download?", + "titleFailed_other": "Clear {{count}} failed downloads?", + "description": "This removes the download entries from Vortex. They will no longer appear in the list.", + "deleteFilesLabel": "Also delete files from disk", + "warningTitle": "Permanent deletion", + "warningBody": "Files will be removed from your disk. This action cannot be undone.", + "confirm": "Clear", + "confirmWithFiles": "Clear and delete files", + "cancel": "Cancel" +}, +"toast": { + "clearedCompleted_one": "{{count}} completed download cleared", + "clearedCompleted_other": "{{count}} completed downloads cleared", + "clearedFailed_one": "{{count}} failed download cleared", + "clearedFailed_other": "{{count}} failed downloads cleared", + "clearError": "Failed to clear downloads: {{error}}" +}, +``` + +- [ ] **Step 2: Edit `fr.json`** + +Apply the symmetric French translations. Example values: + +```json +"actions": { + "pauseAll": "Tout mettre en pause", + "resumeAll": "Tout reprendre", + "cancelSelected": "Annuler la sélection", + "clearCompleted": "Effacer terminés", + "clearFailed": "Effacer en erreur" +}, +"clearDialog": { + "titleCompleted_one": "Effacer {{count}} téléchargement terminé ?", + "titleCompleted_other": "Effacer {{count}} téléchargements terminés ?", + "titleFailed_one": "Effacer {{count}} téléchargement en erreur ?", + "titleFailed_other": "Effacer {{count}} téléchargements en erreur ?", + "description": "Les entrées seront retirées de Vortex. Elles n'apparaîtront plus dans la liste.", + "deleteFilesLabel": "Également supprimer les fichiers du disque", + "warningTitle": "Suppression définitive", + "warningBody": "Les fichiers seront supprimés de votre disque. Cette action est irréversible.", + "confirm": "Effacer", + "confirmWithFiles": "Effacer et supprimer les fichiers", + "cancel": "Annuler" +}, +"toast": { + "clearedCompleted_one": "{{count}} téléchargement terminé effacé", + "clearedCompleted_other": "{{count}} téléchargements terminés effacés", + "clearedFailed_one": "{{count}} téléchargement en erreur effacé", + "clearedFailed_other": "{{count}} téléchargements en erreur effacés", + "clearError": "Échec de l'effacement des téléchargements : {{error}}" +}, +``` + +**Preserve any pre-existing keys in `fr.json`** that are not listed above (e.g., the original `pauseAll` French value if it already existed). Only the three blocks listed here are affected. + +- [ ] **Step 3: Validate JSON** + +Run: `node -e "require('./src/i18n/locales/en.json'); require('./src/i18n/locales/fr.json'); console.log('ok')"` +Expected: `ok`. + +- [ ] **Step 4: Commit** + +```bash +git add src/i18n/locales/en.json src/i18n/locales/fr.json +git commit -m "i18n(download): add keys for clear completed/failed dialog and toasts" +``` + +--- + +## Task 8: `ClearDownloadsDialog` component — RED + +**Files:** +- Create: `src/views/DownloadsView/__tests__/ClearDownloadsDialog.test.tsx` + +- [ ] **Step 1: Write the failing tests** + +```tsx +import { describe, it, expect, vi, beforeEach } from 'vitest'; +import { render, screen } from '@testing-library/react'; +import userEvent from '@testing-library/user-event'; +import { I18nextProvider } from 'react-i18next'; +import i18n from '@/i18n/i18n'; +import { ClearDownloadsDialog } from '@/views/DownloadsView/ClearDownloadsDialog'; + +function renderDialog(overrides: Partial[0]> = {}) { + const props = { + open: true, + onOpenChange: vi.fn(), + targetState: 'completed' as const, + count: 3, + onConfirm: vi.fn().mockResolvedValue(undefined), + ...overrides, + }; + render( + + + , + ); + return props; +} + +describe('ClearDownloadsDialog', () => { + beforeEach(() => vi.clearAllMocks()); + + it('renders the completed title with the provided count', () => { + renderDialog({ targetState: 'completed', count: 3 }); + expect(screen.getByText(/Clear 3 completed downloads\?/i)).toBeInTheDocument(); + }); + + it('renders the failed title when targetState is error', () => { + renderDialog({ targetState: 'error', count: 2 }); + expect(screen.getByText(/Clear 2 failed downloads\?/i)).toBeInTheDocument(); + }); + + it('does not show the warning panel by default', () => { + renderDialog(); + expect(screen.queryByText(/Permanent deletion/i)).not.toBeInTheDocument(); + }); + + it('reveals the warning panel when the checkbox is checked', async () => { + const user = userEvent.setup(); + renderDialog(); + await user.click(screen.getByRole('checkbox', { name: /also delete files from disk/i })); + expect(screen.getByText(/Permanent deletion/i)).toBeInTheDocument(); + }); + + it('primary button label switches when the checkbox is checked', async () => { + const user = userEvent.setup(); + renderDialog(); + expect(screen.getByRole('button', { name: /^clear$/i })).toBeInTheDocument(); + await user.click(screen.getByRole('checkbox', { name: /also delete files from disk/i })); + expect( + screen.getByRole('button', { name: /clear and delete files/i }), + ).toBeInTheDocument(); + }); + + it('calls onConfirm with deleteFiles:false when the box is not checked', async () => { + const user = userEvent.setup(); + const props = renderDialog(); + await user.click(screen.getByRole('button', { name: /^clear$/i })); + expect(props.onConfirm).toHaveBeenCalledWith(false); + }); + + it('calls onConfirm with deleteFiles:true when the box is checked', async () => { + const user = userEvent.setup(); + const props = renderDialog(); + await user.click(screen.getByRole('checkbox', { name: /also delete files from disk/i })); + await user.click(screen.getByRole('button', { name: /clear and delete files/i })); + expect(props.onConfirm).toHaveBeenCalledWith(true); + }); + + it('calls onOpenChange(false) when cancel is clicked', async () => { + const user = userEvent.setup(); + const props = renderDialog(); + await user.click(screen.getByRole('button', { name: /cancel/i })); + expect(props.onOpenChange).toHaveBeenCalledWith(false); + }); +}); +``` + +- [ ] **Step 2: Run, confirm RED** + +Run: `npx vitest run src/views/DownloadsView/__tests__/ClearDownloadsDialog.test.tsx 2>&1 | tail -15` +Expected: FAIL — module not found (`ClearDownloadsDialog`). + +--- + +## Task 9: `ClearDownloadsDialog` component — GREEN + +**Files:** +- Create: `src/views/DownloadsView/ClearDownloadsDialog.tsx` + +- [ ] **Step 1: Write the component** + +```tsx +import { useEffect, useState } from 'react'; +import { AlertTriangle } from 'lucide-react'; +import { useTranslation } from 'react-i18next'; +import { + Dialog, + DialogContent, + DialogFooter, + DialogHeader, + DialogTitle, + DialogDescription, +} from '@/components/ui/dialog'; +import { Button } from '@/components/ui/button'; +import { Checkbox } from '@/components/ui/checkbox'; + +export type ClearDownloadsTarget = 'completed' | 'error'; + +interface Props { + open: boolean; + onOpenChange: (open: boolean) => void; + targetState: ClearDownloadsTarget; + count: number; + onConfirm: (deleteFiles: boolean) => Promise | void; +} + +export function ClearDownloadsDialog({ + open, + onOpenChange, + targetState, + count, + onConfirm, +}: Props) { + const { t } = useTranslation(); + const [deleteFiles, setDeleteFiles] = useState(false); + const [submitting, setSubmitting] = useState(false); + + // Reset checkbox every time the dialog opens so the destructive option is + // never pre-selected. + useEffect(() => { + if (open) setDeleteFiles(false); + }, [open]); + + const titleKey = + targetState === 'completed' + ? 'downloads.clearDialog.titleCompleted' + : 'downloads.clearDialog.titleFailed'; + + const confirmLabel = deleteFiles + ? t('downloads.clearDialog.confirmWithFiles') + : t('downloads.clearDialog.confirm'); + + const handleConfirm = async () => { + if (submitting) return; + setSubmitting(true); + try { + await onConfirm(deleteFiles); + onOpenChange(false); + } finally { + setSubmitting(false); + } + }; + + return ( + + + + {t(titleKey, { count })} + + {t('downloads.clearDialog.description')} + + + + + + {deleteFiles && ( +
+
+ )} + + + + + +
+
+ ); +} +``` + +- [ ] **Step 2: Run the dialog tests** + +Run: `npx vitest run src/views/DownloadsView/__tests__/ClearDownloadsDialog.test.tsx 2>&1 | tail -15` +Expected: all 8 tests PASS. + +- [ ] **Step 3: Commit** + +```bash +git add src/views/DownloadsView/ClearDownloadsDialog.tsx \ + src/views/DownloadsView/__tests__/ClearDownloadsDialog.test.tsx +git commit -m "feat(downloads): add ClearDownloadsDialog component + +Reusable confirmation dialog for bulk clearing. Optional 'also delete files' +checkbox toggles a prominent red warning panel and switches the primary +button to the destructive variant with a matching label." +``` + +--- + +## Task 10: `ActionsBar` integration — RED + +**Files:** +- Create: `src/views/DownloadsView/__tests__/ActionsBar.test.tsx` + +- [ ] **Step 1: Inspect the existing ActionsBar test (if any)** + +Run: `ls src/views/DownloadsView/__tests__/ 2>&1` +If an existing `ActionsBar.test.tsx` already exists, open it and append the new suites below rather than creating a new file. + +- [ ] **Step 2: Write the tests** + +```tsx +import { describe, it, expect, vi, beforeEach } from 'vitest'; +import { render, screen, waitFor } from '@testing-library/react'; +import userEvent from '@testing-library/user-event'; +import { QueryClient, QueryClientProvider } from '@tanstack/react-query'; +import { I18nextProvider } from 'react-i18next'; +import i18n from '@/i18n/i18n'; +import { ActionsBar } from '@/views/DownloadsView/ActionsBar'; + +const invokeMock = vi.fn(); +vi.mock('@tauri-apps/api/core', () => ({ + invoke: (...args: unknown[]) => invokeMock(...args), +})); + +const toastMock = { success: vi.fn(), error: vi.fn() }; +vi.mock('@/lib/toast', () => ({ toast: toastMock })); + +function wrap(ui: React.ReactElement) { + const qc = new QueryClient({ defaultOptions: { queries: { retry: false } } }); + return ( + + {ui} + + ); +} + +// The view pre-seeds the count-by-state query; we do the same here. +function seedCounts(qc: QueryClient, counts: Record) { + qc.setQueryData(['downloads', 'countByState'], counts); +} + +describe('ActionsBar — clear completed/failed', () => { + beforeEach(() => { + invokeMock.mockReset(); + toastMock.success.mockReset(); + toastMock.error.mockReset(); + }); + + it('disables "Clear completed" when Completed count is 0', () => { + const qc = new QueryClient({ defaultOptions: { queries: { retry: false } } }); + seedCounts(qc, { Completed: 0, Error: 3 }); + render( + + + , + ); + expect(screen.getByRole('button', { name: /clear completed/i })).toBeDisabled(); + }); + + it('disables "Clear failed" when Error count is 0', () => { + const qc = new QueryClient({ defaultOptions: { queries: { retry: false } } }); + seedCounts(qc, { Completed: 1, Error: 0 }); + render( + + + , + ); + expect(screen.getByRole('button', { name: /clear failed/i })).toBeDisabled(); + }); + + it('invokes download_clear_completed with deleteFiles:false and shows success toast', async () => { + invokeMock.mockResolvedValueOnce(3); + const user = userEvent.setup(); + const qc = new QueryClient({ defaultOptions: { queries: { retry: false } } }); + seedCounts(qc, { Completed: 3, Error: 0 }); + + render( + + + , + ); + await user.click(screen.getByRole('button', { name: /clear completed/i })); + await user.click(await screen.findByRole('button', { name: /^clear$/i })); + + await waitFor(() => { + expect(invokeMock).toHaveBeenCalledWith('download_clear_completed', { + deleteFiles: false, + }); + }); + await waitFor(() => { + expect(toastMock.success).toHaveBeenCalledWith( + expect.stringContaining('3'), + ); + }); + }); + + it('invokes download_clear_failed with deleteFiles:true when checkbox checked', async () => { + invokeMock.mockResolvedValueOnce(2); + const user = userEvent.setup(); + const qc = new QueryClient({ defaultOptions: { queries: { retry: false } } }); + seedCounts(qc, { Completed: 0, Error: 2 }); + + render( + + + , + ); + await user.click(screen.getByRole('button', { name: /clear failed/i })); + await user.click(await screen.findByRole('checkbox', { name: /also delete files from disk/i })); + await user.click(screen.getByRole('button', { name: /clear and delete files/i })); + + await waitFor(() => { + expect(invokeMock).toHaveBeenCalledWith('download_clear_failed', { + deleteFiles: true, + }); + }); + }); + + it('shows error toast when the mutation rejects', async () => { + invokeMock.mockRejectedValueOnce(new Error('boom')); + const user = userEvent.setup(); + const qc = new QueryClient({ defaultOptions: { queries: { retry: false } } }); + seedCounts(qc, { Completed: 1, Error: 0 }); + + render( + + + , + ); + await user.click(screen.getByRole('button', { name: /clear completed/i })); + await user.click(await screen.findByRole('button', { name: /^clear$/i })); + + await waitFor(() => { + expect(toastMock.error).toHaveBeenCalledWith( + expect.stringContaining('boom'), + ); + }); + }); +}); +``` + +- [ ] **Step 3: Run, confirm RED** + +Run: `npx vitest run src/views/DownloadsView/__tests__/ActionsBar.test.tsx 2>&1 | tail -20` +Expected: FAIL — buttons not rendered / `@/lib/toast` not used by ActionsBar yet. + +--- + +## Task 11: `ActionsBar` integration — GREEN + +**Files:** +- Modify: `src/views/DownloadsView/ActionsBar.tsx` + +- [ ] **Step 1: Rewrite `ActionsBar.tsx`** + +Replace the file with: + +```tsx +import { useRef, useState } from 'react'; +import { CheckCheck, Pause, Play, X, XCircle } from 'lucide-react'; +import { useTranslation } from 'react-i18next'; +import { useQueryClient } from '@tanstack/react-query'; +import { Button } from '@/components/ui/button'; +import { Separator } from '@/components/ui/separator'; +import { useTauriMutation } from '@/api/hooks'; +import { downloadQueries } from '@/api/queries'; +import { useUiStore } from '@/stores/uiStore'; +import { toast } from '@/lib/toast'; +import { + ClearDownloadsDialog, + type ClearDownloadsTarget, +} from './ClearDownloadsDialog'; + +const INVALIDATE_KEYS = [ + downloadQueries.lists(), + downloadQueries.countByState(), +] as const; + +export function ActionsBar() { + const { t } = useTranslation(); + const queryClient = useQueryClient(); + const selectedDownloadIds = useUiStore((s) => s.selectedDownloadIds); + const setSelectedDownloadIds = useUiStore((s) => s.setSelectedDownloadIds); + const clearSelection = useUiStore((s) => s.clearSelection); + + const pauseAll = useTauriMutation('download_pause_all', { + invalidateKeys: INVALIDATE_KEYS, + }); + + const resumeAll = useTauriMutation('download_resume_all', { + invalidateKeys: INVALIDATE_KEYS, + }); + + const cancelDownload = useTauriMutation('download_cancel', { + invalidateKeys: INVALIDATE_KEYS, + }); + + const clearCompleted = useTauriMutation( + 'download_clear_completed', + { + invalidateKeys: INVALIDATE_KEYS, + onSuccess: (count) => { + toast.success( + t('downloads.toast.clearedCompleted', { count }), + ); + }, + onError: (err) => { + toast.error(t('downloads.toast.clearError', { error: err.message })); + }, + }, + ); + + const clearFailed = useTauriMutation( + 'download_clear_failed', + { + invalidateKeys: INVALIDATE_KEYS, + onSuccess: (count) => { + toast.success(t('downloads.toast.clearedFailed', { count })); + }, + onError: (err) => { + toast.error(t('downloads.toast.clearError', { error: err.message })); + }, + }, + ); + + const cancellingRef = useRef(false); + const handleCancelSelected = async () => { + if (cancellingRef.current) return; + cancellingRef.current = true; + const snapshot = [...selectedDownloadIds]; + try { + const results = await Promise.allSettled( + snapshot.map((id) => cancelDownload.mutateAsync({ id: Number(id) })), + ); + const failedIds = snapshot.filter((_, i) => results[i].status === 'rejected'); + const currentIds = useUiStore.getState().selectedDownloadIds; + const unchanged = + currentIds.length === snapshot.length + && currentIds.every((id, i) => id === snapshot[i]); + if (unchanged) { + if (failedIds.length === 0) clearSelection(); + else setSelectedDownloadIds(failedIds); + } + } finally { + cancellingRef.current = false; + } + }; + + const hasSelection = selectedDownloadIds.length > 0; + + // Counts. We read the cache directly so the bar is fully reactive even + // when the DownloadsView does not explicitly pass counts down. + const counts = + queryClient.getQueryData>( + downloadQueries.countByState(), + ) ?? {}; + const completedCount = counts.Completed ?? 0; + const errorCount = counts.Error ?? 0; + + const [dialogTarget, setDialogTarget] = + useState(null); + const dialogOpen = dialogTarget !== null; + const dialogCount = dialogTarget === 'completed' ? completedCount : errorCount; + + const handleDialogConfirm = async (deleteFiles: boolean) => { + if (dialogTarget === 'completed') { + await clearCompleted.mutateAsync({ deleteFiles }); + } else if (dialogTarget === 'error') { + await clearFailed.mutateAsync({ deleteFiles }); + } + }; + + return ( +
+ {hasSelection ? ( + <> + + {t('downloads.selectedCount', { count: selectedDownloadIds.length })} + + + + + ) : ( + <> + + + + + + + + + )} + + {dialogTarget !== null && ( + !o && setDialogTarget(null)} + targetState={dialogTarget} + count={dialogCount} + onConfirm={handleDialogConfirm} + /> + )} +
+ ); +} +``` + +- [ ] **Step 2: Run the ActionsBar tests** + +Run: `npx vitest run src/views/DownloadsView/__tests__/ActionsBar.test.tsx 2>&1 | tail -20` +Expected: all 5 new tests PASS. + +- [ ] **Step 3: Run full frontend suite** + +Run: `npx vitest run 2>&1 | tail -5` +Expected: all pass. + +- [ ] **Step 4: Lint and type-check** + +Run: `npx oxlint . 2>&1 | tail -10` +Run: `npx tsc --noEmit 2>&1 | tail -10` +Expected: clean. + +- [ ] **Step 5: Commit** + +```bash +git add src/views/DownloadsView/ActionsBar.tsx \ + src/views/DownloadsView/__tests__/ActionsBar.test.tsx +git commit -m "feat(downloads): wire Clear completed/failed buttons in ActionsBar + +Adds two new toolbar buttons separated from bulk actions by a vertical +Separator. Each opens the ClearDownloadsDialog and fires the corresponding +Tauri mutation, followed by a success or error toast." +``` + +--- + +## Task 12: CHANGELOG update + +**Files:** +- Modify: `CHANGELOG.md` + +- [ ] **Step 1: Edit** + +Under `[Unreleased] > Added`, append: + +```markdown +- Clear completed and clear failed downloads from the Downloads toolbar, with + an optional "also delete files from disk" confirmation guarded by a + prominent warning. Each action reports its outcome via a toast. +- Sonner-based toast notifications (new library dependency). +``` + +- [ ] **Step 2: Commit** + +```bash +git add CHANGELOG.md +git commit -m "docs: changelog entry for clear completed/failed downloads" +``` + +--- + +## Task 13: Manual smoke test (not scripted) + +- [ ] **Step 1:** Run `npm run tauri dev`. In a second shell, trigger real downloads so some finish and some fail (for example, one legit URL + one 404). + +- [ ] **Step 2:** Click **Clear completed** without checking the box → confirm toast, list shrinks, files still on disk. + +- [ ] **Step 3:** Click **Clear failed** with the box checked → confirm warning appears red, confirm destructive variant, click confirm → files gone from disk. + +- [ ] **Step 4:** Trigger an artificial error (e.g. flip the DB into read-only mode for a second, or forge a mutation rejection via `tauri-pilot ipc`) → confirm the error toast shows the backend message. + +- [ ] **Step 5:** Record findings in the PR description. + +--- + +## Task 14: Adversarial review + +- [ ] **Step 1:** Dispatch four agents **in parallel**: + - `rust-reviewer` — focus on the new command, guard, and idempotence. + - `typescript-reviewer` — focus on `ClearDownloadsDialog` and `ActionsBar`, hooks usage, invalidation correctness. + - `security-reviewer` — focus on the file-deletion path: can anything outside the intended file be removed? Is the IPC boundary safe? + - `code-reviewer` — general quality, over-engineering, unused code. + +- [ ] **Step 2:** Fix any finding that rates "must fix" in its own commit (`fix(...): address : `). + +- [ ] **Step 3:** Re-run `cargo test --workspace` and `npx vitest run` after fixes. + +--- + +## Task 15: Optional — PR and follow-up issue + +- [ ] **Step 1:** `/git-create-pr` (or `gh pr create`) with title `feat(downloads): clear completed and failed downloads` and a body summarising the change and linking the spec. + +- [ ] **Step 2:** `/issue-create` (or `gh issue create`) — "Migrate all success/error feedback app-wide to sonner toasts". Mention that this feature seeded the toast infrastructure and the rest of the app should follow. + +--- + +## Self-review notes + +- Every spec section has at least one task covering it (dialog, button+separator, i18n, sonner, adversarial review, follow-up issue). +- No TBDs. All code blocks are complete and self-contained. +- Method signatures are consistent: `handle_clear_downloads_by_state(cmd) -> Result` used identically across the handler file, IPC file, and the frontend expectation (`Promise`). +- Types match across the boundary: `{ deleteFiles: boolean }` (camelCase) in TS, `delete_files: bool` in Rust (serde auto-derives camelCase on the `#[tauri::command]` parameter, per Tauri defaults). +- The `mark_completed` / `mark_error` helpers used in tests are flagged as "verify-API-before-use" so the implementer adjusts them if the domain exposes a different name. diff --git a/package-lock.json b/package-lock.json index 11be2f8..d58d834 100644 --- a/package-lock.json +++ b/package-lock.json @@ -1,12 +1,12 @@ { "name": "vortex", - "version": "0.0.0", + "version": "0.2.0", "lockfileVersion": 3, "requires": true, "packages": { "": { "name": "vortex", - "version": "0.0.0", + "version": "0.2.0", "dependencies": { "@radix-ui/react-slot": "^1.2.4", "@tanstack/react-query": "^5.97.0", @@ -23,6 +23,7 @@ "react-dom": "^19.0.0", "react-i18next": "^17.0.2", "react-router": "^7.14.0", + "sonner": "^2.0.7", "tailwind-merge": "^3.5.0", "zustand": "^5.0.12" }, @@ -6147,6 +6148,16 @@ "dev": true, "license": "ISC" }, + "node_modules/sonner": { + "version": "2.0.7", + "resolved": "https://registry.npmjs.org/sonner/-/sonner-2.0.7.tgz", + "integrity": "sha512-W6ZN4p58k8aDKA4XPcx2hpIQXBRAgyiWVkYhT7CvK6D3iAu7xjvVyhQHg2/iaKJZ1XVJ4r7XuwGL+WGEK37i9w==", + "license": "MIT", + "peerDependencies": { + "react": "^18.0.0 || ^19.0.0 || ^19.0.0-rc", + "react-dom": "^18.0.0 || ^19.0.0 || ^19.0.0-rc" + } + }, "node_modules/source-map-js": { "version": "1.2.1", "resolved": "https://registry.npmjs.org/source-map-js/-/source-map-js-1.2.1.tgz", diff --git a/package.json b/package.json index becd2c3..815787c 100644 --- a/package.json +++ b/package.json @@ -32,6 +32,7 @@ "react-dom": "^19.0.0", "react-i18next": "^17.0.2", "react-router": "^7.14.0", + "sonner": "^2.0.7", "tailwind-merge": "^3.5.0", "zustand": "^5.0.12" }, diff --git a/src-tauri/src/adapters/driving/tauri_ipc.rs b/src-tauri/src/adapters/driving/tauri_ipc.rs index 96fc719..3deed61 100644 --- a/src-tauri/src/adapters/driving/tauri_ipc.rs +++ b/src-tauri/src/adapters/driving/tauri_ipc.rs @@ -13,10 +13,11 @@ use crate::adapters::driven::logging::download_log_store::DownloadLogStore; use crate::application::command_bus::CommandBus; use crate::application::commands::store_install::{StoreInstallCommand, StoreUpdateCommand}; use crate::application::commands::{ - CancelDownloadCommand, DisablePluginCommand, EnablePluginCommand, InstallPluginCommand, - PauseAllDownloadsCommand, PauseDownloadCommand, RemoveDownloadCommand, ResolveLinksCommand, - ResolvedLinkDto, ResumeAllDownloadsCommand, ResumeDownloadCommand, RetryDownloadCommand, - SetPriorityCommand, StartDownloadCommand, UninstallPluginCommand, UpdateConfigCommand, + CancelDownloadCommand, ClearDownloadsByStateCommand, DisablePluginCommand, EnablePluginCommand, + InstallPluginCommand, PauseAllDownloadsCommand, PauseDownloadCommand, RemoveDownloadCommand, + ResolveLinksCommand, ResolvedLinkDto, ResumeAllDownloadsCommand, ResumeDownloadCommand, + RetryDownloadCommand, SetPriorityCommand, StartDownloadCommand, UninstallPluginCommand, + UpdateConfigCommand, }; use crate::application::error::AppError; use crate::application::queries::{ @@ -152,6 +153,38 @@ pub async fn download_remove( .map_err(|e| e.to_string()) } +#[tauri::command] +pub async fn download_clear_completed( + state: State<'_, AppState>, + delete_files: bool, +) -> Result { + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files, + }; + state + .command_bus + .handle_clear_downloads_by_state(cmd) + .await + .map_err(|e| e.to_string()) +} + +#[tauri::command] +pub async fn download_clear_failed( + state: State<'_, AppState>, + delete_files: bool, +) -> Result { + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Error, + delete_files, + }; + state + .command_bus + .handle_clear_downloads_by_state(cmd) + .await + .map_err(|e| e.to_string()) +} + // --- Plugin Commands --- #[tauri::command] diff --git a/src-tauri/src/application/commands/clear_downloads_by_state.rs b/src-tauri/src/application/commands/clear_downloads_by_state.rs new file mode 100644 index 0000000..67caab7 --- /dev/null +++ b/src-tauri/src/application/commands/clear_downloads_by_state.rs @@ -0,0 +1,504 @@ +use std::io::ErrorKind; +use std::path::Path; + +use crate::application::command_bus::CommandBus; +use crate::application::error::AppError; +use crate::domain::event::DomainEvent; +use crate::domain::model::download::DownloadState; + +impl CommandBus { + pub async fn handle_clear_downloads_by_state( + &self, + cmd: super::ClearDownloadsByStateCommand, + ) -> Result { + if !matches!(cmd.state, DownloadState::Completed | DownloadState::Error) { + return Err(AppError::Validation( + "state must be Completed or Error".to_string(), + )); + } + + let downloads = self.download_repo().find_by_state(cmd.state)?; + let mut count: u32 = 0; + + for download in downloads { + // Repository delete first — if the durable store rejects the write + // we must not orphan files on disk under a gone DB row. + if let Err(e) = self.download_repo().delete(download.id()) { + tracing::error!( + id = download.id().0, + error = %e, + "failed to delete download from repository; skipping file cleanup" + ); + continue; + } + + if cmd.delete_files { + let dest = Path::new(download.destination_path()); + if let Err(e) = std::fs::remove_file(dest) + && e.kind() != ErrorKind::NotFound + { + tracing::warn!( + path = %dest.display(), + error = %e, + "failed to delete download file" + ); + } + if let Err(e) = self.file_storage().delete_meta(dest) { + tracing::warn!( + path = %format!("{}.vortex-meta", download.destination_path()), + error = %e, + "failed to delete .vortex-meta sidecar" + ); + } + } + + self.event_bus() + .publish(DomainEvent::DownloadRemoved { id: download.id() }); + count += 1; + } + + Ok(count) + } +} + +#[cfg(test)] +mod tests { + use std::collections::HashMap; + use std::path::Path; + use std::sync::{Arc, Mutex}; + + use crate::application::command_bus::CommandBus; + use crate::application::commands::ClearDownloadsByStateCommand; + use crate::application::error::AppError; + use crate::domain::error::DomainError; + use crate::domain::event::DomainEvent; + use crate::domain::model::config::{AppConfig, ConfigPatch}; + use crate::domain::model::credential::Credential; + use crate::domain::model::download::{Download, DownloadId, DownloadState, Url}; + use crate::domain::model::http::HttpResponse; + use crate::domain::model::meta::DownloadMeta; + use crate::domain::model::plugin::{PluginInfo, PluginManifest}; + use crate::domain::ports::driven::{ + ArchiveExtractor, ClipboardObserver, ConfigStore, CredentialStore, DownloadEngine, + DownloadRepository, EventBus, FileStorage, HttpClient, PluginLoader, + }; + + struct MockDownloadRepo { + store: Mutex>, + } + impl MockDownloadRepo { + fn new() -> Self { + Self { + store: Mutex::new(HashMap::new()), + } + } + fn with(self, dl: Download) -> Self { + self.store.lock().unwrap().insert(dl.id().0, dl); + self + } + } + impl DownloadRepository for MockDownloadRepo { + fn find_by_id(&self, id: DownloadId) -> Result, DomainError> { + Ok(self.store.lock().unwrap().get(&id.0).cloned()) + } + fn save(&self, d: &Download) -> Result<(), DomainError> { + self.store.lock().unwrap().insert(d.id().0, d.clone()); + Ok(()) + } + fn delete(&self, id: DownloadId) -> Result<(), DomainError> { + self.store.lock().unwrap().remove(&id.0); + Ok(()) + } + fn find_by_state(&self, s: DownloadState) -> Result, DomainError> { + Ok(self + .store + .lock() + .unwrap() + .values() + .filter(|d| d.state() == s) + .cloned() + .collect()) + } + } + + struct MockDownloadEngine; + impl DownloadEngine for MockDownloadEngine { + fn start(&self, _: &Download) -> Result<(), DomainError> { + Ok(()) + } + fn pause(&self, _: DownloadId) -> Result<(), DomainError> { + Ok(()) + } + fn resume(&self, _: DownloadId) -> Result<(), DomainError> { + Ok(()) + } + fn cancel(&self, _: DownloadId) -> Result<(), DomainError> { + Ok(()) + } + } + + struct MockEventBus { + events: Mutex>, + } + impl MockEventBus { + fn new() -> Self { + Self { + events: Mutex::new(Vec::new()), + } + } + } + impl EventBus for MockEventBus { + fn publish(&self, e: DomainEvent) { + self.events.lock().unwrap().push(e); + } + fn subscribe(&self, _: Box) {} + } + + struct MockFileStorage { + deleted_metas: Mutex>, + } + impl MockFileStorage { + fn new() -> Self { + Self { + deleted_metas: Mutex::new(Vec::new()), + } + } + } + impl FileStorage for MockFileStorage { + fn create_file(&self, _: &Path, _: u64) -> Result<(), DomainError> { + Ok(()) + } + fn write_segment(&self, _: &Path, _: u64, _: &[u8]) -> Result<(), DomainError> { + Ok(()) + } + fn read_meta(&self, _: &Path) -> Result, DomainError> { + Ok(None) + } + fn write_meta(&self, _: &Path, _: &DownloadMeta) -> Result<(), DomainError> { + Ok(()) + } + fn delete_meta(&self, p: &Path) -> Result<(), DomainError> { + self.deleted_metas + .lock() + .unwrap() + .push(p.to_string_lossy().into_owned()); + Ok(()) + } + } + + struct MockHttpClient; + impl HttpClient for MockHttpClient { + fn head(&self, _: &str) -> Result { + Ok(HttpResponse { + status_code: 200, + headers: HashMap::new(), + body: vec![], + }) + } + fn get_range(&self, _: &str, _: u64, _: u64) -> Result, DomainError> { + Ok(vec![]) + } + fn supports_range(&self, _: &str) -> Result { + Ok(true) + } + } + + struct MockPluginLoader; + impl PluginLoader for MockPluginLoader { + fn load(&self, _: &PluginManifest) -> Result<(), DomainError> { + Ok(()) + } + fn unload(&self, _: &str) -> Result<(), DomainError> { + Ok(()) + } + fn resolve_url(&self, _: &str) -> Result, DomainError> { + Ok(None) + } + fn list_loaded(&self) -> Result, DomainError> { + Ok(vec![]) + } + fn set_enabled(&self, _: &str, _: bool) -> Result<(), DomainError> { + Ok(()) + } + } + + struct MockConfigStore; + impl ConfigStore for MockConfigStore { + fn get_config(&self) -> Result { + Ok(AppConfig::default()) + } + fn update_config(&self, _: ConfigPatch) -> Result { + Ok(AppConfig::default()) + } + } + + struct MockCredentialStore; + impl CredentialStore for MockCredentialStore { + fn get(&self, _: &str) -> Result, DomainError> { + Ok(None) + } + fn store(&self, _: &str, _: &Credential) -> Result<(), DomainError> { + Ok(()) + } + fn delete(&self, _: &str) -> Result<(), DomainError> { + Ok(()) + } + } + + struct MockClipboardObserver; + impl ClipboardObserver for MockClipboardObserver { + fn start(&self) -> Result<(), DomainError> { + Ok(()) + } + fn stop(&self) -> Result<(), DomainError> { + Ok(()) + } + fn get_urls(&self) -> Result, DomainError> { + Ok(vec![]) + } + } + + struct FakeArchiveExtractor; + impl ArchiveExtractor for FakeArchiveExtractor { + fn detect_format( + &self, + _: &Path, + ) -> Result, DomainError> { + Ok(None) + } + fn can_extract(&self, _: &Path) -> Result { + Ok(false) + } + fn extract( + &self, + _: &Path, + _: &Path, + _: Option<&str>, + ) -> Result { + Ok(crate::domain::model::archive::ExtractSummary { + extracted_files: 0, + extracted_bytes: 0, + duration_ms: 0, + warnings: vec![], + }) + } + fn list_contents( + &self, + _: &Path, + _: Option<&str>, + ) -> Result, DomainError> { + Ok(vec![]) + } + fn detect_segments( + &self, + _: &Path, + ) -> Result>, DomainError> { + Ok(None) + } + } + + fn completed_download(id: u64, path: &str) -> Download { + let mut d = Download::new( + DownloadId(id), + Url::new("http://example.com/f.zip").unwrap(), + format!("f{id}.zip"), + path.to_string(), + ); + d.start().unwrap(); + d.complete().unwrap(); + d + } + + fn errored_download(id: u64, path: &str) -> Download { + let mut d = Download::new( + DownloadId(id), + Url::new("http://example.com/f.zip").unwrap(), + format!("f{id}.zip"), + path.to_string(), + ); + d.start().unwrap(); + d.fail("boom".to_string()).unwrap(); + d + } + + struct TestHarness { + bus: CommandBus, + event_bus: Arc, + file_storage: Arc, + } + + fn make_harness(repo: MockDownloadRepo) -> TestHarness { + let event_bus = Arc::new(MockEventBus::new()); + let file_storage = Arc::new(MockFileStorage::new()); + let bus = CommandBus::new( + Arc::new(repo), + Arc::new(MockDownloadEngine), + event_bus.clone(), + file_storage.clone(), + Arc::new(MockHttpClient), + Arc::new(MockPluginLoader), + Arc::new(MockConfigStore), + Arc::new(MockCredentialStore), + Arc::new(MockClipboardObserver), + Arc::new(FakeArchiveExtractor), + None, + ); + TestHarness { + bus, + event_bus, + file_storage, + } + } + + #[tokio::test] + async fn test_clear_completed_returns_count_and_deletes_from_db() { + let repo = MockDownloadRepo::new() + .with(completed_download(1, "/tmp/a.zip")) + .with(completed_download(2, "/tmp/b.zip")); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: false, + }; + let count = h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + + assert_eq!(count, 2); + assert!( + h.bus + .download_repo() + .find_by_id(DownloadId(1)) + .unwrap() + .is_none() + ); + assert!( + h.bus + .download_repo() + .find_by_id(DownloadId(2)) + .unwrap() + .is_none() + ); + } + + #[tokio::test] + async fn test_clear_failed_returns_count() { + let repo = MockDownloadRepo::new() + .with(errored_download(1, "/tmp/a.zip")) + .with(completed_download(2, "/tmp/b.zip")); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Error, + delete_files: false, + }; + let count = h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + + assert_eq!(count, 1); + assert!( + h.bus + .download_repo() + .find_by_id(DownloadId(2)) + .unwrap() + .is_some() + ); + } + + #[tokio::test] + async fn test_clear_non_terminal_state_returns_validation_error() { + let h = make_harness(MockDownloadRepo::new()); + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Downloading, + delete_files: false, + }; + let err = h + .bus + .handle_clear_downloads_by_state(cmd) + .await + .unwrap_err(); + assert!(matches!(err, AppError::Validation(_))); + } + + #[tokio::test] + async fn test_clear_emits_one_removed_event_per_cleared_download() { + let repo = MockDownloadRepo::new() + .with(completed_download(1, "/tmp/a.zip")) + .with(completed_download(2, "/tmp/b.zip")); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: false, + }; + h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + + let events = h.event_bus.events.lock().unwrap(); + let removed: Vec<_> = events + .iter() + .filter_map(|e| match e { + DomainEvent::DownloadRemoved { id } => Some(*id), + _ => None, + }) + .collect(); + assert_eq!(removed.len(), 2); + assert!(removed.contains(&DownloadId(1))); + assert!(removed.contains(&DownloadId(2))); + } + + #[tokio::test] + async fn test_clear_with_delete_files_calls_filestorage_delete_meta() { + let repo = MockDownloadRepo::new().with(completed_download(1, "/tmp/a.zip")); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: true, + }; + h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + + let metas = h.file_storage.deleted_metas.lock().unwrap(); + assert_eq!(metas.len(), 1); + assert_eq!(metas[0], "/tmp/a.zip"); + } + + #[tokio::test] + async fn test_clear_without_delete_files_skips_filestorage() { + let repo = MockDownloadRepo::new().with(completed_download(1, "/tmp/a.zip")); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: false, + }; + h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + + assert!(h.file_storage.deleted_metas.lock().unwrap().is_empty()); + } + + #[tokio::test] + async fn test_clear_missing_file_is_idempotent() { + let repo = MockDownloadRepo::new().with(completed_download( + 1, + "/nonexistent/definitely/not/here.zip", + )); + let h = make_harness(repo); + + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: true, + }; + let count = h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + assert_eq!(count, 1); + } + + #[tokio::test] + async fn test_clear_empty_returns_zero() { + let h = make_harness(MockDownloadRepo::new()); + let cmd = ClearDownloadsByStateCommand { + state: DownloadState::Completed, + delete_files: true, + }; + let count = h.bus.handle_clear_downloads_by_state(cmd).await.unwrap(); + assert_eq!(count, 0); + assert!(h.event_bus.events.lock().unwrap().is_empty()); + } +} diff --git a/src-tauri/src/application/commands/mod.rs b/src-tauri/src/application/commands/mod.rs index efb4448..f2e36b0 100644 --- a/src-tauri/src/application/commands/mod.rs +++ b/src-tauri/src/application/commands/mod.rs @@ -4,6 +4,7 @@ //! Handler implementations live in submodules and add methods to `CommandBus`. mod cancel_download; +mod clear_downloads_by_state; mod extract_archive; mod install_plugin; mod pause_all; @@ -117,6 +118,13 @@ pub struct RemoveDownloadCommand { } impl Command for RemoveDownloadCommand {} +#[derive(Debug)] +pub struct ClearDownloadsByStateCommand { + pub state: crate::domain::model::download::DownloadState, + pub delete_files: bool, +} +impl Command for ClearDownloadsByStateCommand {} + #[derive(Debug)] pub struct ResolveLinksCommand { pub urls: Vec, diff --git a/src-tauri/src/lib.rs b/src-tauri/src/lib.rs index 0a81b95..30cc2cb 100644 --- a/src-tauri/src/lib.rs +++ b/src-tauri/src/lib.rs @@ -54,12 +54,12 @@ pub use domain::model::ExtractionConfig; pub use adapters::driving::tauri_ipc::{ self, AppState, clipboard_state, clipboard_toggle, command_get_media_metadata, download_cancel, - download_count_by_state, download_detail, download_list, download_logs, download_media_start, - download_pause, download_pause_all, download_remove, download_resume, download_resume_all, - download_retry, download_set_priority, download_start, link_resolve, plugin_disable, - plugin_enable, plugin_install, plugin_list, plugin_store_install, plugin_store_list, - plugin_store_refresh, plugin_store_update, plugin_uninstall, settings_get, settings_update, - status_bar_get, + download_clear_completed, download_clear_failed, download_count_by_state, download_detail, + download_list, download_logs, download_media_start, download_pause, download_pause_all, + download_remove, download_resume, download_resume_all, download_retry, download_set_priority, + download_start, link_resolve, plugin_disable, plugin_enable, plugin_install, plugin_list, + plugin_store_install, plugin_store_list, plugin_store_refresh, plugin_store_update, + plugin_uninstall, settings_get, settings_update, status_bar_get, }; #[cfg_attr(mobile, tauri::mobile_entry_point)] @@ -279,6 +279,8 @@ pub fn run() { download_resume_all, download_set_priority, download_remove, + download_clear_completed, + download_clear_failed, download_list, download_detail, download_logs, diff --git a/src/App.tsx b/src/App.tsx index 024f94a..55a8e85 100644 --- a/src/App.tsx +++ b/src/App.tsx @@ -1,6 +1,7 @@ import './i18n/i18n'; import { BrowserRouter, Routes, Route, Navigate } from "react-router"; import { QueryClientProvider } from "@tanstack/react-query"; +import { Toaster } from "sonner"; import { TooltipProvider } from "@/components/ui/tooltip"; import { AppLayout } from "@/layouts/AppLayout"; import { @@ -39,6 +40,7 @@ export function App() {
+
); diff --git a/src/i18n/locales/en.json b/src/i18n/locales/en.json index 9b691ac..204d511 100644 --- a/src/i18n/locales/en.json +++ b/src/i18n/locales/en.json @@ -133,7 +133,29 @@ "actions": { "pauseAll": "Pause All", "resumeAll": "Resume All", - "cancelSelected": "Cancel Selected" + "cancelSelected": "Cancel Selected", + "clearCompleted": "Clear completed", + "clearFailed": "Clear failed" + }, + "clearDialog": { + "titleCompleted_one": "Clear {{count}} completed download?", + "titleCompleted_other": "Clear {{count}} completed downloads?", + "titleFailed_one": "Clear {{count}} failed download?", + "titleFailed_other": "Clear {{count}} failed downloads?", + "description": "This removes the download entries from Vortex. They will no longer appear in the list.", + "deleteFilesLabel": "Also delete files from disk", + "warningTitle": "Permanent deletion", + "warningBody": "Files will be removed from your disk. This action cannot be undone.", + "confirm": "Clear", + "confirmWithFiles": "Clear and delete files", + "cancel": "Cancel" + }, + "toast": { + "clearedCompleted_one": "{{count}} completed download cleared", + "clearedCompleted_other": "{{count}} completed downloads cleared", + "clearedFailed_one": "{{count}} failed download cleared", + "clearedFailed_other": "{{count}} failed downloads cleared", + "clearError": "Failed to clear downloads: {{error}}" }, "selectedCount_one": "{{count}} selected", "selectedCount_other": "{{count}} selected", diff --git a/src/i18n/locales/fr.json b/src/i18n/locales/fr.json index 37fec89..1d69763 100644 --- a/src/i18n/locales/fr.json +++ b/src/i18n/locales/fr.json @@ -133,7 +133,29 @@ "actions": { "pauseAll": "Tout suspendre", "resumeAll": "Tout reprendre", - "cancelSelected": "Annuler la sélection" + "cancelSelected": "Annuler la sélection", + "clearCompleted": "Effacer terminés", + "clearFailed": "Effacer en erreur" + }, + "clearDialog": { + "titleCompleted_one": "Effacer {{count}} téléchargement terminé ?", + "titleCompleted_other": "Effacer {{count}} téléchargements terminés ?", + "titleFailed_one": "Effacer {{count}} téléchargement en erreur ?", + "titleFailed_other": "Effacer {{count}} téléchargements en erreur ?", + "description": "Les entrées seront retirées de Vortex. Elles n'apparaîtront plus dans la liste.", + "deleteFilesLabel": "Également supprimer les fichiers du disque", + "warningTitle": "Suppression définitive", + "warningBody": "Les fichiers seront supprimés de votre disque. Cette action est irréversible.", + "confirm": "Effacer", + "confirmWithFiles": "Effacer et supprimer les fichiers", + "cancel": "Annuler" + }, + "toast": { + "clearedCompleted_one": "{{count}} téléchargement terminé effacé", + "clearedCompleted_other": "{{count}} téléchargements terminés effacés", + "clearedFailed_one": "{{count}} téléchargement en erreur effacé", + "clearedFailed_other": "{{count}} téléchargements en erreur effacés", + "clearError": "Échec de l'effacement des téléchargements : {{error}}" }, "selectedCount_one": "{{count}} sélectionné", "selectedCount_other": "{{count}} sélectionnés", diff --git a/src/lib/toast.ts b/src/lib/toast.ts new file mode 100644 index 0000000..948b7a3 --- /dev/null +++ b/src/lib/toast.ts @@ -0,0 +1,6 @@ +import { toast as sonnerToast } from 'sonner'; + +export const toast = { + success: (message: string) => sonnerToast.success(message), + error: (message: string) => sonnerToast.error(message), +}; diff --git a/src/views/DownloadsView/ActionsBar.tsx b/src/views/DownloadsView/ActionsBar.tsx index ebdcf88..e0bf19d 100644 --- a/src/views/DownloadsView/ActionsBar.tsx +++ b/src/views/DownloadsView/ActionsBar.tsx @@ -1,10 +1,16 @@ -import { useRef } from 'react'; -import { Pause, Play, X } from 'lucide-react'; +import { useRef, useState } from 'react'; +import { CheckCheck, Pause, Play, X, XCircle } from 'lucide-react'; import { useTranslation } from 'react-i18next'; import { Button } from '@/components/ui/button'; -import { useTauriMutation } from '@/api/hooks'; +import { Separator } from '@/components/ui/separator'; +import { useTauriMutation, useTauriQuery } from '@/api/hooks'; import { downloadQueries } from '@/api/queries'; import { useUiStore } from '@/stores/uiStore'; +import { toast } from '@/lib/toast'; +import { + ClearDownloadsDialog, + type ClearDownloadsTarget, +} from './ClearDownloadsDialog'; const INVALIDATE_KEYS = [ downloadQueries.lists(), @@ -29,8 +35,33 @@ export function ActionsBar() { invalidateKeys: INVALIDATE_KEYS, }); - const cancellingRef = useRef(false); + const clearCompleted = useTauriMutation( + 'download_clear_completed', + { + invalidateKeys: INVALIDATE_KEYS, + onSuccess: (count) => { + toast.success(t('downloads.toast.clearedCompleted', { count })); + }, + onError: (err) => { + toast.error(t('downloads.toast.clearError', { error: err.message })); + }, + }, + ); + const clearFailed = useTauriMutation( + 'download_clear_failed', + { + invalidateKeys: INVALIDATE_KEYS, + onSuccess: (count) => { + toast.success(t('downloads.toast.clearedFailed', { count })); + }, + onError: (err) => { + toast.error(t('downloads.toast.clearError', { error: err.message })); + }, + }, + ); + + const cancellingRef = useRef(false); const handleCancelSelected = async () => { if (cancellingRef.current) return; cancellingRef.current = true; @@ -41,14 +72,12 @@ export function ActionsBar() { ); const failedIds = snapshot.filter((_, i) => results[i].status === 'rejected'); const currentIds = useUiStore.getState().selectedDownloadIds; - const unchanged = currentIds.length === snapshot.length + const unchanged = + currentIds.length === snapshot.length && currentIds.every((id, i) => id === snapshot[i]); if (unchanged) { - if (failedIds.length === 0) { - clearSelection(); - } else { - setSelectedDownloadIds(failedIds); - } + if (failedIds.length === 0) clearSelection(); + else setSelectedDownloadIds(failedIds); } } finally { cancellingRef.current = false; @@ -57,8 +86,34 @@ export function ActionsBar() { const hasSelection = selectedDownloadIds.length > 0; + // Subscribes to the shared cache entry so the button enabled/disabled state + // reactively tracks state transitions. Mirrors the staleTime used by the + // primary consumer in DownloadsView so the two reads share a single request. + const { data: counts } = useTauriQuery>( + 'download_count_by_state', + undefined, + { queryKey: downloadQueries.countByState(), staleTime: 2000 }, + ); + const completedCount = counts?.Completed ?? 0; + const errorCount = counts?.Error ?? 0; + + const [dialogTarget, setDialogTarget] = + useState(null); + const dialogOpen = dialogTarget !== null; + const dialogCount = dialogTarget === 'completed' ? completedCount : errorCount; + + const handleDialogConfirm = async (deleteFiles: boolean) => { + if (dialogTarget === 'completed') { + await clearCompleted.mutateAsync({ deleteFiles }); + } else if (dialogTarget === 'error') { + await clearFailed.mutateAsync({ deleteFiles }); + } + }; + return ( -
+
{hasSelection ? ( <> @@ -82,8 +137,39 @@ export function ActionsBar() { {t('downloads.actions.resumeAll')} + + + + + )} + + {dialogTarget !== null && ( + !o && setDialogTarget(null)} + targetState={dialogTarget} + count={dialogCount} + onConfirm={handleDialogConfirm} + /> + )}
); } diff --git a/src/views/DownloadsView/ClearDownloadsDialog.tsx b/src/views/DownloadsView/ClearDownloadsDialog.tsx new file mode 100644 index 0000000..f5d55c6 --- /dev/null +++ b/src/views/DownloadsView/ClearDownloadsDialog.tsx @@ -0,0 +1,121 @@ +import { useEffect, useState } from 'react'; +import { AlertTriangle } from 'lucide-react'; +import { useTranslation } from 'react-i18next'; +import { + Dialog, + DialogContent, + DialogFooter, + DialogHeader, + DialogTitle, + DialogDescription, +} from '@/components/ui/dialog'; +import { Button } from '@/components/ui/button'; +import { Checkbox } from '@/components/ui/checkbox'; + +export type ClearDownloadsTarget = 'completed' | 'error'; + +interface Props { + open: boolean; + onOpenChange: (open: boolean) => void; + targetState: ClearDownloadsTarget; + count: number; + onConfirm: (deleteFiles: boolean) => Promise | void; +} + +export function ClearDownloadsDialog({ + open, + onOpenChange, + targetState, + count, + onConfirm, +}: Props) { + const { t } = useTranslation(); + const [deleteFiles, setDeleteFiles] = useState(false); + const [submitting, setSubmitting] = useState(false); + + useEffect(() => { + if (open) setDeleteFiles(false); + }, [open]); + + const titleKey = + targetState === 'completed' + ? 'downloads.clearDialog.titleCompleted' + : 'downloads.clearDialog.titleFailed'; + + const confirmLabel = deleteFiles + ? t('downloads.clearDialog.confirmWithFiles') + : t('downloads.clearDialog.confirm'); + + const handleConfirm = async () => { + if (submitting) return; + setSubmitting(true); + try { + await onConfirm(deleteFiles); + onOpenChange(false); + } catch { + // Failure is surfaced via the mutation's onError toast; we only + // keep the dialog open and re-enable the button here. + } finally { + setSubmitting(false); + } + }; + + return ( + + + + {t(titleKey, { count })} + + {t('downloads.clearDialog.description')} + + + + + + {deleteFiles && ( +
+
+ )} + + + + + +
+
+ ); +} diff --git a/src/views/DownloadsView/__tests__/ActionsBar.test.tsx b/src/views/DownloadsView/__tests__/ActionsBar.test.tsx index aa745b3..642c234 100644 --- a/src/views/DownloadsView/__tests__/ActionsBar.test.tsx +++ b/src/views/DownloadsView/__tests__/ActionsBar.test.tsx @@ -1,37 +1,60 @@ import { describe, it, expect, vi, beforeEach } from 'vitest'; -import { render, screen } from '@testing-library/react'; +import { render, screen, waitFor } from '@testing-library/react'; import userEvent from '@testing-library/user-event'; import { QueryClient, QueryClientProvider } from '@tanstack/react-query'; import { useUiStore } from '@/stores/uiStore'; import { ActionsBar } from '../ActionsBar'; +import { downloadQueries } from '@/api/queries'; + +const { invokeMock, toastMock } = vi.hoisted(() => ({ + invokeMock: vi.fn(), + toastMock: { success: vi.fn(), error: vi.fn() }, +})); vi.mock('@tauri-apps/api/core', () => ({ - invoke: vi.fn().mockResolvedValue(undefined), + invoke: (...args: unknown[]) => invokeMock(...args), })); -function renderWithQuery(ui: React.ReactElement) { +vi.mock('@/lib/toast', () => ({ toast: toastMock })); + +function makeClient(counts?: Record) { const queryClient = new QueryClient({ defaultOptions: { queries: { retry: false }, mutations: { retry: false } }, }); + if (counts) { + queryClient.setQueryData(downloadQueries.countByState(), counts); + } + return queryClient; +} + +function renderBar(counts?: Record) { + const queryClient = makeClient(counts); return render( - {ui}, + + + , ); } beforeEach(() => { useUiStore.setState({ selectedDownloadIds: [], selectedDownloadId: null }); + invokeMock.mockReset(); + invokeMock.mockResolvedValue(undefined); + toastMock.success.mockReset(); + toastMock.error.mockReset(); + window.localStorage.setItem('i18nextLng', 'en'); }); describe('ActionsBar', () => { it('should show Pause All and Resume All when no selection', () => { - renderWithQuery(); + renderBar(); expect(screen.getByText('Pause All')).toBeInTheDocument(); expect(screen.getByText('Resume All')).toBeInTheDocument(); }); it('should show selection count and actions when items selected', () => { useUiStore.setState({ selectedDownloadIds: ['1', '2', '3'] }); - renderWithQuery(); + renderBar(); expect(screen.getByText('3 selected')).toBeInTheDocument(); expect(screen.getByText('Cancel Selected')).toBeInTheDocument(); expect(screen.getByText('Clear')).toBeInTheDocument(); @@ -40,7 +63,7 @@ describe('ActionsBar', () => { it('should clear selection when Clear is clicked', async () => { const user = userEvent.setup(); useUiStore.setState({ selectedDownloadIds: ['1', '2'] }); - renderWithQuery(); + renderBar(); await user.click(screen.getByText('Clear')); expect(useUiStore.getState().selectedDownloadIds).toEqual([]); }); @@ -48,9 +71,78 @@ describe('ActionsBar', () => { it('should use the singular French label when one item is selected', () => { window.localStorage.setItem('i18nextLng', 'fr'); useUiStore.setState({ selectedDownloadIds: ['1'] }); + renderBar(); + expect(screen.getByText('1 sélectionné')).toBeInTheDocument(); + }); +}); - renderWithQuery(); +describe('ActionsBar — clear completed/failed', () => { + it('disables "Clear completed" when Completed count is 0', () => { + renderBar({ Completed: 0, Error: 3 }); + expect( + screen.getByRole('button', { name: /clear completed/i }), + ).toBeDisabled(); + }); - expect(screen.getByText('1 sélectionné')).toBeInTheDocument(); + it('disables "Clear failed" when Error count is 0', () => { + renderBar({ Completed: 1, Error: 0 }); + expect( + screen.getByRole('button', { name: /clear failed/i }), + ).toBeDisabled(); + }); + + it('invokes download_clear_completed with deleteFiles:false and shows success toast', async () => { + invokeMock.mockResolvedValueOnce(3); + const user = userEvent.setup(); + renderBar({ Completed: 3, Error: 0 }); + + await user.click(screen.getByRole('button', { name: /clear completed/i })); + await user.click(await screen.findByRole('button', { name: /^clear$/i })); + + await waitFor(() => { + expect(invokeMock).toHaveBeenCalledWith('download_clear_completed', { + deleteFiles: false, + }); + }); + await waitFor(() => { + expect(toastMock.success).toHaveBeenCalledWith( + expect.stringContaining('3'), + ); + }); + }); + + it('invokes download_clear_failed with deleteFiles:true when checkbox checked', async () => { + invokeMock.mockResolvedValueOnce(2); + const user = userEvent.setup(); + renderBar({ Completed: 0, Error: 2 }); + + await user.click(screen.getByRole('button', { name: /clear failed/i })); + await user.click( + await screen.findByRole('checkbox', { name: /also delete files from disk/i }), + ); + await user.click( + screen.getByRole('button', { name: /clear and delete files/i }), + ); + + await waitFor(() => { + expect(invokeMock).toHaveBeenCalledWith('download_clear_failed', { + deleteFiles: true, + }); + }); + }); + + it('shows error toast when the mutation rejects', async () => { + invokeMock.mockRejectedValueOnce(new Error('boom')); + const user = userEvent.setup(); + renderBar({ Completed: 1, Error: 0 }); + + await user.click(screen.getByRole('button', { name: /clear completed/i })); + await user.click(await screen.findByRole('button', { name: /^clear$/i })); + + await waitFor(() => { + expect(toastMock.error).toHaveBeenCalledWith( + expect.stringContaining('boom'), + ); + }); }); }); diff --git a/src/views/DownloadsView/__tests__/ClearDownloadsDialog.test.tsx b/src/views/DownloadsView/__tests__/ClearDownloadsDialog.test.tsx new file mode 100644 index 0000000..48d798f --- /dev/null +++ b/src/views/DownloadsView/__tests__/ClearDownloadsDialog.test.tsx @@ -0,0 +1,91 @@ +import { describe, it, expect, vi, beforeEach } from 'vitest'; +import { render, screen } from '@testing-library/react'; +import userEvent from '@testing-library/user-event'; +import { ClearDownloadsDialog } from '../ClearDownloadsDialog'; + +function renderDialog( + overrides: Partial[0]> = {}, +) { + const props = { + open: true, + onOpenChange: vi.fn(), + targetState: 'completed' as const, + count: 3, + onConfirm: vi.fn().mockResolvedValue(undefined), + ...overrides, + }; + render(); + return props; +} + +beforeEach(() => { + window.localStorage.setItem('i18nextLng', 'en'); +}); + +describe('ClearDownloadsDialog', () => { + it('renders the completed title with the provided count', () => { + renderDialog({ targetState: 'completed', count: 3 }); + expect( + screen.getByText(/Clear 3 completed downloads\?/i), + ).toBeInTheDocument(); + }); + + it('renders the failed title when targetState is error', () => { + renderDialog({ targetState: 'error', count: 2 }); + expect( + screen.getByText(/Clear 2 failed downloads\?/i), + ).toBeInTheDocument(); + }); + + it('does not show the warning panel by default', () => { + renderDialog(); + expect(screen.queryByText(/Permanent deletion/i)).not.toBeInTheDocument(); + }); + + it('reveals the warning panel when the checkbox is checked', async () => { + const user = userEvent.setup(); + renderDialog(); + await user.click( + screen.getByRole('checkbox', { name: /also delete files from disk/i }), + ); + expect(screen.getByText(/Permanent deletion/i)).toBeInTheDocument(); + }); + + it('primary button label switches when the checkbox is checked', async () => { + const user = userEvent.setup(); + renderDialog(); + expect(screen.getByRole('button', { name: /^clear$/i })).toBeInTheDocument(); + await user.click( + screen.getByRole('checkbox', { name: /also delete files from disk/i }), + ); + expect( + screen.getByRole('button', { name: /clear and delete files/i }), + ).toBeInTheDocument(); + }); + + it('calls onConfirm with deleteFiles:false when the box is not checked', async () => { + const user = userEvent.setup(); + const props = renderDialog(); + await user.click(screen.getByRole('button', { name: /^clear$/i })); + expect(props.onConfirm).toHaveBeenCalledWith(false); + }); + + it('calls onConfirm with deleteFiles:true when the box is checked', async () => { + const user = userEvent.setup(); + const props = renderDialog(); + await user.click( + screen.getByRole('checkbox', { name: /also delete files from disk/i }), + ); + await user.click( + screen.getByRole('button', { name: /clear and delete files/i }), + ); + expect(props.onConfirm).toHaveBeenCalledWith(true); + }); + + it('calls onOpenChange(false) when cancel is clicked', async () => { + const user = userEvent.setup(); + const props = renderDialog(); + await user.click(screen.getByRole('button', { name: /cancel/i })); + expect(props.onOpenChange).toHaveBeenCalledWith(false); + }); +});