Compass integration#49
Conversation
There was a problem hiding this comment.
Pull request overview
This pull request integrates support for Compass MAK/DAT cave survey files into the OpenSpeleo Library. The implementation provides bidirectional conversion between Compass format and the OSPL common data format, enabling survey data to be loaded from Compass files, manipulated in OSPL format, and exported back to Compass format.
Changes:
- Added
compass-lib(v0.0.3) andutm(v0.8.1) dependencies for Compass file parsing and UTM coordinate conversion - Implemented CompassInterface with decoding/encoding logic and station name-to-ID mapping
- Modified Shot model to allow
depthfield to be None (calculated during GeoJSON propagation) - Enhanced GeoJSON coordinate propagation to calculate depth from inclination for Compass data
- Added comprehensive test suite for Compass integration with round-trip validation
Reviewed changes
Copilot reviewed 18 out of 19 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
| uv.lock | Added compass-lib (0.0.3) and utm (0.8.1) dependencies with their transitive dependencies |
| pyproject.toml | Added compass_lib>=0.0.3,<0.1.0 to project dependencies |
| openspeleo_lib/interfaces/compass/init.py | Module initialization exporting CompassInterface |
| openspeleo_lib/interfaces/compass/interface.py | Main CompassInterface class for loading/saving MAK files with loop closure support |
| openspeleo_lib/interfaces/compass/decoding.py | Compass-to-OSPL transformation logic with UTM coordinate conversion |
| openspeleo_lib/interfaces/compass/encoding.py | OSPL-to-Compass transformation logic for round-trip capability |
| openspeleo_lib/interfaces/compass/station_mapper.py | Bidirectional mapping between Compass station names and OSPL integer IDs |
| openspeleo_lib/interfaces/compass/defaults.py | Default values for Compass-to-OSPL conversion |
| openspeleo_lib/interfaces/compass/name_map.py | Documentation of field name mappings (not imported, reference only) |
| openspeleo_lib/models.py | Made Shot.depth field optional (None) and added floating-point tolerance in length_2d calculation |
| openspeleo_lib/geojson.py | Added depth calculation from inclination, enhanced coordinate propagation, added loop closure handling |
| tests/conftest.py | Added Compass file discovery functions and moved decryption to module import time |
| tests/interfaces/compass/*.py | Comprehensive test suite covering interface, decoding, encoding, station mapping, loading, round-trips, and GeoJSON export |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| # Small tolerance for floating point comparison (sub-millimeter precision) | ||
| EPSILON = 1e-9 |
There was a problem hiding this comment.
The EPSILON constant is defined locally within the method. Consider moving this to a module-level constant since it represents a fundamental physical precision (sub-millimeter) that might be reused elsewhere in the codebase. This would improve maintainability if the precision needs to be adjusted globally.
| def discover_compass_with_ospl_json() -> list[pytest.param]: | ||
| """Discover Compass MAK files that have corresponding OSPL JSON baselines. | ||
|
|
||
| Returns: | ||
| List of pytest.param objects with (mak_path, json_path) tuples | ||
| """ | ||
| if not PRIVATE_COMPASS_DATA_DIR.exists(): | ||
| return [] | ||
| params = [] | ||
| for mak_file in sorted(PRIVATE_COMPASS_DATA_DIR.glob("*.mak")): | ||
| # Look for .ospl.json baseline files | ||
| json_file = mak_file.with_suffix(".ospl.json") | ||
| if json_file.exists(): | ||
| params.append(pytest.param(mak_file, json_file, id=mak_file.stem)) | ||
| return params | ||
|
|
||
|
|
||
| # ============================================================================= | ||
| # Pre-computed Parameter Lists (for module-level parametrize decorators) | ||
| # ============================================================================= | ||
|
|
||
| # Decrypt artifacts before discovery (must happen at import time, before parametrize) | ||
| _decrypt_artifacts(PRIVATE_ARIANE_DATA_DIR) | ||
| _decrypt_artifacts(PRIVATE_COMPASS_DATA_DIR) | ||
|
|
||
| # Compass file discovery | ||
| ALL_COMPASS_MAK_FILES = discover_compass_mak_files() | ||
| COMPASS_WITH_GEOJSON = discover_compass_with_geojson() | ||
| COMPASS_WITH_OSPL_JSON = discover_compass_with_ospl_json() |
There was a problem hiding this comment.
The discover_compass_with_ospl_json function and COMPASS_WITH_OSPL_JSON variable are defined but never used in any test files. This appears to be infrastructure that was prepared but not yet utilized. Consider either adding tests that use this discovery function or removing it until it's needed to avoid dead code.
| from_name = ( | ||
| station_names.get(shot.id_start, f"S{shot.id_start}") | ||
| if shot.id_start >= 0 | ||
| else f"S{shot.id_stop}O" | ||
| ) |
There was a problem hiding this comment.
There's a potential naming collision in the synthetic station name generation. When id_start == -1, a synthetic origin name f"S{shot.id_stop}O" is generated. However, if a user has actually named a station with a name like "S1O", and there's a shot with id_start=-1 and id_stop=1, both will use "S1O" as the from_name, causing a collision. Consider using a less likely to collide prefix/suffix pattern, such as "_ORIGIN{id_stop}" or checking the station_names dict for potential collisions before generating synthetic names.
| from_name = ( | |
| station_names.get(shot.id_start, f"S{shot.id_start}") | |
| if shot.id_start >= 0 | |
| else f"S{shot.id_stop}O" | |
| ) | |
| if shot.id_start >= 0: | |
| from_name = station_names.get(shot.id_start, f"S{shot.id_start}") | |
| else: | |
| # Generate a synthetic origin station name using a reserved-looking prefix | |
| # and ensure it does not collide with any user-defined station names. | |
| base_origin_name = f"__ORIGIN_{shot.id_stop}" | |
| from_name = base_origin_name | |
| existing_names = set(station_names.values()) | |
| suffix = 1 | |
| while from_name in existing_names: | |
| from_name = f"{base_origin_name}_{suffix}" | |
| suffix += 1 |
| a.name, | ||
| a.latitude, | ||
| a.longitude, | ||
| a.depth if a.depth is not None else 0.0, |
There was a problem hiding this comment.
The conditional check a.depth if a.depth is not None else 0.0 is unnecessary here because anchor.depth is guaranteed to be non-None after the initialization block at lines 303-306. By this point, all anchors have had their depth set to survey.first_start_absolute_elevation if it was None.
| a.depth if a.depth is not None else 0.0, | |
| a.depth, |
No description provided.