diff --git a/docs/plans/2025-01-20-tools-coverage-implementation.md b/docs/plans/2025-01-20-tools-coverage-implementation.md deleted file mode 100644 index 8e3a33a..0000000 --- a/docs/plans/2025-01-20-tools-coverage-implementation.md +++ /dev/null @@ -1,263 +0,0 @@ -# Implementation Plan: 100% Tool Coverage - -**Date:** 2025-01-20 -**Goal:** Achieve 100% test coverage on note-tools.ts and vault-tools.ts -**Approach:** Add targeted test cases to existing test files - -## Overview - -This plan addresses the remaining coverage gaps in the tool modules to achieve 100% statement coverage as part of pre-release validation. - -## Current Coverage Status - -- **note-tools.ts:** 96.01% → Target: 100% (9 uncovered lines) -- **vault-tools.ts:** 94.22% → Target: 100% (14 uncovered lines) - -## Gap Analysis - -### Note-Tools Uncovered Lines (9 lines) - -1. **Lines 238-239:** Conflict resolution loop (when creating files with duplicate names) -2. **Lines 377, 408, 590, 710, 836:** Folder-not-file errors (5 occurrences across different methods) -3. **Line 647:** Include compressed data flag in Excalidraw read -4. **Line 771:** Add frontmatter to file that has no existing frontmatter - -### Vault-Tools Uncovered Lines (14 lines) - -1. **Line 76:** Invalid path validation error -2. **Line 200:** Folder assignment (possibly unreachable) -3. **Line 267:** Skip root folder in iteration -4. **Line 272:** Glob filtering skip -5. **Line 325:** Alias as string (non-array) normalization -6. **Line 374:** Folder mtime extraction error catch -7. **Lines 452-456, 524-528:** Defensive "path doesn't exist" returns -8. **Lines 596-597:** Glob filtering in search -9. **Lines 608, 620:** MaxResults early termination -10. **Line 650:** Snippet end-of-line adjustment -11. **Line 777:** Search error catch block - -## Implementation Tasks - -### Task 1: Add Note-Tools Conflict Resolution Tests - -**Objective:** Cover lines 238-239 (conflict resolution loop) - -**Steps:** -1. Add test: "creates file with incremented counter when conflicts exist" -2. Mock PathUtils.fileExists to return true for "file.md" and "file 2.md" -3. Verify creates "file 3.md" -4. Run coverage to confirm lines 238-239 covered - -**Files to modify:** -- `tests/note-tools.test.ts` - -**Expected outcome:** Lines 238-239 covered - ---- - -### Task 2: Add Note-Tools Folder-Not-File Error Tests - -**Objective:** Cover lines 377, 408, 590, 710, 836 (folder instead of file errors) - -**Steps:** -1. Add test for read_note: "returns error when path is a folder" - - Mock PathUtils.folderExists to return true - - Verify error message uses ErrorMessages.notAFile() - -2. Add test for rename_file: "returns error when path is a folder" -3. Add test for update_note: "returns error when path is a folder" -4. Add test for delete_note: "returns error when path is a folder" -5. Add test for update_sections: "returns error when path is a folder" -6. Run coverage to confirm all 5 lines covered - -**Files to modify:** -- `tests/note-tools.test.ts` - -**Expected outcome:** Lines 377, 408, 590, 710, 836 covered - ---- - -### Task 3: Add Note-Tools Excalidraw and Frontmatter Tests - -**Objective:** Cover lines 647, 771 - -**Steps:** -1. Add test for read_excalidraw: "includes compressed data when flag is true" - - Call with includeCompressed=true - - Verify result.compressedData is included - -2. Add test for update_frontmatter: "adds frontmatter to file without existing frontmatter" - - Mock file content without frontmatter - - Update frontmatter - - Verify frontmatter added at beginning with newline separator - -3. Run coverage to confirm lines 647, 771 covered - -**Files to modify:** -- `tests/note-tools.test.ts` - -**Expected outcome:** Lines 647, 771 covered, note-tools.ts at 100% - ---- - -### Task 4: Add Vault-Tools Invalid Path and Glob Tests - -**Objective:** Cover lines 76, 272, 596-597 - -**Steps:** -1. Add test for list(): "returns error for invalid vault path" - - Mock PathUtils.isValidVaultPath to return false - - Verify error message - -2. Add test for list(): "filters items using glob excludes" - - Mock GlobUtils.shouldInclude to return false for some items - - Verify filtered items not in results - -3. Add test for search(): "applies glob filtering to search results" - - Provide includes/excludes patterns - - Verify filtered files not searched - -4. Run coverage to confirm lines 76, 272, 596-597 covered - -**Files to modify:** -- `tests/vault-tools.test.ts` - -**Expected outcome:** Lines 76, 272, 596-597 covered - ---- - -### Task 5: Add Vault-Tools Edge Case Tests - -**Objective:** Cover lines 267, 325, 374, 608, 620, 650 - -**Steps:** -1. Add test for list(): "skips root folder in iteration" - - Mock folder structure with root folder (path='', isRoot()=true) - - Verify root not in results - -2. Add test for list(): "normalizes aliases from string to array" - - Mock cache.frontmatter.aliases as string instead of array - - Verify result has aliases as array - -3. Add test for getFolderMetadata(): "handles folder without mtime stat" - - Mock folder without stat or with invalid stat - - Verify doesn't crash, uses default mtime - -4. Add test for search(): "stops searching when maxResults=1 reached" - - Multiple files with matches - - Verify only 1 result returned - -5. Add test for search(): "adjusts snippet for long lines at end" - - Mock line longer than snippetLength ending with match - - Verify snippet adjustment logic (line 650) - -6. Run coverage to confirm lines 267, 325, 374, 608, 620, 650 covered - -**Files to modify:** -- `tests/vault-tools.test.ts` - -**Expected outcome:** Lines 267, 325, 374, 608, 620, 650 covered - ---- - -### Task 6: Add Vault-Tools Defensive Code Coverage - -**Objective:** Cover lines 200, 452-456, 524-528, 777 - -**Steps:** -1. Analyze if lines 200, 452-456, 524-528 are truly unreachable - - If unreachable: Document why (defensive code) - - If reachable: Add tests to trigger them - -2. Add test for search(): "handles file read errors gracefully" - - Mock vault.read to throw error - - Verify error caught, logged to console, search continues - - Covers line 777 - -3. For defensive returns (452-456, 524-528): - - Attempt to trigger "path doesn't exist" cases - - If impossible: Document as unreachable defensive code - -4. Run coverage to verify maximum possible coverage - -**Files to modify:** -- `tests/vault-tools.test.ts` -- Possibly: add comments in source code marking defensive code - -**Expected outcome:** Lines covered or documented as unreachable - ---- - -### Task 7: Verify 100% Coverage - -**Objective:** Confirm 100% coverage achieved - -**Steps:** -1. Run `npm run test:coverage` -2. Check coverage report: - - note-tools.ts: 100% or documented gaps - - vault-tools.ts: 100% or documented gaps -3. If any gaps remain: - - Identify what's uncovered - - Add tests or document as unreachable -4. Final coverage verification - -**Expected outcome:** Both tools at 100% coverage - ---- - -### Task 8: Run Full Test Suite and Build - -**Objective:** Verify no regressions - -**Steps:** -1. Run `npm test` - all tests must pass -2. Run `npm run build` - must succeed -3. Verify total test count increased -4. Document final metrics - -**Expected outcome:** All tests passing, build successful - ---- - -### Task 9: Create Summary and Merge - -**Objective:** Document and integrate work - -**Steps:** -1. Update IMPLEMENTATION_SUMMARY.md with: - - Coverage improvements (before/after) - - Test counts - - Any unreachable code documented -2. Use finishing-a-development-branch skill -3. Merge to master - -**Expected outcome:** Work merged, documentation updated - -## Success Criteria - -- [x] note-tools.ts at 100% statement coverage -- [x] vault-tools.ts at 100% statement coverage -- [x] All tests passing -- [x] Build succeeds -- [x] Any unreachable code documented -- [x] Work merged to master - -## Risk Mitigation - -**If some lines are truly unreachable:** -- Document with inline comments explaining why -- Accept 99.x% if justified -- Focus on getting all reachable code to 100% - -**If tests become too complex:** -- Consider minor refactoring for testability -- Use subagent review to validate approach -- Ensure tests remain maintainable - -## Estimated Effort - -- Note-tools tests: ~1 hour (7 new test cases) -- Vault-tools tests: ~1.5 hours (10-12 new test cases) -- Verification and cleanup: ~0.5 hours -- **Total: ~3 hours** diff --git a/docs/plans/2025-01-20-utils-coverage-completion.md b/docs/plans/2025-01-20-utils-coverage-completion.md deleted file mode 100644 index 9b76e3d..0000000 --- a/docs/plans/2025-01-20-utils-coverage-completion.md +++ /dev/null @@ -1,207 +0,0 @@ -# Implementation Plan: 100% Utils Coverage - -**Date:** 2025-01-20 -**Goal:** Achieve 100% test coverage on all utils modules -**Approach:** Remove dead code + Add targeted test cases - -## Overview - -This plan addresses the remaining coverage gaps in the utils modules to achieve 100% statement coverage as part of pre-release validation. Unlike the tools coverage work, this combines dead code removal with targeted testing. - -## Current Coverage Status - -- **error-messages.ts:** 82.6% → Target: 100% (lines 182-198 uncovered) -- **version-utils.ts:** 88.88% → Target: 100% (line 52 uncovered) -- **path-utils.ts:** 98.18% → Target: 100% (line 70 uncovered) -- **frontmatter-utils.ts:** 96.55% → Target: 100% (lines 253-255, 310 uncovered) - -## Gap Analysis - -### Dead Code (To Remove) - -1. **error-messages.ts (lines 182-198)**: - - `permissionDenied()` - Never called anywhere in codebase - - `formatError()` - Never called anywhere in codebase - -2. **version-utils.ts (line 52)**: - - `createVersionedResponse()` - Never called (only documented in CHANGELOG) - -### Untested Code (To Test) - -1. **path-utils.ts (line 70)**: - - Windows absolute path validation: `/^[A-Za-z]:/` regex check - -2. **frontmatter-utils.ts (lines 253-255)**: - - Excalidraw parsing fallback: code fence without language specifier - -3. **frontmatter-utils.ts (line 310)**: - - Excalidraw decompression error handler - -## Implementation Tasks - -### Task 1: Remove Dead Code from error-messages.ts - -**Objective:** Delete unused methods to improve coverage - -**Steps:** -1. Delete `permissionDenied()` method (lines 178-189) -2. Delete `formatError()` method (lines 191-204) -3. Run tests to verify no broken imports -4. Run coverage to confirm improved percentage - -**Files to modify:** -- `src/utils/error-messages.ts` - -**Expected outcome:** error-messages.ts at 100% coverage - ---- - -### Task 2: Remove Dead Code from version-utils.ts - -**Objective:** Delete unused method to improve coverage - -**Steps:** -1. Delete `createVersionedResponse()` method (lines 48-57) -2. Run tests to verify no broken imports -3. Run coverage to confirm improved percentage - -**Files to modify:** -- `src/utils/version-utils.ts` - -**Expected outcome:** version-utils.ts at 100% coverage - ---- - -### Task 3: Clean Up CHANGELOG.md - -**Objective:** Remove references to deleted code - -**Steps:** -1. Remove `createVersionedResponse()` reference from line 282 -2. Keep surrounding context intact -3. Verify file still well-formed - -**Files to modify:** -- `CHANGELOG.md` - -**Expected outcome:** CHANGELOG accurate to current codebase - ---- - -### Task 4: Add path-utils Windows Absolute Path Tests - -**Objective:** Cover line 70 (Windows path validation) - -**Steps:** -1. Add test: "rejects Windows absolute paths (C: drive)" - - Test `isValidVaultPath('C:\\Users\\file.md')` returns false -2. Add test: "rejects Windows absolute paths (D: drive)" - - Test `isValidVaultPath('D:\\Documents\\note.md')` returns false -3. Run coverage to confirm line 70 covered - -**Files to modify:** -- `tests/path-utils.test.ts` - -**Expected outcome:** path-utils.ts at 100% coverage - ---- - -### Task 5: Add frontmatter-utils Code Fence Fallback Test - -**Objective:** Cover lines 253-255 (code fence without language specifier) - -**Steps:** -1. Create Excalidraw note with ` ``` ` fence (no language) -2. Add test: "parses Excalidraw with code fence lacking language specifier" -3. Call `parseExcalidrawMetadata()` on content -4. Verify JSON parsed correctly -5. Run coverage to confirm lines 253-255 covered - -**Files to modify:** -- `tests/frontmatter-utils.test.ts` - -**Expected outcome:** Lines 253-255 covered - ---- - -### Task 6: Add frontmatter-utils Decompression Failure Test - -**Objective:** Cover line 310 (decompression error handler) - -**Steps:** -1. Create Excalidraw note with invalid compressed data -2. Add test: "handles decompression failure gracefully" -3. Mock or create scenario where decompression throws error -4. Verify graceful fallback with `hasCompressedData: true` -5. Run coverage to confirm line 310 covered - -**Files to modify:** -- `tests/frontmatter-utils.test.ts` - -**Expected outcome:** frontmatter-utils.ts at 100% coverage - ---- - -### Task 7: Verify 100% Coverage - -**Objective:** Confirm 100% coverage achieved on all utils - -**Steps:** -1. Run `npm run test:coverage` -2. Check coverage report: - - error-messages.ts: 100% - - version-utils.ts: 100% - - path-utils.ts: 100% - - frontmatter-utils.ts: 100% -3. If any gaps remain: - - Identify what's uncovered - - Add tests or document as unreachable -4. Final coverage verification - -**Expected outcome:** All 4 utils at 100% coverage - ---- - -### Task 8: Create Summary and Merge - -**Objective:** Document and integrate work - -**Steps:** -1. Create `UTILS_COVERAGE_SUMMARY.md` with: - - Coverage improvements (before/after) - - Test counts - - Dead code removed -2. Use finishing-a-development-branch skill -3. Merge to master - -**Expected outcome:** Work merged, documentation updated - -## Success Criteria - -- [ ] error-messages.ts at 100% statement coverage -- [ ] version-utils.ts at 100% statement coverage -- [ ] path-utils.ts at 100% statement coverage -- [ ] frontmatter-utils.ts at 100% statement coverage -- [ ] All tests passing (505+) -- [ ] Build succeeds -- [ ] Dead code removed cleanly -- [ ] Work merged to master - -## Risk Mitigation - -**If dead code is actually used:** -- Full test suite will catch broken imports immediately -- TypeScript compilation will fail if methods are referenced -- Git revert available if needed - -**If edge case tests are too complex:** -- Document specific difficulty encountered -- Consider if code is truly reachable -- Mark with istanbul ignore if unreachable - -## Estimated Effort - -- Dead code removal: ~15 minutes (3 simple deletions) -- Test additions: ~20 minutes (3 test cases) -- Verification and cleanup: ~10 minutes -- **Total: ~45 minutes** diff --git a/docs/plans/2025-01-20-utils-coverage-implementation.md b/docs/plans/2025-01-20-utils-coverage-implementation.md deleted file mode 100644 index 99fc337..0000000 --- a/docs/plans/2025-01-20-utils-coverage-implementation.md +++ /dev/null @@ -1,373 +0,0 @@ -# Implementation Plan: 100% Utility Coverage - -**Date:** 2025-01-20 -**Branch:** feature/utils-coverage -**Goal:** Achieve 100% test coverage on all utility modules using dependency injection pattern - -## Overview - -Apply the same adapter pattern used for tools to utility modules, enabling comprehensive testing. This is pre-release validation work. - -## Current Coverage Status - -- glob-utils.ts: 14.03% -- frontmatter-utils.ts: 47.86% -- search-utils.ts: 1.78% -- link-utils.ts: 13.76% -- waypoint-utils.ts: 49.18% - -**Target:** 100% on all utilities - -## Implementation Tasks - -### Task 2: Add comprehensive tests for glob-utils.ts - -**Objective:** Achieve 100% coverage on glob-utils.ts (pure utility, no refactoring needed) - -**Steps:** -1. Create `tests/glob-utils.test.ts` -2. Test `globToRegex()` pattern conversion: - - `*` matches any chars except `/` - - `**` matches any chars including `/` - - `?` matches single char except `/` - - `[abc]` character classes - - `{a,b}` alternatives - - Edge cases: unclosed brackets, unclosed braces -3. Test `matches()` with various patterns -4. Test `matchesIncludes()` with empty/populated arrays -5. Test `matchesExcludes()` with empty/populated arrays -6. Test `shouldInclude()` combining includes and excludes -7. Run coverage to verify 100% -8. Commit: "test: add comprehensive glob-utils tests" - -**Files to create:** -- `tests/glob-utils.test.ts` - -**Expected outcome:** glob-utils.ts at 100% coverage - ---- - -### Task 3: Add comprehensive tests for frontmatter-utils.ts - -**Objective:** Achieve 100% coverage on frontmatter-utils.ts - -**Steps:** -1. Create `tests/frontmatter-utils.test.ts` -2. Mock `parseYaml` from obsidian module -3. Test `extractFrontmatter()`: - - Valid frontmatter with `---` delimiters - - No frontmatter - - Missing closing delimiter - - Parse errors (mock parseYaml throwing) -4. Test `extractFrontmatterSummary()`: - - Null input - - Title, tags, aliases extraction - - Tags/aliases as string vs array -5. Test `hasFrontmatter()` quick check -6. Test `serializeFrontmatter()`: - - Arrays, objects, strings with special chars - - Empty objects - - Strings needing quotes -7. Test `parseExcalidrawMetadata()`: - - Valid Excalidraw with markers - - Compressed data detection - - Uncompressed JSON parsing - - Missing JSON blocks -8. Run coverage to verify 100% -9. Commit: "test: add comprehensive frontmatter-utils tests" - -**Files to create:** -- `tests/frontmatter-utils.test.ts` - -**Expected outcome:** frontmatter-utils.ts at 100% coverage - ---- - -### Task 4: Refactor search-utils.ts to use IVaultAdapter - -**Objective:** Decouple search-utils from App, use IVaultAdapter - -**Steps:** -1. Change `SearchUtils.search()` signature: - - From: `search(app: App, options: SearchOptions)` - - To: `search(vault: IVaultAdapter, options: SearchOptions)` -2. Update method body: - - Replace `app.vault.getMarkdownFiles()` with `vault.getMarkdownFiles()` - - Replace `app.vault.read(file)` with `vault.read(file)` -3. Change `SearchUtils.searchWaypoints()` signature: - - From: `searchWaypoints(app: App, folder?: string)` - - To: `searchWaypoints(vault: IVaultAdapter, folder?: string)` -4. Update method body: - - Replace `app.vault.getMarkdownFiles()` with `vault.getMarkdownFiles()` - - Replace `app.vault.read(file)` with `vault.read(file)` -5. Run tests to ensure no breakage (will update callers in Task 7) -6. Commit: "refactor: search-utils to use IVaultAdapter" - -**Files to modify:** -- `src/utils/search-utils.ts` - -**Expected outcome:** search-utils.ts uses adapters instead of App - ---- - -### Task 5: Refactor link-utils.ts to use adapters - -**Objective:** Decouple link-utils from App, use adapters - -**Steps:** -1. Change `LinkUtils.resolveLink()` signature: - - From: `resolveLink(app: App, sourcePath: string, linkText: string)` - - To: `resolveLink(vault: IVaultAdapter, metadata: IMetadataCacheAdapter, sourcePath: string, linkText: string)` -2. Update method body: - - Replace `app.vault.getAbstractFileByPath()` with `vault.getAbstractFileByPath()` - - Replace `app.metadataCache.getFirstLinkpathDest()` with `metadata.getFirstLinkpathDest()` -3. Change `LinkUtils.findSuggestions()` signature: - - From: `findSuggestions(app: App, linkText: string, ...)` - - To: `findSuggestions(vault: IVaultAdapter, linkText: string, ...)` -4. Update: `app.vault.getMarkdownFiles()` → `vault.getMarkdownFiles()` -5. Change `LinkUtils.getBacklinks()` signature: - - From: `getBacklinks(app: App, targetPath: string, ...)` - - To: `getBacklinks(vault: IVaultAdapter, metadata: IMetadataCacheAdapter, targetPath: string, ...)` -6. Update method body: - - Replace `app.vault` calls with `vault` calls - - Replace `app.metadataCache` calls with `metadata` calls -7. Change `LinkUtils.validateWikilinks()` signature: - - From: `validateWikilinks(app: App, filePath: string)` - - To: `validateWikilinks(vault: IVaultAdapter, metadata: IMetadataCacheAdapter, filePath: string)` -8. Update all internal calls to `resolveLink()` to pass both adapters -9. Run tests (will break until Task 7) -10. Commit: "refactor: link-utils to use adapters" - -**Files to modify:** -- `src/utils/link-utils.ts` - -**Expected outcome:** link-utils.ts uses adapters instead of App - ---- - -### Task 6: Refactor waypoint-utils.ts to use IVaultAdapter - -**Objective:** Decouple waypoint-utils from App, use IVaultAdapter - -**Steps:** -1. Change `WaypointUtils.isFolderNote()` signature: - - From: `isFolderNote(app: App, file: TFile)` - - To: `isFolderNote(vault: IVaultAdapter, file: TFile)` -2. Update method body: - - Replace `await app.vault.read(file)` with `await vault.read(file)` -3. Run tests (will break until Task 7) -4. Commit: "refactor: waypoint-utils to use IVaultAdapter" - -**Files to modify:** -- `src/utils/waypoint-utils.ts` - -**Expected outcome:** waypoint-utils.ts uses adapters instead of App - ---- - -### Task 7: Update VaultTools to pass adapters to utilities - -**Objective:** Fix all callers of refactored utilities - -**Steps:** -1. In VaultTools.search() method: - - Change: `SearchUtils.search(this.app, options)` - - To: `SearchUtils.search(this.vault, options)` -2. In VaultTools.searchWaypoints() method: - - Change: `SearchUtils.searchWaypoints(this.app, folder)` - - To: `SearchUtils.searchWaypoints(this.vault, folder)` -3. In VaultTools.validateWikilinks() method: - - Change: `LinkUtils.validateWikilinks(this.app, filePath)` - - To: `LinkUtils.validateWikilinks(this.vault, this.metadata, filePath)` -4. In VaultTools.resolveWikilink() method: - - Change: `LinkUtils.resolveLink(this.app, sourcePath, linkText)` - - To: `LinkUtils.resolveLink(this.vault, this.metadata, sourcePath, linkText)` -5. In VaultTools.getBacklinks() method: - - Change: `LinkUtils.getBacklinks(this.app, targetPath, includeUnlinked)` - - To: `LinkUtils.getBacklinks(this.vault, this.metadata, targetPath, includeUnlinked)` -6. In VaultTools.isFolderNote() method: - - Change: `WaypointUtils.isFolderNote(this.app, file)` - - To: `WaypointUtils.isFolderNote(this.vault, file)` -7. Run all tests to verify no breakage -8. Commit: "refactor: update VaultTools to pass adapters to utils" - -**Files to modify:** -- `src/tools/vault-tools.ts` - -**Expected outcome:** All tests passing, utilities use adapters - ---- - -### Task 8: Add comprehensive tests for search-utils.ts - -**Objective:** Achieve 100% coverage on search-utils.ts - -**Steps:** -1. Create `tests/search-utils.test.ts` -2. Set up mock IVaultAdapter -3. Test `SearchUtils.search()`: - - Basic literal search - - Regex search with pattern - - Case sensitive vs insensitive - - Folder filtering - - Glob includes/excludes filtering - - Snippet extraction with long lines - - Filename matching (line: 0) - - MaxResults limiting - - File read errors (catch block) - - Zero-width regex matches (prevent infinite loop) -4. Test `SearchUtils.searchWaypoints()`: - - Finding waypoint blocks - - Extracting links from waypoints - - Folder filtering - - Unclosed waypoints - - File read errors -5. Run coverage to verify 100% -6. Commit: "test: add comprehensive search-utils tests" - -**Files to create:** -- `tests/search-utils.test.ts` - -**Expected outcome:** search-utils.ts at 100% coverage - ---- - -### Task 9: Add comprehensive tests for link-utils.ts - -**Objective:** Achieve 100% coverage on link-utils.ts - -**Steps:** -1. Create `tests/link-utils.test.ts` -2. Set up mock IVaultAdapter and IMetadataCacheAdapter -3. Test `LinkUtils.parseWikilinks()`: - - Simple links `[[target]]` - - Links with aliases `[[target|alias]]` - - Links with headings `[[note#heading]]` - - Multiple links per line -4. Test `LinkUtils.resolveLink()`: - - Valid link resolution - - Invalid source path - - Link not found (returns null) -5. Test `LinkUtils.findSuggestions()`: - - Exact basename match - - Partial basename match - - Path contains match - - Character similarity scoring - - MaxSuggestions limiting -6. Test `LinkUtils.getBacklinks()`: - - Linked backlinks from resolvedLinks - - Unlinked mentions when includeUnlinked=true - - Skip target file itself - - Extract snippets -7. Test `LinkUtils.validateWikilinks()`: - - Resolved links - - Unresolved links with suggestions - - File not found -8. Test `LinkUtils.extractSnippet()` private method via public methods -9. Run coverage to verify 100% -10. Commit: "test: add comprehensive link-utils tests" - -**Files to create:** -- `tests/link-utils.test.ts` - -**Expected outcome:** link-utils.ts at 100% coverage - ---- - -### Task 10: Add comprehensive tests for waypoint-utils.ts - -**Objective:** Achieve 100% coverage on waypoint-utils.ts - -**Steps:** -1. Create `tests/waypoint-utils.test.ts` -2. Set up mock IVaultAdapter -3. Test `WaypointUtils.extractWaypointBlock()`: - - Valid waypoint with links - - No waypoint in content - - Unclosed waypoint - - Empty waypoint -4. Test `WaypointUtils.hasWaypointMarker()`: - - Content with both markers - - Content missing markers -5. Test `WaypointUtils.isFolderNote()`: - - Basename match (reason: basename_match) - - Waypoint marker (reason: waypoint_marker) - - Both (reason: both) - - Neither (reason: none) - - File read errors -6. Test `WaypointUtils.wouldAffectWaypoint()`: - - Waypoint removed - - Waypoint content changed - - Waypoint moved but content same - - No waypoint in either version -7. Test pure helper methods: - - `getParentFolderPath()` - - `getBasename()` -8. Run coverage to verify 100% -9. Commit: "test: add comprehensive waypoint-utils tests" - -**Files to create:** -- `tests/waypoint-utils.test.ts` - -**Expected outcome:** waypoint-utils.ts at 100% coverage - ---- - -### Task 11: Verify 100% coverage on all utilities - -**Objective:** Confirm all utilities at 100% coverage - -**Steps:** -1. Run `npm run test:coverage` -2. Check coverage report for: - - glob-utils.ts: 100% - - frontmatter-utils.ts: 100% - - search-utils.ts: 100% - - link-utils.ts: 100% - - waypoint-utils.ts: 100% -3. If any gaps remain, identify uncovered lines -4. Add tests to cover any remaining gaps -5. Commit any additional tests -6. Final coverage verification - -**Expected outcome:** All utilities at 100% coverage - ---- - -### Task 12: Run full test suite and build - -**Objective:** Verify all tests pass and build succeeds - -**Steps:** -1. Run `npm test` to verify all tests pass -2. Run `npm run build` to verify no type errors -3. Check test count increased significantly -4. Verify no regressions in existing tests -5. Document final test counts and coverage - -**Expected outcome:** All tests passing, build successful - ---- - -### Task 13: Create summary and merge to master - -**Objective:** Document work and integrate to master - -**Steps:** -1. Create summary document with: - - Coverage improvements - - Test counts before/after - - Architecture changes (adapter pattern in utils) -2. Use finishing-a-development-branch skill -3. Merge to master -4. Clean up worktree - -**Expected outcome:** Work merged, worktree cleaned up - -## Success Criteria - -- [ ] All utilities at 100% statement coverage -- [ ] All tests passing (expected 300+ tests) -- [ ] Build succeeds with no type errors -- [ ] Adapter pattern consistently applied -- [ ] Work merged to master branch diff --git a/docs/plans/2025-10-19-100-percent-test-coverage-design.md b/docs/plans/2025-10-19-100-percent-test-coverage-design.md deleted file mode 100644 index 393d2b8..0000000 --- a/docs/plans/2025-10-19-100-percent-test-coverage-design.md +++ /dev/null @@ -1,367 +0,0 @@ -# 100% Test Coverage via Dependency Injection - -**Date:** 2025-10-19 -**Goal:** Achieve 100% test coverage through dependency injection refactoring -**Current Coverage:** 90.58% overall (VaultTools: 71.72%, NoteTools: 92.77%) - -## Motivation - -We want codebase confidence for future refactoring and feature work. The current test suite has good coverage but gaps remain in: -- Error handling paths -- Edge cases (type coercion, missing data) -- Complex conditional branches - -The current testing approach directly mocks Obsidian's `App` object, leading to: -- Complex, brittle mock setups -- Duplicated mocking code across test files -- Difficulty isolating specific behaviors -- Hard-to-test error conditions - -## Solution: Dependency Injection Architecture - -### Core Principle -Extract interfaces for Obsidian API dependencies, allowing tools to depend on abstractions rather than concrete implementations. This enables clean, simple mocks in tests while maintaining production functionality. - -### Architecture Overview - -**Current State:** -```typescript -class NoteTools { - constructor(private app: App) {} - // Methods use: this.app.vault.X, this.app.metadataCache.Y, etc. -} -``` - -**Target State:** -```typescript -class NoteTools { - constructor( - private vault: IVaultAdapter, - private metadata: IMetadataCacheAdapter, - private fileManager: IFileManagerAdapter - ) {} - // Methods use: this.vault.X, this.metadata.Y, etc. -} - -// Production usage via factory: -function createNoteTools(app: App): NoteTools { - return new NoteTools( - new VaultAdapter(app.vault), - new MetadataCacheAdapter(app.metadataCache), - new FileManagerAdapter(app.fileManager) - ); -} -``` - -## Interface Design - -### IVaultAdapter -Wraps file system operations from Obsidian's Vault API. - -```typescript -interface IVaultAdapter { - // File reading - read(path: string): Promise; - - // File existence and metadata - exists(path: string): boolean; - stat(path: string): { ctime: number; mtime: number; size: number } | null; - - // File retrieval - getAbstractFileByPath(path: string): TAbstractFile | null; - getMarkdownFiles(): TFile[]; - - // Directory operations - getRoot(): TFolder; -} -``` - -### IMetadataCacheAdapter -Wraps metadata and link resolution from Obsidian's MetadataCache API. - -```typescript -interface IMetadataCacheAdapter { - // Cache access - getFileCache(file: TFile): CachedMetadata | null; - - // Link resolution - getFirstLinkpathDest(linkpath: string, sourcePath: string): TFile | null; - - // Backlinks - getBacklinksForFile(file: TFile): { [key: string]: any }; - - // Additional metadata methods as needed -} -``` - -### IFileManagerAdapter -Wraps file modification operations from Obsidian's FileManager API. - -```typescript -interface IFileManagerAdapter { - // File operations - rename(file: TAbstractFile, newPath: string): Promise; - delete(file: TAbstractFile): Promise; - create(path: string, content: string): Promise; - modify(file: TFile, content: string): Promise; -} -``` - -## Implementation Strategy - -### Directory Structure -``` -src/ -├── adapters/ -│ ├── interfaces.ts # Interface definitions -│ ├── vault-adapter.ts # VaultAdapter implementation -│ ├── metadata-adapter.ts # MetadataCacheAdapter implementation -│ └── file-manager-adapter.ts # FileManagerAdapter implementation -├── tools/ -│ ├── note-tools.ts # Refactored to use adapters -│ └── vault-tools.ts # Refactored to use adapters -tests/ -├── __mocks__/ -│ ├── adapters.ts # Mock adapter factories -│ └── obsidian.ts # Existing Obsidian mocks (minimal usage going forward) -``` - -### Migration Approach - -**Step 1: Create Adapters** -- Define interfaces in `src/adapters/interfaces.ts` -- Implement concrete adapters (simple pass-through wrappers initially) -- Create mock adapter factories in `tests/__mocks__/adapters.ts` - -**Step 2: Refactor VaultTools** -- Update constructor to accept adapter interfaces -- Replace all `this.app.X` calls with `this.X` (using injected adapters) -- Create `createVaultTools(app: App)` factory function -- Update tests to use mock adapters - -**Step 3: Refactor NoteTools** -- Same pattern as VaultTools -- Create `createNoteTools(app: App)` factory function -- Update tests to use mock adapters - -**Step 4: Integration** -- Update ToolRegistry to use factory functions -- Update main.ts to use factory functions -- Verify all existing functionality preserved - -### Backward Compatibility - -**Plugin Code (main.ts, ToolRegistry):** -- Uses factory functions: `createNoteTools(app)`, `createVaultTools(app)` -- No awareness of adapters - just passes the App object -- Public API unchanged - -**Tool Classes:** -- Constructors accept adapters (new signature) -- All methods work identically (internal implementation detail) -- External callers use factory functions - -## Test Suite Overhaul - -### Mock Adapter Pattern - -**Centralized Mock Creation:** -```typescript -// tests/__mocks__/adapters.ts -export function createMockVaultAdapter(overrides?: Partial): IVaultAdapter { - return { - read: jest.fn(), - exists: jest.fn(), - stat: jest.fn(), - getAbstractFileByPath: jest.fn(), - getMarkdownFiles: jest.fn(), - getRoot: jest.fn(), - ...overrides - }; -} - -export function createMockMetadataCacheAdapter(overrides?: Partial): IMetadataCacheAdapter { - return { - getFileCache: jest.fn(), - getFirstLinkpathDest: jest.fn(), - getBacklinksForFile: jest.fn(), - ...overrides - }; -} - -export function createMockFileManagerAdapter(overrides?: Partial): IFileManagerAdapter { - return { - rename: jest.fn(), - delete: jest.fn(), - create: jest.fn(), - modify: jest.fn(), - ...overrides - }; -} -``` - -**Test Setup Simplification:** -```typescript -// Before: Complex App mock with nested properties -const mockApp = { - vault: { read: jest.fn(), ... }, - metadataCache: { getFileCache: jest.fn(), ... }, - fileManager: { ... }, - // Many more properties... -}; - -// After: Simple, targeted mocks -const vaultAdapter = createMockVaultAdapter({ - read: jest.fn().mockResolvedValue('file content') -}); -const tools = new VaultTools(vaultAdapter, mockMetadata, mockFileManager); -``` - -### Coverage Strategy by Feature Area - -**1. Frontmatter Operations** -- Test string tags → array conversion -- Test array tags → preserved as array -- Test missing frontmatter → base metadata only -- Test frontmatter parsing errors → error handling path -- Test all field types (title, aliases, custom fields) - -**2. Wikilink Validation** -- Test resolved links → included in results -- Test unresolved links → included with error details -- Test missing file → error path -- Test heading links (`[[note#heading]]`) -- Test alias links (`[[note|alias]]`) - -**3. Backlinks** -- Test `includeSnippets: true` → snippets included -- Test `includeSnippets: false` → snippets removed -- Test `includeUnlinked: true` → unlinked mentions included -- Test `includeUnlinked: false` → only linked mentions -- Test error handling paths - -**4. Search Utilities** -- Test glob pattern filtering -- Test regex search with matches -- Test regex search with no matches -- Test invalid regex → error handling -- Test edge cases (empty results, malformed patterns) - -**5. Note CRUD Operations** -- Test all conflict strategies: error, overwrite, rename -- Test version mismatch → conflict error -- Test missing file on update → error path -- Test permission errors → error handling -- Test all edge cases in uncovered lines - -**6. Path Validation Edge Cases** -- Test all PathUtils error conditions -- Test leading/trailing slash handling -- Test `..` traversal attempts -- Test absolute path rejection - -## Implementation Phases - -### Phase 1: Foundation (Adapters) -**Deliverables:** -- `src/adapters/interfaces.ts` - All interface definitions -- `src/adapters/vault-adapter.ts` - VaultAdapter implementation -- `src/adapters/metadata-adapter.ts` - MetadataCacheAdapter implementation -- `src/adapters/file-manager-adapter.ts` - FileManagerAdapter implementation -- `tests/__mocks__/adapters.ts` - Mock adapter factories -- Tests for adapters (basic pass-through verification) - -**Success Criteria:** -- All adapters compile without errors -- Mock adapters available for test usage -- Simple adapter tests pass - -### Phase 2: VaultTools Refactoring -**Deliverables:** -- Refactored VaultTools class using adapters -- `createVaultTools()` factory function -- Updated vault-tools.test.ts using mock adapters -- New tests for uncovered lines: - - Frontmatter extraction (lines 309-352) - - Wikilink validation error path (lines 716-735) - - Backlinks snippet removal (lines 824-852) - - Other uncovered paths - -**Success Criteria:** -- VaultTools achieves 100% coverage (all metrics) -- All existing tests pass -- No breaking changes to public API - -### Phase 3: NoteTools Refactoring -**Deliverables:** -- Refactored NoteTools class using adapters -- `createNoteTools()` factory function -- Updated note-tools.test.ts using mock adapters -- New tests for uncovered error paths and edge cases - -**Success Criteria:** -- NoteTools achieves 100% coverage (all metrics) -- All existing tests pass -- No breaking changes to public API - -### Phase 4: Integration & Verification -**Deliverables:** -- Updated ToolRegistry using factory functions -- Updated main.ts using factory functions -- Full test suite passing -- Coverage report showing 100% across all files -- Build succeeding with no errors - -**Success Criteria:** -- 100% test coverage: statements, branches, functions, lines -- All 400+ tests passing -- `npm run build` succeeds -- Manual smoke test in Obsidian confirms functionality - -## Risk Mitigation - -**Risk: Breaking existing functionality** -- Mitigation: Incremental refactoring, existing tests updated alongside code changes -- Factory pattern keeps plugin code nearly unchanged - -**Risk: Incomplete interface coverage** -- Mitigation: Start with methods actually used by tools, add to interfaces as needed -- Adapters are simple pass-throughs, easy to extend - -**Risk: Complex migration** -- Mitigation: Phased approach allows stopping after any phase -- Git worktree isolates changes from main branch - -**Risk: Test maintenance burden** -- Mitigation: Centralized mock factories reduce duplication -- Cleaner mocks are easier to maintain than complex App mocks - -## Success Metrics - -**Coverage Goals:** -- Statement coverage: 100% -- Branch coverage: 100% -- Function coverage: 100% -- Line coverage: 100% - -**Quality Goals:** -- All existing tests pass -- No type errors in build -- Plugin functions correctly in Obsidian -- Test code is cleaner and more maintainable - -**Timeline:** -- Phase 1: ~2-3 hours (adapters + mocks) -- Phase 2: ~3-4 hours (VaultTools refactor + tests) -- Phase 3: ~2-3 hours (NoteTools refactor + tests) -- Phase 4: ~1 hour (integration + verification) -- Total: ~8-11 hours of focused work - -## Future Benefits - -**After this refactoring:** -- Adding new tools is easier (use existing adapters) -- Testing new features is trivial (mock only what you need) -- Obsidian API changes isolated to adapter layer -- Confidence in comprehensive test coverage enables fearless refactoring -- New team members can understand test setup quickly diff --git a/docs/plans/2025-10-19-100-percent-test-coverage-implementation.md b/docs/plans/2025-10-19-100-percent-test-coverage-implementation.md deleted file mode 100644 index 9042157..0000000 --- a/docs/plans/2025-10-19-100-percent-test-coverage-implementation.md +++ /dev/null @@ -1,2435 +0,0 @@ -# 100% Test Coverage via Dependency Injection - Implementation Plan - -> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task. - -**Goal:** Achieve 100% test coverage through dependency injection refactoring that decouples tool classes from Obsidian API dependencies. - -**Architecture:** Create adapter interfaces (IVaultAdapter, IMetadataCacheAdapter, IFileManagerAdapter) to wrap Obsidian APIs. Refactor NoteTools and VaultTools to depend on interfaces instead of concrete App object. Use factory pattern for production instantiation while enabling simple mocks in tests. - -**Tech Stack:** TypeScript, Jest, Obsidian Plugin API - ---- - -## Task 1: Create Adapter Interfaces - -**Files:** -- Create: `src/adapters/interfaces.ts` - -**Step 1: Write adapter interfaces** - -Create the file with complete interface definitions: - -```typescript -import { TAbstractFile, TFile, TFolder, CachedMetadata } from 'obsidian'; - -/** - * Adapter interface for Obsidian Vault operations - */ -export interface IVaultAdapter { - // File reading - read(file: TFile): Promise; - - // File existence and metadata - stat(file: TAbstractFile): { ctime: number; mtime: number; size: number } | null; - - // File retrieval - getAbstractFileByPath(path: string): TAbstractFile | null; - getMarkdownFiles(): TFile[]; - - // Directory operations - getRoot(): TFolder; - - // File creation (process method) - process(file: TFile, fn: (data: string) => string, options?: any): Promise; - - // Folder creation - createFolder(path: string): Promise; - - // File creation - create(path: string, data: string): Promise; -} - -/** - * Adapter interface for Obsidian MetadataCache operations - */ -export interface IMetadataCacheAdapter { - // Cache access - getFileCache(file: TFile): CachedMetadata | null; - - // Link resolution - getFirstLinkpathDest(linkpath: string, sourcePath: string): TFile | null; - - // Backlinks - returns record of source paths to link occurrences - getBacklinksForFile(file: TFile): Record; - - // File cache for links and metadata - resolvedLinks: Record>; - unresolvedLinks: Record>; -} - -/** - * Adapter interface for Obsidian FileManager operations - */ -export interface IFileManagerAdapter { - // File operations - renameFile(file: TAbstractFile, newPath: string): Promise; - trashFile(file: TAbstractFile): Promise; - createNewMarkdownFile(location: TFolder, filename: string, content?: string): Promise; - processFrontMatter(file: TFile, fn: (frontmatter: any) => void): Promise; -} -``` - -**Step 2: Commit interfaces** - -```bash -git add src/adapters/interfaces.ts -git commit -m "feat: add adapter interfaces for dependency injection - -Create IVaultAdapter, IMetadataCacheAdapter, and IFileManagerAdapter -interfaces to decouple tool classes from Obsidian API dependencies." -``` - ---- - -## Task 2: Implement Concrete Adapters - -**Files:** -- Create: `src/adapters/vault-adapter.ts` -- Create: `src/adapters/metadata-adapter.ts` -- Create: `src/adapters/file-manager-adapter.ts` - -**Step 1: Implement VaultAdapter** - -Create `src/adapters/vault-adapter.ts`: - -```typescript -import { Vault, TAbstractFile, TFile, TFolder } from 'obsidian'; -import { IVaultAdapter } from './interfaces'; - -export class VaultAdapter implements IVaultAdapter { - constructor(private vault: Vault) {} - - async read(file: TFile): Promise { - return this.vault.read(file); - } - - stat(file: TAbstractFile): { ctime: number; mtime: number; size: number } | null { - return this.vault.stat(file); - } - - getAbstractFileByPath(path: string): TAbstractFile | null { - return this.vault.getAbstractFileByPath(path); - } - - getMarkdownFiles(): TFile[] { - return this.vault.getMarkdownFiles(); - } - - getRoot(): TFolder { - return this.vault.getRoot(); - } - - async process(file: TFile, fn: (data: string) => string, options?: any): Promise { - return this.vault.process(file, fn, options); - } - - async createFolder(path: string): Promise { - await this.vault.createFolder(path); - } - - async create(path: string, data: string): Promise { - return this.vault.create(path, data); - } -} -``` - -**Step 2: Implement MetadataCacheAdapter** - -Create `src/adapters/metadata-adapter.ts`: - -```typescript -import { MetadataCache, TFile, CachedMetadata } from 'obsidian'; -import { IMetadataCacheAdapter } from './interfaces'; - -export class MetadataCacheAdapter implements IMetadataCacheAdapter { - constructor(private cache: MetadataCache) {} - - getFileCache(file: TFile): CachedMetadata | null { - return this.cache.getFileCache(file); - } - - getFirstLinkpathDest(linkpath: string, sourcePath: string): TFile | null { - return this.cache.getFirstLinkpathDest(linkpath, sourcePath); - } - - getBacklinksForFile(file: TFile): Record { - const backlinksData = this.cache.getBacklinksForFile(file); - return backlinksData?.data || {}; - } - - get resolvedLinks(): Record> { - return this.cache.resolvedLinks; - } - - get unresolvedLinks(): Record> { - return this.cache.unresolvedLinks; - } -} -``` - -**Step 3: Implement FileManagerAdapter** - -Create `src/adapters/file-manager-adapter.ts`: - -```typescript -import { FileManager, TAbstractFile, TFile, TFolder } from 'obsidian'; -import { IFileManagerAdapter } from './interfaces'; - -export class FileManagerAdapter implements IFileManagerAdapter { - constructor(private fileManager: FileManager) {} - - async renameFile(file: TAbstractFile, newPath: string): Promise { - await this.fileManager.renameFile(file, newPath); - } - - async trashFile(file: TAbstractFile): Promise { - await this.fileManager.trashFile(file); - } - - async createNewMarkdownFile(location: TFolder, filename: string, content?: string): Promise { - return this.fileManager.createNewMarkdownFile(location, filename, content); - } - - async processFrontMatter(file: TFile, fn: (frontmatter: any) => void): Promise { - await this.fileManager.processFrontMatter(file, fn); - } -} -``` - -**Step 4: Commit adapters** - -```bash -git add src/adapters/vault-adapter.ts src/adapters/metadata-adapter.ts src/adapters/file-manager-adapter.ts -git commit -m "feat: implement concrete adapter classes - -Add VaultAdapter, MetadataCacheAdapter, and FileManagerAdapter as -pass-through wrappers for Obsidian API objects." -``` - ---- - -## Task 3: Create Mock Adapters for Testing - -**Files:** -- Create: `tests/__mocks__/adapters.ts` - -**Step 1: Write mock adapter factories** - -Create `tests/__mocks__/adapters.ts`: - -```typescript -import { IVaultAdapter, IMetadataCacheAdapter, IFileManagerAdapter } from '../../src/adapters/interfaces'; -import { TFile, TFolder, TAbstractFile, CachedMetadata } from 'obsidian'; - -/** - * Create a mock VaultAdapter with jest.fn() for all methods - */ -export function createMockVaultAdapter(overrides?: Partial): IVaultAdapter { - return { - read: jest.fn(), - stat: jest.fn(), - getAbstractFileByPath: jest.fn(), - getMarkdownFiles: jest.fn(), - getRoot: jest.fn(), - process: jest.fn(), - createFolder: jest.fn(), - create: jest.fn(), - ...overrides - }; -} - -/** - * Create a mock MetadataCacheAdapter with jest.fn() for all methods - */ -export function createMockMetadataCacheAdapter(overrides?: Partial): IMetadataCacheAdapter { - return { - getFileCache: jest.fn(), - getFirstLinkpathDest: jest.fn(), - getBacklinksForFile: jest.fn(), - resolvedLinks: {}, - unresolvedLinks: {}, - ...overrides - }; -} - -/** - * Create a mock FileManagerAdapter with jest.fn() for all methods - */ -export function createMockFileManagerAdapter(overrides?: Partial): IFileManagerAdapter { - return { - renameFile: jest.fn(), - trashFile: jest.fn(), - createNewMarkdownFile: jest.fn(), - processFrontMatter: jest.fn(), - ...overrides - }; -} - -/** - * Helper to create a mock TFile - */ -export function createMockTFile(path: string, stat?: { ctime: number; mtime: number; size: number }): TFile { - return { - path, - basename: path.split('/').pop()?.replace('.md', '') || '', - extension: 'md', - name: path.split('/').pop() || '', - stat: stat || { ctime: Date.now(), mtime: Date.now(), size: 100 }, - vault: {} as any, - parent: null - } as TFile; -} - -/** - * Helper to create a mock TFolder - */ -export function createMockTFolder(path: string, children?: TAbstractFile[]): TFolder { - const folder = { - path, - name: path.split('/').pop() || '', - children: children || [], - vault: {} as any, - parent: null, - isRoot: function() { return path === '' || path === '/'; } - } as TFolder; - return folder; -} -``` - -**Step 2: Commit mock adapters** - -```bash -git add tests/__mocks__/adapters.ts -git commit -m "test: add mock adapter factories - -Create factory functions for mock adapters to simplify test setup. -Includes helpers for creating mock TFile and TFolder objects." -``` - ---- - -## Task 4: Refactor VaultTools to Use Adapters - -**Files:** -- Modify: `src/tools/vault-tools.ts` -- Create: `src/tools/vault-tools-factory.ts` - -**Step 1: Update VaultTools constructor** - -In `src/tools/vault-tools.ts`, modify the class constructor and add imports: - -```typescript -import { IVaultAdapter, IMetadataCacheAdapter } from '../adapters/interfaces'; - -export class VaultTools { - constructor( - private vault: IVaultAdapter, - private metadata: IMetadataCacheAdapter, - private app: App // Keep temporarily for methods not yet migrated - ) {} - - // ... rest of class -} -``` - -**Step 2: Create factory function** - -Create `src/tools/vault-tools-factory.ts`: - -```typescript -import { App } from 'obsidian'; -import { VaultTools } from './vault-tools'; -import { VaultAdapter } from '../adapters/vault-adapter'; -import { MetadataCacheAdapter } from '../adapters/metadata-adapter'; - -/** - * Factory function to create VaultTools with concrete adapters - */ -export function createVaultTools(app: App): VaultTools { - return new VaultTools( - new VaultAdapter(app.vault), - new MetadataCacheAdapter(app.metadataCache), - app - ); -} -``` - -**Step 3: Update listNotes method to use adapters** - -In `src/tools/vault-tools.ts`, find the `listNotes` method and update to use adapters: - -```typescript -async listNotes(path?: string, includeFrontmatter: boolean = false): Promise { - try { - // ... path validation code ... - - let targetFolder: TFolder; - if (!normalizedPath || normalizedPath === '.' || normalizedPath === '/') { - targetFolder = this.vault.getRoot(); - } else { - const folder = this.vault.getAbstractFileByPath(normalizedPath); - // ... rest of validation ... - } - - const items: Array<{ type: 'file' | 'folder'; path: string; name: string; metadata?: any }> = []; - - for (const item of targetFolder.children) { - // Skip the vault root itself - if (item.path === '' || item.path === '/' || (item instanceof TFolder && item.isRoot())) { - continue; - } - - if (item instanceof TFile && item.extension === 'md') { - const metadata = includeFrontmatter - ? this.extractFrontmatterSummary(item) - : undefined; - - items.push({ - type: 'file', - path: item.path, - name: item.basename, - metadata - }); - } else if (item instanceof TFolder) { - items.push({ - type: 'folder', - path: item.path, - name: item.name - }); - } - } - - // ... rest of method (sorting, return) ... - } catch (error) { - // ... error handling ... - } -} -``` - -**Step 4: Update stat method to use adapters** - -Find the `stat` method and update: - -```typescript -async stat(path: string): Promise { - try { - const normalizedPath = PathUtils.normalizePath(path); - const file = this.vault.getAbstractFileByPath(normalizedPath); - - if (!file) { - return { - content: [{ - type: "text", - text: ErrorMessages.fileNotFound(normalizedPath) - }], - isError: true - }; - } - - const stat = this.vault.stat(file); - if (!stat) { - return { - content: [{ - type: "text", - text: `Unable to get stat for ${normalizedPath}` - }], - isError: true - }; - } - - // ... rest of method ... - } catch (error) { - // ... error handling ... - } -} -``` - -**Step 5: Update exists method to use adapters** - -Find the `exists` method and update: - -```typescript -async exists(path: string): Promise { - try { - const normalizedPath = PathUtils.normalizePath(path); - const file = this.vault.getAbstractFileByPath(normalizedPath); - - return { - content: [{ - type: "text", - text: JSON.stringify({ exists: file !== null }, null, 2) - }] - }; - } catch (error) { - // ... error handling ... - } -} -``` - -**Step 6: Update extractFrontmatterSummary to use adapters** - -Find the `extractFrontmatterSummary` method and update: - -```typescript -private extractFrontmatterSummary(file: TFile): FileMetadata { - const stat = this.vault.stat(file); - const baseMetadata: FileMetadata = { - created: stat?.ctime || 0, - modified: stat?.mtime || 0, - size: stat?.size || 0 - }; - - if (!stat) { - return baseMetadata; - } - - try { - const cache = this.metadata.getFileCache(file); - if (cache?.frontmatter) { - // ... rest of method unchanged ... - } - } catch (error) { - console.error(`Failed to extract frontmatter for ${file.path}:`, error); - } - - return baseMetadata; -} -``` - -**Step 7: Commit VaultTools refactoring** - -```bash -git add src/tools/vault-tools.ts src/tools/vault-tools-factory.ts -git commit -m "refactor: migrate VaultTools to use adapter interfaces - -Update VaultTools constructor to accept IVaultAdapter and -IMetadataCacheAdapter. Add factory function for production usage. -Update listNotes, stat, exists, and extractFrontmatterSummary methods." -``` - ---- - -## Task 5: Update VaultTools Tests to Use Mock Adapters - -**Files:** -- Modify: `tests/vault-tools.test.ts` - -**Step 1: Update test imports and setup** - -At the top of `tests/vault-tools.test.ts`, update imports: - -```typescript -import { VaultTools } from '../src/tools/vault-tools'; -import { createMockVaultAdapter, createMockMetadataCacheAdapter, createMockTFile, createMockTFolder } from './__mocks__/adapters'; -import { TFile, TFolder, App } from 'obsidian'; - -// Remove old mock App setup, replace with: -describe('VaultTools', () => { - let vaultTools: VaultTools; - let mockVault: ReturnType; - let mockMetadata: ReturnType; - let mockApp: App; - - beforeEach(() => { - mockVault = createMockVaultAdapter(); - mockMetadata = createMockMetadataCacheAdapter(); - mockApp = {} as App; // Minimal mock for methods not yet migrated - - vaultTools = new VaultTools(mockVault, mockMetadata, mockApp); - }); - - // ... tests ... -}); -``` - -**Step 2: Update listNotes tests** - -Update the listNotes test cases to use mock adapters: - -```typescript -describe('listNotes', () => { - it('should list files and folders in root directory', async () => { - const mockFiles = [ - createMockTFile('note1.md'), - createMockTFile('note2.md') - ]; - const mockFolders = [ - createMockTFolder('folder1') - ]; - const mockRoot = createMockTFolder('', [...mockFiles, ...mockFolders]); - - mockVault.getRoot = jest.fn().mockReturnValue(mockRoot); - - const result = await vaultTools.listNotes(); - - expect(result.isError).toBeUndefined(); - expect(mockVault.getRoot).toHaveBeenCalled(); - // ... rest of assertions ... - }); - - it('should return error if folder not found', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - - const result = await vaultTools.listNotes('nonexistent'); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('not found'); - }); - - // ... more test cases ... -}); -``` - -**Step 3: Update stat tests** - -```typescript -describe('stat', () => { - it('should return file statistics', async () => { - const mockFile = createMockTFile('test.md', { - ctime: 1000, - mtime: 2000, - size: 500 - }); - - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - mockVault.stat = jest.fn().mockReturnValue(mockFile.stat); - - const result = await vaultTools.stat('test.md'); - - expect(result.isError).toBeUndefined(); - expect(mockVault.getAbstractFileByPath).toHaveBeenCalledWith('test.md'); - expect(mockVault.stat).toHaveBeenCalledWith(mockFile); - // ... rest of assertions ... - }); - - it('should return error if file not found', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - - const result = await vaultTools.stat('nonexistent.md'); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('not found'); - }); -}); -``` - -**Step 4: Update exists tests** - -```typescript -describe('exists', () => { - it('should return true if file exists', async () => { - const mockFile = createMockTFile('test.md'); - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - - const result = await vaultTools.exists('test.md'); - - const response = JSON.parse(result.content[0].text); - expect(response.exists).toBe(true); - }); - - it('should return false if file does not exist', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - - const result = await vaultTools.exists('nonexistent.md'); - - const response = JSON.parse(result.content[0].text); - expect(response.exists).toBe(false); - }); -}); -``` - -**Step 5: Run tests to verify** - -```bash -npm test -- vault-tools.test.ts -``` - -Expected: Tests should pass for the migrated methods (listNotes, stat, exists). - -**Step 6: Commit test updates** - -```bash -git add tests/vault-tools.test.ts -git commit -m "test: update vault-tools tests to use mock adapters - -Replace complex App object mocks with clean mock adapter pattern. -Simplifies test setup and improves maintainability." -``` - ---- - -## Task 6: Fix list-notes-sorting.test.ts Mock Issues - -**Files:** -- Modify: `tests/list-notes-sorting.test.ts` - -**Step 1: Update imports and test setup** - -Update the file to use new mock adapters: - -```typescript -import { VaultTools } from '../src/tools/vault-tools'; -import { createMockVaultAdapter, createMockMetadataCacheAdapter, createMockTFolder, createMockTFile } from './__mocks__/adapters'; -import { App } from 'obsidian'; - -describe('VaultTools - list_notes sorting', () => { - let vaultTools: VaultTools; - let mockVault: ReturnType; - let mockMetadata: ReturnType; - - beforeEach(() => { - mockVault = createMockVaultAdapter(); - mockMetadata = createMockMetadataCacheAdapter(); - vaultTools = new VaultTools(mockVault, mockMetadata, {} as App); - }); - - // ... tests updated to use mockVault ... -}); -``` - -**Step 2: Fix root folder tests** - -Update tests that check root behavior to use properly mocked TFolder with isRoot(): - -```typescript -it('should list root when path is undefined', async () => { - const mockFiles = [ - createMockTFile('file1.md'), - createMockTFile('file2.md') - ]; - const mockRoot = createMockTFolder('', mockFiles); - - mockVault.getRoot = jest.fn().mockReturnValue(mockRoot); - - const result = await vaultTools.listNotes(); - - expect(result.isError).toBeUndefined(); - const parsed = JSON.parse(result.content[0].text); - expect(parsed.path).toBe(''); - expect(parsed.items.length).toBe(2); -}); -``` - -**Step 3: Run tests to verify fix** - -```bash -npm test -- list-notes-sorting.test.ts -``` - -Expected: All tests should pass now that TFolder mocks include isRoot() method. - -**Step 4: Commit fix** - -```bash -git add tests/list-notes-sorting.test.ts -git commit -m "test: fix list-notes-sorting tests with proper mocks - -Use createMockTFolder helper which includes isRoot() method. -Fixes TypeError: item.isRoot is not a function." -``` - ---- - -## Task 7: Continue VaultTools Migration - Search Methods - -**Files:** -- Modify: `src/tools/vault-tools.ts` - -**Step 1: Update search method to use adapters** - -Find the `search` method and update to use adapters: - -```typescript -async search( - query: string, - options?: { - path?: string; - useRegex?: boolean; - caseSensitive?: boolean; - includeGlob?: string; - maxResults?: number; - } -): Promise { - try { - const files = this.vault.getMarkdownFiles(); - let targetFiles = files; - - // Apply path filter if specified - if (options?.path) { - const normalizedPath = PathUtils.normalizePath(options.path); - targetFiles = files.filter(f => f.path.startsWith(normalizedPath + '/') || f.path === normalizedPath); - } - - // Apply glob filter if specified - if (options?.includeGlob) { - targetFiles = GlobUtils.filterByGlob(targetFiles, options.includeGlob); - } - - const results: SearchResult[] = []; - - for (const file of targetFiles) { - const content = await this.vault.read(file); - const matches = SearchUtils.search(content, query, { - useRegex: options?.useRegex, - caseSensitive: options?.caseSensitive - }); - - if (matches.length > 0) { - results.push({ - path: file.path, - matches - }); - - if (options?.maxResults && results.length >= options.maxResults) { - break; - } - } - } - - // ... rest of method (formatting results) ... - } catch (error) { - // ... error handling ... - } -} -``` - -**Step 2: Update getVaultInfo to use adapters** - -Find the `getVaultInfo` method: - -```typescript -async getVaultInfo(): Promise { - try { - const allFiles = this.vault.getMarkdownFiles(); - const totalNotes = allFiles.length; - - // Calculate total size - let totalSize = 0; - for (const file of allFiles) { - const stat = this.vault.stat(file); - if (stat) { - totalSize += stat.size; - } - } - - const info = { - totalNotes, - totalSize, - sizeFormatted: this.formatBytes(totalSize) - }; - - return { - content: [{ - type: "text", - text: JSON.stringify(info, null, 2) - }] - }; - } catch (error) { - // ... error handling ... - } -} -``` - -**Step 3: Commit search methods migration** - -```bash -git add src/tools/vault-tools.ts -git commit -m "refactor: migrate search and getVaultInfo to use adapters - -Update search and getVaultInfo methods to use IVaultAdapter -instead of direct App.vault access." -``` - ---- - -## Task 8: Complete VaultTools Migration - Link Methods - -**Files:** -- Modify: `src/tools/vault-tools.ts` - -**Step 1: Update validateWikilinks to use adapters** - -Find the `validateWikilinks` method: - -```typescript -async validateWikilinks(path: string): Promise { - try { - const normalizedPath = PathUtils.normalizePath(path); - const file = this.vault.getAbstractFileByPath(normalizedPath); - - if (!file || !(file instanceof TFile)) { - return { - content: [{ - type: "text", - text: ErrorMessages.fileNotFound(normalizedPath) - }], - isError: true - }; - } - - // Read file content - const content = await this.vault.read(file); - - // Use LinkUtils to validate (LinkUtils already uses metadataCache internally) - const resolved: any[] = []; - const unresolved: any[] = []; - - // Extract wikilinks from content - const wikilinkRegex = /\[\[([^\]]+)\]\]/g; - let match; - - while ((match = wikilinkRegex.exec(content)) !== null) { - const linktext = match[1]; - const [linkpath, alias] = linktext.split('|'); - const [path, heading] = linkpath.split('#'); - - const dest = this.metadata.getFirstLinkpathDest(path.trim(), normalizedPath); - - if (dest) { - resolved.push({ - link: linktext, - resolvedPath: dest.path, - hasHeading: !!heading, - hasAlias: !!alias - }); - } else { - unresolved.push({ - link: linktext, - reason: 'File not found' - }); - } - } - - const result = { - path: normalizedPath, - totalLinks: resolved.length + unresolved.length, - resolvedLinks: resolved, - unresolvedLinks: unresolved - }; - - return { - content: [{ - type: "text", - text: JSON.stringify(result, null, 2) - }] - }; - } catch (error) { - return { - content: [{ - type: "text", - text: `Validate wikilinks error: ${(error as Error).message}` - }], - isError: true - }; - } -} -``` - -**Step 2: Update resolveWikilink to use adapters** - -Find the `resolveWikilink` method: - -```typescript -async resolveWikilink(linktext: string, sourcePath: string): Promise { - try { - const normalizedSource = PathUtils.normalizePath(sourcePath); - - // Parse linktext - const [linkpath, alias] = linktext.split('|'); - const [path, heading] = linkpath.split('#'); - - const dest = this.metadata.getFirstLinkpathDest(path.trim(), normalizedSource); - - if (!dest) { - return { - content: [{ - type: "text", - text: JSON.stringify({ - resolved: false, - linktext, - reason: 'File not found' - }, null, 2) - }] - }; - } - - const result = { - resolved: true, - linktext, - resolvedPath: dest.path, - hasHeading: !!heading, - heading: heading?.trim(), - hasAlias: !!alias, - alias: alias?.trim() - }; - - return { - content: [{ - type: "text", - text: JSON.stringify(result, null, 2) - }] - }; - } catch (error) { - // ... error handling ... - } -} -``` - -**Step 3: Update backlinks to use adapters** - -Find the `backlinks` method: - -```typescript -async backlinks( - path: string, - includeSnippets: boolean = true, - includeUnlinked: boolean = false -): Promise { - try { - const normalizedPath = PathUtils.normalizePath(path); - const file = this.vault.getAbstractFileByPath(normalizedPath); - - if (!file || !(file instanceof TFile)) { - return { - content: [{ - type: "text", - text: ErrorMessages.fileNotFound(normalizedPath) - }], - isError: true - }; - } - - // Get backlinks from metadata cache - const backlinksData = this.metadata.getBacklinksForFile(file); - const backlinks: any[] = []; - - for (const [sourcePath, positions] of Object.entries(backlinksData)) { - const sourceFile = this.vault.getAbstractFileByPath(sourcePath); - - if (sourceFile && sourceFile instanceof TFile) { - const occurrences: any[] = []; - - for (const pos of positions as any[]) { - let snippet = ''; - - if (includeSnippets) { - const content = await this.vault.read(sourceFile); - const lines = content.split('\n'); - snippet = lines[pos.line] || ''; - } - - occurrences.push({ - line: pos.line, - column: pos.column, - snippet - }); - } - - backlinks.push({ - sourcePath, - occurrences - }); - } - } - - // If snippets not requested, remove them - if (!includeSnippets) { - for (const backlink of backlinks) { - for (const occurrence of backlink.occurrences) { - occurrence.snippet = ''; - } - } - } - - const result = { - path: normalizedPath, - backlinks, - totalBacklinks: backlinks.length - }; - - return { - content: [{ - type: "text", - text: JSON.stringify(result, null, 2) - }] - }; - } catch (error) { - return { - content: [{ - type: "text", - text: `Backlinks error: ${(error as Error).message}` - }], - isError: true - }; - } -} -``` - -**Step 4: Remove app parameter from constructor** - -Now that all methods are migrated, remove the temporary `app` parameter: - -```typescript -export class VaultTools { - constructor( - private vault: IVaultAdapter, - private metadata: IMetadataCacheAdapter - ) {} - - // ... all methods now use adapters only ... -} -``` - -Update factory function: - -```typescript -export function createVaultTools(app: App): VaultTools { - return new VaultTools( - new VaultAdapter(app.vault), - new MetadataCacheAdapter(app.metadataCache) - ); -} -``` - -**Step 5: Commit link methods migration** - -```bash -git add src/tools/vault-tools.ts src/tools/vault-tools-factory.ts -git commit -m "refactor: complete VaultTools adapter migration - -Migrate validateWikilinks, resolveWikilink, and backlinks methods -to use adapters. Remove temporary app parameter from constructor. -VaultTools now fully depends on interface abstractions." -``` - ---- - -## Task 9: Add Tests for Uncovered VaultTools Paths - -**Files:** -- Modify: `tests/vault-tools.test.ts` - -**Step 1: Add frontmatter extraction tests** - -Add test cases for the extractFrontmatterSummary method edge cases: - -```typescript -describe('extractFrontmatterSummary', () => { - it('should handle string tags and convert to array', async () => { - const mockFile = createMockTFile('test.md'); - const mockCache = { - frontmatter: { - title: 'Test', - tags: 'single-tag' - } - }; - - mockVault.getRoot = jest.fn().mockReturnValue( - createMockTFolder('', [mockFile]) - ); - mockVault.stat = jest.fn().mockReturnValue({ - ctime: 1000, - mtime: 2000, - size: 100 - }); - mockMetadata.getFileCache = jest.fn().mockReturnValue(mockCache); - - const result = await vaultTools.listNotes('', true); - - expect(result.isError).toBeUndefined(); - const parsed = JSON.parse(result.content[0].text); - expect(parsed.items[0].metadata.frontmatterSummary.tags).toEqual(['single-tag']); - }); - - it('should handle array tags and preserve as array', async () => { - const mockFile = createMockTFile('test.md'); - const mockCache = { - frontmatter: { - title: 'Test', - tags: ['tag1', 'tag2'] - } - }; - - mockVault.getRoot = jest.fn().mockReturnValue( - createMockTFolder('', [mockFile]) - ); - mockVault.stat = jest.fn().mockReturnValue({ - ctime: 1000, - mtime: 2000, - size: 100 - }); - mockMetadata.getFileCache = jest.fn().mockReturnValue(mockCache); - - const result = await vaultTools.listNotes('', true); - - expect(result.isError).toBeUndefined(); - const parsed = JSON.parse(result.content[0].text); - expect(parsed.items[0].metadata.frontmatterSummary.tags).toEqual(['tag1', 'tag2']); - }); - - it('should handle string aliases and convert to array', async () => { - const mockFile = createMockTFile('test.md'); - const mockCache = { - frontmatter: { - aliases: 'single-alias' - } - }; - - mockVault.getRoot = jest.fn().mockReturnValue( - createMockTFolder('', [mockFile]) - ); - mockVault.stat = jest.fn().mockReturnValue({ - ctime: 1000, - mtime: 2000, - size: 100 - }); - mockMetadata.getFileCache = jest.fn().mockReturnValue(mockCache); - - const result = await vaultTools.listNotes('', true); - - expect(result.isError).toBeUndefined(); - const parsed = JSON.parse(result.content[0].text); - expect(parsed.items[0].metadata.frontmatterSummary.aliases).toEqual(['single-alias']); - }); - - it('should handle frontmatter extraction error gracefully', async () => { - const mockFile = createMockTFile('test.md'); - - mockVault.getRoot = jest.fn().mockReturnValue( - createMockTFolder('', [mockFile]) - ); - mockVault.stat = jest.fn().mockReturnValue({ - ctime: 1000, - mtime: 2000, - size: 100 - }); - mockMetadata.getFileCache = jest.fn().mockImplementation(() => { - throw new Error('Cache error'); - }); - - const result = await vaultTools.listNotes('', true); - - // Should return base metadata even if frontmatter extraction fails - expect(result.isError).toBeUndefined(); - const parsed = JSON.parse(result.content[0].text); - expect(parsed.items[0].metadata).toBeDefined(); - expect(parsed.items[0].metadata.created).toBe(1000); - }); - - it('should include custom frontmatter fields', async () => { - const mockFile = createMockTFile('test.md'); - const mockCache = { - frontmatter: { - title: 'Test', - customField: 'custom value', - anotherField: 123 - } - }; - - mockVault.getRoot = jest.fn().mockReturnValue( - createMockTFolder('', [mockFile]) - ); - mockVault.stat = jest.fn().mockReturnValue({ - ctime: 1000, - mtime: 2000, - size: 100 - }); - mockMetadata.getFileCache = jest.fn().mockReturnValue(mockCache); - - const result = await vaultTools.listNotes('', true); - - expect(result.isError).toBeUndefined(); - const parsed = JSON.parse(result.content[0].text); - expect(parsed.items[0].metadata.frontmatterSummary.customField).toBe('custom value'); - expect(parsed.items[0].metadata.frontmatterSummary.anotherField).toBe(123); - }); -}); -``` - -**Step 2: Add backlinks tests with snippet options** - -```typescript -describe('backlinks with options', () => { - it('should include snippets when includeSnippets is true', async () => { - const targetFile = createMockTFile('target.md'); - const sourceFile = createMockTFile('source.md'); - - mockVault.getAbstractFileByPath = jest.fn() - .mockReturnValueOnce(targetFile) - .mockReturnValue(sourceFile); - mockVault.read = jest.fn().mockResolvedValue('This links to [[target]]'); - mockMetadata.getBacklinksForFile = jest.fn().mockReturnValue({ - 'source.md': [{ line: 0, column: 15 }] - }); - - const result = await vaultTools.backlinks('target.md', true); - - expect(result.isError).toBeUndefined(); - const parsed = JSON.parse(result.content[0].text); - expect(parsed.backlinks[0].occurrences[0].snippet).toBe('This links to [[target]]'); - }); - - it('should remove snippets when includeSnippets is false', async () => { - const targetFile = createMockTFile('target.md'); - const sourceFile = createMockTFile('source.md'); - - mockVault.getAbstractFileByPath = jest.fn() - .mockReturnValueOnce(targetFile) - .mockReturnValue(sourceFile); - mockMetadata.getBacklinksForFile = jest.fn().mockReturnValue({ - 'source.md': [{ line: 0, column: 15 }] - }); - - const result = await vaultTools.backlinks('target.md', false); - - expect(result.isError).toBeUndefined(); - const parsed = JSON.parse(result.content[0].text); - expect(parsed.backlinks[0].occurrences[0].snippet).toBe(''); - }); -}); -``` - -**Step 3: Add validateWikilinks error path tests** - -```typescript -describe('validateWikilinks error paths', () => { - it('should return error if file not found', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - - const result = await vaultTools.validateWikilinks('nonexistent.md'); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('not found'); - }); - - it('should handle wikilink parsing errors gracefully', async () => { - const mockFile = createMockTFile('test.md'); - - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - mockVault.read = jest.fn().mockRejectedValue(new Error('Read error')); - - const result = await vaultTools.validateWikilinks('test.md'); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('error'); - }); -}); -``` - -**Step 4: Run tests to verify coverage increase** - -```bash -npm run test:coverage -- vault-tools -``` - -Expected: Coverage for vault-tools.ts should increase significantly, targeting areas that were previously uncovered (lines 309-352, 716-735, 824-852). - -**Step 5: Commit new tests** - -```bash -git add tests/vault-tools.test.ts -git commit -m "test: add coverage for VaultTools uncovered paths - -Add tests for frontmatter extraction edge cases, backlinks options, -and error handling paths. Targets previously uncovered lines." -``` - ---- - -## Task 10: Refactor NoteTools to Use Adapters - -**Files:** -- Modify: `src/tools/note-tools.ts` -- Create: `src/tools/note-tools-factory.ts` - -**Step 1: Update NoteTools constructor** - -In `src/tools/note-tools.ts`, update the constructor: - -```typescript -import { IVaultAdapter, IFileManagerAdapter } from '../adapters/interfaces'; - -export class NoteTools { - constructor( - private vault: IVaultAdapter, - private fileManager: IFileManagerAdapter, - private app: App // Keep temporarily - ) {} - - // ... rest of class -} -``` - -**Step 2: Create factory function** - -Create `src/tools/note-tools-factory.ts`: - -```typescript -import { App } from 'obsidian'; -import { NoteTools } from './note-tools'; -import { VaultAdapter } from '../adapters/vault-adapter'; -import { FileManagerAdapter } from '../adapters/file-manager-adapter'; - -/** - * Factory function to create NoteTools with concrete adapters - */ -export function createNoteTools(app: App): NoteTools { - return new NoteTools( - new VaultAdapter(app.vault), - new FileManagerAdapter(app.fileManager), - app - ); -} -``` - -**Step 3: Update readNote to use adapters** - -Find the `readNote` method: - -```typescript -async readNote(path: string, includeVersionId: boolean = false): Promise { - try { - const normalizedPath = PathUtils.normalizePath(path); - const file = this.vault.getAbstractFileByPath(normalizedPath); - - if (!file || !(file instanceof TFile)) { - return { - content: [{ - type: "text", - text: ErrorMessages.fileNotFound(normalizedPath) - }], - isError: true - }; - } - - const content = await this.vault.read(file); - - if (includeVersionId) { - const versionId = VersionUtils.generateVersionId(file); - return { - content: [{ - type: "text", - text: content - }], - _meta: { - versionId - } - }; - } - - return { - content: [{ - type: "text", - text: content - }] - }; - } catch (error) { - // ... error handling ... - } -} -``` - -**Step 4: Update createNote to use adapters** - -Find the `createNote` method: - -```typescript -async createNote( - path: string, - content: string, - createParents: boolean = false, - conflictStrategy: 'error' | 'overwrite' | 'rename' = 'error' -): Promise { - try { - const normalizedPath = PathUtils.normalizePath(path); - - // Check if file exists - const existing = this.vault.getAbstractFileByPath(normalizedPath); - - if (existing) { - switch (conflictStrategy) { - case 'error': - return { - content: [{ - type: "text", - text: ErrorMessages.fileAlreadyExists(normalizedPath) - }], - isError: true - }; - case 'overwrite': - if (existing instanceof TFile) { - await this.vault.process(existing, () => content); - return { - content: [{ - type: "text", - text: `Note overwritten successfully: ${normalizedPath}` - }] - }; - } - break; - case 'rename': - // Find available name - let counter = 1; - let newPath = normalizedPath; - while (this.vault.getAbstractFileByPath(newPath)) { - const parts = normalizedPath.split('.'); - const ext = parts.pop(); - const base = parts.join('.'); - newPath = `${base}-${counter}.${ext}`; - counter++; - } - await this.vault.create(newPath, content); - return { - content: [{ - type: "text", - text: `Note created with renamed path: ${newPath}` - }] - }; - } - } - - // Create parent folders if requested - if (createParents) { - const parentPath = normalizedPath.substring(0, normalizedPath.lastIndexOf('/')); - if (parentPath) { - await this.createParentFolders(parentPath); - } - } else { - // Check parent exists - const parentPath = normalizedPath.substring(0, normalizedPath.lastIndexOf('/')); - if (parentPath) { - const parent = this.vault.getAbstractFileByPath(parentPath); - if (!parent) { - return { - content: [{ - type: "text", - text: ErrorMessages.parentFolderNotFound(normalizedPath, parentPath) - }], - isError: true - }; - } - } - } - - await this.vault.create(normalizedPath, content); - - return { - content: [{ - type: "text", - text: `Note created successfully: ${normalizedPath}` - }] - }; - } catch (error) { - // ... error handling ... - } -} - -private async createParentFolders(path: string): Promise { - const parts = path.split('/'); - let currentPath = ''; - - for (const part of parts) { - currentPath = currentPath ? `${currentPath}/${part}` : part; - const existing = this.vault.getAbstractFileByPath(currentPath); - - if (!existing) { - await this.vault.createFolder(currentPath); - } - } -} -``` - -**Step 5: Update updateNote to use adapters** - -Find the `updateNote` method: - -```typescript -async updateNote( - path: string, - content: string, - ifMatch?: string -): Promise { - try { - const normalizedPath = PathUtils.normalizePath(path); - const file = this.vault.getAbstractFileByPath(normalizedPath); - - if (!file || !(file instanceof TFile)) { - return { - content: [{ - type: "text", - text: ErrorMessages.fileNotFound(normalizedPath) - }], - isError: true - }; - } - - // Check version if ifMatch provided - if (ifMatch) { - const currentVersion = VersionUtils.generateVersionId(file); - if (currentVersion !== ifMatch) { - return { - content: [{ - type: "text", - text: ErrorMessages.versionMismatch(normalizedPath, ifMatch, currentVersion) - }], - isError: true - }; - } - } - - await this.vault.process(file, () => content); - - return { - content: [{ - type: "text", - text: `Note updated successfully: ${normalizedPath}` - }] - }; - } catch (error) { - // ... error handling ... - } -} -``` - -**Step 6: Update deleteNote to use adapters** - -Find the `deleteNote` method: - -```typescript -async deleteNote(path: string): Promise { - try { - const normalizedPath = PathUtils.normalizePath(path); - const file = this.vault.getAbstractFileByPath(normalizedPath); - - if (!file) { - return { - content: [{ - type: "text", - text: ErrorMessages.fileNotFound(normalizedPath) - }], - isError: true - }; - } - - await this.fileManager.trashFile(file); - - return { - content: [{ - type: "text", - text: `Note deleted successfully: ${normalizedPath}` - }] - }; - } catch (error) { - // ... error handling ... - } -} -``` - -**Step 7: Update renameFile to use adapters** - -Find the `renameFile` method: - -```typescript -async renameFile(oldPath: string, newPath: string): Promise { - try { - const normalizedOld = PathUtils.normalizePath(oldPath); - const normalizedNew = PathUtils.normalizePath(newPath); - - const file = this.vault.getAbstractFileByPath(normalizedOld); - - if (!file) { - return { - content: [{ - type: "text", - text: ErrorMessages.fileNotFound(normalizedOld) - }], - isError: true - }; - } - - await this.fileManager.renameFile(file, normalizedNew); - - return { - content: [{ - type: "text", - text: `File renamed successfully: ${normalizedOld} → ${normalizedNew}` - }] - }; - } catch (error) { - // ... error handling ... - } -} -``` - -**Step 8: Remove app parameter from constructor** - -Remove the temporary app parameter: - -```typescript -export class NoteTools { - constructor( - private vault: IVaultAdapter, - private fileManager: IFileManagerAdapter - ) {} - - // ... all methods now use adapters only ... -} -``` - -Update factory: - -```typescript -export function createNoteTools(app: App): NoteTools { - return new NoteTools( - new VaultAdapter(app.vault), - new FileManagerAdapter(app.fileManager) - ); -} -``` - -**Step 9: Commit NoteTools refactoring** - -```bash -git add src/tools/note-tools.ts src/tools/note-tools-factory.ts -git commit -m "refactor: migrate NoteTools to use adapter interfaces - -Update NoteTools to depend on IVaultAdapter and IFileManagerAdapter. -Migrate all CRUD methods (read, create, update, delete, rename) to -use adapters instead of direct Obsidian API access." -``` - ---- - -## Task 11: Update NoteTools Tests and Fix parent-folder-detection - -**Files:** -- Modify: `tests/note-tools.test.ts` -- Modify: `tests/parent-folder-detection.test.ts` - -**Step 1: Update note-tools.test.ts imports and setup** - -```typescript -import { NoteTools } from '../src/tools/note-tools'; -import { createMockVaultAdapter, createMockFileManagerAdapter, createMockTFile } from './__mocks__/adapters'; -import { TFile, App } from 'obsidian'; - -describe('NoteTools', () => { - let noteTools: NoteTools; - let mockVault: ReturnType; - let mockFileManager: ReturnType; - - beforeEach(() => { - mockVault = createMockVaultAdapter(); - mockFileManager = createMockFileManagerAdapter(); - noteTools = new NoteTools(mockVault, mockFileManager); - }); - - // ... tests ... -}); -``` - -**Step 2: Update readNote tests** - -```typescript -describe('readNote', () => { - it('should read note content', async () => { - const mockFile = createMockTFile('test.md'); - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - mockVault.read = jest.fn().mockResolvedValue('# Test Content'); - - const result = await noteTools.readNote('test.md'); - - expect(result.isError).toBeUndefined(); - expect(result.content[0].text).toBe('# Test Content'); - expect(mockVault.read).toHaveBeenCalledWith(mockFile); - }); - - it('should return error if file not found', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - - const result = await noteTools.readNote('nonexistent.md'); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('not found'); - }); - - it('should include versionId when requested', async () => { - const mockFile = createMockTFile('test.md', { - ctime: 1000, - mtime: 2000, - size: 100 - }); - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - mockVault.read = jest.fn().mockResolvedValue('content'); - - const result = await noteTools.readNote('test.md', true); - - expect(result.isError).toBeUndefined(); - expect(result._meta?.versionId).toBeDefined(); - }); -}); -``` - -**Step 3: Update createNote tests** - -```typescript -describe('createNote', () => { - it('should create note successfully', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - mockVault.create = jest.fn().mockResolvedValue(createMockTFile('test.md')); - - const result = await noteTools.createNote('test.md', 'content'); - - expect(result.isError).toBeUndefined(); - expect(mockVault.create).toHaveBeenCalledWith('test.md', 'content'); - expect(result.content[0].text).toContain('created successfully'); - }); - - it('should return error if file exists and strategy is error', async () => { - const existing = createMockTFile('test.md'); - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(existing); - - const result = await noteTools.createNote('test.md', 'content', false, 'error'); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('already exists'); - }); - - it('should overwrite if strategy is overwrite', async () => { - const existing = createMockTFile('test.md'); - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(existing); - mockVault.process = jest.fn().mockResolvedValue('content'); - - const result = await noteTools.createNote('test.md', 'content', false, 'overwrite'); - - expect(result.isError).toBeUndefined(); - expect(mockVault.process).toHaveBeenCalled(); - expect(result.content[0].text).toContain('overwritten'); - }); - - it('should rename if strategy is rename', async () => { - const existing = createMockTFile('test.md'); - mockVault.getAbstractFileByPath = jest.fn() - .mockReturnValueOnce(existing) - .mockReturnValue(null); - mockVault.create = jest.fn().mockResolvedValue(createMockTFile('test-1.md')); - - const result = await noteTools.createNote('test.md', 'content', false, 'rename'); - - expect(result.isError).toBeUndefined(); - expect(mockVault.create).toHaveBeenCalledWith('test-1.md', 'content'); - }); -}); -``` - -**Step 4: Update parent-folder-detection.test.ts** - -Update the entire test file to use mock adapters: - -```typescript -import { NoteTools } from '../src/tools/note-tools'; -import { createMockVaultAdapter, createMockFileManagerAdapter, createMockTFile, createMockTFolder } from './__mocks__/adapters'; - -describe('Enhanced Parent Folder Detection', () => { - let noteTools: NoteTools; - let mockVault: ReturnType; - let mockFileManager: ReturnType; - - beforeEach(() => { - mockVault = createMockVaultAdapter(); - mockFileManager = createMockFileManagerAdapter(); - noteTools = new NoteTools(mockVault, mockFileManager); - }); - - describe('Explicit parent folder detection', () => { - it('should succeed when parent folder exists', async () => { - const parentFolder = createMockTFolder('existing-folder'); - - mockVault.getAbstractFileByPath = jest.fn() - .mockReturnValueOnce(null) // File doesn't exist - .mockReturnValue(parentFolder); // Parent exists - mockVault.create = jest.fn().mockResolvedValue(createMockTFile('existing-folder/file.md')); - - const result = await noteTools.createNote('existing-folder/file.md', 'content', false); - - expect(result.isError).toBeUndefined(); - expect(result.content[0].text).toContain('Note created successfully'); - expect(mockVault.create).toHaveBeenCalledWith('existing-folder/file.md', 'content'); - }); - - it('should fail when parent folder does not exist and createParents is false', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - - const result = await noteTools.createNote('missing-folder/file.md', 'content', false); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('Parent folder'); - }); - }); - - describe('createParents parameter', () => { - it('should create single missing parent folder when createParents is true', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - mockVault.createFolder = jest.fn().mockResolvedValue(undefined); - mockVault.create = jest.fn().mockResolvedValue(createMockTFile('new-folder/file.md')); - - const result = await noteTools.createNote('new-folder/file.md', 'content', true); - - expect(result.isError).toBeUndefined(); - expect(mockVault.createFolder).toHaveBeenCalledWith('new-folder'); - expect(mockVault.create).toHaveBeenCalledWith('new-folder/file.md', 'content'); - expect(result.content[0].text).toContain('Note created successfully'); - }); - - it('should recursively create all missing parent folders', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - mockVault.createFolder = jest.fn().mockResolvedValue(undefined); - mockVault.create = jest.fn().mockResolvedValue(createMockTFile('a/b/c/file.md')); - - const result = await noteTools.createNote('a/b/c/file.md', 'content', true); - - expect(result.isError).toBeUndefined(); - expect(mockVault.createFolder).toHaveBeenCalledTimes(3); - expect(mockVault.createFolder).toHaveBeenCalledWith('a'); - expect(mockVault.createFolder).toHaveBeenCalledWith('a/b'); - expect(mockVault.createFolder).toHaveBeenCalledWith('a/b/c'); - }); - - it('should skip creating folders that already exist', async () => { - const folderA = createMockTFolder('a'); - - mockVault.getAbstractFileByPath = jest.fn() - .mockReturnValueOnce(null) // File doesn't exist - .mockReturnValueOnce(folderA) // 'a' exists - .mockReturnValue(null); // 'a/b' doesn't exist - mockVault.createFolder = jest.fn().mockResolvedValue(undefined); - mockVault.create = jest.fn().mockResolvedValue(createMockTFile('a/b/file.md')); - - const result = await noteTools.createNote('a/b/file.md', 'content', true); - - expect(result.isError).toBeUndefined(); - // Should only create 'a/b', not 'a' (which already exists) - expect(mockVault.createFolder).toHaveBeenCalledTimes(1); - expect(mockVault.createFolder).toHaveBeenCalledWith('a/b'); - }); - }); - - describe('Edge cases', () => { - it('should handle file in root directory (no parent path)', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - mockVault.create = jest.fn().mockResolvedValue(createMockTFile('file.md')); - - const result = await noteTools.createNote('file.md', 'content', false); - - expect(result.isError).toBeUndefined(); - expect(mockVault.create).toHaveBeenCalledWith('file.md', 'content'); - }); - - it('should normalize paths before checking parent', async () => { - const parentFolder = createMockTFolder('folder'); - - mockVault.getAbstractFileByPath = jest.fn() - .mockReturnValueOnce(null) - .mockReturnValue(parentFolder); - mockVault.create = jest.fn().mockResolvedValue(createMockTFile('folder/file.md')); - - const result = await noteTools.createNote('folder//file.md', 'content', false); - - expect(result.isError).toBeUndefined(); - expect(mockVault.create).toHaveBeenCalledWith('folder/file.md', 'content'); - }); - - it('should handle deeply nested paths', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - mockVault.createFolder = jest.fn().mockResolvedValue(undefined); - mockVault.create = jest.fn().mockResolvedValue(createMockTFile('a/b/c/d/e/f/file.md')); - - const result = await noteTools.createNote('a/b/c/d/e/f/file.md', 'content', true); - - expect(result.isError).toBeUndefined(); - expect(mockVault.createFolder).toHaveBeenCalledTimes(6); - }); - }); -}); -``` - -**Step 5: Run tests to verify fixes** - -```bash -npm test -``` - -Expected: All tests should now pass, including the 13 that were failing before. - -**Step 6: Commit test fixes** - -```bash -git add tests/note-tools.test.ts tests/parent-folder-detection.test.ts -git commit -m "test: update NoteTools tests to use mock adapters - -Replace complex App mocks with clean adapter pattern. Fixes all -parent-folder-detection test failures by providing proper mocks." -``` - ---- - -## Task 12: Add Tests for Uncovered NoteTools Paths - -**Files:** -- Modify: `tests/note-tools.test.ts` - -**Step 1: Add version mismatch tests** - -```typescript -describe('updateNote with version checking', () => { - it('should update when version matches', async () => { - const mockFile = createMockTFile('test.md', { - ctime: 1000, - mtime: 2000, - size: 100 - }); - const expectedVersion = `2000-100`; - - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - mockVault.process = jest.fn().mockResolvedValue('new content'); - - const result = await noteTools.updateNote('test.md', 'new content', expectedVersion); - - expect(result.isError).toBeUndefined(); - expect(mockVault.process).toHaveBeenCalled(); - }); - - it('should return error when version does not match', async () => { - const mockFile = createMockTFile('test.md', { - ctime: 1000, - mtime: 2000, - size: 100 - }); - const wrongVersion = `1000-50`; - - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - - const result = await noteTools.updateNote('test.md', 'new content', wrongVersion); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('version mismatch'); - expect(mockVault.process).not.toHaveBeenCalled(); - }); - - it('should update without version check when ifMatch not provided', async () => { - const mockFile = createMockTFile('test.md'); - - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - mockVault.process = jest.fn().mockResolvedValue('new content'); - - const result = await noteTools.updateNote('test.md', 'new content'); - - expect(result.isError).toBeUndefined(); - expect(mockVault.process).toHaveBeenCalled(); - }); -}); -``` - -**Step 2: Add error handling tests** - -```typescript -describe('error handling', () => { - it('should handle read errors', async () => { - const mockFile = createMockTFile('test.md'); - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - mockVault.read = jest.fn().mockRejectedValue(new Error('Permission denied')); - - const result = await noteTools.readNote('test.md'); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('Permission denied'); - }); - - it('should handle create errors', async () => { - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(null); - mockVault.create = jest.fn().mockRejectedValue(new Error('Disk full')); - - const result = await noteTools.createNote('test.md', 'content'); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('Disk full'); - }); - - it('should handle update errors', async () => { - const mockFile = createMockTFile('test.md'); - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - mockVault.process = jest.fn().mockRejectedValue(new Error('File locked')); - - const result = await noteTools.updateNote('test.md', 'content'); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('File locked'); - }); - - it('should handle delete errors', async () => { - const mockFile = createMockTFile('test.md'); - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - mockFileManager.trashFile = jest.fn().mockRejectedValue(new Error('Cannot delete')); - - const result = await noteTools.deleteNote('test.md'); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('Cannot delete'); - }); - - it('should handle rename errors', async () => { - const mockFile = createMockTFile('test.md'); - mockVault.getAbstractFileByPath = jest.fn().mockReturnValue(mockFile); - mockFileManager.renameFile = jest.fn().mockRejectedValue(new Error('Name conflict')); - - const result = await noteTools.renameFile('test.md', 'new.md'); - - expect(result.isError).toBe(true); - expect(result.content[0].text).toContain('Name conflict'); - }); -}); -``` - -**Step 3: Run coverage to check progress** - -```bash -npm run test:coverage -``` - -Expected: Coverage for note-tools.ts should approach or reach 100%. - -**Step 4: Commit new tests** - -```bash -git add tests/note-tools.test.ts -git commit -m "test: add coverage for NoteTools uncovered paths - -Add tests for version mismatch handling, error paths, and edge cases. -Targets previously uncovered lines in note-tools.ts." -``` - ---- - -## Task 13: Update ToolRegistry to Use Factory Functions - -**Files:** -- Modify: `src/tools/index.ts` - -**Step 1: Update imports and tool instantiation** - -In `src/tools/index.ts`, update to use factory functions: - -```typescript -import { App } from 'obsidian'; -import { createNoteTools } from './note-tools-factory'; -import { createVaultTools } from './vault-tools-factory'; -import { NotificationManager } from '../ui/notifications'; - -export class ToolRegistry { - private noteTools: ReturnType; - private vaultTools: ReturnType; - - constructor(app: App, notificationManager?: NotificationManager) { - this.noteTools = createNoteTools(app); - this.vaultTools = createVaultTools(app); - - // ... rest of constructor ... - } - - // ... rest of class unchanged ... -} -``` - -**Step 2: Verify no breaking changes** - -The public API of ToolRegistry remains unchanged - it still accepts an App object and uses the tools internally. - -**Step 3: Commit ToolRegistry update** - -```bash -git add src/tools/index.ts -git commit -m "refactor: update ToolRegistry to use factory functions - -Use createNoteTools and createVaultTools factory functions instead -of direct instantiation. Completes integration of adapter pattern." -``` - ---- - -## Task 14: Update Main Plugin to Use Factory Functions (if needed) - -**Files:** -- Check: `src/main.ts` (may not need changes) - -**Step 1: Check if main.ts directly instantiates tools** - -```bash -grep -n "new NoteTools\|new VaultTools" src/main.ts -``` - -Expected: Likely no matches, as main.ts probably only uses ToolRegistry. - -**Step 2: If changes needed, update imports** - -If main.ts directly creates tools (unlikely), update to use factories: - -```typescript -import { createNoteTools } from './tools/note-tools-factory'; -import { createVaultTools } from './tools/vault-tools-factory'; - -// Replace: -// const noteTools = new NoteTools(this.app); -// With: -const noteTools = createNoteTools(this.app); -``` - -**Step 3: Commit if changes made** - -```bash -git add src/main.ts -git commit -m "refactor: update main plugin to use factory functions" -``` - -Or skip this task if no changes needed. - ---- - -## Task 15: Run Full Test Suite and Verify 100% Coverage - -**Files:** -- N/A (verification task) - -**Step 1: Run full test suite** - -```bash -npm test -``` - -Expected: All tests passing (401+ tests). - -**Step 2: Run coverage report** - -```bash -npm run test:coverage -``` - -Expected output: -``` ------------------------|---------|----------|---------|---------|--- -File | % Stmts | % Branch | % Funcs | % Lines | ------------------------|---------|----------|---------|---------|--- -All files | 100 | 100 | 100 | 100 | - adapters | 100 | 100 | 100 | 100 | - file-manager-adapter.ts | 100 | 100 | 100 | 100 | - interfaces.ts | 100 | 100 | 100 | 100 | - metadata-adapter.ts | 100 | 100 | 100 | 100 | - vault-adapter.ts | 100 | 100 | 100 | 100 | - tools | 100 | 100 | 100 | 100 | - note-tools.ts | 100 | 100 | 100 | 100 | - vault-tools.ts | 100 | 100 | 100 | 100 | - utils | 100 | 100 | 100 | 100 | - ... | 100 | 100 | 100 | 100 | ------------------------|---------|----------|---------|---------|--- -``` - -**Step 3: If not 100%, identify remaining gaps** - -```bash -npm run test:coverage -- --verbose -``` - -Check the output for any remaining uncovered lines and add targeted tests. - -**Step 4: Document coverage achievement** - -Once 100% is reached, capture the coverage report: - -```bash -npm run test:coverage > coverage-report.txt -git add coverage-report.txt -git commit -m "docs: capture 100% test coverage achievement - -All files now have 100% statement, branch, function, and line coverage." -``` - ---- - -## Task 16: Run Build and Verify No Errors - -**Files:** -- N/A (verification task) - -**Step 1: Run TypeScript type check** - -```bash -npm run build -``` - -Expected: No type errors. Build should complete successfully. - -**Step 2: Check for any compilation warnings** - -Review build output for any warnings that should be addressed. - -**Step 3: Verify output files** - -```bash -ls -lh main.js -``` - -Expected: main.js exists and has reasonable size (should be similar to before refactoring). - -**Step 4: Commit if any build config changes were needed** - -```bash -git add tsconfig.json # If modified -git commit -m "chore: update build configuration for adapter pattern" -``` - -Or skip if no changes. - ---- - -## Task 17: Create Summary Commit and Tag - -**Files:** -- N/A (Git operations) - -**Step 1: Create summary of all changes** - -Review git log to see all commits: - -```bash -git log --oneline master..HEAD -``` - -**Step 2: Create final summary commit (if desired)** - -```bash -git commit --allow-empty -m "feat: achieve 100% test coverage via dependency injection - -Summary of changes: -- Created adapter interfaces (IVaultAdapter, IMetadataCacheAdapter, IFileManagerAdapter) -- Implemented concrete adapters as Obsidian API wrappers -- Refactored NoteTools and VaultTools to depend on interfaces -- Created factory functions for production usage -- Updated all tests to use clean mock adapter pattern -- Added tests for all previously uncovered code paths - -Results: -- 100% test coverage (statements, branches, functions, lines) -- All 401+ tests passing -- Cleaner, more maintainable test code -- Build succeeds with no errors - -Benefits: -- Easy to test new features (inject simple mocks) -- Obsidian API changes isolated to adapter layer -- Strong confidence for future refactoring -- Clear separation between business logic and API dependencies" -``` - -**Step 3: Create a tag for this milestone** - -```bash -git tag -a v100-percent-coverage -m "100% test coverage milestone" -``` - ---- - -## Task 18: Manual Verification in Obsidian (Optional but Recommended) - -**Files:** -- N/A (manual testing) - -**Step 1: Build the plugin** - -```bash -npm run build -``` - -**Step 2: Copy files to Obsidian vault for testing** - -```bash -# Assuming you have a test vault -cp main.js manifest.json styles.css /path/to/test-vault/.obsidian/plugins/obsidian-mcp-server/ -``` - -**Step 3: Test in Obsidian** - -1. Open Obsidian -2. Reload Obsidian (Ctrl/Cmd + R if in dev mode) -3. Enable the plugin -4. Start the MCP server -5. Test a few basic operations via HTTP client: - - Create a note - - Read a note - - List notes - - Search - - Check that everything works as before - -**Step 4: Document verification** - -If all works correctly: - -```bash -git commit --allow-empty -m "verify: manual testing in Obsidian successful - -Tested plugin in Obsidian after dependency injection refactoring. -All basic operations (create, read, list, search) working correctly." -``` - ---- - -## Completion Checklist - -- [ ] Task 1: Create adapter interfaces -- [ ] Task 2: Implement concrete adapters -- [ ] Task 3: Create mock adapters -- [ ] Task 4: Refactor VaultTools to use adapters -- [ ] Task 5: Update VaultTools tests -- [ ] Task 6: Fix list-notes-sorting tests -- [ ] Task 7: Migrate search methods -- [ ] Task 8: Migrate link methods -- [ ] Task 9: Add tests for uncovered VaultTools paths -- [ ] Task 10: Refactor NoteTools to use adapters -- [ ] Task 11: Update NoteTools tests and fix parent-folder-detection -- [ ] Task 12: Add tests for uncovered NoteTools paths -- [ ] Task 13: Update ToolRegistry -- [ ] Task 14: Update main plugin (if needed) -- [ ] Task 15: Verify 100% coverage -- [ ] Task 16: Verify build succeeds -- [ ] Task 17: Create summary and tag -- [ ] Task 18: Manual verification (optional) - ---- - -## Success Criteria - -**Primary Goals:** -✅ 100% test coverage: statements, branches, functions, lines -✅ All tests passing (401+ tests) -✅ Build succeeds with no errors -✅ Plugin functions correctly in Obsidian - -**Code Quality Goals:** -✅ Clean separation between business logic and Obsidian API -✅ Simple, maintainable test code using mock adapters -✅ Factory pattern enables easy production usage -✅ No breaking changes to public API - -**Documentation:** -✅ Design document committed -✅ Implementation plan committed -✅ Coverage achievement documented -✅ Manual verification documented diff --git a/docs/plans/2025-10-25-simplify-cors-mandatory-auth.md b/docs/plans/2025-10-25-simplify-cors-mandatory-auth.md deleted file mode 100644 index 63fbdf2..0000000 --- a/docs/plans/2025-10-25-simplify-cors-mandatory-auth.md +++ /dev/null @@ -1,1260 +0,0 @@ -# Simplify CORS and Make Authentication Mandatory Implementation Plan - -> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task. - -**Goal:** Remove CORS configuration toggles, make authentication mandatory with secure token storage using Electron's safeStorage API, and simplify settings UI. - -**Architecture:** This refactor removes `enableCORS` and `allowedOrigins` settings, making CORS always enabled with a fixed localhost-only policy. Authentication becomes mandatory with auto-generated API keys encrypted via Electron's safeStorage. Settings are migrated on plugin load to maintain backward compatibility. - -**Tech Stack:** TypeScript, Express.js, Electron safeStorage API, Jest for testing - ---- - -## Task 1: Create Encryption Utility Module - -**Files:** -- Create: `src/utils/encryption-utils.ts` -- Create: `tests/encryption-utils.test.ts` - -**Step 1: Write the failing test** - -Create `tests/encryption-utils.test.ts`: - -```typescript -import { encryptApiKey, decryptApiKey } from '../src/utils/encryption-utils'; - -// Mock electron module -jest.mock('electron', () => ({ - safeStorage: { - isEncryptionAvailable: jest.fn(() => true), - encryptString: jest.fn((data: string) => Buffer.from(`encrypted:${data}`)), - decryptString: jest.fn((buffer: Buffer) => { - const str = buffer.toString(); - return str.replace('encrypted:', ''); - }) - } -})); - -describe('Encryption Utils', () => { - describe('encryptApiKey', () => { - it('should encrypt API key when encryption is available', () => { - const apiKey = 'test-api-key-12345'; - const encrypted = encryptApiKey(apiKey); - - expect(encrypted).toMatch(/^encrypted:/); - expect(encrypted).not.toContain('test-api-key-12345'); - }); - - it('should return plaintext when encryption is not available', () => { - const { safeStorage } = require('electron'); - safeStorage.isEncryptionAvailable.mockReturnValueOnce(false); - - const apiKey = 'test-api-key-12345'; - const result = encryptApiKey(apiKey); - - expect(result).toBe(apiKey); - }); - - it('should handle empty string', () => { - const result = encryptApiKey(''); - expect(result).toBe(''); - }); - }); - - describe('decryptApiKey', () => { - it('should decrypt encrypted API key', () => { - const apiKey = 'test-api-key-12345'; - const encrypted = encryptApiKey(apiKey); - const decrypted = decryptApiKey(encrypted); - - expect(decrypted).toBe(apiKey); - }); - - it('should return plaintext if not encrypted format', () => { - const plaintext = 'plain-api-key'; - const result = decryptApiKey(plaintext); - - expect(result).toBe(plaintext); - }); - - it('should handle empty string', () => { - const result = decryptApiKey(''); - expect(result).toBe(''); - }); - }); - - describe('round-trip encryption', () => { - it('should successfully encrypt and decrypt', () => { - const original = 'my-secret-api-key-abc123'; - const encrypted = encryptApiKey(original); - const decrypted = decryptApiKey(encrypted); - - expect(decrypted).toBe(original); - expect(encrypted).not.toBe(original); - }); - }); -}); -``` - -**Step 2: Run test to verify it fails** - -Run: `npm test -- encryption-utils.test.ts` -Expected: FAIL with "Cannot find module '../src/utils/encryption-utils.ts'" - -**Step 3: Write minimal implementation** - -Create `src/utils/encryption-utils.ts`: - -```typescript -import { safeStorage } from 'electron'; - -/** - * Encrypts an API key using Electron's safeStorage API - * Falls back to plaintext if encryption is not available (e.g., Linux without keyring) - * @param apiKey The plaintext API key to encrypt - * @returns Encrypted API key with "encrypted:" prefix, or plaintext if encryption unavailable - */ -export function encryptApiKey(apiKey: string): string { - if (!apiKey) { - return ''; - } - - // Check if encryption is available - if (!safeStorage.isEncryptionAvailable()) { - console.warn('Encryption not available, storing API key in plaintext'); - return apiKey; - } - - try { - const encrypted = safeStorage.encryptString(apiKey); - return `encrypted:${encrypted.toString('base64')}`; - } catch (error) { - console.error('Failed to encrypt API key, falling back to plaintext:', error); - return apiKey; - } -} - -/** - * Decrypts an API key encrypted with encryptApiKey - * @param stored The stored API key (encrypted or plaintext) - * @returns Decrypted API key - */ -export function decryptApiKey(stored: string): string { - if (!stored) { - return ''; - } - - // Check if this is an encrypted key - if (!stored.startsWith('encrypted:')) { - // Legacy plaintext key or fallback - return stored; - } - - try { - const encryptedData = stored.substring(10); // Remove "encrypted:" prefix - const buffer = Buffer.from(encryptedData, 'base64'); - return safeStorage.decryptString(buffer); - } catch (error) { - console.error('Failed to decrypt API key:', error); - throw new Error('Failed to decrypt API key. You may need to regenerate it.'); - } -} - -/** - * Checks if encryption is available on the current platform - * @returns true if safeStorage encryption is available - */ -export function isEncryptionAvailable(): boolean { - return safeStorage.isEncryptionAvailable(); -} -``` - -**Step 4: Run test to verify it passes** - -Run: `npm test -- encryption-utils.test.ts` -Expected: PASS - All tests pass - -**Step 5: Commit** - -```bash -git add src/utils/encryption-utils.ts tests/encryption-utils.test.ts -git commit -m "feat: add API key encryption utilities using Electron safeStorage" -``` - ---- - -## Task 2: Update Settings Types - -**Files:** -- Modify: `src/types/settings-types.ts:1-34` - -**Step 1: Write the failing test** - -Create test file `tests/settings-types.test.ts`: - -```typescript -import { DEFAULT_SETTINGS, MCPPluginSettings } from '../src/types/settings-types'; - -describe('Settings Types', () => { - describe('DEFAULT_SETTINGS', () => { - it('should have authentication enabled by default', () => { - expect(DEFAULT_SETTINGS.enableAuth).toBe(true); - }); - - it('should not have enableCORS field', () => { - expect((DEFAULT_SETTINGS as any).enableCORS).toBeUndefined(); - }); - - it('should not have allowedOrigins field', () => { - expect((DEFAULT_SETTINGS as any).allowedOrigins).toBeUndefined(); - }); - - it('should have empty apiKey by default', () => { - expect(DEFAULT_SETTINGS.apiKey).toBe(''); - }); - - it('should have autoStart disabled by default', () => { - expect(DEFAULT_SETTINGS.autoStart).toBe(false); - }); - - it('should have valid port number', () => { - expect(DEFAULT_SETTINGS.port).toBe(3000); - expect(DEFAULT_SETTINGS.port).toBeGreaterThan(0); - expect(DEFAULT_SETTINGS.port).toBeLessThan(65536); - }); - }); - - describe('MCPPluginSettings interface', () => { - it('should require apiKey field', () => { - const settings: MCPPluginSettings = { - ...DEFAULT_SETTINGS, - apiKey: 'test-key' - }; - expect(settings.apiKey).toBe('test-key'); - }); - - it('should not allow enableCORS field', () => { - // This is a compile-time check, but we verify runtime - const settings: MCPPluginSettings = DEFAULT_SETTINGS; - expect((settings as any).enableCORS).toBeUndefined(); - }); - }); -}); -``` - -**Step 2: Run test to verify it fails** - -Run: `npm test -- settings-types.test.ts` -Expected: FAIL - Tests fail because enableCORS is still true and allowedOrigins exists - -**Step 3: Update settings types** - -Modify `src/types/settings-types.ts`: - -```typescript -// Settings Types -export interface MCPServerSettings { - port: number; - apiKey: string; // Now required, not optional - enableAuth: boolean; // Will be removed in future, kept for migration -} - -export interface NotificationSettings { - notificationsEnabled: boolean; - showParameters: boolean; - notificationDuration: number; // milliseconds - logToConsole: boolean; -} - -export interface MCPPluginSettings extends MCPServerSettings, NotificationSettings { - autoStart: boolean; -} - -export const DEFAULT_SETTINGS: MCPPluginSettings = { - port: 3000, - apiKey: '', // Will be auto-generated on first load - enableAuth: true, // Always true now - autoStart: false, - // Notification defaults - notificationsEnabled: false, - showParameters: false, - notificationDuration: 3000, - logToConsole: false -}; -``` - -**Step 4: Run test to verify it passes** - -Run: `npm test -- settings-types.test.ts` -Expected: PASS - All tests pass - -**Step 5: Commit** - -```bash -git add src/types/settings-types.ts tests/settings-types.test.ts -git commit -m "refactor: remove CORS settings, make auth mandatory in types" -``` - ---- - -## Task 3: Update Middleware to Use Fixed CORS Policy - -**Files:** -- Modify: `src/server/middleware.ts:1-60` -- Modify: `tests/middleware.test.ts` (create if doesn't exist) - -**Step 1: Write the failing test** - -Create `tests/middleware.test.ts`: - -```typescript -import express, { Express } from 'express'; -import request from 'supertest'; -import { setupMiddleware } from '../src/server/middleware'; -import { MCPServerSettings } from '../src/types/settings-types'; -import { ErrorCodes } from '../src/types/mcp-types'; - -describe('Middleware', () => { - let app: Express; - const mockCreateError = jest.fn((id, code, message) => ({ - jsonrpc: '2.0', - id, - error: { code, message } - })); - - const createTestSettings = (overrides?: Partial): MCPServerSettings => ({ - port: 3000, - apiKey: 'test-api-key-12345', - enableAuth: true, - ...overrides - }); - - beforeEach(() => { - app = express(); - mockCreateError.mockClear(); - }); - - describe('CORS', () => { - it('should allow localhost origin on any port', async () => { - setupMiddleware(app, createTestSettings(), mockCreateError); - app.get('/test', (req, res) => res.json({ ok: true })); - - const response = await request(app) - .get('/test') - .set('Origin', 'http://localhost:8080') - .set('Host', 'localhost:3000'); - - expect(response.headers['access-control-allow-origin']).toBe('http://localhost:8080'); - }); - - it('should allow 127.0.0.1 origin on any port', async () => { - setupMiddleware(app, createTestSettings(), mockCreateError); - app.get('/test', (req, res) => res.json({ ok: true })); - - const response = await request(app) - .get('/test') - .set('Origin', 'http://127.0.0.1:9000') - .set('Host', '127.0.0.1:3000'); - - expect(response.headers['access-control-allow-origin']).toBe('http://127.0.0.1:9000'); - }); - - it('should allow https localhost origins', async () => { - setupMiddleware(app, createTestSettings(), mockCreateError); - app.get('/test', (req, res) => res.json({ ok: true })); - - const response = await request(app) - .get('/test') - .set('Origin', 'https://localhost:443') - .set('Host', 'localhost:3000'); - - expect(response.headers['access-control-allow-origin']).toBe('https://localhost:443'); - }); - - it('should reject non-localhost origins', async () => { - setupMiddleware(app, createTestSettings(), mockCreateError); - app.get('/test', (req, res) => res.json({ ok: true })); - - const response = await request(app) - .get('/test') - .set('Origin', 'http://evil.com') - .set('Host', 'localhost:3000'); - - expect(response.status).toBe(500); // CORS error - }); - - it('should allow requests with no origin (CLI clients)', async () => { - setupMiddleware(app, createTestSettings(), mockCreateError); - app.get('/test', (req, res) => res.json({ ok: true })); - - const response = await request(app) - .get('/test') - .set('Host', 'localhost:3000'); - - expect(response.status).toBe(200); - }); - }); - - describe('Authentication', () => { - it('should require Bearer token when auth enabled', async () => { - setupMiddleware(app, createTestSettings({ enableAuth: true }), mockCreateError); - app.post('/mcp', (req, res) => res.json({ ok: true })); - - const response = await request(app) - .post('/mcp') - .set('Host', 'localhost:3000'); - - expect(response.status).toBe(401); - }); - - it('should accept valid Bearer token', async () => { - setupMiddleware(app, createTestSettings({ enableAuth: true, apiKey: 'secret123' }), mockCreateError); - app.post('/mcp', (req, res) => res.json({ ok: true })); - - const response = await request(app) - .post('/mcp') - .set('Host', 'localhost:3000') - .set('Authorization', 'Bearer secret123'); - - expect(response.status).toBe(200); - }); - - it('should reject invalid Bearer token', async () => { - setupMiddleware(app, createTestSettings({ enableAuth: true, apiKey: 'secret123' }), mockCreateError); - app.post('/mcp', (req, res) => res.json({ ok: true })); - - const response = await request(app) - .post('/mcp') - .set('Host', 'localhost:3000') - .set('Authorization', 'Bearer wrong-token'); - - expect(response.status).toBe(401); - }); - }); - - describe('Host validation', () => { - it('should allow localhost host header', async () => { - setupMiddleware(app, createTestSettings(), mockCreateError); - app.get('/test', (req, res) => res.json({ ok: true })); - - const response = await request(app) - .get('/test') - .set('Host', 'localhost:3000'); - - expect(response.status).toBe(200); - }); - - it('should allow 127.0.0.1 host header', async () => { - setupMiddleware(app, createTestSettings(), mockCreateError); - app.get('/test', (req, res) => res.json({ ok: true })); - - const response = await request(app) - .get('/test') - .set('Host', '127.0.0.1:3000'); - - expect(response.status).toBe(200); - }); - - it('should reject non-localhost host header', async () => { - setupMiddleware(app, createTestSettings(), mockCreateError); - app.get('/test', (req, res) => res.json({ ok: true })); - - const response = await request(app) - .get('/test') - .set('Host', 'evil.com'); - - expect(response.status).toBe(403); - }); - }); -}); -``` - -**Step 2: Run test to verify it fails** - -Run: `npm test -- middleware.test.ts` -Expected: FAIL - CORS tests fail because middleware still uses old configurable CORS - -**Step 3: Update middleware implementation** - -Modify `src/server/middleware.ts`: - -```typescript -import { Express, Request, Response } from 'express'; -import express from 'express'; -import cors from 'cors'; -import { MCPServerSettings } from '../types/settings-types'; -import { ErrorCodes } from '../types/mcp-types'; - -export function setupMiddleware(app: Express, settings: MCPServerSettings, createErrorResponse: (id: any, code: number, message: string) => any): void { - // Parse JSON bodies - app.use(express.json()); - - // CORS configuration - Always enabled with fixed localhost-only policy - const corsOptions = { - origin: (origin: string | undefined, callback: (err: Error | null, allow?: boolean) => void) => { - // Allow requests with no origin (like CLI clients, curl, MCP SDKs) - if (!origin) { - return callback(null, true); - } - - // Allow localhost and 127.0.0.1 on any port, both HTTP and HTTPS - const localhostRegex = /^https?:\/\/(localhost|127\.0\.0\.1)(:\d+)?$/; - if (localhostRegex.test(origin)) { - callback(null, true); - } else { - callback(new Error('Not allowed by CORS')); - } - }, - credentials: true - }; - app.use(cors(corsOptions)); - - // Authentication middleware - Always enabled - app.use((req: Request, res: Response, next: any) => { - // Defensive check: if no API key is set, reject all requests - if (!settings.apiKey || settings.apiKey.trim() === '') { - return res.status(500).json(createErrorResponse(null, ErrorCodes.InternalError, 'Server misconfigured: No API key set')); - } - - const authHeader = req.headers.authorization; - const providedKey = authHeader?.replace('Bearer ', ''); - - if (providedKey !== settings.apiKey) { - return res.status(401).json(createErrorResponse(null, ErrorCodes.InvalidRequest, 'Unauthorized')); - } - next(); - }); - - // Origin validation for security (DNS rebinding protection) - app.use((req: Request, res: Response, next: any) => { - const host = req.headers.host; - - // Only allow localhost connections - if (host && !host.startsWith('localhost') && !host.startsWith('127.0.0.1')) { - return res.status(403).json(createErrorResponse(null, ErrorCodes.InvalidRequest, 'Only localhost connections allowed')); - } - - next(); - }); -} -``` - -**Step 4: Install test dependencies** - -Run: `npm install --save-dev supertest @types/supertest` - -**Step 5: Run test to verify it passes** - -Run: `npm test -- middleware.test.ts` -Expected: PASS - All tests pass - -**Step 6: Commit** - -```bash -git add src/server/middleware.ts tests/middleware.test.ts package.json package-lock.json -git commit -m "refactor: use fixed localhost-only CORS policy, make auth mandatory" -``` - ---- - -## Task 4: Update Main Plugin to Auto-Generate and Encrypt API Keys - -**Files:** -- Modify: `src/main.ts` -- Create: `tests/main-migration.test.ts` - -**Step 1: Write the failing test** - -Create `tests/main-migration.test.ts`: - -```typescript -import { generateApiKey } from '../src/utils/auth-utils'; -import { encryptApiKey, decryptApiKey } from '../src/utils/encryption-utils'; -import { DEFAULT_SETTINGS } from '../src/types/settings-types'; - -// Mock electron -jest.mock('electron', () => ({ - safeStorage: { - isEncryptionAvailable: jest.fn(() => true), - encryptString: jest.fn((data: string) => Buffer.from(`encrypted:${data}`)), - decryptString: jest.fn((buffer: Buffer) => { - const str = buffer.toString(); - return str.replace('encrypted:', ''); - }) - } -})); - -describe('Settings Migration', () => { - describe('API key initialization', () => { - it('should generate API key if empty', () => { - const settings = { ...DEFAULT_SETTINGS, apiKey: '' }; - - // Simulate what plugin should do - if (!settings.apiKey) { - settings.apiKey = generateApiKey(); - } - - expect(settings.apiKey).toBeTruthy(); - expect(settings.apiKey.length).toBeGreaterThanOrEqual(32); - }); - - it('should encrypt API key on save', () => { - const plainKey = generateApiKey(); - const encrypted = encryptApiKey(plainKey); - - expect(encrypted).toMatch(/^encrypted:/); - expect(encrypted).not.toBe(plainKey); - }); - - it('should decrypt API key on load', () => { - const plainKey = generateApiKey(); - const encrypted = encryptApiKey(plainKey); - const decrypted = decryptApiKey(encrypted); - - expect(decrypted).toBe(plainKey); - }); - }); - - describe('Legacy settings migration', () => { - it('should remove enableCORS from legacy settings', () => { - const legacySettings: any = { - ...DEFAULT_SETTINGS, - enableCORS: true, - allowedOrigins: ['*'] - }; - - // Simulate migration - delete legacySettings.enableCORS; - delete legacySettings.allowedOrigins; - - expect(legacySettings.enableCORS).toBeUndefined(); - expect(legacySettings.allowedOrigins).toBeUndefined(); - }); - - it('should preserve other settings during migration', () => { - const legacySettings: any = { - ...DEFAULT_SETTINGS, - port: 4000, - enableCORS: false, - allowedOrigins: ['http://localhost:8080'], - notificationsEnabled: true - }; - - // Simulate migration - const { enableCORS, allowedOrigins, ...migrated } = legacySettings; - - expect(migrated.port).toBe(4000); - expect(migrated.notificationsEnabled).toBe(true); - }); - }); -}); -``` - -**Step 2: Run test to verify it fails** - -Run: `npm test -- main-migration.test.ts` -Expected: PASS (these are just verification tests, but main.ts hasn't been updated yet) - -**Step 3: Update main.ts plugin initialization** - -Find the `onload()` method in `src/main.ts` and add API key initialization and encryption: - -```typescript -// In MCPServerPlugin class, modify onload() method: -async onload() { - await this.loadSettings(); - - // Auto-generate API key if not set - if (!this.settings.apiKey || this.settings.apiKey.trim() === '') { - console.log('Generating new API key...'); - this.settings.apiKey = generateApiKey(); - await this.saveSettings(); - } - - // Migrate legacy settings (remove enableCORS and allowedOrigins) - const legacySettings = this.settings as any; - if ('enableCORS' in legacySettings || 'allowedOrigins' in legacySettings) { - console.log('Migrating legacy CORS settings...'); - delete legacySettings.enableCORS; - delete legacySettings.allowedOrigins; - await this.saveSettings(); - } - - // Rest of existing onload code... -} -``` - -**Step 4: Update loadSettings() to decrypt API key** - -Add decryption to the `loadSettings()` method: - -```typescript -async loadSettings() { - const data = await this.loadData(); - this.settings = Object.assign({}, DEFAULT_SETTINGS, data); - - // Decrypt API key if encrypted - if (this.settings.apiKey) { - try { - this.settings.apiKey = decryptApiKey(this.settings.apiKey); - } catch (error) { - console.error('Failed to decrypt API key:', error); - new Notice('⚠️ Failed to decrypt API key. Please regenerate in settings.'); - this.settings.apiKey = ''; - } - } -} -``` - -**Step 5: Update saveSettings() to encrypt API key** - -Add encryption to the `saveSettings()` method: - -```typescript -async saveSettings() { - // Create a copy of settings for saving - const settingsToSave = { ...this.settings }; - - // Encrypt API key before saving - if (settingsToSave.apiKey) { - settingsToSave.apiKey = encryptApiKey(settingsToSave.apiKey); - } - - await this.saveData(settingsToSave); - - // Update server settings if running - if (this.mcpServer) { - this.mcpServer.updateSettings(this.settings); - } -} -``` - -**Step 6: Add necessary imports to main.ts** - -Add these imports at the top of `src/main.ts`: - -```typescript -import { generateApiKey } from './utils/auth-utils'; -import { encryptApiKey, decryptApiKey } from './utils/encryption-utils'; -``` - -**Step 7: Run build to verify no TypeScript errors** - -Run: `npm run build` -Expected: Build succeeds with no errors - -**Step 8: Commit** - -```bash -git add src/main.ts tests/main-migration.test.ts -git commit -m "feat: auto-generate and encrypt API keys, migrate legacy CORS settings" -``` - ---- - -## Task 5: Update Settings UI - -**Files:** -- Modify: `src/settings.ts:60-90` (remove CORS settings) -- Modify: `src/settings.ts:92-164` (update auth section) - -**Step 1: Remove CORS settings from UI** - -Modify `src/settings.ts`, delete lines 60-90 (CORS toggle and allowed origins settings): - -```typescript -// DELETE THESE SECTIONS: -// - "Enable CORS" toggle (lines 61-72) -// - "Allowed origins" text input (lines 74-90) -``` - -**Step 2: Update authentication section** - -Modify the authentication section in `src/settings.ts` (around line 92-114): - -Replace: -```typescript -// Authentication -new Setting(containerEl) - .setName('Enable authentication') - .setDesc('Require API key for requests (requires restart)') - .addToggle(toggle => toggle - .setValue(this.plugin.settings.enableAuth) - .onChange(async (value) => { - this.plugin.settings.enableAuth = value; - - // Auto-generate API key when enabling authentication - if (value && (!this.plugin.settings.apiKey || this.plugin.settings.apiKey.trim() === '')) { - this.plugin.settings.apiKey = generateApiKey(); - new Notice('✅ API key generated automatically'); - } - - await this.plugin.saveSettings(); - if (this.plugin.mcpServer?.isRunning()) { - new Notice('⚠️ Server restart required for authentication changes to take effect'); - } - - // Refresh the display to show the new key - this.display(); - })); -``` - -With: -```typescript -// Authentication (Always Enabled) -containerEl.createEl('h3', {text: 'Authentication'}); - -const authDesc = containerEl.createEl('p', { - text: 'Authentication is required for all requests. Your API key is encrypted and stored securely using your system\'s credential storage.' -}); -authDesc.style.fontSize = '0.9em'; -authDesc.style.color = 'var(--text-muted)'; -authDesc.style.marginBottom = '16px'; - -// Show encryption status -const { isEncryptionAvailable } = require('./utils/encryption-utils'); -const encryptionStatus = containerEl.createEl('p', { - text: isEncryptionAvailable() - ? '🔒 Encryption: Available (using system keychain)' - : '⚠️ Encryption: Unavailable (API key stored in plaintext)' -}); -encryptionStatus.style.fontSize = '0.85em'; -encryptionStatus.style.marginBottom = '12px'; -encryptionStatus.style.fontStyle = 'italic'; -``` - -**Step 3: Update "API Key Display" condition** - -Change line 117 from: -```typescript -if (this.plugin.settings.enableAuth) { -``` - -To: -```typescript -// Always show API key section (auth is always enabled) -{ -``` - -And update the closing brace accordingly. - -**Step 4: Update MCP Client Configuration section** - -Modify the configuration generation (around line 179-193) to always include auth: - -Replace: -```typescript -// Generate JSON config based on auth settings -const mcpConfig: any = { - "mcpServers": { - "obsidian-mcp": { - "serverUrl": `http://127.0.0.1:${this.plugin.settings.port}/mcp` - } - } -}; - -// Only add headers if authentication is enabled -if (this.plugin.settings.enableAuth && this.plugin.settings.apiKey) { - mcpConfig.mcpServers["obsidian-mcp"].headers = { - "Authorization": `Bearer ${this.plugin.settings.apiKey}` - }; -} -``` - -With: -```typescript -// Generate JSON config (auth always included) -const mcpConfig = { - "mcpServers": { - "obsidian-mcp": { - "serverUrl": `http://127.0.0.1:${this.plugin.settings.port}/mcp`, - "headers": { - "Authorization": `Bearer ${this.plugin.settings.apiKey || 'YOUR_API_KEY_HERE'}` - } - } - } -}; -``` - -**Step 5: Add import for encryption utils** - -Add this import at the top of `src/settings.ts`: - -```typescript -import { isEncryptionAvailable } from './utils/encryption-utils'; -``` - -**Step 6: Test the UI manually** - -Manual test checklist: -1. Open Obsidian dev tools (Ctrl+Shift+I) -2. Open plugin settings -3. Verify no CORS toggle visible -4. Verify no "Allowed origins" field visible -5. Verify "Authentication" section shows "always enabled" message -6. Verify encryption status is displayed -7. Verify API key is shown -8. Verify "Copy Key" and "Regenerate Key" buttons work -9. Verify MCP client configuration includes Authorization header - -**Step 7: Commit** - -```bash -git add src/settings.ts -git commit -m "refactor: simplify settings UI, remove CORS toggles, show encryption status" -``` - ---- - -## Task 6: Update Documentation - -**Files:** -- Modify: `README.md` -- Modify: `CLAUDE.md` - -**Step 1: Update README.md** - -Find security/configuration sections and update: - -1. Remove mentions of CORS configuration toggle -2. Update authentication section to indicate it's mandatory -3. Add note about API key encryption - -Example changes: - -```markdown -## Security - -The plugin implements multiple security layers: - -- **Network binding**: Server binds to `127.0.0.1` only (no external access) -- **Host header validation**: Prevents DNS rebinding attacks -- **CORS policy**: Fixed localhost-only policy for web-based clients -- **Mandatory authentication**: All requests require Bearer token -- **Encrypted storage**: API keys encrypted using system keychain (macOS Keychain, Windows Credential Manager, Linux Secret Service) - -## Configuration - -### Authentication - -Authentication is **mandatory** and cannot be disabled. An API key is automatically generated when you first install the plugin and is encrypted using your system's secure credential storage. - -To use the API: - -```bash -curl -X POST http://127.0.0.1:3000/mcp \ - -H "Authorization: Bearer YOUR_API_KEY" \ - -H "Content-Type: application/json" \ - -d '{"jsonrpc":"2.0","method":"ping","id":1}' -``` - -### Settings - -- **Port**: HTTP server port (default: 3000) -- **Auto-start**: Start server automatically when Obsidian launches -- **API Key**: Auto-generated, encrypted key (can regenerate in settings) -``` - -**Step 2: Update CLAUDE.md** - -Update the settings documentation: - -```markdown -## Settings - -MCPPluginSettings (src/types/settings-types.ts): -- `port`: HTTP server port (default: 3000) -- `autoStart`: Start server on plugin load -- `apiKey`: Required authentication token (encrypted at rest) -- `enableAuth`: Always true (kept for backward compatibility) -- `notificationsEnabled`: Show tool call notifications in Obsidian UI -- `notificationDuration`: Auto-dismiss time for notifications -- `showParameters`: Include parameters in notifications -- `logToConsole`: Log tool calls to console - -**Removed settings** (as of 2025-10-25): -- `enableCORS`: CORS is now always enabled with fixed localhost-only policy -- `allowedOrigins`: Origin allowlist removed, only localhost origins allowed -``` - -Update security model section: - -```markdown -## Security Model - -- Server binds to `127.0.0.1` only (no external access) -- Host header validation prevents DNS rebinding attacks -- CORS fixed to localhost-only origins (`http(s)://localhost:*`, `http(s)://127.0.0.1:*`) -- **Mandatory authentication** via Bearer token (auto-generated, encrypted) -- API keys encrypted using Electron's safeStorage (system keychain) -``` - -**Step 3: Commit** - -```bash -git add README.md CLAUDE.md -git commit -m "docs: update for mandatory auth and simplified CORS" -``` - ---- - -## Task 7: Update Existing Tests - -**Files:** -- Modify: Any tests that mock settings with old CORS fields -- Check: `tests/note-tools.test.ts`, `tests/vault-tools.test.ts` - -**Step 1: Search for tests using old settings** - -Run: `grep -r "enableCORS\|allowedOrigins" tests/` -Expected: Find files that need updating - -**Step 2: Update test mocks** - -For each test file found, update settings mocks: - -Replace: -```typescript -const mockSettings = { - port: 3000, - enableCORS: true, - allowedOrigins: ['*'], - apiKey: 'test-key', - enableAuth: true -}; -``` - -With: -```typescript -const mockSettings = { - port: 3000, - apiKey: 'test-key', - enableAuth: true -}; -``` - -**Step 3: Run full test suite** - -Run: `npm test` -Expected: All tests pass - -**Step 4: Fix any failing tests** - -If tests fail due to missing settings fields, update them to use the new structure. - -**Step 5: Commit** - -```bash -git add tests/ -git commit -m "test: update mocks for new settings structure" -``` - ---- - -## Task 8: Add Coverage Regression Protection - -**Files:** -- Modify: `package.json` (add coverage threshold check) -- Create: `.github/workflows/coverage-check.yml` (if CI exists) - -**Step 1: Add coverage threshold to jest config** - -If `jest.config.js` exists, add: - -```javascript -module.exports = { - // ... existing config - coverageThreshold: { - global: { - statements: 99, - branches: 95, - functions: 99, - lines: 99 - } - } -}; -``` - -If using `package.json` jest config: - -```json -{ - "jest": { - "coverageThreshold": { - "global": { - "statements": 99, - "branches": 95, - "functions": 99, - "lines": 99 - } - } - } -} -``` - -**Step 2: Run coverage to verify thresholds met** - -Run: `npm run test:coverage` -Expected: Coverage meets or exceeds thresholds - -**Step 3: Commit** - -```bash -git add package.json jest.config.js -git commit -m "test: add coverage regression protection" -``` - ---- - -## Task 9: Manual Integration Testing - -**Manual test checklist:** - -**Step 1: Fresh install test** -1. Remove plugin from test vault -2. Copy built plugin files to vault -3. Enable plugin -4. Verify API key auto-generated -5. Check `.obsidian/plugins/obsidian-mcp-server/data.json` - key should be encrypted -6. Verify server starts successfully - -**Step 2: Migration test** -1. Create legacy settings file with `enableCORS: true` and `allowedOrigins: ['*']` -2. Reload plugin -3. Verify settings migrated (old fields removed) -4. Verify API key generated if missing -5. Verify server still works - -**Step 3: API key encryption test** -1. Regenerate API key in settings -2. Copy key to clipboard -3. Stop Obsidian -4. Open `data.json` - verify key is encrypted (starts with "encrypted:") -5. Restart Obsidian -6. Verify server starts and accepts the same key - -**Step 4: Authentication test** -1. Start server -2. Try request without auth: `curl http://127.0.0.1:3000/mcp -d '{"jsonrpc":"2.0","method":"ping","id":1}'` -3. Verify 401 Unauthorized -4. Try with correct key: `curl -H "Authorization: Bearer YOUR_KEY" ...` -5. Verify 200 OK - -**Step 5: CORS test (if you have a local web client)** -1. Create simple HTML file with fetch to `http://localhost:3000/mcp` -2. Serve on `http://localhost:8080` -3. Verify request succeeds (CORS allowed) -4. Try from `http://example.com` (if possible) -5. Verify request fails (CORS blocked) - -**Step 6: Verify no regressions** -1. Test all MCP tools work (read_note, create_note, etc.) -2. Test notifications still work -3. Test server stop/start/restart -4. Test settings save/load - -**Expected:** All manual tests pass - -**Document results:** - -Create `docs/testing/manual-test-results-2025-10-25.md` with results. - ---- - -## Task 10: Final Verification and Cleanup - -**Files:** -- Review all changed files -- Check for any remaining references to old settings - -**Step 1: Search for remaining references** - -Run these searches: -```bash -grep -r "enableCORS" src/ -grep -r "allowedOrigins" src/ -grep -r "enableAuth.*false" src/ # Should only be in tests -``` - -Expected: No results (except comments/docs) - -**Step 2: Run full test suite with coverage** - -Run: `npm run test:coverage` -Expected: All tests pass, coverage ≥99% - -**Step 3: Build production bundle** - -Run: `npm run build` -Expected: Build succeeds, no errors or warnings - -**Step 4: Check bundle size** - -Run: `ls -lh main.js` -Document size for comparison (should not increase significantly) - -**Step 5: Final commit** - -```bash -git add . -git commit -m "chore: final cleanup for CORS simplification and mandatory auth" -``` - ---- - -## Verification Commands - -After completing all tasks: - -```bash -# Run all tests -npm test - -# Run with coverage -npm run test:coverage - -# Build production -npm run build - -# Type check -npx tsc --noEmit - -# Check for old setting references -grep -r "enableCORS\|allowedOrigins" src/ tests/ -``` - -**Expected results:** -- ✅ All tests pass -- ✅ Coverage ≥99% -- ✅ Build succeeds -- ✅ No TypeScript errors -- ✅ No references to removed settings - ---- - -## Rollback Plan - -If issues are discovered: - -1. Revert commits in reverse order -2. Restore original settings types -3. Restore CORS toggle in middleware -4. Remove encryption utilities -5. Run tests to verify rollback successful - -## Notes for Engineer - -- **DRY**: Don't duplicate CORS logic, centralize in middleware -- **YAGNI**: Removed unnecessary CORS configuration complexity -- **TDD**: Write tests first for each component -- **Frequent commits**: Commit after each task completes -- **Backward compatibility**: Migration handles legacy settings gracefully -- **Security**: Encryption is best-effort (fallback to plaintext on Linux without keyring) -- **User experience**: Auto-generation means zero config for most users - -## References - -- Electron safeStorage docs: https://www.electronjs.org/docs/latest/api/safe-storage -- Express CORS package: https://www.npmjs.com/package/cors -- Jest testing: https://jestjs.io/docs/getting-started -- TypeScript strict mode: https://www.typescriptlang.org/tsconfig#strict diff --git a/docs/plans/2025-10-26-public-release-version-reset.md b/docs/plans/2025-10-26-public-release-version-reset.md new file mode 100644 index 0000000..00f36a5 --- /dev/null +++ b/docs/plans/2025-10-26-public-release-version-reset.md @@ -0,0 +1,616 @@ +# Public Release Version Reset to 1.0.0 Implementation Plan + +> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task. + +**Goal:** Reset version to 1.0.0 in preparation for public release while preserving git history + +**Architecture:** Update version identifiers in manifest.json, package.json, and versions.json to mark 1.0.0 as the target for public release. The existing git history (95 commits) will be preserved to demonstrate development quality and maintain context for future contributors. Previous development versions (1.0.0-3.0.0) become private development history. Git tagging will be done separately when development is complete and ready for actual release. + +**Tech Stack:** Node.js version-bump.mjs script, JSON files, git + +--- + +## Context + +Current state: +- Version: 3.0.0 (in manifest.json, package.json) +- versions.json contains: 1.0.0, 1.1.0, 1.2.0, 2.0.0, 2.1.0, 3.0.0 (all mapped to minAppVersion 0.15.0) +- 95 commits in git history with no sensitive data +- Clean commit history with conventional commit format +- CHANGELOG.md contains extensive development history (versions 1.0.0 through 9.0.0) + +Decision: Keep git history (demonstrates quality, security-conscious development, comprehensive testing) and reset version to 1.0.0 in preparation for public release. + +**Important:** Development is ongoing. This plan resets the version number but does NOT create a git tag. The tag will be created separately when development is complete and the plugin is ready for actual public release. + +--- + +## Task 1: Update manifest.json Version + +**Files:** +- Modify: `manifest.json:4` + +**Step 1: Read current manifest.json** + +Verify current version before modifying. + +Run: `cat manifest.json` +Expected: Shows `"version": "3.0.0"` + +**Step 2: Update version to 1.0.0** + +Change version field from "3.0.0" to "1.0.0". + +```json +{ + "id": "obsidian-mcp-server", + "name": "MCP Server", + "version": "1.0.0", + "minAppVersion": "0.15.0", + "description": "Exposes Obsidian vault operations via Model Context Protocol (MCP) over HTTP", + "author": "Bill Ballou", + "isDesktopOnly": true +} +``` + +**Step 3: Verify the change** + +Run: `cat manifest.json | grep version` +Expected: Shows `"version": "1.0.0"` and `"minAppVersion": "0.15.0"` + +--- + +## Task 2: Update package.json Version + +**Files:** +- Modify: `package.json:3` + +**Step 1: Read current package.json** + +Verify current version before modifying. + +Run: `cat package.json | head -5` +Expected: Shows `"version": "3.0.0"` + +**Step 2: Update version to 1.0.0** + +Change version field from "3.0.0" to "1.0.0". + +```json +{ + "name": "obsidian-mcp-server", + "version": "1.0.0", + "description": "MCP (Model Context Protocol) server plugin for Obsidian - exposes vault operations via HTTP", +``` + +**Step 3: Verify the change** + +Run: `cat package.json | grep '"version"'` +Expected: Shows `"version": "1.0.0"` + +--- + +## Task 3: Reset versions.json + +**Files:** +- Modify: `versions.json` (entire file) + +**Step 1: Read current versions.json** + +Verify current content before modifying. + +Run: `cat versions.json` +Expected: Shows entries for 1.0.0 through 3.0.0 + +**Step 2: Replace with single 1.0.0 entry** + +Clear all development version history, keep only 1.0.0 as first public release. + +```json +{ + "1.0.0": "0.15.0" +} +``` + +**Step 3: Verify the change** + +Run: `cat versions.json` +Expected: Shows only one entry: `"1.0.0": "0.15.0"` + +--- + +## Task 4: Update CHANGELOG.md for Public Release + +**Files:** +- Modify: `CHANGELOG.md:1-1366` + +**Step 1: Read current CHANGELOG structure** + +Run: `head -50 CHANGELOG.md` +Expected: Shows "# Changelog" header and extensive version history + +**Step 2: Create new public-release CHANGELOG** + +Replace entire file with simplified version for public release. Remove private development version entries (1.0.0-9.0.0), keep only new 1.0.0 public release entry. + +```markdown +# Changelog + +All notable changes to the Obsidian MCP Server plugin will be documented in this file. + +The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). + +--- + +## [1.0.0] - 2025-10-26 + +### 🎉 Initial Public Release + +The Obsidian MCP Server plugin is now publicly available! This plugin exposes your Obsidian vault via the Model Context Protocol (MCP) over HTTP, enabling AI assistants and other MCP clients to interact with your vault programmatically. + +#### Core Features + +**MCP Server** +- HTTP server implementing MCP protocol version 2024-11-05 +- JSON-RPC 2.0 message handling +- Localhost-only binding (127.0.0.1) for security +- Configurable port (default: 3000) +- Auto-start option + +**Note Operations** +- `read_note` - Read note content with optional frontmatter parsing +- `create_note` - Create notes with conflict handling (error/overwrite/rename) +- `update_note` - Update existing notes with concurrency control +- `delete_note` - Delete notes (soft delete to .trash or permanent) +- `update_frontmatter` - Update frontmatter fields without modifying content +- `update_sections` - Update specific sections by line range +- `rename_file` - Rename or move files with automatic wikilink updates +- `read_excalidraw` - Read Excalidraw drawing files with metadata + +**Vault Operations** +- `search` - Advanced search with regex, glob filtering, and snippets +- `search_waypoints` - Find Waypoint plugin markers +- `list` - List files/directories with filtering and pagination +- `stat` - Get detailed file/folder metadata +- `exists` - Quick existence check +- `get_vault_info` - Vault metadata and statistics + +**Waypoint Integration** +- `get_folder_waypoint` - Extract Waypoint blocks from folder notes +- `is_folder_note` - Detect folder notes +- Automatic waypoint edit protection + +**Link Management** +- `validate_wikilinks` - Validate all links in a note +- `resolve_wikilink` - Resolve single wikilink to target path +- `backlinks` - Get backlinks with optional unlinked mentions + +**Security** +- Mandatory Bearer token authentication +- Auto-generated, cryptographically secure API keys (32 characters) +- API keys encrypted using system keychain (macOS Keychain, Windows Credential Manager, Linux Secret Service) +- Host header validation (DNS rebinding protection) +- CORS policy fixed to localhost-only origins +- Desktop-only (requires Node.js HTTP server) + +**User Interface** +- Settings panel with full configuration +- Status bar indicator showing server state +- Ribbon icon for quick server toggle +- Start/Stop/Restart commands +- Real-time connection information +- Copy API key and configuration snippets +- Notification system for tool calls (optional) +- Notification history viewer + +**Developer Experience** +- Cross-platform path handling (Windows/macOS/Linux) +- Comprehensive error messages with troubleshooting tips +- Path validation and normalization utilities +- Concurrency control via ETag-based versioning +- Type-safe TypeScript implementation +- Extensive test coverage +- Well-documented codebase + +#### Technical Details + +**Dependencies** +- express: ^4.18.2 +- cors: ^2.8.5 +- obsidian: latest + +**Build** +- TypeScript 4.7.4 +- esbuild 0.17.3 +- Jest 30.2.0 for testing + +**Compatibility** +- Obsidian minimum version: 0.15.0 +- Desktop only (not available on mobile) +- Protocol: MCP 2024-11-05 + +#### Known Limitations + +- Desktop only (requires Node.js HTTP server) +- Single vault per server instance +- HTTP only (no WebSocket support) +- Localhost-only (no SSL/TLS) +- Excalidraw support limited to uncompressed format (compressed format planned) + +--- + +## Future Roadmap + +### Planned Features + +**Resources API** +- Expose notes as MCP resources +- Real-time resource updates + +**Prompts API** +- Templated prompts for common operations +- Custom prompt registration + +**Batch Operations** +- Multiple operations in single request +- Transactional batching + +**WebSocket Transport** +- Real-time updates and notifications +- Bidirectional communication + +**Enhanced Graph API** +- Graph visualization data +- Advanced graph traversal + +**Tag & Canvas APIs** +- Query and manage tags +- Manipulate canvas files + +**Dataview Integration** +- Query vault using Dataview syntax +- Advanced data queries + +**Performance Enhancements** +- Indexing for faster searches +- Caching for frequently accessed notes +- Streaming for large files + +--- + +## Support + +For issues, questions, or contributions: +- GitHub Issues: [Report bugs and request features] +- Documentation: See README.md and CLAUDE.md +- Include version number (1.0.0) in bug reports + +--- + +## Credits + +- MCP Protocol: https://modelcontextprotocol.io +- Obsidian API: https://github.com/obsidianmd/obsidian-api +- Built with TypeScript, Express.js, and dedication to quality +``` + +**Step 3: Verify the change** + +Run: `wc -l CHANGELOG.md && head -20 CHANGELOG.md` +Expected: Shows much shorter file (~200 lines vs 1400+), starts with "# Changelog" and "## [1.0.0] - 2025-10-26" + +--- + +## Task 5: Verify All Version Changes + +**Files:** +- Read: `manifest.json`, `package.json`, `versions.json` + +**Step 1: Check all version files** + +Run: `echo "=== manifest.json ===" && cat manifest.json | grep version && echo "=== package.json ===" && cat package.json | grep version && echo "=== versions.json ===" && cat versions.json` + +Expected output: +``` +=== manifest.json === + "version": "1.0.0", + "minAppVersion": "0.15.0", +=== package.json === + "version": "1.0.0", + "version": "node version-bump.mjs && git add manifest.json versions.json" +=== versions.json === +{ + "1.0.0": "0.15.0" +} +``` + +**Step 2: Verify version-bump.mjs script compatibility** + +The version-bump.mjs script (used by `npm version`) reads from package.json and updates manifest.json and versions.json. With all files now at 1.0.0, future version bumps will work correctly. + +Run: `cat version-bump.mjs` +Expected: Script reads `npm_package_version`, updates manifest.json version, and conditionally updates versions.json + +**Step 3: Test build** + +Ensure the plugin still builds correctly after version changes. + +Run: `npm run build` +Expected: TypeScript compiles successfully, esbuild creates main.js, no errors + +--- + +## Task 6: Commit Version Reset + +**Files:** +- Modify: `manifest.json`, `package.json`, `versions.json`, `CHANGELOG.md` + +**Step 1: Review changes to commit** + +Run: `git status` +Expected: Shows modified files: manifest.json, package.json, versions.json, CHANGELOG.md + +**Step 2: Review diff** + +Run: `git diff manifest.json package.json versions.json` +Expected: Shows version changes from 3.0.0 to 1.0.0, versions.json reduced to single entry + +**Step 3: Stage all changes** + +Run: `git add manifest.json package.json versions.json CHANGELOG.md` + +**Step 4: Create commit** + +Run: +```bash +git commit -m "$(cat <<'EOF' +chore: reset version to 1.0.0 for initial public release + +This marks version 1.0.0 as the first public release of the plugin. +Previous versions (1.0.0-3.0.0) were private development iterations. + +Changes: +- Reset manifest.json version to 1.0.0 +- Reset package.json version to 1.0.0 +- Clear versions.json to single entry (1.0.0 -> 0.15.0) +- Rewrite CHANGELOG.md for public release + - Remove private development history + - Document all features as part of 1.0.0 + - Add future roadmap section + +Git history is preserved to demonstrate: +- Development quality and security practices +- Comprehensive test coverage efforts +- Thoughtful evolution of features + +This plugin implements MCP (Model Context Protocol) to expose +Obsidian vault operations via HTTP for AI assistants and other clients. +EOF +)" +``` + +**Step 5: Verify commit** + +Run: `git log -1 --stat` +Expected: Shows commit with 4 files changed, commit message explains version reset + +**Step 6: Verify git history is preserved** + +Run: `git log --oneline | wc -l` +Expected: Shows 96 commits (95 previous + 1 new commit) + +--- + +## Task 7: Document Version Reset Decision + +**Files:** +- Create: `docs/VERSION_HISTORY.md` + +**Step 1: Create version history documentation** + +Document why version was reset and what happened to previous versions. + +```markdown +# Version History + +## Public Release Version Strategy + +### Initial Public Release: 1.0.0 (2025-10-26) + +This plugin's first public release is marked as **version 1.0.0**. + +### Development History + +Prior to public release, the plugin went through private development with internal versions 1.0.0 through 3.0.0. These versions were used during development and testing but were never publicly released. + +When preparing for public release, we reset the version to 1.0.0 to clearly mark this as the first public version available to users. + +### Why Reset to 1.0.0? + +**Semantic Versioning**: Version 1.0.0 signals the first stable, public release of the plugin. It indicates: +- The API is stable and ready for public use +- All core features are implemented and tested +- The plugin is production-ready + +**User Clarity**: Starting at 1.0.0 for the public release avoids confusion: +- Users don't wonder "what happened to versions 1-2?" +- Version number accurately reflects the public release history +- Clear signal that this is the first version they can install + +**Git History Preserved**: The development history (95 commits) is preserved to: +- Demonstrate development quality and security practices +- Show comprehensive testing and iterative refinement +- Provide context for future contributors +- Maintain git blame and bisect capabilities + +### Version Numbering Going Forward + +From 1.0.0 onward, the plugin follows [Semantic Versioning](https://semver.org/): + +- **MAJOR** version (1.x.x): Incompatible API changes or breaking changes +- **MINOR** version (x.1.x): New functionality in a backward-compatible manner +- **PATCH** version (x.x.1): Backward-compatible bug fixes + +### Development Version Mapping + +For reference, here's what the private development versions contained: + +| Dev Version | Key Features Added | +|-------------|-------------------| +| 1.0.0 | Initial MCP server, basic CRUD tools | +| 1.1.0 | Path normalization, error handling | +| 1.2.0 | Enhanced authentication, parent folder detection | +| 2.0.0 | API unification, typed results | +| 2.1.0 | Discovery endpoints (stat, exists) | +| 3.0.0 | Enhanced list operations | + +All these features are included in the public 1.0.0 release. + +### Commit History + +The git repository contains the complete development history showing the evolution from initial implementation through all features. This history demonstrates: + +- Security-conscious development (API key encryption, authentication) +- Comprehensive test coverage (100% coverage goals) +- Careful refactoring and improvements +- Documentation and planning +- Bug fixes and edge case handling + +No sensitive data exists in the git history (verified via audit). + +--- + +## Future Versioning + +**Next versions** will be numbered according to the changes made: + +- **1.0.1**: Bug fixes and patches +- **1.1.0**: New features (e.g., Resources API, Prompts API) +- **2.0.0**: Breaking changes to tool schemas or behavior + +The CHANGELOG.md will document all public releases starting from 1.0.0. +``` + +**Step 2: Verify file was created** + +Run: `cat docs/VERSION_HISTORY.md | head -30` +Expected: Shows version history explanation + +**Step 3: Commit version history documentation** + +Run: +```bash +git add docs/VERSION_HISTORY.md +git commit -m "docs: add version history explanation for 1.0.0 reset" +``` + +--- + +## Task 8: Final Verification + +**Files:** +- Read: All modified files + +**Step 1: Verify all version references** + +Check that no stray version references remain. + +Run: `grep -r "3\.0\.0" --include="*.json" --include="*.md" . 2>/dev/null | grep -v node_modules | grep -v ".git"` +Expected: No results (all 3.0.0 references should be gone from project files) + +**Step 2: Verify package.json npm version script** + +The `npm version` command should work correctly for future version bumps. + +Run: `cat package.json | grep '"version"'` +Expected: Shows `"version": "1.0.0"` and version script with version-bump.mjs + +**Step 3: Verify build output** + +Run: `npm run build 2>&1 | tail -5` +Expected: Build succeeds, no errors + +**Step 4: Check git status** + +Run: `git status` +Expected: Working tree clean, no uncommitted changes + +**Step 5: Verify commit history** + +Run: `git log --oneline -5` +Expected: Shows recent commits including version reset and documentation + +**Step 6: Final summary** + +Run: +```bash +echo "=== Version Files ===" && \ +cat manifest.json | grep version && \ +cat package.json | grep '"version"' && \ +cat versions.json && \ +echo "=== Git Info ===" && \ +git log --oneline | wc -l && \ +echo "=== Build Status ===" && \ +ls -lh main.js +``` + +Expected: +- All versions show 1.0.0 +- versions.json has single entry +- Git shows 96+ commits +- main.js exists and is recent + +--- + +## Completion Checklist + +- [ ] manifest.json version is 1.0.0 +- [ ] package.json version is 1.0.0 +- [ ] versions.json contains only {"1.0.0": "0.15.0"} +- [ ] CHANGELOG.md rewritten for public release +- [ ] All changes committed with descriptive message +- [ ] Git history preserved (95+ commits) +- [ ] VERSION_HISTORY.md documents the reset decision +- [ ] No stray 3.0.0 references remain +- [ ] Build succeeds (main.js created) +- [ ] Working tree is clean + +**Note:** Git tag creation (1.0.0) is NOT part of this plan. The tag will be created later when development is complete and the plugin is ready for actual public release. + +## Post-Implementation Notes + +After completing this plan, the version numbers are reset to 1.0.0 in preparation for public release: + +**Current State After Plan:** +- Version files (manifest.json, package.json, versions.json) all show 1.0.0 +- CHANGELOG.md rewritten for public consumption +- VERSION_HISTORY.md documents the version reset decision +- Git history preserved with all development commits +- No git tag created yet (tag will be created when ready for actual release) + +**When Ready for Actual Public Release:** + +1. **Final Development**: Complete any remaining development work and commit changes + +2. **Create Git Tag**: Create the 1.0.0 annotated tag: + ```bash + git tag -a 1.0.0 -m "Release 1.0.0 - Initial Public Release" + ``` + +3. **GitHub Release**: Create a GitHub release from the 1.0.0 tag with: + - Release title: "v1.0.0 - Initial Public Release" + - Release notes: Use CHANGELOG.md content for 1.0.0 + - Attach files: manifest.json, main.js, styles.css + +4. **Obsidian Plugin Directory**: Submit to Obsidian's community plugins with: + - Plugin ID: obsidian-mcp-server + - Version: 1.0.0 + - Links to GitHub repository and release + +5. **Future Versions**: Use `npm version [major|minor|patch]` which will: + - Update package.json version + - Run version-bump.mjs to update manifest.json and versions.json + - Create git commit and tag automatically + - Then push tag to trigger release workflow + +The git history demonstrates the quality and care taken during development, while the clean version numbering provides clarity for public users. diff --git a/docs/testing/manual-integration-test-checklist.md b/docs/testing/manual-integration-test-checklist.md deleted file mode 100644 index 3fe0f71..0000000 --- a/docs/testing/manual-integration-test-checklist.md +++ /dev/null @@ -1,355 +0,0 @@ -# Manual Integration Testing Checklist -## Task 9: CORS Simplification and Mandatory Auth - -**Date:** 2025-10-25 -**Implementation Plan:** docs/plans/2025-10-25-simplify-cors-mandatory-auth.md -**Purpose:** Verify that all code changes work correctly in a real Obsidian environment - ---- - -## Test 1: Fresh Install Test - -### Prerequisites -- Access to a test vault -- Built plugin files (main.js, manifest.json, styles.css) - -### Steps -1. ✅ Remove plugin from test vault (if exists): `rm -rf .obsidian/plugins/obsidian-mcp-server/` -2. ✅ Build plugin: `npm run build` -3. ✅ Copy built plugin files to vault: `.obsidian/plugins/obsidian-mcp-server/` -4. ✅ Enable plugin in Obsidian Settings → Community Plugins -5. ✅ Open browser console (Ctrl+Shift+I) -6. ✅ Verify log message: "Generating new API key..." -7. ✅ Check `.obsidian/plugins/obsidian-mcp-server/data.json`: - - Key should be present - - Key should start with "encrypted:" (if encryption available) -8. ✅ Verify server starts successfully (check plugin settings or console) - -### Expected Results -- [ ] API key auto-generated on first install -- [ ] Key is encrypted in data.json -- [ ] No CORS settings in data.json -- [ ] Server starts without errors -- [ ] No "enableCORS" or "allowedOrigins" fields in data.json - ---- - -## Test 2: Migration Test - -### Prerequisites -- Test vault with plugin already installed -- Access to data.json file - -### Steps -1. ✅ Stop Obsidian -2. ✅ Manually edit `.obsidian/plugins/obsidian-mcp-server/data.json`: - ```json - { - "port": 3000, - "enableCORS": true, - "allowedOrigins": ["*"], - "enableAuth": false, - "apiKey": "old-plaintext-key", - "autoStart": false - } - ``` -3. ✅ Save file -4. ✅ Start Obsidian -5. ✅ Open browser console -6. ✅ Verify log message: "Migrating legacy CORS settings..." -7. ✅ Check updated data.json: - - "enableCORS" should be removed - - "allowedOrigins" should be removed - - "enableAuth" should be true - - "apiKey" should be encrypted -8. ✅ Verify server still works - -### Expected Results -- [ ] Legacy CORS settings removed from data.json -- [ ] enableAuth set to true -- [ ] API key encrypted (if not already) -- [ ] Other settings preserved (port, autoStart, notifications) -- [ ] Server functionality not affected - ---- - -## Test 3: API Key Encryption Test - -### Prerequisites -- Plugin installed and running -- Access to plugin settings UI - -### Steps -1. ✅ Open plugin settings in Obsidian -2. ✅ Locate "API Key Management" section -3. ✅ Click "Copy Key" button -4. ✅ Note the plaintext key (save to clipboard) -5. ✅ Stop Obsidian completely -6. ✅ Open `.obsidian/plugins/obsidian-mcp-server/data.json` -7. ✅ Verify apiKey field starts with "encrypted:" (or is plaintext if encryption unavailable) -8. ✅ Restart Obsidian -9. ✅ Open plugin settings -10. ✅ Verify API key display shows the same plaintext key from step 4 -11. ✅ Verify server starts and accepts the key - -### Expected Results -- [ ] API key displayed in plaintext in UI -- [ ] API key encrypted in data.json file -- [ ] Same key works after restart -- [ ] "Copy Key" button copies plaintext key -- [ ] Encryption status indicator shows correct state - ---- - -## Test 4: API Key Regeneration Test - -### Prerequisites -- Plugin installed with existing API key -- Access to plugin settings UI - -### Steps -1. ✅ Open plugin settings -2. ✅ Copy current API key (note it down) -3. ✅ Click "Regenerate Key" button -4. ✅ Verify success notification -5. ✅ Verify displayed key has changed -6. ✅ Copy new key -7. ✅ Verify old key ≠ new key -8. ✅ Stop Obsidian -9. ✅ Check data.json - verify encrypted key has changed -10. ✅ Restart Obsidian -11. ✅ Verify new key is displayed correctly -12. ✅ Verify server restart prompt if server was running - -### Expected Results -- [ ] Regenerate button generates a new key -- [ ] New key is different from old key -- [ ] New key is properly encrypted on disk -- [ ] New key persists across restart -- [ ] Server restart prompted if needed - ---- - -## Test 5: Authentication Test - -### Prerequisites -- Plugin installed and server running -- curl or similar HTTP client - -### Steps -1. ✅ Start MCP server from plugin settings -2. ✅ Copy API key from settings UI -3. ✅ Try request WITHOUT auth: - ```bash - curl -X POST http://127.0.0.1:3000/mcp \ - -H "Content-Type: application/json" \ - -d '{"jsonrpc":"2.0","method":"ping","id":1}' - ``` -4. ✅ Verify response is 401 Unauthorized -5. ✅ Try request WITH correct Bearer token: - ```bash - curl -X POST http://127.0.0.1:3000/mcp \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer YOUR_API_KEY_HERE" \ - -d '{"jsonrpc":"2.0","method":"ping","id":1}' - ``` -6. ✅ Verify response is 200 OK with pong result -7. ✅ Try request with WRONG Bearer token: - ```bash - curl -X POST http://127.0.0.1:3000/mcp \ - -H "Content-Type: application/json" \ - -H "Authorization: Bearer wrong-token" \ - -d '{"jsonrpc":"2.0","method":"ping","id":1}' - ``` -8. ✅ Verify response is 401 Unauthorized - -### Expected Results -- [ ] Requests without auth rejected (401) -- [ ] Requests with invalid token rejected (401) -- [ ] Requests with valid token accepted (200) -- [ ] No way to bypass authentication - ---- - -## Test 6: CORS Test (Optional - Requires Web Client) - -### Prerequisites -- MCP server running -- Simple HTML file or local web server - -### Steps -1. ✅ Create test HTML file: - ```html - - - - -
- - - - ``` -2. ✅ Serve HTML from localhost:8080: `python3 -m http.server 8080` -3. ✅ Open http://localhost:8080 in browser -4. ✅ Update apiKey in HTML with actual key -5. ✅ Click "Test CORS" button -6. ✅ Verify request succeeds (CORS allowed from localhost:8080) -7. ✅ Try accessing from non-localhost origin (if possible) -8. ✅ Verify CORS blocks non-localhost origins - -### Expected Results -- [ ] Requests from localhost origins succeed -- [ ] Requests from 127.0.0.1 origins succeed -- [ ] Requests from non-localhost origins blocked by CORS -- [ ] HTTPS localhost origins also work - ---- - -## Test 7: Settings UI Verification - -### Prerequisites -- Plugin installed -- Access to plugin settings - -### Steps -1. ✅ Open Obsidian Settings → Community Plugins → Obsidian MCP Server -2. ✅ Verify NO "Enable CORS" toggle is visible -3. ✅ Verify NO "Allowed origins" text input is visible -4. ✅ Verify NO "Enable authentication" toggle is visible -5. ✅ Verify "Authentication" heading is present -6. ✅ Verify description text mentions "mandatory" and "encrypted" -7. ✅ Verify encryption status indicator is displayed: - - 🔒 "Encryption: Available" OR - - ⚠️ "Encryption: Unavailable" -8. ✅ Verify "API Key Management" section is always visible -9. ✅ Verify API key is displayed in monospace font -10. ✅ Verify "Copy Key" and "Regenerate Key" buttons are visible -11. ✅ Verify "MCP Client Configuration" section always includes Authorization header - -### Expected Results -- [ ] No CORS configuration options visible -- [ ] No authentication toggle (always enabled) -- [ ] Clear messaging about mandatory auth -- [ ] Encryption status displayed -- [ ] API key section always visible -- [ ] Configuration snippet includes auth header - ---- - -## Test 8: No Regressions Test - -### Prerequisites -- Plugin installed and server running -- Test vault with notes - -### Steps -1. ✅ Test all MCP tools work: - - `read_note` - Read an existing note - - `create_note` - Create a new note - - `update_note` - Modify a note - - `delete_note` - Delete a note - - `list` - List notes in vault - - `search` - Search for text - - Other tools as applicable -2. ✅ Test notifications (if enabled): - - Enable notifications in settings - - Call an MCP tool - - Verify notification appears in Obsidian -3. ✅ Test server controls: - - Stop server - - Start server - - Restart server -4. ✅ Test settings save/load: - - Change port number - - Toggle autoStart - - Restart Obsidian - - Verify settings preserved - -### Expected Results -- [ ] All MCP tools function correctly -- [ ] No errors in console related to CORS/auth changes -- [ ] Notifications work as before -- [ ] Server controls work correctly -- [ ] Settings persist across restarts -- [ ] No functionality regressions - ---- - -## Test 9: Error Handling Test - -### Prerequisites -- Plugin installed - -### Steps -1. ✅ Test empty API key scenario: - - Stop Obsidian - - Edit data.json to set `apiKey: ""` - - Start Obsidian - - Verify new key is auto-generated -2. ✅ Test decryption failure: - - Stop Obsidian - - Edit data.json to set `apiKey: "encrypted:invalid-base64!!!"` - - Start Obsidian - - Verify error notice displayed - - Verify user prompted to regenerate key -3. ✅ Test server start with no API key: - - Stop Obsidian - - Edit data.json to remove apiKey field entirely - - Start Obsidian - - Verify key auto-generated - - Verify server can start - -### Expected Results -- [ ] Empty API key triggers auto-generation -- [ ] Invalid encrypted key shows error notice -- [ ] User can recover from decryption failures -- [ ] Server doesn't start with invalid key state - ---- - -## Summary Checklist - -After completing all tests above, verify: - -- [ ] Fresh install generates and encrypts API key -- [ ] Legacy CORS settings are migrated correctly -- [ ] API keys are encrypted at rest -- [ ] API key regeneration works -- [ ] Authentication is mandatory and enforced -- [ ] CORS allows localhost origins only -- [ ] Settings UI shows correct options (no CORS, no auth toggle) -- [ ] Encryption status is displayed -- [ ] All existing MCP tools work correctly -- [ ] No console errors related to changes -- [ ] Error scenarios handled gracefully - ---- - -## Test Results - -**Tester:** [Name] -**Date:** [Date] -**Obsidian Version:** [Version] -**Plugin Version:** [Version] -**Platform:** [Windows/macOS/Linux] - -**Overall Status:** [ ] PASS / [ ] FAIL -**Notes:** diff --git a/docs/testing/manual-test-task5-settings-ui.md b/docs/testing/manual-test-task5-settings-ui.md deleted file mode 100644 index 8a44294..0000000 --- a/docs/testing/manual-test-task5-settings-ui.md +++ /dev/null @@ -1,97 +0,0 @@ -# Manual Testing Checklist - Task 5: Settings UI Updates - -**Date:** 2025-10-25 -**Task:** Update Settings UI to reflect mandatory authentication and encryption - -## Changes Made - -### Step 2: Updated Authentication Section -- ✅ Removed "Enable authentication" toggle -- ✅ Added "Authentication (Always Enabled)" heading (h3) -- ✅ Added description: "Authentication is required for all requests. Your API key is encrypted and stored securely using your system's credential storage." -- ✅ Added encryption status indicator showing: - - "🔒 Encryption: Available (using system keychain)" when available - - "⚠️ Encryption: Unavailable (API key stored in plaintext)" when not available - -### Step 3: Updated API Key Display -- ✅ Changed condition from `if (this.plugin.settings.enableAuth)` to always show -- ✅ API key section now always visible since auth is mandatory - -### Step 4: Updated MCP Client Configuration -- ✅ Changed from conditional auth headers to always including them -- ✅ Authorization header always included in generated config -- ✅ Fallback text "YOUR_API_KEY_HERE" if apiKey is missing - -### Step 5: Added Encryption Utils Import -- ✅ Added import for `isEncryptionAvailable` from './utils/encryption-utils' - -### Additional Fixes -- ✅ Fixed variable name collision: renamed `buttonContainer` to `apiKeyButtonContainer` in API key section - -## What to Verify Manually (When Available in Obsidian) - -Since this is a settings UI change, manual verification would include: - -### Visual Verification -1. ✅ **CORS Settings Removed** - No "Enable CORS" toggle visible -2. ✅ **No "Allowed Origins" field** - Field should not be present -3. ✅ **Authentication Section**: - - Should show "Authentication" heading - - Should display description about mandatory authentication - - Should show encryption status (🔒 or ⚠️ depending on platform) -4. ✅ **API Key Section**: - - Should always be visible (not conditional) - - Should show "Copy Key" and "Regenerate Key" buttons - - Should display the API key in monospace font -5. ✅ **MCP Client Configuration**: - - Should always include Authorization header - - Config JSON should show Bearer token - -### Functional Verification -1. ✅ **Copy Key Button** - Should copy API key to clipboard -2. ✅ **Regenerate Key Button** - Should generate new key and refresh display -3. ✅ **Copy Configuration Button** - Should copy full JSON config with auth header -4. ✅ **Encryption Status** - Should reflect actual platform capability - -## Test Results - -### Build Status -- ✅ TypeScript compilation: **PASS** -- ✅ Build successful: **PASS** - -### Test Suite -- ✅ All 550 tests passed -- ✅ No new test failures -- ✅ Encryption utils tests: **PASS** -- ✅ Settings types tests: **PASS** -- ✅ Main migration tests: **PASS** - -## Files Changed -- `/home/bballou/obsidian-mcp-plugin/src/settings.ts` - -## Code Changes Summary - -1. **Import added**: `isEncryptionAvailable` from encryption-utils -2. **Lines 60-82**: Replaced authentication toggle with always-enabled section -3. **Lines 81-127**: Removed conditional, API key section always visible -4. **Lines 142-152**: Config always includes Authorization header -5. **Line 92**: Renamed variable to avoid collision - -## Observations - -- All changes align with Task 5 specifications -- No regression in existing functionality -- Settings UI now correctly reflects mandatory authentication model -- Encryption status provides user transparency about security - -## Issues Encountered - -1. **Variable Name Collision**: - - Issue: Two `buttonContainer` variables in same scope - - Resolution: Renamed to `apiKeyButtonContainer` in API key section - - Impact: No functional change, compiler error resolved - -## Next Steps - -- Commit changes as per Step 7 -- Integration testing in Obsidian (when available)