Files
obsidian-mcp-server/tests
Bill 5760ac9b8b test: add comprehensive VaultTools coverage tests
Added extensive test coverage for VaultTools to increase coverage from 54.9% to 93.83%:

getVaultInfo tests:
- Return vault info with total notes and size
- Handle empty vault
- Handle files with missing stat info
- Handle errors gracefully
- Format large file sizes correctly (KB, MB, GB)

search tests:
- Search for literal text
- Search with regex pattern
- Handle invalid regex pattern
- Filter by folder
- Respect maxResults limit
- Handle file read errors gracefully
- Handle case sensitive search
- Extract snippets correctly
- Handle zero-width regex matches
- Handle general search errors

Waypoint tests (searchWaypoints, getFolderWaypoint, isFolderNote):
- Search for waypoints in vault
- Filter waypoints by folder
- Extract waypoint from file
- Detect folder notes
- Handle file not found errors
- Handle general errors

resolveWikilink tests:
- Resolve wikilink successfully
- Provide suggestions for unresolved links
- Handle errors gracefully
- Handle invalid source path

getBacklinks unlinked mentions tests:
- Find unlinked mentions
- Skip files that already have linked backlinks
- Skip target file itself in unlinked mentions
- Not return unlinked mentions when includeUnlinked is false

list edge case tests:
- Handle invalid path
- Handle non-existent folder
- Handle path pointing to file instead of folder
- Handle cursor not found in pagination

validateWikilinks edge case tests:
- Handle invalid path
2025-10-20 00:20:12 -04:00
..

Tests

This directory contains unit and integration tests for the Obsidian MCP Server plugin.

Current Status

The test files are currently documentation of expected behavior. To actually run these tests, you need to set up a testing framework.

  1. Install Jest and related dependencies:
npm install --save-dev jest @types/jest ts-jest
  1. Create a jest.config.js file in the project root:
module.exports = {
  preset: 'ts-jest',
  testEnvironment: 'node',
  roots: ['<rootDir>/tests'],
  testMatch: ['**/*.test.ts'],
  moduleFileExtensions: ['ts', 'tsx', 'js', 'jsx', 'json', 'node'],
  collectCoverageFrom: [
    'src/**/*.ts',
    '!src/**/*.d.ts',
  ],
};
  1. Add test script to package.json:
{
  "scripts": {
    "test": "jest",
    "test:watch": "jest --watch",
    "test:coverage": "jest --coverage"
  }
}
  1. Run tests:
npm test

Test Files

path-utils.test.ts

Tests for the PathUtils class, covering:

  • Path normalization (cross-platform)
  • Path validation
  • File/folder resolution
  • Path manipulation utilities

Key Test Categories:

  • normalizePath: Tests for handling leading/trailing slashes, backslashes, drive letters
  • isValidVaultPath: Tests for path validation rules
  • Cross-platform: Tests for Windows, macOS, and Linux path handling

Mocking Obsidian API

Since these tests run outside of Obsidian, you'll need to mock the Obsidian API:

// Example mock setup
jest.mock('obsidian', () => ({
  App: jest.fn(),
  TFile: jest.fn(),
  TFolder: jest.fn(),
  TAbstractFile: jest.fn(),
  // ... other Obsidian types
}));

Running Tests Without Jest

If you prefer not to set up Jest, you can:

  1. Use the test files as documentation of expected behavior
  2. Manually test the functionality through the MCP server
  3. Use TypeScript's type checking to catch errors: npm run build

Future Improvements

  • Set up Jest testing framework
  • Add integration tests with mock Obsidian vault
  • Add tests for error-messages.ts
  • Add tests for tool implementations
  • Add tests for MCP server endpoints
  • Set up CI/CD with automated testing
  • Add code coverage reporting

Test Coverage Goals

  • PathUtils: 100% coverage (critical for cross-platform support)
  • ErrorMessages: 100% coverage (important for user experience)
  • Tool implementations: 80%+ coverage
  • Server/middleware: 70%+ coverage

Writing New Tests

When adding new features, please:

  1. Write tests first (TDD approach recommended)
  2. Test both success and error cases
  3. Test edge cases and boundary conditions
  4. Test cross-platform compatibility where relevant
  5. Add descriptive test names that explain the expected behavior

Example test structure:

describe('FeatureName', () => {
  describe('methodName', () => {
    test('should handle normal case', () => {
      // Arrange
      const input = 'test';
      
      // Act
      const result = method(input);
      
      // Assert
      expect(result).toBe('expected');
    });

    test('should handle error case', () => {
      expect(() => method(null)).toThrow();
    });
  });
});