Add caching layer to improve performance when repeatedly accessing the same OpenAPI specs: - LRU cache with max 10 entries and 15-minute TTL - Cache key includes mtime for local files (change detection) - URL normalization for consistent remote spec caching - noCache parameter on all tools to bypass cache - Response includes cached:true/false indicator Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
4.9 KiB
4.9 KiB
Task: Remote Spec Caching
Related Documents
- Analysis:
docs/analysis/feature-enhancements-analysis.md - Branch:
feature/caching(fromfeature/typescript-options)
Priority
NORMAL
Objective
Add an in-memory LRU cache for parsed OpenAPI specs to improve performance when repeatedly accessing the same remote URLs or local files.
Definition of Done
- Code implemented per specification
- TypeScript compilation CLEAN
- ALL tests passing
- Manual verification with remote spec
- PROOF PROVIDED (cache hit/miss demonstration)
Scope
IN SCOPE
- LRU cache with configurable max entries (default 10)
- TTL-based expiration (default 15 minutes)
- Cache key includes mtime for local files
noCacheparameter to bypass cache- Cache invalidation on parse errors
OUT OF SCOPE
- Persistent (disk) caching
- ETag/304 support for remote URLs
- Cache warming/preloading
- Distributed caching
Sub-Tasks
Phase 1: Cache Implementation
1.1 Create cache module
- Details: Implement LRU cache class with TTL support
- Files:
src/lib/cache.ts(new) - Testing: Unit test cache hit/miss/expiration
1.2 Define cache entry interface
- Details: Store spec, metadata, timestamp, dereferenced flag
- Files:
src/lib/cache.ts,src/lib/types.ts - Testing: TypeScript compilation
Phase 2: Parser Integration
2.1 Add cache to parseSpec
- Details: Check cache before parsing, store result after
- Files:
src/lib/parser.ts - Testing: Verify cache hit on repeated calls
2.2 Implement cache key strategy
- Details: URL normalization for remote, path+mtime for local
- Files:
src/lib/cache.ts - Testing: Local file change triggers re-parse
Phase 3: Tool Integration
3.1 Add noCache parameter to tools
- Details: Optional parameter to bypass cache
- Files:
src/tools/parse.ts,src/tools/validate.ts,src/tools/query.ts,src/tools/schema.ts,src/tools/generate.ts - Testing: noCache=true forces fresh parse
Files to Modify
src/lib/cache.ts: NEW - LRU cache implementationsrc/lib/types.ts: Add CacheEntry interfacesrc/lib/parser.ts: Integrate cache into parseSpecsrc/tools/*.ts: Add noCache parameter to all tools
Risks & Mitigations
| Risk | Impact | Mitigation |
|---|---|---|
| Stale cache for rapidly changing specs | MEDIUM | Short TTL (15 min), noCache param |
| Memory pressure with large specs | MEDIUM | LRU eviction, max 10 entries |
| Local file changes not detected | LOW | Include mtime in cache key |
Testing Strategy
- Build:
npm run build- must pass - Manual verification:
# First call - should parse fresh echo '{"jsonrpc":"2.0","id":1,"method":"tools/call","params":{"name":"parse-spec","arguments":{"path":"https://petstore.swagger.io/v2/swagger.json"}}}' | node dist/index.js # Second call - should hit cache (faster) echo '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"parse-spec","arguments":{"path":"https://petstore.swagger.io/v2/swagger.json"}}}' | node dist/index.js # With noCache - should bypass echo '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"parse-spec","arguments":{"path":"https://petstore.swagger.io/v2/swagger.json","noCache":true}}}' | node dist/index.js
Implementation Notes
Cache Key Strategy
function getCacheKey(specPath: string): string {
if (specPath.startsWith('http://') || specPath.startsWith('https://')) {
// Normalize URL: remove trailing slash, sort query params
return normalizeUrl(specPath);
}
// Local file: include mtime for change detection
const resolved = path.resolve(specPath);
const stat = fs.statSync(resolved);
return `${resolved}:${stat.mtimeMs}`;
}
LRU Cache Structure
interface CacheEntry {
spec: OpenAPISpec;
metadata: ParsedSpec;
dereferenced: boolean;
timestamp: number;
}
class SpecCache {
private cache: Map<string, CacheEntry>;
private maxSize: number;
private ttlMs: number;
constructor(maxSize = 10, ttlMinutes = 15) {
this.cache = new Map();
this.maxSize = maxSize;
this.ttlMs = ttlMinutes * 60 * 1000;
}
get(key: string): CacheEntry | undefined {
const entry = this.cache.get(key);
if (!entry) return undefined;
if (Date.now() - entry.timestamp > this.ttlMs) {
this.cache.delete(key);
return undefined;
}
// Move to end for LRU
this.cache.delete(key);
this.cache.set(key, entry);
return entry;
}
set(key: string, entry: CacheEntry): void {
if (this.cache.size >= this.maxSize) {
// Remove oldest (first) entry
const firstKey = this.cache.keys().next().value;
this.cache.delete(firstKey);
}
this.cache.set(key, { ...entry, timestamp: Date.now() });
}
invalidate(key?: string): void {
if (key) {
this.cache.delete(key);
} else {
this.cache.clear();
}
}
}