Design cache system with configurable eviction policies.
interface CacheConfig {
policy: "LRU" | "LFU";
maxSize: number;
ttlDays: number;
}
interface Cache<K, V> {
get(key: K): V | null;
put(key: K, value: V): void;
cleanup(): void;
}
class CacheFactory {
static createCache<K, V>(config: CacheConfig): Cache<K, V>;
}
Requirements:
1. Cache Policies
- LRU implementation
- LFU implementation
- Policy switching support
2. Data Management
- TTL enforcement
- Daily cleanup
- Size limits
3. Thread Safety
- Concurrent access
- Atomic operations
- Lock management
Merge overlapping intervals into non-overlapping set.
function mergeIntervals(intervals: number[][]): number[][]
Examples:
1. Input: [[1,3],[2,6],[8,10],[15,18]]
Output: [[1,6],[8,10],[15,18]]
Explanation: [1,3] and [2,6] overlap -> [1,6]
2. Input: [[1,4],[4,5]]
Output: [[1,5]]
Explanation: Touching intervals merge
Constraints:
- 1 <= intervals.length <= 104
- intervals[i].length == 2
- 0 <= starti <= endi <= 104
Implementation:
1. Sort by start time
2. Merge overlaps
3. Handle edge cases
Implement rate limiter for concurrent async functions.
function mapLimit<T, R>(
inputs: T[],
limit: number,
iteratee: (input: T) => Promise<R>,
callback: (results: R[]) => void
): void
Example:
const mockFunction = (input: number) =>
new Promise(resolve =>
setTimeout(() => resolve(`User${input}`),
Math.random() * 5000)
);
mapLimit(
[0,1,2,3,4,5],
2,
mockFunction,
console.log
);
// Output: ["User0","User1","User2","User3","User4","User5"]
Requirements:
- Max N concurrent functions
- Maintain result order
- Handle errors gracefully
Implement rand7() using rand2() API.
// Given:
function rand2(): number // Returns 1 or 2
// Implement:
function rand7(): number // Return 1-7
Examples:
1. Input: n = 1
Output: [2]
2. Input: n = 2
Output: [2,8]
3. Input: n = 3
Output: [3,8,10]
Follow-up Questions:
1. Expected rand2() calls?
2. Minimize rand2() calls?
3. Design randN()?
4. Chi-squared test implementation?
Implementation Strategy:
1. Generate uniform distribution
2. Reject out-of-range values
3. Optimize call count
Design in-memory rate limiter for API endpoint.
interface RateLimiter {
canProcess(timestamp: number): boolean;
}
class RequestLimiter implements RateLimiter {
constructor(requestsPerSecond: number);
canProcess(timestamp: number): boolean;
}
Example:
maxRequests = 2 per second
12:00:01.100 PASS
12:00:01.200 PASS
12:00:01.300 FAIL
12:00:02.100 PASS
12:00:02.150 FAIL
12:00:02.200 PASS
Follow-up Considerations:
1. Multi-User Support
- Per-user limits
- User identification
- Limit inheritance
2. Request Weighting
- Priority levels
- Weight factors
- Dynamic adjustment
3. Distributed System
- Synchronization
- Consistency
- Failure handling