In this tutorial, we will cover key concepts and coding challenges you might encounter in a Node.js coding interview, focusing on high performance techniques. Each section includes a real-world example with optimized solutions.
1. Asynchronous JavaScript: Callbacks, Promises, and Async/Await
Problem: Handling Multiple API Calls Efficiently
Your task is to fetch user data from multiple APIs and combine the results efficiently.
Solution
Instead of making API calls sequentially, we use Promise.all()
for parallel execution.
const fetch = require("node-fetch");
async function fetchUserData(userIds) {
const apiUrls = userIds.map(id => `https://jsonplaceholder.typicode.com/users/${id}`);
const userPromises = apiUrls.map(url => fetch(url).then(res => res.json()));
try {
const users = await Promise.all(userPromises);
return users;
} catch (error) {
console.error("Error fetching user data:", error);
}
}
// Usage
fetchUserData([1, 2, 3]).then(console.log);
Performance Improvement
- Sequential Execution:
O(n)
API requests. - Parallel Execution:
O(1)
, usingPromise.all()
speeds up fetching multiple requests.
2. Optimized File Handling
Problem: Read and Process a Large File Efficiently
You need to read a large file (e.g., logs, CSV) and process it without blocking the event loop.
Solution
Instead of fs.readFile()
, use streams to process data in chunks.
const fs = require("fs");
const readline = require("readline");
async function processLargeFile(filePath) {
const fileStream = fs.createReadStream(filePath);
const rl = readline.createInterface({ input: fileStream });
for await (const line of rl) {
console.log(`Processing line: ${line}`);
}
}
// Usage
processLargeFile("largefile.txt");
Performance Improvement
- fs.readFile() loads the entire file into memory (bad for large files).
- Streams process files in chunks (
O(1) memory usage
), improving efficiency.
3. Handling High-Concurrency Requests (Load Optimization)
Problem: Preventing Overloading in High Traffic APIs
A web server should efficiently handle thousands of concurrent requests.
Solution
Use a worker pool to distribute the workload using worker threads.
const { Worker } = require("worker_threads");
function runWorker(data) {
return new Promise((resolve, reject) => {
const worker = new Worker("./worker.js", { workerData: data });
worker.on("message", resolve);
worker.on("error", reject);
});
}
// Worker file (worker.js)
const { parentPort, workerData } = require("worker_threads");
parentPort.postMessage(workerData * 2);
// Usage
(async () => {
const results = await Promise.all([runWorker(10), runWorker(20)]);
console.log("Processed results:", results);
})();
Performance Improvement
- Using worker threads offloads CPU-heavy tasks from the main thread.
- Prevents event loop blocking and increases server responsiveness.
4. Optimizing Database Queries
Problem: Slow Query Performance in a High-Traffic System
Your application queries a database frequently, leading to slow responses.
Solution
Use Indexing and Connection Pooling.
Example: Using Indexing for Fast Lookup
CREATE INDEX idx_email ON users(email);
- Without an index: O(n)
- With an index: O(log n) (much faster)
Example: Connection Pooling in PostgreSQL
const { Pool } = require("pg");
const pool = new Pool({
user: "admin",
host: "localhost",
database: "test",
password: "password",
max: 10, // Pool size
});
async function getUserByEmail(email) {
const res = await pool.query("SELECT * FROM users WHERE email = $1", [email]);
return res.rows[0];
}
Performance Improvement
- Indexing reduces query time.
- Connection pooling minimizes the overhead of opening/closing database connections.
5. Caching for Faster Response Time
Problem: Reduce Load on the Database
If multiple users request the same data, querying the database each time is inefficient.
Solution
Use Redis caching to store frequently accessed data.
const redis = require("redis");
const client = redis.createClient();
async function getUserData(userId) {
return new Promise((resolve, reject) => {
client.get(`user:${userId}`, async (err, data) => {
if (data) {
resolve(JSON.parse(data)); // Return cached data
} else {
const user = await fetchUserFromDB(userId); // Fetch from DB
client.setex(`user:${userId}`, 3600, JSON.stringify(user)); // Store in Redis
resolve(user);
}
});
});
}
Performance Improvement
- Without caching: Each request hits the DB.
- With caching: Responses are 10-100x faster.
6. Avoiding Memory Leaks in Node.js
Problem: Memory Usage Grows Over Time
A poorly managed application can cause memory leaks, increasing RAM usage.
Solution
- Use WeakMaps instead of Maps for objects that should be garbage collected.
- Monitor memory using
heapdump
.
Example: Avoiding Memory Leaks with WeakMap
const cache = new WeakMap();
function cacheUser(user) {
if (!cache.has(user)) {
cache.set(user, { data: `Cached data for ${user.name}` });
}
return cache.get(user);
}
- WeakMap allows garbage collection of unused objects.
- Regular Map holds references, preventing cleanup.
7. Rate Limiting to Prevent Abuse
Problem: Preventing DDoS Attacks
A single user can send too many requests, overloading your server.
Solution
Use rate-limiting middleware like express-rate-limit
.
const rateLimit = require("express-rate-limit");
const limiter = rateLimit({
windowMs: 1 * 60 * 1000, // 1 minute
max: 100, // Limit each IP to 100 requests per minute
});
const express = require("express");
const app = express();
app.use(limiter);
app.get("/", (req, res) => res.send("Hello!"));
app.listen(3000, () => console.log("Server running..."));
Performance Improvement
- Prevents a single user from overwhelming the system.
- Improves availability by ensuring fair usage.
Conclusion
Key Takeaways
Technique | Performance Benefit |
---|---|
Promise.all() | Runs API calls in parallel (O(1) ) |
Streams | Reads large files without blocking (O(1) memory usage ) |
Worker Threads | Handles CPU-intensive tasks without blocking the main thread |
Indexing & Pooling | Speeds up database queries (O(log n) ) |
Redis Caching | Reduces DB load, speeds up responses (10-100x faster ) |
WeakMap | Prevents memory leaks |
Rate Limiting | Protects against abuse |
This tutorial prepares you for high-performance Node.js coding interviews by focusing on real-world problems and solutions. Implement these techniques to optimize efficiency, scalability, and reliability in your projects!
Leave a Reply