async/await is syntactic sugar for Promises enabling synchronous-looking asynchronous code. async keyword: declares function returning Promise automatically. await keyword: pauses execution until Promise settles, only inside async functions or ES module top-level (Node.js 14.8+). Example: async function fetch() { const data = await readFile('file.txt'); return process(data); }. Error handling: try/catch blocks. Parallel execution: const [a, b] = await Promise.all([fetchA(), fetchB()]). Behind scenes: compiles to Promise chains with generator functions. Always return values from async functions (wrapped in resolved Promise).
Node.js FAQ & Answers
35 expert Node.js answers researched from official documentation. Every answer cites authoritative sources you can verify.
unknown
35 questionsprocess.nextTick() executes callback immediately after current operation, before event loop continues to any phase. setImmediate() executes callback in check phase, after I/O events. Execution order: current operation → process.nextTick() queue → Promise microtasks → event loop phases → setImmediate(). Critical difference: process.nextTick() can starve I/O if used recursively (entire queue must empty), setImmediate() processes one callback per loop iteration. Use setImmediate() for I/O-related callbacks, process.nextTick() only when callback must execute before event loop continues.
Error middleware has 4 parameters, must be last: import { ErrorRequestHandler } from 'express'; const errorHandler: ErrorRequestHandler = (err, req, res, next) => { console.error(err.stack); const status = err.status || 500; const message = process.env.NODE_ENV === 'production' ? 'Internal server error' : err.message; res.status(status).json({ error: message, ...(process.env.NODE_ENV !== 'production' && { stack: err.stack }) }); }; app.use(errorHandler); For async routes, use express-async-errors or wrap: const asyncHandler = (fn) => (req, res, next) => Promise.resolve(fn(req, res, next)).catch(next);. Custom errors: class AppError extends Error { constructor(public status: number, message: string) { super(message); } }.
The event loop is Node's mechanism for non-blocking I/O despite single-threaded JavaScript execution. Six phases: timers (setTimeout/setInterval), pending callbacks (I/O callbacks deferred), idle/prepare (internal), poll (retrieve new I/O events), check (setImmediate callbacks), close callbacks (socket.on('close')). Between phases, microtask queues execute (process.nextTick(), then Promises). Worker pool (libuv threadpool) handles blocking operations: fs, crypto, DNS, zlib. Never block event loop with synchronous operations or CPU-intensive tasks without offloading to worker threads.
Methods: 1) Node.js Inspector: node --inspect app.js, then open chrome://inspect (Chrome DevTools), 2) VS Code: add launch.json configuration with type: 'node', program: '${workspaceFolder}/app.js', 3) debugger statement: place in code, execution pauses. Commands: node --inspect (continue), node --inspect-brk (break at start), node --watch --inspect (auto-restart with debugger). Features: breakpoints, watch expressions, call stack, variable inspection. Production: use logging, APM tools (New Relic, Datadog), error tracking (Sentry). Performance: node --prof app.js generates v8.log, process with node --prof-process v8.log.
fs/promises provides Promise-based file operations: import { readFile, writeFile, mkdir, rm } from 'node:fs/promises'; Read: const data = await readFile('file.txt', 'utf8'); Write: await writeFile('file.txt', data, 'utf8'); Create directory: await mkdir('dir', { recursive: true }); Remove: await rm('path', { recursive: true, force: true }); Check exists: await access('file.txt').then(() => true).catch(() => false); Read dir: const files = await readdir('dir'); Use with async/await for clean error handling. Prefer fs/promises over callback-based fs or sync methods (fs.readFileSync blocks event loop).
Use try/catch blocks for async/await error handling. Pattern: async function() { try { const data = await fetchData(); return process(data); } catch (err) { console.error('Error:', err); throw err; } }. Catch block receives rejection reason from any awaited Promise. For multiple operations: single try/catch wraps all await statements. Parallel operations: try { const [a, b] = await Promise.all([opA(), opB()]); } catch (err) { /* first rejection / }. Global handler: process.on('unhandledRejection', (err) => { / log and exit */ }); Node.js terminates on unhandled rejections by default.
Node.js 20+ includes stable native test runner (no Jest needed). Run: node --test (finds *test.js, .spec.js, test/**/.js automatically). Basic test: import { test } from 'node:test'; import assert from 'node:assert/strict'; test('adds numbers', () => { assert.equal(1 + 2, 3); }); Async: test('async operation', async () => { const result = await fetchData(); assert.equal(result, expected); }); Features: test.describe() (groups), test.only(), test.skip(), test.todo(), --watch (watch mode), --test-reporter (tap, spec, dot), --experimental-test-coverage (code coverage). Mocking: import { mock } from 'node:test'. Eliminates Jest dependency for simple testing.
Promises represent eventual completion/failure of asynchronous operations. Three states: pending, fulfilled (resolved with value), rejected (failed with error). Creation: new Promise((resolve, reject) => { /* async work */ }). Chaining: promise.then(onFulfilled).catch(onRejected).finally(onFinally). Combinators: Promise.all([p1, p2]) (all succeed or any reject), Promise.allSettled([p1, p2]) (wait for all regardless), Promise.race([p1, p2]) (first to settle), Promise.any([p1, p2]) (first to fulfill). Promises are immutable once settled. Node.js APIs provide Promise versions: fs.promises, timers/promises.
Use streams when: file size >100MB (prevents memory overflow), processing data in chunks (transformation, filtering), real-time data (logs, video/audio streaming), network protocols (HTTP, WebSocket), memory-constrained environments. Benefits: constant memory footprint (independent of data size), faster time-to-first-byte (processing starts immediately), automatic backpressure handling (flow control). Example: fs.createReadStream('huge.csv').pipe(csvParser()).pipe(processTransform).pipe(fs.createWriteStream('output.json')). Avoid streams for small files (<1MB) where overhead outweighs benefits.
Methods: (1) Node.js Inspector: node --inspect app.js, open chrome://inspect in Chrome for DevTools debugging, (2) VS Code: add launch.json with type: 'node', program: '${workspaceFolder}/app.js', (3) debugger statement: place in code to pause execution. Commands: node --inspect (continue), node --inspect-brk (break at start), node --watch --inspect (auto-restart with debugger). Features: breakpoints, watch expressions, call stack, variable inspection. Production: use structured logging, APM tools (New Relic, Datadog), error tracking (Sentry). Performance profiling: node --prof app.js generates v8.log, analyze with node --prof-process v8.log.
Access via process.env object: const port = process.env.PORT || 3000; const nodeEnv = process.env.NODE_ENV. Set in shell: NODE_ENV=production node app.js or use .env file with dotenv package: import 'dotenv/config'; (loads .env into process.env). Common variables: NODE_ENV (development/production/test), PORT, DATABASE_URL, API keys. Security: never commit .env to git (add to .gitignore), use secret managers in production (AWS Secrets Manager, HashiCorp Vault). Validation: check required vars at startup: if (!process.env.DATABASE_URL) throw Error('DATABASE_URL required').
Node.js 20+ includes stable native test runner (no Jest needed). Usage: node --test runs all test files (*test.js, .spec.js, test/**/.js). Basic test: import { test } from 'node:test'; import assert from 'node:assert/strict'; test('adds numbers', () => { assert.equal(1 + 2, 3); }); Async: test('async operation', async () => { const result = await fetchData(); assert.equal(result, expected); }); Features: test.describe() (groups), test.only(), test.skip(), test.todo(), --watch (watch mode), --test-reporter (tap, spec, dot). Coverage: node --test --experimental-test-coverage. Mocking: import { mock } from 'node:test'.
Essential practices: 1) Use helmet middleware for security headers (XSS, clickjacking protection), 2) Validate/sanitize all inputs (use validator, express-validator), 3) Rate limiting (express-rate-limit) against brute-force, 4) Use parameterized queries (prevent SQL injection), 5) Keep dependencies updated (npm audit, Snyk), 6) Secure authentication (bcrypt for passwords, Argon2 recommended), 7) HTTPS only, 8) CSRF protection (csurf), 9) Secure session cookies (httpOnly, secure, sameSite), 10) Environment variables for secrets. Install: npm install helmet express-rate-limit. Configuration: app.use(helmet()); app.use(rateLimit({ windowMs: 15 * 60 * 1000, max: 100 })).
Worker Threads: CPU-intensive JavaScript tasks within single process, shared memory (SharedArrayBuffer), fast communication, lower overhead, same event loop context. Cluster: spawn multiple processes sharing server port, utilize all CPU cores for I/O-bound workloads (HTTP servers), complete process isolation, separate memory spaces. Use Worker Threads for: parallel CPU tasks, data processing, maintaining single process. Use cluster for: scaling HTTP servers, fault isolation (crash doesn't affect others), traditional multi-process architecture. Node.js 20+: prefer Worker Threads unless you need process isolation. Can combine both approaches.
Worker Threads enable parallel JavaScript execution in separate V8 isolates within same process. Usage: import { Worker, isMainThread, parentPort } from 'node:worker_threads'; new Worker('./worker.js', { workerData: input }). Communication: worker.postMessage(data), worker.on('message', handler). Features: SharedArrayBuffer (shared memory), Atomics (thread-safe operations), transferable objects (zero-copy). Use for: CPU-intensive JavaScript (image processing, cryptography, data analysis, heavy computation). Don't use for: I/O operations (event loop handles efficiently), external programs (use child_process). Available Node.js 12+ (stable).
Essential practices: (1) Use helmet middleware for security headers (XSS, clickjacking protection), (2) Validate/sanitize all inputs (validator, express-validator), (3) Rate limiting (express-rate-limit) against brute-force, (4) Parameterized queries (prevent SQL injection), (5) Keep dependencies updated (npm audit, Snyk), (6) Secure authentication (bcrypt for passwords, Argon2 recommended 2025), (7) HTTPS only, (8) CSRF protection (csurf), (9) Secure cookies (httpOnly, secure, sameSite), (10) Environment variables for secrets. Install: npm install helmet express-rate-limit. Setup: app.use(helmet()); app.use(rateLimit({ windowMs: 15 * 60 * 1000, max: 100 })).
Use http module: import http from 'node:http'; const server = http.createServer((req, res) => { res.writeHead(200, { 'Content-Type': 'application/json' }); res.end(JSON.stringify({ message: 'Hello' })); }); server.listen(3000, () => console.log('Server running on port 3000')). Request: req.method, req.url, req.headers. Response: res.writeHead(statusCode, headers), res.write(data), res.end(data). For HTTPS: use https module with TLS certificates. Production: use frameworks (Express, Fastify, Koa) for routing, middleware, and developer experience.
Streams process data piece-by-piece without loading entire dataset into memory. Four types: Readable (fs.createReadStream, http.IncomingMessage), Writable (fs.createWriteStream, http.ServerResponse), Duplex (TCP sockets, both read/write), Transform (modify data: zlib, crypto). Core events: 'data' (chunk received), 'end' (complete), 'error' (failure), 'finish' (writing done). Modes: flowing (automatic) vs paused (manual .read()). Piping: readable.pipe(writable) chains streams with automatic backpressure. Use for: large files (>100MB), real-time data, network protocols, memory efficiency.
Node.js 22 LTS excels in production for: RESTful APIs (Express, Fastify, NestJS handle high-traffic microservices), real-time applications (WebSocket chat, live notifications, collaborative tools with native ws module), serverless functions (AWS Lambda, Vercel, Netlify with fast cold starts), data streaming (event-driven architecture for logs, video/audio, IoT), CLI tools and DevOps automation (built-in test runner, watch mode). Major adopters: Netflix, PayPal, LinkedIn, Walmart. Strengths: largest ecosystem (npm 2M+ packages), mature tooling, excellent I/O performance. 2025 position: proven stability vs Bun's speed and Deno's security, remains optimal for enterprise JavaScript backends.
Buffer represents fixed-size binary data allocated outside V8 heap, subclass of Uint8Array. Creation: Buffer.from('string', 'utf8'), Buffer.from([1,2,3]), Buffer.alloc(size) (initialized), Buffer.allocUnsafe(size) (faster, uninitialized). Methods: buf.toString('base64'), buf.write('data'), buf.slice(start, end), buf.compare(other). Encodings: utf8, ascii, base64, hex, binary. Node.js 20+: prefer Uint8Array and TypedArrays for cross-platform code (browser compatibility). Buffer still required for Node.js-specific APIs (fs, crypto). Use .subarray() over .slice() for consistency with TypedArrays.
Use jsonwebtoken package: import jwt from 'jsonwebtoken'; const secret = process.env.JWT_SECRET; Sign: const token = jwt.sign({ userId: user.id, role: user.role }, secret, { expiresIn: '1h', algorithm: 'HS256' }); Verify: try { const decoded = jwt.verify(token, secret); req.user = decoded; } catch (err) { return res.status(401).json({ error: 'Invalid token' }); }. Send in Authorization header: 'Bearer ' + token. Middleware: function auth(req, res, next) { const token = req.headers.authorization?.split(' ')[1]; if (!token) return res.status(401).json({ error: 'No token' }); next(); }. Security: use strong secret (32+ bytes), HTTPS only, short expiration.
Use cors middleware: import cors from 'cors'; Simple (allow all): app.use(cors()). Configure: app.use(cors({ origin: 'https://example.com', credentials: true, methods: ['GET', 'POST', 'PUT', 'DELETE'], allowedHeaders: ['Content-Type', 'Authorization'], maxAge: 86400 })). Multiple origins: origin: ['https://app.com', 'https://admin.app.com'] or function. Dynamic: origin: (origin, callback) => { const allowed = ['https://example.com']; callback(null, allowed.includes(origin)); }. Security: whitelist specific origins in production (never use wildcard * with credentials: true). Preflight (OPTIONS) handled automatically.
RBAC pattern: define roles with permissions, check user role in middleware. Implementation: const permissions = { user: ['read'], admin: ['read', 'write', 'delete'] }; function authorize(action: string) { return (req: Request, res: Response, next: NextFunction) => { const userRole = req.user?.role; if (!userRole || !permissions[userRole]?.includes(action)) { return res.status(403).json({ error: 'Forbidden' }); } next(); }; }. Usage: app.delete('/users/:id', authenticate, authorize('delete'), deleteUser). Store user role in JWT payload or session. For complex permissions, use libraries like casbin or accesscontrol.
Middleware are functions executing during request-response cycle with access to req, res, next. Signature: (req: Request, res: Response, next: NextFunction) => void. Execution order: matches registration order. Types: application-level (app.use(middleware)), router-level (router.use(middleware)), route-specific (app.get('/path', middleware, handler)), error-handling (err, req, res, next). Call next() to continue, next(err) to skip to error handler, omit next() to end response. Use for: authentication, logging, parsing (express.json()), validation, CORS.
Node.js supports two module systems: ES Modules (ESM, modern standard) and CommonJS (legacy). Enable ESM: add "type": "module" to package.json, use .mjs extension, or use import/export syntax. CommonJS uses require()/module.exports. Module resolution: node_modules lookup, file extensions (.js, .json, .node, .mjs, .cjs), index files, package.json exports/main fields. Module caching: first load executes and caches, subsequent loads return cached. Node.js 20+ can require() ESM modules (except those with top-level await). Use ESM for new projects.
cluster module spawns multiple worker processes sharing same server port, utilizing all CPU cores. Master process forks workers, OS distributes incoming connections. Usage: import cluster from 'node:cluster'; import os from 'node:os'; if (cluster.isPrimary) { const numCPUs = os.cpus().length; for (let i = 0; i < numCPUs; i++) cluster.fork(); cluster.on('exit', (worker) => { console.log('Worker died, spawning new'); cluster.fork(); }); } else { /* start HTTP server */ }. Production: use PM2 (handles clustering automatically). Use for: scaling I/O-bound HTTP servers, zero-downtime restarts.
process.exit(code) terminates Node.js process immediately. Exit codes: 0 (success), 1 (failure), 2+ (custom). Warning: doesn't complete pending async operations, skips cleanup. Better alternatives: set process.exitCode = 1 and let process exit naturally, throw error from async context, implement graceful shutdown. Use process.exit() only for: CLI tools with immediate termination, unrecoverable errors. Graceful shutdown: process.on('SIGTERM', async () => { await closeConnections(); process.exitCode = 0; }). Production: always handle SIGTERM/SIGINT for clean shutdown (close database connections, finish in-flight requests).
Scripts define reusable commands in package.json "scripts" field. Run with: npm run
npm (Node Package Manager) is Node.js's default package manager with 2M+ packages. Core commands: npm install