Skip to content

Suggestion with cluster module when use "maxConnection" in its worker process #54882

Closed as not planned
@EchoFUN

Description

@EchoFUN

What is the problem this feature will solve?

When the node service is under a very high load, multiple connections are processed at the same time in one worker ( we use the cluster module currently in our project ). We set the "maxConnections" to limit the connections of the worker. But we found that if a new request reach the limit of the "maxConnections", the request will retry on other workers. I think can we have an option, if a new request reach the limit, we can just drop the request instead of retrying the request on other workers ? Because as the system is under a very high load, the other workers may also be very busy at this moment. Here is a example on "v22.7.0".

const cluster = require('cluster');
const http = require('http');
const process = require('process');

if (cluster.isPrimary) {
  console.log(`Master ${process.pid} is running.\n`);
  for (let i = 0; i < 1; i++) {
    cluster.fork();
  }
} else {
  const server = http.createServer((req, res) => {
    res.writeHead(200);
    res.end('hello world\n');
  });

  server.maxConnections = 0;

  server.listen(8000, () => {
    console.log(`Worker ${process.pid} started`);
  });
}

What is the feature you are proposing to solve the problem?

For example, add An option "--maxconnections-drop-request" to the node "Command-line options" while on start up.

What alternatives have you considered?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    clusterIssues and PRs related to the cluster subsystem.feature requestIssues that request new features to be added to Node.js.stale

    Type

    No type

    Projects

    Status

    Awaiting Triage

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions