Skip to content

MaxConnectionPoolSize is not respected under heavy load #501

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
aaronjwood opened this issue Nov 20, 2019 · 6 comments · Fixed by #544
Closed

MaxConnectionPoolSize is not respected under heavy load #501

aaronjwood opened this issue Nov 20, 2019 · 6 comments · Fixed by #544

Comments

@aaronjwood
Copy link

aaronjwood commented Nov 20, 2019

The default internal connection pool size for this driver seems to be set at 100. I'm seeing that this can be broken. Basic test to reproduce this behavior:

    test('Connection pool', async () => {
        const driver = Neo4J.driver('bolt://localhost:7687', Neo4J.auth.basic('neo4j', 'neo4j'), {
            disableLosslessIntegers: true,
        });

        const p: Promise<any>[] = [];
        for (let i = 0; i < 1500; i++) {
            p.push(new Promise(async (res, rej) => {
                const session = driver.session();
                await session.run('MATCH (n) RETURN n');
                session.close();
                res();
            }));
        }

        await Promise.all(p);
    }, 30000);

Watch netstat while you run this test and see that the number of connections goes into the thousands:

watch -n .5 'netstat -an | grep "7687" | grep ESTABLISHED | wc -l'

I've tested this against Neo4j 3.5.3 and 3.5.12 using driver versions 1.7.5 and 4.0.0-beta01.

@zhenlineo
Copy link
Contributor

#500

@zhenlineo
Copy link
Contributor

Hi @aaronjwood ,
Thanks very much for reporting this to us. We are releasing a new 4.0.0-rc1 driver with the fix inside. The fix ensures the maximum active connection count is always less or equal to MaxConnectionPoolSize.

@aaronjwood
Copy link
Author

Thanks! Any chance this can be backported to 1.7.x? I don't want to upgrade to the beta driver release.

@zhenlineo
Copy link
Contributor

Hi @aaronjwood,
The change was also merged in 1.7. But I currently do not have a patch release date for next 1.7. I will let you know once I got more information.

@zhenlineo
Copy link
Contributor

Hi @aaronjwood ,

The 4.0.0 js driver is fully released. It supports all 3.3+ servers. The API is slightly changed, you can read more here.

Could you verify if the problem is solved and close this issue if it is no longer the problem? Or let us know if you think there is more need to be done.
Cheers,
Zhen

@aaronjwood
Copy link
Author

We hadn't planned on upgrading to 4.0.0 soon as the API has changed a bit and requires some work on our end to swap it in. I'll let you know if 4.0.0 fixes the issue when we upgrade to it but for now we've layered our own session pool on top of the lib to work around the issue.

@2hdddg 2hdddg linked a pull request Mar 19, 2020 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants