brianc / node-pg-pool

A connection pool for node-postgres

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Performance/structure of app

brandonros opened this issue · comments

I have a lib/db.js file that contains:

var pgp = require('pg-promise')();

var dbModule = module.exports;

dbModule.db = null;

dbModule.getDatabaseConnection = function() {
	if (!dbModule.db) {
		dbModule.db = pgp(...);
	}

	return dbModule.db;
};

Then, I require this module in multiple other files that help server REST API calls:

var dbModule = require('db.js');

return dbModule.getDatabaseConnection().one(sql.text, sql.values);

I don't ever free the connection/close it when I'm not using it, since it's supposed to act as the one main connection for the entire app to use.

Would I see performance benefits if I used to pg-pool? Unless I am misunderstanding, I only have one global connection, not a pool given my current setup, and all queries happen sequentially, in a blocked fashion? By switching to a pool, multiple connections would get round-robined and there would be less blocking?

If I am wrong, please let me know. Thanks!

Note: I do use transactions.

I'm not familiar w/ pg-promise so I can't answer questions related to it; however, you're correct about pg-pool using a pool of connections. You generally don't want to use a single database connection for your entire app because the PostgreSQL server can only process one query at a time sequentially for each open connection - so in a multi-tenant app like a web application using a pool allows n * m simultaneous queries where n is the size of the pool and m is the number of processes your app is running within.

You can still use transactions with pg-pool no problem, you just need to make sure to run all queries for a given transaction on the same client.

pg-pool has a pool.query method that checks out a random idle client, runs a query, and checks the client back in. You wouldn't want to run a transaction with that method because you probably wont get the same client every time and bad things ™️ will happen due to that. pool.query is just a connivence function - it's common to do the following for a transaction:

const client = await pool.connect()
try {
  await client.query('BEGIN')
  const result1 = await client.query('INSERT INTO users RETURNING id', [user])
  const savedUser = result1.rows[0]
  const result2 = await client.query('INSERT INTO sessions (user_id)', [savedUser.id])
  await client.query('COMMIT')
} catch(e) {
} finally {
  client.release()
}

Hope this helps!

I appreciate the information + example. Here is what I came up with:

The goal is to have one function that all routes in the app call to return it a client from the pool.

In theory, it should be as simple as:

var dbModule = module.exports;

var pool;

dbModule.getDatabaseConnection = async function() {
	if (!pool) {
		pool = new Pool(global.db);
	}

	var client = await pool.connect();

	return client;
};

Would this be "bad" that every single incoming request now would be calling the async function pool.connect()?

Would this be "bad" that every single incoming request now would be calling the async function pool.connect()?

No; that’s how a pool is meant to be used. (The bad part in that example is the global state.)

@brandonros

since it's supposed to act as the one main connection for the entire app to use.

No, it is not. It uses the global pool to manage the connections.

Would I see performance benefits if I used to pg-pool? Unless I am misunderstanding, I only have one global connection, not a pool given my current setup, and all queries happen sequentially, in a blocked fashion? By switching to a pool, multiple connections would get round-robined and there would be less blocking? If I am wrong, please let me know. Thanks!

I'm afraid all these assumptions are wrong. There is plenty about on the library's website, and on StackOverflow.

But why are you asking this question here instead of StackOverflow or the library's website? You'd get far better answers there.

B.t.w., pg-promise 6.x uses the latest version of the driver and the pool library.

@vitaly-t

B.t.w., pg-promise 6.x uses the latest version of the driver and the pool library.

How does pg-promise expose the pool library? Is it abstracted away?

I do this once on app init:

var db = pgp(global.db);

then, for each incoming request that involves the database, I do query against the db variable.

var results = await db.one(sql.text, sql.values);

Is this desired? I'm just trying to figure out why I have ~10 queries always in "ClientRead" for 5s+ on my single threaded node.js app that uses pg-promise. I want to make sure I don't have the event loop blocked or am saturating one connection when I could be using multiple.

If you say pg-promise does it automatically, that makes my adoration for the library even stronger. That's powerful stuff. However, it feels like magic so I want to be sure?

Might it make sense to include a "how to use pg-promise with REST apps correctly" example in the documentation? I could model around that, since that's what I'm trying to achieve...

How does pg-promise expose the pool library? Is it abstracted away?

Via property $pool.

@brandonros
For anything else related to pg-promise, post your questions either on the project's website or StackOverflow, and I will answer those. Here's not suitable for discussing it.