brianc / node-pg-copy-streams

COPY FROM / COPY TO for node-postgres. Stream from one database to another, and stuff.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Question] Example on using copy streams starting with just arrays created in memory

fluxquantum opened this issue · comments

Hello, thank you for creating and supporting this library. May I get some tips on how to leverage this library to create bulk inserts using data from arrays directly to the posgres db? So far the examples and use case discuss mostly streaming between dbs or from files to db. But I am looking to do something where I have an array (retrieved from an api response) and need to write that data as records in a posgres db table without using a loop and executing an insert statement one at a time.

Appreciate your time.

Hello,
If you look at the example on https://github.com/brianc/node-pg-copy-streams#pipe-from-a-file-to-table-copyin---copy-from
you will see that the ingestion from a file is done via fileStream.pipe(stream).

Here stream is the database.

If you want to insert directly without piping from another stream, you could do

stream.write("col1\tcol2\n") to insert one tuple in a table that has 2 text columns and considering the COPY operation is started in "tab separated values - tsv"

calling stream.end() will terminate the COPY operation.

You can look at the tests in https://github.com/brianc/node-pg-copy-streams/tree/master/test some of them do manual ingestion.

But you need to take into account that the data inserted via COPY will only be commited when the COPY operation is terminated so it cannot be used directly for long-running ingestion. You need some kind of batching mechanism if you want do ingest data for a long time period.

I hope this will help you.