IBM / nodejs-idb-connector

A JavaScript (Node.js) library for communicating with Db2 for IBM i, with support for queries, procedures, and much more. Uses traditional callback-style syntax

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Avoid javascript parsing on JSON data in db2 columns

jasonclake opened this issue · comments

Is there a way to tell idb-connector when some columns already contain JSON data? In other words, avoid javascript having to parse the JSON string after the fact.

For example: Let's say you have a table in your database with a list of document types where the actual document is stored as JSON in the INFO_JSON data column.

const { dbconn, dbstmt, SQL_ATTR_AUTOCOMMIT, SQL_FALSE } = require("idb-connector");
function runsqlp(sql, conn) {
    return new Promise((resolve) => {
        let statement = new dbstmt(conn);
        console.log(`SQL: ${sql}`);
        statement.exec(sql, (result, error) => {
            console.log(`Result: ${JSON.stringify(result, null, "  ")}`);
            console.log(`Error: ${error}`);
            statement.close();
            console.log("All done.");
            resolve(result);
        });
    });
}

async function test() {
    const connection = new dbconn();
    connection.debug(true);
    connection.conn("*LOCAL");
    console.log("---------------------------------------------");
    var result = await runsqlp(`
    WITH EXAMPLE(ID, NAME, PHONE) AS(
        SELECT * FROM TABLE(VALUES(1,'EMILY','555-1111'),(2,'JASON','555-2222'))
        )
    ,EXAMPLE_JSON AS(
        SELECT 'ADDRESS_BOOK' AS TYPE, json_arrayagg(
            json_object('ID' value ID, 'NAME' value NAME, 'PHONE' value PHONE)
            )as INFO_JSON 
        FROM EXAMPLE	
    )
    SELECT TYPE, INFO_JSON FROM EXAMPLE_JSON
    `,connection);
    
    //Transform manually or is there a way to tell idb-connector
    //to treat INFO_JSON as a json object.
    result[0].INFO_JSON=JSON.parse(result[0].INFO_JSON);
    
    console.log(`Result desired: ${JSON.stringify(result, null, "  ")}`);
    connection.disconn();
    connection.close();
}
test();

RECEIVED OUTPUT

[
    {
        "TYPE": "ADDRESS_BOOK",
        "INFO_JSON": "[{\"ID\":1,\"NAME\":\"EMILY\",\"PHONE\":\"555-1111\"},{\"ID\":2,\"NAME\":\"JASON\",\"PHONE\":\"555-2222\"}]"
    }
]

DESIRED OUTPUT

[
    {
        "TYPE": "ADDRESS_BOOK",
        "INFO_JSON": [
            {
                "ID": 1,
                "NAME": "EMILY",
                "PHONE": "555-1111"
            },
            {
                "ID": 2,
                "NAME": "JASON",
                "PHONE": "555-2222"
            }
        ]
    }
]

@jasonclake , Seems this problem is mentioned here -> https://www.ibm.com/support/knowledgecenter/en/ssw_ibm_i_72/sqlp/rbafyjsonpublishing.htm

I do not know if the format json clause described in the article resolve the problem?

👋 Hi! This issue has been marked stale due to inactivity. If no further activity occurs, it will automatically be closed.

@jasonclake Since Db2 has no "JSON" type (the JSON publishing functions merely return a CLOB), there is no way within the existing driver framework to do this.

One possible solution would be to create some kind of flag to idb-connector to have it run JSON.parse on the output. Alternatively, a more general solution would be to allow users to add some sort of converter function to a column to have idb-connector call it before returning the results to the user. Something like:

        let statement = new dbstmt(conn);
        statement.output_converter(2, JSON.parse) 
        statement.exec(sql, (result, error) => { ... })

PyODBC has something similar, though it works on a type-by-type basis (which wouldn't work here, since again there's no JSON type) instead of a column-by-column basis.

Alternatively, you could continue to do the mapping yourself and make it as generic as needed.