A JavaScript client for Node implementing the SAP HANA Database SQL Command Network Protocol.
- Install
- Getting started
- Establish a database connection
- Direct Statement Execution
- Prepared Statement Execution
- Bulk Insert
- Streaming results
- Transaction handling
- Streaming Large Objects
- Running tests
- Running examples
- Todo
Install from npm:
npm install hdb
or clone from the GitHub repository to run tests and examples locally:
git clone https://github.com/SAP/node-hdb.git
cd node-hdb
npm install
If you do not have access to a SAP HANA server, go to the SAP HANA Developer Center and choose one of the options to get your own trial SAP HANA Server.
This is a very simple example how to use this module:
var hdb = require('hdb');
var client = hdb.createClient({
host : 'hostname',
port : 30015,
user : 'user',
password : 'secret'
});
client.on('error', function (err) {
console.error('Network connection error', err);
});
client.connect(function (err) {
if (err) {
return console.error('Connect error', err);
}
client.exec('select * from DUMMY', function (err, rows) {
client.end();
if (err) {
return console.error('Execute error:', err);
}
console.log('Results:', rows);
});
});
The first step to establish a database connection is to create a client object. It is recommended to pass all required connect
options like host
, port
, user
and password
to the createClient
function. They will be used as defaults for following connect calls on the created client instance. In case of network connection errors like a connection timeout or a database restart you should always register an error event handler in order to be able to handle this kind of problems.
var hdb = require('hdb');
var client = hdb.createClient({
host : 'hostname',
port : 30015,
user : 'user',
password : 'secret'
});
client.on('error', function (err) {
console.error('Network connection error', err);
});
console.log(client.readyState); // new
When a client instance is created it does not immediately open a network connection to the database host. Initially the client is in state new
. When you call connect
the first time two things are done internally.
-
A network connection is established and the communication is initialized (Protocol - and Product Version exchange). Now the connection is ready for exchanging messages but no user session is established. The client is in state
disconnected
. This step is skipped if the client is already in statedisconnected
. -
The authentication process is initiated. After a successful user authentication a database session is established and the client is in state
connected
. If authentication fails the client remains in state'disconnect'
.
client.connect(function (err) {
if (err) {
return console.error('Error:', err);
}
console.log(client.readyState); // connected
});
If user and password are specified they will override the defaults of the client. It is possible to disconnect and reconnect with a different user on the same client instance and the same network connection.
Details about the different authentication method can be found in the SAP HANA Security Guide.
Users authenticate themselves with their database user
and password
.
SAML bearer assertions as well as unsolicited SAML responses that include an unencrypted SAML assertion can be used to authenticate users. SAML assertions and responses must be signed using XML signatures. XML Digital signatures can be created with xml-crypto or xml-dsig.
Instead of user
and password
you have to provide a SAML assertion
.
client.connect({
assertion: '<Assertion xmlns="urn:oasis:names:tc:SAML:2.0:assertion" ...>...</Assertion>'
},function (err) {
if (err) {
return console.error('Error:', err);
}
console.log('User:', client.get('user'));
console.log('SessionCookie:', client.get('SessionCookie'));
});
After a successful SAML authentication the server returns the database user
and a SessionCookie
which can be used for reconnect.
A Kerberos authentication provider can be used to authenticate users.
This mechanism is not implemented and I do not plan to implement a Kerberos authentication provider myself. Contributions via pull request are welcome.
To establish an encrypted database connection just pass whether key
, cert
and ca
or a pfx
to createClient.
var client = hdb.createClient({
host : 'hostname',
port : 30015,
key : fs.readFileSync('client-key.pem'),
cert : fs.readFileSync('client-cert.pem'),
ca : [fs.readFileSync('trusted-cert.pem')],
...
});
Direct statement execution is the simplest way to execute SQL statements. The only input parameter is the SQL command to be executed. Generally we return the statement execution results using callbacks. The type of returned result depends on the kind of statement.
In the case of a DDL Statement nothing is returned.
client.exec('create table TEST.NUMBERS (a int, b varchar(16))', function (err) {
if (err) {
return console.error('Error:', err);
}
console.log('Table TEST.NUMBERS has been created');
});
In the case of a DML Statement the number of affectedRows
is returned.
client.exec('insert into TEST.NUMBERS values (1, \'one\')', function (err, affectedRows) {
if (err) {
return console.error('Error:', err);
}
console.log('Number of affected rows:', affectedRows);
});
The exec
function is a convenient way to completely retrieve the result of a query. In this case all selected rows
are fetched and returned in the callback. The resultSet
is automatically closed and all Lobs
are completely read and returned as Buffer objects. If streaming of the results is required you will have to use the execute
function. This is described in section Streaming results.
client.exec('select A, B from TEST.NUMBERS order by A', function(err, rows) {
if (err) {
return console.error('Error:', err);
}
console.log('Rows:', rows);
});
The default representation of a single row is an Object where the property names are the columnDisplayNames of the resultSetMetadata.
var command = 'select top 1 * from t1';
client.exec(command, function(err, rows) {
/* rows will be an array like this:
[{
ID: 1,
A: 't1.1.a',
B: 't1.1.b'
}]
*/
});
If your SQL statement is a join with overlapping column names, you may want to get separate objects for each table per row. This is possible if you set option nestTables
to true.
var command = 'select top 1 * from t1 join t2 on t1.id = t2.id';
var options = {
nestTables: true
};
client.exec(command, options, function(err, rows) {
/* rows will be an array like this now:
[{
T1: {
ID: 1,
A: 't1.1.a',
B: 't1.1.b',
},
T2: {
ID: 1
A: 't2.1.a',
B: 't2.1.b',
},
}]
*/
});
It is also possible to return all rows as an Array where the order of the column values is exactly the same as in the resultSetMetadata. In this case you have to set the option rowsAsArray
to true.
var command = 'select top 1 * from t1 join t2 on t1.id = t2.id';
var options = {
rowsAsArray: true
};
client.exec(command, options, function(err, rows) {
/* rows will be an array like this now:
[[
1,
't1.1.a',
't1.1.b',
1
't2.1.a',
't2.1.b'
]]
*/
});
The client returns a statement
object which can be executed multiple times.
client.prepare('call * from DUMMY where X = ?', function (err, statement){
if (err) {
return console.error('Error:', err);
}
// do something with the statement
console.log('StatementId', statement.id);
});
The execution of a prepared statement is similar to the direct statement execution on the client. The difference is that the first parameter of exec
function is an array with positional parameters
. In case of named parameters it can also be an parameters
object.
statement.exec([1], function (err, rows) {
if (err) {
return console.error('Error:', err);
}
console.log('Rows:', rows);
});
If you use the execute
instead of exec
function the resultSet
is returned in the callback like in direct query execution above.
If you have for example the following stored procedure:
create procedure PROC_DUMMY (in a int, in b int, out c int, out d DUMMY, out e TABLES)
language sqlscript
reads sql data as
begin
c := :a + :b;
d = select * from DUMMY;
e = select * from TABLES;
end
you can call it via a prepared statement.
The second argument is always an object with the scalar parameters.
If there are no scalar parameters, an empty object {}
will be returned.
The following arguments are the resultSets
.
client.prepare('call PROC_DUMMY (?, ?, ?, ?, ?)', function(err, statement){
if (err) {
return console.error('Prepare error:', err);
}
statement.exec({
A: 3,
B: 4
}, function(err, parameters, dummyRows, tableRows) {
if (err) {
return console.error('Exec error:', err);
}
console.log('Parameters:', parameters);
console.log('Dummies:', dummyRows);
console.log('Tables:', tableRows);
});
});
To drop the statement simply call
statement.drop(function(err){
if (err) {
return console.error('Drop error:', err);
}
console.log('Statement dropped');
});
The callback is optional in this case.
If you want to use datetime types in a prepared statement,
be aware that strings like '14.04.2016 12:41:11.215'
are not
processed by the SAP HANA Database but by the node-hdb module.
Therefore you must use the exact required format that would be returned
by a selection made with this module.
The formats are:
TIME: '13:32:20'
DATE: '2016-04-14'
TIMESTAMP: '2016-04-14T13:32:20.737'
SECONDDATE: '2016-04-14T13:32:20'
Another possibility is to use the functions
TO_DATE
, TO_DATS
, TO_TIME
and TO_TIMESTAMP
in your
SQL statement to convert your string to a valid datetime type.
If you want to insert multiple rows with a single execute you just have to provide the all parameters as array.
client.prepare('insert into TEST.NUMBERS values (?, ?)', function(err, statement){
if (err) {
return console.error('Prepare error:', err);
}
statement.exec([[1, 'one'], ['2', 'two'], [3, 'three']], function(err, affectedRows) {
if (err) {
return console.error('Exec error:', err);
}
console.log('Array of affected rows:', affectedRows);
});
});
Take a look at the example app9 for further details.
If you use the execute
function of client or statement instead of the exec
function, a resultSet
object is returned in the callback instead of an array of all rows. The resultSet
object allows you to create an object based row
stream or an array based stream of rows
which can be piped to an writer object. Don't forget to close the resultSet
if you use the execute
function.
client.execute('select A, B from TEST.NUMBERS order by A', function(err, rs) {
if (err) {
return console.error('Error:', err);
}
rs.setFetchSize(2048);
rs.createObjectStream()
.pipe(new MyWriteStream())
.on('finish', function (){
if (!rs.closed) {
rs.close();
}
});
});
Take a look at the example app4 for further details.
The default behavior is that each statement is automatically commited. If you want to manually control commit
and rollback
of a transaction, you can do this by calling setAutoCommit(false)
on the client object.
function execTransaction(cb) {
client.setAutoCommit(false);
async.series([
client.exec.bind(client, "insert into NUMBERS values (1, 'one')"),
client.exec.bind(client, "insert into NUMBERS values (2, 'two')")
], function (err) {
if (err) {
client.rollback(function(err){
if (err) {
err.code = 'EROLLBACK';
return cb(err);
}
cb(null, false);
});
} else {
client.commit(function(commitError){
if (err) {
err.code = 'ECOMMIT';
return cb(err);
}
cb(null, true);
});
}
client.setAutoCommit(true);
});
}
execTransaction(function(err, ok){
if (err) {
return console.error('Commit or Rollback error', err);
}
if (ok) {
console.log('Commited');
} else {
console.log('Rolled back');
}
})
Take a look at the example tx1 for further details.
Reading large object as stream can be done if you use the execute
method of client or statement. In this case for all LOB columns a Lob object is returned. You can call createReadStream
or read
in order create a readable stream or to read the LOB completely.
Writing large objects is automatically done. You just have to pass instance of Readable
or a Buffer object as parameter.
Take a look at the example app7 for further details.
To run the unit tests for hdb simply run:
make test-unit
To run the unit tests as well as acceptance tests for hdb you have to run:
make test
For the acceptance tests a database connection has to be established. Therefore you need to copy the configuration template config.tpl.json in the test/db
folder to config.json
and change the connection data to yours. If the config.json
file does not exist a local mock server is started.
Also, for the examples you need a valid a config.json
in the test/db
folder.
- app1: Simple query.
- app2: Fetch rows from
ResultSet
. - app3: Streaming rows
createObjectStream()
. - app4: Pipe row into JSON-Transform and to
stdout
. - app5: Stream from the filesystem into a db table.
- app6: Stream from a db table into the filesystem.
- app7: Insert a row with a large image into a db table (uses WriteLobRequest and Transaction internally).
- app8: Automatic reconnect when network connection is lost.
- app9: Insert multiple rows with large images into a db table as one batch.
- app10: Usage example of query option
nestTables
. - call1: Call stored procedure.
- call2: Call stored procedure with lob input and output parameter.
- call3: Call stored procedure with table as input parameter.
- tx1: Transaction handling (shows how to use commit and rollback).
- csv: Stream a db table into csv file.
- server: Stream rows into http response
http://localhost:1337/{schema}/{tablename}?top={top}
To run e.g. the first example:
node examples/app1
- Improve documentation of the client api
- Improve error handling
- ...