Skip to content

Commit

Permalink
Version 2.0.26
Browse files Browse the repository at this point in the history
  • Loading branch information
kipliklotrika committed Nov 27, 2018
2 parents ee584cd + 18965dd commit 009f10c
Show file tree
Hide file tree
Showing 11 changed files with 214 additions and 56 deletions.
2 changes: 1 addition & 1 deletion modules/service/profile-service.js
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ class ProfileService {
const reputation = new BN(profile.reputation, 10);
const withdrawalTimestamp = new BN(profile.withdrawalTimestamp, 10);
const withdrawalAmount = new BN(profile.withdrawalAmount, 10);
const nodeId = new BN(profile.nodeId, 10);
const nodeId = new BN(Utilities.denormalizeHex(profile.nodeId), 16);
return !(stake.eq(zero) && stakeReserved.eq(zero) &&
reputation.eq(zero) && withdrawalTimestamp.eq(zero) &&
withdrawalAmount.eq(zero) && nodeId.eq(zero));
Expand Down
2 changes: 1 addition & 1 deletion package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "origintrail_node",
"version": "2.0.25",
"version": "2.0.26",
"description": "OriginTrail node",
"main": ".eslintrc.js",
"config": {
Expand All @@ -22,7 +22,7 @@
"start": "node node_version_check.js && node ot-node.js",
"debug:start": "node --nolazy --inspect-brk ot-node.js",
"bootstrap": "npm run setup:hard",
"lint": "./node_modules/.bin/eslint --quiet migrations/ models/ modules/ seeders/ test/ testnet/ ot-node.js .eslintrc.js isStartHealthy.js node_version_check.js add_db_backup.js check-updates.js",
"lint": "./node_modules/.bin/eslint --quiet migrations/ models/ modules/ seeders/ test/ testnet/ ot-node.js .eslintrc.js node_version_check.js add_db_backup.js check-updates.js",
"arango": "/usr/local/opt/arangodb/sbin/arangod &",
"db_backup": "node add_db_backup.js",
"generate-identity": "node ./scripts/identity-cli.js",
Expand Down
35 changes: 35 additions & 0 deletions test/bdd/features/datalayer.feature
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
Feature: ERC725 Profile features
Background: Setup local blockchain and bootstraps
Given the blockchain is set up
And 1 bootstrap is running

Scenario: Check that second gs1 import does not mess up first import's hash value
Given I setup 4 nodes
And I start the nodes
And I use 1st node as DC
And DC imports "importers/xml_examples/Basic/01_Green_to_pink_shipment.xml" as GS1
Given DC initiates the replication
And I wait for 10 seconds
And I remember previous import's fingerprint value
And DC imports "importers/xml_examples/Basic/02_Green_to_pink_shipment.xml" as GS1
And DC initiates the replication
And I wait for 10 seconds
Then checking again first import's root hash should point to remembered value

Scenario: Smoke check data-layer basic endpoints
Given I setup 2 nodes
And I start the nodes
And I use 1st node as DC
And DC imports "importers/xml_examples/Basic/01_Green_to_pink_shipment.xml" as GS1
Given I query DC node locally with path: "identifiers.id", value: "urn:epc:id:sgtin:Batch_1" and opcode: "EQ"
Then response should contain only last imported data set id
Given I query DC node locally for last imported data set id
Then response hash should match last imported data set id

Scenario: Basic dataset integrity with its xml
Given I setup 1 node
And I start the node
And I use 1st node as DC
And DC imports "importers/xml_examples/Basic/01_Green_to_pink_shipment.xml" as GS1
Then imported data is compliant with 01_Green_to_pink_shipment.xml file

8 changes: 8 additions & 0 deletions test/bdd/features/erc725profile.feature
Original file line number Diff line number Diff line change
Expand Up @@ -20,3 +20,11 @@ Feature: ERC725 Profile features
And the 1st node's spend all the Ethers
And I start the node
Then the 1st node should start normally

Scenario: Provide own ERC725 identity and expect node to create profile
Given I setup 1 node
When I manually create ERC725 identity for 1st node
And I use the created ERC725 identity in 1st node
And I start the node
Then the 1st node should have a valid ERC725 identity
And the 1st node should have a valid profile
24 changes: 1 addition & 23 deletions test/bdd/features/network.feature
Original file line number Diff line number Diff line change
Expand Up @@ -17,31 +17,9 @@ Feature: Test basic network features
Then the last import's hash should be the same as one manually calculated
Given DC initiates the replication
And I wait for replications to finish
Then the last root hash should be the same as one manually calculated
Then the last import should be the same on all nodes that replicated data

Scenario: Check that second gs1 import does not mess up first import's hash value
Given I setup 4 nodes
And I start the nodes
And I use 1st node as DC
And DC imports "importers/xml_examples/Basic/01_Green_to_pink_shipment.xml" as GS1
Given DC initiates the replication
And I wait for 10 seconds
And I remember previous import's fingerprint value
And DC imports "importers/xml_examples/Basic/02_Green_to_pink_shipment.xml" as GS1
And DC initiates the replication
And I wait for 10 seconds
Then checking again first import's root hash should point to remembered value

Scenario: Smoke check data-layer basic endpoints
Given I setup 2 nodes
And I start the nodes
And I use 1st node as DC
And DC imports "importers/xml_examples/Basic/01_Green_to_pink_shipment.xml" as GS1
Given I query DC node locally with path: "identifiers.id", value: "urn:epc:id:sgtin:Batch_1" and opcode: "EQ"
Then response should contain only last imported data set id
Given I query DC node locally for last imported data set id
Then response hash should match last imported data set id

Scenario: DC->DH->DV replication + DV network read + DV purchase
Given the replication difficulty is 0
And I setup 5 nodes
Expand Down
29 changes: 29 additions & 0 deletions test/bdd/steps/erc725identity.js
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,35 @@ const BN = require('bn.js');
const utilities = require('./lib/utilities');
const erc725ProfileAbi = require('../../../modules/Blockchain/Ethereum/abi/erc725');

Given(/^I manually create ERC725 identity for (\d+)[st|nd|rd|th]+ node$/, async function (nodeIndex) {
expect(this.state.localBlockchain, 'No blockchain.').to.not.be.undefined;
expect(nodeIndex, 'Invalid index.').to.be.within(0, this.state.nodes.length);

const node = this.state.nodes[nodeIndex - 1];
const nodeWallet = node.options.nodeConfiguration.node_wallet;
const nodeWalletKey = node.options.nodeConfiguration.node_private_key;

const identityContractInstance =
await this.state.localBlockchain.createIdentity(nodeWallet, nodeWalletKey);
expect(identityContractInstance._address).to.not.be.undefined;
this.state.manualStuff.erc725Identity = identityContractInstance._address;
});

When(/^I use the created ERC725 identity in (\d+)[st|nd|rd|th]+ node$/, async function (nodeIndex) {
expect(this.state.localBlockchain, 'No blockchain.').to.not.be.undefined;
expect(this.state.manualStuff.erc725Identity, 'No ERC725 identity.').to.not.be.undefined;
expect(this.state.nodes.length, 'No started nodes.').to.be.greaterThan(0);
expect(this.state.bootstraps.length, 'No bootstrap nodes.').to.be.greaterThan(0);
expect(nodeIndex, 'Invalid index.').to.be.within(0, this.state.nodes.length);

const node = this.state.nodes[nodeIndex - 1];

fs.writeFileSync(
path.join(node.options.configDir, 'erc725_identity.json'),
JSON.stringify({ identity: this.state.manualStuff.erc725Identity }),
);
});

Then(/^the (\d+)[st|nd|rd|th]+ node should have a valid ERC725 identity/, async function (nodeIndex) {
expect(this.state.nodes.length, 'No started nodes.').to.be.greaterThan(0);
expect(this.state.bootstraps.length, 'No bootstrap nodes.').to.be.greaterThan(0);
Expand Down
1 change: 1 addition & 0 deletions test/bdd/steps/hooks.js
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ Before(function (testCase, done) {
this.state.localBlockchain = null;
this.state.nodes = [];
this.state.bootstraps = [];
this.state.manualStuff = {};
done();
});

Expand Down
12 changes: 12 additions & 0 deletions test/bdd/steps/lib/local-blockchain.js
Original file line number Diff line number Diff line change
Expand Up @@ -186,6 +186,10 @@ class LocalBlockchain {
this.holdingContractData = `0x${compileResult.contracts['Holding.sol:Holding'].bytecode}`;
this.holdingContractAbi = JSON.parse(compileResult.contracts['Holding.sol:Holding'].interface);
this.holdingContract = new this.web3.eth.Contract(this.holdingContractAbi);

this.identityContractData = `0x${compileResult.contracts['Identity.sol:Identity'].bytecode}`;
this.identityContractAbi = JSON.parse(compileResult.contracts['Identity.sol:Identity'].interface);
this.identityContract = new this.web3.eth.Contract(this.identityContractAbi);
}

async deployContracts() {
Expand Down Expand Up @@ -357,6 +361,14 @@ class LocalBlockchain {
async getBalanceInEthers(wallet) {
return this.web3.eth.getBalance(wallet);
}

async createIdentity(wallet, walletKey) {
const [, identityInstance] = await this.deployContract(
this.web3, this.identityContract, this.identityContractData,
[wallet], wallet,
);
return identityInstance;
}
}

module.exports = LocalBlockchain;
30 changes: 30 additions & 0 deletions test/bdd/steps/lib/utilities.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
/* eslint-disable max-len */
const sortedStringify = require('sorted-json-stringify');
const { sha3_256 } = require('js-sha3');
const _ = require('lodash');


function calculateImportHash(data) {
return `0x${sha3_256(sortedStringify(data, null, 0))}`;
Expand Down Expand Up @@ -31,8 +34,35 @@ function denormalizeHex(number) {
return number;
}

function findVertexIdValue(verticesArray, vertex_type, sender_id, id_type, id_value) {
const response = [];
verticesArray.forEach((element) => {
if (Object.keys(element).toString() === '_key,id_type,id_value,sender_id,vertex_type') {
if (element.vertex_type === vertex_type && element.sender_id === sender_id && element.id_type === id_type && element.id_value === id_value) {
response.push(element);
}
}
});
return response;
}

function findVertexUid(verticesArray, vertex_type, sender_id, uid, data) {
const response = [];
verticesArray.forEach((element) => {
if (Object.keys(element).toString() === '_key,data,sender_id,uid,vertex_type') {
if (element.vertex_type === vertex_type && element.sender_id === sender_id && element.uid === uid && _.isEqual(element.data, data)) {
response.push(element);
}
}
});
return response;
}


module.exports = {
calculateImportHash,
normalizeHex,
denormalizeHex,
findVertexIdValue,
findVertexUid,
};
123 changes: 94 additions & 29 deletions test/bdd/steps/network.js
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ const deepExtend = require('deep-extend');

const OtNode = require('./lib/otnode');
const Utilities = require('../../../modules/Utilities');
const ImportUtilities = require('../../../modules/ImportUtilities');
const LocalBlockchain = require('./lib/local-blockchain');
const httpApiHelper = require('./lib/http-api-helper');
const utilities = require('./lib/utilities');
Expand Down Expand Up @@ -231,47 +232,111 @@ Given(/^DC imports "([^"]*)" as ([GS1|WOT]+)$/, async function (importFilePath,
this.state.lastImport = importResponse;
});

Then(/^the last import's hash should be the same as one manually calculated$/, function () {
Then(/^the last import's hash should be the same as one manually calculated$/, async function () {
expect(!!this.state.dc, 'DC node not defined. Use other step to define it.').to.be.equal(true);
expect(this.state.nodes.length, 'No started nodes').to.be.greaterThan(0);
expect(this.state.bootstraps.length, 'No bootstrap nodes').to.be.greaterThan(0);
expect(!!this.state.lastImport, 'Last import didn\'t happen. Use other step to do it.').to.be.equal(true);

const { dc } = this.state;
return new Promise((accept, reject) => {
request(
`${dc.state.node_rpc_url}/api/import_info?data_set_id=${this.state.lastImport.data_set_id}`,
{ json: true },
(err, res, body) => {
if (err) {
reject(err);
return;
}

// TODO: Avoid asserting in promise. Manually check.
// expect(body).to.have.keys([
// 'import_hash', 'root_hash', 'import',
// 'transaction', 'data_provider_wallet',
// ]);
if (!body.import || !body.import.vertices || !body.import.edges) {
reject(Error(`Response should contain import: { vertices: ..., edges: ... }\n${JSON.stringify(body)}`));
return;
}
const response = await httpApiHelper.apiImportInfo(dc.state.node_rpc_url, this.state.lastImport.data_set_id);

const calculatedImportHash = utilities.calculateImportHash(body.import);
if (calculatedImportHash !== this.state.lastImport.data_set_id) {
reject(Error(`Calculated hash differs: ${calculatedImportHash} !== ${this.state.lastImport.data_set_id}.`));
return;
}
expect(response, 'response should contain root_hash, import, transaction and data_provider_wallet keys').to.have.keys([
'root_hash', 'import',
'transaction', 'data_provider_wallet',
]);

// TODO: Calculate root hash here and test it against body.root_hash.
accept();
expect(response.import, 'response.import should contain vertices and edges').to.have.keys(['vertices', 'edges']);

const calculatedImportHash = utilities.calculateImportHash(response.import);
expect(calculatedImportHash, `Calculated hash differs: ${calculatedImportHash} !== ${this.state.lastImport.data_set_id}.`).to.be.equal(this.state.lastImport.data_set_id);
});

Then(/^the last root hash should be the same as one manually calculated$/, async function () {
expect(!!this.state.dc, 'DC node not defined. Use other step to define it.').to.be.equal(true);
expect(this.state.nodes.length, 'No started nodes').to.be.greaterThan(0);
expect(this.state.bootstraps.length, 'No bootstrap nodes').to.be.greaterThan(0);
expect(!!this.state.lastImport, 'Last import didn\'t happen. Use other step to do it.').to.be.equal(true);
expect(!!this.state.lastReplication, 'Nothing was replicated. Use other step to do it.').to.be.equal(true);

const { dc } = this.state;

const myFingerprint = await httpApiHelper.apiFingerprint(dc.state.node_rpc_url, this.state.lastImport.data_set_id);
expect(myFingerprint).to.have.keys(['root_hash']);
expect(Utilities.isZeroHash(myFingerprint.root_hash), 'root hash value should not be zero hash').to.be.equal(false);


const myApiImportInfo = await httpApiHelper.apiImportInfo(dc.state.node_rpc_url, this.state.lastImport.data_set_id);
// vertices and edges are already sorted from the response
const myMerkle = await ImportUtilities.merkleStructure(myApiImportInfo.import.vertices.filter(vertex =>
vertex.vertex_type !== 'CLASS'), myApiImportInfo.import.edges);

expect(myFingerprint.root_hash, 'Fingerprint from API endpoint and manually calculated should match').to.be.equal(myMerkle.tree.getRoot());
});

Then(/^imported data is compliant with 01_Green_to_pink_shipment.xml file$/, async function () {
expect(!!this.state.dc, 'DC node not defined. Use other step to define it.').to.be.equal(true);
expect(this.state.nodes.length, 'No started nodes').to.be.greaterThan(0);
expect(this.state.bootstraps.length, 'No bootstrap nodes').to.be.greaterThan(0);
expect(!!this.state.lastImport, 'Last import didn\'t happen. Use other step to do it.').to.be.equal(true);

const { dc } = this.state;
let data;
const myApiImportInfo = await httpApiHelper.apiImportInfo(dc.state.node_rpc_url, this.state.lastImport.data_set_id);

expect(
utilities.findVertexIdValue(myApiImportInfo.import.vertices, 'IDENTIFIER', 'urn:ot:object:actor:id:Company_Green', 'uid', 'urn:ot:object:actor:id:Company_Green:2018-01-01T01:00:00.000-04:00Z-04:00').length,
'There should be at least one such vertex',
).to.be.above(0);
data = {
parent_id: 'urn:epc:id:sgln:Building_Green',
};
expect(
utilities.findVertexUid(myApiImportInfo.import.vertices, 'LOCATION', 'urn:ot:object:actor:id:Company_Green', 'urn:epc:id:sgln:Building_Green_V2', data).length,
'There should be at least one such vertex',
).to.be.above(0);
data = {
category: 'Company',
name: 'Green',
object_class_id: 'Actor',
wallet: '0xBbAaAd7BD40602B78C0649032D2532dEFa23A4C0',
};
expect(
utilities.findVertexUid(myApiImportInfo.import.vertices, 'ACTOR', 'urn:ot:object:actor:id:Company_Green', 'urn:ot:object:actor:id:Company_Green', data).length,
'There should be at least one such vertex',
).to.be.above(0);
data = {
category: 'Beverage',
description: 'Wine Bottle',
object_class_id: 'Product',
};
expect(
utilities.findVertexUid(myApiImportInfo.import.vertices, 'PRODUCT', 'urn:ot:object:actor:id:Company_Green', 'urn:ot:object:product:id:Product_1', data).length,
'There should be at least one such vertex',
).to.be.above(0);
data = {
expirationDate: '2020-31-12T00:01:54Z',
parent_id: 'urn:ot:object:product:id:Product_1',
productId: 'urn:ot:object:product:id:Product_1',
productionDate: '2017-31-12T00:01:54Z',
quantities: {
'urn:ot:object:actor:id:Company_Green:2018-01-01T01:00:00.000-04:00Z-04:00': {
PCS: '5d3381241af6b16260f680059e9042',
},
);
});
},
};
expect(
utilities.findVertexUid(myApiImportInfo.import.vertices, 'BATCH', 'urn:ot:object:actor:id:Company_Green', 'urn:epc:id:sgtin:Batch_1', data).length,
'There should be at least one such vertex',
).to.be.above(0);
expect(
utilities.findVertexIdValue(myApiImportInfo.import.vertices, 'IDENTIFIER', 'urn:ot:object:actor:id:Company_Green', 'uid', 'urn:epc:id:sgln:Building_Green').length,
'There should be at least one such vertex',
).to.be.above(0);
});

Given(/^DC initiates the replication$/, async function () {
Given(/^DC initiates the replication$/, { timeout: 60000 }, async function () {
expect(!!this.state.dc, 'DC node not defined. Use other step to define it.').to.be.equal(true);
expect(!!this.state.lastImport, 'Nothing was imported. Use other step to do it.').to.be.equal(true);
expect(this.state.nodes.length, 'No started nodes').to.be.greaterThan(0);
Expand Down

0 comments on commit 009f10c

Please sign in to comment.