Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue 654 - gzipping asset bundles #656

Merged
merged 14 commits into from
Jan 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 6 additions & 2 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,9 @@ matrix:
- sudo apt-get install -y gnupg mongodb-org=5.0.8 mongodb-org-database=5.0.8 mongodb-org-server=5.0.8 mongodb-org-shell=5.0.8 mongodb-org-tools=5.0.8
- sudo systemctl daemon-reload && sudo systemctl start mongod && echo $(mongod --version)
- mkdir testData && cd testData
- svn --no-auth-cache export --username $TESTUSER --password $TESTPW https://github.com/3drepo/tests/trunk/cplusplus/bouncer
- git clone --filter=blob:none --sparse https://$TESTUSER:[email protected]/3drepo/tests.git .
- git sparse-checkout add cplusplus/bouncer
- mv cplusplus/bouncer/ .
- cd ../
- until nc -z localhost 27017; do echo Waiting for MongoDB; sleep 1; done
- mongo admin testData/bouncer/createUser.js
Expand Down Expand Up @@ -114,7 +116,9 @@ matrix:
- export CXX="g++"
- export CC="gcc"
- mkdir testData && cd testData
- svn --no-auth-cache export --username $TESTUSER --password $TESTPW https://github.com/3drepo/tests/trunk/cplusplus/bouncer/ext_libs/focal
- git clone --filter=blob:none --sparse https://$TESTUSER:[email protected]/3drepo/tests.git .
- git sparse-checkout add cplusplus/bouncer/ext_libs/focal
- mv cplusplus/bouncer/ext_libs/focal/ .
- cd ../
- echo ============ BOOST INSTALL =============
- sudo apt-get install libboost-all-dev
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
3drepobouncer [![Build Status](https://travis-ci.org/3drepo/3drepobouncer.svg?branch=master)](https://travis-ci.org/3drepo/3drepobouncer) [![Coverage Status](https://coveralls.io/repos/github/3drepo/3drepobouncer/badge.svg?branch=master)](https://coveralls.io/github/3drepo/3drepobouncer?branch=master)
=========

3DRepoBouncer is essentially the refactored 3DRepoCore and (parts of) 3DRepoGUI. It is a C++ library providing 3D Repo Scene Graph definition, repository management and manipulation logic as well as direct MongoDB databse access.
3DRepoBouncer is essentially the refactored 3DRepoCore and (parts of) 3DRepoGUI. It is a C++ library providing 3D Repo Scene Graph definition, repository management and manipulation logic as well as direct MongoDB database access.

### Latest Releases
We always recommend using the [Latest stable release](https://github.com/3drepo/3drepobouncer/releases). However, to access cutting-edge development versions, check out the [tags](https://github.com/3drepo/3drepobouncer/tags).
Expand Down
61 changes: 57 additions & 4 deletions bouncer/src/repo/core/handler/fileservice/repo_file_manager.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,15 @@
#include "repo_file_manager.h"
#include "../../../lib/repo_exception.h"
#include "../../model/repo_model_global.h"
#include "../../model/bson/repo_bson_builder.h"
#include "../../model/bson/repo_bson_factory.h"
#include "repo_file_handler_fs.h"
#include "repo_file_handler_gridfs.h"
#include <boost/iostreams/filtering_stream.hpp>
#include <boost/iostreams/filter/gzip.hpp>
#include <boost/iostreams/copy.hpp>
#include <boost/interprocess/streams/bufferstream.hpp>
#include <istream>

using namespace repo::core::handler::fileservice;

Expand All @@ -46,20 +52,67 @@ bool FileManager::uploadFileAndCommit(
const std::string &collectionNamePrefix,
const std::string &fileName,
const std::vector<uint8_t> &bin,
const repo::core::model::RepoBSON &metadata)
const repo::core::model::RepoBSON &metadata,
const Encoding &encoding)
{
bool success = true;
auto fileUUID = repo::lib::RepoUUID::createUUID();
auto linkName = defaultHandler->uploadFile(databaseName, collectionNamePrefix, fileUUID.toString(), bin);

// Create local references that can be reassigned, if we need to do any
// further processing...

const std::vector<uint8_t>* fileContents = &bin;
repo::core::model::RepoBSON fileMetadata = metadata;

switch (encoding)
{
case Encoding::Gzip:
{
// Use stringstream as a binary container to hold the compressed data
std::ostringstream compressedstream;

// Bufferstream operates directly over the user provided array
boost::interprocess::bufferstream uncompressed((char*)bin.data(), bin.size());

// In stream form for filtering_istream
std::istream uncompressedStream(uncompressed.rdbuf());

boost::iostreams::filtering_istream in;
in.push(boost::iostreams::gzip_compressor());
in.push(uncompressedStream); // For some reason bufferstream is ambigous between stream and streambuf, so wrap it unambiguously

boost::iostreams::copy(in, compressedstream);

auto compresseddata = compressedstream.str();
fileContents = new std::vector<uint8_t>(compresseddata.begin(), compresseddata.end());

repo::core::model::RepoBSONBuilder builder;
builder.append("encoding", "gzip");
auto bson = builder.obj();
fileMetadata = metadata.cloneAndAddFields(&bson);
}
break;
}

auto linkName = defaultHandler->uploadFile(databaseName, collectionNamePrefix, fileUUID.toString(), *fileContents);
if (success = !linkName.empty()) {
success = upsertFileRef(
databaseName,
collectionNamePrefix,
cleanFileName(fileName),
linkName,
defaultHandler->getType(),
bin.size(),
metadata);
fileContents->size(),
fileMetadata);
}

// If we've created a new vector, clean it up. metadata doesn't need to be
// deleted because Mongo BSONs have built in smart-pointers allowing them to
// be passed by value.

if (fileContents != &bin)
{
delete fileContents;
}

return success;
Expand Down
11 changes: 10 additions & 1 deletion bouncer/src/repo/core/handler/fileservice/repo_file_manager.h
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,14 @@ namespace repo {
repo::core::handler::AbstractDatabaseHandler *dbHandler
);

/*
* Possible options for static compression of stored files
*/
enum Encoding {
None = 0,
Gzip = 1
};

/**
* Upload file and commit ref entry to database.
*/
Expand All @@ -62,7 +70,8 @@ namespace repo {
const std::string &collectionNamePrefix,
const std::string &fileName,
const std::vector<uint8_t> &bin,
const repo::core::model::RepoBSON &metadata = repo::core::model::RepoBSON()
const repo::core::model::RepoBSON &metadata = repo::core::model::RepoBSON(),
const Encoding &encoding = Encoding::None
);

/**
Expand Down
11 changes: 7 additions & 4 deletions bouncer/src/repo/lib/datastructure/repo_structs.h
Original file line number Diff line number Diff line change
Expand Up @@ -25,11 +25,14 @@
#include "repo_vector.h"
#include <boost/crc.hpp>

typedef struct {
std::unordered_map<std::string, std::vector<uint8_t>> geoFiles; //files where geometery are stored
std::unordered_map<std::string, std::vector<uint8_t>> jsonFiles; //JSON mapping files
using repo_web_geo_files_t = std::unordered_map<std::string, std::vector<uint8_t>>;
using repo_web_json_files_t = std::unordered_map<std::string, std::vector<uint8_t>>;

struct repo_web_buffers_t{
repo_web_geo_files_t geoFiles; //files where geometery are stored
repo_web_json_files_t jsonFiles; //JSON mapping files
repo::core::model::RepoUnityAssets unityAssets; //Unity assets list
}repo_web_buffers_t;
};

//This is used to map info for multipart optimization
typedef struct {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ AssetModelExport::~AssetModelExport()
repo_web_buffers_t AssetModelExport::getAllFilesExportedAsBuffer() const
{
return {
std::unordered_map<std::string, std::vector<uint8_t>>(),
repo_web_geo_files_t(),
getJSONFilesAsBuffer(),
getUnityAssets()
};
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -564,9 +564,9 @@ repo_web_buffers_t GLTFModelExport::getAllFilesExportedAsBuffer() const
return{ getGLTFFilesAsBuffer(), getJSONFilesAsBuffer() };
}

std::unordered_map<std::string, std::vector<uint8_t>> GLTFModelExport::getGLTFFilesAsBuffer() const
repo_web_geo_files_t GLTFModelExport::getGLTFFilesAsBuffer() const
{
std::unordered_map<std::string, std::vector<uint8_t>> files;
repo_web_geo_files_t files;
//GLTF files
for (const auto &pair : trees)
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -236,7 +236,7 @@ namespace repo{
* Return the GLTF file as raw bytes buffer
* returns an empty vector if the export has failed
*/
std::unordered_map<std::string, std::vector<uint8_t>> getGLTFFilesAsBuffer() const;
repo_web_geo_files_t getGLTFFilesAsBuffer() const;

/**
* Reindex the given faces base on the given information
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -136,9 +136,9 @@ SRCModelExport::~SRCModelExport()
{
}

std::unordered_map<std::string, std::vector<uint8_t>> SRCModelExport::getSRCFilesAsBuffer() const
repo_web_geo_files_t SRCModelExport::getSRCFilesAsBuffer() const
{
std::unordered_map < std::string, std::vector<uint8_t> > fileBuffers;
repo_web_geo_files_t fileBuffers;

for (const auto &treePair : trees)
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ namespace repo{
* Return the SRC file as raw bytes buffer
* returns an empty vector if the export has failed
*/
std::unordered_map<std::string, std::vector<uint8_t>> getSRCFilesAsBuffer() const;
repo_web_geo_files_t getSRCFilesAsBuffer() const;
};
} //namespace modelconvertor
} //namespace manipulator
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,14 @@ using Scalar = float;
using Bvh = bvh::Bvh<Scalar>;
using BvhVector3 = bvh::Vector3<Scalar>;

static const size_t REPO_MP_MAX_VERTEX_COUNT = 65536;
// The vertex count is used as a rough approximation of the total geometry size.
// This figure is empirically set to end up with an average bundle size of 24 Mb.
static const size_t REPO_MP_MAX_VERTEX_COUNT = 1200000;

// This limit is used to prevent metadata files becoming unwieldly, and the
// supermesh UV resolution falling below the quantisation noise floor.
static const size_t REPO_MP_MAX_MESHES_IN_SUPERMESH = 5000;

static const size_t REPO_BVH_MAX_LEAF_SIZE = 16;
static const size_t REPO_MODEL_LOW_CLUSTERING_RATIO = 0.2f;

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ bool SceneManager::commitWebBuffers(
//Upload the files
for (const auto &bufferPair : resultBuffers.geoFiles)
{
if (success &= fileManager->uploadFileAndCommit(databaseName, projectName + "." + geoStashExt, bufferPair.first, bufferPair.second))
if (success &= fileManager->uploadFileAndCommit(databaseName, projectName + "." + geoStashExt, bufferPair.first, bufferPair.second, {}, repo::core::handler::fileservice::FileManager::Encoding::Gzip)) // Web geometry files are gzipped by default
{
repoInfo << "File (" << bufferPair.first << ") added successfully to file storage.";
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -389,7 +389,7 @@ TEST(MultipartOptimizer, TestSingleOversizedMesh)
auto nMesh = 3;
repo::core::model::RepoNodeSet meshes, trans, dummy;
trans.insert(root);
meshes.insert(createRandomMesh(65537, false, 3, { rootID }));
meshes.insert(createRandomMesh(1200000 + 1, false, 3, { rootID })); // 1200000 comes from the const in repo_optimizer_multipart.cpp

repo::core::model::RepoScene* scene = new repo::core::model::RepoScene({}, dummy, meshes, dummy, dummy, dummy, trans);
ASSERT_TRUE(scene->hasRoot(DEFAULT_GRAPH));
Expand Down
Loading