-
Notifications
You must be signed in to change notification settings - Fork 44
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CSUMB Capstone : Added google/draco compression natively in mbgrd2gltf #1456
base: master
Are you sure you want to change the base?
Conversation
…nclude Draco files, Verify build
…at was introduced from geometry class.
…at was introduced from geometry class.
…ng recognized in the process.
…d funtion definations from draco.h.
Tiny gltf update
readd file
@MBARIMike : The file presented cloning issues on Windows 11 both via the CLI and the GitHub Desktop application. The problem was isolated to that one single empty file. As per our discussion, and as mentioned in #1453, the aforementioned |
Testing this PR - getting the new code:
Building it:
Testing man page:
Sweet! Testing execution:
Oops! what did I do wrong? |
Omitting the I was able to create a couple of draco compressed .glb files that can be viewed with X3DOM here: https://stoqs.mbari.org/x3d/Monterey25_10x/Monterey25_e10_draco.html # Options The file size reduction with I do see jaggy artifacts in the resulting model, especially on the flatter parts of the terrain, e.g.: Is this reasonable for what we can expect with Draco compression? We don't see these jaggies in the file created without https://stoqs.mbari.org/x3d/Monterey25_10x/Monterey25_e10_lat_rev.html |
@MBARIMike : I'll be working on adding other quantization variables to the program and to the GLTF structure. I wonder if modifying the NORMAL quantization variable will have some effect on the renderings on the flats. I did expect the edges because of fidelity loss, however it didn't seem as drastic in the GLTF viewers. Stay tuned for updates. |
I wonder if the jaggies are being caused by round off or lack of numerical precision error. Are all the geometry computations done using double precision floats until the final step? |
@MBARIMike : Its possible. I didn't want to change the base calculations, however I'll make some changes and see if that helps. I added the other quantization variables and that didn't help the roughness. Increasing the positional quantization to 20 yields a bigger , less compressed and thus smoother mesh. I have attached a screenshot of the edge of the ~29MB file. In addition, this seems to be the expected behavior as using default draco_transcoder, the ~3MB .glb thus produced is similarly rough on the flats. I didn't exaggerate the output to showcase the same in the following screenshot. The following were the results of The files thus produced were visually very similar.( No difference to the naked eye. ) |
@StevenHPatrick Or @aretinoco, Meanwhile, I'll be tackling the output dir discrepancy today and hopefully have it all finished by the coming Wednesday. |
Testing latest commits...
Resulting files have been put into X3D files viewable here with a consistent default viewpoint: Higher values of The man page and help message should give some guidance in this regard:
Also, it seems that the |
Thankyou for testing this for us. I'll be using these numbers as part of our analysis to identify an optimal -qp value based on time spend encoding, size rendered, and the loading time of the assets.
The above section of the man page seems to be flipped. It seems like the original edits from my end did not make it to the manpage. Please expect a fix soon.
I'll work on removing the option without anything breaking by the end of the week.
One of my teammates will make the requested change according to previous standards. @MBARIMike : Could you please remind me the steps or direction to take to make use of the Michigan and Erie .grd files for testing ? |
The NetCDF metadata differs from the various programs that write .grd files. The metadata for the Monterey25.grd file is:
Whereas the metadata for erie_lld.grd is:
The same information is there, but it needs to be parsed and constructed differently for mbgrd2gltf. I'm adding more try/catch blocks to this section of code to be able to read both formats. |
Note that what GMT grid writing modules insert has changed many times over different versions, and the same is true for mbgrid, and other sources of GMT format grids like the GMRT global topography site. I don’t think that you can successfully identify a short list of variants for the x y and z names and units, but instead have to code assuming those values are arbitrary. It may be worth trying to identify grids that are in some projected coordinate system versus geographic longitude and latitude, but I don’t think that you can expect standardization of the array variable names and units.
Dave
… On Apr 30, 2024, at 5:10 PM, Mike McCann ***@***.***> wrote:
[POSSIBLE IMPERSONATION: This message is using the name of an MBARI account holder and has originated from outside of the organization. Please review the content and sender information carefully.]
The NetCDF metadata differs from the various programs that write .grd files. The metadata for the Monterey25.grd file is:
➜ build git:(capstone-spring2024) ✗ ncdump -h ~/GitHub/stoqsgit/stoqs/loaders/Monterey25.grd
netcdf Monterey25 {
dimensions:
side = 2 ;
xysize = 7037572 ;
variables:
double x_range(side) ;
x_range:units = "user_x_unit" ;
double y_range(side) ;
y_range:units = "user_y_unit" ;
double z_range(side) ;
z_range:units = "user_z_unit" ;
double spacing(side) ;
int dimension(side) ;
float z(xysize) ;
z:scale_factor = 1. ;
z:add_offset = 0. ;
z:node_offset = 0 ;
// global attributes:
:title = "" ;
:source = "xyz2grd mbm_arc2grd_tmp_ -GMonterey25.grd -H0 -I0.00025269750452443/0.00025269750452443 -R-122.507137008218/-121.78063168271/36.4408483957993/37.058946491866 -N-9999 -ZTLa -V" ;
}
Whereas the metadata for erie_lld.grd is:
➜ build git:(capstone-spring2024) ✗ ncdump -h ~/GitHub/stoqsgit/stoqs/loaders/erie_lld.grd
netcdf erie_lld {
dimensions:
x = 7201 ;
y = 2401 ;
variables:
double x(x) ;
x:long_name = "x" ;
x:actual_range = -84., -78. ;
double y(y) ;
y:long_name = "y" ;
y:actual_range = 41., 43. ;
float z(y, x) ;
z:long_name = "z" ;
z:_FillValue = NaNf ;
z:actual_range = -62.5768966674805, 603.534606933594 ;
// global attributes:
:Conventions = "COARDS/CF-1.0" ;
:title = "erie.grd" ;
:history = "grdmath erie_igld.grd 173.5 SUB = erie_lld.grd" ;
:GMT_version = "4.5.7 [64-bit]" ;
}
The same information is there, but it needs to be parsed and constructed differently for mbgrd2gltf. I'm adding more try/catch blocks to this section of code <https://github.com/dwcaress/MB-System/blob/1caa49f336a78deedf9f9fcb75adb05d3f35e171/src/mbgrd2gltf/bathymetry.cpp#L50-L67> to be able to read both formats.
—
Reply to this email directly, view it on GitHub <#1456 (comment)>, or unsubscribe <https://github.com/notifications/unsubscribe-auth/ABTSXXFD7LQPP5QJQCXWFULZAAXF5AVCNFSM6AAAAABGSFWDW6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDAOBXG42DSNBQHE>.
You are receiving this because you are subscribed to this thread.
----------------------------------------------------
David W. Caress
Principal Engineer
Seafloor Mapping Lab
Monterey Bay Aquarium Research Institute
7700 Sandholdt Road
Moss Landing, CA 95039
***@***.***
https://www.mbari.org/person/dave-caress/
https://www.mbari.org/team/seafloor-mapping/
Phone: 831-775-1775
|
@dwcaress Thanks for your comment, it helps confirm my observations on the variety of .grd file formats. Given that there is no determinative way to discern the format (for example by switching on the value of the Conventions global attribute) I think it's best to deal (for now) with the arbitrary values by having arbitrary sets of try/catch blocks that can parse the files we care about. The mbgrd2gltf tool converts geographic longitude and latitude to ECEF coordinates before converting the grid into a mesh – we don't need to convert the data to a projected coordinate system, even though this may add some determinism to the workflow. |
I was wondering if regex pattern matching might be an option for the variable names. Something along the lines of : try:
# Define regex patterns
side_pattern = List of regex patterns matching for sides
xy_size_pattern = List of regex patterns matching for x,y size
range_pattern = Pattern for 'a:zA:Z_range' matching
# Iterate over variables in the dataset to find matches
for var_name in variables:
# if else through the regex blocks and see what we match with.
assign = data accordingly.
except Exception as e:
#raise hell
|
Thanks @varadpoddar. So far I'm going with a pattern like this:
With these kind of changes, I'm able to parse the data into the variables that bathymetry.cpp needs. However, my testing indicates several other issues (upstream of the draco compression) that are beyond the scope of the work entailed by this PR. |
Following up on #1456 (comment) I made the changes shown in this gist and was able to make draco compressed meshes of the Michigan and Erie .grd files:
The resulting .glb files load into X3DOM here: Note that the
I'm making this comment here to document further work beyond the scope of this PR. Here is a checklist for follow-on improvements:
|
@MBARIMike : On it. I apologize for the delay. I am setting a hard deadline for us for these tasks by the end of the day Thursday. If that is later than you expected, I'll try to crank it out by tomorrow. Please let me know. As for the error,
It seems to be because the gltf size, or more importantly the buffer size exceeds the original capacity of the array and your approach with the stride seems to be the optimal way for this scenario. I'll talk to the team tomorrow, however I am thinking something along the lines of
Waiting to hear back. |
Hi @varadpoddar No need to crank it out tomorrow. The purpose of my checklist is to capture items for follow-on work. |
@MBARIMike : Based on some more extensive research, I located multiple causes for runtime errors for the js frameworks used to visualize the gltf/glb data. One error happens because of irregular placement of buffer data for large files and Draco-decoder's inability to parse it. (Assumption) Since a fair chunk of memory is requested in one lump sum because of the big buffer size, I think it reaches past their initial memory allotment. One of the fixes that I explored but wasn't able to implement was chunking the buffer data. This would require exploring breaking up of buffer data and assigning the properties (accessors, bufferview) correctly to the data while it is being dracoCompressed. Second error type was uint8Array. As a work-around and based on some experimentation, setting the stride value to 1 reduces the z size by ~10-20%. The strides or compressions thus force the buffer to not jump out of the array, which leads me to believe that it might be an issue there.
All changes have been implemented and the documentation reflects the same. In the spirit to improve upon the project, I spent some time exploring finding ways to eliminate the try-catch blocks in favor of some standard to parse in the netCDF values. Towards this exploration, I ended up writing a json parser that could theoretically perform the task as it should have, however it wasn't published as part of the pull request. The following would be blueprint of a "Universal" NetCDF value interpreter.
Time constraints led me to leave this approach behind in favor for the try-catch blocks from the gist. I hope another team could use this approach to gain some traction in the wild Wild westish world that is netCDF. Regards, |
Adds : Draco compression to mbgrd2gltf
Solves : #1426
Tasks Performed :