-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Demo of point cloud input #5
Comments
Hi, the mesh in the demo is artificial, very clean, and with thin slices. However, many meshes generated by implicit networks are double-layered and hollow in the middle. I found that the method from this paper does not perform well on these generated meshes. Since point clouds are more common in reality, I wish you could provide a simple demo for point clouds. |
Hi @2019EPWL, Thanks for bringing this up. I have not tested this on AI generated meshes but if you can link the examples you tested, I am keen to try it out. My first guess is that generated meshes have poor geometry which can yield imperfect depth and normal maps to what ControlNet has been trained on. However, good quality meshes such as from https://www.kaedim3d.com/ can definitely work well. For point cloud data, we combine the rendered depth map with canny edge maps as illustrated in the supplementary of our paper (eq 13). Note that this only works well with dense point clouds. Will look into adding a demo, thanks for the idea! |
Dear nilardridutt, @niladridutt I would like to query that do you have any plane to add point cloud demos? |
Hi, Thank you for sharing this work. Currently, the .ipynb file only provides a demo for mesh input. Could you advise on how to proceed when the input is a point cloud? I see from the render_point_cloud.py file that it only returns a depth image, which is different from mesh render.
The text was updated successfully, but these errors were encountered: