Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to do inferencing for a given 3d point cloud of room ? #17

Open
shubhamwagh opened this issue Dec 7, 2018 · 7 comments
Open

How to do inferencing for a given 3d point cloud of room ? #17

shubhamwagh opened this issue Dec 7, 2018 · 7 comments

Comments

@shubhamwagh
Copy link

Hello!
Once the model is trained how inferencing from the model is done? If I give an input 3d point cloud of a room can I get output as a floorplan? or do I also need to give images as well.

@art-programmer
Copy link
Owner

Please refer to https://github.com/art-programmer/FloorNet/blob/master/RecordWriterCustom.py for writing your own data as a tfrecords file. Then you can run inference similar as did in evaluate.py. You don't have to provide images. However, if you don't have images, maybe better to also train the model without images.

@shubhamwagh
Copy link
Author

Hello!
Thanks for the reply. So I am actually trying to use the pre-trained model and want to do inferencing on my pointcloud datatset (2-3 pcd files). So what I understood is -

  1. I will first convert my pointcloud dataset into tfrecords file.
  2. Then I can run inference similar to evaluate.py ......right?

By images I meant while writing data into tfrecords file do I have to also provide images which is captured during scanning?

@art-programmer
Copy link
Owner

Yes, you are correct. If you have images, it should be better to provide. If not, you can leave zero values to the image_feature field.

@shubhamwagh
Copy link
Author

Hi!

So I was trying to use the RecordWriterCustom.py file which you had pointed out earlier to convert custom point cloud scans to tfrecords file.

  1. Initially I converted all my ".pcd" files to ".npy"
  2. Kept the numChannels =3 as I am only giving XYZ points.
  3. Run the RecordWriterCustome.py code which successfully converts (after some tweakings in the code) the point cloud scans to ".tfrecords" file . Initially I gave only one point cloud file.

Now when I am using this file to evaulate :

python train.py --task=evaluate --separateIconLoss after which I get the following error

'
WARNING:tensorflow:From /home/shubham/FloorNet/train.py:635: sparse_to_dense (from tensorflow.python.ops.sparse_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Create a tf.sparse.SparseTensor and use tf.sparse.to_dense instead.
2019-02-19 18:14:52.247303: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2019-02-19 18:14:53.037395: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.040973: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.044373: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.047929: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.049790: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.051721: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.053636: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.055573: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
Traceback (most recent call last):
File "/home/shubham/pycharm-community-2018.3.2/helpers/pydev/pydevd.py", line 1741, in
main()
File "/home/shubham/pycharm-community-2018.3.2/helpers/pydev/pydevd.py", line 1735, in main
globals = debugger.run(setup['file'], None, None, is_module)
File "/home/shubham/pycharm-community-2018.3.2/helpers/pydev/pydevd.py", line 1135, in run
pydev_imports.execfile(file, globals, locals) # execute the script
File "/home/shubham/FloorNet/train.py", line 1559, in
evaluate(args)
File "/home/shubham/FloorNet/evaluate.py", line 113, in evaluate
total_loss, losses, dataset, image_flags, gt, pred, debug, inp = sess.run([loss, loss_list, dataset_flag, flags, gt_dict, pred_dict, debug_dict, input_dict])
File "/home/shubham/FloorNet/venv/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 929, in run
run_metadata_ptr)
File "/home/shubham/FloorNet/venv/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1152, in _run
feed_dict_tensor, options, run_metadata)
File "/home/shubham/FloorNet/venv/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1328, in _do_run
run_metadata)
File "/home/shubham/FloorNet/venv/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1348, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Key: points. Can't parse serialized Example.
[[{{node ParseSingleExample/ParseSingleExample}} = ParseSingleExample[Tdense=[DT_INT64, DT_INT64, DT_STRING, DT_STRING, DT_INT64, DT_INT64, DT_FLOAT, DT_STRING], dense_keys=["corner", "flags", "icon", "image_path", "num_corners", "point_indices", "points", "room"], dense_shapes=[[900], [2], [], [], [], [50000], [300000], []], num_sparse=0, sparse_keys=[], sparse_types=[]](arg0, ParseSingleExample/Const, ParseSingleExample/Const, ParseSingleExample/Const_2, ParseSingleExample/Const_2, ParseSingleExample/Const, ParseSingleExample/Const, ParseSingleExample/Const_6, ParseSingleExample/Const_2)]]
[[node IteratorGetNext (defined at /home/shubham/FloorNet/evaluate.py:62) = IteratorGetNextoutput_shapes=[[?,2], [?], [?,50000], [?,50000,7], [?,300,3], [?,256,256], [?], [?,256,256]], output_types=[DT_INT64, DT_STRING, DT_INT32, DT_FLOAT, DT_INT32, DT_INT32, DT_INT64, DT_INT32], _device="/job:localhost/replica:0/task:0/device:CPU:0"]]
Backend TkAgg is interactive backend. Turning interactive mode on.

'
Not able to understand what the exact problem is?
If you possible can point out what is exactly going wrong? or any pointers to resolve this problem.

Thanks

@shubhamwagh
Copy link
Author

I am still not able to do prediction on my custom pointcloud data. A detailed insight on this will be greatly appreciated.

@KirillHiddleston
Copy link

KirillHiddleston commented Apr 8, 2021

Hi!

So I was trying to use the RecordWriterCustom.py file which you had pointed out earlier to convert custom point cloud scans to tfrecords file.

  1. Initially I converted all my ".pcd" files to ".npy"
  2. Kept the numChannels =3 as I am only giving XYZ points.
  3. Run the RecordWriterCustome.py code which successfully converts (after some tweakings in the code) the point cloud scans to ".tfrecords" file . Initially I gave only one point cloud file.

Now when I am using this file to evaulate :

python train.py --task=evaluate --separateIconLoss after which I get the following error

'
WARNING:tensorflow:From /home/shubham/FloorNet/train.py:635: sparse_to_dense (from tensorflow.python.ops.sparse_ops) is deprecated and will be removed in a future version.
Instructions for updating:
Create a tf.sparse.SparseTensor and use tf.sparse.to_dense instead.
2019-02-19 18:14:52.247303: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2019-02-19 18:14:53.037395: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.040973: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.044373: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.047929: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.049790: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.051721: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.053636: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
2019-02-19 18:14:53.055573: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at example_parsing_ops.cc:240 : Invalid argument: Key: points. Can't parse serialized Example.
Traceback (most recent call last):
File "/home/shubham/pycharm-community-2018.3.2/helpers/pydev/pydevd.py", line 1741, in
main()
File "/home/shubham/pycharm-community-2018.3.2/helpers/pydev/pydevd.py", line 1735, in main
globals = debugger.run(setup['file'], None, None, is_module)
File "/home/shubham/pycharm-community-2018.3.2/helpers/pydev/pydevd.py", line 1135, in run
pydev_imports.execfile(file, globals, locals) # execute the script
File "/home/shubham/FloorNet/train.py", line 1559, in
evaluate(args)
File "/home/shubham/FloorNet/evaluate.py", line 113, in evaluate
total_loss, losses, dataset, image_flags, gt, pred, debug, inp = sess.run([loss, loss_list, dataset_flag, flags, gt_dict, pred_dict, debug_dict, input_dict])
File "/home/shubham/FloorNet/venv/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 929, in run
run_metadata_ptr)
File "/home/shubham/FloorNet/venv/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1152, in _run
feed_dict_tensor, options, run_metadata)
File "/home/shubham/FloorNet/venv/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1328, in _do_run
run_metadata)
File "/home/shubham/FloorNet/venv/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1348, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.InvalidArgumentError: Key: points. Can't parse serialized Example.
[[{{node ParseSingleExample/ParseSingleExample}} = ParseSingleExample[Tdense=[DT_INT64, DT_INT64, DT_STRING, DT_STRING, DT_INT64, DT_INT64, DT_FLOAT, DT_STRING], dense_keys=["corner", "flags", "icon", "image_path", "num_corners", "point_indices", "points", "room"], dense_shapes=[[900], [2], [], [], [], [50000], [300000], []], num_sparse=0, sparse_keys=[], sparse_types=[]](arg0, ParseSingleExample/Const, ParseSingleExample/Const, ParseSingleExample/Const_2, ParseSingleExample/Const_2, ParseSingleExample/Const, ParseSingleExample/Const, ParseSingleExample/Const_6, ParseSingleExample/Const_2)]]
[[node IteratorGetNext (defined at /home/shubham/FloorNet/evaluate.py:62) = IteratorGetNextoutput_shapes=[[?,2], [?], [?,50000], [?,50000,7], [?,300,3], [?,256,256], [?], [?,256,256]], output_types=[DT_INT64, DT_STRING, DT_INT32, DT_FLOAT, DT_INT32, DT_INT32, DT_INT64, DT_INT32], _device="/job:localhost/replica:0/task:0/device:CPU:0"]]
Backend TkAgg is interactive backend. Turning interactive mode on.

'
Not able to understand what the exact problem is?
If you possible can point out what is exactly going wrong? or any pointers to resolve this problem.

Thanks
can u share RecordWriterCustome.py ?

@marcomiglionico94
Copy link

Someone was able to solve this? I am getting the exact same problem

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants