Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why comparing distance of one point in the world and the depth of one pixel in the image will not gives the reliable result #10

Open
hailuu684 opened this issue May 30, 2022 · 1 comment

Comments

@hailuu684
Copy link

Hello Adib,
This is not an issue, I just wanted to ask you about the occlusion filter but I cannot find your email to ask. I am sorry for that.
I would like to ask how bad it is when we only compare the actual distance to one pixel in the image, and how comparing the actual distance with all pixels within the bounding box would help to solve the problem?
Thank you so much and hope to hear from you soon.
Best regards,
Luu Tung Hai

@MukhlasAdib
Copy link
Owner

Hello @hailuu684 ,

I am really sorry for the very late response. There was no special reason for that. I just found out that if we just use a single point (bbox midpoint) the results will be very noisy. In particular I found at the time that the cars' windows were not taken into account in the measurement of the depth map. As you can see below, the window areas are not measured as the same pattern of the car body.

image

So it is very possible that the bbox midpoint is not representative enough to be used as the comparison. To solve that, I decided to aggregate the depth of the pixels surrounding the midpoint. This way, we can use a thresholding mechanism so that the noise from the windows can be taken into consideration.

I hope my answer is clear enough

Best regards,
Adib

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants