Replies: 1 comment
-
I personally like that it's explicit, because the pole of inaccessibility is usually calculated for visual purposes, and we can define the maximum error we can tolerate depending on our visual requirements (e.g. we know it's OK if it's within a few pixels of the right point). As for errors on degenerate polygons — we should find a way to fix this independently of how precision is set. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
See #51 for a potential pitfall in handling very small or degenerate polygons.
In my code I set the precision to one thousandth of the shortest side of the polygon bounding box. I dislike using fixed-point values instead of dictating a particular "reasonable" domain of possible values, so this relative measurement seemed like a good compromise. However, this still lets a small polygon tank the algorithm. I think I'll add a special case to my code to immediately return the bounding box centroid if either dimension is less than some value (that particular value would depend on whether your implementation is for general polygons or polygons in a known domain).
Thoughts? Has anyone else come up with a different solution for automatically determining an appropriate precision?
Beta Was this translation helpful? Give feedback.
All reactions