We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Looks like this new locking produces a potential deadlock:
WARNING: ThreadSanitizer: lock-order-inversion (potential deadlock) (pid=186275) Cycle in lock order graph: M0 (0x7b4400004e40) => M1 (0x7b3400000bf0) => M0 Mutex M1 acquired here while holding mutex M0 in thread T7: #0 pthread_mutex_lock <null> (slmix+0x27a381) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #1 mtx_lock /re/src/thread/posix.c:180:10 (slmix+0x5ae13a) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #2 vidmix_thread /re/rem/vidmix/vidmix.c:220:3 (slmix+0x5efded) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #3 handler /re/src/thread/thread.c:101:9 (slmix+0x4fe098) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #4 handler /re/src/thread/posix.c:23:27 (slmix+0x5ad8fc) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) Hint: use TSAN_OPTIONS=second_deadlock_stack=1 to get more informative warning message Mutex M0 acquired here while holding mutex M1 in main thread: #0 pthread_mutex_lock <null> (slmix+0x27a381) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #1 mtx_lock /re/src/thread/posix.c:180:10 (slmix+0x5ae13a) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #2 clear_all /re/rem/vidmix/vidmix.c:69:3 (slmix+0x5eeca8) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #3 vidmix_source_enable /re/rem/vidmix/vidmix.c:512:2 (slmix+0x5ee7b6) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #4 disp_enable_h /modules/vmix/vmix.c:119:2 (slmix+0x3e259b) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #5 slmix_disp_enable /src/mix.c:384:9 (slmix+0x2d3a98) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #6 slmix_session_video /src/sess.c:458:3 (slmix+0x2d7c28) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #7 http_req_handler /src/http.c:372:3 (slmix+0x2d0435) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #8 recv_handler /re/src/http/server.c:281:4 (slmix+0x45a1c6) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #9 tcp_recv_handler /re/src/tcp/tcp.c:442:3 (slmix+0x4f394e) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #10 fd_poll /re/src/main/main.c:802:4 (slmix+0x483439) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #11 re_main /re/src/main/main.c:976:9 (slmix+0x4802ba) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245) #12 main /src/main.c:193:2 (slmix+0x2cca00) (BuildId: f7fd61d965732ab811ec233f56f3cf075d6b0245)
Should we revert this commit?
Originally posted by @sreimers in #830 (comment)
The text was updated successfully, but these errors were encountered:
Another approach could be this PR to remove clear_all(): #861
clear_all()
Sorry, something went wrong.
version 3.3.0 is planned for Wed 5. July
should we fix this before the release ?
I have no strong opinions here. Please revert the commit, if that is the best option. Or remove clear_all.
Successfully merging a pull request may close this issue.
Looks like this new locking produces a potential deadlock:
Should we revert this commit?
Originally posted by @sreimers in #830 (comment)
The text was updated successfully, but these errors were encountered: