You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello everyone, I am trying to build a solution by streaming images directly via websocket for a separate application thinking it will be more efficient that reading saved images for real time applications. Here is some context and what I tried to do.
I want to build an external interface for real-time webcam transformations using comfyui workflows.
The standard way of doing it seems to:
queue a frame
wait that the process is done and detect that it is done by listening to the websocket messages
Read the result images from the save image path where the output of the workflow are saved.
Now I would like receive the resulting images directly via websocket and test if it is more efficient.
In the main.py, I see that there is a way to achieve that since there is some code that seems to send images via websocket: server.send_sync(BinaryEventTypes.UNENCODED_PREVIEW_IMAGE, preview_image, server.client_id)
When queuing images in loop with the websockets_api_example.py I printed every output received by the websocket and it seems to be only strings, no binary data, when images are successfully generated.
Do I need to add a server.send_sync(BinaryEventTypes.PREVIEW_IMAGE, preview_image, server.client_id) somewhere so that I can read the image with out = websocket.recv() on the receiving end?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hello everyone, I am trying to build a solution by streaming images directly via websocket for a separate application thinking it will be more efficient that reading saved images for real time applications. Here is some context and what I tried to do.
I want to build an external interface for real-time webcam transformations using comfyui workflows.
The standard way of doing it seems to:
Now I would like receive the resulting images directly via websocket and test if it is more efficient.
In the main.py, I see that there is a way to achieve that since there is some code that seems to send images via websocket:
server.send_sync(BinaryEventTypes.UNENCODED_PREVIEW_IMAGE, preview_image, server.client_id)
When queuing images in loop with the websockets_api_example.py I printed every output received by the websocket and it seems to be only strings, no binary data, when images are successfully generated.
Do I need to add a
server.send_sync(BinaryEventTypes.PREVIEW_IMAGE, preview_image, server.client_id)
somewhere so that I can read the image without = websocket.recv()
on the receiving end?Beta Was this translation helpful? Give feedback.
All reactions