Skip to content

Commit

Permalink
Update gestures.md to reflect current state of gesture recognition al…
Browse files Browse the repository at this point in the history
…gorithm (#75)
  • Loading branch information
mvanderkamp authored Jun 23, 2023
1 parent c7e2ae6 commit 76eec68
Showing 1 changed file with 23 additions and 39 deletions.
62 changes: 23 additions & 39 deletions gestures.md
Original file line number Diff line number Diff line change
@@ -1,51 +1,35 @@
# Per-device
```mermaid
sequenceDiagram
participant cg as ClientGestureLibrary
participant ci as Interactor
participant cc as ClientController
participant sc as ServerController
participant mh as MessageHandler
participant sv as ServerView
participant svg as ServerViewGroup
participant item
cg ->> ci : recognize gesture
ci ->> cc : call handler received from controller
cc ->> sc : emit message with gesture data
sc ->> mh : handle the gesture
mh ->> sv : transform x,y from view coordinates to workspace coordinates
sv ->> mh : (x, y) point
mh ->> item : emit gesture event
note over sc, item: item is selected using the Track gesture,<br>first point down finds an item to lock, or the view
```

# Multi-device:
```mermaid
sequenceDiagram
participant cc as ClientController
participant sc as ServerController
participant dv as Device
participant gc as GestureController
participant sg as ServerGestureLibrary
participant mh as MessageHandler
participant sv as ServerView
participant svg as ServerViewGroup
participant item
participant ws as WorkSpace
actor user
cc ->> sc : emit pointer event
sc ->> dv : transform x,y from view coordinates to device coordinates
dv ->> sc : (x, y) point
sc ->> gc : process pointer event
gc ->> sg : process pointer event
sg ->> mh : recognize gesture
mh ->> svg : transform x,y point from device coordinates to workspace coordinates
cc ->> sc : transmit pointer event with (clientX, clientY)
sc ->> sv : transform (clientX, clientY) to view coordinates
sv ->> sc : (viewX, viewY)
svg ->> mh : (x, y) point
mh ->> item : emit gesture event
alt if event is pointerdown and the view has no locked item
sc ->> ws : obtain item lock, possibly on the view group
ws ->> sv : set locked item
end
note over sc, item: item is selected using the Track gesture,<br>first point down finds an item to lock, or the view
```
sc ->> user : emit pointer event with (viewX, viewY)
sc ->> dv : transform (clientX, clientY) to device coordinates
dv ->> sc : (deviceX, deviceY)
sc ->> gc : process pointer event
gc ->> mh : recognize gesture with (deviceCentroidX, deviceCentroidY)
mh ->> dv : transform (deviceCentroidX, deviceCentroidY) back to client coordinates
dv ->> mh : (clientCentroidX, clientCentroidY)
mh ->> sv : transform (clientCentroidX, clientCentroidY) to workspace coordinates
sv ->> mh : (viewCentroidX, viewCentroidY)
mh ->> user : emit gesture event with (viewCentroidX, viewCentroidY)
This double transformation is actually correct. The device transformation moves the pointer event to where the device is located physically relative to other devices, that is: where in the server view group the event takes place. Further transforming from where in the view group the event takes place to where in the workspace it takes place is therefore logical. However, if we are to use this same process for single-device gestures we need to provide a unique ServerViewGroup for each ServerView when in single-device mode.
alt if event is pointerup and there are no inputs remaining in the device group
sc ->> sv : release locked item
end
```

0 comments on commit 76eec68

Please sign in to comment.