Casually thinking about the Three-body Problem as it relates to gesture playback: for a sequence of t0, t1 touches, infer the centroid, determine the angle, toggle various shift and option sequences, move to t0 (mirror to t1 through the angle of t0, centroid), and then repeat.

#MultitouchPlayback

Also yes, I really did mean “to” the iOS simulator. The only way I’ve found to actually emulate real gesture transitions for complex behavior is scripting CGEvents in macOS to control the simulator. More gesture support than Meta’s reverse-engineered IndigoHID, more than #XCUITest. If you’ve found a better way, please let me know!

#MultitouchPlayback

Also thanks to @geerlingguy‘s tirelsss efforts in illustrating administering macOS with Ansible, the whole thing is automated and configurable via a playbook! I can’t wait to release these:
- multitouch-handler: simultaneous pan/zoom/rotate with multi-finger upgrade/downgrade
- multitouch-recorder: drop-in library for recording json-representations of touch events
- multitouch-playback: play back json recording
- multitouch-vm: ansible x tart

Release date: TBD

#MultitouchPlayback

From my recent work on my advanced gesture automaton: if you’re using CGEvents etc to emulate touches to the iOS simulator, your screen has to be on and unlocked… unless you’re using something like Apple Virtualization!

#MultitouchPlayback