Updated the previous spatial interaction design experiment with hand tracking.
Felt very satisfying when a UI panel appeared on the hand that triggered it. Looked like you cast a spell or used gravitational gloves.
You could store windows on both hands. It would extend the number of panels you can have. It would allow using left/right-hand-side coding to group windows. For example, frequently used UI panels are on the left side if you are right-handed.
Another idea popped up while testing the prototype — using gestures for minimising windows like grab or pinch-to-zoom. In this case, if the behaviour is consistent across all the apps or implemented on the system OS level, you don't need to have a special button for it (like it works on mobile). It would make the UI cleaner.