Meta has released its Interaction SDK Experimental as part of the v37 release this week. You can see the product overview on the, erm… Oculus developer website. (Maybe the Meta memo hasn't reached that department yet.)
The SDK enables developers to build and integrate hand and controller interactions into apps, through modular interaction components that are also composable. Added flexibility comes with the fact that developers can use just the pieces they need to integrate them into existing architecture, or use it standalone.
Some of the capabilities within Interaction SDK Experimental include:
- Grab, Resize & Throw
- Hand Grab
- Pose Detection
- Direct Touch
- Targeting and Selection
The SDK has been developed with feedback from developers such as Odders Lab, ForeVR Games and Miru Studio.
Tracked keyboard
Also released this week for the development community is the Tracked Keyboard SDK that allows users to bring physical keyboards with them into VR. Computer vision tracking locates and renders keyboards - the Logitech K830 and the Apple Magic Keyboard - Bluetooth receives keyboard output (optional), and a combination of Hand tracking and Passthrough renders hands.
The SDK allows developers to integrate this capability into Unity and Native apps.