Apple brought depth to its user interfaces with Force Touch, and Wednesday's launch of the iPhone 6s added another layer in the form of 3D Touch. AppleInsider looks at how it works and what you can do with it.
This blog will share various news and information related to gesture user interfaces. If you find any interesting news feel free to contribute by sending them to me for publication.
Thursday, September 10, 2015
Force Touch gets redefined in the iPhone 6s with 3D Touch
Force Touch gets redefined in the iPhone 6s with 3D Touch:
Tuesday, July 21, 2015
New tech bonds fingerprint sensors under Gorilla Glass, could allow button-free iPhones
New tech bonds fingerprint sensors under Gorilla Glass, could allow button-free iPhones: "Security technology firm Sonavation on Tuesday announced a technology allowing ultrasonic fingerprint sensors to be embedded under Corning Gorilla Glass, potentially paving the way for anticipated iPhone designs without home buttons.
"
"
Apple invents natural tap-based gesture input for nudging onscreen objects, selecting text
Apple invents natural tap-based gesture input for nudging onscreen objects, selecting text: "a user is able to move an onscreen object left or right with extreme precision, perhaps nudged a pixel at a time, by lightly tapping on the side of an iPhone. Tap gestures on non-touchscreen portions of a device are picked up by an accelerometer or gyroscope and processed naturally, meaning inputs are represented onscreen in an equal and opposite direction. For example, a light tap on the right side of an iPhone would move an object to the left, while a tap on the left would send the object to the right."
Saturday, May 30, 2015
Project Jacquard
Project Jacquard:
'via Blog this'
Project Jacquard makes it possible to weave touch and gesture interactivity into any textile using standard, industrial looms. Everyday objects such as clothes and furniture can be transformed into interactive surfaces.
'via Blog this'
Friday, May 29, 2015
ATAP’s ‘Soli’ Radar-Based Gesture Control Could Be The Perfect Wearable Interface | TechCrunch
ATAP’s ‘Soli’ Radar-Based Gesture Control Could Be The Perfect Wearable Interface | TechCrunch:
'via Blog this'
Google’s ATAP Project Soli is an attempt to harness the power of hand and finger manipulation to make it easier to interact with ever-smaller devices and screens. It posits that instead of using tools, we should use our “hand motion vocabulary” to control devices, even when the devices aren’t present.
'via Blog this'
Tuesday, May 26, 2015
Tuesday, April 28, 2015
Subscribe to:
Comments (Atom)
