Amazon's rumored flagship smartphone is said to include a total of six compact camera modules. Four of them are reportedly VGA cameras, located at the four corners of the device, that will allow for 3D gesture and eye tracking.
This blog will share various news and information related to gesture user interfaces. If you find any interesting news feel free to contribute by sending them to me for publication.
Thursday, October 31, 2013
Amazon planning 2014 smartphone with advanced 3D gesture & eye tracking input - source
AppleInsider reports:
Friday, October 25, 2013
Monday, October 21, 2013
WHYRemote
WHYRemote is another company using gestures. Their claim they do a lot of things:
"Mind Control" sounds especially ominous. We should keep an eye on it to see how their product develops. It seems to be in early stages now.
WHYRemote© is a Television Remote using: Voice Control, Hand Gesture Control, Eyes Sight Tracking Control, and Mind Control technologies.
WHYRemote© is based on electroencephalography, embedded voice control technology, and motion detection algorithms with exclusive fingertip tracking capabilities.
The goal is combining Hand Gesture Recognition software and Voice Control technologies with a unique, sophisticated hand shape and motion detection algorithm that works together with a standard 3D camera, microcontrollers, and voice navigator to obtain a television remote that is completely controlled by the user.
With WHYRemote’s innovative and patent pending technology, the world is provided with a diverse hand gesture and voice control solution for a variety of platforms and applications.
"Mind Control" sounds especially ominous. We should keep an eye on it to see how their product develops. It seems to be in early stages now.
Sunday, October 13, 2013
Google applies for patent on gesture-based car controls
Engadget reports: Google applies for patent on gesture-based car controls.
I happen to know that researches at CMU were experimenting with something similar. I wonder if it might constitute prior art.
Friday, October 11, 2013
Ultrasound chip offers gesture control for mobiles
A bit of news from an unusual (for this blog) sources BBC.
Visit Elliptic Labs web site for more info on their technology.
"Ultrasound technology that enables mobiles and tablets to be controlled by gesture could go into production as early as next year. Norwegian start-up Elliptic Labs is in talks with Asian handset manufacturers to get the chip embedded in devices."
Visit Elliptic Labs web site for more info on their technology.
Wednesday, October 9, 2013
Cuttable, Foldable Sensors Can Add Multi-Touch To Any Device
TechCrunch reports citing MIT's paper:
"Researchers at the MIT Media Lab and the Max Planck Institutes have created a foldable, cuttable multi-touch sensor that works no matter how you cut it, allowing multi-touch input on nearly any surface."
Tuesday, October 8, 2013
Agawi TouchMark contrasts iPad's fast screen response to laggy Android tablets
Appleinsider reports :
"Cross platform mobile ad vendor Agawi has released test results from its TouchMarks study of various tablets, including Microsoft's Surface, Amazon's Kindle and Amazon tablets by Nvidia, Samsung and Google's Nexus-branded tablet built by Asus."
GreenTouch: data capture, curation, and analysis via a multi-touch tabletop interface
Research at Google is featuring today one of recipients of Google App Engine Education Award - GreenTouch. It is interesting to see people working on using touch interfaces in education.
Nest Protect is featuring gesture control
Nest just announced their newest product: Nest Protect. One interesting tidbit caught my attention:
More detailed description of Nest Wave feature could be found here.
The feature is trivial but it shows how gesture interfaces becoming more and more widely used in various consumer products. It is an also an elegant solution to a problem where the user needs to control device which is located far from his or her reach.
"No more frantically swinging towels at the smoke alarm to quiet it down. If there’s a nuisance alarm, just stand under Nest Protect and wave your arm to hush the alert. As you wave, your hand should be 2-8 feet away from the alarm."
More detailed description of Nest Wave feature could be found here.
The feature is trivial but it shows how gesture interfaces becoming more and more widely used in various consumer products. It is an also an elegant solution to a problem where the user needs to control device which is located far from his or her reach.
Thursday, October 3, 2013
Whole-Home Gesture Recognition Using Wireless Signals
"Whole-Home Gesture Recognition Using Wireless Signals" paper which won MobiCom 2013 best paper award discusses futuristic gesture-based interaction where user gestures are sensed through WiFi signals.
Apple investigating advanced velocity-sensitive touch input methods
AppleInsider reports new Apple's patent application they have spotted:
Apple's touchscreens can measure not only where you tap, but also how hard you tap — and that velocity sensing functionality may become even more advanced in the future, a new patent application reveals. The filing, entitled "System and Method for Enhancing Touch Input," describes a processing algorithm that would estimate the velocity of a touch input. The sensors within an iOS device or otherwise allow the system to determine velocity, even though the screen may not be pressure sensitive.
The patent abstract provides more technical detail:
Disclosed herein are systems, methods, and non-transitory computer-readable storage media for processing user input. A system configured to practice the method first receives, via a touch screen of a computing device, input from a user. Then the system fetches data associated with the input from at least two sensors other than the touch screen and adjusts an input processing algorithm based on the input and the data to yield an adjusted input processing algorithm. Then the system can process the input according the adjusted input processing algorithm. The adjusted input processing algorithm can estimate a velocity of the input and/or filter out invalid inputs. The other sensors besides the touch screen can be an accelerometer, a gyroscope, a microphone, a Hall Effect sensor, a compass, an ambient light sensor, a proximity sensor, a camera, and/or a positioning system. The data can relate to the input based on a temporal relationship.
Wednesday, October 2, 2013
Google Acquires YC-Backed Flutter, A Gesture Recognition Technology Startup, For Around $40M
TechCrunch reports that Google Acquires YC-Backed Flutter, A Gesture Recognition Technology Startup, For Around $40M. Flutter works on gesture recognition using Web Cam.
Subscribe to:
Posts (Atom)