Tuesday, December 24, 2013

Apple patents accurate touch and hover panel, embedded heart rate monitor

Appleisider, our reliable Apple watchdog reports that Apple patents accurate touch and hover panel, embedded heart rate monitor.

The hovering part is described as:

A number of modern computing devices, like Apple's iPhone and iPad, incorporate touch-sensitive panels that enhance and define the user experience by affording unprecedented GUI manipulation and control. Some systems also incorporate what is known as "hover" controls, which allow users to interface with a device without actually touching it. 
Using specialized internal components, these touch sensitive devices can recognize an object hovering above a display panel, like a user's finger or stylus. Once a hover event has been detected, the device may process it as a touch event, handling subsequent actions according to the general rules of traditional touch input. For example, if a hover event occurs over a specific app, that app may be opened.


This capability is not new. Modern mobile phones equipped with proximity-sensitive touch screens include Samsung Galaxy S4 and Sony Xperia. Sony calls this feature “Floating Touch” while Samsung calls it “AirView.” This patent shows that Apple is considering incorporating this feature into their devices.

A team of researches from CMU which I was part of recently submitted a paper on a system implementing passive user authentication using proximity information. I will add the link to the paper to this post once it is published.



Wednesday, December 18, 2013

Augmented Reality For Glass » Gesture Interaction with Glass

Folks from Hit Lab NZ in New Zeland started to experiment with using Google Glass for gesture recognition. From brief description in Augmented Reality For Glass » Gesture Interaction with Glass I understand they execute Computer Vision algorithms on PC and using Glass as a camera and display.

This is consistent with slow frame rates I've seen when running simple OpenCV demos on the glass. However this is still very exiting and important as a prototype exploring new gesture interaction scenarios using this novel device.

Wednesday, December 11, 2013

Apple patents tech for making curved touch surfaces, displays

Apple patents tech for making curved touch surfaces, displays:

"The U.S. Patent and Trademark Office on Tuesday awarded Apple a patent that describes a method for efficiently manufacturing curved touch surfaces, suggesting the company may be experimenting with curved iOS device displays."

Tuesday, December 3, 2013

Apple, PrimeSense And Perceptive Computing Or: Why Your Phone Will See In 3D | TechCrunch

Apple, PrimeSense And Perceptive Computing Or: Why Your Phone Will See In 3D | TechCrunch:

Quote:

"To enable something called ‘intent-based computing'. Through a combination of voice, facial recognition, gesture recognition and awareness of signals like depth and 3D space, perceptive computing will allow us to interact with computers in a way"

Monday, November 25, 2013

Massive new Apple patent filing details multi-user support, trackpad controls with Touch ID

Massive new Apple patent filing details multi-user support, trackpad controls with Touch ID.

"Apple's first set of claims revolve around enabling the home button's Touch ID sensor to identify and process gestures, essentially acting as a miniature trackpad for navigating within and switching between applications"

(Via AppleInsider)

Tuesday, November 12, 2013

AppleInsider reports:

With its U.S. Patent No. 8,581,870 for "Touch-sensitive button with two levels," Apple may be on the verge of introducing a new form of input; one that could change the face of iOS.
While a majority of the patent refers to a "button," a notation buried near the end of the document points out that a touch screen operatively coupled to an actuator can be considered a "touch-sensitive depressible button".

Thursday, October 31, 2013

Amazon planning 2014 smartphone with advanced 3D gesture & eye tracking input - source

AppleInsider reports:
Amazon's rumored flagship smartphone is said to include a total of six compact camera modules. Four of them are reportedly VGA cameras, located at the four corners of the device, that will allow for 3D gesture and eye tracking.

Monday, October 21, 2013

WHYRemote

WHYRemote is another company using gestures. Their claim they do a lot of things:
WHYRemote© is a Television Remote using: Voice Control, Hand Gesture Control, Eyes Sight Tracking Control, and Mind Control technologies. 
WHYRemote© is based on electroencephalography, embedded voice control technology, and motion detection algorithms with exclusive fingertip tracking capabilities. 
The goal is combining Hand Gesture Recognition software and Voice Control technologies with a unique, sophisticated hand shape and motion detection algorithm that works together with a standard 3D camera, microcontrollers, and voice navigator to obtain a television remote that is completely controlled by the user. 
With WHYRemote’s innovative and patent pending technology, the world is provided with a diverse hand gesture and voice control solution for a variety of platforms and applications.

"Mind Control" sounds especially ominous. We should keep an eye on it to see how their product develops. It seems to be in early stages now.

Sunday, October 13, 2013

Google applies for patent on gesture-based car controls

Engadget reports: Google applies for patent on gesture-based car controls.


I happen to know that researches at CMU were experimenting with something similar. I wonder if it might constitute prior art.

Friday, October 11, 2013

Ultrasound chip offers gesture control for mobiles

A bit of news from an unusual (for this blog) sources BBC.
"Ultrasound technology that enables mobiles and tablets to be controlled by gesture could go into production as early as next year. Norwegian start-up Elliptic Labs is in talks with Asian handset manufacturers to get the chip embedded in devices."



Visit Elliptic Labs web site for more info on their technology.

Wednesday, October 9, 2013

Cuttable, Foldable Sensors Can Add Multi-Touch To Any Device

TechCrunch reports citing MIT's paper:

"Researchers at the MIT Media Lab and the Max Planck Institutes have created a foldable, cuttable multi-touch sensor that works no matter how you cut it, allowing multi-touch input on nearly any surface."

Tuesday, October 8, 2013

Agawi TouchMark contrasts iPad's fast screen response to laggy Android tablets

Appleinsider reports :

"Cross platform mobile ad vendor Agawi has released test results from its TouchMarks study of various tablets, including Microsoft's Surface, Amazon's Kindle and Amazon tablets by Nvidia, Samsung and Google's Nexus-branded tablet built by Asus." 


GreenTouch: data capture, curation, and analysis via a multi-touch tabletop interface

Research at Google is featuring today one of recipients of Google App Engine Education Award - GreenTouch. It is interesting to see people working on using touch interfaces in education.




Nest Protect is featuring gesture control

Nest just announced their newest product: Nest Protect. One interesting tidbit caught my attention:
"No more frantically swinging towels at the smoke alarm to quiet it down. If there’s a nuisance alarm, just stand under Nest Protect and wave your arm to hush the alert. As you wave, your hand should be 2-8 feet away from the alarm."


More detailed description of Nest Wave feature could be found here.

The feature is trivial but it shows how gesture interfaces becoming more and more widely used in various consumer products. It is an also an elegant solution to a problem where the user needs to control device which is located far from his or her reach.

Thursday, October 3, 2013

Whole-Home Gesture Recognition Using Wireless Signals

"Whole-Home Gesture Recognition Using Wireless Signals" paper which won MobiCom 2013 best paper award discusses futuristic gesture-based interaction where user gestures are sensed through WiFi signals.


Apple investigating advanced velocity-sensitive touch input methods

AppleInsider reports new Apple's patent application they have spotted:
Apple's touchscreens can measure not only where you tap, but also how hard you tap — and that velocity sensing functionality may become even more advanced in the future, a new patent application reveals. The filing, entitled "System and Method for Enhancing Touch Input," describes a processing algorithm that would estimate the velocity of a touch input. The sensors within an iOS device or otherwise allow the system to determine velocity, even though the screen may not be pressure sensitive.
The patent abstract provides more technical detail:
Disclosed herein are systems, methods, and non-transitory computer-readable storage media for processing user input. A system configured to practice the method first receives, via a touch screen of a computing device, input from a user. Then the system fetches data associated with the input from at least two sensors other than the touch screen and adjusts an input processing algorithm based on the input and the data to yield an adjusted input processing algorithm. Then the system can process the input according the adjusted input processing algorithm. The adjusted input processing algorithm can estimate a velocity of the input and/or filter out invalid inputs. The other sensors besides the touch screen can be an accelerometer, a gyroscope, a microphone, a Hall Effect sensor, a compass, an ambient light sensor, a proximity sensor, a camera, and/or a positioning system. The data can relate to the input based on a temporal relationship.

Wednesday, October 2, 2013

Google Acquires YC-Backed Flutter, A Gesture Recognition Technology Startup, For Around $40M

TechCrunch reports that Google Acquires YC-Backed Flutter, A Gesture Recognition Technology Startup, For Around $40M.  Flutter works on gesture recognition using Web Cam.

Tuesday, September 24, 2013

Apple experimenting with multitouch swipe gestures for keyboard in iOS

Appleinsider reports on Apple's U.S. Patent No. 8,542,206 for "Swipe gestures for touch screen keyboards". The patent deals in part with using various swipe-like gestures as keyboard input shortcuts.


Monday, September 23, 2013

New Microsoft Touch Cover features gestures

New Microsoft Touch Cover 2 was announced today along with Surface Pro 2. There are an early reports it has some touch gestures added:
Slide two fingers across the number key line, and the Touch Cover will highlight text. Release, and the selected text will be deleted. A spacebar gesture talks to Windows 8.1′s word recommendation system.

 We will report more on these once more information becomes available.

Thursday, September 19, 2013

First HP Computer With Embedded Leap Motion Tech Will Ship This Fall For $1,049.99

Techcrunch reports
"The HP ENVY17 Leap Motion SE is the first shipping computer to build the startup’s tech directly in, and features a new embedded Leap Motion sensor that dramatically reduces size vs. previous embedded designs."

Tuesday, September 17, 2013

Structure Sensor: Capture the World in 3D

This new device from Occipital could be used as 3D gesture input sensor. Can't wait to see first applications!

Monday, September 9, 2013

Dryft Wants To Reinvent The Way We Type On Tablets

More news from Disrupt SF 2013 stage:
Swype co-founder Randy Marsden — along with Bridgescale Partners co-founder Rob Chaplinsky — decided to take another stab at reinventing typing on the go with an Android keyboard app called Dryft.
From Dryft's product description:
Dryft provides a true natural no-look typing experience on the screen of the tablet. It allows the user to rest their fingers on the screen and automatically brings the keys to their fingers. Then, it detects the vibration of the user tapping on a key to tell the difference between them resting and typing. It is this combination of finger tracking and touch-tap detection that makes Dryft work; and the patents for both features are already issued! ... Dryft accomplishes its magic by combining not one, but TWO sensors: touch and vibration. That makes it possible for the user to rest their fingers on the screen, the same way they would on a physical keyboard. Then, the keys “drift” to the user’s fingers and forms the keyboard around them. Next, it combines data from the device’s touch and accelerometer sensors and picks up on the vibration caused when a user taps on a key. In this way, Dryft can tell when the user is resting and typing.

Microsoft Announces Surface 2.0 Event In New York City On September 23

Again, via techcrunch:
Today Microsoft released invites to its forthcoming Surface 2.0 event in New York City. The shindig, taking place in Chelsea on the 23 of September, will show off new Surface hardware.

Sunday, September 8, 2013

Secret Handshake Lets You Pay With Hand Gestures And Leap Motion – No Phone Or Card Required

TechCrunch reports:
One of the TechCrunch Disrupt hacks this year is the brainchild of Matthew Drake, an Atlanta-based employee of advertising firm 22squared. Secret Handshake, Drake’s hack this year, uses the Leap Motion gesture controller, combined with Clover’s shopping cart point-of-sale API, to allow people to pay using only a unique hand gesture or “secret handshake.”

Friday, September 6, 2013

Thursday, September 5, 2013

Haptix: Multitouch Reinvented

Interesting Kickstarter project with a bit misleading name, as there is no actual haptic feedback  involved.

Evomail Is a Gesture-Based Email client for iOS and now Android

Evomail just announced Android version. It is interesting to see an app which is entering such crowded space as email clients but with an interesting twist. The twist being their UI is heavily gesture-oriented. I guess in the near we will see more apps like this.

Wednesday, September 4, 2013

Samsung AirView under the hood

If you interested in technical details on how Samsung AirView works under the hood, please read my blog post.

Tuesday, September 3, 2013

Apple tech uses specific gestures to unlock apps, device functions

Apple Insider reports based on recent Apple's patent application that "future version of iOS may feature a unique security method that recognizes different gesture inputs to open specific sets of apps, allowing for greater control over user access."