We’ve gotten
used to the idea that computers are machines that we operate with our
hands. Just as we Gen Xers became comfortable
with keyboards and mouses, Today’s millennial generation has learned to text at
blazing speed. Each new iteration of
technology has required new skills to use it proficiently.
That’s why the new trend towards
no-touch interfaces is so fundamentally different. From Microsoft’s Kinect to Apple’s Siri to
Google’s Project Glass, we’re beginning to expect that computers adapt to us
rather than the other way around.
The basic
pattern recognition technology has been advancing for generations and, thanks
to accelerating returns, we can expect computer interfaces to become almost
indistinguishable from humans in little more than a decade.
Tapped
Out? Moving Toward a No-Touch Future
With advances in
sensors and cameras, no-touch interfaces and devices will continue to be
further integrated into daily life.Smartphones such as the Pantech Perception
and the upcoming Samsung Galaxy S4 are the latest devices to incorporate
touchless features, with each device enabling users to browse through picture
galleries or answer a phone call by just waving a hand over the smartphone
screen. The Galaxy S4 also has Smart Scroll, which detects eyes and scrolls web
pages based on the angle the user tilts his or her head.
Many smartphone
users are already familiar with no-touch technology thanks to the wide adoption
of voice recognition software in wireless devices. Smartphone users use apps
like Google Now on Android and Siri on iOS for hands-free access to endless
information. And now, Google Chrome has added voice recognition to its latest
version, enabling features like email dictation. This technology is also being
incorporated into automobiles to allow for a hands-free mobile experience for
drivers.
Gesture
technology is also featured in products like Kinect for Xbox. To expand this
functionality to computers, Kinect for Windows was created and uses software
and sensors. One app for Kinect for Windows allows surgeons to use gestures to
control medical images and scans on computers, eliminating time lost when using
unsterilized computers then having to scrub up again. Intel has developed a
gesture-sensing device using conventional and infrared cameras, microphones and
software to enable apps on computers to track a person’s fingers, recognize
faces, infer emotions and interpret words spoken in nine languages.However,
this is just the beginning. Mobile voice interfaces will soon be even more
commonplace allowing users to talk to a device without touching it first.
TOUCH-LESS
TOUCH SCREEN USER INTERFACE
It was the touch screens which initially
created great furore.Gone are the days when you have to fiddle with the touch
screens and end scratching up. Touch screen displays are ubiquitous
worldwide.Frequent touching a touchscreen display with a pointing device such
as a finger can result in the gradual de-sensitization of the touchscreen to
input and can ultimately lead to failure of the touchscreen. To avoid this a
simple user interface for Touchless control of electrically operated equipment
is being developed. EllipticLabs innovative technology lets you control your
gadgets like Computers, MP3 players or mobile phones without touching them. A
simple user interface for Touchless control of electrically operated equipment.
Unlike other systems which depend on distance to the sensor or sensor selection
this system depends on hand and or finger motions, a hand wave in a certain
direction, or a flick of the hand in one area, or holding the hand in one area
or pointing with one finger for example. The device is based on optical pattern
recognition using a solid state optical matrix sensor with a lens to detect
hand motions. This sensor is then connected to a digital image processor, which
interprets the patterns of motion and outputs the results as signals to control
fixtures, appliances, machinery, or any device controllable through electrical
signals.
The touch less
touch screen sounds like it would be nice and easy, however after closer
examination it looks like it could be quite a workout. This unique screen is
made by TouchKo, White Electronics Designs and Groupe 3D. The screen resembles
the Nintendo Wii without the Wii Controller. With the touchless touch screen
your hand doesn’t have to come in contact with the screen at all, it works by
detecting your hand movements in front of it. This is a pretty unique and
interesting invention, until you break out in a sweat. Now this technology
doesn’t compare to the hologram-like IO2 Technologies Heliodisplay M3, but
thats for anyone that has $18,100 laying around.
You probably
wont see this screen in stores any time soon. Everybody loves a touch screen
and when you get a gadget with touch screen the experience is really
exhilarating. When the I-phone was introduced,everyone felt the same.But
gradually,the exhilaration started fading. While using the phone with the
finger tip or with the stylus the screen started getting lots of finger prints
and scratches. When we use a screen protector; still dirty marks over such
beautiful glossy screen is a strict no-no. Same thing happens with I-pod touch.
. Most of the time we have to wipe the screen to get a better unobtrusive view
of the screen
TOUCH
LESS MONITOR:
Sure, everybody
is doing touchscreen interfaces these days, but this is the first time I’ve
seen a monitor that can respond to gestures without actually having to touch
the screen.The monitor, based on technology from TouchKo was recently
demonstrated by White Electronic Designs and Tactyl Services at the CeBIT show.
Designed for applications where touch may be difficult, such as for doctors who
might be wearing surgical gloves, the display
features capacitive sensors that can read movements from up to 15cm away
from the screen. Software can then translate gestures into screen commands.
Touchscreen
interfaces are great, but all that touching, like foreplay, can be a little bit
of a drag. Enter the wonder kids from Elliptic Labs, who are hard at work on
implementing a touchless interface. The input method is, well, in thin air. The
technology detects motion in 3D and requires no special worn-sensors for
operation. By simply pointing at the screen,users can manipulate the object
being displayed in 3D. Details are light on how this actually functions, but
what we do know is this:
What
is the technology behind it?
It obviously
requires a sensor but the sensor is neither hand mounted nor present on the
screen. The sensor can be placed either onthe table or near the screen. And the
hardware setup is so compact that it can be fitted into a tiny device like a
MP3 player or a mobile phone. It recognizes the position of an object from as 5
feet.
WORKING:
The system is
capable of detecting movements in 3-dimensions without ever having to put your
fingers on the screen. Their patented touchless interface doesn’t require that
you wear any special sensors on your hand either. You just point at the screen
(from as far as 5 feet away), and you can manipulate objects in 3D.
Sensors are
mounted around the screen that is being used, by interacting in the line-of-sight
of these sensors the motion is detected and interpreted into on-screen
movements. What is to stop unintentional gestures being used as input is not
entirely clear, but it looks promising nonetheless.Elliptic Labs says their
technology will be easily small enough to be implemented into cell phones and
the like.
Touch-less
Gives Glimpse of GBUI:
We have seen the
futuristic user interfaces of movies like Minority Report and the Matrix
Revolutions where people wave their hands in 3 dimensions and the computer
understands what the user wants and shifts and sorts data with precision.
Microsoft's XD Huang demonstrated how his company sees the future of the GUI at
ITEXPO this past September in fact. But at the show, the example was in 2
dimensions, not3.The GBUI as seen in the Matrix
The
GBUI as seen in Minority Report
Microsoft's
vision on the UI in their Redmond headquarters and it involves lots of gestures
which allow you to take applications and forward them on to others with simple
hand movements. The demos included the concept of software understanding
business processes and helping you work. So after reading a document - you
could just push it off the side of your screen and the system would know to
post it on an intranet and also send a link to a specific group of people.
Touch-less
UI:
The basic idea
described in the patent is that there would be sensors arrayed around the
perimeter of the device capable of sensing finger movements in 3-D space. The
user could use her fingers similarly to a touchphone, but actually without
having to touch the screen.
Touch-less
SDK:
The Touchless
SDK is an open source SDK for .NET applications. It enables developers to
create multi-touch based applications using a webcam for input. Color based
markers defined by the user are tracked and their information is published
through events to clients of the SDK. In a nutshell, the Touchless SDK enables
touch without touching. Well, Microsoft Office Labs has just released
“Touchless,” a webcam-driven multi-touch interface SDK that enables “touch
without touching.”
Touch-less
demo:
The Touch less Demo is an open source application that anyone with a
webcam can use to experience multi-touch, no geekiness required. The demo was
created using the Touch less SDK and Windows Forms with C#. There are 4 fun
demos: Snake - where you control a snake with a marker, Defender - up to 4
player version of a pong-like game, Map - where you can rotate, zoom, and move
a map using 2 markers, and Draw the
marker is used to guess what…. draw!
Touch
wall:
Touch Wall
refers to the touch screen hardware setup itself; the corresponding software to
run Touch Wall, which is built on a standard version of Vista, is called Plex.
Touch Wall and Plex are superficially similar to Microsoft Surface, a
multi-touch table computer that was introduced in 2007 and which recently
became commercially available in select AT&T stores. It is a fundamentally
simpler mechanical system, and is also significantly cheaper to produce. While
Surface retails at around $10,000, the hardware to “turn almost anything into a
multi-touch interface” for Touch Wall is just “hundreds of dollars”
Touch Wall
consists of three infrared lasers that scan a surface. A camera notes when
something breaks through the laser line and feeds that information back to the
Plex software. Early prototypes, say Pratley and Sands, were made, simply, on a
cardboard screen. A projector was used to show the Plex interface on the
cardboard, and a the system worked fine. It’s also clear that the only real
limit on the screen size is the projector, meaning that entire walls can easily
be turned into a multi touch user interface. Scrap those white boards in the
office, and make every flat surface into a touch display instead. You might
even save some money.
What’s
next?
Many personal
computers will likely have similar screens in the near future. But touch
interfaces are nothing new -- witness ATM machines.
How about getting completely out of
touch? A startup called LM3Labs says it's working with major computer makers in
Japan, Taiwan and the US to incorporate touch less navigation into their
laptops, Called Airstrike; the system uses tiny charge-coupled device (CCD)
cameras integrated into each side of the keyboard to detect user movements.
You can drag windows around or
close them, for instance, by pointing and gesturing in midair above the
keyboard.You should be able to buy an Airstrike-equipped laptop next year, with
high-end stand-alone keyboards to follow.
Any such system is unlikely to
replace typing and mousing. But that's not the point. Airstrike aims to give
you an occasional quick break from those activities.
CONCLUSION:
Today’s thoughts are again around user interface. Efforts are being put
to better the technology day-in and day-out. The Touchless touch screen user
interface can be used effectively in computers, cell phones, webcams and
laptops. May be few years down the line, our body can be transformed into a
virtual mouse, virtual keyboard and what not??, Our body may be turned in to an
input device!
This blog explains about the new trend towards no-touch interfaces. Thanks for sharing this information about no touch interfaces.
ReplyDeleteHere is Related website to this post: https://www.Intuiface.com/