Tuesday, December 2, 2014

Exclusive: Giving Stephen Hawking a voice http://goo.gl/pHufxl








Marco Grob









Joao Medeiros

Joao Medeiros


Science Editor






This exclusive article was taken from the January
2015 issue of WIRED magazine. Be the first to read WIRED's articles
in print before they're posted online, and get your hands on loads
of additional content by  subscribing online

Stephen Hawking first met Gordon Moore, the cofounder of Intel,
at a conference in 1997. Moore noticed that Hawking's computer,
which he used to communicate, had an AMD processor and asked him if
he preferred instead a "real computer" with an Intel
micro-processor. Intel has been providing Hawking with customised
PCs and technical support since then, replacing his computer every
two years.

Hawking lost his ability to speak in 1985, when, on a trip to
CERN in Geneva, he caught pneumonia. In the hospital, he was put on
a ventilator. His condition was critical. The doctors asked
Hawking's then-wife, Jane, whether they should turn off the life
support. She vehemently refused. Hawking was flown to Addenbrooke's
Hospital, in Cambridge, where the doctors managed to contain the
infection. To help him breathe, they also performed a tracheotomy,
which involved cutting a hole in his neck and placing a tube into
his windpipe. As a result, Hawking irreversibly lost the ability to
speak.

For a while, Hawking communicated using a spelling card,
patiently indicating letters and forming words with a lift of his
eyebrows. Martin King, a physicist who had been working with
Hawking on a new communication system, contacted a California-based
company called Words Plus, whose computer program Equalizer allowed
the user to select words and commands on a computer using a hand
clicker. King spoke to the CEO of Words Plus, Walter Woltosz, and
asked if the software could help a physics professor in England
with ALS. Woltosz had created an earlier version of Equalizer to
help his mother-in-law, who also suffered from ALS and had lost her
ability to speak and write. "I asked if it was Stephen Hawking, but
he couldn't give me a name without permission," says Woltosz. "He
called me the next day and confirmed it. I said I would donate
whatever was needed." 

Equalizer first ran on an Apple II computer linked to a speech
synthesiser made by a company called Speech Plus. This system
was then adapted by David Mason, the engineer husband of one of
Hawking's nurses, to a portable system that could be mounted on one
of the arms of a wheelchair. With this new system, Hawking was able
to communicate at a rate of 15 words per minute. 

However, the nerve that allowed him to move his thumbs kept
degrading. By 2008, Hawking's hand was too weak to use the clicker.
His graduate assistant at the time then devised a switching device
called the "cheek switch". Attached to his glasses, it could
detect, via a low infrared beam, when Hawking tensed his cheek
muscle. Since then, Hawking has achieved the feat of writing
emails, browsing the internet, writing books and speaking using
only one muscle. Nevertheless, his ability to communicate continued
to decline. By 2011, he managed only about one or two words per
minute, so he sent a letter to Moore, saying: "My speech input is
very, very slow these days. Is there any way Intel could help?"

Moore asked Justin Rattner, then Intel's CTO, to look into the
problem. Rattner assembled a team of experts on human-computer
interaction from Intel Labs, which he brought over to Cambridge for
Hawking's 70th birthday conference, "The state of the Universe", on
January 8, 2012. "I brought a group of specialists with me from
Intel Labs," Rattner told the audience. "We're going to be looking
carefully at applying some state-of-the-art computing technology to
improve Stephen's communicating speed. We hope that this team has a
breakthrough and identifies a technique that allows him to
communicate at levels he had a few years ago."

Hawking had been too ill to attend his own birthday party, so he
met the Intel experts some weeks later at his office in the
department of applied mathematics and theoretical physics at the
University of Cambridge. The team of five included Horst
Haussecker, the director of the Experience Technology Lab, Lama
Nachman, the director of the Anticipatory Computing Lab and project
head, and Pete Denman, an interaction designer. "Stephen has always
been inspirational to me," says Denman, who also uses a wheelchair.
"After I broke my neck and became paralysed, my mother gave me a
copy of A Brief History of Time, which had just come out.
She told me that people in wheelchairs can still do amazing things.
Looking back, I realise how prophetic that was."

After the Intel team introduced themselves, Haussecker took the
lead, explaining why they were there and what their plans
were. Haussecker continued speaking for 20 minutes, when, suddenly,
Hawking spoke. 

"He welcomed us and expressed how happy he was that we were
there," says Denman. "Unbeknown to us, he had been typing all that
time. It took him 20 minutes to write a salutation of about 30
words. It stopped us all in our tracks. It was poignant. We now
realised that this was going to be a much bigger problem than we
thought."

At the time, Hawking's computer interface was a program called
EZ Keys, an upgrade from the previous softwares and also designed
by Words Plus. It provided him with a keyboard on the screen and a
basic word-prediction algorithm. A cursor automatically scanned
across the keyboard by row or by column and he could select a
character by moving his cheek to stop the cursor. EZ Keys also
allowed Hawking to control the mouse in Windows and operate other
applications in his computer. He surfed the web with Firefox and
wrote his lectures using Notepad. He also had a webcam that he used
with Skype.

The Intel team envisaged an upheaval of Hawking's archaic
system, which would involve introducing new hardware. "Justin was
thinking that we could use technology such as facial-gesture
recognition, gaze tracking and brain-computer interfaces," says
Nachman. "Initially we fed him a lot of these wild ideas and tried
a lot of off-the-shelf technologies." Those attempts, more often
than not, failed. Gaze tracking couldn't lock on to Hawking's gaze,
because of the drooping of his eyelids. Before the Intel project,
Hawking had tested EEG caps that could read his brainwaves and
potentially transmit commands to his computer. Somehow, they
couldn't get a strong enough brain signal. "We would flash letters
on the screen and it would try to select the right letter just by
registering the brain's response," says Wood. "It worked fine with
me, then Stephen tried it and it didn't work well. They weren't
able to get a strong enough signal-to-noise."

"The more we observed him and listened to his concerns, the more
it dawned on us that what he was really asking, in addition to
improving how fast he could communicate, was for new features that
would let him interact better with his computer," says Nachman.
After returning to Intel Labs and after months of research, Denman
prepared a ten-minute video to send to Hawking, delineating which
new user-interface prototypes they wanted to implement and
soliciting his feedback. "We came up with changes we felt would not
drastically change how he used his system, but would still have a
large impact," says Denman. The changes included additions such as
a "back button", which Hawking could use not only to delete
characters but to navigate a step back in his user interface; a
predictive-word algorithm; and next-word navigation, which would
let him choose words one after another rather than typing them.

The main change, in Denman's view, was a prototype that tackled
the biggest problem that Hawking had with his user interface:
missed key-hits. "Stephen would often hit the wrong key
by hitting the letter adjacent to the one he wanted,"
says Denman. "He would miss the letter, go back, miss the letter
again, go back. It was unbearably slow and he would get
frustrated." That particular problem was compounded by Hawking's
perfectionism. "It's really important for him to have his thoughts
articulated in exactly the right way and for the punctuation to be
absolutely right," says Nachman. "He learned to be patient enough
to still be able to be a perfectionist. He's not somebody who just
wants to get the gist of the message across. He's somebody who
really wants it to be perfect."

To address the missed key-hits, the Intel team added a prototype
that would interpret Hawking's intentions, rather than his actual
input, using an algorithm similar to that used in word processing
and mobile phones. "This is a tough interaction to put your faith
into," the video explained. "When the iPhone first entered the
market, people complained about predictive text but quickly
distrust turned to delight. The problem is that it takes a little
time to get used to and you have to release control to let the
system do the work. The addition of this feature could increase
your speed and let you concentrate on content."

The video concluded: "What's your level of excitement or
apprehension?" In June that year, Hawking visited Intel Labs,
where Denman and his team introduced him to the new system,
initially called ASTER (for ASsistive Text EditoR). "Your current
piece of software is a little dated," Denman told him. "Well, it's
very dated, but you're very used to using it, so we've changed the
method by which your next-word prediction works and it can pretty
much pick up the correct word every single time, even if you're
letters away from it."

"This is a big improvement over the previous version," Hawking
replied. "I really like it."

They implemented the new user interface on Hawking's computer.
Denman thought they were on the right path. By September, they
began to get feedback: Hawking wasn't adapting to the new system.
It was too complicated. Prototypes such as the back button, and the
one addressing "missed key-hits", proved confusing and had to be
scrapped. "He's one of the brightest guys in the world but we can't
forget that he hasn't been exposed to modern technology," says
Denman. "He never had the opportunity to use an iPhone. We were
trying to teach the world's most famous and smartest 72-year-old
grandfather to learn this new way of interacting with
technology."

Denman and the rest of the team realised that they had to start
thinking differently about the problem. "We thought we were
designing software in the traditional sense, where you throw out a
huge net and try to catch as many fish as you can," says Denman.
"We didn't realise how much the design would hinge on Stephen. We
had to point a laser to study one individual."

At the end of 2012, the Intel team set up a system that recorded
how Hawking interacted with his computer. They recorded tens of
hours of video that encompassed a range of different situations:
Stephen typing, Stephen typing when tired, Stephen using the mouse,
Stephen trying to get a window at just the right size. "I watched
the footage over and over," says Denman. 

"Sometimes, I would run it at four times the speed and still
find something new."

By September 2013, now with the assistance of Jonathan Wood,
Hawking's graduate assistant, they implemented another iteration of
the user interface in Hawking's computer. "I thought we had it, I
thought we were done," says Denman. However, by the following
month, it became clear that, again, Hawking was having trouble
adapting. "One of his assistants called it 'ASTER' torture,"
recalls Denman. "When they said it, Stephen would grin."

It was many more months before the Intel team came up with a
version that pleased Hawking. For instance, Hawking now uses an
adaptive word predictor from London startup SwiftKey which allows
him to select a word after typing a letter, whereas Hawking's
previous system required him to navigate to the bottom of his user
interface and select a word from a list. "His word-prediction
system was very old," says Nachman. "The new system is much faster
and efficient, but we had to train Stephen to use it. In the
beginning he was complaining about it, and only later I realised
why: he already knew which words his previous systems would
predict. He was used to predicting his own word predictor." Intel
worked with SwiftKey, incorporating many of Hawking's documents
into the system, so that, in some cases, he no longer needs to type
a character before the predictor guesses the word based on context.
"The phrase 'the black hole' doesn't require any typing," says
Nachman. "Selecting 'the' automatically predicts 'black'. Selecting
'black' automatically predicts 'hole'."

The new version of Hawking's user interface (now called ACAT,
after Assistive Contextually Aware Toolkit) includes contextual
menus that provide Hawking with various shortcuts to speak, search
or email; and a new lecture manager, which gives him control over
the timing of his delivery during talks. It also has a mute button,
a curious feature that allows Hawking to turn off his speech
synthesiser. "Because he operates his switch with his cheek, if
he's eating or travelling, he creates random output," says Wood.
"But there are times when he does like to come up with random
speech. He does it all the time and sometimes it's totally
inappropriate. I remember once he randomly typed 'x x x x', which,
via his speech synthesiser, sounded like 'sex sex sex
sex'."

Wood's office is next to Hawking's. It's more of a
workshop than a study. 
One wall is heaped with
electronic hardware and experimental 
prototypes.
Mounted on the desk is a camera, part of an ongoing project with
Intel. "The idea is to have a camera pointed at Stephen's face to
pick up not just his cheek movements but other facial movements,"
says Wood. "He could move his jaw sideways, up and down, and drive
a mouse and even potentially drive his wheelchair. These are cool
ideas but they won't be coming to completion any time
soon."

Another experimental project, suggested by the manufacturers of
Hawking's wheelchair earlier this year, is a joystick that attaches
to Hawking's chin and allows him to navigate his wheelchair
independently. "It's something that Stephen is very keen on," says
Wood. "The issue was the contact between Stephen's chin and the
joystick. Because he doesn't have neck movement it is difficult to
engage and disengage the joystick." Wood shows WIRED a video of a
recent test trial of this system. In it, you can see Hawking
driving his wheelchair across an empty room, in fits and starts.
"As you can see, he managed to drive it," says Wood. "Well, sort
of."

Wood showed WIRED a little grey box, which contained the only
copy of Hawking's speech synthesiser. It's a CallText 5010, a model
given to Hawking in 1988 when he visited the company that
manufactured it, Speech Plus. The card inside the synthesiser
contains a processor that turns text into speech, a device that was
also used for automated telephone answering systems in the
80s. 

"I'm trying to make a software version of Stephen's voice so
that we don't have to rely on these old hardware cards," says Wood.
To do that, he had to track down the original Speech Plus team. In
1990, Speech Plus was sold to Centigram Communications. Centigram
was acquired by Lernout and Hauspie Speech Products, which was
acquired by ScanSoft in 2001. ScanSoft was bought by Nuance
Communications, a multinational with 35 offices and 1,200
employees. Wood contacted it. "They had software with Stephen's
voice from 1986," says Wood. "It looks like we may have found it on
a backup tape at Nuance."

Hawking is very attached to his voice: in 1988, when Speech Plus
gave him the new synthesiser, the voice was different so he asked
them to replace it with the original. His voice had been created in
the early 80s by MIT engineer Dennis Klatt, a pioneer of
text-to-speech algorithms. He invented the DECtalk, one of the
first devices to translate text into speech. He initially made
three voices, from recordings of his wife, daughter and himself.
The female's voice was called "Beautiful Betty", the child's "Kit
the Kid", and the male voice, based on his own, "Perfect Paul".
"Perfect Paul" is Hawking's voice.

This article was taken from the January 2015
issue of WIRED magazine. Be the first to read WIRED's articles in
print before they're posted online, and get your hands on loads of
additional content by  
subscribing online
















Source Article from http://www.wired.co.uk/magazine/archive/2015/01/features/giving-hawking-a-voice http://cdni.wired.co.uk/620x413/g_j/hawk1.jpg
Exclusive: Giving Stephen Hawking a voice

No comments:

Post a Comment