iPhone 4S & Siri Personal Assistant : What’s in it for Speech Therapists and People with disabilities?
Have you ever had trouble enjoying a day away from the house? The date was October 14th; my heart raced in agony and longing of home and this time it wasn’t because I missed my husband or dogs, it was because I was far away in California while my new brand new iPhone 4S sat patiently awaiting it’s techie mother back in Texas.
When the grueling heartache of the 14th was finally over, and the promise of finally seeing my new iPhone on the 15th, was a tangible dream, I rushed home from the airport to find my beautiful seek elegant iPhone 4S sitting on my table begging me to try out all of it’s new functions.
Some critics have been leery of the new iPhone being called the iPhone 4S, as opposed to the speculated “iPhone 5”, but the reality is that I do not care about what name it was given because it is definitely a huge upgrade from my previous phone iPhone 4; it is faster and it comes with a personal assistant! To paraphrase Shakespeare, “a rose by any other name would smell as sweet”. I won’t be crying about nomenclature decisions when I have a handful of awesomeness at my fingertips, and that awesomeness starts with Siri.
Siri, is the name given by Apple to its voice activated personal assistant on the iPhone 4S; I named mine Jane. For those of you who do not own an iPhone 4S yet, Siri allows you to dictate almost anything and it will do its own research to get you the answers. You can speak what you want and Siri will transform your speech into text. Siri is quite impressive and I can only imagine where this technology is going and all of the future possibilities.
You can watch Apple’s video ( in which they show a person with visual impairment using Siri).
I was, however, wondering how “Jane”(AKA Siri) would respond to people with speech disabilities such as individuals who stutter, who have cerebral palsy or articulation delays. I decided to test out Siri and here are my results:
Siri and foreign accents:
I am Brazilian, and I learned English 6 years ago, so my Portuguese accent is still here and I don’t think it is going anywhere. So, testing out Siri + Foreign accents was not an obstacle to me! 😉 I have to say I am quite impressed with Siri’s ability to understand my speech (almost as good as my husband’s speech). Siri had an accuracy rate of about 97% with my speech! Impressive! I noticed it had the biggest trouble when I tried to speak specific proper nouns such as street names and people’s names.
Faking accents: I am also really good with trying to imitate other accents, especially accents that are much more marked than mine. Again, I am impressed! I dictated a complex sentence and Siri was about 80% accurate. I can see that the major issues can be recognizing the vowel, which often leads to transforming the word into something completely different.
The possibilities: I wonder if Siri could be implemented for accent reduction by alerting the user when specific vowels/ consonants are not pronounced as the standard English accent just like Rosetta Stone Language learning software. This would open up the possibilities for several apps that can give instant speech feedback.
Siri for people with speech impairments:
I tested Siri using a variety of different types of stuttering moments. Here are the results I got from it:
Syllable repetitions: I tried “wh-wh-wh where are you?” ; Siri interestingly completed the syllables “Wh” and made it into a “what”: here is what was typed on my text: “What what what where are you.”
Word repetitions: Siri types everything you say, so if a person repeats the word three times Siri will just accept that as a meaningful repetition.
Prolongations: Siri does much better with prolongations than with syllable repetitions. I prolonged the “I” in “I love you” for 3 seconds and Siri was great! It understood the message “ I love you”.
Blocks: Siri respond to blocks just as pauses, which is great; it does not account for any of my attempts to imitate a block.
Interjections: I used the interject “hum” three times in a sentence; out of those three times Siri ignored two times and substituted the third by “him”.
Siri and the “r”: Siri does NOT like the substitution of “w” for “r”; it interprets as a completely different word. I said the following phrase “ The red/wed rabbit/wabbit went to play”, here is what I got typed: The wed web it went to play”.
I tested Siri at the word level for several specific articulation/phonological errors:
Final consonant deletion:
Helme(helmet) : helmet
Siri does much better at the phrase level than at the word level; because I believe it tries to get information from the following word to make sense of a phrase. For example:
I spoke “haven’t” without the “t” and I got the word “ Hey” ; then I said “haven’t seen” without the “ t” and Siri was able to compensate for my final consonant deletion well.
Fun with Siri: I wondered how Siri would respond to my dogs’ bark. Well, it interpreted my dogs barking to “where where where where”. I wonder is that is what they are really saying. Maybe Siri is the new dog translator!?? I can only wish and hope for that in a future iOS update.
7 thoughts on “iPhone 4S & Siri Personal Assistant : What’s in it for Speech Therapists and People with disabilities?”
Yay! Thanks for posting this Barbara. I’ve been considering changing to an iPhone and I think I’m won over! I love the idea of Siri.
FYI – this link is not accessible on iPhone/iPad/iPod Touch due to Flash content. 🙁
I was able to access this on my iPad. If you need to view Flash try this app- Puffin
Thanks for the good info about Siri. I wonder if Siri would adapt to someone’s speech over time? So, maybe once Siri started to learn that “wabbit” was always “rabbit” for it’s owner it would eventually start to correct that error in the text? Possibilities seem to be endless!
Very interesting- thanks for sharing your experiments!
Yes, I was able to access it on my iPhone3. And yes, I also wonder about Deb’s comment. Thank you for sharing your experiences.