Press "Enter" to skip to content

Cued Speech – Too Oral for Signers and Too Visual for Oralists?

Many know that deaf/hoh people use spoken, written, and/or signed languages, but only a very few are aware about cued language. There’s a lot of misconception around that communication mode.

Why is Cued Speech not well accepted even in either signing or oral deaf communities? Since it’s based on phonemes of a spoken language, many signing Deaf people think that it is biased towards a spoken language. Many strict oralists dismiss any visual languages (signed or cued) – even though Cued Speech fully eliminates the guesswork in speechreading (that gives only 30% of visual information) and greatly improves pronunciation skills. Also, Cued Speech is a new communication mode that was created in 1960s by Dr. Cornett – compared to signed and written languages that have been around for centuries.

So what is Cued Speech exactly? Is it to replace sign languages? No, cued speech and sign language are not same. They offer different ways of accessing aural information:

  • Sign language is based on visual concepts with its own grammar different from a spoken language.
  • When using sign language interpreters, it’s a meaning for meaning translation from a spoken to signed language.
  • Since sign language interpretation is meaning for meaning, some words may be lost in translation. Not all words have signs for (especially advanced terminology), and one sign may mean different things that may cause come confusions in translation.
  • Cued speech is based on visual phonetics to make it easier for deaf/hoh people to learn visually what sounds are.
  • When using cued speech transliterators, it’s a verbatim transliteration of speech into visual cues. It can be easily used to relay any words, including special terminology (technical, medical, legal, etc.) and even foreign words.
  • Since Cued Speech transliteration is verbatim, it’s possible to see/hear each word said and even foreign accents.

Cued speech is based on 8 handshapes (representing consonants) in 4 locations near the mouth (representing vowels) to supplement speechreading. It can even be used in any spoken languages (with some minor changes in cues) and is a great tool for learning how to pronounce in foreign languages and showing foreign accents. Unlike captioning and sign language interpreting (that just say that a speaker has a foreign accent), cued speech transliteration can show how words are exactly pronounced by speakers with foreign accents. Cued Speech has been adapted to more than 60 languages and dialects.

Here is an American English Cued Speech Chart to show what it looks like. There are also charts for other languages.

Cued Speech Chart showing 8 handshapes in 4 locations

You can learn more about Cued Speech on the following links:

NCSA logo

Below is a video to demonstrate more examples of cued speech and how it is used.

(Video – What Is Cued Speech?)

error: Content is protected !!