Design and the Digital Divide. Alan F. Newell

Чтение книги онлайн.

Читать онлайн книгу Design and the Digital Divide - Alan F. Newell страница 6

Design and the Digital Divide - Alan F. Newell Synthesis Lectures on Assistive, Rehabilitative, and Health-Preserving Technologies

Скачать книгу

which is very important in face-to-face communication between speaking people, was not possible. The “Talking Brooch” [Newell, A., 1974a] consisted of a small “rolling” alphanumeric display worn on the lapel and operated via a hand-held keyboard, and was designed to provide eye contact for non-speaking people. I wrote a simulation of this on a PDP12 (the successor to the PDP8 as a laboratory computer), and my team subsequently developed a dedicated electronic version. (In the early 1980’s, even “small computers” were very large.)

       Encourage serendipity.

      The Talking Brooch was a major factor in my being awarded a Winston Churchill Travel Fellowship in 1976 to visit researchers in the U.S. This was an immensely useful experience, enabling me to visit the major players in the field. Many of the people I met have remained friends and colleagues throughout my career. The Winston Churchill Travel Fellowship was excellent in that its modus operandi was to choose Fellows on the basis of their ideas and then give them freedom to plan their fellowship without having to check back to the Trust. For example, I was advised not to have a full diary so that I could follow up leads that came up during the tour. If it wasn’t for this ability I would never have met the New York-based speech pathologist, Arleen Kraat. She was not on my original itinerary, but became a very important mentor and supporter throughout my research activities in this field.

      The Travel Fellowship confirmed my view that research into, and development of, systems to assist people with disabilities was an area which was satisfying and which I would enjoy. The area of assisting human communication was particularly interesting and held some exciting technological challenges, but serendipity led to the specific projects I and my team pursued.

      Lewis Carter-Jones, a Member of Parliament was visiting the Department at Southampton and I demonstrated the Talking Brooch to him. He was a friend of Jack (now Lord) Ashley, a labour MP who had suddenly become deaf. He suggested that the Brooch would be useful to assist Jack in the House of Commons, where he had great difficulty in following debates, and he arranged for me to meet Jack in the House. This led to my team developing a transcription system for machine shorthand, in particular the Palantype Machine that had been developed in the UK. This system provided a verbatim transcript of speech on a display screen. It became the first computer system to be used in the Chamber of the House of Commons, and led the field in commercially available real-time systems for stenograph transcription.

      My investigations into the needs of deaf people led me to consider television subtitling, and we investigated ways in which “closed captions” could be transmitted. This research was superseded when the UK text services of Oracle and Ceefax were developed. We thus refocussed our research on the characteristics required for effective captioning, and developing equipment that would enable captioners to work efficiently. Although the Independent Television authorities supported the former research, they did not see any need for new equipment. So again there was no support from the potential users of such a system, but Andrew Lambourne, a research student at the time, continued this development both as a PhD topic and as a commercial venture [Lambourne et al., 1982a]. In 2011 he continues to run a successful company marketing this type of equipment.

       Research without stakeholder support can still be valuable.

      The Palantype and the Subtitling projects were brought together in our research into live subtitling. ITV supported our work, our system being used for the Charles & Diana royal wedding, whereas the BBC supported Earnest Edmunds at Leicester University. In those early days, although subtitling for deaf people added only 1/3 of one percent to the cost of programmes, it was deemed to be too high a price to pay (to assist 10% of the audience!). After much lobbying, this view changed and a large percentage of programmes in the UK are now subtitled, including 100% of the British Broadcasting Corporation’s output news being subtitled mainly by stenographers.

      In 1980, I moved to the NCR chair of Electronics and Microcomputer Systems in Dundee University’s Electrical Engineering and Electronics Department. There I founded a group investigating the uses of microcomputers with a special interest in disabled people. This again was not a “strategic” decision by the University—their aim was to expand their research and teaching in microcomputers. My research group was not a good fit with the Electrical Engineering and Electronics Department and, in 1986, the group joined Mathematics to produce a Department of Mathematics and Computer Science. Later, it became a stand-alone Department of Applied Computing and subsequently the School of Computing. My move from being essentially an electrical/electronic engineering academic to the head of an Applied Computing Department was not entirely for academic reasons, but proved to be exactly the right move to make both from teaching and research standpoints.

      A major thrust of the School of Computing at Dundee University is computer systems for areas of high social impact. It had four research groups: Assistive and Healthcare technologies, Interactive Systems design, Computational systems and the Space Technology group. There was cross-fertilization between all four groups, but, in particular, there was close links between Assistive and Health care technologies and Interactive system design. The Assistive and Healthcare technologies group contains over 30 researchers developing computer and communication technology for older and disabled people, and has become the largest and one of the most influential academic groups in the world in this field [Newell, A., 2004].

      In the 1980s and 1990s, much of the group’s research to support disabled people focussed on non-speaking people and the development of Augmentative and Alternative Communication (AAC) systems [Gregor et al., 1999]. These are computer systems that control a speech synthesiser—the most well-known user of such a system being Professor Steven Hawking of Cambridge University. My academic colleagues Adrian Pickering, John Arnott, Norman Alm, Annalu Waller and Ian Ricketts, plus a large number of Research Students, Assistants and Fellows worked in this field. This work which will be described in more detail in later chapters, but was aimed at increasing the rate at which non-speaking people could talk, and was based on prediction, and the use of conversational models. It is vital that such work be interdisciplinary, and our research group has included a wide variety of disciplines, including psychologists, speech and occupational therapists, linguists, philosophers, nurses, school teachers, and creative designers, as well as computer engineers and human computer interface (HCI) specialists. I, and other colleagues, also have interdisciplinary academic backgrounds, which have proved particular helpful in our research. We were concerned with providing systems that allowed the non-speaking people to transmit their personalities, rather than simply deliver messages, and the work included Iain Murray’s pioneering work on inserting emotion into the output of speech synthesisers.

      Our work in the AAC field led us to be considered mavericks by many speech therapists in the international field, some of whom were strongly opposed to our ideas, but Arlene Kraat was a constant source of support for our work.

       A maverick: “a person pursuing rebellious, even potentially disruptive, policies or ideas”.

      Thanks to Lynda Booth, a special education teacher, we were the first research group to show that predictive systems can assist people with spelling and language dysfunction, and John Arnott’s research into disambiguation was one of the triggers for the development of the T9 system which is available in most mobile phones today. Other work the group did during this period included Peter Gregor’s research into computer-supported interviewing, where we worked with child psychiatry units and collaborated with researchers in a secure mental hospital. Peter Gregor also conducted ground-breaking research into support for people with dyslexia.

      

Скачать книгу