Apr 03 , 2014

0 Comments

Angela Nelson

Autism Technology

Over approximately the past 2 decades, with the rise in the prevalence of autism, an entire industry has grown up around treating and teaching children and adults with autism, as well as easing the challenges and improving the quality of day-to-day life for individuals on the autism spectrum. Within this growing market, the past 10 to 15 years has seen the adaptation of many new technologies to the particular needs of individuals with Autism.

The purpose of this article is to examine emerging technologies that teach, support and enhance the lives of children and adults with Autism and their families. In furtherance of this investigation I conducted searches of academic publications as well as popular media, and I conducted an interview with a current doctoral student at MIT who is studying autism and technology. My initial expectations were to find a sampling of apps and software that addressed various symptoms of autism. I quickly discovered that the range and depth of innovative technology solutions goes beyond the basic educational issues, and addresses solutions to problems I hadn't even considered, particularly in the areas of interaction, social skills, and the teaching and processing of emotions.

Overview of Autism and Technology

Recently, the use of Ipad apps to bridge the communication deficit associated with autism has received significant media attention. Six months ago a simple iTunes store search for “autism apps” turned up 243 results. That same search this morning rendered 1054 apps. Apps, however, are just one isolated example of the range of technological advances that are being applied to autism.

A survey of the scientific literature revealed emerging technologies that are being applied to extremely broad areas of autism education, including: diagnosis and assessment (Xu, 2009), communication skills (Bereznak, 2012), Affect and Emotion (Picard, 2009), Social Skills (Robins, 2006; Billard (2007), Visual Attention and Perception (Klin, 2002; Boraston, 2007; Sasson, 2008), and basic Academic Skills (Moore, 2000; Passerino, 2008; Pinnington, 2010). A great deal of the work that has been done in autism education, both technology and non-technology based,  has been focused on therapeutic interventions to make children with Autism “normal” or at measures to minimize the disruption created by autistic behaviors that are perceived to be maladaptive. If emerging technologies meet either of these ends, there is certainly a high level of value. However, an intriguing area where emerging technologies may impact the lives of individuals with autism is in capitalizing on the strengths and differences that may be associated with autism spectrum “disorders.” If emerging technologies can help individuals with autism lead more independent and productive lives, and bridge the social and communication gaps that separate them from others, then there is something important to be gained.

For this article, I will focus primarily on emerging technology as it applies to basic therapy; social skills, affect and emotion processing; communication and managing problem behaviors. I will conclude with a discussion of some of the previous limiting factors for technology development in this industry and explain why that tide is turning to open the door for future innovation

Transformative Technology for Autism and Interpersonal Skills

Ipad for Autism Therapy

It shocks most people to realize that the Ipad release was just over a few years ago on April 3, 2010.  As I mentioned above, there are already well over a thousand apps for the Ipad that are designed or adapted for autism. On the surface, many of these apps are not transformative in nature. Most are simply digital adaptations of existing curriculum or applications adapted from “clunkier” technology. What may be considered transformative, however, is the potential for mobility which is created when you move the therapeutic stimulus materials to a digital format.

Consider, for example, the Applied Behavior Analysis (ABA) style therapy that has been very popular in autism for the past several decades. ABA therapy focuses on the principals of learning via positive reinforcement of correct responses to stimuli (McEachin, Smith & Lovaas, 1993).  In the simplest terms, a student is asked a question; they give a response, and receive either positive or negative reinforcement depending on the accuracy of their response. The unique aspect of ABA therapy as applied to children with autism is the intensity. Students generally sit across a small table, one-on-one with a therapist who will drill them on the same question, sometime hundreds of times, to shape the correct response. In this fast-moving, intense environment, therapists need easy access to stimulus materials that they can use conveniently in the small table learning space. Generally the therapists use photographic images. For example, a therapist will lay out 3 pictures of a car, a dog, and a spoon, and then ask the student to “find the spoon.” Until recently, plain printed flashcards served this purpose. Stages Learning Materials publishes the leading set of flashcards designed specifically for this purpose, the Language Builder Picture cards. Now, however, the pictures can be presented electronically via Ipad, iPhone or any tablet style computer. Stages Learning Materials recently released the Language Builder app to present ABA therapy on the ipad.

An ABA therapy app does not on its surface present revolutionary content or pedagogy. It offers the same lessons taught in digital form rather than with paper flash cards. However, the portability of mobile devices significantly expands the scope, pace and environment of the learning opportunities. Therapies previously limited to sitting at a table can now be carried out anywhere. Mobility and a less structured setting have the potential to enhance generalization and facilitate transfer.

Communication Skills

As described above, autism is a complex syndrome which presents with a wide range of symptoms. Some individuals that we consider higher functioning, while they may suffer from various social challenges, have normal verbal communication skills. Lower functioning individuals, however, may have limited or no speech capabilities. As such, they are unable to communicate even their most basic needs, to say nothing of their higher level ideas.

Before the proliferation of mobile devices, creative parents and therapist came up with ways to help non-verbal individuals on the autism spectrum to communicate. The most basic tool was a collection of important pictures (food, bathroom, emergency, etc.) placed on a ring that the individual could carry around with them or even attach to a belt loop. Communication was accomplished by selecting a representative picture from the ring and showing it to someone who could help get their needs met.

In the early 1990’s a few companies actually created very rudimentary digital devices that had buttons with pictures on them (DynaVox Mayer-Johnson, 2012). An individual could push a button and the device would say the corresponding word… in effect speaking for the non-verbal person. But, these communication devices were static and not adaptable. They were limited by the number of buttons you could put on it, while keeping it small enough to carry around (Gea-Megias, 2004).

One of the more valuable current uses for mobile devices with individuals with autism is as an assistive communication device. With iPhones, Ipads and tablet computers, you are not limited by the number of words you can include. The device can store an entire language, and can be customized to meet the needs of, and to grow with, the individual using it. In the speech-language field these devices are referred to as AAC, or Augmentative and Alternative Communication Systems.

Critics of AAC worry that by giving non-verbal individuals an electronic device for communication; we will demotivate them to actually learn to speak. However, research has shown that for some individuals, use of an AAC actually helps to enhance their verbal learning (Schlosser, 2007; Kagohara, 2012).

Affect, Emotion & Social Skills

Individuals with autism experience challenges with affect and emotion primarily in two ways: 1) Not being able to correctly discern and draw meaning from the emotional state of the people with whom they interact, and 2) not being able to communicate or react appropriately to their own emotional states. These deficits are major obstacles in both education and social development. Any technology that could help individuals with autism processes their own emotions or the emotions of others would increase their opportunities for education, occupation, relationships, and quality of life. I will describe 3 technologies, the Self-Cam, social signal processing, and glasses that provide digital feedback to the wearer. Then I will summarize how these 3 technologies, taken together, could offer a transformative technology to individuals with autism by assisting their interpretation of social interactions.

Self-Cam: Researchers in the Affective Computing Department at MIT adapted wearable camera technology that to promote social-emotional intelligence skills and communication for individuals with Autism.  By recording their daily interpersonal exchanges for later review, the wearer can better understand the perspective of the people they interact with and gain insight into their communication skills. The system allows the individual with Autism or Asperger’s to study the nonverbal cues which make up typical interpersonal interactions.
A pilot study of the first prototype is currently underway at the Groden Center in Providence, RI to investigate how the wearable system compares to other types of autism intervention (Goodwin, 2012). Ten young adults with Autism or Asperger Syndrome will wear the Self-Cam three times a week for 10 weeks. They will then use a computerized system that analyzes the facial expressions of the people with whom they interacted. The results will be compared to a matched group who will use a currently available DVD program that teaches emotions. The goal is to discern weather access to a “replay” of the day’s interaction will give individuals with autism or Asperger Syndrome a chance to study and better understand interpersonal interactions in a safe, non-real-time setting.
When wearable cameras were first conceived they were bulky and unsightly, however now, with cameras getting much smaller, the devices can be put into jewelry or sewn into clothing (El Kaliouby, 2006). In this way, the technology is less of a distraction, and can be integrated into the wearer’s everyday life.


MindReader API: Most of us process subtle changes in the eyes and mouth, movements of the head, voice intonation and tightening of various facial muscles unconsciously and work them into our interpersonal interactions (Mehrabian, 1968). Individuals with autism or Asperger Syndrome have a more difficult time discerning emotion from facial expressions or other cues; the more complex the emotion being expressed, the more impairment in interpretation (BaronCohen, 1997). Considerable research is being dedicated to advancing the capability of computers to interpret expressions of human emotions (Sandbach, 2012; Lin, 2009; Robinson, 2009; el Kaliouby, 2005; el Kaliouby, 2004). The goal is that by drawing on the fields of machine learning, pattern recognition, signal processing, machine vision, cognitive science and psychology; and using feedback on facial expressions, nonverbal aspects of speech, posture, and gestures, computers can learn to interpret, with a high degree of accuracy, the social signals in a human interaction (Robinson, 2009; Brunet 2012).

MindReader API builds on a program that Rana el Kaliouby (2005) worked on as part of her doctoral dissertation. El Kaliouby developed a digital model for “mind-reading.” The goal was to create a system that could interpret more complex mental states than just simple emotions, such as “agreeing,” “concentrating,” “disagreeing,” “interested,” “unsure,”  “or “thinking.” To break down the complexity of facial expression into manageable information that the computer could interpret, the design focuses on 24 “landmarks” on the face, then examines how each of those parts change shape when we display a particular emotion. The computational model was able to read these mental states with an average of 77.4% accuracy, and a top accuracy of 88.9%. Integrated with video monitoring systems, this type of software could provide valuable real-time feedback to individuals with autism or Asperger Syndrome to help them manage their day to day personal interactions.


Augmented Reality Glasses: Microsoft was recently granted a patent for augmented reality glasses that would be able to overlay real-world vision with digital information (Oremus, 2012), and of course we are all becoming familiar with Google Glass. Use examples for this technology could include seeing sports statistics while watching a live game, receiving gps directions with step by step guidance to your desired location, song lyrics displayed to go along with the music you are listening to, nutrition content for the food you are eating, real-time information on consumer products while you shop… really the sky’s the limit if you let your imagination go. The wearer or augmented reality glasses could obtain desired information on anything your eye perceives. Even your cell phone could move out of your pocket and sit perched on your nose. The patent provides two illustrations. The first illustration show a picture of the glasses, detailing features like the camera, microphone, speakers, viewing screen/lenses and power source. The second illustration gives visual representation of how information might look for the wearer while watching a baseball game. When combined with the previously described innovations, augmented reality glasses could serve as camera and feedback device to provide information on social interactions.


Synthesis of Products for Affect, Emotion & Social Skills: By incorporating a wearable camera into augmented reality glasses, and interfacing with a digital system to interpret social signals, you could create for individuals with autism or Asperger Syndrome a real-time assistive technology to detect and deliver digital information on interactions and help them process the emotional reactions of others. Incorporating this technology into a fashion accessory would eliminate any social stigma that might be caused by a bulky camera or hand held digital readout. If some of the predictions about the Augmented Reality Glasses come true, and the glasses actually serve the function of your cell phone, it would be as easy as downloading an “Emotion Reader App.” In fact, this technology may benefit all of us from time to time!

 

Managing Problem Behaviors

Among the most difficult challenges faced by individuals with autism and their families is the propensity for seemingly unprovoked tantrums of either aggression or of self-injury. Families and caregivers often report that the individual with autism showed no outward signs of growing discomfort, and that the outburst seemed to come out of nowhere. Researchers have found, however, that even in the absence of visible signs of distress, the individual may have an unusually high heart rate or show other non-visible signs of sympathetic arousal (Goodwin et al., 2006). The stress that has been building on the inside may eventually culminate in an outburst that, while seemingly without precipitation, has been building up for some time.

Rosalind Picard, MIT Media Lab researcher, faculty member, and founder and director of the Affective Computing Research Group, has pioneered a device that can be inconspicuously worn on the wrist and measures changes in skin conductivity which often corresponds with emotional arousal (Picard, 2006). Ideally, such a device could monitor physiological changes in an individual with autism and alert parents and caregivers to potential behavioral problems so they could change the individual’s environment or otherwise intervene.

During an interview with Micah Eckhardt, a PhD candidate at MIT I learned that while a wearable “mood sensor” for autism would be significant benefit to the quality of life of the individual and their families, this device as it stands right now is not quite sufficient (Eckhardt, Micah, personal communication, October 2, 2012). The measurements of skin conductivity related to a particular emotion have not proven consistent from one individual to the next; and, they have even varied significantly with the same individual from one day to the next. Further, the data does not provide information on the quality of the emotion being experienced. It is impossible to tell whether the wearer is happy, sad, or just stepped into a warmer room. Picard (2006) suggests that with more data we may be able to tease out finer distinctions. I believe that a better route would be to investigate ways to incorporate additional sensors into the device that may be able test pH levels, heart rate, blood pressure or other biological measures, which when interpreted together would offer more reliable or predictive information. Clearly this will take significantly more interdisciplinary research.

Concluding Remarks

My personal experience with this industry includes 15 years of running an educational publishing company, Stages Learning Materials, focusing primarily on supplemental school supplies and therapy products for autism education. Last year I added a software program to my product offering. The program takes images from one of my most popular products and digitalized them for the PC and Mac platform. And, as I described above, I am currently in production on an app that will take the same basic design which teaches basic vocabulary and categorization skills and adapt it to the Ipad and Android tablet devices.  

I have been attending conferences in the autism industry for two decades. During that time I have seen an amazing growth of the variety and quality of tools available for individuals with autism and their families. However, over the past 5 years the number and variety of technology companies that are entering the market has skyrocketed.

In previous years, I would say that the barriers to innovation in the industry were the lack of information and understanding surrounding autism and Asperger Syndrome, and the perception of a small market size. At this point, neither of those barriers still exists. Psychology, psychiatry, neurology, and general medicine are dedicating significant time and money to autism research, and more and more is understood every day. And, as mentioned at the outset of this paper, the number of individuals with a diagnosis on the autism spectrum is now extremely high. Further, a cascade of state governments has enacted insurance laws to mandate coverage for autism treatment and education. As things stand now, autism is a huge market with substantial funding. Research, innovation and products will follow.

Bibliography

American, P. A., & American Psychiatric Association Task Force,on DSM. (2000). Diagnostic and statistical manual of mental disorders : DSM-IV-TR (4th, text revision 2000. ed.). Washington, DC: American Psychiatric Association.

BaronCohen, S., Wheelwright, S., & Jolliffe, T. (1997). Is there a ''language of the eyes''? evidence from normal adults, and adults with autism or asperger syndrome. Visual Cognition, 4(3), 311-331. doi: 10.1080/713756761

Bereznak, S., Ayres, K. M., Mechling, L. C., & Alexander, J. L. (2012). Video self-prompting and mobile technology to increase daily living and vocational independence for students with autism spectrum disorders. Journal of Developmental and Physical Disabilities, 24(3) doi: 10.1007/s10882-012-9270-8

Billard, A., Robins, B., Nadel, J., & Dautenhahn, K. (2007). Building robota, a mini-humanoid robot for the rehabilitation of children with autism. Assistive Technology, 19(1), 37-49.

Boraston, Z., & Blakemore, S. (2007). The application of eye-tracking technology in the study of autism. Journal of Physiology-London, 581(3) doi: 10.1113/jphysiol.2007.133587

Brunet, P. M., & Cowie, R. (2012). Towards a conceptual framework of research on social signal processing. Journal on Multimodal User Interfaces, 6(3-4) doi: 10.1007/s12193-012-0092-x

Cavagnaro, A. T. (2007). Autistic spectrum disorders: Changes in the california caseload, an update: June 1987-june 2007. ().Sacramento, CA: California Health and Human Services Agency.

Centers for Disease Control and Prevention. (2008). Prevalence of autism spectrum disorders — autism and developmental disabilities monitoring network, 14 sites. (Morbidity and Mortality Weekly Report No. SS-03). United States: Centers for Disease Control and Prevention.

DynaVox Mayer-Johnson. (2012). The history of DynaVox mayer-johnson. Retrieved 12/17, 2012, from http://ir.dynavoxtech.com/history.cfm

El Kaliouby, R., & Robinson, P. (2004). Mind reading machines: Automated inference of cognitive mental states from video. Systems, Man and Cybernetics, 2004 IEEE International Conference on, , 1 682-688.

el Kaliouby, R., & Robinson, P. (2005). Generalization of a vision-based computational model of mind-reading. Affective Computing and Intelligent Interaction, Proceedings, 3784

el Kaliouby, R., Picard, R., & Baron-Cohen, S. (2006). Affective computing and autism. Progress in Convergence: Technologies for Human Wellbeing, 1093, 228-248. doi: 10.1196/annals.1382.016

Gea-Megias, M., Medina-Medina, N., Rodriguez-Almendros, M. L., & Rodriguez-Fortiz, M. J. (2004). Sc@ut: Platform for communication in ubiquitous and adaptive environments applied for children with autism. User-Centered Interaction Paradigms for Universal Access in the Information Society, 3196

Goodwin, M. S., Groden, J., Velicer, W. F., Lipsitt, L. P., Baron, M. G., Hofmann, S. G., & Groden, G. (2006). Cardiovascular arousal in individuals with autism. Focus on Autism and Other Developmental Disabilities, 21(2), 100-123.

Goodwin, M., Picard, R., el Kaliouby, R., Teeters, A. & Groden, J. (2012). Wearable platform to foster learning of natural facial expressions and emotions in high-functioning autism and asperger syndrome. Retrieved 12/18, 2012, from http://grodennetwork.org/about/research-presentations.asp#top

Kagohara, D. M., van der Meer, L., Achmadi, D., Green, V. A., O'Reilly, M. F., Lancioni, G. E., . . . Sigafoos, J. (2012). Teaching picture naming to two adolescents with autism spectrum disorders using systematic instruction and speech-generating devices. Research in Autism Spectrum Disorders, 6(3) doi: 10.1016/j.rasd.2012.04.001

Klin, A., Jones, W., Schultz, R., Volkmar, F., & Cohen, D. (2002). Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Archives of General Psychiatry, 59(9) doi: 10.1001/archpsyc.59.9.809

Lin, D., & Pan, D. (2009). Integrating a mixed-feature model and multiclass support vector machine for facial expression recognition. Integrated Computer-Aided Engineering, 16(1), 61-74.

MCEACHIN, J., SMITH, T., & LOVAAS, O. (1993). Long-term outcome for children with autism who received early intensive behavioral treatment. American Journal on Mental Retardation, 97(4), 359-372.

Mehrabian, A. (1968). Communication without words. Psychology Today, 2(4)

Moore, M., & Calvert, S. (2000). Brief report: Vocabulary acquisition for children with autism: Teacher or computer instruction. Journal of Autism and Developmental Disorders, 30(4) doi: 10.1023/A:1005535602064

Oremus, W. (2012). Forget the smartphone wars, here come the augmented-reality glasses wars. Retrieved 12/19, 2012, from http://www.slate.com/blogs/future_tense/2012/11/28/microsoft_augmented_reality_glasses_patent_rival_to_apple_google_glass.html

Passerino, L. M., & Santarosa, L. A. C. (2008). Autism and digital learning environments: Processes of interaction and mediation. Computers & Education, 51(1) doi: 10.1016/j.compedu.2007.05.015

Picard, R. W. (2009). Future affective technology for autism and emotion communication. Philosophical Transactions of the Royal Society B-Biological Sciences, 364(1535), 3575-3584. doi: 10.1098/rstb.2009.0143

Picard, R. W. (2010). Emotion research by the people, for the people. Emotion Review, 2(3) doi: 10.1177/1754073910364256

Robins, B., Dautenhahn, K., & Dubowski, J. (2006). Does appearance matter in the interaction of children with autism with a humanoid robot? Interaction Studies, 7(3), 479-512.

Robinson, P., & el Kaliouby, R. (2009). Computation of emotions in man and machines. Philosophical Transactions of the Royal Society B-Biological Sciences, 364(1535) doi: 10.1098/rstb.2009.0198

Sandbach, G., Zafeiriou, S., Pantic, M., & Yin, L. (2012). Static and dynamic 3D facial expression recognition: A comprehensive survey. Image and Vision Computing, 30(10) doi: 10.1016/j.imavis.2012.06.005

Sasson, N. J., Turner-Brown, L. M., Holtzclaw, T. N., Lam, K. S. L., & Bodfish, J. W. (2008). Children with autism demonstrate circumscribed attention during passive viewing of complex social and nonsocial picture arrays. Autism Research, 1(1), 31-42. doi: 10.1002/aur.4

Schlosser, R. W., Sigafoos, J., Luiselli, J. K., Angermeier, K., Harasymowyz, U., Schooley, K., & Belfiore, P. J. (2007). Effects of synthetic speech output on requesting and natural speech production in children with autism: A preliminary study. Research in Autism Spectrum Disorders, 1(2), 139-163. doi: 10.1016/j.rasd.2006.10.001

Xu, D., Gilkerson, J., Richards, J., Yapanel, U., & Gray, S. (2009). Child vocalization composition as discriminant information for automatic autism detection. Conference Proceedings : ...Annual International Conference of the IEEE Engineering in Medicine and Biology Society.IEEE Engineering in Medicine and Biology Society.Conference, 2009

 


Leave a comment

Please note, comments must be approved before they are published