Teaching students in the Age of Autism

03/21/2011  |  MACLEAN GANDER, Ed. D.
Autism
image

Here’s an interesting mystery: between 1993 and 2003, the diagnosis of autism spectrum disorder (ASD) increased by 800 percent in the United States. Since then, the number of individuals diagnosed with ASD has continued to grow by about 50 percent per year. What’s behind this epidemic?

The question is far from academic. Talk to any educator at any level, and they will tell you of the great increase in students with ASD and the challenges that this has posed in terms of providing appropriate and effective programs and support.  At Landmark College, which exclusively serves students with diagnosed learning disorders, about 20 percent of students now come to us with some form of this learning difference.

ASD, sometimes known as Asperger Syndrome in its milder forms, is a condition that primarily affects social cognition and the ability to interact with others in appropriate ways. Children who have it often struggle in peer relationships and social situations, and they may experience stigma and even bullying. Adolescents and young adults with ASD often continue to struggle in school, not simply for social reasons; they often also have difficulties dealing with new material, integrating and analyzing information, and dealing with more complex concepts.

Until recently, one theory behind the rise in ASD was that a mercury-based preservative in childhood immunizations was responsible. This position was never accepted by the Center for Disease Control, but it gathered so much attention that rates of vaccination have decreased and some common childhood maladies are making a comeback. It now appears that at least part of the research on which that theory was based was fraudulent.

Most researchers suggest that the increase in ASD comes from changes in the 1990s in how autism was defined, and in the educational services children with ASD were provided. Although severe forms of autism had been recognized since the 1940s, in 1992 the American Psychiatric Association redefined autism as a “spectrum disorder,” meaning that it was possible for someone to be “mildly autistic.” Likewise, in the 1990s, children with autism became eligible for special education services. Changes in classification criteria and social incentives have meant an increase in diagnosis of many different disorders — including a vast increase in the diagnosis of AD/HD between 1990 and 1993.

The mainstream perspective is that ASD is a genetic disorder that has always been with us, but only now is being fully recognized. This scientific consensus seems reasonable on the face of it. But the story is more complicated than that.

For one thing, while it may be true that mild versions of autism have always been around, it’s not clear whether labeling this condition as a disorder has been helpful. Most people my age can remember classmates who might have been quirky, or nerdy, or socially awkward — and who also may have grown up to be scientists, information analysts or technology whizzes. If the change is one of recognition rather than actual prevalence, then we may also want to question when it was that we began to substitute a lexicon of symptoms and disorders for one of personal characteristics and human differences.

In our experience at Landmark, we know that students with ASD often possess extraordinary gifts in certain areas, such as music, technology, or specific subject areas in which they have taken an interest. We also know that the ASD diagnosis covers a huge range of abilities, from students who are severely hampered by their difficulties in reading social cues, to students who have no obvious difficulties apart from the fact that someone has diagnosed them with the disorder.

In addition, we now know that our genetic inheritance generally works in subtle and complex ways. There is no such thing as “a gene for autism,” or for most common disorders that have a genetic component. Instead, some individuals inherit a genetic make-up that makes them vulnerable to developing certain disorders; and in most cases, the environment must also play a key role.

With autism, talk of environmental factors has generally turned toward environmental toxins — perhaps a natural focus, given how pervasive these toxins are in our environment and how well-documented their impact on health has been. There is no question that toxins shape genetic development, and that the prenatal impact of poisons and illnesses can be profound. Even if vaccinations aren’t the culprit, it’s important to still look for what else within the chemical soup we all live in may be causing the rise of ASD.

But when we talk about the environment within a genetic context, we can’t forget that old dichotomy of nature vs. nurture. For a long time, psychologists and psychiatrists blamed nurture for cognitive disorders. In the 1950s and 1960s, both autism and schizophrenia were ascribed primarily to bad parenting, as in Bruno Bettelheim’s theory of the cold, distant mother, or R.D. Laing’s theory of the “double-bind” family system.

These theories were discredited, and the paradigm shifted radically over the next decades, as twin studies and gene mapping both demonstrated factors that had nothing to do with the environment. For example, if one identical twin has schizophrenia, the odds that his or her twin will share that disorder are 50 percent, even when they are raised in separate families because of adoption.

Now, a new paradigm shift is underway. The idea that disorders like ASD are primarily genetic is still widely held, but genetic research over the past decade indicates that this view is too simplistic. In fact, current genetic science suggests that a variety of gene-environment interactions have to be taken into account in most cognitive and physical disorders. The emerging science of epigenetics even suggests that genetic inheritance may be changed by environmental factors in ways that may be passed down to succeeding generations.

So what does this mean for ASD? I’m not certain, but a couple of recent news stories got me thinking. Last fall, Hilary Stout wrote in The New York Times about how parents are using iPhones and similar gadgets as toys to pacify unruly toddlers. It has been a truism for a while that this is the first generation to grow up with the Internet, mobile phones, and similar technology as a natural and unavoidable part of their environment. Now it appears that environment extends to the cradle.

In another recent Times article, Stout wrote about a re-awakened interest in the concept of childhood play. The article described  kindergarten environments with walls of computers and desks rather than sandboxes and toys, schools that had eliminated recess, and an approach to childhood play so constantly regimented by adults that the kinds of social skills that kids used to develop on the playground are no longer practiced. It also described a quixotic counter-movement focused on bringing play back to childhood, among other things by holding public events designed to reintroduce parents and children to childhood games like kick the can and capture the flag.

I grew up in Manhattan in the 1960s and 1970s, and by the time I was eight I was spending most of my time in the park with other kids, playing sports, or tag, or other invented games. I had my football stolen a couple of times, but I learned a lot about how to get along with people. Now, in urban areas, few parents would send their kids out alone to play anymore. In fact, parks have often been landscaped in ways that have made it impossible to put together a sandlot ballgame.

In suburban areas, play is regimented in the form of team sports, with an alternative being lessons in dance or acting or music, or staying at home alone. For most families everywhere, evenings are taken up mainly with ongoing interactions with technology — each family member isolated in his or her own world of Facebook, IM, texting, and e-mail, or with one of the 500 channels now available on the large-screen TV.

Social skills are not hard-wired genetically. They have to be developed. And the main way that they are acquired is through interaction with others, a process that begins in infancy and that sees its most important developmental periods in childhood and adolescence.

Perhaps it is natural that in a world in which human interaction is mainly mediated by electronics, and in which communication takes place primarily within isolated or structured contexts, that more and more of our children fail to acquire social skills, and instead are labeled as having ASD.

It’s a truism that different eras have their characteristic maladies. Hysteria was an affliction of the sexually-repressed Victorian age, just as the Cold War ‘50s were an Age of Anxiety, with Miltown the precursor to Valium, Prozac and Xanax.

In our current age, the messiness of play and the dangers (and rewards) of direct peer interaction in childhood have been exchanged for the clean, safe contours of adult supervision.

I don’t mean in any way to diminish the importance of understanding and treating ASD effectively — in fact, it is a major focus in my professional life. There is no question that students who have been labeled as having ASD may benefit from support for the development of social skills, especially in the transition from high school to college, and from home to the residence hall. At Landmark, we provide small support groups that focus on these kinds of social transitional issues, and we also provide individual counseling.

College students who have ASD also benefit from approaches to instruction that take into account some of the key challenges associated with learning differences. Faculty training is one key component, focused on such best teaching practices as making transitions explicit, scaffolding complex conceptual material, and providing support for moving from literal to more abstract ways of understanding and working with information. Simply being aware of the challenges is essential, as is learning to recognize signs that a student may be struggling in these ways.

Landmark is not alone in beginning to recognize and address the challenges of ASD, and also in understanding the gifts and talents that are often associated with this diagnosis. We also have come to see that characteristics that are currently classified as disorders may be better seen as differences, not flaws, but part of the rainbow of diversity. It also seems clear that the “epidemic” that we presently face is at least partly the outcome of significant changes in the experiences that children and young adults have growing up, from the loss of unregulated childhood play, to the extraordinary presence of technology in social interchanges.

As we recognize the importance of this emerging demographic trend, perhaps we should also acknowledge that we have entered the Age of Autism. Doing so might help us lessen the stigma now attached to ASD. It might also cause us to put the Blackberries away once in a while and play scrabble with our children instead.

MacLean Gander has worked with students with various forms of learning differences since 1987. He currently teaches at Landmark College and runs a private consulting service for individuals with learning challenges and organizations that seek to serve them. The views expressed here are his own. For more information call 802-387-4767 or visit www.landmark.edu.
Comments & Ratings
rating
  Comments


TENZI - THE WORLD'S FASTEST GAME.

IT'S A FUN, FAST FRENZY!