Field of Science

Offense taken: autism, emotion, and packs of neurons

Cupcakes are grand, but they are unlikely to drive significant social change.
Via Wikimedia Commons.
I edit scientific papers for a living, some of which overlap with the social sciences and the humanities. Occasionally, I come across quotes that strike at the beating heart of an embodied issue in my life: Autism. I call it "embodied" because in our family, autism isn't something separate from the person who has it. That's not just a whacky, "neurotypical" philosophical stance. It's scientific. Autism isn't an overlay on an individual. If we are a pack of neurons, as Francis Crick so famously described it, then for autistic people, their packet weaves for them behaviors that add up to what we call autism. They are Autism, and Autism is who they are.

At this intersection of philosophy--What am I? Who am I?--and science--I am a pack of neurons that takes in stimuli, processes and integrates them, and issues responses that characterize my entire individuality--sits the nebulous human construct we call "emotion." As much as we consider ourselves to be "rational," the fact is that the influence of hormones and other chemical messengers on our pack of neurons modifies and modulates their intakes and outputs in ways that layer over any basic rational framework we may have.

Sure, we can compute. That computes. But those signals that modify inputs have undergone millennia of evolutionary shaping to facilitate survival. And you know what? Evolution doesn't give a rap whether or not what keeps you alive makes "rational" sense or not. If imagining that unicorns exist keeps someone getting out of bed every morning, and--more important from an evolutionary perspective--perky enough to get back into bed and have sex and reproduce--then evolution doesn't care whether or not unicorns actually exist.

So emotions, with their hazy and vaguely threatening chemical underpinnings and expressions, have a purpose for us. They are a response, a communication, the walloping limbic core in all of us around which we try to wrap the rational bits we're so proud of. Of course, emotion overwhelms almost everything we do, and it's frankly irrational to think that somehow, being rational is going to get the upper hand.

We communicate and respond with emotions, and thanks to FOXP2 and possibly some other genes, we also designate these feelings using spoken and written symbols called "words." Words, of course, are not only symbols for emotion. They evoke things tangible--like trees or rocks or platypi--and intangible--like love, hate, and offense. Because they carry the heavy load of that walloping limbic core that rules us all, whether we admit it or not, words--words, words, words--can land with a crushing force when we don't use them carefully, especially when they evoke or target emotions. Interesting, is it not, that these symbols, concocted themselves from subsymbols on the page, can land with the softest brush of a feather or the hard blow of a hammer and either way, elicit powerful responses that well up through every human physiological pathway into the inchoate manifestations we call "emotions."

Their purpose, these emotions and these words that symbolize them, is multifold. One of those purposes, as I gleaned recently from a paper I was editing, one that intersected philosophy and neuroscience, is that in human society, emotion is a glue. We group together not because of rational processes and agreements but because of shared, deep emotions. How many times have you watched a conflict online only to find that One Voice that expresses an emotion just like your own, a mutual feeling that sometimes even leads to friendship and continued association? How many first dates end just because one party says, "Oh, I love XYZ" and the other party feels exactly the opposite?

But really, how rational is it to end a potential relationship or start a new one simply because you have the same or opposite reaction--that we call emotion--to the same stimuli? But, of course, that's how it works. When you feel a strong emotion and find others who share it with you, the result is a bond, like a couple of shared electrons filling the same need for one another. When others don't share that emotion with you or purposely make clear that they think it's foolish, the outcome is a strong negative response, one we call "offense." The result is decreasing overlaps in the emotional Venn diagrams of our commonalities and differences. Eventually, the overlap may disappear completely so that we become isolated groups, or factions.

I think that most humans probably understand this relationship between offense and expressing disdain or scorning a deeply held emotional belief. It's one reason that so often, when we know we're about to offend, we precede our offense by saying, "No offense." Why say it unless you're aware that you're about to disparage or dismiss a deeply held emotion that the other person feels?

A way around committing this offense is, of course, to try to respect the other person's viewpoint or emotion. To engage in a little perspective taking, putting yourself in their shoes, imagine being them, with their ears, eyes, and pack of neurons. To try to absorb what their lives have been like--what words they've heard that have crushed them, what looks they've received that have suffocated them, what abuse they've experienced for not being "normal," what others have done to them to tap their very spirit dry. If you've done that, then you don't need to predicate anything you have to say with "no offense" because you will know that saying it will offend or not. And if it will, then do not say it.

Why am I writing this? Because our packs of neurons in our family don't translate into what society considers the "typical" packets. We've all struggled. There have been bullies who have--generally, at least, without the intellectually dishonest prelude of "no offense"--left scars both literal and figurative. There is society, the one that thinks my son's Asperger's is a copout for us, his parents, because he was always "picked last for dodgeball" (in our case, presumably, when he was three). There are the very able people out there who don't understand what it means to go through life without daily, incessant, offensive messages from almost every human being they encounter that they suck or can suck it for who they are, for how they are, for what they are.

Because autism, you see, isn't a costume. It isn't a tumor growing inside a person or a malformation that threatens a life. Science says it's a certain kind of pack or packs of neurons, the very core of a human being.

If you have not lived a life like that, one that has been bereft of an emotional glue that groups you with others who feel as you do--then you are privileged indeed. If you are privileged, your humanity demands that you listen to those who are not. When they tell you that words offend or that actions offend--that these things cause them pain--as someone who had the privilege of not living their lives, your humanity obligates you to harken to what they are saying. It obliges you not to demand that they not be offended. I am obligated to listen, to understand, and to act because if I do not, my son will grow into a society that thinks it's OK to belittle him for who he is. Into a world that thinks his autism is somehow a separate skin from his pack of neurons, and that this conception of it makes it OK to mock and deride autism. Him.

Some people don't like conflict. It makes them want to duck away somewhere, out of the volley of words because, yes, words injure and wound and mar. But as I tweeted the other day, big social change doesn't happen over tea and cupcakes and a bit of civil conversation. It's messy, it's emotional, it's painful, and it requires both calm-headed people and fanatics to shift the Overton window of what's within the bounds of social acceptance, of being human.

As much as my emotions overwhelm me, as much as I do take offense that is often intended, I will not shun the conflict. I engage with it, fully. I will always try to put myself in the other person's shoes, taking their perspective and working as hard as I can to be respectful, to not deride, to not use my own emotions against them. Even as I stand by what I think is right, I will work not to offend. If I'm doing my job right as a human being, I've incorporated the other person's perspective sufficiently to understand whether or not I'd offend them. Having done so, I can avoid saying something that deliberately will.

If I do offend, I will apologize. Emotions bind us together, and perhaps one of the strongest emotional glues available for any group is the feeling of resolution. Sincere apologies are the Super Glue of community building. If you bemoan the absence of a community, you might consider starting with saying, "I am sorry." Ali McGraw had it wrong: Love actually means saying "I am sorry" pretty often, and unless you're vying for the title of Oldest Three-Year-Old, you ought to consider it.

This ability to take a perspective on someone else's life, to feel their feelings, to listen to the pain they're expressing--it's called empathy. Can other animals feel empathy? Maybe. But of all the things we, as humans, think highly about ourselves--like our pride in our "rational" minds--the one thing that may distinguish our collective pack of neurons from that of most other animals is that we can articulate that understanding. The question is, Can we also be human enough not to ignore it?
-------------------------------------------------------------------------------
Happy to share this upshot of a recent contretemps online over attitudes about autism: Evolution of an apology. See the power of "I am sorry"?

So special: Visionary scientist or quack visionary?

Via WikiMedia Commons
I'm a "special needs" parent. What that means is that I've got children who fall into the category, "special needs," needs that extend beyond what people would consider typical for children. In all honesty, I've yet to meet a child or adult who doesn't have some kind of special need, but I suppose that in the aggregate, my children's needs may be more than typically special.

The word "special" is an odd one because it can mean something rather wonderful--"You're so special to me," or something rather snarky-- "Well, isn't that special," or something rather equivocal-- "special needs child." The original meaning of the word in English, it seems, was "better than ordinary," and indeed, to me, my children are better than ordinary and have led me into my life less ordinary. In the 13th century, it came into being to mean "marked off from others with some distinguishing quality." Like I said, I've yet to meet any human being who wasn't, based on that definition, special in some way.

But some of us burn to be more special than others. I yearn, for example, to be a good writer and recognized as such and to teach well. Those are, perhaps, fairly humdrum and common ambitions. Most people have some desire to be special in some way. Even Neurotribes blogger Steve Silberman, one of the best writers alive, has admitted to feeling less than special when writing about synesthesia, the crosslinking of sensory inputs in the brain that gives color to numbers or taste to sound. As Steve writes, "for a drearily mono-sensory person like me, it's tough to read these accounts without feeling a (sour-apple green) twinge of envy." The greats among us still want to feel "special," you see.

And then there are those among us who desire another form of special, the kind that approaches what Freud might have called a "complex." Within this complex may be a genuine desire to do good in the world, but that desire lies wrapped within layers of self-investment and self-perception as a crusader or seer. One of the most common expressions of that specialness I've seen is the belief that the special person has powers of insight that others do not.

People express this specialness of insight as physicians, as parents, as teachers, as therapists, as polemicists, as politicians, as religious leaders and martyrs. And when someone expresses that specialness of insight, takes it on crusade and self-identifies as a Special One, a seer, some of us have a similar response. Because of our tendency to want that sort of magical ability in others if not in ourselves--perhaps with a twinge of that "green-apple envy"--many of us will follow that crusader and ascribe to what the Seer sees, even if it's really a whole lot of nothing.

Hanging onto a crusader, to someone who claims specific insight that others don't have, can by association make the hangers-on themselves feel special, to feel like visionaries who see what's better than ordinary in the Special One before anyone else does. That fervent needs drives these pockets of devotees, and their need to be special--to see what others do not--feeds the growing ego of the focus of their crusade. It's a playground taunt of "I know something you don't know" writ larger and with greater effect.

This need to be special--to have insight into the intangible, to predict the future based on the present's flotsam--this need drives both the followers and the followed. Some of us are happy with what our eyes tell us, with what evidence shows, with waiting until data or verdicts are in. For others, though, that desire to be special in that visionary way brings a feeling of "better than ordinary" that satiates that very human urge.

When we face a mystery, a mystery like autism, for example, that craving to be the Special One, the one who puts a finger on the core of the mystery and exposes it, has led to some of the most wasteful, anti-science crusades in modern medical history. The movement has its visionary martyrs, its visionary followers, the people who are convinced that they see what no one else--not the legions of scientists, doctors, therapists, autistic people, or parents--have seen. If a Special One uses the slightest trace of a clue to construct a signpost, those who yearn to stand out for their special insights will march dutifully down the path it indicates. That road, unfortunately, leads to wasted dollars on a cottage industry of quackery, public health nightmares, deaths from preventable illnesses, erosion of public trust in science and medicine, and divisions within communities that, by all other measures, ought to be united.

A few weeks ago, I devised a checklist of 10 questions to ask when assessing whether or not something passes the "real science" test. I should have included in that checklist that it's important to watch out for anyone who makes unusual claims to insights that others don't have, to knowledge that only they've been able to access, to a pattern that only they've visualized because of their special powers and ways of seeing. These people are not relying on shining light on their evidence or exposing their ideas to the critical and often clarifying insights of others. They're relying on their alleged power as seers--as someone special--to sell you something.

Do real visionaries in science and medicine exist? Yes, they do. But they don't invest their vision with an infusion of how special they themselves are. They turn to the not-so-special but ever-important mechanisms for demonstrating the legitimacy of their vision, to add tangibility and weight to what their insights tell them, to open their ideas to critique instead of making fantastic, unsupported claims.

In doing that, they themselves are special. Why? Because the difference between a visionary scientist and a quack visionary is taking the focus to the evidence instead of to yourself. And that, my friends, is something special.

Men should just shove aspirin up their urethras

Aspirin goes here. 
Can I just say "thank you" to the GOP for reminding me in the last few weeks how very, very little some men think of women? I'd gotten pretty comfortable there, walking here and there with my ovaries and uterus, brazenly exposing my ankles and even sometimes my knees to the light, boldly driving around alone, my head uncovered and my torso uncorseted. Thinking, like a fool, that I am, here in 2012, a fully 100% citizen and human being in this great nation of ours, someone on par with people who have penises and testes, perhaps some hair on their chests. You know, someone whose full control over her body and her mind is never in question, whose choices about when to have sex and with whom are her own, whose choices about when to bear children and when not to are her own, whose right to have a violation of her body considered a criminal offense is a right retained.

Instead, here we are. When I look around at what's happening today, I feel that so little time has passed since I was born more than 40 years ago or, hell, since my great-grandmother was born in 1898. I could enumerate here the way assaults on the person of women, on their wombs and their vaginas and their rights and their personhood, feel here in 2012. But that's being done all over the Web as each little GOP mole pops up, says something straight out of 1850, and then steps aside for the next little mole.

I'd like to play a little game with you, instead. In the spirit of adventure, let's pretend that all of the shit people are doing against women in just the last week alone instead is being done against men. Here we go.

1. If a man joins the army and is raped, he should just expect that. I mean, damn. He's got an asshole, doesn't he?

2. If a man contributes to the formation of a zygote and then determines for myriad very personal reasons that having a child is not a rational decision at this time and chooses to terminate a pregnancy, he must have a six-inch wand shoved up his ass before the termination for no medically indicated reason. Oh, and then be told that this decision is a "lifestyle choice," because bringing a new human being into the world is a "lifestyle."

3. If a man wants to have birth control so that he doesn't find himself in the above situation, if he works for any entity that has anything whatsoever to do with a Catholic god, he must pay for that birth control himself because religious freedom, not personal freedom, takes precedence here and whatever a religion wants to do, it can. Given the prohibitions against all forms of sodomy, I guess that prostate exams also are out.

4. If a bullying congresswoman uses her power to discuss "religious freedom" in the context of birth control provision, in particular regarding condom use and vasectomies, she will have only women at the congressional hearing to discuss it and force out any man on the docket to speak. Some of the women who are allowed to speak will be nuns.

5. If a man wants to avoid contributing to the formation of a zygote, all he has to do is shove an aspirin up his urethra. That keeps things from coming out of it, you see.

There. Wasn't that fun? That's how it feels to be a woman, here, today, in 2012 in the United States of America. Great, isn't it?

Writing about disability: No science, no disabled point of view? No good


Image via WikiMedia Commons. Originally posted to Flickr.

A flurry of articles has emerged in the last few weeks in which mental health professionals voice opinions about developmental disorders without providing scientific evidence to support them. Opinion is fine, except that these articles deliver it as gospel straight from the expert's mouth while not providing an iota of scientific findings as a basis. Because the opinions relate to a developmental disorder in children, these writings carry not only the great weight of being vague and unsupported, but they also carry the even greater weight of damaging real people with real developmental disorders. 

In these articles--one in the New York Times and authored by a psychiatrist and the other at the Daily Beast and quoting a handful of mental health practitioners--the tone is that people with an Asperger’s diagnosis are just quirky folk who don’t have anything sufficiently disabling to be considered to have a disorder. The misunderstanding of diagnostic criteria or even of what Asperger’s actually is makes both of these pieces worthless in terms of information. The fact that neither of them quotes a person with Asperger’s or the parent of a child with Asperger’s means that all the reader gets from them is the bias of the writer. 

Each piece works hard, using generalizations and misinterpretations, to make sure that the public will perceive any human being walking around right now with an Asperger’s diagnosis as a diagnostic fraud who is undeserving of supports of any kind, who is simply odd or quirky and taking advantage of a "diagnosis du jour." In other words, these articles with their clear bias and their lack of factual information do very real harm to real people who really have a developmental disorder. And that pisses me off because one of those people is my son.

In the Daily Beast article, writer Casey Schwartz provides us with a master class in using generalizations without specifics to back them up. The article opens by saying about Asperger's that "no one has been able to agree on what it is." That's odd because there's this book, a manual really, called The Diagnostic and Statistical Manual of Mental Disorders IV-Text Revision (DSM-IV-TR), that explicitly lays out what Asperger's Disorder is. It lists the criteria that a person must meet for the diagnosis. Why the profession whose manual this is can't agree on it escapes me, but then I don't see any evidence in this article supporting the assertion that "no one has been able to agree on what it is," although it seems to contain evidence that at least two diagnosticians can't agree on what it isn't. 

The writer quotes Lorna Wing as saying that Asperger's kids are "active but odd." I can't tell why that quote is in there. What does it tell you about Asperger's, its alleged overdiagnosis, the diagnostic confusion around it? Nothing.

Then this kicker: "They don't have the language or cognitive impairments seen in autistic disorder." See, this is when doing a little research can help a lot. That statement is simply untrue. The diagnostic criteria for Asperger's are that there should be no language delay. Children with Asperger's have been identified as having problems with receptive language, the form of language that you hear and then process so that you understand the meaning of the words the other person is saying. You can see, yes, how that deficit might be important in social interaction, how it's not just "quirky" to have a processing problem in the context of interpreting spoken language. 

Schwartz then goes on to say that people with Asperger's have a "social handicap." Here, I refer the writer to the National Center on Disability and Journalism's Website, which parses the use of appropriate and inappropriate terms when writing about disability, including the word "handicap." Pro-tip: If you're going to write about disability, don't use the word "handicap." I can't stop there because in the next phrase, she writes, "the inability to relate normally to others." 'Nother pro-tip: Don't talk about normal. There is no such thing as "normal." "Typical" is the appropriate term here if one must be used.

And lo, another generalization: "Many doctors feel that the introduction of Asperger's syndrome enriched clinical thinking..." How many? What is the source for this statement? I don't disagree but isn't it journalism 101 not to generalize, generally?

Following that introductory natter intended to set the stage for how flotsammy and jetsammy an Asperger's diagnosis is, we then move on to...Nazis. Several grafs devoted to hearsay from one person--hearsay that according to the article itself could not be confirmed--about the possibility that Hans Asperger was a Nazi. What that has to do with diagnosing people with a developmental disorder, I'm not clear. 

The fact is that there is no support whatsoever for the rumor and that Asperger in fact may have been just the opposite. A colleague of his, according to the source linked in the previous sentence, stated that Asperger "had a very clear standpoint against the Nazis." Indeed, in his paper, he argued fervently for the social importance of these "little professors" he'd identified, taking a very strong anti-eugenics stance. This information was not hard to find. Why the hinting at "Asperger was a Nazi"? What does that even have to do with the developmental disorder itself? 

At this point, we are near the end of this article, and what do we have so far? An incorrect characterization of Asperger's, some unsupported generalizations, and the introduction of Nazis for no apparent reason. Has this last invoked Godwin's law? Should we stop there? 

But hark. More generalizations remain. "Many doctors believe Asperger's is significantly overdiagnosed...." In this statement, the writer links to this post at the New York Times, authored by psychiatrist Paul Steinberg. Does he have multiple personality disorder? How is this one guy "many doctors?" Steinberg, in his post, does the same disservice that these "j'accuse diagnosis du jour" screeds always do: They rely on vague accusations, no data, and tar everyone who has that diagnosis as frauds in the public mind. Thanks for that, man.

In his article--which was extraordinarily controversial in the already controversial world of the autism community--Steinberg and his presumed many doctor selves write that, "Social disabilities are not at all trivial, but they become cheapened by the ubiquity of the Asperger diagnosis, and they become miscast when put in the autism spectrum." Ah, er, hem. Isn't a key deficit of autism--any form of autism--the social deficit? Social deficits take up the vast majority of criteria related to diagnosing autistic disorder. How are they miscast when included on the autism spectrum?

Steinberg and his many doctor selves continue on, saying, "These men (with Asperger's) are able to compensate more completely than a truly autistic child or adult whose language deficiencies and cognitive deficits can often put him at a level of functioning in the mentally retarded range." I guess the good doctor (a) hasn't gotten the memo that the phrase is now "intellectually disabled" or that (b) intellectual disability isn't required for an autism diagnosis. According to the CDC, an average of 41% of people with autism also have an intellectual disability, and that value [ETA: meaning the value of 41%] doesn't include people with Asperger's, as intellectual disability excludes that diagnosis. That means the majority of people with autism do not have intellectual disability, whether they have inclusive of people with autistic disorder or Asperger's. In addition, new data are showing that people with autism may test as intellectually disabled on some tests but not on others and that the mode of testing matters

In other words, Steinberg, in addition to insulting specific autistic people he names in his article, also seems to lack an understanding of autistic people in general. 

But let's return to the Daily Beast article. That piece closes with what turns into a puff profile of Bryna Siegel, a child development PhD at my postdoc alma mater, the University of California, San Francisco. Siegel has a...reputation in the autism community. In this piece, she's quoted as saying that she undiagnoses 9 out of 10 of the people who come to her clinic with an Asperger's diagnosis. Really? While studies suggest that the overlap between what has been called high-functioning autism and Asperger's has confused the diagnostic issue, they don't show that people are wrongly diagnosed as being on the spectrum. How is it that 9 of 10 people who show up in Siegel's office aren't misdiagnosed in terms of placement on the spectrum but instead just...aren't on it at all? 

If you went to a doctor and found that this doctor overturned 90% of diagnoses of other practitioners in the field...what would your reaction be? Mine is that this rate of undiagnosis implies a crusade or that practitioners in that area really really suck at what they do and that Dr. Overturn is some kind of medical savior, an oasis in a howling wilderness of local misdiagnosis. Oh, thank God she's there to save us. 

Actually, Siegel lists her specialty as "differential diagnosis of autistic spectrum disorders and linking diagnostic assessment and treatment planning." Not autism spectrum disorder, but differential (as in alternative) diagnosis of autism spectrum disorder. Now that's specialized. No wonder her rates of undiagnosis are 9 out of 10. That's her clientele. Using her experience as an example for poor diagnosis is like using cats as an example of the growing preference for catnip.

It's odd that Siegel has this power, especially in light of recent studies showing that people diagnosed with Asperger's may have distinct structural differences and white and grey matter distributional differences. This diagnosis isn't an exercise in Freudian theory. It's not something that any doctor can giveth and then taketh away. It's a developmental difference that you have or you don't, and I'd argue that given that, there's no good reason (a) for mental health professionals to be involved in its diagnosis at all and that (b) the best diagnosticians for it are developmental pediatricians specialized in addressing developmental disorders.  

I've got to coin a new law for what comes next. This accusation appears so much in these kinds of articles that there really needs to be a name for it. Siegel says:
“I think part of the proliferation of the Asperger’s diagnosis is that if you say that a kid has oppositional defiant disorder, and especially if you say that about a normally intelligent upper-middle-class kid, parents don’t like to use the word 'oppositional' and they don’t like to use the word 'defiant' and they don’t like to use the word 'disorder.' And ‘Asperger’s’ just sounds so much more neutral. It doesn’t have any connotations … It’s a name, it’s not a descriptive term.”
Wow. I'm going to call this Emily's Law. It's the law that if your child has a developmental disorder and you're middle class, eventually someone will accuse you of being in denial about the real nature of your child's problem, which boils down to either your bad parenting or oppositional defiant disorder (ODD). By the way, if you look at the diagnostic criteria for ODD, you'll wonder how any practitioner who can read would ever conflate it with autism of any kind. My son never showed any of these behaviors before being diagnosed with Asperger's at age 3...or after, for that matter.

Another mental heath professional quoted in the piece, Pete Szatmari, says he undiagnoses 50% of the people he sees. So he and Siegel are even separated from one another in their diagnostic (undiagnostic?) rates by 40%. 

In the Daily Beast beast, Siegel describes the flood of calls that came to her office after Wired magazine published an Asperger's questionnaire. She says that she told her intake coordinator, "If they leave you the number of their secretary to call back, do not call them back." Funny. I don't remember reading anything in the DMS-IV-TR criteria about "having a secretary" as being an exclusion criterion. Does she include these people she ignored as part of her "undiagnosis" rates?

And my take-home from that comment--were I to buy it--would be that any hope we have for our son to be successful is a false hope. After all, based on this comment, he either has Asperger’s and never would be successful enough to have an administrative assistant (never mind that people with executive function skills, like, you know, administrative assistants may be the perfect complement to people like my son) or he doesn’t have Asperger’s and deserves to be blackened with the same dismissive “you’re just quirky and whiny” brush that these pieces seek to tar all Aspies with.

Given that Siegel and Szatmari can't even agree on their undiagnosis rates, looks to me like these folks need to stop blaming people with a developmental difference for having the temerity to have it and look to their own profession for how badly and inconsistently it is practiced or unpracticed. If members of their profession can't apply criteria consistently, does the fault actually lie in the criteria--or in the profession? And if they're so hopeless at the entire process, at using that doorstop of a manual provided as guidance, why should we trust them to rewrite that manual, to write trustworthy articles about diagnosis, to serve as reliable sources in any way? 

But how bad is it in the profession, I ask? Where are the published data showing that 9 out of 10 or 5 out of 10 or any children are misdiagnosed with Asperger’s and shouldn't be diagnosed on the spectrum at all? Indeed, the literature I find points to people with Asperger's as being misdiagnosed with other, non-spectrum disorders when their real diagnosis is Asperger's or just being diagnosed as somewhere else on the spectrum. There also are tested scales that also help in refining and distinguishing the diagnosis--the DSM-IV criteria aren't the only tool.

Without those data demonstrating the claims, articles like these do only harm. How? Children like my son, who has the Asperger’s diagnosis--and who also has receptive language problems, learning differences, stereotyped behaviors, fixations on acorns, patterned grimaces, echolalia, motor delays, and flaps--gets packaged in the public mind with these vague accusations of fakery. He is not faking it. His autism has been with him since birth. A professional referred him. We did not “seek a diagnosis.” He was, appropriately I think, considering the disarray among people whose professional designations begin with “psych,” diagnosed by a developmental pediatrician, someone with medical training explicitly related to developmental orders.

Finally, I close with the observation that neither one of these pieces presuming that Asperger's is nonexistent fakery managed to include insight from people with Asperger's or parents of people with Asperger's. And here's some anecdata: I've met a lot--a lot--of people who are diagnosed with Asperger's. They aren't just "quirky." They have real deficits in motor function, social function, and receptive language function, they have learning disabilities, and they exhibit stereotyped behaviors and unusual fixations. They're clearly people on the spectrum, not just odd or quirky or lovably absent minded. The learning differences that come with Asperger's are very real. The receptive language problems are not just a quirk. The flapping and the echolalia are not just oddities. Just because there's confusion about placement on the spectrum doesn't mean that a person isn't on it.

Finally, you know what I think? (You probably do if you've read this far). I think that when you write an article about disability and diagnosis, it's ableist and inappropriate not to include whenever possible insight from the people who themselves have that disability. And I think that psychiatrists should look to their own house and the biases they bring to it before they start publicly vilifying by association someone who lives in mine. 

Will having a child ruin your career? Fate won't tell you

Elizabeth I was a career woman before anyone knew what that was. She opted to express
her urges to shape, rear, mentor, support, and create through her work as a monarch.
Image via WikiMedia Commons, public domain in the US.
I've been thinking about the question of the much-discussed conflict between childbearing and career for women. This post, "On 'forgetting' to have babies," offers a lot for consideration, and I appreciate the honesty of the self-conversation it presents. 

The one thing I see in so many of these conversations with self or nonselves is a certain perception of inevitability about one choice or the other. Of course, if a woman doesn't want to have children, there's no either/or false dichotomy here. But for women who feel torn about the decision, the message they receive is very much a binary choice: have kids=career stalled and sacrifices made; don't have kids=career chugs onward.

We tend to dichotomize the situation and assume certain outcomes that in either case are not guaranteed. If a woman chooses not to have children because she determines that other considerations carry more weight for her and she's self aware enough to know it...does that guarantee that she's going to have a career? No. Even childless people encounter unforeseen obstacles ranging from sudden absence of funds to shuttering of an agency to obsolescence of the job's focus to personal events that include heath, disability, the health of others, marriage, divorce, death, and taxes.

Choosing not to have or adopt a child is no guarantee that in turn, you will have a career. Does it help you control your life a little more? Absolutely. Does not having children mean more personal flexibility should those obstacles arise? Certainly. But not having them doesn't somehow smooth the way to a career of success and accolades, either.

If you elect to have children, I probably don't need to say that the guarantees here are extremely limited. They'll be H. sapiens, that's assured. Beyond that... if you really want to make the Fates laugh, just try to make plans if you're a parent. Does that mean a child or children will derail your own dreams? They may. They may not. Just as choosing the path of career offers no assurances of success, choosing the path of parenthood won't ensure a career failure or dreams denied, either.

In some senses, this choice can seem binary, but it's less an either/or than it is, at some point in time, irreversible. If you're female and you choose to have children, you can't send them back once you've had them. If you're female and you choose not to have children, after a certain point in time, you can't change your mind and decide you want to do that after all. Either way, you reach a point of commitment that ultimately allows for no changes of heart.

And that's the part that's scary. What if you regret it later? I don't actually know any women who regret having children, but it may be that women who do don't speak out about it. After all... imagine the outcry or just imagine how the children would feel. Best to keep that on the down-low. And I know women who've elected not to have children, and they're very happy with their fulfilling, active lives. Do they have regrets? If so, it may be something they prefer not to discuss openly. And I'd argue that an overall feeling of life satisfaction, whether you're childed or childless, is the best antidote to regret. 

And then there's that whole issue of time, the factor that drives this discussion. After all, if we had all the time in the world, women who find the question a conundrum of some urgency could relax a little. Time, as we perceive it from the human perspective, always seems to demand rapidity of action even as our ability to live only in the moment stymies us. We make these decisions solely with the information we have in-hand, with the feelings and instincts and expectations for ourselves and our lives as we understand them in the now. We have no surety about how the context 5 or 10 or 15 years hence will influence our attitudes, our regrets, or our perspective in hindsight. 

But life itself has a way of unfolding on its own timetable. You may worry that having a child has forced or will force you to to table a dream permanently. Yet I know two people--my parents--who achieved in their 50s and 60s goals that seemed elusive for years, including tenure and publishing a book. I can only imagine how impossible either of those seemed to them 20 or 30 years ago when they were up to their ears in children and work and all of the accompanying detritus of family life and set aside for decades their dreams of scholarship and writing. In the end, time opened up for them, and they went right back to that table and gathered those dreams to themselves. They weren't dreams denied, just dreams considerably delayed.

You may worry that choosing not to have a child will leave you looking back with yearning in a decade or so. I can point to both men and women who've chosen not to have children. They're happy and fulfilled and doing wonderful things in the world. The urge to shape, rear, teach, support, mentor, and create outside of yourself isn't one that we can express only through parenthood. Both women and men have many outlets for that expression, for experiencing the feeling of well-doing that derives from channeling those urges in positive ways.

In the end, that's probably a common goal for most of us, that feeling of well-doing. But we have no way of knowing if we'll achieve it, whether we choose fulfillment in part through parenting, through career, or through both. Indeed, any of our best-laid plans that we carefully predicate on these choices can gang agley --and oft, they do-- in any of Fate's unforeseen turnings. As long as you move along your path, working at fulfillment and doing wonderful things for the world in your way, while you still can, that's really the best anyone can do with the moments life places at our disposal. Especially when a subsequent moment can make it clear that life offers no guarantees.

ADHD risk and general anesthesia: What does the study really tell us?

Via Wikimedia Commons. This image is a gross misrepresentation of
the real disability that is ADHD. But it was public domain, so I used it.
Recent headlines inform us that researchers have identified a link between ADHD and general anesthesia. Some have gone several words too far and promised us that "Anesthesia in Toddlers Proved to Be Linked to ADHD Development." Others are more modest if not fully informative, telling readers about a "Possible link between anesthesia exposure and ADHD in young children" and "Anesthesia before age 3 raises child's ADHD risk." 

Having been advised not to make too much of headlines that are wildly inaccurate about research findings exist to pique the interest of readers, I won't do that here (but OH MY GOD [one of] THOSE HEADLINES). Instead, let's just look at what this retrospective (meaning it relied on looking back at records) study said. The relevance of the results to the year 2012 comes down to a matter of time and diagnostic consistency.

First of all, like any parent of a child with ADHD, I clicked on the first headline about this study that I encountered. After reading the opening grafs, I immediately calculated how many general anesthesia procedures my nine-year-old with severe ADHD has had, and it's at least four (I lose track with all the children and all the surgeries). FOUR. When you read further into the news articles, you find (in most of them) that while it's not all anesthesia or any age of exposure, it's multiple general anesthesias before age 3 age 2 (not 3--so much for accuracy in headlines). 

A nanosecond more math, and I've reduced his exposures that might be relevant here to two. One for ear tubes, one for lacrimal ducts that were blocked with bone. Each procedure was incredibly brief, and he was under general anesthesia for much less than 30 minutes for each.

Having done all of this heavy math, I went looking for the paper itself. It's a study that was done in the Mayo Clinic, and it's published in...Mayo Clinic Proceedings (full text), which recently fell under the Elsevier publication umbrella [PDF]. To keep things reasonably clear, I've bullet pointed the main findings of the study below:

  • For children not receiving anesthesia for procedures before age 2, the cumulative incidence of ADHD at age 19 was 7.3% (95% CI, 6.5%-8.1%).
  • For single and multiple (≥2) exposures to anesthesia for procedures, the estimates were 10.7% (95% CI, 6.8%-14.4%) and 17.9% (95% CI, 7.2%-27.4%), respectively (Figure).
  • In unadjusted analysis, exposure significantly increased ADHD risk (Table 3).
  • In analysis adjusted for the covariates of sex, birth weight, and gestational age, multiple (HR, 2.49 [boldface mine]; 95% CI, 1.32-4.71), but not single (HR, 1.35; 95% CI, 0.90-2.02), exposures to anesthetics for procedures increased ADHD risk.
  • Similar results were found using stratified proportional hazards regression with strata defined based on the propensity for receiving anesthesia (multiple exposure HR, 1.95; 95% CI, 1.03-3.71; single exposure HR, 1.18; 95% CI, 0.79-1.77).
  • When analyzed either as a continuous or a categorical variable, the total duration of anesthesia was also associated with ADHD in unadjusted and covariate-adjusted analysis, but this association did not reach statistical significance in propensity-stratified analysis (Table 3).
Translation: The population of children who had one general anesthesia exposure before age 2 did not differ in ADHD rates from the population of children who had none before that age. The children who had 2 or more general anesthesia exposures before age 2 had higher rates of ADHD in their group than the other children. 

Then, the researchers scraped the data for factors that might also influence the presence of ADHD and still found an increased risk of ADHD among children who'd had general anesthesia at least twice before age 2. In fact, their risk was about 2.5 times that of children with one or no exposures. There was a hint that the length of anesthesia was relevant to the outcome in terms of ADHD risk.

Taking a closer look at the paper, I found something that struck me as odd. The children in this paper were all born between January 1976 and December 1982. For them to have had general anesthesia before age 2, they'd've had it somewhere between 1976 and 1984 or so. In other words, the most recent episode of general anesthesia under consideration in this study occurred ~27 years ago, and the earliest occurred 36 years ago.

Naturally, I thought, "Hmm. Wonder if anesthesia has changed at all in the intervening decades."

It has. The general anesthesia protocol used most often in the children in this study involved halothane inhalation (87.1%) combined with nitrous oxide administration (88.1%). Apparently, halothane is in very limited use these days, replaced by sevoflurane, which evidently has advantages that include more rapid induction of anesthesia and more rapid emergence from it. It also is associated with less post-operative nausea and vomiting and appears to be associated with a reduced incident of heart episodes compared to halothane. 

According to an anesthesiology newsletter from October 2011 (link may go to paywall), "much has changed in the practice of pediatric anesthesia over the past 25 years," including the development of safer anesthetics. Other newly developed anesthetics may help reduce some of the aftermath of pediatric general anesthesia, including "agitation." 

In other words, this paper is about the influence of having 2 or more episodes of general anesthesia before age 2 about 30 years ago. There is no way of knowing what factors associated with anesthesia--if any--are relevant to the findings of this paper, but some of those factors have changed, so there's also no way of knowing how relevant these findings are in the context of current practices.

Then I thought about the length of anesthesia. Luckily, the authors provide an analysis of this. For parents who read that headline and immediately flashed onto the operations for ear tubes their little darlings may have had, I think you can rest easy. Those brief procedures don't seem to be linked to any ADHD-related outcomes. Based on the paper, there's no increased ADHD risk for children who had procedures lasting in total less than 30 minutes, and placement of ear tubes, for example, requires a duration of anesthesia of about 15 minutes. In addition, there doesn't seem to be any significant increase in risk until you get to a total duration of 1.5 hours or more. 

That takes me to the observation that a surgical intervention lasting more than 30 or 45 minutes is generally no minimal intervention. These aren't ear tube operations or any of the more seemingly run-of-the-mill surgeries that children seem to have so frequently today; even infant hernia operations require only a "relatively brief" visit to la-la land. The children who had anesthesia in this cohort also had a more frequent rate of other health problems compared to children who didn't and were more likely to have low birth weight and have been born before full term ("lower gestational age"). In other words, the indications for these general-anesthesia-associated surgeries likely outweighed any long-term concerns over risk for ADHD-- 30 years ago.

One news article about the current study cites "animal and human" studies suggesting an influence of anesthesia on "the developing brain," which to me seems likely. But that article links to another story as support...that is from the same research group working with the same retrospective data from the same cohort (study here). In their ADHD/anesthesia paper, the authors cite a study linking the inhibition of certain signals in the brain and ADHD in an animal model, but that study didn't use the same anesthetics or administration route used on the children in their study cohort. They also cite a report they intend to support their assertion of an association between learning disabilities and anesthesia, but the study they cite found no link "between a single, relatively brief anesthetic exposure in infancy...and reduced academic performance" in the teen years.

Not only have things changed in the world of pediatric anesthesia in the last three or four decades, but they've also changed in the world of diagnostic criteria for learning disorders and ADD/ADHD. The current paper used the Diagnostic and Statistical Manual of Mental Disorders (DSM)-IV for exclusion based on other disorders, 
combing school records for "any indication of concern regarding learning and behavior" and using DMS-IV criteria as one possible confirmation factor for an ADHD diagnosis. But the authors appear to have accepted an ADHD diagnosis from the period combined with a parental questionnaire as another method of confirming the ADHD diagnosis: 
Patients were defined as having research-identified “definite” ADHD if their records included a clinical diagnosis of ADHD and at least 1 form of supporting evidence, including documentation of symptoms that met DSM-IV criteria for ADHD (with 6 or more separate entries in the medical or school records that were consistent with DSM-IV criteria) and positive parent or teacher ADHD questionnaire results.
The children in the study were all diagnosed with ADHD before age 19, so between the years of about 1980 (for diagnoses starting at about age 4) to about 2001 (for children born in 1982), spanning four five versions of the DSM (DSM-II until 1980; DSM-III until 1987; DSM-III-R until 1994; DSM-IV until 2000; DSM-IV-TR in 2000). In other words, there was not a consistent diagnostic standard for what constituted ADHD in this study, and the authors used a version of the DSM published in 1994 for exclusion and, in some cases, confirmation.

Anachronism piles on top of anachronism here, and I can't see how any of this is relevant to, say, my nine-year-old son--or anyone else--in 2012.


The news media stories seemed to largely overlook the issue of how pediatric anesthesia practices may have changed in the decades since the cohort in this study had general anesthesia or the issues of diagnostic variability. Does emerging from anesthesia more rapidly, as children do today with sevoflurane, have an effect? What about the improvements in the surgeries themselves that may make them shorter today? Or amelioration of post-operative vomiting and nausea, or post-operative agitation? Things have changed a lot in the 30 years since the children in this study underwent their surgeries, so that leaves me asking just how relevant these findings are to children born--and having surgery--today.

Do women matter in childbirth?

Image in public domain.
The news this past week was full of stories about home birth--they're on the rise in the US, even though rates remain comparatively tiny, and a home birth advocate in Australia died from unknown causes after giving birth at home. I pinned a piece I just wrote for Slate to the rise in home births, hoping in that piece to clarify that not all home births are created equal and one with an experienced, well-trained certified nurse midwife (CNM) with a hospital-based, backup OB and hospital access is the gold standard if home birth is your choice. I also argued that many women in the US don't have this choice or the choice of a hospital that offers an environment conducive to health and bonding between mother and child, which is, in fact, the case. A blogger over at Babble summed up this core argument better than I did in her post about my piece. 


The piece opened with a paragraph about my own experiences with the births of our first two children. In my original draft, I'd had that information at about paragraph three and written differently, but for reasons of narrative and word count limitations, it was moved to the top. It opens with a mention of fluorescent lights--not the way I'd originally described it, which was simply, "fluorescent lit"--and continues with a very brief description of that birth and the sequelae. The birth was no picnic--what birth is?--and my husband and I both were not thrilled with aspects like hospital visitors peering in through the open door as I laid there, spread eagled in stirrups, pushing and covered with the effluvia related to birth. But the aftermath was what left us so upset that to this day, we just don't talk about it with each other. 


I described this aftermath briefly in the article--it consisted of the hospital's forcing our son to have 12 blood draws for glucose testing for no medically indicated reason (he was full term, perfect Apgars, feeding well, all readings were normal, our pediatrician was appalled) against our will and without our informed consent. They also aggressively threatened us with separation from our healthy son and with dismissal from the hospital while they retained our son, unless we took him against medical advice. I was probably a hormonal mess--I had just given birth after three days of sleepless prodrome, we were first-time parents--but hearing the click as they ripped into his heel 12 times and listening to him shriek with pain every three hours (during which we anticipated each draw with growing dread) made me feel like I was feeling what he did, and that empathy between us has persisted to this day. When we finally did leave the hospital, within a day or two, I was fighting a raging hospital-acquired infection that required some powerful drugs to treat and interfered with my ability to breastfeed our boy.


It was these effects on my son--not me--that led us to pursue home birth for our second child, born in 2002. I was terrified of the prospect of a home birth, not because of safety issues--the literature I could find at that time indicated good safety profiles for CNM-attended births with an OB backup and hospital access nearby, which is what we had--I was terrified about the pain, about whether I could do it. But I forced myself to do it because I did not want our second son to go through what had happened to our first without medical indication. 


It wasn't because I had some nutty idea about a beautiful or lovely of fluffy birth experience. Birth isn't fluffy. It's hard as hell, and yes, emergencies can be sudden and fatal. We were fully aware of that. A hospital was blocks away. Had it not been, we'd've elected simply to be in a hospital because safety would have tipped the scale that way. When my water broke and I went into labor, I stood in my kitchen and ate some lasagna, then--dilated to about a 5--I went into my bathroom, looked at myself in the mirror, and said out loud to myself: You can do this. You're scared shitless, but you can do this because it is best for your son. That was our conclusion at the time, and that's what we did. Now, when someone asks me about the 1 to 10 pain scale, I know what my 10 is. Was I comfortable? No, not at all. Was he comfortable, peaceful, safe, and with me without separation from the moment he was born? Yes, he was. And only one heel prick, for the metabolic screen.


Our third child was born in 2006, in a different hospital, five years after our first hospital experience. This hospital was new, and they did labor, delivery, and recovery all in the same family-sized room, with family welcome at any time of day or night. They never once separated our son from us, they helped us with breastfeeding--it turned out that because of a motor deficit, he couldn't--and they did do two blood sugar draws, each carefully and clearly explained to us and done with our informed consent. I sent flowers to the staff after we left because they had done everything that was the right thing for our third and final son and they'd had a hard couple of days that had included a neonatal death on the L&D floor. This last birth of ours was the kind I'd choose again were I going to have any more children, which I am not.


The thing is, that last experience is not one that is widely available to women, women of any ethnicity or socioeconomic status. Many women I know have two widely different options where they live: birth in a hospital with a poor reputation for birthing women and childbirth, or childbirth with a direct-entry midwife who does not have the level of training of a certified nurse midwife (an RN) or the OB or hospital backup. To have a better option, you have to be living in just the right place and have just the right kind of insurance. Talk about white privilege. Speaking of which, does anyone really think that the hospital would have kept us there with our first son, against our will, had we not, at that time, had white-glove health insurance? 


In the Slate piece I wrote, I argued that women need a spectrum of choices that are best suited to their situations. Stress and anxiety and separation from the mother at birth are not optimal childbirth outcomes and rarely are necessary for a birth that proceeds normally. CNM-attended births are associated with specific benefits in this regard. In the piece, I argued that a national infrastructure of CNMs who are associated with OBs and have hospital access would be a boon to women everywhere, giving them the choices we didn't have in 2001 and that many still don't have today. Whether a woman chooses to birth at home--and increasing numbers of them are--or in a hospital or possibly best of all for low-risk pregnancies, a birthing center--there are safer and stress-reduced options that should be available. I also noted that a recent UK study found that home births with the gold standard I describe above are safe for low-risk women who have already had a child, but not for women giving birth for the first time. Birth in a birthing center, however, was safe for any woman who was low risk. I also cited studies about childbirth factors associated with post-traumatic stress disorder in mothers. This is not simply a theoretical exercise.


In this society, no one has a right to a personal and informed decision, yet everyone has a right to tell someone how to do things. After that piece appeared, some people read the central message--a need for choice so that mother and baby can be healthy, safe, with limited stress and with each other whenever possible--and some others read the first 157 words about our experiences among the 1302 words of the piece and began to attack me. They misread some of what I wrote and accused me of linking my first son's autism with the unnecessary and invasive interventions inflicted on him in his first 36 hours. I did exactly the opposite. They accused me of being a narcissistic princess who did a home birth only to make myself more comfortable and without consideration for my son, writing only for my own benefit (which is odd, as I'm all done with childbirth). Again, the opposite is the case. They accused me of being an entitled white woman with no idea of how poor people live and access medical care...and that one also is untrue. Maybe I am entitled now, although I feel that I've never lost touch with my roots. I've been poor. Very, very poor, very sick, with no access to medical care at all. I'll never forget the clinic doctor who did my lung X-rays for free--I had severe pneumonia--when I didn't have a quarter to my name. I'm not poor now, and I'm glad, but I have experienced it close up and personal. The midwives I've had have always had a sliding scale for their already relatively minimal charges and in some cases provided prenatal care to mothers for no money at all.


And some commenters can't let go of the fluorescent lights that open the piece. I'll admit--I am photophobic and I do avoid fluorescent lights. But our problems with the hospital--which, by the way, were not unique in our community--were not about the goddamned lights or even about our experience. It was about what happened to our son. Period. Those events were so powerfully embedded in my psyche that when I returned to that hospital five years later to have my thyroid removed, pulling into the parking garage and walking through the doors brought on, unexpectedly, a huge anxiety attack that I had to force my way through just to keep on walking. People can diminish what happened to us as not really traumatic in the grand scheme of things, but we are simply people with our own histories and own triggers and own experiences, and these--not other people's experiences--dictate our reaction to things and the decisions we make. That's so clear to me, that perspective taking--yet it seems to escape a whole lot of people.


The commentary on the Slate piece has been highly informative to me. I have a deep interest in how the Internet allows open discussion and how people take advantage of that discussion, for better and for worse. In this case, rather than address the core arguments of the article itself, many many of these comments have been personal--deeply personal--attacks on me, my white privilege, my (presumed) socioeconomic status and history, my "narcissism," my whining about fluorescent lights, my alleged linking of my son's autism to those earliest hours of his life. So very few of the comments talk about the core discussion at hand, which is that in this country, women--from every ethnicity and every socioeconomic background--deserve respect and dignity and consideration while they're giving birth and access to safe care that provides it. 


And many do not have it. That is the problem.


------------------------------------------
ETA: And nothing new, of course. I blogged this previously on my parenting blog, as well.


Below are links to a couple of posts that address some of these issues of the complexity of choice and access. If more arise, I'll add to them here.


Meredith at A Mother Is Born has written a nice summary with bullet points of some of the main issues around birth choice. 


Ceridwen, at Babble, has also posted a piece that does a good job of summarizing some of these issues. I link to it in the above post, but wanted to feature it here, too.