Boosting:

Just here for a minute to boost my nonspeaking friend jorja’s blog, starting with this excellent post on what she would change about her group home:

https://tumblr.com/malazansapper/732641056602095616/if-jorja-were-queen-of-group-home?source=share

A few quotes that struck me:

(CN food)

“i dont like hot dogs. i eat them every week.”

“I have a key to my dads house, and my friends house. i dont have a key to my own house tho.”

I encourage my readers to go check out her full post! I think too many of us don’t understand what life in group homes and institutions is actually like.

AAC in my favorite media: as promised, fandom nonsense

Content warnings for this post: eating disorder, anti-trans author.

Spoiler warnings: all media listed.

After all this time you’d think I’d be back for an ~important~ blog post, but instead I’m finally just writing an inconsequential fun one. Welcome to: favorite AAC moments in the shows I watch over and over!

This is sort of a listicle, in that it’s organized by show (in no particular order), with brief(ish) commentary on my favorite manifestations of AAC in each franchise I’ve become attached to. Some of the following media I’ve seen 10x or more, but others only a few times so far, so these won’t be exhaustive lists of every AAC-ish instance in their respective canons. Just stuff I love. And as a lot of this is fantasy/sci-fi, I’ll be stretching the definition of AAC for sure. Enjoy!

Star Trek: I’m sorely tempted to do one item per series on this franchise, but instead I’ll focus on TNG for its plenty of AAC awesomeness. First of all, Riva’s telepathic chorus that voices his words I think counts as AAC (though not sign language, which d/Deaf AAC users have taught me generally shouldn’t be considered AAC – as signed languages are full languages of their own right, not “alternative” to anything). Secondly, the Bynars’ primary language for communicating amongst themselves, that doesn’t use mouths, I think could be considered AAC as well. But I think my favorite non-spoken communication method in this series is the bright flashes and responsive humming witnessed in a nonbiological life form found in the soil of a planet being terraformed in Season 1… in part because, when finally patched into the Enterprise’s universal translator, the first English words to come from this being are this hilarious address to the biological lifeforms listening: “ugly giant bags of mostly water!” Indeed we are, dear new lifeform, indeed we are. (Honorable mention on this series is, of course, the ubiquitous “Darmok and Jalad at Tanagra” – which, though it technically uses mouthwords, is such an expansive/creative use of language that I think it deserves a mention on this list.)

Sherlock (the one with Cumberbatch): Well, I’ve got to love the constant text messages to Sherlock from Irene Adler – in which both the content of the text and the pointedly intimate sound the phone’s notification makes could be considered AAC. Partly because the main alternative I can think of here is Moriarty forcing innocent victims to act as voicer for his words under threat of bodily explosion if they do not comply. Everyone deserves some means of expressing themself, but that’s just not okay.

X-Files: For the record I am not a fan of the less MSRy seasons, but I’m going to go with the alternate-universe paralyzed Doggett getting to use high tech switch scanning as my fave for this series, because the idea that anyone would have gotten him access to that kind of system that quickly and that long ago is so cool! I wish everyone who needs that level of tech had that kind of access. One runner up is the Red Museum episode – when I was first watching this series as a kid I think I didn’t quite grasp that they typist was channeling walk-ins (and as such the walk-ins were essentially using him as AAC), but I do recall seeing the woman reading the typist’s words aloud and thinking “gosh that would be nice”. (Finallly, fifteen years later I found out about AAC for real… *sob face*.) But the second place item for this series is absolutely Mr. X.’s last act being to write a clue on Mulder’s floor in his own blood, because that’s definitely the most badass AAC I can think of.

Skins (UK): I’ve only seen this a couple times, and the writing I’m about to point out was likely hallucinated, but as a former anorexic I can’t skip over the “EAT” messages Cassie sees on her phone and plate and etc. Perhaps stomachs can send imagined AAC back to the brain when in desperate need of sustenance.

Buffy: If you’ve seen Buffy you probably already know how I’m going to finish this paragraph, but I’m going to do it anyway… Good lord, “Hush” is pretty much a masterpiece of AAC ingenuity considering it was (as far as I know) written entirely by speakies. Like, it’s amazing how quickly AAC becomes more available and more acceptable when a whole town needs it! When else would you be able to buy necklace whiteboards on the street corner? (I know partly I’m just a sucker for wearable AAC, but that component of this episode really highlights for me that the main reason mouthwords are considered a superior form of communication is that they work well for most people* most of the time. Take that aspect away, and all of a sudden more tools are made available for EVERYONE to communicate in diverse ways.) In any case, I will never forget Giles’ hilarious overhead projector drawings as quintessential on-demand AAC.

*abled people with power, in particular

Leverage: So in this show the team’s constant reliance on earbud comms means that there’s not much AAC use outside of situations where they’re right in front of a mark and can’t speak freely (or in code). Because of that, I was thinking of having my entry for this show just be “every single meaningful facial expression that Sophie has thrown at Parker (and that Parker has of course misunderstood or missed completely)”, but then I remembered San Lorenzo – who else besides Hardison can say they’ve stolen a whole country by means of an email? And I can’t help but adore the ensuing exchange: 

Moreau: “I have the guns! I have the money! I have… the goverment!” 

Nate: “You know what I have? I have a 24 year old… with a smartphone and a problem with authority.” 

Me: *heart eyes face*

Bones: Although technically impressive cryptologically, I am not going to give Pelant the dignity of citing his rearranged human spine as cool AAC. Nothing Pelant has ever done has been anywhere in the realm of “cool”. Hodgins, on the other hand, accidentally proposed to Angela by rearranging glow-in-the-dark shrimp. HODGINS CLEARLY WINS. As an honorable mention, because Booth/Brennan is pretty much my OTP, I’ll include Bones’ handwritten note to him when buried alive, later read aloud on a certain auspicious date. *the good kind of sob face*

Doctor Who: The Ood who use translation spheres are arguably using a type of AAC, but the Doctor’s message to Sally Sparrow written under wallpaper (hooray time travel!) and the stitched leaf with Melody Pond’s name embroidered on it are equally awesome and far less slavery-related AAC moments. But top of my list for this series (admittedly, I’ve only seen Doctors 9 through 11!) has to be the typistless typewriter clacking out “MUMMY MUMMY ARE YOU MY MUMMY?” 

Veronica Mars: Not really on this list, because although I consider it one of “my shows” I’ve only seen it a handful of times and can’t think of any good AAC moments, beyond just, day-to-day text messaging. Can you help?? Please, comment below!

The series that must not be named: Only kind of on this list, because 1) it’s not a TV show, 2) the author is a piece of shit, and 3) it’s mostly just a long series of missed opportunities for AAC! The missed opportunity I want to highlight here is that a certain small giant whose name starts with G could really have benefited from some magical AAC. Heavy stone pictograms that were carved via magic? Untearable letterboard enchanted to speak a speller’s words aloud? There’s so many directions this could go in, and if I was still as attached to the series as I was in the past, I might have made a whole post about my ideas for wizarding AAC. As it is, I’ll leave it off here. Go give money to a trans person.

Speechless: Not really on this list, because I’ve only seen it once and it’s pretty much all AAC all the time! If you haven’t seen it yet, I definitely recommend it!

Stranger Things: All-time best for last. There’s nearly endless examples of AAC in this series, both supernatural and not. The absolute classic is Joyce figuring out Will can use the Christmas lights for both partner-assisted scanning and a letterboard – in fact, I keep a sticker design of this string light letterboard on the back of my phone case, just in case I ever lose battery and need an alternative form of AAC! But I especially love 011’s quiet improvised gestural AAC in Season 1 – pointing to her tattoo/herself to share her name, using the demogorgon figurine and the D&D board to explain where Will is, motioning as if with a gun to tell Mike that their lives are in danger. Just like Mike is figuring out on the fly how to best interact with a traumatized autistic, 011’s figuring out how to say more than she’s ever been allowed to say before. I know I said Booth/Brennan is kind of an OTP earlier (and mentioned my attachment to MSR), but I’ll take kids saving the world over a story about grownups any day. 011/Mike (and their communication) forever.**

**Sure they can be polyam with Max and Will (respectively) but like… sorry not sorry folks, it’s STILL 011/MIKE FOREVER. And yes, of course Mike is gay, 011 is his agender boyfriend. 

Okay, headcanon ranting over, silly blog post at its end. Are you an AAC user that has favorite AAC moments in one of these shows or another piece of media you love? Please tell me all about them in the comments!

Updates and a poem

It’s been over a year since I wrote a blog post. I won’t apologize, but you can trust that life got away from me and I regret the long absence! I am hopeful that I am in a better place now to write posts more regularly again.

I didn’t actually fall off the face of the earth a year ago, however: I’ve managed some other advocacy type things in the meantime. Here are some links for your perusal followed by some additional life updates:

I co-authored a research study on the experiences of speaking autistic adults who also use AAC.

If you don’t have access to the journal article, you might enjoy our presentation on the study’s results at the 2021 AAC in the Cloud conference.

My friend Sam convinced me to start a podcast. You can listen to AAC Town here and read the AAC Town transcripts here.

I was also a guest on an established podcast called Pigeonhole. Listen to or read our conversation about the stigma around digitized voices.

I appeared in Communication First’s short film “LISTEN” which was released on the same day as Sia’s movie “MUSIC” as a counterpoint that features actual nonspeaking autistics.

I most recently recorded my contribution to a panel at the upcoming Reinventing Quality conference. The topic under discussion is equitable access to healthcare for AAC users and people with intellectual and/or developmental disabilities. I’ll be in the chat while our session plays on August 9th to answer questions. 

I trialed low-gain hearing aids programmed for auditory processing problems and ended up thrilled to purchase a permanent set – expect a joint blog post with my friend soon about our experiences with this technology.

I’ve begun seeing a SLP who specializes in AAC with the goal of using Proloquo2Go more quickly, as well as using it more regularly for written communication such as texts, tweets, and emails. 

After a dearth of updates from myself, I decided to turn the Twitter wizard rock game I invented over to Zoe (of Dots and Lines). This was a stellar decision, because not only do they actually post, they post from a wider variety of bands than I ever did! Unfortunately, they were also one of our best players for Ravenclaw (also my Hogwarts house) as a participant, whereas it turns out I am actually terrible at playing the game I used to run. Thus, we are now perpetually trying to catch up to Slytherin. Anyway, follow Zoe and ping one of us for instructions if you’d like to play!

I suspect this list isn’t exhaustive, but I think those are the main things that have been nagging at me as “I really ought to tell my blog readers about X” over the last year. 

The last thing I’ll share in this update before the promised poem is that I just finished 14 weeks of of a partial hospitalization program for my eating disorder relapse, and I’m feeling pretty optimistic about achieving/maintaining long-term recovery from that struggle. (You may know that there’s a frequent overlap between autism and anorexia, and that trans folks are at higher risk for eating disorders as well.) I wasn’t sure I was going to mention it here, and I may never actually put up a whole post about it, but I think this gives context for the poem I’d like to post. Which, here’s that, written in a month into treatment:

i struggle to break ground

these roots still infant – 

it’s difficult to build momentum

and push up into the spring air.

above, the shifting spots of sun beam down

and my arms grow green,

almost glow – 

spores on the underside

just slightly rough,

tan, perhaps shrinking,

under hypothetical human touch.

i was born curled

and this season tugging at my tips,

begging me to unfurl,

can seem almost harsh 

in its pull.

is this growth what i want?

was i always meant to span outward, upward,

to change shape and size and form –

not smoothly, perhaps, but steady?

is this springtime meant for me?

maybe every day is my birthday:

digesting not just soil and water

but the sheer sweet light of each dawn.

maybe i am ready to reach up – 

for real –

to grow into myself,

to greet the world open

expressing what i experience

and experiencing what i express.

i must remember this forest

was built to be my home.

i am not the only fern struggling up,

and even the trees’ windy whispers

and the birds’ hopeful songs

are shared with me on purpose.

grow, fiddlehead,

even if your fingers ache

and your base swells so far out.

grow, for these woods 

are only blessed by your green.

and someday

maybe someone

will walk by

and see you

for who you soon will be,

who you already are,

and who you always were.

how will you know?

they will smile.

AAC for your environment

I gave a presentation at AAC in the Cloud a couple weeks ago about backup AAC methods for when your primary communication method isn’t a good option, and decided to turn some of the content into blog posts for folks who aren’t video/slideshow type people. The first post was full of ideas for AAC that is wearable or highly portable, but in this post I want to tell you my ideas for communication supports you can work into your environment so that you’re never left without a way to communicate.

Some of these require that you have a fair amount of say over your own living space and work or school environment, so this is a good opportunity to practice self-advocacy skills around your access to communication supports. Please seek out support from other disabled activists or disability legal organizations if you encounter resistance from the people who control the environment you want to add AAC backups to. You have the right to have access to whatever communication methods you want at all times, not just your primary method when it’s most convenient for speaking people!

It’s worth mentioning that any of you who are not AAC users should feel free to use any of these ideas solely as models too, not putting pressure on the AAC user you’re supporting to use them in return. There is value in consistently demonstrating what symbols are connected to what letters are connected to what sounds, regardless of whether the AAC user ever chooses to use that exact communication support to express themselves. Many of us need visual input in order to understand you well anyway, and we need to see that it’s okay to switch between different communication methods, so you utilizing any of these ideas as a conversation partner is worthwhile whether or not we end up preferring the same methods.

One of the benefits of working backup AAC methods into your living spaces is that they can make communication quicker due to the fact that they’re designed for that environment. For example, to say, “stove on high?” in my device I have to navigate through various subfolders with nine button presses, but if I have a chart listing the different settings posted near the stove, all I have to do to confirm with my support staff what I’m supposed to do next is point to the chart’s symbol labeled “high”. A lot of the ideas listed here will sound like direct selection, but almost any of them can be used with partner assisted scanning instead, or head mounted laser pointers for folks who have more control over their head movements than their hands or feet. 

Personally, most of my environmental AAC is in my kitchen, because that’s where I most frequently need to communicate with my support staff. We have a letterboard taped to the microwave and a symbols strip taped to the counter. Or when I lived with roommates, we had a giant whiteboard in the kitchen for announcements and phone messages but that I ended up using as AAC sometimes. Another option is to use magnetic letters or words on your fridge. In the dining room, you could put a custom placemat at the dinner table or high chair, or use a stack of passable communication cards with fringe vocabulary specific to meal time. Cooking and eating is a time you might want to set your device aside to avoid mess, so these are settings you might especially want to use backups in.

In your living room, try placing a strip with pictures of favorite characters at the bottom of your television – or taped to the remote control – so you can pick out what movie you want to watch. Another idea is to put pictures of different kinds of Lego structures on the outside of the toy bin so that AAC users can tell their playmates what they want to build next. If you want access to core vocab on the same support, you can add words like watch, play, build, want, next, and all done, to these.

For families and people who share a bedroom, it’s a great idea to keep a giant letterboard with core words on the bedroom wall. You can keep a laser pointer on your bedside table for when you want to chat when you’re supposed to be asleep, like JJ uses in the show Speechless. Another good option for when you’re supposed to be asleep, or just when you can’t cope with bright lights, is a letterboard made from glow in the dark puff paint. An additional option for bedroom AAC is to put symbols for clothing items and different colors on the outside of a closet door or dresser to encourage self determination during dressing. If you have a bookshelf in your room but can’t reach all the shelves, have a strip of book cover images low down for when you want to request a certain story.

Having AAC backups in the bathroom is particularly important for folks who need support people with them there, and it’s another situation you might want to avoid using a primary communication device if you’re worried about water exposure. So one option is a laminated symbols board in the shower featuring fringe words for bathing. For folks with some handwriting skills, soap crayons in the tub are easy to clean up later. And a small communication board near your toilet or changing table that includes words like stop, no, and do it myself, can encourage bodily autonomy and consent.

Even those of us who usually wear our primary AAC method on a strap or harness might not want to lug it around during playtime in the yard, so here’s a few ideas for outdoor backups. First, you can keep sidewalk chalk on the patio for writing and drawing messages. If you have play equipment, affix several well laminated communication cards to the chains on a swingset using keyrings, or tape symbols for vocabulary like up, down, again, now, and all done, to the side of a slide. If you have a pool or hot tub, use zip ties to wrap a laminated letterboard around a foam pool noodle or inflatable inner tube. 

Maybe a lot of you work in schools or attend school, so let’s talk about environmental accommodations to support communication in that setting. First of all, picture schedules or visual task lists you might already be using can double as AAC, because they’re a handy way to indicate which activity you want to talk about or begin doing. I use ASL based facial expressions while I point to my cooking task list with my support staff to indicate, okay, shall we do this next? It can be a lot less labor intensive to just point to something right there on the wall than finding the right buttons in my device. Yes, of course it’s good to practice knowing where the buttons are in my device too for situations I’m not standing right next to a certain environmental support, but if multiple button presses can be fatiguing for you it’s more important to have backup options available that will make communication as easy as possible at times you’re too overwhelmed to navigate between folders or spell out a full word. Oh, note that for some of us we’d rather use a visual schedule that incorporates the symbol system we are also using on our devices or communication boards, so if you’re creating new visual supports like picture schedules I recommend using screenshots from your device to help design each item. 

Some other ideas for AAC backups worked into the school environment are, try to get the school to place a giant board on the playground with core and fringe symbols for anyone to use. In the cafeteria, ask kitchen staff to post pictures of each day’s food options somewhere reachable for AAC users to point to their choices. On a more individual basis, you can tape a core word board or letterboard to your desk to use with your assistant or friends who sit nearby. For AAC users who like to handwrite, having a large whiteboard on one wall and getting preferential seating right next to it can be a communication support. If you like to use a laser pointer on a head mount, ask for a large core words board or letterboard to be placed on the wall opposite your desk. And if you struggle with being able to keep your hand raised, you can place a single battery operated button at your desk that plays a recording of the words, “I have something to say”, in order to get the teacher’s attention.

There are also ways to integrate backup AAC methods into a work or volunteer environment. If you have a desk or work station, you can make a flipbook style sign propped up next to you that features rotatable messages displaying your current ability to engage with coworkers, similar to status badges. Another light tech support would be having a “small talk” communication board on the wall near whatever serves as the metaphorical water cooler at your workplace, the spot people tend to stand around and chat. Then there’s the technological end of workplace backups. Many of us who use voice output devices find it difficult to integrate them into slideshow presentations and video chat meetings. Ask for access to slideshow software that automatically generates preprogrammed voice output each time you pull up the corresponding slide, and access to web conferencing programs that allow simultaneous participation via chat box rather than webcam only. These environmental supports only work if you’re actually allowed to use them, so talk to your HR department about appropriate policies that support your inclusion.

Have you ever been in a car or school bus or on a bike when you needed to say something but there was no way to use your primary AAC method? This is common! A lot of people remove their device from their wheelchair mount during transit, or the other passengers can’t hear their voice output over the road noise, or there’s no way for the driver or a fellow bicyclist to read their usual letterboarding, and all sorts of other barriers. Here are a few ideas. If you’re going to be in a back seat of a car or school bus where there will be other passengers who can give their visual attention, use zip ties or velcro to attach a letterboard to the back of the seat in front of you. If you’re usually the driver or a front seat passenger, affix a few different battery operated buttons to the dashboard that play single voice recordings of your choice for important messages like “I need to go to the bathroom now” or “I’m lost”. If you’re bicycling with a friend, place a bell on your bike and pre-establish a system for using it to signal yes and no to your communication partner’s spoken questions, for example one ring for yes and two for no. And for that matter, Morse code could be an option too if you want to get really fancy with the bell.

Okay, those are my ideas for environmental communication supports you could use as backup AAC methods or even one of your primary methods! Do you have ideas I didn’t list here? Please comment below!

Wearable and highly portable AAC ideas

I gave a presentation at AAC in the Cloud this week about creating backup AAC methods for when your primary method isn’t a good option. I wanted to turn a couple parts of this presentation into blog posts, and this will be the first one – a list of ideas for wearable and other highly portable AAC (that you could use as a backup or as your primary method). If you’re more of a video/slideshow person feel free to watch the presentation linked above instead (there’s a transcript of my speech in the notes section of the slideshow if you need that to follow along), but if you’re more of a reading-a-blog person this is for you!

Please note that I’m not trying to advertise any specific brands in the links provided below, I just want you to be able to see examples of the possibilities that are out there. If you’re considering purchasing something to use as wearable/portable AAC, do your research in order to choose between the sellers available. If you’re considering crafting something to use as wearable/portable AAC, do your research to see how others have created similar items.

First of all, for partner assisted scanning, you don’t need any physical objects at all, just another person who can provide options auditorily or in sign language. This is a great method of communication for people who are having a really hard time with motor control or physical fatigue, because you can use as little as one small muscle movement to indicate a positive response when your communication partner says the option you want to select. I have friends who frequently use their head or eye gaze to select options when their devices are broken – their support staff might say two options, holding up a fist for each in opposite locations, and the AAC user can turn their heads or eyes towards the one they want to select. Another acquaintance of mine can stick out their tongue slightly to select an option or indicate “yes”. With a skilled communication partner these methods can lead to a huge range of self expression, and if you’re a support person for an AAC user I encourage you to learn as much as you can about how you can support robust communication using partner assisted scanning.

Another AAC method that’s completely portable because it doesn’t require any physical objects is expressive sign language, modified signs, or home signs. I don’t recommend using signed English or other gestural systems that are based on spoken grammar, because learning even a little bit of an actual sign language like ASL, including exposure to Deaf cultural norms, will lead to greater potential for connections with sign language communities across the lifespan. It also provides a better chance of access to interpreters who will understand you accurately as a communication support – for example, I often use interpreters for medical appointments. ASL or whatever your local sign language is can be a great backup AAC method. If you have motor skills impairments you might be able to learn modified signs that fit your abilities, and there are still interpreters who are skilled at understanding those. For AAC users who are certain they really only want to be able to communicate with their immediate family or caregivers, home signs or idiosyncratic gestures are another option for a backup method. Sometimes nonspeaking people will develop these kind of home signs independently, so if you notice someone you know using the same gesture over and over, try to figure out what it means! It’s entirely possible it’s a stim, but it could be communicative.

Those are the nontangible forms of AAC that are probably the most portable of all methods, but there are almost endless objects you can use as wearable communication supports too. Note that a lot of these can also be used for partner assisted scanning with aided visual input, where your support person points to each option as they list it and wait for you to select the one you want. For example a few months ago when both me and my friend’s devices weren’t available, I used a letterboard both to direct select what I wanted to say where my friend could see it and to use my finger to indicate each letter for them to select the ones that spelled out their messages. It was slow, but it was honestly lovely to be able to have a direct conversation together without needing fancy tech or speaking people to intervene!

Perhaps the most obvious way to wear AAC is to put your communication device on a strap or wheelchair mount! If you use a tablet like me and its case doesn’t feature a built-in handle, I recommend Otterbox’s harness and strap. I typically sling this over my shoulder when on the go so that my device is always nearby.

Additionally, if you almost always have your device on you but want to prepare for when it runs out of batteries, you can keep a small letterboard taped to the back of it or tuck one into its holster. Here is a link to a simple letterboard I designed that you’d just need to shrink or expand to your ideal size before printing. I definitely recommend laminating the printout so that it holds up over time – well, I recommend laminating pretty much everything. Small laminated letterboards are also easy to tuck into a purse or backpack.

blog post 1

(Image description: a small laminated letterboard with some core words and symbols peeks out from the holster of a tablet. The holster has a small neurodiversity symbol patch safety pinned to its handle.)

If you like to handwrite you can wear a miniature notebook and pen, or whiteboard and pen, around your neck or clipped to your shirt or belt. This can be homemade with dollar store materials or a pre-made set purchased online like this small whiteboard designed for nurses.

blog post 2

(Image description: a small green spiral notebook with a faint drawing of the Harry Potter “Deathly Hallows” symbol hangs from a loop of yarn, with pen attached with another small loop and tape.)

Even if your typical style isn’t exactly spooky, an Ouija board shirt or pillow conveniently doubles as a letterboard. Or this is even more niche, but if you’re a fan of the Netflix series Stranger Things, look for merchandise that displays Joyce’s string lights alphabet wall.

blog post 3

(Image description: A close up shot of a sticker with three rows of string lights, each letter of the alphabet scrawled underneath.)

You can attach a luggage tag that is printed with letters or important symbols to a keychain or lanyard. Sometimes there are luggage tags like this pre-made online, or you can print your own designs at home and seal them onto both sides of an old gift card with a sturdy glue or tape, punching a hole for the attachment – here is one I made. I recommend taping over the entire card with packing tape before punching the hole in order to provide a sort of homemade waterproofing. You can fit more than one of these on a single lanyard or keychain if you want to increase the vocabulary you can fit on it.

blog post 4

(Image description: a rounded rectangle card attached to a lanyard with metal clasp features words and symbols for yes, no, stop, go, help, and bathroom.)

Some companies custom print snap bracelets, or if you’re a seamstress you can hand embroider a fabric bracelet with symbols and words for a wearable and washable option. I have made several of these for myself and friends, and they’re really handy especially for quick interactions! You’re welcome to message me on Etsy if you’re interested in purchasing one from me. Another option for a homemade bracelet is to attach a small laminated symbols board to the cuff of an old sock.

blog post 5

(Image description: Two embroidered bracelets with velcro closure feature words and symbols for communication device, bathroom, help, I don’t know, meltdown, meds, need, not, can hear, drink, eat, go, maybe, semiverbal autistic, stop, and thank you.)

Football playbook wrist cuffs offer plastic sleeves you can insert your own letterboard or symbols board into:Here’s a picture of mine. It’s not perfectly waterproof but holds up decently as it’s designed for outdoor use.

blog post 6

(Image description: a wrist cuff with plastic sleeve worn on a white-skinned scarred arm unfolds to show a letterboard and a small symbols board with the words can hear, bathroom, hello, help, semiverbal autistic, and thank you.)

If you’re someone that always has your phone in your pocket or on a belt holster, a free or low cost app like Speech Assistant could be handier for a quick conversation than getting out a bigger device. My phone doesn’t have great volume but I frequently hold it out for a communication partner to read in circumstances like running errands or talking to neighbors.

A sturdy plastic alphabet stencil or letterboard can be attached to your backpack with a carabiner, here’s an example of my friend’s.

blog post 7

(Image description: a grey alphabet stencil with large capital letters and handle resting flat on a wooden surface.)

There are companies that custom print bandanas you could design with symbols or words tailored to your communication needs: I haven’t tried this but I’m working on embroidering a letterboard that will be similarly foldable, washable, and hardy.

blog post 8

(Image description: a grid of embroidered rectangles on rough fabric with a letter faintly drawn into each, so far only A and B have been filled in with stitches)

You can buy pre-made sets of communication cards on keyrings, but I recommend printing and laminating your own in order to choose vocabulary that is going to be most useful to you. This picture is one I made for when I travel, it’s useful to have something with me while my Ipad has to go through the X-Ray machine, in case I need to say anything to TSA agents. Again, I laminated with packing tape as a cheap alternative to paying for lamination at an office supplies store.

blog post 9

(Image description: a stack of small cards hanging on a loop of yarn. Four cards are splayed out with symbols on each, and text only partially visible: 1) a person touching their mouth with text reading “thank you”, 2) a ticket machine with text reading “…checking in”, 3) a flight attendant with text reading “flight attendants… disabled and… communication”?, and 4) a security guard with text reading “extra time and help to get… security. I am autistic, it is… organize my things quickly and… complicated instructions.”)

For blind and low vision AAC users, one wearable option I saw online is a belt with Braille cards for different words. For folks not fluent in Braille, you could use raised line drawings. This would be great for AAC users who primarily use a tactile based communication device but temporarily don’t have access to it.

Alphabet beads can be strung between spacer beads in order to make a necklace that’s right there for you when your letterboard isn’t handy: This would be best for folks who have good fine motor skills.

blog post 10

(Image description: beads with letters on them strung alphabetically between rainbow-ordered star shaped beads)

A medical alert bracelet is another form of AAC! Holding it out to somebody and pointing is a way to disclose that you are nonspeaking, which can be helpful in emergencies or even in everyday situations where someone doesn’t understand why you’re not responding to them as expected. I also use mine to refer emergency medical staff to the full info sheet I keep in my wallet.

blog post 11

(Image description: A medical alert bracelet on a white-skinned wrist reads “autistic, semiverbal, needs Ipad to communicate fully. Complete medical info in wallet.”)

Status badges can communicate all sorts of things about your present state of mind. You could choose a certain one to wear on your shirt until it’s no longer needed, or keep a set pinned to keychain webbing or a foldable piece of fabric with you at all times so that you can point to relevant ones as needed. One option is using the red, yellow, and green communication badge system that the Autistic Self-Advocacy Network recommends, but you can also purchase or make more specific messages indicating to people around you what you’re coping with or what you need. Here are some I’ve made.

blog post 12

(Image description: Colorful round pinback buttons read: I’m stressed out, I am out of spoons, I’m dissociating, my spoons are low, I’m overwhelmed right now, I’m in pain, I need a minute to think, I’d like to just sit quietly, I’m having trouble coping, I’m tired, I’m not doing well today, I’m okay but don’t feel like talking, I need to be alone right now.)

Other buttons and pins are forms of communication too! One example common in queer circles is pronoun pins that tell people around you what gender pronouns you go by. I wear these on my backpack and hat so that I don’t constantly have to find the right folder in my device to tell people they’re referring to me incorrectly. These are also available on my Etsy if they’d be helpful to you.

blog post 13

(Image description: colorful round pinback buttons feature sets of neopronouns – ze/hir/hirs, ve/ver/vis, fae/faer/faers, co/co/cos, xe/xem/xyr, zie/zir/zirs, and ey/eir/eirs.)

Okay, those are my ideas for wearable/portable AAC methods! I almost always have more than one with me as backups for when my device isn’t the best option. Do you wear your AAC everywhere too? What do you use? Feel free to comment below!

Top 5 signs you’re modeling wrong

I normally try not to be as abrasive as this post’s title suggests, and I don’t want to scare you away from using AAC around your kid (or another emerging communicator you know) – it’s so important. Where I’m coming from here is that I suspect you deeply care about this person in your life, and I also suspect you can do better. Caregivers aren’t given nearly enough resources to learn how to support the new AAC users they know. I’m sure, if I was the primary support person in a young AAC user’s life, I would be on a learning curve too! Because I don’t have much formal training around emerging first language skills, and moreso modeling isn’t how I was introduced to AAC, please take these recommendations with a huge grain of salt – if you have the opportunity to learn from people who actually grew up being introduced to AAC via modeling, listen to their point of view over mine. But as an AAC user myself who has spent some time engaging with emerging communicators and their caregivers, I think my thoughts on this might have some ideas you need to hear. It’s really important to say from the outset that it’s difficult and probably unwise to make blanket recommendations for modeling, because every nonspeaking person has such different needs – in fact, many of my points below are not intended to be “if you’re doing this thing you’re definitely absolutely doing it wrong” and moreso “if you’re assuming this is definitely absolutely the best way to model please seriously question it before proceeding”. Also, as you’ll see below, I sort of take issue with the standard vision of what modeling even means, so understand that I might be using this term in a different way than you are used to. With that understanding, please read on!

1. You’re not modeling. It’s been several hours since you touched a letterboard. You’re not sure whether the talker is actually charged. You’ve been spending much more of your time with the emerging communicator speaking than using AAC. In my opinion, this means you are not modeling enough. For me, self-directedly learning a symbols-based AAC app as an adult who was already fluent in receptive English and expressive written communication took so much dedicated practice and focus – and that was after I’d already spent a significant amount of time organizing and programming my own layout and choice of images. I can’t imagine trying to learn a symbols-based app that was solely designed by someone else at a stage in my life where I was still struggling to understand what was going on around me and what the nature of language even is. If you feel like you’re getting nowhere with modeling your kid’s AAC, the first thing you can do is model more than you’ve been doing.

Or maybe some of you have given up on AAC completely – you started to casually introduce a method or two, but after a few weeks or a few months of the nonspeaking person showing no interest, you set it aside and tried to make the best of the situation. But do you realize how much speech you were exposed to before you began to speak? An average 18-month old has heard more than 4,000 hours of people speaking to them and around them. Was your AAC user exposed to 4,000 hours of seeing an AAC method (one that’s well suited to their needs) used to them and around them before you gave up? For some AAC users, they will only need 400, 100, 10 hours – but some might need 5,000, 8,000, 10,000. For nonspeaking people with processing issues or other receptive language impairments, learning to use AAC will be more difficult than an average nondisabled baby learning to talk. Have you put in the time and effort to truly be able to say you gave it a good go and your child just isn’t able to or doesn’t want to communicate? Pick the AAC back up. Model more often and longer than you did the first time around. Don’t give up. Many of us are on different developmental timelines than an abled person, but if your nonspeaking child can find and learn a communication method that works for them – whether that’s a few months from now, a few years from now, or as an adult – I can almost guarantee their quality of life is going to be so much better.

2. You’re modeling. But… you just said… Yes, I did. But I want to take apart the preconceptions you might have about modeling. So many people think of modeling AAC as sheerly teaching, guiding, or coaxing. I challenge you to re-create your understanding of modeling and begin to use it primarily as communicating, as having a conversation. If you truly want your nonspeaking child to be able to communicate using AAC, that is what you need to model. Not:

[on device] Want cookie. Want cookie.

[then, in speech] “Do you want a cookie, Johnny? Say want cookie!” 

Rather, please literally just use the device to say “I’m getting out cookies for snack, Johnny, do you want some?” That’s it. That’s what you’d say to a nondisabled child you expect to learn to speak, right? So use AAC to say that. Besides the actual demonstration of where to find all those words on their device, you are also demonstrating that it is okay and normal and good to communicate using AAC. This will have not just a more immediate impact on the chance that your child will pick up AAC, but a lifelong impact on their self-esteem as a nonspeaking person. I know personally, growing up with only people speaking around me, I deeply internalized the idea that speech was a superior form of communication, and I’m still unlearning that years into my journey as an adult AAC user. By using AAC to truly communicate with your nonspeaking child, you are directly showing them that they are valued and loved and respected, not just a bucket to be filled with communication skills.

3. You’re taking the nonspeaking person’s voice away to model on. Our AAC should be considered an extension of our body that you don’t have a right to touch without our permission. So if it’s at all possible, please make a duplicate of your child’s system to model on so that theirs always remains with them. For high tech AAC users, the most ideal form of this would be to have their same app on a separate device – for example, installing the Proloquo app that your child has on an iPad on your own phone for you to model on. Now, as a poor person, I absolutely realize that cost is an issue here. You might not have two devices in your household. The app you use might require you to pay twice to put it on two devices, and that could be prohibitively expensive. This is totally the reality for many families. If that’s the case, there are some other options to look into. Printing out each folder of your child’s high tech app means you can point to the same symbols they’re used to in the same layout they’re used to in a paper binder – or at the least, their home screen on a single hard copy communication board. This leaves their device with them for whatever and whenever they want to communicate, and teaches them that they have agency and autonomy over their own communication. If for some reason there’s really no way to have a printout either, I’d say, at the very least normalize asking your child before you touch their device. Maybe they don’t have a reliable yes/no indicator yet – but still ask, every single time. Convey that this is something you realize you need consent for, whether or not they are currently able to give it. If you’re not sure what their answer is, go ahead and say that before proceeding. “Okay, I’m not sure what your answer is, so I’m going to go ahead and use your device, but you can take it away anytime.” When they do get to a point where they can express yes or no – in whatever form, even if it’s grabbing the device out of your hands – respect those answers.

4. You’re not using complete sentences. Maybe you noticed the way I phrased the cookie example above. I think that for many nonspeaking people, just like many speaking people, the best way to learn how to form complete sentences is to be regularly exposed to them from day one! This certainly isn’t universal – especially if you have evidence your child has receptive language problems, it might indeed make sense to go with the usual recommendation of modeling just one or two more words per utterance than they already use. And it’s not like it’s unheard of for parents to say things like “you want cookie?” to children learning how to speak. But I truly think that recommending that practice across the board for AAC modeling severely misses the point of “presume competence”. Not being able to form speech sounds reliably doesn’t inherently mean children are incapable of understanding and learning to use full language with the vocabulary and grammar other people are used to, and for many of them there’s no reason to introduce it only gradually. Yes, look for evidence that suggests they might need you to simplify your grammar when you model, but don’t start from that assumption. If you speak things to them like “Do you want a cookie, Johnny?” and expect them to understand that, you may as well be modeling the full sentence on their AAC instead/too. Not nonsense sentences like “cookie want?” or “more yes” that you would never say to a speaking person.

5. You’re still intimidated by the AAC system, or don’t feel comfortable using it to express yourself. If you, an adult with decades of experience using language (and tech) in general are feeling too intimidated to use your child’s talker, how do you think they feel? I really think you need to know their device (or letterboard, etc) inside and out if you want them to ever be able to be able to use it confidently. Yes, this can take a huge amount of time and practice. But what wouldn’t you do for your kid? Think of it like learning a foreign language – you need to set aside time every day to practice, not just with them but on your own. Once they’ve gone to bed, read books “aloud” to yourself using their device. Repeat dialogue from the TV show you have on in the background. Use the search function liberally if you’re not sure where to find a button, and then practice that motor pathway two or three times after locating it to help internalize it for next time you need to find it. Be patient with yourself, this isn’t necessarily going to come quickly. The more time and effort you put into learning how to use your child’s AAC system, the more effective you will be at modeling – not to mention, you’ll have a greater appreciation of just how much exposure to it they really need before you can start expecting them to use it themselves.

Okay, like I said at the outset, there’s probably no one perfect way to model AAC because every nonspeaking person is so different, but I hope this post has challenged you to rethink some of the typical suggestions you might have heard for how modeling Should Always Be Done. (Psst. I might follow up this post with a bonus +5 more signs in a few days, keep your eye out.)

Did you grow up with people modeling your AAC system? What did you like/not like about their approach? Please comment below!

Using visual supports as an autistic adult: a review

[Content warning: dental]

Recently I created a lot more visual supports for myself, including some picture schedules. Although I often favor text, I’ve realized from my experience with AAC that symbols can actually be helpful for me, and now I’ve seen the benefit of using detailed picture instructions with my support staff in order to follow recipes, I thought it was worth trying to apply to other aspects of life too.

Background: I’ve tried and abandoned various planners and to do list systems over the years. Google Calendar is probably the thing I’ve stuck with longest, and if they had a way to integrate pictures I might have stayed high tech. But I love not just visual tools but tangible tools for coping with executive dysfunction, so I wanted to make hard copy supports. Plus, I’m less likely to remember to go into a certain folder on my device to open up a picture schedule compared to having it physically sitting next to me on the couch where I’ll see it no matter what. I was hoping visual supports could help with various aspects of my executive dysfunction: regularly skipping repetitive self care tasks (looking at you, morning meds), stalling every night when it was time to begin my bedtime routine (inevitably tweeting “GO TO BED E*” in inertia-mired desperation), trouble initiating certain irregular tasks (why is it that plugging in and turning on the printer takes so much effort?), and endless other examples. And like many autistics, I just thrive on structure and knowing what to expect. Mental health professionals often think this means I should go back to work or school or getting out of the house more to provide external structure, but they miss the point – I can create a structured life for myself without attempting high-spoons activities that I know I can’t sustain without ending up in autistic burnout and/or a bipolar episode. Just having a plan for my quiet life and knowing what to expect around the house on any given day is much more useful than trying to accomplish a heavy load of hard things that wear me down beyond repair.

But visual supports are just for kids, right? a voice nagged at the back of my head. No, no: they’re not. Disabled people grow up! Our needs may change over time but many of us still appreciate visual text and/or pictures to support our learning, focus, and communication. Some of us prefer photographic imagery over symbols, or we may want to use words only – and the content of our schedules and routines may be very different than a child’s – but that doesn’t mean we don’t need or want visuals. Executive dysfunction doesn’t magically go away when we leave school or move into our own place – in fact, for many of us we might need this kind of support more as an adult due to new work/living environments, increased demands on our cognitive load, decreased interpersonal supports, and/or the built-up effects of autistic burnout. The idea that picture schedules and other visual supports are only meant for children actively discourages disabled adults from accessing tools they need. It’s the fact that I’ve been part of a positive autistic community for a while now, a community that fights the stigma around using any needed supports across the lifespan, that got me to the place where I could ditch the internalized ableism around this and go ahead and create these tools for myself.

Before I go further, I want to take a minute to point out a few situations in which I hope you won’t use visual supports like these. 1) Don’t use visual supports to convince or train a disabled person to do something they don’t want to do, even if it’s what you think is best for them. 2) Don’t use visual supports to convince or train yourself to do something that overall impacts you negatively. (A couple examples to make it clear what I mean: you might not want to brush your teeth but still find the net effect on your well-being positive, whereas you might want to keep the house spotless but find the net effect on your well-being negative – in that case, go for it with the toothbrushing but please don’t use these ideas to get yourself to keep the house spotless, it’s not worth it.)

My process: I’d looked at premade visual schedules and sets of picture communication cards online and considered purchasing, but decided to make my own instead. I was able to customize the available options for each schedule (including many more adult type tasks than is easy to find in premade sets online), use symbols I’m already familiar with from my AAC app, use typefaces I can read more comfortably, and spend less money on supplies for more total supports. Before starting, I did a giant brainstorm of what kinds of supports would be helpful for me (for example, a “morning routine” checklist) and what items each one would need to contain (for example, “meds” and “wash face”). I let those lists marinate for a few days so I could gradually add items I’d forgotten. Then I screenshotted the relevant buttons for each item from my symbols-based AAC app (Proloquo2Go), in some cases temporarily editing that button’s label to more closely match my intention for the visual support usage. I used those symbols for all my supports except for my kitchen inventory – for that one I used pictures of the actual brands I tend to buy screenshotted from my local grocery store’s website. I inserted all these images into the Google doc I’d brainstormed items into, and played around with sizing before printing. After cutting out each item I “laminated” them with packing tape, and did the same to the backing pieces of cardboard most of my supports were destined to lay on. I then attached adhesive velcro dots to the back of each item, and placed opposite pieces of velcro to the various backing pieces for each support (or in the case of my shopping list directly onto my fridge, landlord be damned). I added envelopes to hold loose items not currently in use, and the morning and evening routine boards have loops of yarn at the top so they can hang around my neck until I’ve completed everything. The leaving-the-house checklist didn’t require so much crafting; I just taped the printed out list on a single sheet onto the back of my front door.

Images, text descriptions, and notes on individual items:

image2

Image description: a piece of cardboard hanging from yarn labeled “morning routine” has two columns marked “to do” and “done”. “To do” contains a set of empty velcro dots, while “done” contains velcro dots with symbols and words attached to each. Items included are: coffee, dress, wash face, deodorant, glasses, hearing aids, medicine, October [cat], breakfast, brush teeth, and mouthwash.

Notes on morning routine: I keep these in a rough suggested order from top to bottom starting on the left column and continuing on the right, but don’t necessarily complete them in the set order. It’s nice to be able to move them onto the “done” side individually so that I always know what’s left no matter what order I’ve proceeded in.

image3

Image description: a piece of cardboard hanging from a loop of yarn labeled “bedtime routine” has two columns marked “to do” and “done”. “Done” contains a set of empty velcro dots, while “to do” contains velcro dots with symbols and words attached to each. Items included are: tomorrow’s schedule, plug devices in, hearing aids, pajamas, medicine, October [cat], brush teeth, mouthwash, and glasses.

Notes on bedtime routine: As with morning routine. Both are on loops of yarn so that I can wear them around my neck until everything’s complete. This prevents me from having to continually walk back to a section of wall or counter in a certain part of the house between each step, and makes it harder to get distracted and abandon the routine partway through.

image4

Image description: a foldable piece of cardboard has sections labeled “today” and “maybe” containing velcro dots, and an envelope labeled “another day”. Currently visible under “today” are moveable velcro dots attached to pictures and words for walk, sign language, blog, video chat, Twitter chat, recipe, and Etsy. Currently visible under “maybe” are pictures and words for yoga, audio book, wizard rock, and modding.

Notes on day to day schedule: Every evening I pull up my Google calendar as a reference and remove all the possible items from schedule and envelope, sorting into piles for the following day. After returning any irrelevant items to “another day”, I place the “today” and “maybe items” to the bottom/right of each section so that I can move them to the top/left as they are completed. The gap in between tells me where I’ve left off and makes it easier to sort the following evening because I can see what’s been left undone. In total I made about 25 items that frequently repeat during my average week or month but don’t fit into a every-single-day routine like for mornings and evenings. Activities include various carer appointments, visits from my support staff, errands, hobbies, self care, and more.

image5

Image description: a piece of cardboard labeled “PCA time” has a short list of velcro dots. Currently displayed items are: oatmeal, beans, counters/sink, and prep a recipe.

Notes on PCA agenda: Like the day to day schedule, I initially place the agenda for me and my support staff at the bottom of the short list so we can move them up to the top as they are completed. A small envelope (not pictured) below the chart contains other tasks we do frequently but not that day. This visual support that’s tacked to the kitchen wall doubles as AAC, because I can point to it when needed rather than finding the word on my device or signing.

image0

Image description: A white freezer has a line demarking two sections labeled “have plenty” and “need more” with several dozen velcro dots under each. About forty food items and household items are attached across the two categories, displaying a photographic image and large text for each.

Notes on kitchen inventory: This list continues down the fridge, but the photograph above gives you the basic idea. Not currently pictured are additional non-food items I regularly need to restock such as soap and toilet paper. If I’m struggling to think of what to eat, a glance at the “have plenty” side tells me what I own without having to dig through fridge and cupboards, and as I run out of each ingredient I can move it to the “need more” side – the latter of which can then be photographed just before leaving for the store as an instant, bad-handwriting-free shopping list that incorporates pictures.

image1(2)

Image description: a single sheet of paper is attached to a wooden background with masking tape, labeled “Leaving the house? Bring these things!” Below the heading are two columns of symbols and words for the following items: mask, wallet, keys, phone, hat, weather gear, sunscreen, Ipad, speaker, HA batteries, letterboards, chargers, caffeine, food, water, AAC bracelets, and stim toys.

Notes on leaving-the-house checklist: I didn’t bother attaching these items to moveable velcro dots because I don’t necessarily need every one of these items every single time I leave the house. Instead, I put them in rough order (top to bottom on the left and then continuing on the right) of how likely it is I will need each thing for any given time I exit the apartment – for example, I need mask, wallet, and keys just to take out the trash or walk to the corner store, but might not need to bring a backpack with a lot of the latter items unless I’m actually taking a longer adventure that day on transit.

Results so far:

  • I don’t always actually physically move every single item into the done category when I’m at completion of the end of the routines lists, but it definitely helps me not get lost in the middle, and having them hanging around my neck is also definitely making it harder to get lost in the middle
  • It does seem like I’m less likely to stall on bedtime now; at least the frequency of my “GO TO BED E*” tweets has decreased, which I imagine my followers appreciate. I have often been starting my routine ahead of my mental deadline rather than scrolling Twitter endlessly long past when I intended to move on.
  • That said, if I haven’t left the bedtime routine support actually within reaching distance of the couch (where I’m almost always sitting when it’s time to get ready for bed), it does nothing to help with inertia. Turns out standing up and crossing the room to pick up the schedule takes just as much cognitive effort as standing up and crossing the room to begin the actual routine – who would have guessed? – so the less-stalling effect only happens if I’ve left the schedule nearby.
  • The morning routine isn’t as smooth-going as bedtime, which seems to be because some of the things I only do every other day – that therefore aren’t worked into the morning visual support – have to happen in between morning routine items. (Ie shower before dressing, yoga before putting on hearing aids, etc.) So I might start the morning routine when I first wake up but then I set it down partway through to accomplish those irregular tasks, and it ends up being an hour or two before I actually complete everything. I also am liable to get out Animal Crossing halfway through and get distracted by that for awhile, oops. But it does seem like I’m more likely to eventually actually complete all the morning routine tasks than I used to be, so I do think the new support is still helping.
  • For the day to day schedule, I’m finding that if I change my mind and decide to skip something I meant to do under the “today” section, it can be hard to transition on to the next item. After some trial and error it seems like moving the skipped item down to the “maybe” section or even out of sight to the “another day” envelope is a suitable hack to get me over that AUGH-CHANGE-OF-PLANS cognitive barrier. But as always is true for me, it remains much easier for me to not do something I planned for a given day than to add something that I didn’t have in my brain as a possibility the night before. I think this is just a default quirk of my brain that the presence of picture schedules doesn’t seem to impact one way or the other.
  • There’s only so many empty velcro dots on the day to day schedule, so it’s harder to overbook myself spoons-wise!
  • I’m already noticing some important items I’m wanting to add to the day to day schedule, the PCA agenda, and the kitchen inventory, so I’ll probably do another printing-laminating session eventually to fill in the gaps.
  • I hadn’t been sure how many adhesive velcro dots to order, but a set of 250 was enough for this set of supports. I think the “laminating” process took less than one big roll of packing tape, which is much cheaper than actually laminating this much paper at the FedEx store.
  • Cooking isn’t easy for me (although having a support staff and detailed picture instructions have happily moved it up from “impossible” to “not easy”), so I’ve sometimes been forgetting to pay attention whether I’m getting low on an ingredient as we’re partway through a recipe. But having a support person here means there’s someone to remind me to move the item to the “need more” section of the fridge, so I don’t think I’ve actually completely missed anything yet.

My recommendations:

  1. While prepurchased sets of picture schedules or other visual supports might be convenient to just click “buy” on, if you do have the time and supplies, individualized homemade supports might be much more useful.
  2. Do use symbols the user is already familiar with, or photographs of the actual items they’re used to.
  3. Consider what typeface you’re using if you’re including text – size, spacing, contrast, dyslexic friendly fonts, etc can all impact how usable a visual support is.
  4. Consider portability of each visual support, or if they’re not portable, exactly where they’re going to be placed in your home. This can significantly affect how easy it is to follow through on each task.
  5. Assume you’ll need to add more items to your supports after a few weeks or months of trialling your original plans. Keep a notepad nearby where you can jot down missing items as you think of them. Like choosing what vocabulary to choose for an AAC device, it’s just hard to predict all the details of what you’ll need without actually trying it out for a while.
  6. Most importantly: Involve the user in intent, design, and implementation as much as possible! Don’t reward or punish someone based on whether or not they use the visual support, and don’t trade rewards or stickers or whatever for completed tasks. Visual supports should be optional tools for people who are dissatisfied with the way executive dysfunction affects their own goals, not a method of training a disabled person to do what you want.

Thanks for reading such a detailed post! I hope it was helpful for you or someone you love. If you have something to add based on your own experience with visual supports, please post a comment below.

Communication access and ableism

Adapted from a presentation I gave to college students today.

Content warnings: eugenics, abuse, coronavirus

Thinking about communication accessibility is important because access barriers are a huge part of disabled people’s day to day life, and it’s exhausting to usually be the only one addressing them. What shapes access barriers is structural and interpersonal ableism. Ableism demands all day every day that we as disabled people conform to the kinds of communication that are most convenient for abled people rather than those which are even slightly accessible to us. Not only that, it demands that the content of what we communicate is docile and submissive; we’re not supposed to stand up for our rights, disagree with mainstream culture, refuse unsolicited help, or self advocate. Let me give some pertinent examples.

Deaf and Hard of Hearing people have had to petition countless local governments around the world lately to add sign language interpretation to their press conferences and health advisories about coronavirus. At the level of government-provided information and mass media, this should not be seen as a bonus unplanned for accommodation that might be arranged upon request, but as a baseline accessibility measure required to meet the needs of the public at large. If not implemented from the outset, disabled lives are threatened due to lack of the same information that abled people have free access to. Similarly, it’s disabled run organizations and individuals that have taken it upon themselves to create cognitively accessible information about Covid – summarizing official briefings in plain language, adding illustrations and glossaries, and creating social stories for fellow autistic people devastated by the sudden change in daily routine. We do all this in part because we know that we are in that much more danger if we get sick. Even if we’re not personally immuno-compromised or dealing with underlying conditions that make the virus particularly dangerous to our bodies, we know that disabled lives are seen as less valuable, to the extent that many jurisdictions make decisions about how to ration medical treatment such as ventilators based on the presence of other disabilities and how an outsider estimates the impact of disability on subjective quality of life. This means that disabled people, including people who are labeled unable to communicate and people with intellectual disabilities, can be sacrificed in favor of saving patients without pre-existing disabilities. This is one example of modern day eugenics. So when people fight for communication access, it is not about mere convenience or political correctness. It is a matter of life and death.

The federal government has released guidance on how to enact the Americans with Disabilities Act in regards to communication accommodations, which really, the law should be seen as the bare minimum for making something accessible. That guidance states that primary consideration of which communication accommodation is provided should be the disabled person’s preference, which brings us around to the idea of communicative choice. It should be up to each disabled person to choose communication methods that work best for us in the moment. This is often not going to look the same as abled communication, and it’s not necessarily even going to look the same from day to day. I switch regularly between AAC that uses spelling versus AAC that uses picture symbols, high tech AAC versus light tech AAC, sign language, plus a bit of speech. I remember noticing one day a few weeks ago, it’s only noon and I’ve already used five forms of AAC this morning. I’m proud of being a multimodal communicator. Have you ever been in the middle of writing an email when the phone rings, and then while you’re talking your roommate comes over to ask something and you hold up a hand to them to mean wait, i’m on the phone? See, you’re a multimodal communicator too, but if you’re abled people probably don’t consider that anything but normal. Yet if people are used to me speaking, they are surprised I often need AAC. If they’re used to me using my communication device, they are surprised I might prefer a sign language interpreter. When you’re disabled your methods of communication are under endless scrutiny, skepticism, and ableist expectations.

So what’s important is to support disabled people’s communication by giving us as many options as possible, and validate whatever method we choose at any given moment. This means giving Deaf and Hard of Hearing children access to sign language and Deaf culture from day one rather than pushing oralism, and giving neurodivergent children access to AAC from the very first indication of a speech delay rather than pushing speech skills. It means adding image descriptions to your social media posts, whether or not you happen to know that a blind person will read them. It means celebrating autistic people’s echolalia, and infodumping about our special interests, and stimming, as important and valid forms of communication. And sometimes it might mean having conversations that aren’t totally comfortable for you. It’s more respectful for you to ditch your self consciousness and just go ahead and ask us how you can best support our individual communication, rather than being a conversation partner who finishes sentences for us, changes the subject while we’re typing, reads over our shoulders, touches our communication devices, or demands details about our medical history, without having asked whether any of that is okay or helpful for us. Sometimes we will say things that make you feel uncomfortable too. In autistic culture we are often upfront and direct about how we talk about things, this isn’t seen as rude but just as normal for our kind of brain. We might not know how to translate it into the kinds of phrasing that neurotypicals expect, or we may have too much going on at the moment to be able to spend the energy doing so.

Other times disabled people of any neurotype can find themselves forced to be rude in order to be listened to! When you are blind and people grab you without consent, forcing you to walk in the direction they think is helpful – or you use a mobility device and people take it upon themselves to move you around like furniture, despite your protests – these violations of bodily integrity are what is truly rude, and confronting them is a kind of self advocacy that abled people don’t often want to hear. When they’ve invested their ego in the idea that they help the disabled, that they use their superiority over us to make decisions in our best interests, they think us saying “don’t do that” or “I don’t need your help” or “leave me alone” is inappropriate or ungrateful. Let alone if we say those things without speech, at which point they might not consider it communication at all. It might be written off as just a behavior or a symptom, not a message that deserves consideration. Even within forms of AAC, sometimes speaking people privilege the ones that come closest to typical speech. They’d rather me use a device with a QWERTY keyboard and expressive voice output to compose full sentences in proper English, whereas if what I’m able to do that day is just point to hard copy picture symbols silently, a verb here, a noun there – somehow that makes me less of a person. But all communication should be honored, all people should be respected. Saying no, whether that’s with mouth words or some other way, is an important communicative function. Noncompliance and refusal are valuable social skills that help us set boundaries and protect us from the abuse we as disabled people are so likely to be subjected to.

In summary, we deserve to be our own decision makers about how we communicate and the content of our messages. Existing ableist structures and ideals are not a legitimate reason to withhold information from us, refuse to provide us with communication accommodations, or dismiss our messages. Communication access is a human right that is frequently denied to disabled people, and if you’re abled part of your job as an ally/accomplice is to help us fight for that access.

AAC is not just for requesting! Creative ways to use AAC

I think too often professionals and caregivers of AAC users only focus on using AAC – especially symbols-based systems – for communicative functions such as requesting. That really limits how much can actually be done with these powerful apps! I use my symbols-based AAC (Proloquo2Go) in a much wider variety of ways; here’s some examples:

Picture schedule: Why buy a separate set of physical or electronic event/task images when you already have symbols based AAC? AAC users or our support people can create a page to edit each night for the upcoming day, or create folders for the sequences of steps in complicated activities like cooking. Many of us who struggle with transitions and executive dysfunction can benefit from visual supports like this.

Flashcards: By creating a folder of vocabulary we’re trying to learn in a second language, setting the buttons to show image only, and programming the speak field to feature the foreign word, AAC users can use their symbols-based AAC system as a flashcards studying app. This is especially useful for learners like me who do best in immersion settings – other flashcards and picture dictionaries are always mixing in English, which can make it harder to learn the new language. (Please note that this idea should be used only for self-directed learners, not for forcing emerging communicators to go through drills to prove their competence.)

Navigate meltdowns: Some AAC users who usually use text-to-speech on a QWERTY keyboard may find that they lose this ability during meltdowns and shutdowns – but that they may still be able to use a symbols-based system during these times. One major reason I like having Proloquo2Go as well as Proloquo4Text (which is QWERTY-based) is that I can switch to symbols when necessary. Last year I had to go to class in the midst of this situation, and it turned out my classmate was totally cool with me composing messages via images to say things like “have meltdown hard communicate”. Normally I’d be able to type on QWERTY more complex sentences like “I just had a meltdown and am still having a hard time communicating”, but in that moment if it weren’t for a symbols-based program I wouldn’t have been able to interact at all.

Write poetry: AAC users can select images that evoke a scene and then use the words to write a poem describing the feeling it gives us. I especially enjoy poetry because poets are given more leeway to break the rules of “proper English”, something that AAC users are often discouraged from doing. But communication is about conveying an idea, not about grammar and syntax and spelling and pronunciation! If we can get our message across using a haphazard series of nouns and verbs, that is still valid communication. Poetry is a venue where this kind of creative use of words can be valued.

Write prose: AAC users can write school assignments or even extracurricular fiction by composing their sentences in a symbols-based app and then copy and pasting into a word processor. I have written sections of my novels this way!

Post to social media/emails: Similarly, by composing a message in our symbol-based apps and then copy and pasting to social media or email, AAC users can participate in online communities using the kind of communication we prefer. I sometimes use my symbols-based app to livetweet my favorite TV shows; it’s a great way to share my special interests with others and get more familiar with the app.

Give presentations: AAC users can program our scripts into a series of buttons and practice by running a stopwatch to make sure our words play in the desired amount of time. This could be for a school project, an open mic night, advocating for ourselves at an IEP meeting, or meeting with our senators on disability rights issues. I regularly use my device to give presentations about autism, disability, accessibility, and AAC.

Special interest infodump: Autistic AAC users like me might enjoy utilizing the way symbols-based systems organize categories and folders to store information about our special interests. I have folders full of hundreds of Harry Potter characters, spells, et cetera, so that I can talk about the canon I love with other fans.

Vocal stimming and echolalia: Many autistic people like me use vocal stimming and echolalia to modulate our sensory environment and communicate. This shouldn’t be limited to speaking people; it’s a totally valid way to use AAC! Don’t discourage us from “playing” with our systems – having the freedom to press buttons over and over, to use buttons that repeat phrases from our favorite movies, or to play buttons at random as experimentation can encourage emerging communicators to feel comfortable using AAC.

Prompt speech: This isn’t commonly understood, but some of us can speak words aloud only when they are in front of us visually. So we can use an AAC system to compose what we want to say, and once we have selected the right buttons we may be able to read the screen aloud rather than using our device’s synthesized speech. Please don’t pressure us to do this, and don’t expect us to read a message you composed for us! This is just one more tool that may give us additional agency over our communication.

I hope you got some new ideas from this list that you can try out and share with other AAC users! If you have discovered more creative uses for your own AAC, please add your thoughts in a comment below.

Why do some autistics like watching the same media over and over?

I can’t speak for all autistics, but there are a lot of reasons I watch the same media over and over! I have about 10 long series that I watch on endless loop – I restart one, watch every episode in order, and then restart the next one (I keep a spreadsheet), ad nauseum. Except for me it’s not ad nauseum – it’s the main, maybe only, way I can enjoy media! In this post I’ll go over a few reasons why, in case it gives anyone insight as to why you or your autistic loved one might be doing the same thing I do.

Routine: This might be the obvious answer, but it’s not unimportant. Many of us just thrive on routine. Even if all other elements were neutral, it is inexplicably reassuring and comforting to watch the same shows over and over. In a chaotic world where we may not always know what to expect, coming home to a familiar show can feel like a weighted blanket or a soft stuffed animal.

Prosopagnosia (faceblindness): I am not completely faceblind, but it does take me a huge amount of repeated exposure to any given face before I begin to recognize it reliably. For this reason, (re)watching TV shows that have several seasons with the same main cast of characters keeps me oriented to which character is which. In contrast, watching a two-hour-long movie would just be confusing: it’s very difficult to understand what’s going on when for the first two-thirds (at least) of the plot I can’t even tell if I’ve met any given main character yet, let alone what they said or did in previous scenes. Sometimes I recognize an actor by their voice, but unless I’ve seen multiple seasons’ worth of their appearances – ideally over and over – their face is likely to be a mystery to me. Occasionally even actors I am very familiar with are unrecognizable out of context – once, in the middle of a DM conversation about Gillian Anderson, a friend sent me a picture of Anderson. I’ve seen X-Files at least five times through, but this was an out-of-context photo where her hair and outfit was different than I’m used to. My response to my friend: “who’s that person?” I was baffled as to why she had sent me a random photo of what to me registered as a stranger.

Auditory processing: Captions can help a lot with auditory processing, but so can rewatching media. Captions don’t usually account for background music or sound effects, and even with captions it might take me a few times through any given scene before I’m integrating all that correctly. Crucially, jump scares and other startling sounds/lights/movements can somewhat be cognitively prepared for if you know what’s coming when. Watching a series from beginning to end on Netflix means I don’t have to turn down the volume for every artificially loudened commercial break like I would on a standard television, and I can skip the theme songs if they’re also too loud (or if they’ve recently changed – that bugs the heck out of me).

Understanding the plot: I guess this makes me feel a little silly, but I genuinely don’t understand the plot of many shows the first time through. Every time a new season of Stranger Things comes out it takes me at least three times through before I start to understand why things happened the way they did. It seems like I just don’t always clue in to the elements the creators expect neurotypicals to automatically notice. I didn’t fully realize how true this was until I watched a couple of shows with audio descriptions. While I wish the audio descriptions were also captioned, what I could catch of them was amazing. They pointed out crucial elements of each scene I was supposed to be attending to but often wasn’t – facial expressions, body language, visual elements that set the backdrop with clues and ingredients of later subplots. A bonus is that audio descriptions often name the character seen emoting on screen, helping with prosopagnosia. But they’re available for so few shows, in most cases it’s only rewatching multiple times that can help me meet these access needs. Repetition helps me grasp each step of the plot and how it’s all connected. I start to figure out characters’ motivations and understand the worldbuilding rules that shape the story.

There are probably many more reasons other autistic people might prefer to rewatch media, these are just the biggest contributing factors for me. What are yours? Comment below!