Wearable and highly portable AAC ideas

I gave a presentation at AAC in the Cloud this week about creating backup AAC methods for when your primary method isn’t a good option. I wanted to turn a couple parts of this presentation into blog posts, and this will be the first one – a list of ideas for wearable and other highly portable AAC (that you could use as a backup or as your primary method). If you’re more of a video/slideshow person feel free to watch the presentation linked above instead (there’s a transcript of my speech in the notes section of the slideshow if you need that to follow along), but if you’re more of a reading-a-blog person this is for you!

Please note that I’m not trying to advertise any specific brands in the links provided below, I just want you to be able to see examples of the possibilities that are out there. If you’re considering purchasing something to use as wearable/portable AAC, do your research in order to choose between the sellers available. If you’re considering crafting something to use as wearable/portable AAC, do your research to see how others have created similar items.

First of all, for partner assisted scanning, you don’t need any physical objects at all, just another person who can provide options auditorily or in sign language. This is a great method of communication for people who are having a really hard time with motor control or physical fatigue, because you can use as little as one small muscle movement to indicate a positive response when your communication partner says the option you want to select. I have friends who frequently use their head or eye gaze to select options when their devices are broken – their support staff might say two options, holding up a fist for each in opposite locations, and the AAC user can turn their heads or eyes towards the one they want to select. Another acquaintance of mine can stick out their tongue slightly to select an option or indicate “yes”. With a skilled communication partner these methods can lead to a huge range of self expression, and if you’re a support person for an AAC user I encourage you to learn as much as you can about how you can support robust communication using partner assisted scanning.

Another AAC method that’s completely portable because it doesn’t require any physical objects is expressive sign language, modified signs, or home signs. I don’t recommend using signed English or other gestural systems that are based on spoken grammar, because learning even a little bit of an actual sign language like ASL, including exposure to Deaf cultural norms, will lead to greater potential for connections with sign language communities across the lifespan. It also provides a better chance of access to interpreters who will understand you accurately as a communication support – for example, I often use interpreters for medical appointments. ASL or whatever your local sign language is can be a great backup AAC method. If you have motor skills impairments you might be able to learn modified signs that fit your abilities, and there are still interpreters who are skilled at understanding those. For AAC users who are certain they really only want to be able to communicate with their immediate family or caregivers, home signs or idiosyncratic gestures are another option for a backup method. Sometimes nonspeaking people will develop these kind of home signs independently, so if you notice someone you know using the same gesture over and over, try to figure out what it means! It’s entirely possible it’s a stim, but it could be communicative.

Those are the nontangible forms of AAC that are probably the most portable of all methods, but there are almost endless objects you can use as wearable communication supports too. Note that a lot of these can also be used for partner assisted scanning with aided visual input, where your support person points to each option as they list it and wait for you to select the one you want. For example a few months ago when both me and my friend’s devices weren’t available, I used a letterboard both to direct select what I wanted to say where my friend could see it and to use my finger to indicate each letter for them to select the ones that spelled out their messages. It was slow, but it was honestly lovely to be able to have a direct conversation together without needing fancy tech or speaking people to intervene!

Perhaps the most obvious way to wear AAC is to put your communication device on a strap or wheelchair mount! If you use a tablet like me and its case doesn’t feature a built-in handle, I recommend Otterbox’s harness and strap. I typically sling this over my shoulder when on the go so that my device is always nearby.

Additionally, if you almost always have your device on you but want to prepare for when it runs out of batteries, you can keep a small letterboard taped to the back of it or tuck one into its holster. Here is a link to a simple letterboard I designed that you’d just need to shrink or expand to your ideal size before printing. I definitely recommend laminating the printout so that it holds up over time – well, I recommend laminating pretty much everything. Small laminated letterboards are also easy to tuck into a purse or backpack.

blog post 1

(Image description: a small laminated letterboard with some core words and symbols peeks out from the holster of a tablet. The holster has a small neurodiversity symbol patch safety pinned to its handle.)

If you like to handwrite you can wear a miniature notebook and pen, or whiteboard and pen, around your neck or clipped to your shirt or belt. This can be homemade with dollar store materials or a pre-made set purchased online like this small whiteboard designed for nurses.

blog post 2

(Image description: a small green spiral notebook with a faint drawing of the Harry Potter “Deathly Hallows” symbol hangs from a loop of yarn, with pen attached with another small loop and tape.)

Even if your typical style isn’t exactly spooky, an Ouija board shirt or pillow conveniently doubles as a letterboard. Or this is even more niche, but if you’re a fan of the Netflix series Stranger Things, look for merchandise that displays Joyce’s string lights alphabet wall.

blog post 3

(Image description: A close up shot of a sticker with three rows of string lights, each letter of the alphabet scrawled underneath.)

You can attach a luggage tag that is printed with letters or important symbols to a keychain or lanyard. Sometimes there are luggage tags like this pre-made online, or you can print your own designs at home and seal them onto both sides of an old gift card with a sturdy glue or tape, punching a hole for the attachment – here is one I made. I recommend taping over the entire card with packing tape before punching the hole in order to provide a sort of homemade waterproofing. You can fit more than one of these on a single lanyard or keychain if you want to increase the vocabulary you can fit on it.

blog post 4

(Image description: a rounded rectangle card attached to a lanyard with metal clasp features words and symbols for yes, no, stop, go, help, and bathroom.)

Some companies custom print snap bracelets, or if you’re a seamstress you can hand embroider a fabric bracelet with symbols and words for a wearable and washable option. I have made several of these for myself and friends, and they’re really handy especially for quick interactions! You’re welcome to message me on Etsy if you’re interested in purchasing one from me. Another option for a homemade bracelet is to attach a small laminated symbols board to the cuff of an old sock.

blog post 5

(Image description: Two embroidered bracelets with velcro closure feature words and symbols for communication device, bathroom, help, I don’t know, meltdown, meds, need, not, can hear, drink, eat, go, maybe, semiverbal autistic, stop, and thank you.)

Football playbook wrist cuffs offer plastic sleeves you can insert your own letterboard or symbols board into:Here’s a picture of mine. It’s not perfectly waterproof but holds up decently as it’s designed for outdoor use.

blog post 6

(Image description: a wrist cuff with plastic sleeve worn on a white-skinned scarred arm unfolds to show a letterboard and a small symbols board with the words can hear, bathroom, hello, help, semiverbal autistic, and thank you.)

If you’re someone that always has your phone in your pocket or on a belt holster, a free or low cost app like Speech Assistant could be handier for a quick conversation than getting out a bigger device. My phone doesn’t have great volume but I frequently hold it out for a communication partner to read in circumstances like running errands or talking to neighbors.

A sturdy plastic alphabet stencil or letterboard can be attached to your backpack with a carabiner, here’s an example of my friend’s.

blog post 7

(Image description: a grey alphabet stencil with large capital letters and handle resting flat on a wooden surface.)

There are companies that custom print bandanas you could design with symbols or words tailored to your communication needs: I haven’t tried this but I’m working on embroidering a letterboard that will be similarly foldable, washable, and hardy.

blog post 8

(Image description: a grid of embroidered rectangles on rough fabric with a letter faintly drawn into each, so far only A and B have been filled in with stitches)

You can buy pre-made sets of communication cards on keyrings, but I recommend printing and laminating your own in order to choose vocabulary that is going to be most useful to you. This picture is one I made for when I travel, it’s useful to have something with me while my Ipad has to go through the X-Ray machine, in case I need to say anything to TSA agents. Again, I laminated with packing tape as a cheap alternative to paying for lamination at an office supplies store.

blog post 9

(Image description: a stack of small cards hanging on a loop of yarn. Four cards are splayed out with symbols on each, and text only partially visible: 1) a person touching their mouth with text reading “thank you”, 2) a ticket machine with text reading “…checking in”, 3) a flight attendant with text reading “flight attendants… disabled and… communication”?, and 4) a security guard with text reading “extra time and help to get… security. I am autistic, it is… organize my things quickly and… complicated instructions.”)

For blind and low vision AAC users, one wearable option I saw online is a belt with Braille cards for different words. For folks not fluent in Braille, you could use raised line drawings. This would be great for AAC users who primarily use a tactile based communication device but temporarily don’t have access to it.

Alphabet beads can be strung between spacer beads in order to make a necklace that’s right there for you when your letterboard isn’t handy: This would be best for folks who have good fine motor skills.

blog post 10

(Image description: beads with letters on them strung alphabetically between rainbow-ordered star shaped beads)

A medical alert bracelet is another form of AAC! Holding it out to somebody and pointing is a way to disclose that you are nonspeaking, which can be helpful in emergencies or even in everyday situations where someone doesn’t understand why you’re not responding to them as expected. I also use mine to refer emergency medical staff to the full info sheet I keep in my wallet.

blog post 11

(Image description: A medical alert bracelet on a white-skinned wrist reads “autistic, semiverbal, needs Ipad to communicate fully. Complete medical info in wallet.”)

Status badges can communicate all sorts of things about your present state of mind. You could choose a certain one to wear on your shirt until it’s no longer needed, or keep a set pinned to keychain webbing or a foldable piece of fabric with you at all times so that you can point to relevant ones as needed. One option is using the red, yellow, and green communication badge system that the Autistic Self-Advocacy Network recommends, but you can also purchase or make more specific messages indicating to people around you what you’re coping with or what you need. Here are some I’ve made.

blog post 12

(Image description: Colorful round pinback buttons read: I’m stressed out, I am out of spoons, I’m dissociating, my spoons are low, I’m overwhelmed right now, I’m in pain, I need a minute to think, I’d like to just sit quietly, I’m having trouble coping, I’m tired, I’m not doing well today, I’m okay but don’t feel like talking, I need to be alone right now.)

Other buttons and pins are forms of communication too! One example common in queer circles is pronoun pins that tell people around you what gender pronouns you go by. I wear these on my backpack and hat so that I don’t constantly have to find the right folder in my device to tell people they’re referring to me incorrectly. These are also available on my Etsy if they’d be helpful to you.

blog post 13

(Image description: colorful round pinback buttons feature sets of neopronouns – ze/hir/hirs, ve/ver/vis, fae/faer/faers, co/co/cos, xe/xem/xyr, zie/zir/zirs, and ey/eir/eirs.)

Okay, those are my ideas for wearable/portable AAC methods! I almost always have more than one with me as backups for when my device isn’t the best option. Do you wear your AAC everywhere too? What do you use? Feel free to comment below!

Top 5 signs you’re modeling wrong

I normally try not to be as abrasive as this post’s title suggests, and I don’t want to scare you away from using AAC around your kid (or another emerging communicator you know) – it’s so important. Where I’m coming from here is that I suspect you deeply care about this person in your life, and I also suspect you can do better. Caregivers aren’t given nearly enough resources to learn how to support the new AAC users they know. I’m sure, if I was the primary support person in a young AAC user’s life, I would be on a learning curve too! Because I don’t have much formal training around emerging first language skills, and moreso modeling isn’t how I was introduced to AAC, please take these recommendations with a huge grain of salt – if you have the opportunity to learn from people who actually grew up being introduced to AAC via modeling, listen to their point of view over mine. But as an AAC user myself who has spent some time engaging with emerging communicators and their caregivers, I think my thoughts on this might have some ideas you need to hear. It’s really important to say from the outset that it’s difficult and probably unwise to make blanket recommendations for modeling, because every nonspeaking person has such different needs – in fact, many of my points below are not intended to be “if you’re doing this thing you’re definitely absolutely doing it wrong” and moreso “if you’re assuming this is definitely absolutely the best way to model please seriously question it before proceeding”. Also, as you’ll see below, I sort of take issue with the standard vision of what modeling even means, so understand that I might be using this term in a different way than you are used to. With that understanding, please read on!

1. You’re not modeling. It’s been several hours since you touched a letterboard. You’re not sure whether the talker is actually charged. You’ve been spending much more of your time with the emerging communicator speaking than using AAC. In my opinion, this means you are not modeling enough. For me, self-directedly learning a symbols-based AAC app as an adult who was already fluent in receptive English and expressive written communication took so much dedicated practice and focus – and that was after I’d already spent a significant amount of time organizing and programming my own layout and choice of images. I can’t imagine trying to learn a symbols-based app that was solely designed by someone else at a stage in my life where I was still struggling to understand what was going on around me and what the nature of language even is. If you feel like you’re getting nowhere with modeling your kid’s AAC, the first thing you can do is model more than you’ve been doing.

Or maybe some of you have given up on AAC completely – you started to casually introduce a method or two, but after a few weeks or a few months of the nonspeaking person showing no interest, you set it aside and tried to make the best of the situation. But do you realize how much speech you were exposed to before you began to speak? An average 18-month old has heard more than 4,000 hours of people speaking to them and around them. Was your AAC user exposed to 4,000 hours of seeing an AAC method (one that’s well suited to their needs) used to them and around them before you gave up? For some AAC users, they will only need 400, 100, 10 hours – but some might need 5,000, 8,000, 10,000. For nonspeaking people with processing issues or other receptive language impairments, learning to use AAC will be more difficult than an average nondisabled baby learning to talk. Have you put in the time and effort to truly be able to say you gave it a good go and your child just isn’t able to or doesn’t want to communicate? Pick the AAC back up. Model more often and longer than you did the first time around. Don’t give up. Many of us are on different developmental timelines than an abled person, but if your nonspeaking child can find and learn a communication method that works for them – whether that’s a few months from now, a few years from now, or as an adult – I can almost guarantee their quality of life is going to be so much better.

2. You’re modeling. But… you just said… Yes, I did. But I want to take apart the preconceptions you might have about modeling. So many people think of modeling AAC as sheerly teaching, guiding, or coaxing. I challenge you to re-create your understanding of modeling and begin to use it primarily as communicating, as having a conversation. If you truly want your nonspeaking child to be able to communicate using AAC, that is what you need to model. Not:

[on device] Want cookie. Want cookie.

[then, in speech] “Do you want a cookie, Johnny? Say want cookie!” 

Rather, please literally just use the device to say “I’m getting out cookies for snack, Johnny, do you want some?” That’s it. That’s what you’d say to a nondisabled child you expect to learn to speak, right? So use AAC to say that. Besides the actual demonstration of where to find all those words on their device, you are also demonstrating that it is okay and normal and good to communicate using AAC. This will have not just a more immediate impact on the chance that your child will pick up AAC, but a lifelong impact on their self-esteem as a nonspeaking person. I know personally, growing up with only people speaking around me, I deeply internalized the idea that speech was a superior form of communication, and I’m still unlearning that years into my journey as an adult AAC user. By using AAC to truly communicate with your nonspeaking child, you are directly showing them that they are valued and loved and respected, not just a bucket to be filled with communication skills.

3. You’re taking the nonspeaking person’s voice away to model on. Our AAC should be considered an extension of our body that you don’t have a right to touch without our permission. So if it’s at all possible, please make a duplicate of your child’s system to model on so that theirs always remains with them. For high tech AAC users, the most ideal form of this would be to have their same app on a separate device – for example, installing the Proloquo app that your child has on an iPad on your own phone for you to model on. Now, as a poor person, I absolutely realize that cost is an issue here. You might not have two devices in your household. The app you use might require you to pay twice to put it on two devices, and that could be prohibitively expensive. This is totally the reality for many families. If that’s the case, there are some other options to look into. Printing out each folder of your child’s high tech app means you can point to the same symbols they’re used to in the same layout they’re used to in a paper binder – or at the least, their home screen on a single hard copy communication board. This leaves their device with them for whatever and whenever they want to communicate, and teaches them that they have agency and autonomy over their own communication. If for some reason there’s really no way to have a printout either, I’d say, at the very least normalize asking your child before you touch their device. Maybe they don’t have a reliable yes/no indicator yet – but still ask, every single time. Convey that this is something you realize you need consent for, whether or not they are currently able to give it. If you’re not sure what their answer is, go ahead and say that before proceeding. “Okay, I’m not sure what your answer is, so I’m going to go ahead and use your device, but you can take it away anytime.” When they do get to a point where they can express yes or no – in whatever form, even if it’s grabbing the device out of your hands – respect those answers.

4. You’re not using complete sentences. Maybe you noticed the way I phrased the cookie example above. I think that for many nonspeaking people, just like many speaking people, the best way to learn how to form complete sentences is to be regularly exposed to them from day one! This certainly isn’t universal – especially if you have evidence your child has receptive language problems, it might indeed make sense to go with the usual recommendation of modeling just one or two more words per utterance than they already use. And it’s not like it’s unheard of for parents to say things like “you want cookie?” to children learning how to speak. But I truly think that recommending that practice across the board for AAC modeling severely misses the point of “presume competence”. Not being able to form speech sounds reliably doesn’t inherently mean children are incapable of understanding and learning to use full language with the vocabulary and grammar other people are used to, and for many of them there’s no reason to introduce it only gradually. Yes, look for evidence that suggests they might need you to simplify your grammar when you model, but don’t start from that assumption. If you speak things to them like “Do you want a cookie, Johnny?” and expect them to understand that, you may as well be modeling the full sentence on their AAC instead/too. Not nonsense sentences like “cookie want?” or “more yes” that you would never say to a speaking person.

5. You’re still intimidated by the AAC system, or don’t feel comfortable using it to express yourself. If you, an adult with decades of experience using language (and tech) in general are feeling too intimidated to use your child’s talker, how do you think they feel? I really think you need to know their device (or letterboard, etc) inside and out if you want them to ever be able to be able to use it confidently. Yes, this can take a huge amount of time and practice. But what wouldn’t you do for your kid? Think of it like learning a foreign language – you need to set aside time every day to practice, not just with them but on your own. Once they’ve gone to bed, read books “aloud” to yourself using their device. Repeat dialogue from the TV show you have on in the background. Use the search function liberally if you’re not sure where to find a button, and then practice that motor pathway two or three times after locating it to help internalize it for next time you need to find it. Be patient with yourself, this isn’t necessarily going to come quickly. The more time and effort you put into learning how to use your child’s AAC system, the more effective you will be at modeling – not to mention, you’ll have a greater appreciation of just how much exposure to it they really need before you can start expecting them to use it themselves.

Okay, like I said at the outset, there’s probably no one perfect way to model AAC because every nonspeaking person is so different, but I hope this post has challenged you to rethink some of the typical suggestions you might have heard for how modeling Should Always Be Done. (Psst. I might follow up this post with a bonus +5 more signs in a few days, keep your eye out.)

Did you grow up with people modeling your AAC system? What did you like/not like about their approach? Please comment below!

Using visual supports as an autistic adult: a review

[Content warning: dental]

Recently I created a lot more visual supports for myself, including some picture schedules. Although I often favor text, I’ve realized from my experience with AAC that symbols can actually be helpful for me, and now I’ve seen the benefit of using detailed picture instructions with my support staff in order to follow recipes, I thought it was worth trying to apply to other aspects of life too.

Background: I’ve tried and abandoned various planners and to do list systems over the years. Google Calendar is probably the thing I’ve stuck with longest, and if they had a way to integrate pictures I might have stayed high tech. But I love not just visual tools but tangible tools for coping with executive dysfunction, so I wanted to make hard copy supports. Plus, I’m less likely to remember to go into a certain folder on my device to open up a picture schedule compared to having it physically sitting next to me on the couch where I’ll see it no matter what. I was hoping visual supports could help with various aspects of my executive dysfunction: regularly skipping repetitive self care tasks (looking at you, morning meds), stalling every night when it was time to begin my bedtime routine (inevitably tweeting “GO TO BED E*” in inertia-mired desperation), trouble initiating certain irregular tasks (why is it that plugging in and turning on the printer takes so much effort?), and endless other examples. And like many autistics, I just thrive on structure and knowing what to expect. Mental health professionals often think this means I should go back to work or school or getting out of the house more to provide external structure, but they miss the point – I can create a structured life for myself without attempting high-spoons activities that I know I can’t sustain without ending up in autistic burnout and/or a bipolar episode. Just having a plan for my quiet life and knowing what to expect around the house on any given day is much more useful than trying to accomplish a heavy load of hard things that wear me down beyond repair.

But visual supports are just for kids, right? a voice nagged at the back of my head. No, no: they’re not. Disabled people grow up! Our needs may change over time but many of us still appreciate visual text and/or pictures to support our learning, focus, and communication. Some of us prefer photographic imagery over symbols, or we may want to use words only – and the content of our schedules and routines may be very different than a child’s – but that doesn’t mean we don’t need or want visuals. Executive dysfunction doesn’t magically go away when we leave school or move into our own place – in fact, for many of us we might need this kind of support more as an adult due to new work/living environments, increased demands on our cognitive load, decreased interpersonal supports, and/or the built-up effects of autistic burnout. The idea that picture schedules and other visual supports are only meant for children actively discourages disabled adults from accessing tools they need. It’s the fact that I’ve been part of a positive autistic community for a while now, a community that fights the stigma around using any needed supports across the lifespan, that got me to the place where I could ditch the internalized ableism around this and go ahead and create these tools for myself.

Before I go further, I want to take a minute to point out a few situations in which I hope you won’t use visual supports like these. 1) Don’t use visual supports to convince or train a disabled person to do something they don’t want to do, even if it’s what you think is best for them. 2) Don’t use visual supports to convince or train yourself to do something that overall impacts you negatively. (A couple examples to make it clear what I mean: you might not want to brush your teeth but still find the net effect on your well-being positive, whereas you might want to keep the house spotless but find the net effect on your well-being negative – in that case, go for it with the toothbrushing but please don’t use these ideas to get yourself to keep the house spotless, it’s not worth it.)

My process: I’d looked at premade visual schedules and sets of picture communication cards online and considered purchasing, but decided to make my own instead. I was able to customize the available options for each schedule (including many more adult type tasks than is easy to find in premade sets online), use symbols I’m already familiar with from my AAC app, use typefaces I can read more comfortably, and spend less money on supplies for more total supports. Before starting, I did a giant brainstorm of what kinds of supports would be helpful for me (for example, a “morning routine” checklist) and what items each one would need to contain (for example, “meds” and “wash face”). I let those lists marinate for a few days so I could gradually add items I’d forgotten. Then I screenshotted the relevant buttons for each item from my symbols-based AAC app (Proloquo2Go), in some cases temporarily editing that button’s label to more closely match my intention for the visual support usage. I used those symbols for all my supports except for my kitchen inventory – for that one I used pictures of the actual brands I tend to buy screenshotted from my local grocery store’s website. I inserted all these images into the Google doc I’d brainstormed items into, and played around with sizing before printing. After cutting out each item I “laminated” them with packing tape, and did the same to the backing pieces of cardboard most of my supports were destined to lay on. I then attached adhesive velcro dots to the back of each item, and placed opposite pieces of velcro to the various backing pieces for each support (or in the case of my shopping list directly onto my fridge, landlord be damned). I added envelopes to hold loose items not currently in use, and the morning and evening routine boards have loops of yarn at the top so they can hang around my neck until I’ve completed everything. The leaving-the-house checklist didn’t require so much crafting; I just taped the printed out list on a single sheet onto the back of my front door.

Images, text descriptions, and notes on individual items:


Image description: a piece of cardboard hanging from yarn labeled “morning routine” has two columns marked “to do” and “done”. “To do” contains a set of empty velcro dots, while “done” contains velcro dots with symbols and words attached to each. Items included are: coffee, dress, wash face, deodorant, glasses, hearing aids, medicine, October [cat], breakfast, brush teeth, and mouthwash.

Notes on morning routine: I keep these in a rough suggested order from top to bottom starting on the left column and continuing on the right, but don’t necessarily complete them in the set order. It’s nice to be able to move them onto the “done” side individually so that I always know what’s left no matter what order I’ve proceeded in.


Image description: a piece of cardboard hanging from a loop of yarn labeled “bedtime routine” has two columns marked “to do” and “done”. “Done” contains a set of empty velcro dots, while “to do” contains velcro dots with symbols and words attached to each. Items included are: tomorrow’s schedule, plug devices in, hearing aids, pajamas, medicine, October [cat], brush teeth, mouthwash, and glasses.

Notes on bedtime routine: As with morning routine. Both are on loops of yarn so that I can wear them around my neck until everything’s complete. This prevents me from having to continually walk back to a section of wall or counter in a certain part of the house between each step, and makes it harder to get distracted and abandon the routine partway through.


Image description: a foldable piece of cardboard has sections labeled “today” and “maybe” containing velcro dots, and an envelope labeled “another day”. Currently visible under “today” are moveable velcro dots attached to pictures and words for walk, sign language, blog, video chat, Twitter chat, recipe, and Etsy. Currently visible under “maybe” are pictures and words for yoga, audio book, wizard rock, and modding.

Notes on day to day schedule: Every evening I pull up my Google calendar as a reference and remove all the possible items from schedule and envelope, sorting into piles for the following day. After returning any irrelevant items to “another day”, I place the “today” and “maybe items” to the bottom/right of each section so that I can move them to the top/left as they are completed. The gap in between tells me where I’ve left off and makes it easier to sort the following evening because I can see what’s been left undone. In total I made about 25 items that frequently repeat during my average week or month but don’t fit into a every-single-day routine like for mornings and evenings. Activities include various carer appointments, visits from my support staff, errands, hobbies, self care, and more.


Image description: a piece of cardboard labeled “PCA time” has a short list of velcro dots. Currently displayed items are: oatmeal, beans, counters/sink, and prep a recipe.

Notes on PCA agenda: Like the day to day schedule, I initially place the agenda for me and my support staff at the bottom of the short list so we can move them up to the top as they are completed. A small envelope (not pictured) below the chart contains other tasks we do frequently but not that day. This visual support that’s tacked to the kitchen wall doubles as AAC, because I can point to it when needed rather than finding the word on my device or signing.


Image description: A white freezer has a line demarking two sections labeled “have plenty” and “need more” with several dozen velcro dots under each. About forty food items and household items are attached across the two categories, displaying a photographic image and large text for each.

Notes on kitchen inventory: This list continues down the fridge, but the photograph above gives you the basic idea. Not currently pictured are additional non-food items I regularly need to restock such as soap and toilet paper. If I’m struggling to think of what to eat, a glance at the “have plenty” side tells me what I own without having to dig through fridge and cupboards, and as I run out of each ingredient I can move it to the “need more” side – the latter of which can then be photographed just before leaving for the store as an instant, bad-handwriting-free shopping list that incorporates pictures.


Image description: a single sheet of paper is attached to a wooden background with masking tape, labeled “Leaving the house? Bring these things!” Below the heading are two columns of symbols and words for the following items: mask, wallet, keys, phone, hat, weather gear, sunscreen, Ipad, speaker, HA batteries, letterboards, chargers, caffeine, food, water, AAC bracelets, and stim toys.

Notes on leaving-the-house checklist: I didn’t bother attaching these items to moveable velcro dots because I don’t necessarily need every one of these items every single time I leave the house. Instead, I put them in rough order (top to bottom on the left and then continuing on the right) of how likely it is I will need each thing for any given time I exit the apartment – for example, I need mask, wallet, and keys just to take out the trash or walk to the corner store, but might not need to bring a backpack with a lot of the latter items unless I’m actually taking a longer adventure that day on transit.

Results so far:

  • I don’t always actually physically move every single item into the done category when I’m at completion of the end of the routines lists, but it definitely helps me not get lost in the middle, and having them hanging around my neck is also definitely making it harder to get lost in the middle
  • It does seem like I’m less likely to stall on bedtime now; at least the frequency of my “GO TO BED E*” tweets has decreased, which I imagine my followers appreciate. I have often been starting my routine ahead of my mental deadline rather than scrolling Twitter endlessly long past when I intended to move on.
  • That said, if I haven’t left the bedtime routine support actually within reaching distance of the couch (where I’m almost always sitting when it’s time to get ready for bed), it does nothing to help with inertia. Turns out standing up and crossing the room to pick up the schedule takes just as much cognitive effort as standing up and crossing the room to begin the actual routine – who would have guessed? – so the less-stalling effect only happens if I’ve left the schedule nearby.
  • The morning routine isn’t as smooth-going as bedtime, which seems to be because some of the things I only do every other day – that therefore aren’t worked into the morning visual support – have to happen in between morning routine items. (Ie shower before dressing, yoga before putting on hearing aids, etc.) So I might start the morning routine when I first wake up but then I set it down partway through to accomplish those irregular tasks, and it ends up being an hour or two before I actually complete everything. I also am liable to get out Animal Crossing halfway through and get distracted by that for awhile, oops. But it does seem like I’m more likely to eventually actually complete all the morning routine tasks than I used to be, so I do think the new support is still helping.
  • For the day to day schedule, I’m finding that if I change my mind and decide to skip something I meant to do under the “today” section, it can be hard to transition on to the next item. After some trial and error it seems like moving the skipped item down to the “maybe” section or even out of sight to the “another day” envelope is a suitable hack to get me over that AUGH-CHANGE-OF-PLANS cognitive barrier. But as always is true for me, it remains much easier for me to not do something I planned for a given day than to add something that I didn’t have in my brain as a possibility the night before. I think this is just a default quirk of my brain that the presence of picture schedules doesn’t seem to impact one way or the other.
  • There’s only so many empty velcro dots on the day to day schedule, so it’s harder to overbook myself spoons-wise!
  • I’m already noticing some important items I’m wanting to add to the day to day schedule, the PCA agenda, and the kitchen inventory, so I’ll probably do another printing-laminating session eventually to fill in the gaps.
  • I hadn’t been sure how many adhesive velcro dots to order, but a set of 250 was enough for this set of supports. I think the “laminating” process took less than one big roll of packing tape, which is much cheaper than actually laminating this much paper at the FedEx store.
  • Cooking isn’t easy for me (although having a support staff and detailed picture instructions have happily moved it up from “impossible” to “not easy”), so I’ve sometimes been forgetting to pay attention whether I’m getting low on an ingredient as we’re partway through a recipe. But having a support person here means there’s someone to remind me to move the item to the “need more” section of the fridge, so I don’t think I’ve actually completely missed anything yet.

My recommendations:

  1. While prepurchased sets of picture schedules or other visual supports might be convenient to just click “buy” on, if you do have the time and supplies, individualized homemade supports might be much more useful.
  2. Do use symbols the user is already familiar with, or photographs of the actual items they’re used to.
  3. Consider what typeface you’re using if you’re including text – size, spacing, contrast, dyslexic friendly fonts, etc can all impact how usable a visual support is.
  4. Consider portability of each visual support, or if they’re not portable, exactly where they’re going to be placed in your home. This can significantly affect how easy it is to follow through on each task.
  5. Assume you’ll need to add more items to your supports after a few weeks or months of trialling your original plans. Keep a notepad nearby where you can jot down missing items as you think of them. Like choosing what vocabulary to choose for an AAC device, it’s just hard to predict all the details of what you’ll need without actually trying it out for a while.
  6. Most importantly: Involve the user in intent, design, and implementation as much as possible! Don’t reward or punish someone based on whether or not they use the visual support, and don’t trade rewards or stickers or whatever for completed tasks. Visual supports should be optional tools for people who are dissatisfied with the way executive dysfunction affects their own goals, not a method of training a disabled person to do what you want.

Thanks for reading such a detailed post! I hope it was helpful for you or someone you love. If you have something to add based on your own experience with visual supports, please post a comment below.

AAC is not just for requesting! Creative ways to use AAC

I think too often professionals and caregivers of AAC users only focus on using AAC – especially symbols-based systems – for communicative functions such as requesting. That really limits how much can actually be done with these powerful apps! I use my symbols-based AAC (Proloquo2Go) in a much wider variety of ways; here’s some examples:

Picture schedule: Why buy a separate set of physical or electronic event/task images when you already have symbols based AAC? AAC users or our support people can create a page to edit each night for the upcoming day, or create folders for the sequences of steps in complicated activities like cooking. Many of us who struggle with transitions and executive dysfunction can benefit from visual supports like this.

Flashcards: By creating a folder of vocabulary we’re trying to learn in a second language, setting the buttons to show image only, and programming the speak field to feature the foreign word, AAC users can use their symbols-based AAC system as a flashcards studying app. This is especially useful for learners like me who do best in immersion settings – other flashcards and picture dictionaries are always mixing in English, which can make it harder to learn the new language. (Please note that this idea should be used only for self-directed learners, not for forcing emerging communicators to go through drills to prove their competence.)

Navigate meltdowns: Some AAC users who usually use text-to-speech on a QWERTY keyboard may find that they lose this ability during meltdowns and shutdowns – but that they may still be able to use a symbols-based system during these times. One major reason I like having Proloquo2Go as well as Proloquo4Text (which is QWERTY-based) is that I can switch to symbols when necessary. Last year I had to go to class in the midst of this situation, and it turned out my classmate was totally cool with me composing messages via images to say things like “have meltdown hard communicate”. Normally I’d be able to type on QWERTY more complex sentences like “I just had a meltdown and am still having a hard time communicating”, but in that moment if it weren’t for a symbols-based program I wouldn’t have been able to interact at all.

Write poetry: AAC users can select images that evoke a scene and then use the words to write a poem describing the feeling it gives us. I especially enjoy poetry because poets are given more leeway to break the rules of “proper English”, something that AAC users are often discouraged from doing. But communication is about conveying an idea, not about grammar and syntax and spelling and pronunciation! If we can get our message across using a haphazard series of nouns and verbs, that is still valid communication. Poetry is a venue where this kind of creative use of words can be valued.

Write prose: AAC users can write school assignments or even extracurricular fiction by composing their sentences in a symbols-based app and then copy and pasting into a word processor. I have written sections of my novels this way!

Post to social media/emails: Similarly, by composing a message in our symbol-based apps and then copy and pasting to social media or email, AAC users can participate in online communities using the kind of communication we prefer. I sometimes use my symbols-based app to livetweet my favorite TV shows; it’s a great way to share my special interests with others and get more familiar with the app.

Give presentations: AAC users can program our scripts into a series of buttons and practice by running a stopwatch to make sure our words play in the desired amount of time. This could be for a school project, an open mic night, advocating for ourselves at an IEP meeting, or meeting with our senators on disability rights issues. I regularly use my device to give presentations about autism, disability, accessibility, and AAC.

Special interest infodump: Autistic AAC users like me might enjoy utilizing the way symbols-based systems organize categories and folders to store information about our special interests. I have folders full of hundreds of Harry Potter characters, spells, et cetera, so that I can talk about the canon I love with other fans.

Vocal stimming and echolalia: Many autistic people like me use vocal stimming and echolalia to modulate our sensory environment and communicate. This shouldn’t be limited to speaking people; it’s a totally valid way to use AAC! Don’t discourage us from “playing” with our systems – having the freedom to press buttons over and over, to use buttons that repeat phrases from our favorite movies, or to play buttons at random as experimentation can encourage emerging communicators to feel comfortable using AAC.

Prompt speech: This isn’t commonly understood, but some of us can speak words aloud only when they are in front of us visually. So we can use an AAC system to compose what we want to say, and once we have selected the right buttons we may be able to read the screen aloud rather than using our device’s synthesized speech. Please don’t pressure us to do this, and don’t expect us to read a message you composed for us! This is just one more tool that may give us additional agency over our communication.

I hope you got some new ideas from this list that you can try out and share with other AAC users! If you have discovered more creative uses for your own AAC, please add your thoughts in a comment below.

Why do some autistics like watching the same media over and over?

I can’t speak for all autistics, but there are a lot of reasons I watch the same media over and over! I have about 10 long series that I watch on endless loop – I restart one, watch every episode in order, and then restart the next one (I keep a spreadsheet), ad nauseum. Except for me it’s not ad nauseum – it’s the main, maybe only, way I can enjoy media! In this post I’ll go over a few reasons why, in case it gives anyone insight as to why you or your autistic loved one might be doing the same thing I do.

Routine: This might be the obvious answer, but it’s not unimportant. Many of us just thrive on routine. Even if all other elements were neutral, it is inexplicably reassuring and comforting to watch the same shows over and over. In a chaotic world where we may not always know what to expect, coming home to a familiar show can feel like a weighted blanket or a soft stuffed animal.

Prosopagnosia (faceblindness): I am not completely faceblind, but it does take me a huge amount of repeated exposure to any given face before I begin to recognize it reliably. For this reason, (re)watching TV shows that have several seasons with the same main cast of characters keeps me oriented to which character is which. In contrast, watching a two-hour-long movie would just be confusing: it’s very difficult to understand what’s going on when for the first two-thirds (at least) of the plot I can’t even tell if I’ve met any given main character yet, let alone what they said or did in previous scenes. Sometimes I recognize an actor by their voice, but unless I’ve seen multiple seasons’ worth of their appearances – ideally over and over – their face is likely to be a mystery to me. Occasionally even actors I am very familiar with are unrecognizable out of context – once, in the middle of a DM conversation about Gillian Anderson, a friend sent me a picture of Anderson. I’ve seen X-Files at least five times through, but this was an out-of-context photo where her hair and outfit was different than I’m used to. My response to my friend: “who’s that person?” I was baffled as to why she had sent me a random photo of what to me registered as a stranger.

Auditory processing: Captions can help a lot with auditory processing, but so can rewatching media. Captions don’t usually account for background music or sound effects, and even with captions it might take me a few times through any given scene before I’m integrating all that correctly. Crucially, jump scares and other startling sounds/lights/movements can somewhat be cognitively prepared for if you know what’s coming when. Watching a series from beginning to end on Netflix means I don’t have to turn down the volume for every artificially loudened commercial break like I would on a standard television, and I can skip the theme songs if they’re also too loud (or if they’ve recently changed – that bugs the heck out of me).

Understanding the plot: I guess this makes me feel a little silly, but I genuinely don’t understand the plot of many shows the first time through. Every time a new season of Stranger Things comes out it takes me at least three times through before I start to understand why things happened the way they did. It seems like I just don’t always clue in to the elements the creators expect neurotypicals to automatically notice. I didn’t fully realize how true this was until I watched a couple of shows with audio descriptions. While I wish the audio descriptions were also captioned, what I could catch of them was amazing. They pointed out crucial elements of each scene I was supposed to be attending to but often wasn’t – facial expressions, body language, visual elements that set the backdrop with clues and ingredients of later subplots. A bonus is that audio descriptions often name the character seen emoting on screen, helping with prosopagnosia. But they’re available for so few shows, in most cases it’s only rewatching multiple times that can help me meet these access needs. Repetition helps me grasp each step of the plot and how it’s all connected. I start to figure out characters’ motivations and understand the worldbuilding rules that shape the story.

There are probably many more reasons other autistic people might prefer to rewatch media, these are just the biggest contributing factors for me. What are yours? Comment below!


Nerding out on AAC: what is it like to switch grid sizes/layouts?

A frequent conversation in AAC communities for people who use symbols-based apps and their caregivers is: what grid size should I use? Is it okay to change grid size/layout later? Well, in the last week or two I changed my symbols-based app around significantly, and want to talk about why I did so and what it’s been like to transition.

First, some basics: I’ve been using Proloquo2Go for a couple of years now. This highly customizable app (for IOS only, unfortunately) features buttons with picture symbols for full words as well as a QWERTY typing view that can be switched to as needed. Due to my sensory profile (strong fine motor skills, somewhat easily overwhelmed visual processing, and difficult hand eye coordination) I’d come to use Proloquo2Go’s symbols view largely based on motor plan rather than by visually scanning for the words I wanted – that is, I had built a muscle memory of where many words were located on each screen so that my hand would automatically move to the right area, similar to how I type on QWERTY. I had this decently down for common words, albeit with plenty of near-misses on the buttons I was aiming for due to the hand-eye coordination problem, and of course was still relying on visually scanning (which takes me longer) for less frequently used words (“fringe vocabulary”).

So if having built up this muscle memory was how I navigated the app as well as I did, why would I want to switch grid size/layout?

I had toyed with the idea of changing the layout of many of my fringe pages for months, because I’d long since noticed that I almost always habitually went back to the “home” screen for core vocab rather than checking whether any given fringe folder’s template included the core word I was looking for. This meant that the templates used in most core folders were just taking up space, pushing lots of fringe words to the second layer of that folder – meaning it took an extra button press to reach those words. If I was going to automatically go back to the home screen to use core vocab anyway, why keep the same words in the fringe folders? But on the other hand, was it worth re-learning where my favorite fringe words sat on the screen once core words were removed?

I’d also noticed that the default templates (and honestly I’m not sure why this seems like a good idea to anybody) both vary which words are available on different kinds of pages and occasionally alter the location of core words compared to the home screen. This means that I couldn’t rely on motor planning to inform me where to find core words when on one of the fringe pages, which is probably what built my habit of returning to home each time for core words in the first place. While using fringe page templates like this might speed up communication for many people because their ability to visually scan for core words on the page they’re already on prevents them from needing to tap back to home, it was just slowing me down by pushing fringe words “further away” (more button presses) from the home screen. But learning how to edit templates somehow seemed like a daunting task, and I wasn’t sure if it was worth it.

The event that prompted me to go ahead and change all this stuff was that I’d offered to build a core board for a Facebook acquaintance to print out and introduce to their little one while they were waiting for the usual April sale on the app. I encouraged them, as is the general recommendation, to request whatever grid size they thought was the most buttons their kiddo was going to be able to handle given the prospective screen size and any visual impairments or other constraints. They chose 8×14, one of the standard options. I remembered considering this grid size when I was initially setting up my app, but at the time I felt visually overwhelmed trying to contemplate navigating anything bigger than 7×11 on my Ipad Mini, so I had set up my own user profile as 7×11 and used it ever since. I created a quick new user profile for 8×14 in order to create this other family’s core board, and after editing for a few minutes I realized… this is not overwhelming to me. 

I think something about having a couple years to get used to symbols-based/grid-based AAC – and this app specifically – really made a difference in how visually overwhelming a bigger grid size felt. So I quickly did the calculations: 35 more buttons per page? Even 35 more buttons just on the home screen would probably speed up communication. But I just wasn’t sure if it was a good idea to try to get used to a new layout.

And that’s when I remembered the other changes I’d been thinking of making – removing the templates from many fringe pages; editing the standard templates. If I was going to make changes, I really ought to make changes, right? I knew a major reason for the common recommendation of starting with as big a grid size as possible is that changing the layout of what-word-is-where later can be extremely difficult for an AAC user to adjust to, especially those of us that do rely on a motor plan more than visual scanning. Would making these changes be like learning AAC from scratch?

I took a few days’ worth of deep breaths and dived in, figuring, 1) I enjoy fiddling with communication boards no matter what, so even hours of rearranging and editing would probably just register as “ooh fun!”, and 2) if I put in a lot of effort but it ended up impossible to get used to, it’s not like my old user profile would have disappeared. (Kudos to Assistiveware for letting me design multiple user profiles on one app/device!) I followed through on all the changes I was thinking might speed up my communication – bigger grid size, removing templates from many fringe pages, and editing the standard templates to better match the home screen and each other. I also did a lot more color coding and subtle customization like varied outline widths to make certain buttons stand out to me more.

The results? Almost none of my fringe folders necessitate second layers – the words (or subfolders) I need are all on the first screen of each. For my People, Places, and Verbs folders, I pulled my most commonly used words from each of their subfolders out onto the main page. (For example, now I can find the words “friend” and “doctor” under just Home>People rather than Home>People>Friends and Home>People>Healthcare.) My Home screen fits not only more words on it now but more folders, so I don’t have to navigate to the second layer of Home as frequently either. Let’s look at how this affects sentence construction: on my new layout, the sentence “yesterday I had coffee with my friends [O] and [Z], we practiced more sign language and talked about our plans for next week” requires 49 button presses (average 2.1 per word), but on my old layout, it requires 58 (2.5 per word). If that sample is representative, it adds up! Writing 2000 words of my novel (I like composing on my symbols app and then copy and pasting into Google Docs) will take 4200 button presses on my new layout instead of 5000. Additionally, four of the necessary buttons for this example sentence have added color coding in the new layout due to being words I use frequently, which I expect will help me navigate to them (and buttons relative to them) more quickly. After editing templates, none of the core words from the home screen are located somewhere different (if present) in other folders, so I can more consistently rely on the new motor plan I’m building.

It is not without difficulty to switch to a new layout, and if someone has even more trouble with cognitive transitions than I do, it’s possible there’s no number of decreased button presses that would be worth the learning curve. But for me, this feels like a really positive change. I’ve been making a point to practice the new layout just like I made a point to practice the app when I very first got it. Reading aloud to myself, answering practice prompts in Facebook groups for AAC users, using Proloquo2Go to compose texts and Tweets, and writing other documents using symbols are all helping orient me to the new layout. I kept as many things in the same relative place as possible given the task at hand – like, the home screen doesn’t look like it went through a catastrophic reorganization compared to what it was before, it just has more than it had before. More personalized color coding than is the app’s default seems to be slightly improving my ability to visually scan for fringe words’ new locations. Wonderfully, this grid size even leaves me room to grow – many fringe folders now have one or two dozen blank spaces I can add vocab to as I find myself without any given word mid-thought, rather than knowing any additions would just be buried on the second layer.

Proloquo2Go’s default setup does work very efficiently for some people, but its customizability is its true strength – and the fact that the app allows for me to easily make all these changes is a major reason I am loyal to it. What I’d say to anyone considering changing their own grid size or app layout is: talk to other people who have done it (this Facebook group is a good resource), and make lists of what you expect will be the pros and cons… but, when in doubt? If inefficiencies or possible changes have been nagging you for awhile, then I’d say just go for it. Make sure you have a backup of your old layout safely tucked away, set aside plenty of time and energy for editing/customization and practice, and see what happens!

If you’re a professional or caregiver considering changing another person’s app – ask them directly what they think. Explain what you understand to be the pros and cons to be in language they understand, and phrase follow up questions in a way they are able to answer (for example multiple choice or yes/no if open-ended is more difficult for them). If they are excited about a reorganization, wait until you’ve basically finished tinkering with the new layout before introducing it to them, and ask them what they think/if they want any changes. After that, do your modeling on the new layout but make sure they have access to the old layout anytime they want to switch back to it – their ability to communicate in the moment should never be frustrated by trying to learn a new layout for the long-term.

Have you switched your AAC (symbols app, letterboard, whatever) to a new grid size or layout before? What was it like? Comment below!

Well, another poem I guess

I know I said I don’t really plan on posting poetry to this blog, but a dear friend asked me to write some, and the content is relevant – so, here you go!

people think i spoke for all those years
even correctly, for some of them.
so how can i explain
that it is only now i have a voice?
true, i eked out words here and there
that were right and true
paper, journals, signs, word processor, instant messenger, emails, texts…
i even spent days in silence –
perhaps a whiteboard note or two,
scratch paper for the barista (soy latte, 16 ounces) –
but i didn’t understand.
how can i tell you
whose brain is connected to your mouth
that my fingers are more sure than my lips?
and that now that i know myself
my words are strengthened, more confident, more proud?
to me, quieter than i wanted for all those years,
being able to press play feels like a privilege.
but what i wish for everyone
growing up unusual
is a human right:
words any way you want them
all the ways you want them
no explanation necessary
at all.
my words my ways
is a promise to myself
and a hope for everyone else atypical
in this world that doesn’t listen.
may we all have loud hands.

[The final line is a deliberate reference to ASAN’s “Loud Hands” anthology, please check it out.]

AAC for autistics 101: part two!

You can read Part 1 of this post here, focusing on assessing autistic people for AAC supports. This second part gives tips on interacting with those of us who already use AAC and helping advocate with us for communication rights.

Whatever AAC supports you and the autistic person have selected, it’s your job to model as constantly and thoroughly as you can! If the only input nonspeaking autistics are getting from people around us is speech, at best we are going to develop self esteem issues from the implicit message that our communication isn’t normal, or at worst we will never learn to use our AAC supports at all. Please teach our caregivers how to use our AAC supports so we are getting consistent input in a useful modality throughout the day and across every setting.

Along with modeling, please provide us with as much vocabulary as possible as early as possible! The AAC field has historically often fixated on teaching requesting, but every autistic person has the right to all communicative functions. This includes refusal! That is actually a really important thing to introduce along with other vocabulary for self-advocacy; being able to say no is important for our safety and self-determination. As another example, if you think we are not capable of commenting on preferences or sharing opinions, and thus don’t give us the vocabulary, you have made us automatically and ongoingly not capable of commenting on preferences or sharing opinions! And please make sure from the outset autistic communicators have vocabulary for talking about our special interests and favorite objects; we are more likely to want to learn our AAC method if we can use it for conversations that excite us.

As early or often as possible try to teach us how to customize our devices and add vocabulary for ourselves. Don’t see yourselves or our caregivers as the sole moderators of our access to communication. And don’t overstep your bounds in assessing how we use our devices. Many programs have a history or tracking feature that can help you analyze what words we’re using, but looking at this without our consent is a privacy violation.

Some other thoughts on working with AAC users… Never take away our device (or light tech supports)! They should always be within reach. I have had my device taken away and it is such a helpless feeling. Even if we are using our device for what seems to you to be non-communicative, for example echolalia or vocal stimming, and even if it’s getting really really annoying, we still deserve access. You probably sometimes sing along to the radio, which is essentially socially acceptable echolalia/vocal stimming, but no one tapes your mouth shut. Give us the same autonomy. The one exception to this is if the needed objects might get damaged, for example in a swimming pool. If this is the case try to get our consent before just taking it away, and have a backup like a small laminated communication board or at least a system for answering yes and no questions while in that environment.

Ask each AAC user directly how we prefer to interact. Some of us want communication partners to be silent and patient while we type, other people would appreciate you trying to guess the ends of our sentences so that it is less work for us than having to spell everything out. We might like a device set to speak each letter or word as we type it or we might want to compose messages silently and play the whole thought at the end. Some of us want to keep our screens private and express ourselves with speech generation, other people would rather you read along as we type. No matter what an AAC user prefers along these lines, use your relative position of power as a professional to teach peers and caregivers how to interact with us respectfully. Make sure we are invited to our own IEP meetings and other services planning meetings so we can express our own goals for our communication, and make sure people talk directly to us rather than to our caregivers and assistants. I have had doctors ask my support person questions about me while I was sitting right there, it’s really patronizing.

Another thing to consider is helping us use our AAC system to connect to other people and places, not just face to face. We may want to make phone calls – can we use text relay for that, or put our devices on speakerphone? Can our devices access emails and social media so that we can copy and paste the sentences we write using symbols directly into other apps? This is important not just for our social lives but also for self advocacy. For example, we might want to call a hotline to discuss our rights in benefits programs. We deserve to contact our senators and representatives about policies that affect us, and most autistic adults have the right to vote even if we are under guardianship. Think about whether our AAC supports will give us access to these activities that are part of living a full life.

Okay, that’s the text of the presentation I delivered to future SLPs! As I give more presentations like this I will probably refine and add and cut and edit, so maybe someday I’ll update this post, but for now this is a good summary of what I’d say to professionals fairly new to AAC for autistic people. Please feel free to pass it along to anyone you think might be interested!

Are you an autistic who uses AAC? Please let me know below what else you would want to tell people at an intro level by commenting below.

AAC for autistics 101

Part One: Assessment

I gave a presentation about AAC in a Speech/Hearing Sciences class on autism last week, and I want to share the text with you as a sort of “Intro to AAC” geared mostly towards students and professionals relatively new to the subject. That said, there’s probably going to be good info in here for families too as well as some core advocacy concepts AAC users might be interested in. (Warning, if you’ve read my blog before you may see some duplication of ideas, the goal with this two-part post is just to get all the sort of 101 info from my presentation into a consolidated online resource.) This week I’ll post the portion about assessing autistic people for AAC supports, and then next week will be tips on working with AAC users and helping advocate with us for communication rights.

As for assessment, hopefully you have already heard the phrase “presume competence”. This is a key principle in the AAC field. Another phrase I like that gets at the same thing is, “an absence of evidence is not evidence of absence”. So just because you have a child or adult in front of you who hasn’t yet demonstrated an ability to communicate, that doesn’t mean the ability is not there – it might just mean that they haven’t been offered the right supports yet. Think back to the social model of disability, and until proven otherwise assume that an autistic person’s lack of communication is a fault in the environment rather than a fault in their brain. There is nothing inherently superior about speech compared to other communication methods, so there’s no reason you should hold out for someone to develop speech – just provide AAC modeling early and constantly and see what happens. (Modeling means using the chosen AAC method for your part of the conversation so that we have an example of how to use it, the way speaking children hear speech around them all day to learn from.) The autistic person you’re working with may eventually develop speech or they may not, but with AAC at least they have a chance of communicating in the meantime. That’s a human right for people of every age and every neurotype.

There is a flip side to this, and it’s when you’re assessing an autistic person who does use speech, but perhaps has articulation or pragmatic impairments. Or maybe their speech even seems completely adequate to you – but please don’t assume that it is adequate to them internally! Verbal and semiverbal autistic people might very much benefit from AAC. An autistic friend who studies neuroscience, Alyssa Hillary Zisk, recently published an article on this. For many of us AAC is easier on our overall motor planning demands, or we have better fine motor control than oral motor control, or it is easier to work with visuals than auditory words, or it just saves us energy that would better be directed towards other areas like executive function skills or sensory modulation. We may not be expressing quite what we’re really trying to say when we have to use speech, or we may be unnecessarily exhausted by it. So I would recommend that any autistic person you encounter in your work should be considered for AAC supports. And actually, I’ll extend that to anyone presenting to a speech therapist in general, because someone who struggles with communication might not have an accurate diagnosis yet.

How do you decide on an AAC method? Well, do all the formal assessments and checklists you want, but if the person has already demonstrated any ability to understand choices and make decisions, I’d suggest also just showing them the various options and directly asking what they want to try! For people who haven’t yet had the opportunity to demonstrate an ability to understand choices and make decisions, start modeling at least one AAC method, while pairing with speech, and see what the person shows an interest in. If people around them don’t display a bias towards a specific communication method, then the autistic person will show you their preference by trying out the one that works best for them.

In choosing an AAC method to trial it’s important to consider communicators’ strengths versus impairments in motor skills as well as sensory modalities – alongside what is going to be most effective for their environment. For example, if someone’s family refuses to learn sign language, it might not be a good option. Of course, if it’s what the communicator prefers, you should be pressing their caregivers to support that. Considering cultural competence is also important – for example, a child from an immigrant family might prefer a text to speech device that offers an accent that fits their heritage rather than a device that only has white American voices. And gender variance is more common in autistic people than the general public, so never make an assumption about which gender coded voice a communicator will prefer.

Okay, there’s my thoughts on assessment, stay tuned for next week’s post on working with AAC users!

Timesuck: disability edition

Content warnings: skinpicking, mention of eating disorder


This might not be interesting to other people, I’m not sure, but I wanted to track and annotate how much time I spend on being disabled each week. See, I’m not sure abled people understand the timesuck that can happen with disabling conditions (and with navigating ableism). For this experiment, I didn’t include activities that were optional such as disability advocacy, but moreso things that I either can’t avoid at all or would put my health at risk to not do.

The total week came to 32 hours and 25 minutes, although most of that was the extra sleep I require (this is a combination of bipolar management and meds side effects) and the extra time transit takes compared to if I was able to drive. Here is the week’s totals broken down:

Extra sleep: 18 hours. I compared to a healthy person’s potential eight hours, although adult average is apparently more like seven. I didn’t track this down to five minute increments since it’s hard to tell exactly when I fall asleep, but instead approximated, often rounding down. Certainly there are tons of abled people who would love to sleep as much as I do, but for me it’s necessary, and not necessarily enjoyable.

Extra transit: 7 hours and 25 minutes. I compared my actual walk/bus/train trips with Google’s estimates of driving times. This is an example of another thing I do that some abled people do as well, it’s just that I literally can’t avoid it due to disability. I am so not capable of driving, even if I had the money for it. It’s part of my experience of being autistic.

Treatment: 5 hours and 30 minutes. This includes appointments, pharmacy, transit for those, taking meds, and symptom tracking. I’m also technically supposed to go to group therapy two hours a week but haven’t been doing that lately, which is probably unwise.

Communication supports: 1 hour. This was time spent preprogramming my device for upcoming activities (see my posts about AAC here). I didn’t count the extra amount of time any given conversation takes when I’m typing, as I had no real way of estimating the difference from conversations where I speak or sign the whole time.

Actively engaging in behavioral symptoms: only 35 minutes, yay! This week all of which was devoted to skinpicking. I used to spend 30+ minutes/day on that alone, but have been doing much better on all obsessive-compulsive spectrum symptoms lately. The total amount of time I spend on symptoms can be much higher during bipolar episodes, and almost automatically goes up to 4+ hours/day during acute eating disorder relapses. I decided not to count engaging in special interest stuff under this category even though that’s technically a “symptom” of autism, because for me it feels like mostly a positive autistic trait – it enriches my life rather than takes away from it.

Coordinating benefits/accommodations: 15 minutes. There are definitely weeks this takes an hour or much, much more. Nothing like being your own primary case manager.

Okay, given, I don’t have the spoons (spoon theory) necessary to actually work for 32 hours/week, but in theory look at all the time I could direct towards productivity if I was able! I would be much less poor. Or not to mention life would certainly be more fun if I could spend that time on crafts or special interests. I often resent the amount of sleep especially – can you imagine having 18 more hours a week than you have available right now to do whatever you wanted with? In truth, I can only barely imagine it. I have been living with my disabilities for so long that making this list was somewhat surprising. Advocates do frequently point out that being disabled (and poor, too) is a full time job, but I wasn’t sure before this data collection whether my own experience added up to that. It certainly got close!

Well, I don’t know if you found this as interesting as I did, heh, but maybe it gave you new insight into one disabled life. If you’re disabled and have ever done a tracking project along these lines, I’d love to hear about it! Feel free to comment below.