Tech Literacy: Misunderstood From the Start

In the 90s and 00s, we didn’t know any better. Technology was moving so quickly that we were teaching our teachers modern innovations that solved their problems – problems they were trying to teach us workarounds for. Manually pressing enter when you got to the end of a line in Word so it would make a new line? Getting Internet Explorer to work, at all? Little things. Seemingly innocuous things at the time. But those little things added up. There’s an entire generation out there that (at least those who participated and took an interest) had tech literacy and critical thinking instilled in them. Skills that would not only serve them well for the next 20+ years, but skills that would also make them seem like geniuses compared to the generations before and after. People will give a few different reasons for why later generations haven’t been quite as tech literate: Everyone thought they were somehow born with the knowledge, they were given phones too early and not monitored by their parents, it’s not taught anymore. The problem is it was never taught in the first place, and no one just bothered to realize that. In truth, all of those answers are correct, yet completely incorrect.

What even is “Tech Literacy” anyway?

Tech literacy is not: – Knowing how to save and find files in a specific operating system’s file system (usually Windows) – Knowing how to use specifically Microsoft Office – Adeptly writing a professional email – Being able to build your own website – Easily being able to pick a server upon signing up for Mastodon – “Getting” Linux

Tech literacy is: – Understanding what files are, how file and folder hierarchies are structured, and a vague idea of what file extensions/containers are for – Understanding the basics of writing in a text editor/word processor, formatting text, and why/when to use specific formatting – Being able to appropriately cater their tone and presentation online based on time and place – Understanding why managing your online persona/presence/image is important, and the implications and consequences of “acting out” online on your personal life – Being able to troubleshoot technical issues, problem solve when things aren’t immediately intuitive, and not freaking out when there’s a step or two you didn’t expect or haven’t done before – Thinking critically about the tools and services you use, the relationship you have with the creators/managers of those tools/services, and ownership over your data or experiences

Tech literacy is all of these things and more, and yet they were never overtly taught to my generation any more than they were taught to anyone else. Instead, tech education in academia (and beyond) has always focused exclusively on skills. “How to use a computer” or “How to use Microsoft Office” for the purposes of being able to continue completing your schoolwork is about as far as that “education” ever went. As Theodore Roszak discusses in “The Cult of Information: A Neo-Luddite Treatise on High-Tech, Artificial Intelligence, and the True Art of Thinking” – even in the 80s as computers were being rushed into every institution far before they were ready, the entire “culture” around technology first had to involve a bubble of everyone learning the technology in order to be able to use it. Manufacturers like IBM and Apple could sell everyone on the ideas of how computers could make school and work easier or better, but there was still a bottleneck of learning how to use them. Academia never got past this – because, realistically, how can it? If the school doesn’t ensure they’ve taught children to use computers to get them to the point of being able to complete assignments on them – they can’t really expect assignments to get completed. No one stops to think about the “why” when innovation and big dollar signs are thrown around, simply that “this is the new thing and we must all use it.”

Our schools (and workplaces) teach “Digital Skills,” not digital or tech literacy. And if you’re not taught this during your formative years of developing a relationship with that technology, thinking critically about the role it serves in your life – especially with modern tech like smartphones and smart watches meaning that inter connectivity and all of its alerts and harassers and risks are inescapable – then it becomes very difficult to ever learn it.

Maha Bali (2016) explained it this way: “Digital skills focus on what and how. Digital literacy focuses on why, when, who, and for whom.”

The Troubleshooting Cycle

So why, then, would a generation of millennials grow up to have more tech literacy than younger generations? Put simply: Because we were on the forefront of the net culture in the first place, we were the ones blazing the trails and establishing norms and figuring it all out. But also because everything kept breaking along the way and we really had no choice but to figure it out on a deeper level if we expected to keep using our computers. Much like how the generations before us would go on to become the Bill Gates, John Carmack, CliffyB, etc. of their era simply because the only way to do anything fun on the computer was to write your own programs and figure programming out on some level – we became master troubleshooters by just trying to do basic things. Computers weren’t great. I have a massive soft spot for DOS, Windows 98, Windows XP, and Red Hat 5 – but things were jank. Shit just didn’t work a lot of the time. And in order to make it work, we had to figure out how it worked.

As I have been able to glean, computer and tech classes don’t really differ that much today than from when I was a kid. The difference is that schools are all running on Chromebooks and iPads now. Devices and operating systems that are simple, locked down, intuitive, and remove most of the control that a user even has on more “actual computer” systems. Inherently, systems that are designed to “just work” and be as intuitive as possible require less teaching and understanding effort in the first place. The smarter the technology, the dumber the user has to be, therefore the value of just “showing them how to use it” decreases. I don’t think older tech education in grade school ever really focused on critical thinking about the bigger picture questions in big tech – but “big tech” was barely a thing yet. Sure there was your Apple, Microsoft, IBM’s of the world, but they didn’t dominate our everyday life as they do now and so much of our online experiences weren’t dictated by major corporations just yet. Older tech education did have some of the ‘90s fear-mongering that had everyone’s parents worried about their kids getting abducted or molested;– we were taught to not share our real information, age, where we lived, etc. (things that are all standard for representing yourself online today). But the big questions more kind of came up through necessity of finding tools that worked for you, figuring out how to fix them or how to ask the right questions to get assistance in fixing them, and avoiding adware and viruses. We would learn with examples only because tools to lock us in and make sure every step was perfectly matched (like today) didn’t exist yet. This also meant that finding creative alternative ways to accomplishing goals were celebrated – meanwhile modern teaching tools for technology don’t allow for such things in the first place. That’s a bigger problem I have with so much modern technology implementations in general: Lack of user control and troubleshooting opportunities. I tried signing up for the new popular social media app BeReal and... ooh boy. While being app-only with no website is an immediate red flag for me that is inexcusable, in my opinion, it got worse from there. I originally signed up on my iPad so I could screen capture it. I appear to have given it my work phone number – because I don’t want my real number being on public-facing social media sites because that’s stupid and should never be required... and then tried logging in with my real number on my phone (assuming I had used it). Well, the problem showed up when there is no sign-in option in the first place! You’re always prompted to make a new account upon opening the app for the first time – much like a mobile game that forces you through an hour of soul-draining tutorials despite having 100+ hours already logged in the game – and it just assumes it’ll detect your info before asking your username and prompt you to log in. Well, it didn’t do that. And, even after multiple uninstallation and reinstallation attempts, doesn’t let me change the phone number assigned to the app after the first time it asked for it. So now I have, quite literally, no recourse for logging into my account on my mobile device to actually use the damn app on my phone. Sure, I could contact email support – but that’s never displayed in the app either. Why can I not try to log in? Why do you hate your fucking users so god damn much that you think it’s acceptable to remove every possible option from them? This kind of “it just works” (until it doesn’t) bullshit philosophy is what actively contributes to making people less tech literate or capable of troubleshooting their systems. We should not tolerate it.

What’s the fix?

Ultimately, school curriculums are a massive, slow-moving beast and one person fighting for change isn’t going to do anything. But I have ideas on how the classroom environment for technology needs to change: 1. Stop relying purely on iPads and Chromebooks for in-school technology. They should be leveraged, where appropriate, but should never be a child’s only exposure to technology in the school. Instead, provide an environment where the children are regularly exposed to different types of computer interfaces (phones, tablets, laptops, desktop PCs, TVs vs projector setups, etc.) and safely/comfortable encouraged to explore familiar actions and tasks on these new systems. By growing up during the “changing age” of technology, I was exposed to everything from DOS machines to Windows 95/98 to XP/ME/2000 to Mac OS 7 and early X, to even Ubuntu and Red Hat Linux within my schools. Much how we constantly re-frame basic math problems, history lessons, or language skills in new contexts and alternate scenarios, technology should absolutely be handled the same way. 2. Teach concepts rather than purely tasks. As someone who has created tutorials for over a decade now, it’s easy to just teach someone how to do X thing, but it’s far more difficult to get them to take the time to learn fundamentals. But the longer you wait, the more impossible an obstacle that becomes. Ideas like “file management and safe data storage,” “protecting yourself online,” “how to keep in touch with friends and family” and “how to avoid malware/scams while finding cool new games/apps/etc.” can all be related to things children as young as... 6 years old these days?... are doing every day, while still teaching them crucial technology skills. “File management/safe data storage” is easily related to keeping track of your photos, game saves, TikTok clips, or school assignments. Build scenarios where a student loses access to their homework or the Blackboard-equivalent school software doesn’t accept the type of file they’ve made and guide them to fixing that. 3-2-1 backup rules are relatable to children who have faced natural disasters damaging their homes, had things stolen from them, lose things often, or even have divorced parents that they travel between and don’t always remember to bring everything each time. “Protecting yourself online” is an important understanding for avoiding predators, scams, data leaks and harvesters, etc. and can immediately be related to traditional parental fears of grooming, abduction, etc. but also to more kid-tangible ideas such as not having embarrassing pictures leak out, or keeping a secret journal without their classmates finding it online. (These are, honestly bad examples, but you get the idea and I’d rather move on than continue brainstorming. Let’s save that for the book deal or something.) “How to keep in touch with friends/family” is something everyone wants to be able to do, is especially relevant towards the end of the school year and summer break, and can help teach the value of diversifying online presences, keeping your own form of a contact book or unique online identifier, how to manage and navigate an ever-changing social media/communication tool landscape, and so on. “How to avoid malware/scams” is obvious and still relevant for modern mobile devices, given how frequently my grandmother has managed to load up her phone with adware and other junk. This can be related back to obvious things like downloading Fortnite when it’s no longer on the normal App Store and so on – while teaching crucial skills for evaluating where your downloads come from, the trustworthiness of the site you’re getting recommendations from, that kind of thing. 3. Instead of forcing kids to all have their own unique accounts that every action they take is monitored on and consequences directly applied to them regardless of experience level (I realize this is more relevant to the Windows domain-managed days of old than the modern Chromebook/etc. era) have sandbox PCs for tech classrooms to give students room to experiment with different approaches to finding things without the consequences of unleashing malware or whatever else onto the system. 4. Build assignments that allow students to prove their knowledge while also giving them freedom to explore their unique interests/rabbit holes rather than completing steps. If they can accomplish something for themselves, they will both be more invested and feel more rewarded and thus compelled to take this learning experience and continue applying it in their daily life. 5. Generally, just make it fun and flexible enough to shift with the changing landscapes of technology. Speaking from experience, trying to learn technology from what feels like an outdated context focused on rigid objectives and a lack of relevance to what I wanted to do with technology was never fun or inspiring. You bore the people who are interested in tech and you fail to reach the people who aren’t.

My Next Move

2023 will be a year of tech literacy for me. I’ve always carried the higher-level goal of improving my viewers’ tech literacy via my content, but I do often fall into the trap of focusing too much on the individual tasks rather than the thinking behind them. It’s tough in a landscape like YouTube, as the most effective way to target viewers, get views, get paid and be able to continue doing what I do means making “How to do X” kinds of videos – most people don’t care about the nuance or bigger-picture stuff in that specific moment when they were searching for an answer – but that doesn’t mean I shouldn’t try. I think focusing on bigger projects where I can walk more through my thought processes and reasoning for certain ideas more than individual “how to” steps can help a lot. I can still supplement those tutorials on my tutorial side channels and websites so I don’t miss out on that level of traffic and reach, too. There’s a lot of brainstorming involved to make this happen, but it’s important to me and I’ve been trying to figure out the path for years, so it’s time to commit. I think it’s more important than ever to develop everyone’s tech literacy and to stop being spoon-fed our online lives by mega-corps and political propaganda bots.

— Addie Find me everywhere at Mastodon Business Inquiry email _Tips here!