Lately it seems like all anyone is talking about is coding.
It came up in at least three of the four job interviews I
had to get my current job. Two of the
three people interviewed in a recent TimeMoney article as examples of long-term unemployment are learning to code to
beef up their resumes. Google routinely
sponsors “Learn to Code!” webinars on their homepage or as events in the
year. Every local library offers
learning to code classes for beginners. There are even toys that use Star Wars
as a platform to teach kids to code.
It is a skillset hailed as the cure-all of the modern world,
solving problems from students lacking math skills to unemployment. Resources everywhere suggest learning to code
to boost your desirability as an employee, to keep your mind sharp as age
advances, to boost the amount of money you can make.
For someone in IT, it is frankly confusing.
Few of these articles mention computer skills in a broader
sense as necessary – it is possible this goes unspoken, but considering how
many phone calls my help desk receives that are “how do I change the time on my
computer,” I suspect that it’s simply been overlooked.
More often than not, these articles are filled with
statements designed to draw and please audiences. They talk about how desirable coding is as a
skillset, how much money a developer can make.
Sometimes they tell one-in-a-million stories about employers competing
over coders, sending their salaries sky-high.
Most mention, at least indirectly, those most famous programmers who
have made millions with a new product – Mark Zuckerberg comes to mind, along
with basically the entirety of Silicon Valley (the place, not the show).
It’s not that all of these things aren’t true.
Sure, if you learn to code, you could make a shitload of
money. You could end up being pursued
aggressively by employers. You could
invent a new app and run your own company and make millions.
But before you do all that, you have to actually learn to
code.
Almost none of these articles mention just how
time-consuming and truthfully, hard,
coding actually is. Most, instead, point
toward coding boot camps or small start-ups that will walk a new coder through
a lot of small, easy-to-write/understand programs. They’re full of encouraging statements like “Everyone
can code!” or stories about people learning to code in retirement, as if that
somehow makes it easier. Some courses, like those on Udemy or Pluralsight,
offer more comprehensive looks at coding, but even then, they often start with
blindfolds and hand-holding, hiding just how difficult this skill is to acquire.
Everything starts easy – the traditional “Hello, World!” console
program (below),
a calculator, a rudimentary
coin-flipper, all those little baby programs that make this seem really easy
and possible. Sure, everyone can learn to code when it’s just a few
lines!
But, as with most things, it’s not that simple.
Coding is, essentially, syntax combined with algebra. How you put statements together, how you
declare variables, and how you set your methods to run (all extremely basic
tasks in C#, for example) is all about how you set up the computer to read your
code.
If you are like me, and Algebra 1 threw you back in high
school, this seems near-impossible.
There’s a lot out there about how to learn to code, but the
one resource that has always stuck with me is Viking Code School’s article “WhyLearning to Code is so Damn Hard.” It
has a handy graph to outline the process:
That graph alone is enough to scare people away – the 'desert of despair' does not sound particularly welcoming. Hell, listening to Bishop talk about hours or days at work spent running down the smallest of bugs makes me want to cry with frustration on his behalf.
So the notion that everyone should learn this skill baffles
me.
Apparently, I am not alone.
Last May, just as I was leaving teaching, I found myself
briefly unemployed. It didn’t last long
– all told, about three weeks, one of which was just me waiting for my start
date to finally arrive. But like so many
others who find themselves unemployed, I considered that maybe I should learn
to code.
Bishop set me up with a beginner Udemy course he’d used and
I was off and running.
For about an hour.
Then, stumped by the fact that coding is algebra conducted
in Latin and I sucked at both, I started playing on the internet. Within minutes of googling about coding, I
stumbled upon the Tech Crunch article above, titled “Please Don’t Learn to Code.”
It argues that, while coding is a useful skillset and our
current economy does need people able to do it, providing the entire populate
with these skills is wildly unnecessary.
At one point, the author says, “I would no more urge everyone to learn
to program than I would urge everyone to learn to plumb.”
I know there are those who would argue that everyone should learn to plumb. Resist that urge, just for a moment.
Instead, the author argues for better problem solving skills
– after all, someone can code to their heart’s content, but if they don’t
understand the problem they are solving, coding a solution is pointless. You can’t fix a bug if you don’t know why the
bug is there in the first place. Those
skills are the ones that are truly lacking – the ability to figure out what’s
going on, to ask the right questions, to fumble through and find a solution
that works – and all that has to happen long before anyone can whip out a
variable and assign it a value.
Having read this, and having helped students, family
members, and more fix computers for years, and having spent just a smattering
of time in the industry, I can safely say that coding is not for everyone.
Everyone does, however, need computer skills. That’s just the world we live in now – just
about everyone will interact with a computer almost daily, in some form. And those problem solving skills that Tech
Crunch wants us all to have are only sharpened by learning how to solve all
kinds of small computer problems.
Instead of coding, what I would argue for is computer
literacy.
I just left teaching – I’m still so close that I write
lesson plans in my head while I brush my teeth and have active shooter
nightmares. I’ll build you a curriculum
right here:
Unit 1: Working a Computer: Hardware
- Parts of a computer: motherboard, processor, hard drive, RAM, and more
- How to build a computer
- Never leave the computer on all the time
- Why pressing the power button to turn it off is a terrible idea
- Don’t just yank out a jump drive
- Fix it vs. Replace it: Pros & Cons
Unit 2: Working a Computer: Software
- Basic Programs: How to set up email, word processing, online storage, etc
- Strong Passwords
- What a strong password actually IS
- Resetting a password
- Why having a million things open makes things slow
- Changing the date, time, program defaults, and more
- Adjusting the volume
Unit 3: Basic Internet Skills
- What the Internet IS
- Set up a rudimentary network; protect that network
- Resetting a router
- THE Cloud is not A cloud
- How to google properly
- Reliable sources
- Using quotes, +, and more to tailor search results
A creative teacher might even use parts of College Humor's "If Google Were a Guy" videos (NSFW, but very funny).
Unit 4: Basic Security
- How to avoid dangerous websites
- How to protect a home network
- What phishing, ransomeware, spyware, and more actually are
- What Not to Click On in email, online, and more
- Never open email from an unknown source
- Never give out passwords or personal info over the internet
- What to do when if your email is hacked
Unit 5: Peripherals
- How to install a keyboard, mouse, jump drive, printer
- How to set up wireless anything: Bluetooth speakers, headphones, etc
- How to install a wireless printer
Somewhere in here, people also need a lesson on “How to Be Nice to Tech Support When You Call Them.” Too many people call my help desk and are already shouting when I pick up the phone. They are ignoring the fact that I have literally All The Power over their computer. Do not fuck with the IT people.
But I digress.
I even have the perfect time in high school: junior year,
maybe opposite Personal Finance, so they have experienced some of these things
but haven't yet totally checked out of school.
The final could be sitting them down in front of a computer that is all
kinds of screwed up and telling them to fix it.
And disabling Google beforehand – Google is any good tech’s best
friend.
#IndustrySecrets |
Some of this probably sounds painfully straightforward, but
trust me, it is not. In six months on a
help desk, I have had to explain how to minimize windows, that files are
different sizes and why that matters, how to turn the volume up, and why
leaving your computer on for a month is a terrible idea. People do not understand.
Sure, sometimes, I wonder if anyone would trust the lawyers
of the firm I support with their cases if they knew some of the ridiculously simple
computer issues they can’t fix. And the
fact that someone will call me and brag that they don’t get computers so
they’re keeping me employed does make me want to kill them. I’m just not sure it’s 100% their fault.
So much of our world runs on technology – smartphones,
tablets, laptops, smart watches. It is
easy to assume that, since technology is ubiquitous, everyone knows how to use
it. For years I heard teachers say that
kids should know how to use a laptop because they are always on their
phones.
But using something and using it correctly are not the
same.
After all, no one wants to be like one of my coworkers, who
was trying to code something and accidentally deleted all his system
files. You know, the ones that let the
computer turn on and load an operating system.
Of course without some computer literacy, people probably
don’t quite know what it means to load an OS, so perhaps I’m getting ahead of
myself.
We can't have everyone coding until everyone can actually
use a computer -- and let me tell you, almost no one can actually use a
computer.
No comments:
Post a Comment