Glenda Hood does not know what a computer is. - Speaker for the Diodes
Aug. 28th, 2004
08:12 am - Glenda Hood does not know what a computer is.
I sometimes annoy people by trying to explain too much, especially when explaining anything computer-related to non-geeks. It's a combination of "the teaching instinct" and honest enjoyment of the subject. But it's also because of my own annoyance at the specific types of errors and poor choices that result from folks not understanding just a little bit of the "how".
An argument is often made (and it does have some weight to it, I'll admit up front) that people don't want to have to become "computer experts" or learn a whole lot of background just to use these tools which are, after all, supposed to make their lives simpler, not be a fresh bundle of complexity. And despite my own geekhood -- and more importantly, a personal learning style that works best when I've got info about the underlying principles to use as a framework for what I'm learning -- I can actually sympathize with that. There are times I just want something to work dammit, when it would be nice for something that "ought to be easy" to really be easy. I don't have to know whether my car is injected or has a carburetor, nor understand how a carburetor works in order to drive it, right?
On the other hand, understanding a carburetor may make it easier to avoid flooding the thing and then being confused as to why it won't start. Knowing how ABS works and whether my car has it makes it more likely that I'll do the right thing in a panic-braking situation (and not get freaked out by the behaviour of the automation). And even though I don't need to know how to set the timing, I do have to understand what to do when the needle on the fuel guage dips down to the 'E' mark (or 'R' on old Volkswagens), and how long overdue my next oil change is. And while it's not necessary for operation, it'll reduce surprises and frustration to have an idea how long to expect a battery to last (and why the car needs one). So why is the automobile so often held up as an example of how simple computers "ought to be" and how you don't need to understand technology in order to use it? Because in our culture the knowledge required to operate a car is ubiquitous, not because none is required. Folks don't notice the amount of knowledge required because they never sat down to study it; they gathered it gradually through immersion in a culture where it's everyday knowledge.
And still, we require a test -- and strongly encourage training -- before permitting people to drive these "don't need to learn a bunch of arcane stuff" vehicles on public roads, just to cover the basics of safe operation. (I've been told that some other countries require more understanding of the machinery as well.)
Well, folks, public ignorance of how computers work does create hazards. Some of the hazards are petty annoyances (top posting, untrimmed quotes, HTML email to people who don't use HTML-aware mail readers, 'To' headers five screens long, posting a GIF of a picture of text on the web instead of posting text of the text). Some of the hazards are more significant impediments to productivity and/or security (worms, trojans, zombies, spyware). And some actually threaten our democracy. (Yes, yes, I know it's a republic, not a democracy; what's being threatened is the ability of the people to have their votes counted, the democratic aspect of the republic.)
In one article about electronic voting machines, I ran across:
- The idea (repeatedly) that having the machine record three copies of the same thing in the same format counts as an "audit trail" even if there's no way to verify that what it recorded in the first place is what the voter had indicated,
- The idea that since computers, when functioning correctly, programmed correctly, and operated correctly, tabulate things more reliably than a human, there's no point in ever checking the computer's results,
- A statement from Alfred Gonzales (Santa Clara County) that "There is fool-proof security."
- And a statement from Florida Secretary of State Glenda Hood that, "The touch-screen machines are not computers."
(And that's just the misunderstanding-technology" part. I'm also boggled by laws against recounting votes in some places. WTF?!)
Regarding #1, Jesse Durazo, the registrar of voters of Santa Clara County said, "It's a redundant perfection!... It starts with the premise that the information in the system is correct." (Uh, dude? If you can't audit the first step, you don't have an audit trail. And if one copy can be changed, three copies can be changed.)
Regarding #2, even if there's no malicious tampering (not yet a good bet), some bugs are subtle. So are some hardware failures. If you hand-check every time then there's not much point to using automation, but we still need to be able to check when things look wonky (or even for random-sample testing in a real-world environment).
Regarding #3, well there ain't such a thing.[*] Security is relative. You make breaches difficult, and you try to make the difficulty greater than the temptation to break in (for an election, the stakes are pretty high); and you make it as easy as possible to detect the break-ins you know are still possible, however unlikely you think you've made them. You tie the expense of the security to the risks (both the degree of incentive to break in and what you stand to lose if someone does so). (Note that I did not limit this statement to computer security. This mistake isn't strictly a failure-to-grok-technology one.)
But #4 is the one that really bugs me. You've got a box that runs a Microsoft operating system, has a touch-screen, collects and tabulates information, and can be reprogrammed without dismantling it and redesigning the hardware, but it's not a computer? If you don't know what a computer is... Look, I can understand someone not being worried about their microwave oven being broken into, or your car's fuel-injection computer being infected with a virus, but to claim that a touch-screen voting machine is miraculously secure because it doesn't have a word processor and a web browser on it (I can't think of why else she'd say it "isn't a computer") is boggling.
There also seems to be an attitude that "the academics" and security experts "don't get it", are behind the times, are mere worrywarts, etc. Folks? Someone who has made a career as a computer expert is not likely to be a Luddite.
Unfortunately there's no guarantee that getting a job as a programmer means you already really understand computers. Some of the failures-to-comprehend-the-how come from the folks who designed the voting machines, or who wrote the code. aliza250 can tell you about having to explain really basic stuff to folks who were already getting paid to be computer programmers. I can tell stories about coders who'd learned cookbook stuff on one type of system and were completely ignorant of any others, and had no idea how different the system they were writing for was from the one they had learned on.[**] And I'm sure every programmer on my friends list (and some of the engineers) can talk about other people's really idiotic code they've had to maintain or correct. Worse, how many folks here saw the problem with the 640K barrier before Microsoft did, or the danger of setting up a system to automatically execute code received over the net before macro-viruses and mail worms became a problem?
I don't expect everyone to have my level of understanding of how all the pieces fit together, much less be the kinds of experts some of my friends are, but I'm annoyed by the people who don't even want to sit through "driver's ed for the net" or learn the computer equivalent of "what a spark plug is" and "how to check the oil, and I'm a little scared of the folks who are trusted to make technology decisions that affect our democracy without learning a damned thing about that technology first. And I'm discouraged by how hard it often is to make non-geeks understand enough of the threat to care. Direct interaction with computers has only relatively recently become as ubiquitous as cars were in 1960, so perhaps eventually more people will understand them well enough to avoid the mistakes experts think are obvious. In the meantime ... *Grrr!*
The more I think about it, the more apropriate it seems that I know someone in the security field named Cassandra.
[*] If you want your computer to be secure, encase it in concrete, unplugged, put the concrete-encased machine in a vault, and post an armed guard outside the vault. And even then, how do you know that guard can't be bribed? In the meantime, your computer isn't being very useful.
[**] Let's start with the fact that they were writing code in COBOL under MS-DOS 2.1 for a system be deployed under Xenix 3.something, and didn't know that multitasking operating systems existed, didn't know what a multiuser system was, and had never heard of a dumb terminal. (Note that none of these were new or obscure concepts at the time.) If they'd known such things existed, they might have done some research to see what the differences between Xenix and MS-DOS were, but they'd never seen a computer that wasn't an IBM PC and had no idea that Xenix wasn't just a funny spelling of "DOS". They didn't know enough to know what questions to ask.