It's common knowledge that enrollment in computer science programs has declined significantly over the past decade. Although the trend seems to be reversing itself with enrollment increasing over the past few years, computer science educators shouldn't get complacent just yet. There's still much work to be done in order to ensure that computer science departments continue to grow and are able to meet the increasing demand for technically skilled graduates.
I've always felt that a lack of exposure is one of the primary detriments to computer science enrollment. While I don't have any hard data, my own experience combined with an informal surveyal of friends and colleagues leads me to believe that a large percentage of North American high schools—possibly even a majority—offer little in the way of comprehensive computing classes. While I don't dispute the importance of math, English and science, I can't help but wonder why computing isn't given an equal footing in the educational sphere, given its ubiquity today.
I've encountered a great number of intelligent and educated people who have no idea whatsoever about what the field of computer science is, or even how computers and software work. I understand that most people don't necessarily need that information. However, it's odd to me that basic computer literacy isn't required of college graduates (let alone high school students) in the same way that basic writing and math skills are. In the interest of meeting ever-increasing economic demand for technically skilled graduates, shouldn't we do more to expose students to computing?
To illustrate my point, here are a few examples:
I used to work retail, and small-talk at the till would frequently turn to my university career. Almost invariably when I told a customer that I was enrolled in a computer science program, he or she would reply "Oh, I always have so much trouble getting computers to work. I think mine has a virus." This was usually followed by "But it's good that you're learning about them; we'll always need people to fix them!"
While discussing my current job with a friend of mine who is in the third year of her degree program, I described my workflow to her. I told her that a bug report or feature request will come in from a customer, I'll gather requirements and implement a solution, and then make adjustments as necessary until the customer is satisfied. She asked, "How many do you get done in a day?" I couldn't help but laugh; there are very few tasks in client-focused software development which are both worth mentioning and completable in one day. Perhaps I'm being unfair here, as that question probably seems reasonable to someone with little knowledge of software development. However, no one would ask of a an architect, "How many buildings do you design in a day?" Why is software different? I'd argue that it's because most people are never exposed to software development.
In my third year of university I was required to take an English course to fulfill my breadth requirements. While discussing one of my papers with her, the professor complimented me on my writing. She said that I had a strong, persuasive style, and asked if I was a law student. She was shocked when I told her that, as a matter of fact, I was in computer science. She then told me that she was glad I'd taken her course (on Renaissance and early modern literature), as it would ensure that I was somewhat cultured and not simply a "computer nerd." I have a great deal of respect for that professor and I don't believe that her comment had any ill intent, but it again illustrates the fact that computer science is frequently misunderstood and/or not taken seriously.
I'm not entirely sure if these examples will properly communicate the point I'd hoped they would. I wouldn't want anyone to misunderstand the purpose of this post, either; I don't mean to complain about how computer science is perceived. I don't particularly care how people feel about my field of study.
My point is this: it seems to me that a large percentage of people, no matter how well educated they may otherwise be, have no idea how computers work or how software is developed. I feel that the primary reason for this is, in most cases, that they were never exposed to computing in a serious way. Until this is remedied, computer science enrollment will continue to be mediocre and the field of computer science will not reach its full potential. After all, who wants to go into a field he or she knows nothing about?
I've mentioned that I'm surprised at the lack of computer literacy I see in some college graduates. Colleges are not my primary concern, though. If computer science enrollment is to be increased, exposure to computing needs to start much earlier. At the very least, I'd like to see is comprehensive computing courses—which teach skills up to and including basic programming—made available in a majority of high schools. Ideally, these classes should not simply be electives; one computing class should be a required part of any high school curriculum. (I don't mean a course in touch typing.) Of course, given the real-world economic and temporal constraints that educators face, I don't expect that any of these things will happen any time soon. One can always hope, though!
On a related note, you may be amused to find out what a browser is.