Thursday, April 16, 2009

Training and Supporting the Technophobic Teacher

Introduction: What is technophobia?
From a purely psychological point of view, is there really any such thing as a clinically diagnosable condition called “technophobia”? After all, a number of articles and blog offerings on the Internet that purport to discuss technophobia (many of which even include that term in their titles) actually turn out to be straightforward discussions of how to effectively provide technical training to just about anybody (Bell, 2006; Fox, 2007; Goldsborough, 2003; Jencius, 2009). But a brief look at the work of some serious practitioners like M. J. Brosnan (2006) and the renowned psychologist team of Rosen and Weil (1990; 1995) should be enough to convince anyone that “technophobia” is more than just an empty hyped-up term.

The work of all three of these researchers spans several decades, so they have had time to rework and refine their research methodologies, as well as undertake longitudinal studies. Brosnan (2006) concludes that 1 in 20 people suffer from an extreme clinical technophobia, which entails measurable involuntary physical manifestations of anxiety similar to those demonstrated with traditional phobias, like fear of spiders. However, both Brosnan and the team of Rosen and Weil do not limit their definition of technophobia to just this more extremely-afflicted 5% of the population. Brosnan reports that fully one in three of the subjects he studied suffer from some aspect of technophobia (2006), while Rosen similarly concludes that “approximately one-fourth to one-third of all people can be classified as ‘technophobic’” (1993, p. 28). Technophobia in this significant portion of the population is defined as “an irrational anticipation of fear evoked by the thought of using (or actually using) computers, the effects of which result in avoiding, or minimising, computer usage” (Brosnan, 2006, p. 1081).

Importantly, in addition to effectively defining technophobia and showing the scope of the problem, the work of these serious researchers also blows away our stereotypes regarding who tends to be a technophobe. It is not gender-specific; it is not age-specific; it is not just another form of math anxiety; and it is not just limited to “high strung” individuals who display other kinds of phobias (Rosen, 1993).

While these researchers make the case that technophobia is “a real phenomenon” (Brosnan, 2006, p. 1081) that has similarities to other phobias, I would go further to posit that technophobia has some characteristics that make it decidedly different and potentially more troubling than other phobias. To frame an example, a technophobic teacher is obviously aware of the fact that information technologies have become a vital fixture on the modern educational landscape. They also surely realize that their fear of interacting with these technologies could stand as a major impediment to the advancement of their career, and it could perhaps reasonably be seen as an existential threat to their career itself. This realization could lead to new, quite rational fears of career failure flowing from their technophobia, with each fear exacerbating the other in a kind of negative feedback loop of dread, loathing, anger, and despair. This picture of technophobia is not at all analogous to something like “spider phobia”.

The rigorous researchers mentioned thus far all share the same background: they are psychologists. In the methodologies that they have constructed, they use the tools of the psychologist to fix what they have defined as a psychological problem. In fact, Rosen and Weil mince no words; they flatly state “Before computer education can truly become the fourth ‘r,’ school administrators must become cognizant of the massive numbers of technophobic teachers in their schools and take steps to provide psychological assistance to eliminate this psychologically-based problem [italics mine]” (Rosen and Weil, 1995, p. 28). Indeed, the work of these researchers has shown that purely psychologically oriented stress-reduction therapies (such as those offered to treat just about any kind of phobia) offer the prospect of largely eliminating technophobic tendencies in people. If we were to take this as the final word, then this paper would end here, instructing technophobes to enroll in a program like the ones put in place by Brosnan (2006) or Rosen (1988). However, it could be worthwhile to speculate upon specific causes of technophobia, not from a psychological point of view, but from an educational technology point of view, and then see whether by effectively defining those causes we might be led to remedies that might be available to a technical trainer or mentor, rather than to a psychologist.

Possible Causes of Computer Aversion
So what are some potential causes of technophobia or “computer aversion”? Most sources seem to agree that an individual can develop computer aversion if technology is not introduced to them in an appropriate way (Bell, 2006; Bartozzi, 2007; Brosnan, 2006; Fox, 2007; Goldsborough, 2003). All of these sources either state or imply that this means that the computer averse individuals were not trained in an appropriate way. Certainly, poor training must play its role in the creation of technophobes, but in all of the sources that I have consulted I have seen no mention of what is, to me, the most obvious “bad introduction” to technology that an individual can receive, which is to be introduced to bad technology! A great deal of software produced in the past and in the present day is simply designed poorly, so as to be confusing, ambiguous, and incredibly frustrating to use. When I use such a product, I tend to think thoughts like “what a complete piece of garbage”, but when a neophyte is given an unworkable computer interface, they potentially have no basis for knowing that this klutzy interface is not representative of the entire world of computer technology. Instead, all that they may perceive is others around them happily, busily, and effectively working with their computers, while they languish in abject confusion. Is it any wonder that a person in this position might tend to assume that they somehow lack a fundamental ability that most others possess? Should we be surprised when they choose to “blame the victim”, i.e., blame themselves?

Another bit of dogma shared by many of the sources that I have consulted in my research is that computer averse individuals seem to believe that if they push a wrong button that they will somehow do some kind of damage. What is amazing to me is that these sources are seemingly unanimous in their appraisal that this cannot happen, and that computer averse trainees should be carefully taught this (Harrison, 2000; Goldsborough, 2003; Bell, 2006). But how many of these afflicted individuals actually have had earlier, formative experiences when they “pushed a button” and valuable data or documents that they may have spent hours, or even days working to carefully create were either unintentionally deleted, overlaid, or misplaced somewhere deep in the file structures of a computer? Okay, so maybe this did not result in some piece of hardware or operating system software being physically damaged, but such a deletion or corruption of data can sometimes do more damage to an individual or institution than if somebody had walked in with a sledge hammer and started smashing every piece of equipment they could find.

An additional “bad introduction” that a person can receive to computers is suggested in the research of Bertozzi and Lee (2007), who found a positive correlation between playing with computers and “feeling comfortable with and competent in terms of computer technology” (Bertozzi and Lee, 2007, p. 200). This correlation strongly suggests that the inverse situation is also true: if computer use is presented to an individual as representing only one onerous task after another to trudge through, it is no surprise that the computer will be avoided, if not feared.

One more root cause of computer aversion is hinted at in the work of Mark Lecher, who offers that “adequate and timely support” should be made available to the recovering technophobe (Lecher, 2004, p. 173). Obviously, proper support services are needed by all technology users, but the absence of adequate support during a person’s early encounters with technology could certainly set up an understandable long-standing aversion to computer usage. This particular brand of technophobia is reflected in this quote from a technology consultant regarding those who are reluctant to adopt interactive whiteboard technology: “A lot [of teachers] still like chalk, and the reason is that they know when they walk into a classroom that it's going to work. With a whiteboard they're never totally certain” (Zind, 2008).

Potential Remedies for Computer Aversion
Given the causes of computer aversion laid out above, the most obvious palliative that comes to mind is simply honesty. One of the most useful treatises that I have run across is by Lucy Harrison, a reference librarian with considerable experience dealing with computerphobic patrons needing to use her library’s computers to search for resources. Like several of the other authors I encountered, she makes the point that falsely idealizing a complex (or poorly-designed) technology will not be of ultimate benefit to the computer averse trainee (Harrison, 2000). In my personal experience in my over twenty years of training adults in uses of various technologies, I find that computer averse individuals are very relieved to hear someone they perceive as a capable technophile explain to them that they are right to be confused by certain obscure features of a poorly designed piece of software or hardware. And the computer averse users that I have worked with are also especially grateful to get advice on how to avoid loss of data. This topic is not something to be left for the latter part of a training sequence; its inclusion early on can build a user’s confidence to be able to use a technology without “pushing the wrong buttons” and “breaking something”.

Beyond honesty, I find that computer averse individuals appreciate having somebody who will be their staunch advocate when necessary, either when hardware or software is not up to snuff, or when proper support is lacking. In any institution, users of any given technology should only be expected to use it if they have reasonable assurance that they will be supported in the use of the technology and will receive reasonably quick responses and resolutions when and if the technology breaks down. If such an effective support infrastructure is not in place, then a teacher has every right, in some cases perhaps even a responsibility, to refuse to use the technology, and to fall back to more reliable alternatives.

Speaking of proper infrastructure, a review of a recent New York Times article, regarding the failure of one-to-one computing initiatives in some schools (Hu, 2007), brings to mind some basic project management skills that can be brought to bear to help the technologically averse adapt to changes brought about by the introduction of new technologies. New technologies need to be introduced incrementally, if at all possible, to avoid throwing anyone (particularly the computer averse) into the “deep end”. Surely a big, underlying fear among technophobic teachers is that new technologies will usurp the tried-and-true teaching methodologies that they have worked hard (sometimes for decades) to develop. One additional cause of computer aversion that we see in many teachers may come from their past negative experiences of “new technologies” being foisted upon them without first getting proper buy-in, without providing proper training, or without effective integration into curriculum and existing class routines.

But obviously, effective training for the technophobic teacher should not consist just of mutual gripe sessions about the shortcomings of technology, nor solely in lobbying on behalf of the teacher to get proper support and management into place. It primarily needs to focus on listening to the individual, to gain some insights as to why they have aversions toward working with a specific technology. In the words of librarian Lucie Harrison, we “should treat such concerns seriously, and not brush them off by saying, rather arrogantly, how easy it all really is” (Harrison, 2000, p. 35). In my work with any trainee (but particularly the computer averse), I like to find at least one thing that they would really like to be able to accomplish with the technology, and then find a way to very quickly either partially or completely fulfill that desire. To me, the speed of getting there is critical. The idea is to get a quick payoff, to offset perceived negatives with a readily attainable positive. In an interactive whiteboard training that I did with a computer averse teacher at the beginning of this school year, the quick payoff was simply the ability to write and erase with the stylus on the whiteboard. It doesn’t have to be a big thing, but rather something that the technophobe can immediately hold up to themselves and say “wow, I can do this”, preferably being able to add, “and this is really fun!” A little bit of “fun” can bring many barriers tumbling down (Bertozzi & Lee, 2007).

The ultimate goal is to replace dread with enthusiasm, to banish “I’m so stupid when it comes to computers” and instead summon up “I finally figured out how to get this to work”. As Rosen pointed out, technophobia is a psychological problem that can be dealt with by effectively addressing the underlying roots of the aversion. It is when some of those underlying roots lie outside of the individual, and are instead embodied in poorly designed software or training, in inadequate support infrastructures, in poorly planned implementations of new technologies, or simply in an overwhelming amount of information leaving teachers dazed and confused – it is in such situations that the educational technologist, rather than the psychologist, must aggressively step in, take the bull by the horns, and work to set things right.


Bell, M. (2006, November). Encouraging Technophobes: Old Dogs CAN Learn New Tricks. MultiMedia & Internet@Schools, 13(6), 36-38. Retrieved April 7, 2009, from Computers & Applied Sciences Complete database.

Bertozzi, E., & Lee, S. (2007). Not Just Fun and Games: Digital Play, Gender and Attitudes Towards Technology. . (pp. 179-204). Organization for Research on Women & Communication. Retrieved April 6, 2009, from Communication & Mass Media Complete database.

Brosnan, M., & Thorpe, S. (2006, November). An evaluation of two clinically-derived treatments for technophobia. Computers in Human Behavior, 22(6), 1080-1095. Retrieved April 6, 2009, doi:10.1016/j.chb.2006.02.001

Fox, C. (2007, July). From Technophobes to Tech Believers. T H E Journal, 34(7), 36-37. Retrieved April 6, 2009, from Computers & Applied Sciences Complete database.

Goldsborough, R. (2002, March). Personal Computing: Overcoming Fear of PCs. Link-Up, 19(2), 7. Retrieved April 6, 2009, from Computers & Applied Sciences Complete database.

Goldsborough, R. (2003, July 7). Taming Technology Fears. Community College Week, 15(24), 19-19. Retrieved April 6, 2009, from Education Research Complete database.

Gurcan-Namlu, A. (2002, May). Technophobia and its Factors: A Study On Teacher Candidates. Educational Sciences: Theory & Practice, 2(1), 244. Retrieved April 6, 2009, from Education Research Complete database.

Harrison, L. (2000, January). Stress Relief: Help for the Technophobic Patron from the Reference Desk. Reference Librarian, 33(69/70), 31. Retrieved April 6, 2009, from Education Research Complete database.

Hemby, K. (1998). The Impact of Keyboarding Skill on Computer Anxiety in End Users. Retrieved in April 9, 2009 from:

Hu, W. (2007, May 4). Seeing No Progress, Some Schools Drop Laptops. New York Times website. Retrieved April 6, 2009, from

Jencius, M. (2009, January). You say you want a resolution. Counseling Today, 51(7), 26-27. Retrieved April 6, 2009, from Education Research Complete database. (2007, June 5). Causes of Technophobia: Why Some People Refuse to Learn About Computers. Retrieved on April 9, 2009 from:

Lecher, M. (2004, June 1). Technophobes Teaching with Technology. Association of Small Computer Users in Education (ASCUE), (ERIC Document Reproduction Service No. ED490116) Retrieved April 7, 2009, from ERIC database.

Rosen, L., & California State Univ., D. (1988, January 1). A Model Program for Computerphobia Reduction. (ERIC Document Reproduction Service No. ED318466) Retrieved April 7, 2009, from ERIC database.

Rosen, L., & Others, A. (1993, March 1). Treating Technophobia: A Longitudinal Evaluation of the Computerphobia Reduction Program. Computers in Human Behavior, 9(1), 27-50. (ERIC Document Reproduction Service No. EJ456183) Retrieved April 7, 2009, from ERIC database.

Rosen, L. D., & Weil, M. M. (1990). Computers, classroom instruction, and the computerphobic university student. Collegiate Microcomputer, 8, 275-283.

Rosen, L., & Weil, M. (1995, March 1). Computer Availability, Computer Experience and Technophobia among Public School Teachers. Computers in Human Behavior, 11(1), 9-31. (ERIC Document Reproduction Service No. EJ496618) Retrieved April 7, 2009, from ERIC database.

Zind, T. (2008, July 17). Dealing With Technophobes. Retrieved on April 9, 2009 from:

Friday, April 10, 2009

How do computers fit into a plan for pre-Kindergarten children?

Another brief paper that I submitted earlier this semester (Spring '09) at George Washington University:

In this week’s studies in usage of technology in pre-Kindergarten settings, we were introduced to a multi-faceted and often confusing discussion (which might be better characterized as “argument” or “controversy”) over the proper role that modern technologies might play in a pre-Kindergarten classroom. I find it necessary, in order to a bit more dispassionately focus on the question at hand, to take up a curriculum-driven approach to usage of educational technologies in this context. This avoids for the moment the passion-inciting question of whether such technologies are healthy or “proper” for usage with 3 and 4 year-old pre-Kindergarten students, and focuses instead on these two questions: (1) What are the prime goals of the average pre-Kindergarten curriculum? and (2) Could the achievement of these goals be assisted by computer technology?

What are the prime goals of the average pre-Kindergarten curriculum?
For starters, the National Institute for Early Education Research [NIEER] offers this consolation to those of us seeking to make sense of widely disparate pre-Kindergarten curricula throughout the U.S.: “Given the multitude of available curriculum models, the confusion regarding which ones are appropriate for 3- and 4-year-olds is understandable” (Frede & Ackerman, 2007). Determined, however, to find a common thread, I looked at the curricula of a couple of Head Start programs as well as the pre-K curriculum of my neighborhood international school (IST, 2008). All the curricula I examined seemed to converge at a point clearly established by the Head Start program in Portland, Oregon: “The goal … is to bring about a greater degree of social competence in children” (Head Start - Portland, 2009). While some pre-K programs go notably further in their curriculum, stipulating that children will “develop beginning reading skills … acquire beginning writing skills,” etc. (Head Start - New Castle County, 2009), the most fundamental focus of all programs seems to be the development of cooperative social behavior.

One thing that is clearly not present in any pre-Kindergarten curriculum that I have examined is a particular focus on “computer skills”. Whereas many K-12 curricula do pointedly include the development of computer skills, such an explicit role for computers within the pre-Kindergarten curriculum is conspicuously absent. This is important to note: K-12 curricula require the usage of computers; pre-Kindergarten curricula do not. This means that the inclusion of computers in pre-K classrooms seems to be decidedly optional from a curricular perspective.

Could the achievement of these goals be assisted by computer technology?
For the sake of brevity, I will presume that an assessment of technologies which aid in development of reading and writing skills will be covered in our upcoming survey of K-12 educational technologies. This allows a focus on the question of whether the purely socialization-oriented goals common to all pre-K programs can be assisted by incorporation of computer technologies.

In summarizing the findings of neuroscientific research, a recent NIEER report concludes that “sensitive interactions with adults do more to promote brain development than any toy, CD, or DVD” (Thompson, 2008). While this certainly favors humans over gadgets in the pre-K classroom, it leaves open the question as to whether gadgets, such as computers, might play an effective supporting role. Fortunately, I found two relatively recent studies (from 2002 and 2003) which focus on computers and socialization of preschoolers.

One study focused on both the peer-to-peer interactions and student-to-teacher interactions around two separate computers, equipped with a variety of age-appropriate educational games, placed for observation side by side in a preschool classroom. One student was placed at each of the two computers, and the main thrust of the study was to see whether this would lead to social isolation or social interaction between the separately-placed students. The researchers give details of some of the numerous peer-to-peer interactions that occurred, many of which fall into the category of “hey look what I’m doing on my computer” type of interactions. Predictably, teachers had to sometimes give assistance to students, and also had to adjudicate arguments between students (Heft & Swaminathan, 2002). In reading through these anecdotal episodes, it occurs to me that these are the kinds of social interactions that take place when any toys are being played with in a preschool classroom; in this case, the toys just happen to be computer-based learning games.

The second socialization-oriented study was set in an inner-city preschool, and looked at the potential for computing technology to have a positive effect in mitigating the disruptive behavior of “at-risk” students, behaviors which included “poor attention to directions, hyperactivity, and aggression”. The researchers studied and compared two groups of young students, with one group working with computerized math games twice a week, and the other group not getting computer access. Their findings showed that the “at-risk” preschool students who found it difficult to sit still in a normal class would be quite attentive throughout a twenty to twenty-five minute session of computer-based math games. Not surprisingly, their performance on math assessments was noticeably better than that of their counterparts who were not given access to computers. However, the researchers noted that the disruptive behaviors of the “at-risk” students resumed seconds after they got back into their normal classroom (Laffey, et al, 2003).

No overarching conclusions can be derived from a cursory look at two studies, but I think it can be said that there is nothing in either study that would suggest that the usage of computers by preschool students would likely provide notable incremental benefits in the socialization of the children, either in peer-to-peer interactions or student-to-teacher interactions. The first study simply showed that computer learning games offer the same opportunities for social interaction and learning that any other “toy” in the classroom offers, and the second study shows that, while disruptive children are temporarily pacified by computer learning games, their behavioral problems are not permanently ameliorated in any identifiable way.


Frede, E. & Ackerman, D. (2007). Preschool Curriculum Decision Making: Dimensions to Consider. Retrieved March 2, 2009 from

Head Start New Castle County [Delaware] (2009). Curriculum. Retrieved, March 1, 2009 from

Head Start Portland [Oregon] Public Schools (2009). Curriculum – Basic Educational Skills as Defined by Head Start. Retrieved, March 1, 2009 from

Heft, T., & Swaminathan, S. (2002, March 1). The Effects of Computers on the Social Behavior of Preschoolers. Journal of Research in Childhood Education, 16(2), 162-74. (ERIC Document Reproduction Service No. EJ654377) Retrieved March 3, 2009, from ERIC database.

International School of Tianjin [IST] (2008). IST Curriculum Handbook. Retrieved, March 2, 2009 from

Laffey, J., Espinosa, L., Moore, J., & Lodree, A. (2003). Supporting Learning and Behavior of At-Risk Young Children: Computers in Urban Education. Journal of Research on Technology in Education, 35(4). Retrieved March 3, 2009, from

Thompson, R. (2008). Connecting Neurons, Concepts, and People; Brain Development and its Implications. Retrieved, March 1, 2009 from

The Tendency to Neglect Questions of Efficiency in Educational Technology Research

Yet another paper that I submitted early this semester (Spring '09) at George Washington University...

In the course of completing many learning activities that take place in an elementary school classroom, there are usually some steps that could be referred to as constituting a kind of “clerical overhead”, duties and actions that fall outside of the explicit realm of students directly engaging in a learning activity. Clerical overhead accrues to both teachers and students, and some of it unquestionably constitutes an irreducible positive educational component in and of itself, such as a teacher grading a set of handwritten student essays (gathering vital formative assessment information) or students tracking their progress with the week’s homework on personal checklists. However, for both teachers and students, some clerical overhead represents something to be gotten out of the way as quickly as possible so that real teaching or learning activities can commence. Examples of such overhead might include: a teacher’s manual grading of multiple choice quizzes, or young students taking an inordinate amount of time to simply go around the room and gather supplies for an activity. For teachers, it is a zero-sum game: the more they can reduce the duration of such overhead tasks, the more time they can spend in high-quality lesson preparation and teaching with students. In my discussions with teachers this year, the most prevalent frustration expressed to me is that excessive administrative and clerical overhead undermines their ability to be effective teachers. While reduction of substantial parts of this overhead may not be addressable by educational computing technologies, some parts of it may well be.

A simple question that is often asked in industrial settings is, “Given that a task might be done with or without a specific technological toolset, what is the incremental time savings accrued through use of the toolset?” Within industry, such a time-oriented productivity analysis tends to be of prime importance and is rarely neglected. But in my very limited survey of educational computing research, I am finding questions such as this only indirectly addressed, if they are addressed at all. For example, a study on usage of handheld student-response devices (used peripherally with a popular interactive whiteboard system) focused on questions of student achievement and attitudes associated with usage of the devices (English, 2006). Besides offering a flawed, “apples and oranges” quantitative comparison to assess student achievement, the study failed to focus on questions of student/teacher productivity in usage of the devices. The comments from teachers that were included in the study hinted that there may have been significant savings of a teacher’s time when multiple-choice tests were administered to the class via the hand-held devices, and then instantly and automatically graded by the associated software installed on the teacher’s computer. How much time might be saved per week by a teacher that is facile with this technology, time that would not have to be spent sitting and grading objective tests, but that could instead be used for higher-level teaching tasks, such as explorations of advanced ways to differentiate lessons or to develop creative trans-disciplinary tie-ins? Not only did this study not consider the question, it also did not recommend that a future study look at the question. Are the authors unaware that such a question regarding possible reduction of “clerical overhead” might be asked, or is such a question deemed appropriate for industrial assembly lines but inappropriate for educational computing research?

To investigate further, I rather randomly stumbled into the current issue of the Journal of Educational Computing Research and reviewed an article on the hot topic of one-to-one computing in schools (Lei & Zhao, 2008). Since most schools that might be considering an “upgrade” to one-to-one computing likely already have computers (either present in the classroom or in separate computing labs), I expected that some of the study might focus on the incremental benefits that accrue to the students of a school in switching from a less-pervasive computing presence to a more-pervasive one. A part of such an analysis would almost surely focus on efficiencies afforded a student who no longer needs to reserve computer time or wait their turn to access a computer. Quite simply: How much more quickly would a student with their own notebook computer be able to accomplish certain basic computing tasks, as opposed to a student in a shared-computer situation? But once again, this study focused exclusively on questions of effectiveness of learning – certainly not a bad focus, but an incomplete focus, ignoring simple but potentially important questions of efficiency.

As mentioned above, the observations made here are based upon a very small sampling of educational computing research done over the course of my first three active weeks in the GWU ETL program. My intention is to use this focus paper as scaffolding for further investigations in the field of educational computing research, particularly focusing on the question of whether the exclusion of time-oriented productivity analysis is indeed widespread within the field, and whether any authorities in the field are in agreement with me in identifying this as a problem.


English, Lauren J. (2006, September). ACTIVote: Can It Activate Our Students to Learn? Action Research Exchange, 5(2). Retrieved February 4, 2009, from

Labaree, David (1998, November). Educational Researchers: Living with a Lesser Form of Knowledge. Educational Researcher, 27(8). Retrieved January 22, 2009, from

Lei, J., & Zhao, Y. (2008, January 1). One-to-One Computing: What Does It Bring to Schools? Journal of Educational Computing Research, 39(2), 97-122. (ERIC Document Reproduction Service No. EJ820002) Retrieved February 8, 2009, from ERIC database.

Newby, Timothy (2006). Educational Technology for Teaching and Learning. Upper Saddle River, NJ: Pearson Education, Inc.

Roblyer, M.D. (2006). Integrating Educational Technology in Teaching. Upper Saddle River, NJ: Pearson Education, Inc.

Woolfolk, Anita (2008). Educational Psychology. Boston: Pearson Education, Inc.

Thursday, April 9, 2009

A Brief Look at WCAG 2.0

Here is yet another paper that I just completed for my George Washington University coursework. This one gets into a topic that may, understandably, not be too thrilling for a lot of folks. But, as with some of my other postings, somebody out there on the great wide Web might find this to be of use.

* * * *

"The power of the Web is in its universality. Access by everyone regardless of disability is an essential aspect." – Tim Berners-Lee (Web Accessibility Initiative, 2009)

The World Wide Web Consortium (known by the acronym, W3C) is the body that oversees the development and maintenance of the most vital standards and protocols upon which the Web runs, including things as fundamental and ubiquitous as HTML (Kyrnin, 2009). Not the least among its tasks is the establishment of rules of the road for ensuring accessibility of the Web to all people, including those with disabilities.

All of its work in this regard is done within the Web Accessibility Initiative (WAI), a group which is co-sponsored by North American and European governmental bodies and by a number of prominent high-tech organizations (including Microsoft and IBM). Participants in the WAI include a broad range of commercial, governmental, and educational organizations. Importantly, the Protocol and Formats Working Group of the WAI interacts with all other bodies of the W3C, reviewing their protocols and standards with an eye toward accessibility issues (Brewer, 2004)

But the much more focused work of the WAI lies in its development of three explicit sets of accessibility standards relating to accessibility of (1) web content, (2) authoring tools, and (3) web browsers (Brewer, 2004). Fulfilling the need for an update to web content standards, the Web Content Accessibility Guidelines 2.0 (known by the acronym, WCAG 2.0) were formally published by the WAI just a few months ago, in December of 2008 (WCAG Overview, 2009). It was arguably about time for some new standards, since WCAG 1.0 had been in effect since May 1999, and was thus a fairly ancient document in Web terms (Brewer, 2004).

As one might expect of the updated standard, WCAG 2.0 was designed to be applicable to a broad range of new types of Web content and applications, many of which did not exist back in 1999. Perhaps just as vital to Web developers, it is also explicitly designed to make compliance with the standard much more easily testable via automated means. Looking at some of the details, there are twelve basic requirements that a WCAG 2.0 compliant web-resource should meet. Among these are requirements that: (1) text-based substitutes should be offered for any substantive non-text items presented (e.g., a brief description should be offered along with an image); (2) functionality is to be made operable from the keyboard; (3) nothing should be displayed in a flashing manner that might cause seizures; (4) supplemental (simplified) content should be made available as an alternative to potentially difficult to understand text; (5) assistance should be provided in data entry and correction of errors. As can be seen from just this partial list, many of the WCAG 2.0 requirements could be beneficial not just to those that are disabled, but to all Web users. In fact, some of the additional requirements seem to just be generic recommendations on good website design: “provide users enough time to read and use content”; “make Web pages appear and operate in predictable ways” (W3C, How to Meet WCAG 2.0, 2009). Who doesn’t want (and expect) a web site to “operate in predictable ways?”

While the WCAG 2.0 recommendations seem to make sense, it is worthwhile to see if there are any reasons to be critical of the new standards. It turns out that a number of fairly harsh critiques are readily accessible on the Web, but they are mostly written in rather technical terms, by Web developers for Web developers. Most of the critiques seem concerned that the details of WCAG 2.0 are difficult to understand and even more difficult to successfully implement. Joe Clark (2006), who seems to be a respected voice among WCAG 2.0 critics, sums it up this way: “In an effort to be all things to all web content, the fundamentals of WCAG 2 are nearly impossible for a working standards-compliant developer to understand.” Another critic, Vladimir Popov, ruefully laments that the length of the WCAG 2.0 documentation rivals that of Tolstoy’s War and Peace (Popov, 2006). Unfortunately, this is all too reminiscent of my past experiences working with other W3C standards, back when I was developing software products that made use of various parts of the XML, SOAP, and XSLT standards (all maintained by the W3C). In those days, I knew never to touch the lengthy and complex W3C specifications themselves, but instead to rely on websites and books that translated the standards into comprehensible and concise English. So, future implementations of WCAG 2.0 may depend upon a cadre of translators who can successfully give us the equivalent of “WCAG 2.0 for Dummies” books or websites (which will hopefully be WCAG 2.0 compliant).

But if WCAG 2.0 is too complex to implement, what alternatives are there for accomplishing some of its stated goals for improved accessibility? Vladimir Popov, the provider of the Tolstoy analogy, additionally offers some common sense advice on this point: It is much simpler to make significant improvements to a relatively small number of assistive tools and technologies that are in use by people with disabilities, rather than to make large-scale changes to “billions of existing web pages” (Popov, 2006). From a pragmatic, problem-solving point of view, Mr. Popov’s suggestion certainly sounds more doable; in fact, it honestly sounds like the only thing that is doable.

Brewer, J. et al (2004, June). Overview of the Web Accessibility Initiative. Retrieved from the World Wide Web Consortium website:

Clark, J. (2006). To Hell with WCAG2. Retrieved April 9, 2009 from

Kyrnin, J. (2009). What is the W3C? Retrieved April 9, 2009 from the website:

Lawson, B. (2006). WCAG 2.0: when I want a beer, don’t give me a shandy. Retrieved April 9, 2009 from

Popov, V. (2006). Can WCAG 2.0 be simpler? Retrieved April 9, 2009 from

Wikipedia website (2009, March 14). World Wide Web Consortium. Retrieved April 9, 2009 from

W3C website (2009, March). How to Meet WCAG 2.0. Retrieved April 9, 2009 from

W3C website (2009, March). Understanding WCAG 2.0. Retrieved April 9, 2009 from

W3C website (2009, March). Web Accessibility Initiative. Retrieved April 9, 2009 from

W3C website (2009, March). WCAG Overview. Retrieved April 9, 2009 from

Sunday, April 5, 2009

Elementary School Web Content Filtering on a Shoestring

Here is an annotated bibliography that I completed a few weeks ago for one of my George Washington University courses. This is the first annotated bibliography that I've ever written, and at this point I would have to say that I'm not a big fan of the genre. The requirement to come up first with a list of resources, and then try to build a nicely flowing narrative around that list seems very backward and unnatural to me. But, after I finished this annotated bibliography, I realized that there might be some information here that is useful to others, so I am going ahead and posting it...

Why this topic was chosen
This annotated bibliography serves to provide a research foundation for a real-world need that was recently expressed to me by some of the elementary teachers at a medium-sized international school located in northeastern China. They have only recently come to realize that their school has absolutely no content filtering in place for students that are accessing the Internet via the school’s computers. After a number of incidents in which her students came upon objectionable material (particularly images) during web searches, one of the teachers sought confirmation from one of the school’s administrators that the school has no content filters in place. She was informed that a filtering solution has not been acquired because such solutions cost several thousand dollars and thus are out of budgetary bounds. When the teachers subsequently approached me for advice, I promised them that I would educate myself in the overall topic of content filtering and see if I could find low- or no-cost content filtering solutions that might be appropriate for the school to consider.

This annotated bibliography begins by presenting resources that give an overview of the need for and the nature of content filtering, and then gradually narrows its focus to present resources pertinent to the needs of this particular elementary school, which has Windows-based desktop computers for students at an approximate 3 to 1 (student to computer) ratio.

An introduction to content filtering: Why it is needed and how it works
SonicWall, Inc. (2004). Whitepaper: Demystifying Internet Content Filtering for Businesses, Schools, and Libraries. Retrieved March 16, 2009, from

While there is no shortage of information on the Internet regarding content filtering, quite a bit of that information comes from corporations that are trying to sell proprietary filtering solutions. Although the white paper cited above does indeed come from such a vendor, it is the best overview of the topic that I have found. While the paper, published in 2004, is slightly dated, the overview it gives appears to still be fully pertinent and accurate in 2009.
The fourth page of the paper presents a concise overview of four issues confronting schools and libraries: (1) protecting children from inappropriate content and predators, (2) “keeping students focused” from the distractions of non-educational websites, (3) dealing with issues of legal liability for potential illegal usage of Internet resources, and (4) maintaining compliance with the Children’s Internet Projection Act (CIPA) of 2001, which requires federally-funded schools to have Internet content filters in place.

The paper then briefly and clearly states how content-filtering works, starting by identifying the two main ways that such filtering is accomplished: (1) blocking of sites and (2) examination of content. The first option, blocking of sites, involves maintaining a list of sites known to have objectionable content and preventing computer users from accessing those sites. A stricter variation on website blockage is to maintain a list of approved sites and allow access only to those sites, with the entire remainder of the Internet being blocked. The second option calls for examination of all content coming from each Internet source, comparing it to a list of keywords indicative of objectionable content and blocking access to any resource that does not pass muster.

Finally, the whitepaper outlines several architectural options for content-filtering systems. It identifies the options as: (1) “client solutions” which are installed directly on a user’s computer, and (2) “standalone solutions” or “integrated solutions” which involve setup and maintenance of a centralized server through which all content requests are processed.

Wikipedia (2009). List of content-control software. Retrieved March 16, 2009, from

The SonicWall whitepaper (perhaps for competitive reasons) left out one major alternative architectural approach for a content-filtering solution: that of a “web-based service”. Wikipedia includes this category on its “List of content-control software” web-page. Such an architecture (an example of off-site “cloud computing”) takes advantage of remote servers to process all content-requests, minimizing or eliminating the installation and maintenance of hardware and software at the school itself.

The drawbacks of content filtering
Villano, M. (2008, May 1). What Are We Protecting Them From? T.H.E. Journal, 35(5), 48-54. (ERIC Document Reproduction Service No. EJ797266) Retrieved March 17, 2009, from ERIC database. Also available at:

The overall attitude expressed in this recently published article stands in counterpoint to that of my teacher clients. The author takes the stand that Internet access restrictions can inadvertently “keep vital educational technology out of the classroom”, while the teachers at my school merely want to keep things like pictures of obese people in their underwear out of the classroom. Actually, however, both parties are in agreement that some filtering is appropriate, if only to keep students out of things like pornography and gambling sites.

But the author does bring up what seems a vital point to consider when implementing content filtering: that an overly strict filter might prevent students and teachers from accessing perfectly non-objectionable information that is quite pertinent to their studies. Also mentioned is the fact that sufficiently savvy students can bypass a school’s content filters by utilizing proxy servers, which fool site-blocking filters into thinking that a non-objectionable site is being visited. That kind of knowledge tends to be potently viral: once one student learns how to bypass a filter, in a short time all the students in a school will know.

The article takes a more pragmatic stance when it takes up the subject of blocking access to social networking sites. While it cites the need to avoid the comprehensive blockage of blogs (an increasingly useful educational tool), it acknowledges that events like suicides of children as a result of cyber-bullying provoke understandably strong reactions on the part of authorities, such as shutting down access to social networking sites like MySpace. The timeliness of this article (which is less than a year old) is apparent in its treatment of this very current topic.
The author concludes by proposing that a much better protection is afforded to students when they are educated in safe usage of the Internet, as opposed to when their access to certain Internet resources is simply blocked. He cites schools in Finland as an example of places where no filters are in place, but where students are effectively educated from a very young age in Internet safety.

Case-study of an open-source approach to content filtering
Reddick, T. (2004, April 1). Building and Running a Collaborative Internet Filter Is Akin to a Kansas Barn Raising. Computers in Libraries, 24(4), 10-14. (ERIC Document Reproduction Service No. EJ750477) Retrieved March 17, 2009, from ERIC database. Also available at:

Leaving behind the pro and con arguments regarding content filtering, this article recounts the rather large-scale effort undertaken to bring all of the libraries in Kansas into compliance with the Children’s Internet Protection Act (CIPA), federal legislation enacted in 2001 which requires content filtering in schools and other public institutions that receive a certain kind of widely-distributed federal funding. While the size and scope of the Kansas challenge in no way resembles that of my medium-sized school, there is one aspect that the two situations have in common: the Kansas librarians also had minimal financial resources to throw at the problem. This led the Kansas team to adopt an open-source product called squidGuard, which would be housed on a new server to be purchased.

The squidGuard solution involves blocking and unblocking of specific sites. The Kansas team decided to start with the default blockage list offered by the squidGuard providers, but formed a committee of librarians familiar with CIPA to deal with incoming requests for additional blocking or unblocking of sites.

The chief usefulness of this article for my purposes is that it highlighted how labor-intensive the process of installing and configuring a solution like squidGuard can be. An article like this one, which uses the analogy of a “barn raising” in describing the collaborative implementation of squidGuard, makes it clear that this solution is far from “plug and play”. It also makes it clear that, although an option like squidGuard involves no license fees for the open-source software, the overall solution could come to be relatively expensive in terms of devoted hardware and technically-skilled labor costs.

A no-cost “cloud based” option for content filtering
Colburn, K. (2009, March 5). What is Retrieved March 17, 2009, from
OpenDNS (2009). Web Content Filtering. Retrieved March 17, 2009, from

Steering away from complex solutions like squidGuard, my search for an easy to maintain and low-cost content-filtering solution led me through several hours of hit-and-miss web-searching. Finally, one of the places that Wikipedia’s “List of content-control software” (referenced above) led me to was the OpenDNS website. Intrigued by the “cloud computing” approach of OpenDNS, and even more intrigued by the fact that the service is offered at no cost, I sought out a nice overview of the OpenDNS service, and I found it in the first web-page cited above.

The article begins with a brief explanation of how the Domain Name System (DNS) of the Internet works to provide your browser with the “real” (numeric) IP address of a web resource after you type in a character-based URL (like “”). The author then goes on to explain that using the OpenDNS service requires (1) making sure that your machine (or central router) is configured to utilize the DNS server of the OpenDNS service and (2) configuring your (or your institution’s) OpenDNS account by logging onto the OpenDNS website to submit and maintain a list of websites to be blocked when the DNS server is accessed from your machine or site.

As the OpenDNS site explains in more detail (at the second web-page cited above), the user can also select from a preset list of over 50 categories of content to be blocked. Of particular importance to an elementary school might be categories like “pornography” and “adult themes”. While I would need to personally experiment with the OpenDNS service before formally recommending it to anyone, at this point it seems to meet the cost requirement (being free of charge) and the low-complexity requirement. It presumably calls only for the school’s network administrator to point the school’s routing system(s) at the DNS server of OpenDNS, and then allows an appropriate non-technical administrator to maintain the school’s OpenDNS filtration settings.

A no-added-cost option for Windows Vista users from Microsoft (2009, January 13). Use Windows Vista’s Parental Controls to keep your kids safe. Retrieved on March 16, 2009, from

With the introduction of Windows Vista Parental Controls, Microsoft continues its long tradition of bundling more and more functionality into its operating systems and putting third-party providers out of business. In this particular case, I am not complaining, because if a more comprehensive solution like OpenDNS does not prove feasible for my cash-strapped school, then the new Vista feature might fulfill the needs of the school for content filtering. It just so happens that the teachers received an announcement recently that the elementary school’s classroom computers will soon be upgraded from Windows XP to Vista.

As the name implies, and as the web-page cited above confirms, the Parental Controls functionality bundled into Windows Vista is intended to allow parents to configure a single computer with their content-filtering preferences. Thus, this option is not optimal for the school, in that it would require IT support personnel with administrative authority to configure each machine separately. This option does, however, have the benefit of entailing no extra software or hardware costs.

The webpage cited above provides a very good pictorial overview of the straightforward setup and maintenance of Parental Controls. This visual overview was particularly useful to me, since my machine is running the “Business” edition of Vista, which does not include Parental Controls. The bundled software offers the administrator the authority to set “allow/block” website filtration options, and it also allows blockage of access to programs locally installed on the computer. In the school’s case, it would likely utilize the “block website” functionality, requiring the manual maintenance of a “blocked websites” list on each machine. While the miracle of “copy and paste” simplifies this potentially onerous task, it could still take several hours of technical staff time to accomplish the configuration of all of the school’s computers.

Instant blocking of specific websites on any Windows XP or Vista machine
Laurie, V. (2008, January 17). Using the Windows Hosts File. Retrieved on March 19, 2009, from

Is there any option available to the teacher that wants to shut down access to a few objectionable sites, but does not want to wait for school-wide policies and potential investments or upgrades to be decided upon and implemented? Fortunately, any Windows machine can be configured by an administrator within a minute or two to block access to specific sites. The web-based reference listed above, “Using the Windows Host File”, provides very clear and simple instructions. To block access to the website “”, a user with administrative privileges must simple edit the “hosts” file on the Windows machine and add the following two lines:

In tests I did on one of my Windows machines to block access to YouTube, a reboot was not even required for the blockage to instantly go into effect for all browsers on the machine.

The references cited above offer proof that low-cost or no-cost options are indeed available to a school that has a need to implement web content-filtering within the constraints of a very limited budget.

Saturday, April 4, 2009

A Very Brief Look at Netbooks and Their Potential Place in K-12 Education

Here's a quick overview of netbooks that I wrote up last week for one of my GWU courses:

What is a netbook?
The sudden popularity of netbooks (lightweight, low-processing-power, low-storage-capacity, and LOW PRICED portable computers) is something of a surprise to those of us who have watched the notebook computer market over the last decade, as the trend toward ever more high-powered and high-capacity machines seemed inexorable. Instead, the new inexorable trajectory seems to be in the demand for these little netbooks, with most major computer manufacturers getting into the game of providing their versions of these gadgets, many for under $500. And the price tag can approach as little as $200 (update, April 2, 2009: the NY Times now reports netbook prices below $100) if you don’t mind running Linux on your netbook instead of Windows.

The chief rationale for the average person to use a netbook is that a more powerfully equipped computer is simply not needed for things like e-mailing and web-surfing, which account for the majority of a lot of people’s computing activities. With web-based services like google apps now offering remotely-based versions of traditional “office” applications (like word processing and spreadsheets), there is even greater incentive for a user to dump a heavier notebook computer and instead tote along a small, lightweight netbook, which needs to be equipped with little more than a good web browser to do its job (Wikipedia, 2009; Ars Technica, 2009).

Given that much of the students’ computing activities at my K-12 international school take place solely within the browsers of the traditional Dell desktop computers in the computer labs, I am intrigued by the possibility that these cumbersome and expensive machines might be replaced with a fleet of inexpensive netbooks on carts. This tantalizing prospect is what led me to this week’s research topic.

The current push for netbooks in education
The biggest noise on the Internet regarding prospects for netbooks in schools is currently coming from vendors, each vying with the others to gain a toehold in what is assumed will be a rapidly expanding market. Education Week reported last May that a number of computer makers have initiated vigorous marketing campaigns targeted at primary and secondary schools. One of the vendors, Hewlett Packard, claimed to have involved educators in the design of one of its netbook offerings, resulting in enhanced multi-media features and some beefing up to make it better able to withstand rough handling (Trotter, 2008).

Speaking of design features, blogger Christopher Dawson makes the interesting point that the slightly smaller keyboard of most netbooks, considered by some to be a problematic trait, are actually “perfectly appropriate for little hands” (Dawson, 2008). Dawson is a self-professed fan of Macbooks, but he pragmatically points out that he could purchase 75 netbooks for the price of 30 Macbooks, making it tougher to justify keeping the Macs around. In this time of tight budgets, we can bet that netbooks will be getting serious consideration by primary and secondary schools in their upcoming rounds of computer acquisitions.

With that said, hardware acquisition decisions cannot be made in a vacuum, and must take into consideration existing curriculum, which might currently be tied to locally-installed computer software which may not be executable on a new netbook (particularly one equipped with a potentially incompatible operating system like Linux, or with a slightly-retooled version of Googles' new Android operating system). Bringing in new netbooks with a potentially different operating system might also require some retraining of faculty and staff. But for a school that does the proper planning upfront, netbooks are looking like they could prove to be a very worthwhile investment.

Ars Technica website (2009, January 11). Five reasons to seriously consider buying a netbook. Retrieve March 29, 2009 from:

Dawson, Christopher (2008, December 7). I can buy a lot of netbooks for $30k. Retrieved March 29, 2009 from:
Dawson, Christopher (2009, March 18). Realistic netbook expectations. Retrieved March 29, 2009 from:
Lenovo website (2008, October 28). Lenovo Brings “e” Education Netbook PC to the Classroom and Campus. Retrieved March 29, 2009 from:
School Buyers Online website (2009, February 10). Schools Embrace Acer’s K-12 Seed Unit Program. Retrieved March 29, 2009 from:
Trotter, Andrew (2008, April 23). Companies Targeting Low-Cost ‘Netbooks’ Directly at Education. Retrieved March 29, 2009 from Education Week website:
Wikipedia website (2009). Netbook. Retrieved March 29, 2009 from: