How Do We Protect Our Students From Online Hate?
June 4, 2019
June 4, 2019
By Charles Ronco
During my planning period one Friday in March, I checked the news on my school computer and saw a headline about the New Zealand shooter flashing a hand signal in court. This hand signal purportedly was a symbol of white supremacy, and since I’ve been trained by my SRO to be aware of gang iconography and subtle clues indicating membership or support of criminal groups, I was curious.
I visited Google Images and typed in some search terms in an attempt to find the hand signal, so I could identify it should I see it. Clicking on one image, I was disturbingly sent to a Neo-Nazi, white supremacist and Holocaust denial website called “The Daily Stormer.” The article I landed on discussed how the New Zealand shooter could have “increased his score.”
I am never shocked by the depravity on the internet. There are always going to be those that peddle hate, lies, and filth. But I was looking at this website on my school computer.
Still curious, I went to several other hate group websites, finding all of them to be freely available to students. I grew ever more concerned, as I saw articles on some of these pages about the “golden age of Adolph Hitler” and the “Jew media.” Innocent young people can pull up this filth in our schools.
I contacted the chair of my school board and asked that he instruct the Information Technology Department to filter these websites, and he assured me that he would do so. I was also given the opportunity to submit a list of websites that should be blocked, and although I know a few, compiling such a list would be a Herculean task. Computerized bots currently scour the internet to filter out unwanted websites, and I realized that those bots need to be tasked to filter out hate groups, also.
A new business item at this year’s Representative Assembly was passed to inform the other states and VEA’s locals about the possibility that these websites may be available to our students. While that’s a step in the right direction, I still didn’t feel satisfied when I got home.
Additional research led me to the Children’s Internet Protection Act (CIPA). Passed in 2000, it requires school systems and libraries to filter out harmful online content as a requirement for federal funding. The law stipulates that obscene material (as defined by Miller v. California 1973), child pornography, and “material harmful to minors” be filtered.
That last term is defined within CIPA as: “Any picture, image, graphic image file, or other visual depiction that – (i) taken as a whole and with respect to minors, appeals to a prurient interest in nudity, sex, or excretion; (ii) depicts, describes, or represents, in a patently offensive way with respect to what is suitable for minors, an actual or simulated sexual act or sexual contact, actual or simulated normal or perverted sexual acts, or a lewd exhibition of the genitals; and (iii) taken as a whole, lacks serious literary, artistic, political, or scientific value as to minors.”
Notably absent is hate speech, hate groups and Holocaust denial.
As CIPA is a federal law, I contacted my congressman, Gerry Connelly, and Senator Mark Warner’s offices and told them about the issue. This needs to be addressed at the federal level, and it needs support from educators to make sure that something is done quickly.
I’m looking forward to adding this to the NEA’s legislative agenda and putting the full force of our Union behind the online protection of our students.
Ronco, a member of the Prince William Education Association, is a math teacher at Stonewall Jackson High School.
Virginia has seen problems inherent in internet access in schools coming for a couple decades: In 2000, the General Assembly passed a law requiring school divisions to develop acceptable use policies, including Internet guidelines for students and teachers. The next year, state and federal laws allowed schools to install filtering software as a protection against students accessing inappropriate and potentially harmful material.
But, as any educator can tell you, nothing is foolproof and young people can be very, very resourceful.
That’s why, even with filters in place, monitoring your students’ computer use remains essential. As the Virginia Department of Education puts it, teachers and library media specialists must watch students like they would on a field trip. Computer labs can be set up to make supervision easier.
Some other bits of advice from VDOE:
While there are instances in which the line between real research and questionable online activity can get a bit blurry, most educators know that letting young people have unfettered internet access is an almost-certain recipe for trouble.
“Good content filtering protects kids,” says Paul Kirill, VEA’s Director of Technology and Data, “and should not be difficult to set up. Every school should have it.”
Educators should be familiar with their school division’s filter, Kirill adds, both for the protection of students and themselves and for when planning assignments. “Teachers need to be well-informed about what should and shouldn’t get through,” he says. “There should also be a process when an educator learns of an objectionable site that’s gotten past the student filter. Who do you send that information to? Who then says, ‘You’re right, this should be blocked’ and directs it to happen?”
Groups are another easy-to-create component of a decent filtering system, Kirill notes. Teachers should be able to access information unavailable to students, such as hate group sites, if they’re going to be teaching about such groups or other controversial topics.
According to a poll conducted by Virginia Commonwealth University, 66% of Virginians say public schools do not have enough funding to meet their needs.
Learn More