We don't know what we don't know we don't know
I've managed to annoy a few folk with a post on filtering, mostly due to the tone of the argument, I hope, than its content. Mea culpa - too much caffeine, not enough thought before pressing publish? I was off the mark with tone, but I hope the message of the post is not lost on those reading it.
Update: Doug seems to think that there is a certain expediency in even having whitelisting, in a convincing argument.
Update II: AB sees the real argument being about mobile internet and (the lack of potential in) filtering.
While my choice of language was wrong, I can't let the notion of whitelisting sites escape me. I feel the basic argument stands: whitelisting means little to nothing for those who can't access the thing that's blocked in the first place.
In an abundant world, we don't have time to unfilter
Jim points out what I half-guessed would be the case: that schools in Highland, as in many other Local Authorities throughout the UK, can ask for sites to be unfiltered, or whitelisted, by Websense. This sounds great: a devolved system designed to give the teachers what they want. The problem lies here: you don't know what you don't know you don't know. Namely: if a teacher is to ask for a site to be whitelisted they have to have been able to see it in the first place. Even the slightest barrier to entry - coming up against a filter - is enough for that website to be forgotten in the click of a 'Back' arrow, and onto the next site.
The expectation that we must learn out of school
So, Local Authorities around the world rely on teachers (and presumably students) to do research of material in their own time, at home, on a personally paid-for internet connection, instead of being able to spontaneously access material in school. Most do this without thinking twice, though whether they should have to do this in a worthwhile way exclusively at home is another question. These teachers then have a lead time before someone in the IT department whitelists, or unblocks that page; no matter how quick an institutional IT department is they will always be slower than a Google result appearing. (Note that I don't even think of students calling IT to have something unblocked).
How can you know you don't want something you've never tasted?
I reckon that whitelisting should ideally be a shared responsibility. There are sites which teachers will know they want whitelisted, and can ask for well in advance. I've seen whole schools set out at the beginning of the year the main sites that they will require to get through their work, based on the blocked frustrations of the previous year. Highland and others cater admirably for this group. There are many sites, or individual web pages, though, that they and students will not be able to ask about since they cannot view them in the first place, after both random search or recommendation. That, or the page is needed now, for the essay due next week, or the 30 kids sitting in my room now.
So, as well as whitelisting by school, which is quite common, there might also be a need for foresight from the superb ICT Teams around the country who will know where to look, and how to unlock, the sites within those categories which, frankly, are still far too broad to provide meaning as and in themselves. East Lothian did this very successfully two years ago, as David explains in his comment on Alan's blog.
But here's the crunch: Local Authorities do share the task of whitelisting, but, for whatever reasons, the same genres of sites, the same distinctive blogs, Flickr accounts and social platforms are blocked. The only reason I can come up with (since Local Authorities are clearly not Machiavellian) is time. Or the lack of it.
What's the answer(s)?
I don't believe this kind of joint whitelisting is a huge task, particularly if LAs could unite on the task and share the labour. I dare say there would be enough bloggers willing to offer their ideas based on what's currently blocked, but the desire to harness this would have to be seen to come from Local Authorities, or otherwise seem a fruitless task to the teachers involved.
Whatever Local Authorities choose to do, there is a clear need to 'do something'.
- In East Lothian, the answer was to own the platform, which led to eduBuzz, and to make the filtering software see the difference between blogs, Flickr and Blip.TV on the one hand, and Bebo and Facebook on the other. Several other Authorities have followed suite successfully.
- In others, the answer may be more frequent whitelisting, seeing this tiresome task become more of a drain on resources.
- Elsewhere, such as in the schools I saw in New Zealand, the politik might be to filter after the fact, and use the Acceptable Use Policy for what it was designed: to pull up those who abuse the freedom the net (should) offer.
If you are a Local Authority IT manager, or if you have some ideas about where whitelisting and blacklisting should sit, then please do join the discussion.
It might be worth sharing my experience as an ICT co-ordinator in a Manchester primary school. A few years back we subscribed to a private broadband provider (Inty)becaue the LA hadn't got their act together at the time. The filters were fairly liberal, but user control was excellent - we could block any site and ban any user very quickly.To back this up we used a piece of software called Policy Master, which monitored every web page pulled up, every email sent and every keystroke on every workstation. We knew, and had evidence of, every dodgy search term entered into the system and used the Acceptable Use Policy to inform parents whenever we felt there was a deliberate attempt to access inappropriate material.
The crucial fact was that the children knew that their web activity was monitored and as a result they did not attempt to abuse the system. In 3 years of daily monitoring I can only recall one instance of a deliberate attempt to access porn. (Note: the monitoring was shared by three trained individuals and took, on average, less than five minutes per day. Overwhelmingly the policy violations were accidental -a dodgy word on a page of a legitimate set of search results etc. And we recorded any deliberate violations in a log with the follow up action noted)
Most primary schools have no idea what children type into web searches (let's face it, we all looked up naughty words in the dictionary in our day); they don't monitor web activity at even a basic level; and coudn't nail the culprit even if they did as most still don't use secure individual logins.
The unfortunate truth that you've mentioned above is that schools believe they don't have the time to monitor internet use effectively, and prefer to let the LA take responsibility in case they let something slip through. The experience I had in Manchester showed convincingly that if you put adequate monitoring systems in place then, not only can you filter after the fact (most of the time "silently" - the kids could never understand why their favourite violent games sites kept getting blocked), but you could also implement an effective AUP, and you could let teachers use the web for its intended purpose.
Just one word of caution - our monitoring systems showed far more violations by staff than it ever did for children: accessing flight booking sites during lessons; use of dating agencies and personal email etc (it was before the time of Facebook). Cyberslacking is, I believe, the correct term. Make sure that the staff know the AUP too!
Posted by: John Sutton | May 04, 2008 at 10:37 PM
@John: Bet they weren't in the same league as this guy...
Posted by: David Gilmour | May 04, 2008 at 11:23 PM
I have probably made this worse rather than better. We have to move on.
I have placed my views on my blog rather than clutter up these comments.
Posted by: Doug Dickinson | May 05, 2008 at 09:38 AM