The cold hard truth is, if kids want to find it, they’ll find a way to find it. They can’t be completely sheltered without locking them in their rooms, which probably does more damage than letting them see what they wanted to see in the first place.
It’s not really the search engine’s responsibility to block it, nor the government’s to regulate it or outlaw it, but maybe it’s not too much to expect that I’m not BS’ed about it.
Vint Cerf’s post at the Google Blog smells of public relations. Nothing against Mr. Cerf; he helped developed the greatest human invention since the printing press. But his post, “Google’s (and parents’) role in keeping kids safe online,” just plain stinks.
It’s the same stink that emits from unconstitutional legislation proposed in the name of protecting children – because nobody’ll vote against protecting the kids.
The post is an affirmation of everything Google’s done in this regard, including (laudable) partnerships with CommonSense Media, i-Safe, iKeepSafe, NetFamilyNews, the Family Online Safety Institute, and the National Center For Missing and Exploited Children.
That alone would have made me pat them on the back and move on, deferentially ignoring that it smelled of PR, or maybe even an attempt to bolster a case against some kind of impending regulation or controversy.
But it was these two paragraphs that set my BS detectors on alert:
“We’ve invested in developing family safety tools that empower parents to limit what online content their children can discover. Our SafeSearch filter, which users can adjust to block explicit content from their search results, is an example of this type of technology.
On YouTube, where we host user-generated content, we aim to offer a community for free expression that is suitable for children and protects them from exploitation. Our work to keep YouTube safe for children includes clear policies about what is and is not acceptable on the site; robust mechanisms to enforce these policies, such as easy tools for users to police the content by flagging inappropriate videos; innovative product features that enable safe behavior; and YouTube safety tips.”
And this is where I ask, “Who’s he kidding?” A chimp could turn off the SafeSearch filter, and search at will (if he could spell). And on YouTube, it’s not much different. Anything that approaches inappropriate is at most protected by an age confirmation button and a warning that the video may not be suitable for minors.
Are you over this 18? A simple “yes I am” is all that’s needed. An honor system for anonymous young hornballs. And their mechanisms may be “robust,” but they’re not always speedy. It’s unclear whether the hardcore porn I found without looking was taken down because I reported it here, or if users or YouTube found it first.
Really, I’m okay with the SafeSearch filter. I think it’s a good thing. And I think it’s good YouTube at least makes an effort to police inappropriate content. But pretending that Google’s robust mechanisms are air-tight and that kids don’t circumvent their filters on a regular basis is not just silly, it’s insulting.