Thursday, September 19, 2024

Web Founder Still Creatively Vague

The World Wide Web turned 20 years old last Friday, and its creator, Tim Berners-Lee, says its potential is hardly reached. His next vision, a vision he’s been talking about for years, is the Semantic Web, which on the surface seems as simple as herding cats. But don’t let the specifics bog down a perfectly good concept with just the right amount of vagueness to drive it forward.

This Reuters report on Berners-Lee’s big plans isn’t notable for how it defines the Semantic Web. In fact it is exceedingly vague to the point it seems rather obvious the writers wrote down, as best they could, the simplest version of what Berners-Lee told them. There’s no shame in that. Berners-Lee called me to the carpet a couple of years ago because I didn’t quite get it, either. That’s why he’s at MIT and I’m not.

Tim Berners-Lee
Tim Berners-Lee

The Reuters writers focused instead on things more tangible to their readers, such as his warnings about government and corporate snooping on Web users by creating individual profiles based on the data users supply. But this was my favorite part:

When Berners-Lee wrote his proposal in March 1989, his boss at CERN, the world’s biggest particle physics laboratory, scrawled “vague, but exciting” on the memo.

I found that interesting in light of a couple of things. Recently, futurist and cyberpunk author Bruce Sterling gave a controversial speech in New Zealand about the future of the Web. In it, he gave what appeared to be a blistering criticism of Web 2.0, Tim O’Reilly’s term, which gave rise to many of the applications we see today—user generated content, AJAX, social media, on and on.

At Web 2.0’s inception, critics jumped on it as an ill-defined marketing buzzword and claimed it wasn’t really different from Berners-Lee’s Web 1.0; it just looked different. Sterling summed it up a few weeks ago by saying Web 2.0 was made of “useful, sound ideas that were creatively vague.” He followed that up with much vaguer tales of the “Transition Web” and turtles upon turtles upon turtles.

Vagueness, it seems, is incredibly useful. One needs specifics when trying to sell something, especially in a corporate environment. Try selling an idea to your boss without data to back it up. But the beauty of the Web is that it is created by a kind of collective intelligence that can run merely on a concept, with definitions to come later, or perhaps never. In the meantime, look at all the cool, useful stuff that got made.

An example of another creatively vague concept: Twitter. At this very moment the general public is asking the same questions we on the cutting edge were asking when it debuted two years ago: What’s the point? Why would anybody use this? Why does anybody care? On the surface it sounds stupid. People send 140-character updates about what they’re having for breakfast to a bunch of other people who apparently give a crap.

 But then it became a useful tool for journalism, for Congress, for celebrities, for marketers, and very recently and very suddenly, for realtime search, and Google shows a twinge of concern.

It turns out that creatively vague is very powerful, even world changing. I think that’s because concepts can’t be weighted down by justification. Instead the collective interprets what it could mean, and radical innovation ensues.

So what did Sterling mean by a “transition web?” Hard to say exactly, but on the surface a culture-based, unmonetizable Web free of business models (as he calls Web 2.0, a business model) seems much more utopian than Berners-Lee’s cat-herding Semantic Web.

Thomas D. Wason
Thomas D. Wason

Defining the Semantic Web is difficult. You can run the define: function in Google and it will bring back several definitions sounding vaguely similar but offering no simple explanation—except for one, another favorite: The gleam in Tim Berners-Lee’s eye for a unified Web without metadata. Thanks for that, Thomas D. Wason, PhD.

Google Define Function

Berners-Lee told me I had it backwards in 2007, basing my understanding on a common misconception that led a Semantic Web developer from Berkeley to declare it dead on arrival. The Semantic Web, said Berners-Lee, wasn’t so much about getting humans to adhere to a common language (fat chance!) when identifying data so that machines could better understand it, but more about getting machines to understand data in more human ways (um, I think), and then integrating that data in a way that is intuitively accessible and contextual.

Here’s how he defined it himself:

“The semantic web is about data integration. Most of the data is in existing databases. Much of it is currently exported in HTML and can be easily exported also in RDF using a tool like D2R Server. Data comes from many sources. Calendars. Scientific measurements. Applications such as calendars, financial programs, and so on.

“Yes, it is possible to write data into online media, but that (a) is very effort-intensive and (b) only covers a fraction of all the things data is about. I’m not holding my breath for that.”

And that means? I don’t know, but my guess is we’ll know it when we see it, and so long as enough vagueness remains innovation could be limitless. All I really know is that if something the creator of the World Wide Web is doing sounds “vague, but exciting,” we should pay attention. 

 

Related Articles

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles