Over at Silicon Valley Watcher, Tom Foremski wonders if search is damaged in ways we rarely recognise.
With the regular trumpeting of complex search algorithms, how comes engines need people to do the first layer of interpretation or action he asks. After all Google's mathematics filters information, but he points to tagging, nofollows, robots.txt, pings, blogs and linking as examples of the way people create the content and metadata that's used by search engines - rather than the idea of it happening the other way around.
If the search engines are so great at doing what they do, then how come we have to do all of the above?
I resent the fact that I have to create all this content describing my content--the search engines should be creating this "metadata."
I just want to write stuff, and leave it up to the search engines to find it, classify it, index it, and do all the other things their mythology suggests that they do.
Really, then, he's not arguing that search is broken, but that it never worked the way we wanted it to in the first place.
I've often pondered why we have to spend so much time giving machines information about what we're doing before they can decide what to do with the information. Mostly I think about it because I'm lazy: I like creating, but I don't like organising and categorising the things I create. So that's what I want my software to do; step in and understand everything I implicitly recognise without me having to underline it.
Essentially, though, this is about the next wave of artificial intelligences - understanding what we want, not just interpreting.