The Future of Search is Social

Over the past ten years we have witnessed an evolution in web search. The first-generation search engines like AltaVista, Excite, and Yahoo all indexed the web and gave back results primarily based on the words that were on a web page. If you searched for “lemurs�, these engines would look for pages that had the word “lemur� on them, and return those to you.

This was all well and good until the spammers came. It wasn’t long before the spammers figured out that if they stuffed a page full of the phrase “lemur�, the search engines would send people who searched for lemurs to that page. So, if these spammers happened to be selling lemurs, they could use this method to drive a lot of people to their store even if that store had no information about lemurs or the lemurs they sold weren’t very good. They could get traffic if they just had the word “lemur� on their page enough times (this example is simplified for illustrative purposes.)

Then along came second-generation search: Google. Google’s search was smarter because it looked at pages that linked to pages. If you owned a site about lemurs, Google would scan your site and know that it was about lemurs, but it would also look at other sites, and if they linked to your site with the word “lemurs� in the link, Google would figure that your site about lemurs was pretty important, so it should show up high in a Google search for lemurs.

Again, this was fine and dandy until the spammers figured it out. The early spammers just set up a lot of cheap sites with links to their main site, and built authority in Google’s index that way. Over time, they had to get smarter, so they set up link exchanges among reputable sites, or started buying text links on reputable sites.

Then the spammers set up companies like PayPerPost.com that pay people to write something (anything) about lemurs and link to discountlemurs.com in their blogs. While any human would read these blogs and dismiss them as marketing hooey, to Google’s algorithm these blogs look perfectly valid and they lend authority to discountlemurs.com. Of course, these ersatz bloggers are actually just shills, writing marketing copy for a living – their blogs don’t get much traffic (or they’d have real advertisers), but by posting articles that look real, Google’s algorithm is fooled into thinking the sites they point to (discountlemurs.com) have some authority.

What people outside of the Search industry don’t realize is that Search is hitting a brick wall. The second-generation algorithms, including Google, are constantly struggling to stay one step ahead of the spammers. Just read through a few of the Search Industry sites Webmasterworld.com, SearchEngineRoundup, SearchEngineWatch, SearchEngineLand, and you’ll see the trends soon enough. Google has to update their algorithm all the time to combat spammers, and it’s hard to say who’s winning.

My bet is on the spammers for one simple reason: people are still smarter than computers. If someone can program a search engine to give authority to webpages that match a certain criteria, someone else can figure out how to simulate those criteria. Only recently have computers started to beat the chess masters, and that’s a game with simple rules; the search optimization industry has no rules, so it will be a long time before computers can surpass human judgment when it comes to determining which sites are really important..

This is why I believe it is just a matter of time until Search becomes social, and Google Search starts to fade away. Some may argue that Google’s weighting of links from other sites is already social because those links are placed by people, but we’ve seen that this linking can easily be gamed and/or automated. A truly social search is one that takes user trends and preferences and uses those to tailor its results. It can look at user click behavior, or it can look at the ratings that users give to various sites, or even better a combination of the two. It is true that this can be gamed, but if one puts the proper requirements and safeguards in place, abuse should be relatively easy to detect. If done correctly, it will be too costly for marketers to pay enough people to promote a site, or for someone to create enough bogus accounts to sway the indexing in their favor, and if a site somehow does promote itself artificially, the masses will vote it down immediately, promoting the sites they think are truly important. This is the only way for Search to evolve.

We’re already seeing versions of social search in smaller applications like Digg, Flickr, Delicious, Yelp, and other sites. It’s just a matter of time until someone adapts these techniques to Search itself, and unseats the Google’s algorithm. Ask.com sure isn’t going to do it – who will?

Comments

1 Comments.

Trackbacks and Pingbacks: