Ever wonder how a search engine like Google disseminates web pages on the Internet to produce the best results possible for its searchers?
This week, engineers, product managers and executives at Google will be meeting to determine how they can make their search engine smarter. As you know from our recent post on the topic, Google has made many changes over the course of its 10+ year history.
Google has become synonymous with search, commanding around 2/3 of search traffic. But that isn’t making them slow down as they strive to “organize the world’s information” as the company’s mission statement says. None of the upstarts like Facebook, Twitter, Yelp and others present a threat to Google in their own right. But going forward, search will not simply be dominated by Google but rather incorporate a combination of services.
The biggest threat to Google however is Bing, Microsoft’s revamped search engine. They’re trying to fill in places they feel Google’s algorithm falls short, namely in the health, reference and shopping sectors.
While Bing is increasing market share, Google is still miles ahead of them in the simple task of dissecting a search and returning relevant results. Using contextual signals, Google has been able to master the ability to figure out what a searcher is looking for.
Google culls data from all of its searchers to achieve these ends by seeing the search terms people use along with what they re-enter into the search if what they’re given isn’t satisfactory.
And the most incredible thing, Google makes these changes under the radar. Searchers have no idea that their searches are constantly dissected and that the company is always trying new things to improve its algorithm.
Explore this topic more in this article from Wired Magazine which outlines some of Google’s internal processes. Knowledge like this can be tremendously helpful in optimizing your website for the search engines.