Search Engine Algorithms

« Back to Blog

Search engines are the key to finding specific information on the vast expanse of the World Wide Web. Without sophisticated search engines, it would be virtually impossible to locate anything on the Web without knowing a specific URL.

Every search engine wants to provide relevant and accurate search results and this is primarily accomplished by using something called an algorithm.

An algorithm is a set of mathematical rules which assign values to specific factors like keyword density, page title, meta tags, link popularity, emphasized text and whatever else the search engine determines important. For each search, web page values are totaled and scores are used to rank the web page for search query relevancy.

The exact algorithms used by search engines are kept secret, and they are frequently changed to improve the search results. Keeping them a secret helps to prevent unwanted manipulation of websites, which may lead to poor or irrelevant search results.

There are basically three types of search engines: Those that are powered by robots (called crawlers; ants or spiders) and those that are powered by human submissions; and those that are a hybrid of the two.

Crawler-based search engines are those that use automated software agents (called crawlers) that visit a Web site, read the information on the actual site, read the site’s meta tags and also follow the links that the site connects to performing indexing on all linked Web sites as well. The crawler returns all that information back to a central depository, where the data is indexed. The crawler will periodically return to the sites to check for any information that has changed. The frequency with which this happens is determined by the administrators of the search engine.

Human-powered search engines rely on humans to submit information that is subsequently indexed and catalogued. Only information that is submitted is put into the index.

With billions of addressable documents publicly accessible, Internet search engines continue to be fundamental to information seeking on the Web. The scale of these engines, both in content and in access make the algorithms, architectures, and implementations of these systems challenging.

Search Engines are important and provide a valuable service to the Internet.

The Search Engine Algorithms are in a constant state of change and improvement, which in itself creates a further challenge to the search engine optimization companies to keep well informed and educated.