Search engines are the key to find specific information on the Internet.
There are basically 3 types of Search Engines:
- Crawler-based
- Human-based
- Combination of above two.
Crawler based Search Engines find information on hundreds of millions of web pages, search engines use automated software robots called Crawlers or Spiders, to build a list of words on the websites and store information and return them back to the central depository. When spider is building its lists, process is called Web Crawling. The spider begins its journey from popular site, indexing the words on its pages and following every link found within the site.
Human based Search Engines are based on human submissions, like directories
In both cases, they search through giant databases indexed by them. These databases get updated after fixed intervals.
Same word will give different results on different search engines because indexes are not same and also all search engines do not use same algorithms.
Web spiders are build to index every word on the page apart from spam. Some web spider’s index most frequently used words such as but all spiders keep track of the words in titles, subtitles meta tags, image, alt tags and web page content.