Search engines have programs called spiders which visit web pages to determine what the content of your site is, and to find other links to scan at a later date.
1. Spiders, or web crawlers, scan the content of web pages.
2. They send the results of their scan back to an algorithm to be broken down and analyzed.
3. If the spiders encounter a link to another page or website these links are stored.
4. Eventually other spiders crawl the linked-to pages.
5. Therefore, the more links from other websites and pages your website has, the more frequently your website is visited and crawled.