The worst mistake you can make as a website owner is not checking that search engines can successfully crawl your site and it’s web pages.

If your website is not crawlable, your web pages won’t be listed on Google.

There is a number of reasons why this happens, including:

1. Using software that self generates urls, including ?string%characters, ie www.website.com/?website/page%/0965744/content. Keep urls as clean and easy-to-follow-as-possible
2. Websites containing tens of thousands of e-commerce product pages. Very likely you will need to submit your products in a number of documents, rather than just one.
3. Websites containing hundreds of thousands of web pages will need to submit a number of organised sitemaps for Google to list properly.
4. If using a CMS system, such as wordpress etc, ensure you have not ticked ‘do not submit my website to search engines’.
5. Your xml sitemap is not properly formatted.
6. Robots.txt file is not properly formatted.

If you are not sure whether your website is crawlable or not, it’s well worth asking a google seo specialist to run a few tests. This can normally be done within an hour. Putting it right might take an hour or two, depending on how good your web developer is at addressing these types of issues.