Search

How does Google see my site? 6 secrets that you should know for your business!

SEO

Though in this day and age it’s essential that you focus your marketing efforts on providing a powerful user experience for your customers, it’s still key that you understand just how Google sees your business’ website. By doing this you’ll ensure you craft your site, pages and content to suit the needs of Google, which in turn will lead to higher rankings and therefore a higher visibility to your potential customers.

So, let’s take a little look at how exactly Google sees your business’ website.

WANT TO ENHANCE YOUR BUSINESS’ SEO STRATEGY? THEN DOWNLOAD OUR FREE ULTIMATE GUIDE TODAY!
Finding your site.

When you create a website, Google will first need to discover it. Here Googlebots systematically crawl the web, finding new sites, pages and content, gathering the information it finds, so that those sites can be indexed and ranked accordingly for their subject matter.

However, it can take a while for the Googlebots to discover your business’ website and its new content, but there are a couple ways you can help speed up that process:

  • Create a Sitemap – A sitemap is created specifically for search engines, though your more technically aware site users may choose to navigate through it. But, simply put a sitemap tells Google exactly where all your pages are and how you would like a user and Google to navigate through them. This is an essential tool for any website, and any web developer worth their weight will ensure this is created and provided to you. If, however, you don’t currently have a sitemap you can use sites like xml-sitemaps.com to easily generate one, or if your site runs on WordPress then the Google Sitemap Generator plugin is a quick and effective tool.
  • Google Search Console (formally Webmaster Tools) is a powerful tool, and should definitely be something you sign your site up for. It will firstly allow you to confirm that your site is actually being indexed, you can then submit your sitemap if it hasn’t already been identified, and when you create new pages for your website, you can request for a Googlebot to come and crawl them.
Viewing your site.

Ensuring you have a Robots.txt file is a necessity.

By using a Robots.txt file on your website you can tell Google exactly which areas of your site it should look at and which aren’t for its eyes. This can be really useful for pages you don’t want to be indexed by Google such as guarded content pages, or user account pages etc.

Your Robots.txt file can be as complex or a simple as you like, but all that is really required is a simple text file submitted to your web server, titled Robot.txt, which when a Googlebot comes to crawl your website it will look at first to see what it is allowed to crawl and what it isn’t.

The following text is what you would use in your robot.txt file to say “Yes crawl everything!”

User-agent: *

Disallow:

And if you wanted to prevent Google from crawling and indexing any areas of your site then you’d simply list them after, ‘Disallow’.

If you want to check whether your site currently has a robot.txt file then all you need to do is type in your URL followed by /robot.txt, and if you have one you’ll see a page like the image below:

robotstxt example

Your page title.

After viewing your Robot.txt file the Googlebot will then move onto look at your website’s page title. This meta tag can be identified by: <title> in your pages’ HTML code. Many top Search Engine experts, such as Neil Patel, Rand Fishkin and our very own SEO team, call it “the single most important SEO element”.

And the Googlebot will look for a number of things in your title tag:

  • Firstly, it will read all of your title tag, though only the first 65-70 characters matter, (variations occur due to the difference in pixel size of certain characters), as this is all that will show on SERP’s.
  • It will look at all your page’s title tags, wanting every one of them to be unique.
  • Finally, it will identify the keywords in your page title.

 

Checking your descriptions.

Next, the bot will look to your page’s <description> tag, and there are various lines of thought whether your page’s meta description is actually used by Google or not as a ranking factor. But there is no argument that what you include in your description is key for your users, as it provides them with key information about what your website and pages are about on SERPs.

A couple of things to think about:

  • Write your description for humans, not search engines.
  • Keep it at 160 characters or less.
  • Include the keywords you’re targeting with that page.
  • And finally, remember this is probably the only bit of free advertising space you’ll ever receive so make sure to craft something powerful.
Images?

Though images and videos are a powerful tool to grasp your user’s attention and break up those bulky and cumbersome blocks of text, as of yet, Google can’t actually crawl them and identify what you are showing your users.

So, to get around this and tell Google exactly what your images and videos are, you need to use alt tags.

You can insert your alt tags either directly into the HTML code of your site, using the code alt=“the description of your image” inside your img src container, or most CMS systems will allow you to simply add the alt tag when you upload the image.

Right click the image below and click save as to see it’s alt tag.

So, make sure you add descriptive, keyword-sensitive alt tags to every image on your business’ site.

Looking at your content.

Finally, the Googlebot will come to what I personally think is the most important aspect of any site (but then I am a content writer),

Your Content

Here Google will crawl and index every single word on your pages, and the more content you have the better, not only to help Google rank your site, but to help you add value to your user’s experience.

However, always remember to ensure that quality remains the top priority over quantity.

Google also likes to see fresh content, and every time you publish a new blog, you’ll ensure you raise the Googlebots heads and bring them back to crawl your site for that new content.  

By taking all of these points into consideration when you update your business’ website, be it a simple new blog post, or a complete facelift, you can ensure that you not only provide a great user experience but also ensure that Google can find all the information it requires to rank your business on page one.

But if you want to ensure your site is being properly crawled by Google make sure you use their free and powerful tools Google’s Search Console.

Leave a comment