Outside of spending your whole budget on Google Ads to draw in new customers, there are other options to explore. No-one would blame you if you were fed up with PPC as there can be some fairly serious downsides of exclusively using PPC to drive traffic to your website – depending on the competition it can become expensive (driving down ROI) and traffic can disappear as soon as your monthly budget is reached.
If your budget is a restrictive factor in your marketing efforts, focus your efforts on driving traffic through organic search applying Search Engine Optimisation.
Google is at the vanguard of providing tools to help small and large businesses get themselves in front of potential customers and provides a fairly conclusive set of tools to help improve their websites’ ranking and subsequent results.
The tool we’ll be looking at today is called Google Search Console.
This tool has been around for a very long while and it used to be called Google Webmaster Tools or Google Webmaster Central before that.
As recently as 2015, Google rebranded it as Google Search Console but they all promised similar results – the ability to review and optimise your website.
One of the most valuable elements about Google Search Console (GSC) is that it’s absolutely un equivocally free. It’s made by Google themselves and so the advice comes straight from the source – Google HQ.
Without further or do, here is how you can use Google Search Console to maximise your SEO results.
How to add your website to Google Search Console
Let’s kick things off by signing up to your free account with Google Search Console. After signing up, you’ll need to verify that you actually own the site you’re going to review and there are a few ways you can do this. Note: There isn’t currently a way you can use this tool to take a look at your competitors’ site, however there are 3rd party tools in which you can.
After you’ve signed up with the link above, you can start by clicking on ‘add property’ on the left-hand dropdown.
From here, you just need to enter your site name. Now, it’s a strict entry which means that both https:// and http:// are considered two different websites.
Moving on, you’ll need to verify ownership of the site. Google provides a number of different ways of doing this.
The simplest method is to add an HTML file to your server. The HTML file is provided by Google Search Console and you’ll simply need to make sure it’s in the root directory. Alternatively, you can add a meta-tag, edit your DNS settings or connect to your Google Analytics or Google Tag Manager account.
If you’ve carried out the above process correctly, you’ll see that your site will start to generate data and the data will be shown in the Google Search Console. It may not generate straight away but it will give a few hours or a day or two. Rest assured, it will start rolling in.
Once it starts to generate data, you can use a few different tools to navigate the data – You’ll be able to use overview, performance and URL inspection to get an understanding of how your site is performing.
Overview gives you a rough overview of everything from what keywords you are ranking for to how much traffic you are getting.
You’ll also notice that if the Google bot is experiencing any crawl errors when going through your website, the number of sites linking to yours and how many website pages Google has successfully managed to index.
On the Performance tab, you can see a more detailed breakdown of your site’s performance on Google.
With URL inspection you can explore a specific URL. Type it into the search bar at the top of the dashboard and it will show a quick report on how Google sees the URL.
Google may make mistakes. Google isn’t perfect so your site can help present the information it needs to show your website more frequently for relevant search terms. When configuring your website to help Google understand what your business does, there are a number of different areas you should become familiar with.
On your website, some of the web pages may contain information that you do not want Google to catalogue. These could be pages which are reserved for private use for certain customers, RSS feeds or crucial data that you simply don’t want outsider’s privy to.
On the Coverage tab you can review a basic report of pages on your site.
To make it easy to read, it’s divided into a few categories. Pages containing technical issues, warnings, excluded (by robots.txt) or pages which are able to be indexed successfully. The number of valid and excluded pages depends on what you’d like to present on Google and what you’d like to keep private (off Google).
Creating a robots.txt file can help you block pages from being index by Google (and other search engines). It’s a powerful tool to stop pages being shown but if incorrectly configured, can cause a few headaches.
For truly sensitive areas you may want to consider password protecting all relevant directories or pages. Just because a page isn’t indexed on a search engine, it doesn’t mean it isn’t accessible by other means.
With a robots.txt generator, you can create a robots.txt file and you’ll also be able to check it for validation before uploading it to your server.
Checking your robot.txt before you upload it is a wise thing to do because the last thing you want to do is make a mistake and tell them not to index your whole website.
If you do accidentally misconfigure your robots.txt file and see Google indexing pages that you don’t want indexed, you can request that they remove it through this section. However, you really want to make sure robot.txt covers all bases rather than rely on a tweak in this section.
Sitemaps! Next up is sitemaps. A sitemap is basically a ‘table of contents’ for your site that can help Google find every page on your website and understand it’s schema.
Submitting a sitemap will help Google understand what pages you have and which of them are publicly available – and how to access them.
The world won’t end if you don’t submit a sitemap but your responsibility as a website administrator is to make Google’s job as simple as possible. Not submitting a sitemap may mean that some of your pages aren’t indexed and ranked so it’s best practice to use a sitemap.
Sitemaps have to be submitted to an XML format and they can’t contain more than 50k individual URLS or be larger than 10Mbs. If you do exceed these limits (one or both) then you’ll need to divvy up your sitemap so that it encompasses multiple files and then submit all of them.
If you aren’t a technical bod, you can go to XML Sitemaps and create a sitemap with a few clicks of the mouse. Simply point this tool to your websites URL and tap ‘start’.
Once your sitemaps have been uploaded, Google will then tell you how many of your pages have been indexed. It takes a number of runs before Google see and index them all – but Google will eventually get around to them all.
The goal here is to get as many pages indexed as possible.
If pages aren’t being indexed it’s because there is a reason – usually because the content isn’t unique, title tags and meta descriptions are too generic or not enough websites are linking to your internal pages.
Enhancements only represents ‘Mobile Usability’ and is the only representation under this category. Ideally, you’ll want every page on your website to work well on mobile with zero errors. Any errors here may reduce how your website is shown in search results on both desktop and mobile devices so both parts have to work well.
There are a few more options at the bottom of the menu bar. Let’s dive into those next.
Manual Actions provides a Google with the means to manually flag elements as spam. Here is what Google had to say on Manual Actions;
“Google’s algorithms can detect the vast majority of spam and demote it automatically; for the rest, we use human reviewers to manually review pages and flag them if they violate the guidelines.”
If your website looks a tad spammy, you’ll get a notification from Google highlighting an aspect of your website which requires your attention. It’s unusual to have any notifications under this category unless you’re doing something fishy OR you run a very complex website.
Really, the only item you expect to see on this page is ‘No Issues detected’ until there is an issue. When there is an issue, you’ll need to jump on it quickly to avoid any long-lasting penalties.
Under the Links tab, there is a wealth of data about where your site is receiving links, what those links say and where they’re linking to.
It’s divided up into two sections – Internal links and External ones.
When we talk about Internal Links, we’re talking about links from within your website e.g. Where one page contains a link to another page (both on your site). Linking pages together is a great way of informing Google of how important these pages are to you – if you link to a quotation page from every page on your site, for example, Google will understand that the quotation page is important to your business and will be more likely to rank it higher in search results.
If you don’t link to your internal pages, they will not get as much attention and they won’t place as well in search results.
When we talk about External Links, we mean links from other websites to yours. These links are more valuable and harder to come by. In fact, External Links are a large indicator to Google that the pages which are linked to from external sources are more authoritative with the assumption being, they must be good quality as other businesses are referring their visitors to them. Google treats external links as a top-ranking factor – so the more backlinks you can collect from good quality sources, the higher your pages will rank. External links are one of the best ways of increasing your overall rankings on Google so collecting these is extremely important. You can do this by simply writing great content which appeals to your potential customers and increase exposure to your content by publishing it on multiple platforms. The more other businesses see your pages, the higher the chance is of them linking back to it.
Crucially, these tabs give you an insight into your websites link profile and expose areas where you are required to perform more work to increase your websites rank in the future.
Now, if you’re looking to grow the reach of your website and you don’t want to put all of your eggs in the one basket, then pursuing both a paid strategy (PPC) and an organic strategy (SEO) is a perfect fit.
Remember though, you need to follow the rules and avoid shortcuts – we want to make your website as clear as possible to Google and avoid any risk of receiving manual penalties.
If you’re just getting started, the best way if to follow Google’s rules and adhere to their instructions. Start paying close attention to your Google Search Console and it will present you with plenty of valuable options for growing your organic reach through Search Engine Optimisations.
Within just a few clicks, you can start reviewing your websites online profile and expose ways of improving it by following Googles instructions and playing by the rules.
Give it a go – Sign up for free over at Google Search Console.