HOW TO GET YOUR WEBSITE TO THE TOP OF GOOGLE IN 2016

Remember that Google is an answer machine. It wants to give the best answer it can for any given question it is asked by using all of the resources and information it has available.

I would give a few examples of Google searches here to show different results, i.e. local results, a recipe, a question etc.

So we can see that our answers will be displayed according to the type of search we carry out. Google is showing us here what it “knows” about stuff on the Internet.

Google knows which are the best websites to show because as it crawls the Internet collecting information about websites for its own cache it also does something else, it uses a very complicated algorithm to analyse the signals surrounding that website.

There are hundreds of factors it considers but to break all of those things down into an easy overview we could say the important things are:
TRUST, AUTHORITY & RELEVANCY

POINT 1: TRUST

Google spends a lot of time and money trying to stop spammers from abusing weaknesses within the Google algorithm. Every now and then it changes the algorithm to keep up with these exploits and one way it does this is consider whether your website is trustworthy or not.

Google have a program called a spider bot which crawls websites to get information.

Not only does it collect information regarding how well designed your website is for a user experience, it also tries to understand what your website is about.
If you mention another website and you link out to that website from your own site then Google will also pay a visit to that external page. It will consider your acknowledgement as a vote of confidence.

This is also true of social platforms such as Facebook and Twitter etc when they link to your website.

User Experience

Google analytics and webmasters tools (now called Search Console) are provided by Google themselves so we can see what Google sees regarding our website.
These tools give us an indication as to any technical issues it finds, how visitors find you and what they do when they are on your website.
It takes all of these signals into consideration. Poorly coded or designed websites are graded lower than slick ones which cater and adjust for its’ user experience.

Social Signals

Google considers these as OFFPAGE SIGNALS. A trustworthy website will not only naturally link out to its’ own social pages but will receive both visitors and links from these social platforms too.
Spammy or untrustworthy websites tend not to have these signals.

Real Business Signals

A real business or brand will usually follow the guidelines given by Google and register a Google plus page. It is thought that Google gives high credence to people who spend time fully completing their Google plus page and interact with similar communities and people on its own products such as Google plus and the YouTube community.

It is natural that a business would also link out to its’ own website and other social accounts such as Facebook and Linkedin etc and would receive traffic (visitors) and backlinks from them to its’ own website; this is due to natural interactions and promotions on those platforms.
In addition to this a local business will normally want to be listed in online directories to try and entice customers. These are usually relevant to the local region or the market in which the business operates.

Spammy or untrustworthy sites don’t do any this or have these collective signals.

POINT 2: AUTHORITY

If you were ill and went down your road knocking on all of the doors asking everybody what they thought was wrong with you, everybody would give their opinion (or report you to the police). If one of those people were a doctor and you had to pick the advice from just 1 of these people you asked, who do you think you would listen to? Obviously it is the doctor.

When Google is asked a question it does the same thing.

Google collates a lot of information and organises it into relevant topics. It then assesses which of this information is likely to be true and authoritive.

It does this by comparing all of the signals it knows about a website (or webpage) and considers all of the other factors such as the other sorts of websites that mention it and perhaps link to it, it looks at how trustworthy and topically relevant those places are who link to you.

It will then decide how authoritive YOU are based on all of this information; it is also worth remembering that Google also carries out this exact process with every website which links to you. The better they are, the better you are.

POINT 3: RELEVANCE

Google will always try to provide it’s users with the most relevant answers to the questions it gets asked and have now reported that rather than considering a single 500 word article as an authority on a topic, it looks how topical websites are as a whole.

An Information page Vs Topical Chapters

Think of this like chapters within a text book, let’s say a book on DIY.

One would expect to find topically relevant chapters in this book such as how to build a garden wall or how to hang wallpaper.

Now this would be a fairly relevant book but imagine if our text book was specifically all about how to hang wallpaper.

Chapter 1 could talk about the type of tools required for wallpapering, chapter 2 could talk about safety when wallpapering etc etc. If these were websites then Google would consider these topically relevant factors when deciding which of our 2 examples to present to its’ user, of course it would also look at all of the other signals we already mentioned too.

POINT 4: Look at the Guys at #1

If you could replicate absolutely everything that a website had done to get to the number 1 position then it stands to reason that you would get there too.
So in order to rank at the top of Google that’s pretty much what you need to do as well BUT then go further and do it better. Understanding how it was achieved was only the first step.

Footprints

Practically everything carried out online is traceable, it leaves a footprint. Spotting the footprints left by the website at #1 and then using those same footprints to go one better than them is how you will beat them.

Outsource it

Hire an SEO company who are technically capable of reverse engineering how this happened. Let them do the same thing for your own website and compare the two together and then make the adjustments required.

Of course, you could do this yourself but remember there are hundreds of things to consider, these people have spent years getting to know these factors inside and out. Your time could be spent doing what you do best (whatever that is) and let the Search Engine Optimisation experts do what they do best.

Recent Posts

Leave a Comment