13 reviews so far!  Tel: 020 8405 6418 | reviews | research | about  
Benedict, the very best SEO Expert in the UK
  • About YOU!
  • SEO
  • Backlink Building
  • Results
  • Prices
  • Contact
  • Home
  • About YOU!
  • SEO
    • Strategy
    • Keywords
    • On page Optimisation
    • User optimisation
    • Content creation
    • Reporting
  • Link Building Service
    • Strategy
    • Prices & Packages
  • Results
  • Packages & prices
  • About
  • Contact
    • 020 8405 6418
Google's 200 variables that its algorithm checks using the Googlebot to index YOUR website

Do you want a copy of Google's algorithm?

Apr 2017 (list of variable included in algorithm)

To best serve our clients we need a better understanding of Google's algorithm. In the past we have tested individual ideas and concepts regarding optimisation to give an idea of what works and what does not. However this approach is unscientific.

Whilst we used to publish our findings and try to educate the other Search Agencies who use our white label services, we soon realised that the UK Search Arena is not ready for our version of SEO.

What Benedict has decided to do is to recreate Google's algorithm from scratch... it is a more difficult that we thought, however we have our own beta version which is proving accurate 90% of the time

The first things we need to do is to copy the Internet. Whilst we have access to over 50 servers, and the storage capacity runs into the hundreds of terabytes, we're not Google. All we can do is make a copy of part of the Internet, for example selecting UK websites, defined by SLD (.co.uk etc) and by geographic IP.

We have 1,000,000 also complete websites in our database, (incomplete with images, scripts removed compressed etc), as our test bed. We try and avoid websites like bbc.co.uk with its 60 million pages and Wiki, instead indexing the more manageable sites.

we already know the vast majority of the 200 to 300 variables that are included

As we are experienced in search, especially optimising for Google, and unpaid search in general, we already know the vast majority of the 200 to 300 variables that are included in Google's algorithm. From previous research we have a good idea of the weights that Google puts on particular variables, which variables are currently deemed the most important etc.

We decided to score of variables using a ratio between -100 and 100. The minus ratio is important to be able to set up a system where for example, H1 is used excessively or if a website is guilty of keyword stuffing, then over optimising a website will have clear scoring detractions.

This has proved quite an eye opener as we did not factor over zealous optimisation to be penalised quite as highly in our own client work. Perhaps another point of note is that the list of "trust factors" on site has grown in relevance

After copying the UK Internet, and writing the algorithm for the first time, we have begun to create our own search engine, which far from looking to replace Google as an alternative in UK search, we are looking to emulate Google's current results.

Of course the problems begin to arise around data storage. When we index some e-commerce sites, the amount of data can become huge even with all of our page stripping. Where are we going to draw the line? Because if our test data is too small, then our algorithm is inaccurate.

If you're reading this page with interest and want to get involved, then please call 020 8405 6418, we are always open to new ideas

If you're reading this page with interest and want to get involved, then please call Benedict on 020 8405 6418, preferably later in a working day, when we have time to talk about these things. It is one of the disadvantages of UK Internet search research that we almost find ourselves in isolation, and often have to resort to our American cousins for information and assistance.

The projects is at the stage where we have written the code and have started testing, Surrey cupcakes is our data choice as it combines websites that fit all our criteria and allows us to insert our own test sites

The projects is at the stage where we have written the code necessary to continue copying the UK Internet. The bots have already copied nearly 100 TB of data. What we are working on now is the algorithm.

One of the advantages of having nine years experience of SEO is the Benedict Sykes can list off variables, that include many of the best practise SEO techniques and more importantly many than not utilised currently by your average digital agency.

However, one can imagine when writing the algorithm, the difficulties that Brin and Larry faced, when creating any list of rules, how best to make the algorithm as accurate as possible, whilst presenting the user with exactly the information they are seeking.

With endless dictionaries, list of synonyms, access to endless search engine backlink checkers, we can imitate off page metrics, backlinks, blog posts, Facebook likes etc....

Fortunately, with endless dictionaries, list of synonyms, access to endless search engine backlink checkers, and of course with our own data on the 3000 odd websites we have created, we are in with a chance.

It is proving to be an exercise of endless tweaking. We have managed to emulate Google search results, with a 50% accuracy. Whilst we can get the first two or three pages of a Google search very similar, we fall down when search queries use more generic expressions generic expressions.

What is slowing the process is of course when websites appear in a Google search, which we have not identified, nor copied.

we hope to end up with the complete algorithm

What we hope to end up with is of course the complete algorithm. Of course we realise that at the end of every research period Google no doubt will make changes.

However that is technology, and we believe that as more and more of our algorithm is accurate, we will be able to take a website, currently listing on page 10 of Google for a keyword or key term, and know exactly, not through hunches or through past research, not through ideas and concepts, but EXACTLY what that site is lacking in terms of advanced optimisation and off page metrics, in order to rank number one in Google for that search term.

Fortunately we can recreate THE MAJORITY of external metrics, backlink, social media .. we have built 3200 websites to work as an substitute for organic backlinks.

Fortunately we can recreate THE MAJORITY of external metrics, backlink, social media etc. In order to negate external factors that Google can take into consideration, as it has a copy of the entire Internet we have built 3000 websites to work as an substitute.

Why are we doing this, we already know how to translate digital data into Google friendly data. We already know the balance in the content optimisation, site structure, internal and external use of anchor text, and the creation of effective invaluable backlink. I think what we are really focusing on, is efficiency.

Are we? successful because we are better at identifying and implementing higher more algorithmic scores?

Are we? a successful SEO company, because we are better at identifying and implementing higher score algorithm variables than other companies, or is it a case that the work we currently do, contains more positives, then negatives? Perhaps the answer lies in the fact you FOUND this site and are reading this page?

"I think we are striving for the most efficient use of resources; where we can improve on what we are doing well, and are being rewarded for by the Googlebot, where can we weed out, work which is both algorithmically inefficient, or just plain wrong?"

Free initial consultationEveryone likes something FREE

One possible place to start is to telephone and ask for some feedback on your current SEO. The first hour is absolutely FREE. There is no charge and no obligation.

What we often produce afterwards is an analysis of: a) where they are now, b) how they got there and c) what improvements can be made moving forward. See example report or click here for more about where to make a START.

 

Latest Review

SEO reviews from Benedict

"From the very start we knew that Benedict was the right SEO company for us. Not only did they avoid all jargon, they quietly got on with the work and we watched our two websites appear from nowhere to page 1 on Google. As a business, we have been going for over a 100 years and now Benedict has taken our company to new levels. Their ideas and their execution are faultless and they always work more hours than they are contracted to. They take great pride in their work and are a joy to work with.""

Moores Glass Works PLC 2018

Sunday Times Guardian Newspaper

 

Research

  • About Benedict
  • Google's algorithm?
  • Google's 200 variables
  • What are people search for?
  • Philosophy

  • ACS
  • Tesco
  • Nominet
  • Exterion Media
  • Philips
  • Aquarius
  • Moores
  • Wrappz
  • Contract Calculator
  • Lexinta
  • The Skills Connection

About Benedict

Benedict is now a Head of Digital Marketing. Formed in 2003, he has grown into a creative team of 5 true digital experts that know exactly how to move your website up Google's ranking. Led by Mr Sykes, who is a recognised authority on SEO.

Based in offices in London, Benedict now consists of an unbeatable team of 5 creatives which include Blogger Outreachers, Content writers and seeders, Web Designers and coders, who work under the direction of a single early Google adopter.

Contact

Stuart Road | Wimbledon | London | SW19 8DJ | Tel: 020 8405 6418 | sales@benedict.co.uk

Copyright 2018 Website Promotion | Testimonials | Twitter | Facebook | Linked In | Google | Terms | Privacy | News | Blog | Sitemap.xml | Website Map | Jobs | Legal | In the press | LinkedIn | Archives | PDF| Search Quality Rating Guidelines

Read more on our  RSS feed   What are we talking about

Click here for our feed .

We research Search

We research search, we have spent 14 years reading, coding and publishing websites to gain a better understanding of Google both past and present. More.

In the past we have tested individual ideas and concepts regarding optimisation to give an idea of what works and what does not. However this approach is slow and unscientific. What we are looking for is definitive rules we can test on Google. See 10 new things found in Google's algorithm.