**Recently, Google has announced that they work on “artificial intelligence” with its own algorithm. This algorithm is the second most important as far as ranking factors are concerned. While many positioners have been occupied with the questions of the essence of the mechanism, I asked myself: “Okay, but what the first ranking factor is?”**

In my opinion, based on six years of experience in SEO, the first ranking factor is **TrustRank.** It consists of a set of website characteristics, which Google uses to determine the level of confidence in such a website (understood as TrustRank – a rate known only to Google, in contrast to its followers like, for example, TrustRank applied at the Majestic Search Engine).

Let’s say that TrustRank is like a spot dimension and it exists in a multidimensional data space, where the number of dimensions is the number of factors used by the algorithm to determine the size of TrustRank. Assuming such factors exist and the influence of each of them on TrustRank is not the same, the size of TrustRank would be expressed as:

T/R {a_{1}x_{1},a_{2}x_{2},a_{3}x_{3},a_{4}x_{4},a_{5}x_{5},…,a_{z}x_{z}}

**Where:**

**T/R**– is the value of the Trust/Relevant**x**– is the factor of the number “n”_{n}**a**– is the share of the factor “**x**_{n}**x**_{n}” in the total size of the final result TR– is the finite and natural number**ax**_{n}

Factors “x” can be: TR of linking domains; TR of web page on which a backlink is placed; the quality of a server which is manifested, for example, in its unique IP or neighborhood, etc.

I’ve always had the impression that the changes of position in the search ranking measured with reference to a large number of phrases occur by leaps and bounds. My feelings confirmed well-known to me positioners, who in their talks about the issue in question, though rather generally, used terms such as “increases”, “drops”, and “stagnation”.

And it is not about the change of one or two places of a given phrase in that ”hierarchy”, but about something more significant, alteration of the position of our website in it (a global change).

Although, this article is something of a guessing game, or even a fantasy, I would like to share my hypothesis with the wider public. I treat the following digressions as the development of the sentence: If I were Google, I would do this.

Using the phrase “a position change”, I’m talking about changes of a global nature for the website in the further part of the article.

What are the relationships between these factors in a multidimensional space? One does not need to be an expert to say that they certainly are not linear. In fact, they can be seen as so-called (a) ”percolation threshold(s)” (I borrowed that physical term from the theory of graphs and in this case it means the boundary between two different types of structure), because it is more like a distinguishing border area, in other words: the point beyond which the change of position in a search engine takes place. Since presenting a multidimensional algorithm in the geometrical layout would be difficult, let’s focus on two dimensions.

**Let’s assume that we take into consideration only two factors. **

**X1**– the value of quality backlinks given to the website**X2**– the amount of backlinks given to the website**PP**– set of percolation points as a function of x1 and x2, at which the change occur**W**– the field of values at which a web page gets a higher position**N**– the field of values at which a web page gets a lower position**L 1 ,L 2 ,L 3 ,L 4**– websites with specific values x1 and x2 as well as unchanged values determining- where the value of x1 included in the size L has the property of x 1(L3) < x 1(L2) < x 1(L4) < x 1(L1)
- where the value of x1 included in the size L has the property of x 2(L2) < x 2(L1) < x 2(L4) < x 2(L3)

*This coordinate system shows that to get higher rankings on Google, it has to proceed from the field N (where L2 is located) to the field W, that is to achieve such quality links (x1) and the number of them (x2) to achieve a value point in the range of W (point L4). It raises interesting conclusions. It is not possible, despite the increase position, to have an adequate number of links but not having enough quality (as L3). However, having a certain amount of backlinks with insufficient distribution of them, accessing to a higher level is not possible (as L1).*

**If I would be Google and the chart above would be true, then thanks to it the following phenomena could be explained:**

- addition of a further variable (eg. x 3) changes the position of points L 1, L 2, L 3, L 4 in the system and shape of the percolation point
- change in the slope of the percolation point reflects a change in the impact of individual factors ranked positions in search results
- change in the ratio of x1 and x2 in respect of the change of position does not develop linearly
- this decline in the quality domain maintainable may cause declines in our position, even with the increase in the number of backlinks (although not this is done proportionally)
- this decrease in the number of backlinks can cause a decrease in our position evenwith an increase in the quality of the other (though not this is done proportionally)
- there is value measuring quality of the backlinks without exceeding that it is not possible to achieve higher positions
- there is a quantity of backlinks without which does not meet the higher position

Google level of trust to the website reflected with rankings positions that can change abruptly.

The shape of the percolation point is no contractual in nature (although physics knows his model), and much more interesting is how Google might to maneuver to change the criteria for matching a party to their algorithm to obtain discrete positions.

**It is no secret that Google has in recent years significantly increased the quality criteria of referrals to our website to take into account increases in position.** For many positioners it was novelty. Thinking “old patterns” many of them after losing the current position continued efforts to raise further links further without worrying about their quality.

If I were Google and used the percolation curve, I would not allow for easy increase in position for a new links without the adequate quality.

It would be enough to move only after the right angle and the right slope of the curve percolation to remove pages which won’t ensure the quality score next time.

**Suppose that we consider a case** in which we take into account only previous variable x1 (quality of backlinks) x2 (number of backlinks), which in right proportions determines website status. Over time, Google decided to put emphasis on an algorithm which takes into account a slightly different ratio of these variables in order to obtain change of position, the result of which takes an effect of a shift PP 1 and PP 2. This situation is illustrated in the chart below:

In the chart through **functions S1 and S2 denote a change in the relationship of quantity and quality backlinks to the website over time**. As you can see, page S1 coped with changes in the algorithm as consistently pursue its policy of “link building”. Page S2 despite a good starting point and a similar position at S1 is not coped with the new orientation of the curve percolation and does not reach the position changes; despite further pursued a strategy of the “link building” yet having a bad assumption.

Consequence of the foregoing, there are many outputs and possibilities of geometric transformations even more. **If** **I would be Google, I would manage mine own algorithm with moving in a multidimensional space.** I wonder what Goggle’s respond would be and whether employees’ responsible for changes in the algorithm have the same opinion.

I’m really curious what do you think about that?

Is it true that too much link building can hurt your search engine ranking?Hmmm…you should read my prediction (fictious or not!) about the mathematics of google search https://www.timecamp.com/blog/index.php/2016/01/the-mathematics-of-google-trust-rank/

What is the search algorithm used by the Google search engine? What is its complexity?I have some prediction about the mathematics of google search in this article https://www.timecamp.com/blog/index.php/2016/01/the-mathematics-of-google-trust-rank/ This is only prediction without kind of relevant, but explain very things.

Why is Search Engine Marketing important?You should read, read, read about this. This one single way. First of all read this article https://www.timecamp.com/blog/index.php/2016/01/the-mathematics-of-google-trust-rank/ but not only. You may read moz.com,GrowthHackers.com …

Does Google take into account the quality of a study (eg statistical power) into effect when determining the search engine ranking of a study?Hey. My prediction about the mathematics of google search https://www.timecamp.com/blog/index.php/2016/01/the-mathematics-of-google-trust-rank . You may read about this.

My rank in Google for a certain query changes daily. What can I do to make my rank not fluctuate so much?You must accept this. The Mathematics of Google Search https://www.timecamp.com/blog/index.php/2016/01/the-mathematics-of-google-trust-rank/. Google change algorithm every day and this influences to your rank.

[…] example (maybe) the biggest driver in their algorithm is bringing links (you may read about this https://www.timecamp.com/blog/in…) so, black haters spam a lot of sites automatically ,which bring them links.Written just […]

Why is Google so successful?I think that their algorithm is the most useful mathematics ever. The Math of Google search has created their success. This article contains my prediction https://www.timecamp.com/blog/index.php/2016/01/the-mathematics-of-google-trust-rank/ The algo ha…

What is the single best piece of SEO advice?I think it isn’t so simply. Check my prediction https://www.timecamp.com/blog/index.php/2016/01/the-mathematics-of-google-trust-rank/. It seems to be more complicated math than we think.

What is wrong with Google’s search engine?Google is ‘only’ algorithm. The Math create people. Google is very say math, but this is only math,not brain. However google try create brain map, but this is long way to be perfect. Look at “simple” google math https://www.timecamp.com/blog/index.…

How is Google Brain used in search ranking?Hmm… We don’t know exactly how. I think it is really difficult way to understand this … but I think this is only the mathematics – the algorithm. There are ranking factors which influence on all algorithm. This article predictions how ranking fact…

Thanks Pawel it’s interesting to read your perspective on this topic!

nice great blog it gives lots of knowledge

Good explanation, and good theory, but not very helpful in the real world. It is obvious, and well known that google in some way counts back-link quality as a function of quantity, but in what proportions, and how does one cheaply analyze and develop back-links of quality. In many cases it is not worth the effort; simpler to buy keyword, banners, displays, Facebook ads, etc. etc. etc.

Don’t get me wrong; I applaud your efforts, but I’m not sure how they are of value to the working SEO manager.

Yes, Larry. You have totally right. It’s just theory. But if you have known numbers that would be more value for SEO practitioners 😉