Show Mobile Navigation

January 11, 2014

Google's Penguin Brings Arctic Weather To PageRank

Joshua Berg - 10:34 AM

What are Penguin & Manual Actions doing to Google PageRank? - New PR Analyzed Part 2.




▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁

[Update: 8. Has second generation Penguin gone upstream to deny link value?]
[Update: 9. Can you Disavow Penguin if you haven't received manual penalty?]

1. The PageRank TB Update, December 6, 2013 Analyzed.
2. The Toolbar PR directly affects search rankings... NOT!
3. Using the PR 1213 for analyzing link removal & disavows.
4. PageRank penalties - as visible indicator of lost trust in a domain.
5. Long before Penguin, we had manual action PageRank penalties.
6. Is the PR Toolbar beyond redemption as a reputation metric?
7. Recovering from Penguin, or Manual Actions with Eric Enge.

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔









Penguin has brought Arctic weather to PageRank for quite some time now, putting the freeze on many formerly prosperous industrial fishing ventures (aka. link schemes), which had overrun many local habitats. While this environmental phenomenon has been disastrous to some, others rejoice at the balance to resources the happy feet bring.

In part 1of this series A Bird of Another Feather, I asked the question, "Will PageRank again become a reputable symbol of authority?" as it clearly has not been for quite some time. So here I would like to examine the details of what Google is doing for better or worse to change that.

  • How much have the PR Toolbar ranks changed over almost a year?
  • How does this affect, or does it not affect actual search ranking?
  • How well is Google's internal PageRank algorithm reflected in the PR Toolbar?
  • Is there any real life usage we can get with the TBPR data? 
  • Are there any better authority ranking tools than the PR Toolbar?

The first part of this series Bird of Another Feather, may have ruffled just a few feathers, but not inconveniently provided an interesting interlude to below topic on the disputatious nature of the PageRank. While some of the related statements I opened this series with might have seemed a little controversial by themselves, here you will read about the bigger context.


It's link cleaning & not link scheming that we need to have a closer at.






1. The PageRank TB Update, December 6, 2013 Analyzed. 

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔








Let's start with how much the PR Toolbar ranks have changed since the last PR update which we'll distinguish as PageRank 0213, for its 02/2013 month updated. Overall the PR average has gone down considerably, for reasons I detailed in Part 1. First there's the periodic reduction of the base number for PR Toolbar web wide, but then there's also been a massive volume of links targeted by Google Penguin that have either been severely downgraded, or removed from Google's searchable index & thus also from passing PageRank.

Many of the sites Google Penguin has targeted have been "directories" of some sort, or at least similar enough in that they had high volumes of links emanating from them, which previously also passed PageRank. Many had little user value within the site itself & as referrers, or information curators, doing the job that Google would rather be doing.

In the earlier web, directory websites ranked quite well in search resulting in their proliferation, not only as stand alone sites, but also as directories built in, or even actively fed into, many other websites. The conditions & purposes of many of these directory type sites, pretty much went down hill with the industrial link fishing era & ultimately Google has filtered out much of this content, including many deemed bad neighborhoods. This is one of the reasons you don't want to find your backlinks listed in too many of these areas, as that can give bad signals.

For analysis purposes, I would like to show you some specifics on what the changes have looked like statistically. I was fortunate enough to have a fair database of random website toolbar PageRanks, due to bad link removal & Google Penguin recovery efforts I have worked on for clients during this past year.



So let's compare some of the changes between PageRank 0213 and the newest 1213. These statistics are based on 1,512 websites that are fairly random in nature, authority levels, quality & unsorted for topic.



Welcome to view the full details of Joshua's PageRank 0213 Vs. 1213 Analyzed here.

Note: One should assume a reasonable margin of error in this data, but not so much as IMO would significantly change the results & the numbers have not been skewed in any particular direction. Several sites could have moved, or no longer exist. Sub-websites of which there were quite a few that existed in the list were removed, but showed similar patterns in downgrading of "directory" sections.





  • 1,511    -   TOTAL sites random in nature tracked
  •    814    -    Numerically did not change
  •    103    -    Gained TBPR
  •    406    -    Reduced TBPR
  •    188    -    Downgraded to "n/a" (de-indexed , or penalized) 

For a 14% total PR reduction (approx.) amongst all tracked sites. 




If you consider that this was over almost a year, during which trillions of new pages & links were created, it is quite a huge reduction of overall PR. This drop has been noticed & confirmed by others as well, such as this article in SearchEngineWatch: Surprise! Google Updates PageRank...


“We looked at hundreds of sites and 90 percent dropped,” says Dave Naylor of Bronco. “We’ve not seen many gain PageRank in big leaps this time. We saw PR6s drop to PR1s, but not many PR1s rise to a PR6.”


On to the topic of how we can actually use this information, but before I continue... I'll make clear a few important details & give a few disclaimers for those that need them...





2. The Toolbar PR directly affects search rankings... NOT!

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔







Google's "internal version of PageRank" Vs. Toolbar PageRank:


In this particular series, when I talk about PageRank I am often referring to the PageRank Toolbar, (which is also referred to as TBPR). Although I will usually reference specifically which I am referring to, if you don't know the difference, it is important to make that distinction. From Part 1 of this series I'll repeat Matt Cutts own explanation of it in case you didn't catch it the first time:

"We have very fine-grained notions of PageRank within Google. Outside of Google, PageRank is truncated to 10 levels that are visible in the Google Toolbar. And we typically say, take our opinion about the PageRank or the reputation of each page."

And from Matt Cutts 2013 Pubcon talk:

"We have our own internal version of PageRank, it's always updating, it's continuous and continual, and every single day we have new PageRanks. Then there's also an export that says, "OK, given our internal PageRanks, export that to the Google Toolbar. And normally it runs once every 3 months, or so, maybe every 3, or 4 months."

So now you've heard me say it again specifically, but anyone who's been following my writing on this subject already knows it's been explained extensively.




The PageRank Toolbar does NOT affect search ranking at all:


Yes, it's true. As only an "export" of Google's "own internal version of PageRank," it is safe to say the PR Toolbar does not affect your ranking in search anymore than a calendar affects your aging.

So if you think your PR 5 website will always rank better than your competitors' PR4, or PR3, than you may need to broaden understanding of Google's search algorithms overall.

There are a whole range of factors that are more important than simply chalking up x amount of links, from x number of PR x websites. This kind of direct link counting to increase ranking is quite 2004 & algorithms can now process so much more behind link footprints & understand patterns than they ever did before.

Many of these have grown in importance for quite some time now, probably none more than the complete rebuild known as Google Hummingbird which recently brought Semantic Search to the fore & the accompanying Knowledge Graph is growing exponentially. Then there's the progression of social signals (coming from authorities in significant numbers), which is Learning from collective preferences en masse using social media.

The increased understanding & processing power of the modern algorithms even applies to discussions specifically about link building. For example, I will refer to exceptional research in this MOZ article a step ahead of its time, by +Russ Jones June 22, 2011, The Wikipedia Model.

With some key concepts that include...

  • Building the ideal link model - looking at Wikipedia's natural link profile.
  • Link proximity - the obvious to look at. How paid, or spam links tend to lump together.
  • Source link depth analysis - are too many homepage links link graph manipulation?
  • Domain links per page analysis - finding patterns of link manipulation.
  • Running multiple links analysis - analysis vs. competitors.





Beware The Perturbable PageRank:


Of the many SEO topics to write about, PageRank is one of the most controversial & not without cause. Some of the reasons the PageRank topic tends to either attract, or agitate include:

  • The topic of PageRank has been mostly associated with the arguably unpopular PR Toolbar.
  • The quest for high TBPR has been one of the most abused of the SEO manipulating schemes.
  • Many who've been around in this business long enough have been burned by, or on this topic.
  • Disheartening it is to see TBPR drop on your websites after months of diligent work.
  • We've all lost clients who were more concerned about seeing TBPR go up, than real SEO.
  • The visible 1-10 PR is too short to effectively distinguish between billions of sites & themes.
  • High volume of links to mega sites, mean small sites seldom show TBPR gain over drops.
  • Focus on TBPR has been a hallmark of cheap SEO's & wannabe consultants.
  • I'm sure we could go on with this list, but lastly I'll say that...
after almost a year of no TBPR updates, the uninvited TBPR 1213 appears with almost across the board rank reductions. Arguably making it one of the most unpopular updates ever.




There is nothing new under the sun:


"You're misleading readers because you left out the basics." While writing in-depth about new analysis, or ideas, I'll often leave off explaining many already well known basics. One can assume that with topics as broad as this, there will be related aspects, or basiscs not included.


"We've heard it all before, this is totally basic." Sometimes writing about basics elicits the "we've heard it all before," or "I already said that" responses. And that's fine, maybe somebody else hasn't read this & this is for them.

This author does does not seek the holy grail of palatable concepts. I write to provide my analysis & perspective to topics I have great interest, and I am happy to know & anticipate that even in the same gaggle there will always be a variety of & differing views.


Birds of a feather never, always flock together.




3. Using the PR 1213 for analyzing link removal & disavows.

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔



Valuable insight on Google's Penguin cleanups have been provided through another PR update. This I see as one of the biggest practical values in Google continuing to release PR updates, because this data is especially useful as confirmatory signals in the link removal & cleanup processes.


I believe Google has ingeniously resolved a means in which the PR Toolbar can assist in cleaning up a lot of the mess it contributed to in the first place. After all, it was the attractive TBPR numbers that drew the link fishers in the first place & by turning the tables on the low quality content, those who are attuned to the quality of their interweb associations may start falling over themselves to get away.


This is no secret, Matt Cutts has specifically referenced the downgrading of PageRank on link scheming websites on numerous occasions. It is a pattern we don't see decreasing & if anything, I believe we'll see it increase considerably in the near future. It is not hard to observe that there is still monumental room for improvement on spammy link removals & downgrades amongst billions of websites. If anything, I'm of the personal opinion that the cleanup is still just getting started.



I believe Google has ingeniously resolved a means in which the PR Toolbar can assist in cleaning up a lot of the mess it contributed to in the first place.



The value that the TBPR updates have brought to me, has been that when working for clients on deciding which type of links, or from which neighborhoods, I need to request backlink removals and/or disavows, is that they have been especially useful in confirming some of those final decisions. Which is what you will see from this analysis from one of my active Google Penguin cleanup processes. PageRank 0213 Vs. 1213 Analyzed here.

In this process, it is not so much the particular rank of any one site that is most useful, it is the direction it is moving & especially whether there are any abnormal, or especially drastic observable downgrades. This is also why for the most part is is only as useful as the TBPR updates that are provided.


That said, I absolutely would not rely solely on the TBPR of any site in deciding whether, or not to disavow, or request backlink removals from. There are a whole range of symptoms, from a variety of sources, that we would typically look at & can do so with a variety of tools for this process.

As imperfect as the TBPR is as a standalone signal out of all these below, it is the only one provided specifically & explicitly by Google that is known to also be affected by both Penguin & manual action reviews.

Now here is a list of the many signals we would want to consider, in which I've highlighted the ones specifically affected by TBPR in green. While I have edited, or explained some of these signals, the use of these is not at all my own idea, rather these are used by a majority of tools & entities in this process.



  1. If the backlink is from a very weak domain - could just be a new domain (check against age), or if penalized.
  2. On a page with multiple weak, or penalized backlinks - check for possibility of being a bad link network.
  3. Link from a page with no PageRank, but with incoming links - only useful relative to recent TBPR update.
  4. Domain is quite aged, but homepage still has no PageRank - is not a good sign absent other positives.
  5. Domain has been removed from Google index - could be penalized, or robots.txt, Meta issues, check age.
  6. On a page that clearly doesn't rank reasonably well for its own non-generic title.
  7. Link is coming from a spammy automated directory type page. - usually many links to low quality sites.
  8. Domain or its theme is known to be harmful - may have malware reports, or a history of malicious behavior.
  9. Domain's theme is suspicious - ie. hacking, blackhat, porn, gambling, potential bad neighborhoods.
  10. Domain has the same Registrant as known link networks - useful only if not a common proxy registrant.
  11. Domain has the same IP address as known link networks.
  12. Domain has the same DNS of known link networks - but also check against shared hosting.
  13. Domain has many incoming links from profiles matching other link networks.
  14. Domain's link growth has steadily dropped - may be an owner abandoned domain.
  15. Domain's site footprint matches common linking domains - check for possible link networks.
  16. Google Analytics code matches known link networks - check against other negative patterns.
  17. Google AdSense ID matches known link networks - strong indicator, check against other patterns.
  18. Link is sitewide - mainly in known negative patterns such as footers where paid links would usually be.
  19. Links from article directories - especially if known to be low quality, with the exception of very few editorialized.
  20. Links from pages, or sets of pages, with abnormally high quantities of outbound links.
  21. Links coming from link voting type directories - except for the few reputable, editorialized, well managed.
  22. Links from banned sites that have lost all rankings - check for reasons and other negative patterns.
  23. URL matches a pattern of common spammy, or automated URL links.
  24. If URL comes up on known blacklists - usually neighborhoods to be avoided.
  25. Hidden image links also found on the same page - check for unnatural image links on the page.
  26. Excessive keyword rich anchor text - it's very bad to have too high a ratio of these in your backlinks.
  27. Keyword anchor text links from forums known to be spammy, outdated & hacked.
  28. Placement of anchor text is suspicious - matches negative patterns of paid, or spam networks.




4. PageRank penalties - as visible indicator of lost trust in a domain.

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔



What happens when trust is lost in a domain for violation of Google's Guidelines on Link schemes?



"Any links intended to manipulate PageRank or a site's ranking in search may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines."


Matt Cutts has described a number of types of manual action penalties for this, that will be taken in varying degrees, for various lengths of time, depending on the infraction severity. They may include a percentage reduction of the site's PageRank, a ban on the site's ability to pass PageRank to other sites & in some cases there is functional removal from the active index. These manual actions taken against sites for attempting to artificially increase their, or other sites' PageRank, appear to mostly result in actions against the site's PageRank.


"If we see that happening multiple times, then the actions that we take get more and more severe. So we're more willing to take stronger actions whenever we see repeat violations."
How can a site recover from a period of spamming links?


I find this particular PageRank topic especially interesting because Matt Cutts is describing a specific usage of the PR Toolbar and/or PageRank itself, as a trust indicator outside of how we usually think of PR as a specific (19, 101, 555, 3000) link counting type algorithm.

On numerous occasions Matt Cutts has mentioned reductions of PageRank authority for selling & even buying links. Note Matt Cutts does not make a distinction here between the penalty being applied to Google's internal PageRank which would also affect the site's search ranking position & be reflected in the TBPR, or exclusively to the visible TBPR. In that regards for this topic of PageRank penalties, I am referring just to PageRank, which at some unknown level will be reflected in the TBPR.


Watching this video it is not hard to conclude that this...

Manual action   =   PageRank penalties







“Normally what happens is when we find a site that’s selling links, we say, ‘Okay, this is a link seller.’ It’s PageRank goes down by thirty percent, forty percent, fifty percent as a visible indicator that we’ve lost trust in that domain, and it typically loses its ability to send PageRank going forward.”


It appears that what is being described here is a manual reduction of PageRank by 30-50%. That number is probably not arbitrary, anything less for most sites may not become visible in the TBPR.

The next question might be, is this applied as an actual internal PageRank reduction designed to both reduce search ranking position & also be reflected in the TBPR as a deterrent, or is it only a manually applied TBPR reduction. In which case the domain would also need to have a search ranking penalty applied separately, which in my opinion is not the case. 


So there are two types of manual action PageRank link penalties Matt mentions here:


#1. A percentage reduction of the offending website's own PageRank authority.
"It’s PageRank goes down by thirty percent, forty percent, fifty percent as a visible indicator that we’ve lost trust in that domain."

#2. A ban on the offending site's overall ability to pass PageRank to other sites.
 "and it typically loses its ability to send PageRank going forward.”




The first is known to be reevaluated by Google after it expires, either in 30 days, weeks, or months, depending on the severity of the offense. Regarding the second Matt says, "we don't trust it anymore," but does not refer to a time period in which that might expire.


A common misconception about PR is that it exclusively counts links, regardless of where they originated & in what quantities. A thousand PR 5 backlinks from the same site, do not count as a thousand PR 5 backlinks.



5. Long before Penguin, we had manual action PageRank penalties.

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔






Google manually applying PageRank penalties for link selling has been around since long before Google Penguin began taking over a lot of the cleanup process algorithmically & has been documented since well over 10 years ago. Yale's LawMeme reported on the unusual case, PageRank by Judicial Decree? SearchKing Sues Google, in October 2002.

The lawsuit alleged that changes made by Google to its algorithms reduced the PageRank ratings Google assigned to pages within the "SearchKing Network" and that this reduction was an illegal interference with SearchKing's business. In addition to asking for damages, SearchKing filed a motion with its complaint asking for a preliminary injunction to force Google to restore its PageRanks. 

In a rather interesting opinion U.S. District Court Judge Vicki Miles-LaGrange dismisses the case on the grounds that Google's formula for calculating the popularity of a Web page, or "PageRank," constitutes opinions protected by the First Amendment. Search King vs. Google Technology, Decided May 27, 2003.



"The court simply finds there is no conceivable way to prove that the relative significance assigned to a given Web site is false. Accordingly, the court concludes Google's PageRanks are entitled to full constitutional protection."




+Danny Sullivan also wrote this excellent explanation on PageRank penalties & some controversy at the time, in October of 2007. Official: Selling Paid Links Can Hurt Your PageRank Or Rankings On Google.

If spotted, in most cases all Google would do is prevent links from a site or pages in a site from passing PageRank. Now that’s changing. If you sell links, Google might indeed penalize your site plus drop the PageRank score that shows for it.

He goes on to describe how the PageRank penalty comes from a human review, which Google now calls a Manual Action.

Google stressed, by the way, that the current set of PageRank decreases is not assigned completely automatically; the majority of these decreases happened after a human review. That should help prevent false matches from happening so easily.




6. Is the PR Toolbar beyond redemption as a reputation metric?

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔




While some writers have chosen to ignore the topic of PageRank almost altogether,  Google representatives still talk about it frequently. You only have to watch the last year of Matt Cutts videos, to see he has regularly referred to it more than many other equally important topics. Probably because as head of Google's Webspam Team, this is a major area to police.

The Webspam Team has long worked to purge authority passing link schemes in a concerted effort to refine the quality of PageRank & ultimately to improve the end user's search results.

But has all this effort improved the perception of PageRank as Google's reputation metric? As Matt Cutts has repeatedly stated?



"PageRank is Google's opinion about how reputable a page is." Matt Cutts



I can hear a chorus of the best & brightest chuckling under their breath at this one. And I would not disagree with them. In my opinion there's still a monumental ways to go on this & I wouldn't hold my breath to see that negative impression dissipate anytime soon. In the long term though, I believe Google would like to see & I wishfully hope that it could become, that reputable symbol again.




The irreconcilable limitations of the PR Toolbar?


In my opinion, for the PR Toolbar to become a reputable symbol of authority, there are big limitations to be overcome. Can they be, will they be, or is it a cause too far gone? I for one can't answer that one, but here's some limitations as I see them.

  • The TBPR as a 1-10 metric is far too limited, but that also has its upsides.
  • At the speed the Internet grows most sites seldom show a change.
  • Without an official explanation besides a number the interpretations will & do vary widely.
  • Schemes will doubtless cease to be in abundance & the TBPR data will be used for them.
  • When data is abused by some, this damages the metric as a reputation for all.
  • All the links used to calculate PR will doubtfully ever reach a pure level as trust citations.



Are there any better authority ranking tools than the PR Toolbar?


Thankfully there are third party alternatives that work very well for evaluating link metrics & patterns. One of the most useful (in my opinion) & widely used of these ranking systems is MOZ Rank. Widely known through its popular tool Open Internet Explorer, see What is MOZ Rank here. Without doubt MOZ Rank has built a loyal following, has been around long enough & worked well enough, to acquire an enviable reputation.






7. Recovering from Penguin, or Manual Actions with Eric Enge.

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔



Backlink cleanup & recovery would be a perfect way to end this story & Eric Enge is especially knowledgeable on this topic, with whom I had the pleasure of doing a Google+ Hangout On Air show covering much of this topic 2 months ago.

So I'm including the hour long interview below, but to save time I've transcribed just the key excerpts which are most pertinent to this topic. The interview is specifically about the backlink clean up & disavow processes, it does not include coverage of the TBPR topic.


Google Panda & Penguin Discovery & Recovery - REALSMO Hangouts,

Excerpts of my recent interview with +Eric Enge on Nov 14, 2013


Eric Enge:
"Google is not trying to get every single site properly ranked, that's not really their objective. They're managing hundreds of millions, or billions of websites. They're just trying to make their overall service improved. So they will implement algorithms & see that the overall service improves."

"For Google it's still a good algorithm change, because their overall search quality went up."

8:35 - Difference between Panda & Penguin:

Eric Enge:
"Panda is really more focused on content type stuff. And Penguin is more focused on link type stuff."

"So with Penguin the big things are article directories, directories (too many of them that is, cheap ones), too much rich anchor text pointing to your pages."

Joshua Berg:
"That diagnosis is very helpful to understand the first thing you need to concentrate on, but overall I think if you understand the direction Google is going, that you'll be moving that way on all of these things that will help your website, whether it's Panda, or Penguin, or the EMD Update, or Google Hummingbird."

So moving us forward, getting rid of all the link schemes & the old fashioned SEO keyword stuffing & low quality content problems, that were red flags.








28:48 - Before Penguin it was manual reviews:

Joshua Berg:
Let's talk about Google Penguin, what was the Internet shattering consequences that came along with Penguin & the tidal wave of change along with that. What was that Eric?

Eric Enge:
Yeah, that was a big shock, you're absolutely right. It was April 24th, 2012 & a lot of people got hit.

What people don't always remember is, prior to that, Google started to do a lot more unnatural links messages & warnings in Webmaster Tools & this was kind of the forewarning to the whole thing. And of course those are still happening a lot today, people are still getting those messages.

What those were about, those were cases where some manual review was done by Google & they flagged that somebody was doing link building practices they didn't like. But Penguin was the first time they successfully automated a lot of this link detection. And technically speaking in Google's mind, Penguin isn't a penalty, it's just an algorithm, it's a ranking algorithm.


28:48 - Directory backlinks dragging a site down:

Eric Enge:
Remember you were talking about some people getting hit that really didn't deserve it? I'll just tell a story of one site, a good friend of mine running a site. It was in fairly early stages & it got hit by the Penguin algorithm & he was shocked, because he didn't do as far as he knew, any bad link building stuff, it was all very organic.

But I did some digging & I was fortunate enough to get a little bit of Matt Cutts time at a conference on it & I got him to comment on it. Turned out that this site had 6 links, only 6, from article directories & that was the sole cause of the site being hit by Penguin. And the amazing thing about this, yes he submitted some content to article directories, no money changed hands.

Remember I said that Google is not trying to rank every site perfectly? And so they implement algorithms that improve their search results overall & what they found, is that if they looked at all the sites that did any article directory work & they lowered rankings for them, search quality went up. And that's probably because a lot of those sites did a lot of other bad link building practices also, but in the case of my friends site, all he did was he submitted 6 articles to article directories & he got hit.

I actually helped him out with it, we removed those links & it took a while, but the site recovered fully from just removing the 6 article directory links.


33:20 - Make a sincere effort to get links removed, especially with manual actions:


Eric Enge:
All of this happened before Disavow was around, because we didn't have Disavow when Penguin first came out. So he attempted to get the links removed & in fact, we weren't able to get the site to recover fully, until Disavow was available, for exactly the reason you're talking about, that it was difficult to get the links removed.

This brings up another very good point about Penguin, Google really wants you to make a sincere effort to get the links removed. Particularly in the case of the manual penalties, it does happen that people will recover just through Disavow, but in my experience the success is much higher, even if you get 10, or 20% actually removed & Disavow the rest.

Certainly in the case of a manual penalty, an unnatural links penalty in Webmaster Tools, it makes a big difference, because there's a human being on the other end who receives your Reconsideration Request.


"Sometimes people think that Disavow is the be all and end all, the panacea that's going to cure all their ills. We do want, if you've been doing some bad SEO and you're trying to cure it, in an ideal world you would actually clean up as many links as you can off the actual web." - Matt Cutts

You know they wake up every day trying to protect Google's honor & the quality of their search results. You did something they didn't like, so you're kind of looked at as a cheater to begin with. They may have had a fight with their spouse that morning, they may be in a bad mood. And now they get your reconsideration request & they see you slopped a few links into a Disavow file, didn't put much work into it. And they might not grant your request.

But if you put in that extra effort, we find that we get a much higher recovery rate.

35:28 - Reconsideration requests do not apply to Penguin:

I should clarify since I brought up this notion of reconsideration requests. Reconsideration requests do not apply to Penguin. Penguin is an algorithm, a reconsideration request doesn't help you there. With Penguin, you have to try to Disavow, or remove the links & then you wait.

36:26 - You do not get a manual action notice for Penguin:

Eric Enge:
Webmaster Tools unnatural links messages mean you get a manual penalty, and reconsideration requests do go with that, but that's not Penguin. You don't get a message with Penguin, it just happens. You wake up & you have a bad day, cause your traffic is way down.

Then you look it up like I described earlier, you go to the Google algorithm update history & then you see that on the very same date that your traffic dropped, that that's an algo update. With Penguin, all you can do is take care of the bad links & wait.


38:36 - The directory links:
Eric Enge:
I have a rule of thumb that I give people about traditional directories. I'm not talking about article directories & to be clear, I'm not talking about local business directories either. I'm talking about Yahoo, DMOZ, BestOfTheWeb, Business.com. A few of those directories are OK, but there's a lot of people who've done a lot of link building by going to hundreds of these directories.

And the rule of thumb I give people is, no more than seven. And it's an arbitrary rule, you could argue with me & say, "maybe it's 10 for this business & 5 for this business." But to keep it simple, just don't ever get more than 7 directory links. Even if you could argue with me that, that particular directory over here, which would be the 8th one, might not be bad. You'd probably spend a lot of time finding it, and it's not worth the effort at that point. The incremental value after Yahoo & DMOZ, and things like that, it starts to decline really rapidly.


40:04 - Comment spam & forum links:

Eric Enge:
The other thing that's worth "commenting on" (some pun intended), is forum comments & blog comments & dropping links in there, or profiles on forums. This is also bad news. And like you said Joshua, we would spend a minute in Webmaster Tools & you start seeing, there's all the comment links. And it's like, no, no, no, you have a problem.

Joshua Berg:
And if they used XRumer, then it's probably impossible to get rid of them. We call that comment spam as well, right.

Eric Enge:
Yeah, the comment spam is another clear thing.

40:04 - Manual actions generally include things Penguin can't find:

Eric Enge:
It's kind of interesting to talk about the difference between Penguin & manual penalties.

Penguin is a thing they can do algorithmically & then penalties in Webmaster tools, which are the manual ones, generally speaking include things that the algorithm can't find. They may have been sparked by some sort of human review, maybe your competitor complained about your site & said you were doing all kinds of spammy things & getting away with it.

Maybe there's a different algorithm that flags a potential problem, but they're not confident enough in the algorithm output to say that they should attach a Penguin ranking adjustment to it. So they just say, "OK, this is a possible trouble site & they send that on to a team for human review.


44:05 - Disavowing entire domain if you find a bad link:

Eric Enge:
Key tip, if you have one bad link from a domain according to all your reports, I would still Disavow when you get to the disavow process, all links from that domain. Because the tools we have available to us, Webmaster Tools, Majestic SEO, ahrefs, Open Site Explorer, cumulatively all those tools still don't tell you the whole link story. You're only getting a partial map of all the links with that, they may say you have only one link from a given domain & you may judge it's bad & you may discount only that one link. And it turns out that there's another link from the same domain that's also bad, that didn't come through in all the reports you got.

So that's why it's so important that when you're gonna disavow a link from a domain, you should probably just disavow the entire domain, so you don't have that problem delay your recovery.







45:16 - After recovery site does not return to it's previous ranking:

Eric Enge:
The other thing is, when we recover from a link penalty, you aren't gonna go all the way back to where you were before...


Following is also a good follow up on this cleanup process:
Eric Enge's The Digital Marketing Excellence Show, with Glenn Gabe & Jenny Halasz
Google Penguin, Diagnosis, Recovery, and Pitfalls Oct 24, 2013



As explained it is important to start building good links as well. So to end with here's a good video from Matt Cutts, on some useful ways to build organic links, of course starting with good content.


What are some effective techniques for building links?









8. Has second generation Penguin gone upstream to deny link value?

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔



After releasing this article 2 days ago, I was thinking again about how so many sites PageRank's had dropped over the last year and yet, I don't recall seeing as drastic TBPR reductions with earlier Penguin releases.



Following are all 5 Penguin Updates specifically confirmed by Google:

  1. Google Penguin 1.0    Apr 24, 2012 (impacted around 3.1% of queries)
  2. Google Penguin 1.1    May 26, 2012 (impacted less than 0.1%)
  3. Google Penguin 1.2    Oct 05, 2012 (impacted around 0.3% of queries)
  4. Google Penguin 2.0    May 22, 2013 (impacted 2.3% of queries)
  5. Google Penguin 2.1    Oct 04, 2013 (impacted around 1.0% of queries)



Note: Some industry influencers refer to them as versions by order of their release, because prior to 2.0 Google had never given a Penguin update an actual version number. Danny Sullivan explains in his Penguin 5, With The Penguin 2.1 Spam-Filtering Algorithm, Is Now Live Oct 4, 2013, that...



"when Penguin 4 arrived, Google really wanted to stress that it was using what it deemed to be a major, next-generation change in how Penguin works. So, Google called it Penguin 2."



Were these exceptional reductions just a direct result of the increased effectiveness of what Google has referred to as their 2nd generation of Penguin algorithms?

Was going after more top tier sites with these reductions the more aggressive & effective techniques that Matt Cutts was referring to on May 13, 2013?

Barry Schwartz described the situation at the time in Google: More Sophisticated Link Analysis & Link Devaluing In Works May 15, 2013. And here's what Matt Cutts specifically said...



We're also looking at some ways to go upstream to deny the value to link spammers-some people who spam links in various ways. We've got some nice ideas on trying to make sure that that becomes less effective and so we expect that that will roll out over the next few months as well.



However Matt Cutts did not say that this going "upstream to deny the value to link spammers" was going to specifically become a part of the second generation Penguin. Apparently he has commented it was not part of Penguin 2.0, but I still wondered if it was attached to the more aggressive Penguin 2.1 released 4 months later.


So just yesterday I asked Google representative, John Mueller, Webmaster Trends Analyst, who frequently hosts the valuable & informative, Google Webmaster Central office-hours hangout at 19:26, if he knew whether this upstream devaluation referred to had already come out with a Penguin update.

By the way, I highly recommend watching his Google Webmaster hangouts. I try to join them often, he's answered lots of our questions & to me they've been surprisingly candid & informative.







Joshua Berg:

On May 13, Matt Cutts said, "We're looking at some ways to go upstream to deny value to link spammers... We expect that will roll out over the next few months."

Is this one out yet? Was it some of the PR manual action penalties, or with a Penguin Update?

John Mueller:
I took a quick look at where that came from & I think that came from one of his videos. And I think 2 weeks, or so after that video he actually did a blog post announcing Penguin 2.0 and referring to that video specifically. So I imagine that's what he was pointing at there.



I don't know if it was that head scratch there, or all the "I thinks," but while confirming he had the same impression as I about those videos, it still wasn't exactly the definitive answer I was looking for. But then... Google rep answers as related specifically to algorithms often aren't. It can't hurt to ask, right? :D




9. Can you Disavow Penguin if you haven't received manual penalty?

▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔



In case this isn't obvious already, I'd like to confirm, source & clarify this last point. If you can disavow Manual Actions which are specifically known to include a human review PageRank penalty, then you should also be able to effectively disavow for Penguin related rank reductions, if these also include (as I've described) algorithmic PageRank reductions.



And I can't think of a more detailed & comprehensive Disavow Tool experiment that's been done than the Disavowed: Secrets of Google's Most Mysterious Tool, by +Cyrus Shepard on May 28, 2013.

Here's a few excerpts where he's asked the question...



Can You Use Disavow for Penguin?

"Can you use the Disavow Tool if you haven't received a manually penalty? For example, will it work for Penguin?"


"Google representatives, including Matt Cutts, have gone on record to say the Disavow Tool could be used to help if you’ve been hit by Penguin (an algorithmic action), but also suggests that this applies to links that also violate Google’s Quality Guidelines."

"Penguin and Google’s Unnatural Link Warnings often go hand in hand. So if you were hit by one, you are often hit by the other. Conversely, certain SEOs have claimed benefits from using the disavow on sites that were not penalized."



Apparently some were not convinced and believed the manual action applied lowering of PageRank had nothing to do with the Penguin algorithm, so the disavow tool would then not be useful for it. But a few weeks later Matt Cutts cleared that up with the following...






On the topic of Cyrus Shepard's above referenced experiment with Disavow Tool & Penguin, he also asked the question in If You Disavow Links Are They Gone Forever?


Because the initial drop happened during a Penguin update, many SEOs theorized that rankings wouldn’t return until the next Penguin update.

So I waited. And I waited.

...we finally saw an update on October 4, 2013

Since then, SEOs around the industry have been asking me if the site has recovered.

Aside from a slight bump from publishing a new post, organic search traffic never recovered after the latest Penguin update.

Some folks have asked if it’s possible if I wasn’t just hit by a Penguin update. While it is entirely possible, I consider it unlikely for the following reasons:

  • The link profile to this site was high quality
  • No known negative SEO
  • No unnatural link notices in Google Webmaster Tools
  • Other agencies have reported similar drops after preemptively using the Disavow Tool

If the site was simply hit by Penguin, it doesn’t answer the question of what the hell happened to all those disavowed links? Why didn’t my rankings drop until Penguin 2.0 when I disavowed all 35,000 links found in Google Webmaster Tools?

The evidence seems to suggest that once you disavow a link, it stays disavowed forever.



While I do not know if anyone has yet covered this, in my personal opinion (and that's all it is), I believe what Cyrus Shepard saw & proved here is, that by using the disavow tool on all aforementioned links...


  • Effectively, he permanently disavowed all PageRank authority from those 35,000 links.
  • Google doesn't apply or process disavow files immediately (at least up till Penguin 2.0).
  • Site changes have always taken time to affect ranking, or they'd be too easy to test & manipulate.
  • Ranks will be affected for better or for worse, when Google applies the disavows, or updates.
  • New authority can be built from then on, but will not return from those specific disavows.




2 comments:

Post a Comment