Key topics from Webmaster Central Office-hours Hangouts with John Mueller.
1. For all our algorithms we try to have a sliding scale between how they react to a site.
2. But wait, can't TRUST be turned Off through a Manual Action?
+John Mueller's regular Webmaster Central Office-hours Hangouts have been helpful & informative as usual.
This week I'd like to share some points from his Hangouts that have recently been the subject of discussion on whether certain of Google's algorithmic penalties operate on a sliding scale, or are just all "On, or Off".
Apparently interest in this topic started 2 weeks ago with a tweet from Matt Cutts...
"@chipnicodemus I'd recommend continuing to clean backlinks though. You still have a very mild case of Penguin."
That topic was continued here at Search Engine Watch on April 8, 2014, A Very Mild Case of Google Penguin? as well as quite a number of other SEO blogs.
┅┅┅┅┅┅┅┅┅┅┅┅┅┅┅┅┅┅┅┅┅┅┅┅┅We do pretty much for all of our algorithms, we try to have like a sliding scale between how they react to a site. And it's never the case that something is either "on," or "off," because there's always this lot of room in-between. - John Mueller
1. For all algorithms we have a sliding scale between how they react.▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
*28:10 Submitted Question:
[*Event video is below where you can find the dialogue at noted time codes.]
Matt said this week about a site having a mild case of Penguin, how many grades, or levels of Penguin severity & is that the same for Panda?
28:20 John Mueller:
We do pretty much for all of our algorithms, we try to have like a sliding scale between how they react to a site. And it's never the case that something is either "on," or "off," because there's always this lot of room in-between.
So for most of our algorithms, we kind of have a sliding scale, where we wouldn't even say they're different levels, it's just different strengths of the problem that they're seeing there. And to some extent that makes sense, because it's not something that you'd always have to fix.
Sometimes when the Panda algorithm sees that there's something kind of low qualityish about a website & slightly reacts to that, then that's not something that the webmaster would even always see in most cases.
So it's something that might just look like normal changes in ranking, where we're trying to match your site better to queries that are actually relevant to your site & not show it to queries where it's not relevant.
That's also a reason why we currently don't show much information about these algorithms in Webmaster Tools, is that essentially they're a part of our search quality & our ranking algorithms, and they try to just match the relevant results to the queries that people are bringing.
So having a mild case of Penguin is something where, maybe if you can find something to clean that up, that's always a great idea. But it might also be something that's just lingering from years & years back.
Wait, did he just say, "lingering from years & years back"? Interesting.
And similarly with Panda, when it comes to the content quality that we look at there, it might be that the algorithm is picking up on some things where essentially we're seeing that, in this particular area, it's maybe not as relevant as we thought of in the past. So we're adjusting the rankings slightly.
That doesn't mean that your traffic will always go down, it might even be that your site starts ranking for more relevant queries than, maybe other less relevant queries than it was ranking for before.
So it's really kind of a complex situation in the sense that, a lot of our algorithms can't be mapped one-to-one back to feedback that the webmaster can actually do on their side.
I then asked him if Panda was softened up a bit recently, but that can't be confirmed.
2. But wait, can't TRUST be turned Off through a Manual Action?▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
The above question got me thinking, "Are there any penalties that can be either all on, or all off"? That's because I remembered something Matt Cutts previously talked about, which I covered a few months ago in PageRank penalties - as visible indicator of lost trust in a domain.
In that particular video Matt also described a sliding scale type penalty for a Manual Action, but secondarily he described what sounds very much like an all On, or Off for domain Trust. Interestingly enough, he also does not say there is a way for that trust to return.
So there are two types of manual action PageRank link penalties Matt mentions here:
#1. A percentage reduction of the offending website's own PageRank authority.
"It’s PageRank goes down by thirty percent, forty percent, fifty percent as a visible indicator that we’ve lost trust in that domain."
#2. A ban on the offending site's overall ability to pass PageRank to other sites.
"and it typically loses its ability to send PageRank going forward.”
In my opinion, the reasoning behind this could be simple enough. Once you've lost trust in the reasons for why a particular website links out to other websites, why wouldn't you just turn that off? If anything there are too many other links out there anyway, so it's not like they will still be needed for good functionality of Google's algorithms.
Inverse of that, how would you measure that breach of trust on a sliding scale?
Now one would hope that these would be on a timed reset, where eventually that goes away. It has previously been mentioned that no penalty is totally permanent, as they are setup to eventually reset up to quite some extended time intervals, but for the most egregious cases, you can be sure this would be a very long time.
So later on during the Webmaster Central Office-hours Hangout, during a follow up discussion, I asked John Mueller about the sliding scale & the Manual Action Trust factor...
[Note: This question began with "Is Penguin always a penalty that can be fixed? Can links out cause a Penguin penalty"? At 34:18. However my specific question of this discussion is not till later at: 42:34]
36:50 John Mueller:
With regards to unnatural outbound links that's something where we start feeling a little bit worried about trusting everything on your website, if we discover that your site has a lot of unnatural outbound links.
In the sense that we might say, "Well, we found all of these outbound links, what about the other outbound links? Should we be able to trust those or not"? That's one thing that we look at there.
With regards to Penguin, the main affect will be on the site that's on the other end. So if this other site is supporting itself primarily through all of these unnatural outbound links, then that's something our webmaster team might pickup on & try to take action to make sure it's not being supported by these unnatural links.
[Other discussion continues on why more algorithmic updates aren't reported & then...]
42:34 Joshua Berg:
Is that losing trust in outbound links also a sliding scale thing, or would that ever be just On, or Off. "We don't trust this site's outbound links.
42:50 John Mueller:
I can see both variations of that, I mean...
For Manual Actions it's essentially just On, or Off, because someone from the Webspam Team takes a look at that & says, "Oh, we can't trust these outbound links." And then we essentially turn that off for a Manual Action.
But I could see our algorithms being a little bit more granular in that regard. We try to be as granular as possible, but especially when it comes to manual actions, we can't check all of the links & say, "Oh, on this individual URL, this link shouldn't be counted & everything else is OK."
There's so much content on the web, there's no way we could do it on that level, manually.