Google's John Mueller hangout



Google's John Mueller is a pro at speaking for Google. He can not answer a question like nobody. Matt Cutts was great too but I have to give it to John. His next job could easily be White House Press Secretary as good as he is.

Here is my take on John Mueller's recent Hangout. I'm paraphrasing a lot.

Q. Which would you recommend for our blog; subdomain or subdirectory and does one have greater benefit to SEO?

A. They're essentially the same, it depends on your infrastructure and what's easier for you. If you want to do either it's fine. This is not an answer (or at least not an original one) but here is Matt Cutts providing a more in-depth answer. https://www.youtube.com/watch?v=_MswMYk05tk The question could have been answered by providing any benefits or detractions that each causes. When someone says things are "essentially" the same then that implies that there are still differences and I believe that was the core of the question.


Q. We have multiple CCTLDs and should we link them all together?

A. Firstly make sure you're using HREFLang annotations. If the content is the same then you can link between the individual pages and that's a good way. You can link from country/language page to like content for other country/language or you can have a country-picker function, either works.

My take: I think this is actually a good answer where Google is actually providing some specific direction that linking between the pages directly can be beneficial. Perhaps that will help with providing a type of supplemental content or just content that is helpful for users of different lang/country. It also takes away the question of which option to choose as it's up to the developer to pick what's best. He does, however, go on to say that it helps them when you link directly from page to page especially if you have the same content but for different lang/country. So that could be a subtle gem to consider.


Q. Why is Google slow to digest, treat, consider 301 redirects?

A. There are two parts here that are kind of mixed together that are hard to look at separately. We follow 301's immediately. We'll try to follow them and index the content under the new URL. There are things that make it "look" slow such as if you have a site-wide 301 system and you have a lot of pages or these are infrequently crawled and it can take us weeks or months to crawl them. Traditionally they don't change at all and we don't expect them to change.

The second thing is that people who do a "site:" search are asking for a URL on that domain and we'll show it to them. Even though it has been 301'd to a new domain, we keep the old one and will show it when you search that domain.

If you're changing a site/migrating then use WMT change of address and setup a new WMT instance for the new domain. Don't rely on "site:".

My take: The part about pages changing and Google expecting them to change in order for them to be considered for crawling is a key comment here. It proves beyond doubt that it's important to have content that updates frequently and that it's good to have all of your content pages update frequently in some manner.

Did he just say that even though you 301'd your content over, that if someone searches for the old URL they'll show it? This is ridiculous of Google to do because there are reasons that webmasters want a page 301'd and if someone finds that old URL and links to it instead of the new URL then the link-benefits are being diminished as a link to the new URL would be 100% value but if 301'd it could be 85% or by whatever the real amount which is transferred via a 301. This is another example of Google doing what they want instead of what the webmaster wants with their own site.


I also think Mueller could have provided a bit more help by telling what webmasters can do to push their 301'd content into the index such as "hey, if you see we're not getting to the content just update your sitemap in WMT". That would have been nice. Or how about a bulk-url Fetch tool?

Even though he gives good advice on a site migration, what about if we're just updating our URL nomenclature or our hierarchical structure? Same domain, just different URLs. According to him they'll index both but does Google look at any of this as dupe content? Will my old pages show up in SERPs if the query is more attuned to the former page than the new one? Mueller's response begged more questions that it received answers.


Q. Is one Penguin update enough to recover from a penalty?

A. Essentially yes... but... it can take time for us to process all of the link-based data we have on your site so that a future update may take that newly processed data into consideration and adversely affect your site's rankings.

My take: Straight enough question; straight enough answer.


Q. Our site had a lot of bad links and we disavowed most, even good links. We saw a rise in November but then, soon after, dropped back down. Why?

A. I'd have to look at your site specifically but again this shows that it can take time to process all of the link datA.

My take: Come on Mueller! Google took the disavow into account and boosted the guy's site and then turned around and dropped it again. Was this because of new data or could there have been new things considered in the next update that look at the links differently? I smell a Google misdirection here. I think he could have been a bit more forthcoming, even generally, about why, even though he doesn't know the specifics of this guy's situation/site.


Q. With Penguin; will we see gradual increases or just one big one?

A. As you improve your site you should see gradual increases but following an update you may see a large one.

My take: To me, he's saying that you don't have to wait for an update to have link-related improvements considered by Penguin. This would be a change from the year that people had to wait for Penguin 3 to see any benefit. This could mean that they're doing the constant refreshes but occasional updates. This partial confirmation is very welcomed.


Q. Do Google algos target site-wide or on a per-page-basis?

A. We try to go granularly but Penguin affects you on a site-wide basis. If your blog (the example given) is on a different domain then we'll look at it differently. If the content is good then we may have signals that are boosting it while Penguin is dropping it.

My take: Which is it; granular or site-wide? OK, Penguin works on a site-wide level but you could've answered the question directly by saying yes or now; that it can affect you on a granular level.


Q. Does removing META data from images affect ranking?

A. Not necessarily. For image search we take in to account a lot from the context and some from the META data but that's more just icing on the cake, as we find out more info like the resolution, date of creation, but this is something where if you want to remove it, it's your decision, you don't need to but you're free to but it won't really have a big effect on search.

My take: John... things like "not necessarily", "icing on the cake" and "big effect on search" mean that it will have some effect thus the real answer to the question is perhaps "you can remove it if you have an issue with it but it does affect rankings however to a small degree. He also specifically targeted image search but didn't answer the question about how the META data affects rankings in regular search. Was that an avoidance of the topic of regular search? If it has some affect in image search then we can likely assume that the added data can slightly affect rankings in regular search. Well, it's worth testing at least. We'll get a clearer answer than from testing than we got from Google.


Q. If I have a one-page website and want to target 50 keywords how do I target them and if Google is looking at the Title tag as a main factor how does that play into it?

A. You can create a single-page site, that's your choice but you should look at it from a marketing perspective and if you're targeting diverse topics then that's really hard for us to say "this site is relevant on the topic of shoes and the topic of cellphones". If you're targeting multi topics or increasing your content then you may want to consider a multi-page site approach.

My take: John, you didn't speak to the scenario where they have topically-related keywords that they're trying to target on the same page. You only answered as if they're vastly different topics. It would be nice to have an answer for the similar-topic scenario or would that cause you to actually give something away?


Q. If I have multiple domains targeting the same content in different languages should I verify both in WMT?

A. Yes. Even if they're not targeting the same translated content but they're in different languages then yes, use WMT. Use also HREFLang markup. Do this on a page-by-page basis so the same content in one language references the same content in another language.

WMT also gives you stats on which parts might be implemented correctly and not.

My take: Cool


Q. If a site is in English and sells products internationally and ranks in many English-speaking countries should we add HREFLang markup and send to each country to a specific page tailored for that country?

A. You can but you don't need to. If you have content, though, that is country-targeted then you can use the HREFLang markup and we recommend that but if the content is barely different then it might be better to stay with one page that's "very strong". "Focus on the product that still ranks globally".

My take: One page can be stronger than splitting it into different pages so it looks like duplication of content is still possibly seen and considered here by Google or perhaps the links to that single page are better than diluting it among different pages. He doesn't really dig into that but international SEOs should consider those when making this call. Perhaps test it out. He says you shouldn't do it if the content is mostly the same so this shows that they can be perceived as dupe content so tread carefully and make sure the content is different enough and that that difference is country-related.


Q. In the case of different country-specific pages with HREFLang, can having the currency change be enough?

A. Yes, that's something you could do especially if you're using those pages in Google Shopping but you could also do this with URL parameters. But in general if you have a really strong page that is valid for varied countries then I'd try to keep that.

My take: He said it numerous times; that if it's a strong page then keep it as one. This means that one page with all of its benefits is better than multiple pages unless you have to change it or it's specific enough to each country that you should go with different pages. This could be a nice hidden tip from Google here. Some testing would be good on this.


Q. If we are geo-targeting a site and it's primarily for the USA then do we need to set that in WMT?

A. We can usually figure it out but if you want to explicitly set it we'll explicitly follow it.

My take: If you explicitly set it then you may explicitly be blocking it from other country's search. He already said that one page/site can rank well on different country search engines so why limit yourself?


Q. My Toolbar PageRank has not updated, why?

A. We don't update it anymore. You should not worry about this but you can look at WMT for queries and click data and if you see that you have impressions but not clicks for certain terms you can review your Title and Meta Description to see if they're optimized. If your Titles and Descriptions don't match the topic then you can optimize them and it's a good step.

My take: Did he just mention Meta Description as something to optimize? Wait, I thought Google said they don't consider it anymore. He could be mentioning that because it can help with conversions (click-throughs) which he does mention so he's likely referring to conversion/CTR. Either way it's good advice of course. It's what we used to do with keyword referral data way back when.


Q. Are there any plans to update the date range on queries data to more than 3-months.

A. We are considering that for the next update but it's not guaranteed.

My take: Won't that be nice.


Q. WMT queries data is missing the most recent 2-3 days and this skews the data because it's included in the last 30-days; why?

A. It shows you that you need to work harder. It takes us time to process the datA. Sometimes we have partial-day datA.

My take: Not funny John... maybe Google should work harder to get the data in or simply restructure the data range so it goes 30-days from the most recently available date. Ya think?


Q. A lot of referral spam appears on my site, do I need to worry about that?

A. No, we see it. You can ignore or block on your server. Has no effect on websearch.

My take: Cool!


Q. If I disavow am I confessing?

A. If you're aware of a problem then disavow... it's not something that we take as an admission. It's technical only.

My take: Disavow is a great tool and if there's no backlash then use it.


Q. If we have an article on someone's site and they link to us in the footer of other pages should we disavow all of those links except the main article we have on their site?

A. That would be too much work and if it's really a natural link then just leave it, that's fine. If you're contacting sites and trying to get footer links everywhere then that's different.

My take: So footer links if natural are good. Google has stated that they are able to look at whether footer links are natural and if links in general are part of the content (having editorial value) or if they're stock standard across the site's footers. However he does tell us here that valid links are good even in footers so that's a nice clarification. Thanks John.


Q. Do I need the rel-canonical tag?

A. No you don't have to have it but we recommend it.

My take: If Google recommends it then it's worth considering as it can potentially save one from a Google problem.


Q. If we have a services targeted website and its low on content should we build content?

A. What are your users looking for, how are they searching? It's not good to create artificial content if they're not looking for it. Artificially creating content doesn't make sense. Some sites with thin content rank well.

My take: "It doesn't make sense" doesn't mean that creating content won't rank or build-up the site's Panda quality rating. He says that some sites with thin content rank well. Doesn't Panda slap people that have thin websites? Which is it? He does answer a follow-up question about the same topic with advice that you shouldn't create fluffy content that's not quality but the first question asked, essentially, if there's a benefit to building out the content. He doesn't answer the question. He does, however, offer that if people get to your site and see fluff content and leave which sort of indirectly provides support for the idea that Google looks at bounce-rate as a ranking factor. He also says that people won't share that content which indirectly supports the Social as a signal importance.


Q. Is PageRank important or should I not consider it in my SEO equation or strategies? A. Don't consider it because we don't update it.

My take: Toolbar PageRank doesn't update (another thing Google is taking away from us) but it doesn't mean that the thing that made PR important has changed. PR was based on links, citations, etc. These still matter greatly so Mueller's comment is a bit misleading. He could have phrased in more realistically that the PR number doesn't matter but the things it considered do.


Q. Do links from SEO reporting sites and others affect us? Do you ignore them?

A. We try to ignore them, especially links from auto-generated content – so we try to not to put much weight into them. They're something that we (Google) have to live with. Of course if you're running one of these sites make sure you have value on them.

My take: They're ok so we shouldn't have to worry about a Penguin slap because of them. If Google has to live with them then they can't really downgrade you for them (Penguin slap you) so they are beneficial, to some degree. This is good to know John! Thanks. Also it's interesting information for owners of these sites. Do I smell a self-confessed Penguin-proof link-building technique here? It's possible but again, caveat, it's dangerous so if you do go this route, beware!


Q. Does https only apply for websearch or does it apply for image search?

A. Only for websearch at this point. It's a really small ranking factor at this moment so you're not likely to see any big jump by going to https. Go ahead an switch.

My take: Any jump is a jump and you did say "at this time". It might benefit webmasters to start considering the switch, especially large sites that would have a chance of seeing a small jump better than a small site. This will be more important when they change and consider it even more in the future so if you start now and do it slowly and smoothly then you won't have to rush later.


Q. Is the disappearance of search query data in Google Analytics dealing with a beta?

A. I wasn't aware of that disappearance but you can still get that data from WMT and Analytics will be switching over to the new format as well once that's finalized. If you signed up for preview version you should have that but we're going to be doing more testing for other people on the list.

My take: New format, new data, that would be good. What's coming?


Q. Why do I see different data between Analytics sessions and WMT clicks?

A. If someone doesn't have java script enabled it won't show. Multiple clicks by the same person can be considered one session in GA.

My take: Standardize! How about you provide like data so we can benefit from it? Why not include clicks in GA? Google has a myriad of subtle ways to make optimizing for them more and more difficult.


Q. If I have pagination on my site and it's generating hundreds of pages of content with the same Title and Description will that affect my site with Panda?

A. Panda looks at the overall quality of a website and its content and Titles and Meta tags are not a sign of quality. They're something more for the Search Engines, something we show in search and not something the user actually sees on the website so just because you have duplicate Titles and Meta tags isn't really a quality problem because we look at the overall quality of a website and not at the individual tags. They don't affect how a user sees your content.

If you have a lot of dupes it might be worth fixing but it won't affect the quality of your website.

My take: Bull-dung! Did you really just spit that out John? Users don't see it (the Title)? It's the main thing on the SERPs! It doesn't play into the quality, possibly, but the Title in relation to the content is important and every SEO knows how important it is to optimize Titles. Are you saying it's not a Panda thing but it's another algo (Hummingbird)? You can't sit there and make people believe that the Title isn't really a signal or important or that having a massive amount of dupe titles is not a quality issue. Titles should be tailored to the content so either dupe Titles across a site are tailored to the same content and the content is being duped too (big Panda no-no) or you have different content that has the same titles and that's not a Panda issue? That makes for a quality site in Google's eyes? That's just wrong John!


Q. If Panda considers quality does it take links into consideration like Penguin does?

A. We try to focus on the stuff on your site directly (with Panda) and not off-site. We try to keep our algos so that they don't overlap. It's something we try to avoid. We try to look at quality as we index it.

My take: "We try to" means that that's your main goal but there is overlap. Let's take onsite links. Are they Panda, Penguin or both? Well here Matt Cutts answers this very question showing that Penguin deals with internal links. You can't, however, say that internal links and anchor text don't affect the quality of a website or its content. Such links can be considered relevant supplemental content which increases the overall quality thus finding itself in Panda's domain. I think your answer is not genuine and you've left out some of the very important cross-overs that do happen. You could have given a more straight-forward answer instead of obfuscating by using "we try to".


Q. I've completed a site and have content out on hub pages, should I move these to my main site?

A. In general if you're moving from a shared environment to your own, in general I'd try to move that content to your own domain so you have all of your content there and if you want to keep that content out there as well, in general it's up to you, and it tells Google that this is a separate entity that we should take seriously on its own.

My take: Really John? Did you just say that we can have original content on other sites and keep it there but put it on our own main site and that won't be duplicate content? That won't downgrade the value of that content on our site since it wasn't originally posted there? You could have answered that more specifically to the question instead of modifying the question and repeating how if they're migrating from a shared space to their own that it's a good ideA. The question wasn't about that, it was about having a new main site and bringing in varied content from other sites where it resides to this new main site. Either you're saying there won't be a dupe issue or you're telling people to pull that content in and risk dupe issues which would be bad advice. What is it?


Q. I have a client who got a negative SEO attack. You usually tell us not to worry about that but the number of links this competitor is adding is pretty high, should I disavow them?

A. My general advice is if you see this you can use the disavow tool. If you suspect that something crazy is happening and Google is getting confused you can email me and we'll take a look. But disavow should work.

My take: Cool!


Q. Should I build backlinks?

A. Don't focus on link-building but focus on quality content. We do use links as part of our algorithm.

My take: Links are a bit more than your implied small impact. I think Mr. Mueller was seriously down-playing the value of links following his comment that one shouldn't "focus" on links. Obviously links are dangerous (i.e. Penguin) and what's ok today may not be tomorrow. It's also good to generate links naturally via link-bait, PR, outreach and great content. However, links still matter greatly and people will still mess with Google by link-building.


Q. If I have a toxic domain that we've been told by you not to redirect to our new one could we 301 it and take them down?

A. No. If we see this happen like that we'll see it. It's not a combination of websites – it's something different that we can see.

My take: You're not answering him because you've told him one thing but it's different when applied to someone else. But it's good to know that you can see negative SEO.


Q. Will different algorithms be shown as specific data in WMT, especially search quality data?

A. No and I don't see that changing anytime soon.

My take: Google won't give any data (above what they give) that allows you to reverse engineer them. They'll sooner remove data but don't ask for real, hard-core data from them. Try whistling Dixie instead; at least you'll enjoy a tune.


Well - that's all folks! Let me know your thoughts and your takes on this.

Cheers