Google "mini" Penguin Refresh?

Hi folks... I know that the title of the post seems strange since Penguin has been rolled into the core Hummingbird Algorithm but we've seen a great rise in a specific site that we cleaned out a ton of spam commenting from.

The site took a long dive after a glitch allowed for spam comments to be allowed and they didn't follow up for a while to delete them so they accumulated. Google took notice and Penguin downed them.

After we found and fixed the issue we waited and waited and then after about 2 months (June 18th) we saw a considerable rise in rankings across the site.

Now Penguin is supposed to be refreshing automatically but we saw a couple month delay and since this was primarily a link issue (although the comment spam does affect site quality as well) we see that either Penguin is still taking some time on refreshing or sites may be subjected to sand-boxing or being put on a temporary watch-list to see if they keep their noses clean, as it were.

Let us know if you have seen anything in your stats.

Feel free to submit comments (unless its spam! :) )

The Team @ Meteorsite (Los Angeles SEO Since 2002!)

New SEO Software

As a New Year's 2017 surprise we're giving you all a little hint that we're going to soon be launching a new SEO Software that will actually give you back the Google (not provided) keyword referral data.

Stay tuned as we get ready to launch.

If you want to be on the beta-test list just shoot us an email and we'll send you an invite once we go live.

Get the lost keyword referral data back!

Postcards From Matt Cutts

Click to enlarge

Postcards from Matt Cutts

Check out our Meteorsite SEO Google+ page!

Google's June 17th Algorithm Update for Hummingbird - Quality

SEO for Google's Hummingbird Algorithm Update June 17th, 2015 - Quality





Hi folks,

Looks like Google has done a significant update to their Hummingbird core algorithm last week (June 17th, 2015). Let's take a look at Google's Hummingbird algorithm and what seems to have been the impact of the update.

Talk abounds of this update with even confirmation from Google but little mention is made of Hummingbird. It seems to me that it's a clear sign of a core update as we've been seeing the SERPs shuffle to reveal more topically-targeted pages showing in the results; which are more closely related to the specific terms searched.

We've even seen some sites have their homepage swapped for a more closely relevant subpage. This to me marks a clear sign of Hummingbird's fast and focused targeting they promoted on Hummingbird's launch.

What we've been able to see from this update is a focus on content, supplemental content and media. It may be time for companies to invest in LASER focused content which would be based on the main keyterms that you believe will drive you traffic.

We've also seen freshness of content to be a feature.

Follow That Trend

Google's update also coincided with an update to Google Trends.

I agree with the author that what is trending may affect the SERPs but how "real-time" that will be is something we'll see as time goes by.

One thing you should definitely do is to use Google Trends for keyword research (if you're not already).


Quality is King

Since this is another core update dealing with what we believe to be semantic targeting we should look at how you can target your site better for semantic search.

You can find an in-depth description here: How to do Semantic SEO for Google's Hummingbird Algorithm

Since Google is constantly upgrading their "Quality" algorithm then we should look at what they mean by "quality".


From the Horse's Mouth

Here Google gives some hints and we're going to dive a bit deeper into each hint to provide possible strategies or clarifications.

Would you trust the information presented in this article?

Trust is a key factor but it can mean is this article well researched or provide detailed information on the topic? When writing you should write in-depth and try to cite authorities. Present the content in a unique manner, not just quoting the same old stuff everyone else does.

Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?

Is the writer just a writer or is it from someone who really knows the topic. Many companies hire writers who don't know their market or products. This is why it is vital to include the client in the process because they're the expert in their field. Get the client to jot out the main points of the article or comment and then provide those to the writer to research a bit and to elaborate on. Also putting the client's name to the article gives it that much more weight.

Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?

This one is very straight-forward; don't write the same article twice and just change a few things. If you have this happening then combine the articles and use semantic variations of terms on the one article. If the two articles are able to be differentiated then you should attempt to do so as much as possible. For example; if you have two products that fit the same or do the same thing then to write about each product uniquely may be difficult. Try to focus on the differences between the two products as much as the similarities. Speak some about the brand difference, price difference, even color options, etc.

Would you be comfortable giving your credit card information to this site?

This is primarily about security. Do you have a SSL certificate? Do you have other security certificates or a Better Business Bureau link or Yelp or Google+ reviews on your site? Do you have your contact information clearly there along with a current copyright date? Do you have a terms of use policy or privacy policy clearly linked to in the footer or elsewhere? These are some of the key things that can provide a visitor with a feeling of security and all of these things can be seen (or be seen to be missing) by Google.

Does this article have spelling, stylistic, or factual errors?

Straight-forward. Fix problems, typos, etc.

Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?

Well they say don't write for keywords but that's just Google trying to stop people from actively optimizing. You can definitely write for keywords because they are what your market is targeting. Remember Google allows you to do this for Adwords so they're being hypocritical. The thing is just to make sure that you're using keywords that are relevant to what you do. Don't create content around keywords that aren't going to be driving you qualified traffic and don't dilute your site's relevance by targeting irrelevant keywords.

Does the article provide original content or information, original reporting, original research, or original analysis?

Don't copy and paraphrase. Take down the information, do the research, then write an article that is unique, in-depth, informative, answers questions, gives new and interesting information.

Does the page provide substantial value when compared to other pages in search results?

GO LOOK AT THOSE OTHER PAGES IN THE SEARCH RESULTS! Get out there and look at the competition and read their content and ask yourself if they have a better offering on their site than you do. Would you pick them over you? If so then list out why. Do they have more research done, do they offer more information such as supplemental content like a video or PDF manuals or social sharing features, commenting or other user generated content or do they offer some helpful apps on the page?

How much quality control is done on content?

Is it good or bad? Does it look like someone is actually editing it or making sure it's solid content or is it just there for keywords or to fill space?

Does the article describe both sides of a story?

Be balanced. You can even be controversial. Ask questions and then answer them. Talk about both/multiple sides to a topic.

Is the site a recognized authority on its topic?

Remember it takes an authority to make an authority. Do people in this field link to this content? Is the author an obvious specialist (being made obvious by his bio which states that he is a graduate in this field) or is this person or website part of an association? Does this person write in authoritative journals on the topic? Build authority by contributing online with others in your field.

Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don't get as much attention or care?

Be careful with syndication. Also don't robo-create content (EVER!!!). Syndication can have a blow-back effect by stealing the thunder of the content on your own site. Make sure that you product unique content on your site and that you date that content and put in the author's name. This way if it's picked up or syndicated then Google knows that you had it first. Link to it from Social Media outlets and more.

Was the article edited well, or does it appear sloppy or hastily produced?

Straight-forward.

For a health related query, would you trust information from this site?

This can speak to whether there is a medical professional directly associated or behind the content or whether there appears to be just content to sell ads or is there really quality content. Does it stand up to comparison or large medical/health websites? Does it copy from them? Does it cite sources or refer to other sources for further information?

Would you recognize this site as an authoritative source when mentioned by name?

This is all about branding. Build your brand in association with your topic and get people out there on social media or traditional media to use the two in conjunction. Hire a PR firm and get a press release out or get on some guest blogs or a magazine or newspaper article or op-ed.

Does this article provide a complete or comprehensive description of the topic?

In-depth articles are very powerful. Google seems to like around 750 words and that's just to start. You can split up large pieces of content into smaller ones but those smaller ones should be around the 750 count. Break up your in-depth content logically (if you want to generate more ad-views) but don't just break up content to break it up. You can have a couple thousand words on one page but if you can separate based on topic/subtopic then you should. This gets the topical focus honed to a LASER sharp focus for semantic search.

Does this article contain insightful analysis or interesting information that is beyond obvious?

Is it DUH! content or did the writer actually do some research. Research and write... it's the way to go! Be original!

• Is this the sort of page you'd want to bookmark, share with a friend, or recommend?

Is it good or is it junk? Does it provide regularly updated content that makes one want to come back or does it provide something awesome like a seriously funny infographic or video.

Does this article have an excessive amount of ads that distract from or interfere with the main content?

Don't inundate the visitor with ads. Place ads strategically and don't just do content for ad-views because you'll get better rankings from having better content with fewer ads which means more ad-views in the long run.

Would you expect to see this article in a printed magazine, encyclopedia or book?

This speaks to quality in the sense of is the article professionally written? Review the way articles in magazines and newspapers are written. Copy the format. Link to references and cite your sources.

Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?

Short, dull and uninformed content is bad... build it out, research it, write more than 750 words and integrate related images.

Are the pages produced with great care and attention to detail vs. less attention to detail?

"Attention to detail" is a subjective term. This can mean the details of the content or the details of grammar, wording, and even if the topic was fully explored or the initial questions were answered in the content.

Would users complain when they see pages from this site?

If it's hard to read, full of ads, not well researched or written, just there for spam or otherwise causes an epileptic fit reaction from too many things going on - then the short answer is yes. So don't do that.



There's my take on Quality according to Google. Research your market and terms, target your terms specifically, research your topic; write like a pro; include multi-media and supplemental content; show you're an authority; share it socially; write more than less, show your site is safe and secure, be original and informative and don't duplicate your content. Build your brand and your own personal authority with others in your field.

Google Hummingbird Update June 17th, 2015 - Message

Hello all,

We are currently working on this new Google algorithm update (June 17th) and already have some great stuff to share.

We'll be presenting our findings tomorrow and Friday in the form of blog post and infographic.

There are a lot of great things for whitehat SEOs in this latest update. Google's really doing things right on this one.

Check back in tomorrow and we'd love to have your comments and thoughts.

best,

Meterosite :)

Google Voice Speaks of Bad SEO

Google uses bad SEO for their own Google Voice website




It's interesting to look at the above SERP result from Google for their Google Voice service.

They seem to be doing things they suggest against. To start with, they're using duplicate Titles (two of them say simply "Google Voice". The main one (shown as A) and the first sitelink (B).

(A) goes to their main page as one would expect while B goes to google.com/googlevoice/about.html. One would think the logical title for the about page would be "About Google Voice" but that honor is saved for https://support.google.com/voice/answer/115061?hl=en; a support page (D). which has "About Google Voice" as its heading.

In fact (B) doesn't even go to an about page but instead goes to a features page. Why don't they put that in the title?

They're not exactly being descriptive or even targeted. Does this provide a good "user experience"?

But this is only what appears in their sitelinks. If you do a site:google.com/voice search then you'll see that there are tons and tons of pages that only have "Google Voice" as the title of the page. These include pages on "general settings", "billing", "account signin" and "account recovery".

Avoid repeated or boilerplate titles. It's important to have distinct, descriptive titles for each page on your site. Titling every page on a commerce site "Cheap products for sale", for example, makes it impossible for users to distinguish one page differs another.

Source: https://support.google.com/webmasters/answer/35624?hl=en

As for the descriptions we see two (E & G) that repeat the keyword in both the title and description.

If I'm not mistaken Google has stated that one shouldn't use the same keywords in both the Meta Description and Title tags.


It's easy to see that Google doesn't abide well by their own rules. It's also interesting to see how duplication of Titles on a large scale doesn't seem to effectuate a penalty. At least for Google.

Meteorsite Launches New Google+ Page

Hello SEO's and lovers of things SEO!

I'm really very happy to announce our new Google+ page, located here:Meteorsite SEO on Google+

We hope that you'll join us for new insights and new tips, techniques and strategies for advanced SEO, Semantic SEO, SMO and more.

My Take on the recent Hangout with John Mueller

Google's John Mueller hangout



Google's John Mueller is a pro at speaking for Google. He can not answer a question like nobody. Matt Cutts was great too but I have to give it to John. His next job could easily be White House Press Secretary as good as he is.

Here is my take on John Mueller's recent Hangout. I'm paraphrasing a lot.

Q. Which would you recommend for our blog; subdomain or subdirectory and does one have greater benefit to SEO?

A. They're essentially the same, it depends on your infrastructure and what's easier for you. If you want to do either it's fine. This is not an answer (or at least not an original one) but here is Matt Cutts providing a more in-depth answer. https://www.youtube.com/watch?v=_MswMYk05tk The question could have been answered by providing any benefits or detractions that each causes. When someone says things are "essentially" the same then that implies that there are still differences and I believe that was the core of the question.


Q. We have multiple CCTLDs and should we link them all together?

A. Firstly make sure you're using HREFLang annotations. If the content is the same then you can link between the individual pages and that's a good way. You can link from country/language page to like content for other country/language or you can have a country-picker function, either works.

My take: I think this is actually a good answer where Google is actually providing some specific direction that linking between the pages directly can be beneficial. Perhaps that will help with providing a type of supplemental content or just content that is helpful for users of different lang/country. It also takes away the question of which option to choose as it's up to the developer to pick what's best. He does, however, go on to say that it helps them when you link directly from page to page especially if you have the same content but for different lang/country. So that could be a subtle gem to consider.


Q. Why is Google slow to digest, treat, consider 301 redirects?

A. There are two parts here that are kind of mixed together that are hard to look at separately. We follow 301's immediately. We'll try to follow them and index the content under the new URL. There are things that make it "look" slow such as if you have a site-wide 301 system and you have a lot of pages or these are infrequently crawled and it can take us weeks or months to crawl them. Traditionally they don't change at all and we don't expect them to change.

The second thing is that people who do a "site:" search are asking for a URL on that domain and we'll show it to them. Even though it has been 301'd to a new domain, we keep the old one and will show it when you search that domain.

If you're changing a site/migrating then use WMT change of address and setup a new WMT instance for the new domain. Don't rely on "site:".

My take: The part about pages changing and Google expecting them to change in order for them to be considered for crawling is a key comment here. It proves beyond doubt that it's important to have content that updates frequently and that it's good to have all of your content pages update frequently in some manner.

Did he just say that even though you 301'd your content over, that if someone searches for the old URL they'll show it? This is ridiculous of Google to do because there are reasons that webmasters want a page 301'd and if someone finds that old URL and links to it instead of the new URL then the link-benefits are being diminished as a link to the new URL would be 100% value but if 301'd it could be 85% or by whatever the real amount which is transferred via a 301. This is another example of Google doing what they want instead of what the webmaster wants with their own site.


I also think Mueller could have provided a bit more help by telling what webmasters can do to push their 301'd content into the index such as "hey, if you see we're not getting to the content just update your sitemap in WMT". That would have been nice. Or how about a bulk-url Fetch tool?

Even though he gives good advice on a site migration, what about if we're just updating our URL nomenclature or our hierarchical structure? Same domain, just different URLs. According to him they'll index both but does Google look at any of this as dupe content? Will my old pages show up in SERPs if the query is more attuned to the former page than the new one? Mueller's response begged more questions that it received answers.


Q. Is one Penguin update enough to recover from a penalty?

A. Essentially yes... but... it can take time for us to process all of the link-based data we have on your site so that a future update may take that newly processed data into consideration and adversely affect your site's rankings.

My take: Straight enough question; straight enough answer.


Q. Our site had a lot of bad links and we disavowed most, even good links. We saw a rise in November but then, soon after, dropped back down. Why?

A. I'd have to look at your site specifically but again this shows that it can take time to process all of the link datA.

My take: Come on Mueller! Google took the disavow into account and boosted the guy's site and then turned around and dropped it again. Was this because of new data or could there have been new things considered in the next update that look at the links differently? I smell a Google misdirection here. I think he could have been a bit more forthcoming, even generally, about why, even though he doesn't know the specifics of this guy's situation/site.


Q. With Penguin; will we see gradual increases or just one big one?

A. As you improve your site you should see gradual increases but following an update you may see a large one.

My take: To me, he's saying that you don't have to wait for an update to have link-related improvements considered by Penguin. This would be a change from the year that people had to wait for Penguin 3 to see any benefit. This could mean that they're doing the constant refreshes but occasional updates. This partial confirmation is very welcomed.


Q. Do Google algos target site-wide or on a per-page-basis?

A. We try to go granularly but Penguin affects you on a site-wide basis. If your blog (the example given) is on a different domain then we'll look at it differently. If the content is good then we may have signals that are boosting it while Penguin is dropping it.

My take: Which is it; granular or site-wide? OK, Penguin works on a site-wide level but you could've answered the question directly by saying yes or now; that it can affect you on a granular level.


Q. Does removing META data from images affect ranking?

A. Not necessarily. For image search we take in to account a lot from the context and some from the META data but that's more just icing on the cake, as we find out more info like the resolution, date of creation, but this is something where if you want to remove it, it's your decision, you don't need to but you're free to but it won't really have a big effect on search.

My take: John... things like "not necessarily", "icing on the cake" and "big effect on search" mean that it will have some effect thus the real answer to the question is perhaps "you can remove it if you have an issue with it but it does affect rankings however to a small degree. He also specifically targeted image search but didn't answer the question about how the META data affects rankings in regular search. Was that an avoidance of the topic of regular search? If it has some affect in image search then we can likely assume that the added data can slightly affect rankings in regular search. Well, it's worth testing at least. We'll get a clearer answer than from testing than we got from Google.


Q. If I have a one-page website and want to target 50 keywords how do I target them and if Google is looking at the Title tag as a main factor how does that play into it?

A. You can create a single-page site, that's your choice but you should look at it from a marketing perspective and if you're targeting diverse topics then that's really hard for us to say "this site is relevant on the topic of shoes and the topic of cellphones". If you're targeting multi topics or increasing your content then you may want to consider a multi-page site approach.

My take: John, you didn't speak to the scenario where they have topically-related keywords that they're trying to target on the same page. You only answered as if they're vastly different topics. It would be nice to have an answer for the similar-topic scenario or would that cause you to actually give something away?


Q. If I have multiple domains targeting the same content in different languages should I verify both in WMT?

A. Yes. Even if they're not targeting the same translated content but they're in different languages then yes, use WMT. Use also HREFLang markup. Do this on a page-by-page basis so the same content in one language references the same content in another language.

WMT also gives you stats on which parts might be implemented correctly and not.

My take: Cool


Q. If a site is in English and sells products internationally and ranks in many English-speaking countries should we add HREFLang markup and send to each country to a specific page tailored for that country?

A. You can but you don't need to. If you have content, though, that is country-targeted then you can use the HREFLang markup and we recommend that but if the content is barely different then it might be better to stay with one page that's "very strong". "Focus on the product that still ranks globally".

My take: One page can be stronger than splitting it into different pages so it looks like duplication of content is still possibly seen and considered here by Google or perhaps the links to that single page are better than diluting it among different pages. He doesn't really dig into that but international SEOs should consider those when making this call. Perhaps test it out. He says you shouldn't do it if the content is mostly the same so this shows that they can be perceived as dupe content so tread carefully and make sure the content is different enough and that that difference is country-related.


Q. In the case of different country-specific pages with HREFLang, can having the currency change be enough?

A. Yes, that's something you could do especially if you're using those pages in Google Shopping but you could also do this with URL parameters. But in general if you have a really strong page that is valid for varied countries then I'd try to keep that.

My take: He said it numerous times; that if it's a strong page then keep it as one. This means that one page with all of its benefits is better than multiple pages unless you have to change it or it's specific enough to each country that you should go with different pages. This could be a nice hidden tip from Google here. Some testing would be good on this.


Q. If we are geo-targeting a site and it's primarily for the USA then do we need to set that in WMT?

A. We can usually figure it out but if you want to explicitly set it we'll explicitly follow it.

My take: If you explicitly set it then you may explicitly be blocking it from other country's search. He already said that one page/site can rank well on different country search engines so why limit yourself?


Q. My Toolbar PageRank has not updated, why?

A. We don't update it anymore. You should not worry about this but you can look at WMT for queries and click data and if you see that you have impressions but not clicks for certain terms you can review your Title and Meta Description to see if they're optimized. If your Titles and Descriptions don't match the topic then you can optimize them and it's a good step.

My take: Did he just mention Meta Description as something to optimize? Wait, I thought Google said they don't consider it anymore. He could be mentioning that because it can help with conversions (click-throughs) which he does mention so he's likely referring to conversion/CTR. Either way it's good advice of course. It's what we used to do with keyword referral data way back when.


Q. Are there any plans to update the date range on queries data to more than 3-months.

A. We are considering that for the next update but it's not guaranteed.

My take: Won't that be nice.


Q. WMT queries data is missing the most recent 2-3 days and this skews the data because it's included in the last 30-days; why?

A. It shows you that you need to work harder. It takes us time to process the datA. Sometimes we have partial-day datA.

My take: Not funny John... maybe Google should work harder to get the data in or simply restructure the data range so it goes 30-days from the most recently available date. Ya think?


Q. A lot of referral spam appears on my site, do I need to worry about that?

A. No, we see it. You can ignore or block on your server. Has no effect on websearch.

My take: Cool!


Q. If I disavow am I confessing?

A. If you're aware of a problem then disavow... it's not something that we take as an admission. It's technical only.

My take: Disavow is a great tool and if there's no backlash then use it.


Q. If we have an article on someone's site and they link to us in the footer of other pages should we disavow all of those links except the main article we have on their site?

A. That would be too much work and if it's really a natural link then just leave it, that's fine. If you're contacting sites and trying to get footer links everywhere then that's different.

My take: So footer links if natural are good. Google has stated that they are able to look at whether footer links are natural and if links in general are part of the content (having editorial value) or if they're stock standard across the site's footers. However he does tell us here that valid links are good even in footers so that's a nice clarification. Thanks John.


Q. Do I need the rel-canonical tag?

A. No you don't have to have it but we recommend it.

My take: If Google recommends it then it's worth considering as it can potentially save one from a Google problem.


Q. If we have a services targeted website and its low on content should we build content?

A. What are your users looking for, how are they searching? It's not good to create artificial content if they're not looking for it. Artificially creating content doesn't make sense. Some sites with thin content rank well.

My take: "It doesn't make sense" doesn't mean that creating content won't rank or build-up the site's Panda quality rating. He says that some sites with thin content rank well. Doesn't Panda slap people that have thin websites? Which is it? He does answer a follow-up question about the same topic with advice that you shouldn't create fluffy content that's not quality but the first question asked, essentially, if there's a benefit to building out the content. He doesn't answer the question. He does, however, offer that if people get to your site and see fluff content and leave which sort of indirectly provides support for the idea that Google looks at bounce-rate as a ranking factor. He also says that people won't share that content which indirectly supports the Social as a signal importance.


Q. Is PageRank important or should I not consider it in my SEO equation or strategies? A. Don't consider it because we don't update it.

My take: Toolbar PageRank doesn't update (another thing Google is taking away from us) but it doesn't mean that the thing that made PR important has changed. PR was based on links, citations, etc. These still matter greatly so Mueller's comment is a bit misleading. He could have phrased in more realistically that the PR number doesn't matter but the things it considered do.


Q. Do links from SEO reporting sites and others affect us? Do you ignore them?

A. We try to ignore them, especially links from auto-generated content – so we try to not to put much weight into them. They're something that we (Google) have to live with. Of course if you're running one of these sites make sure you have value on them.

My take: They're ok so we shouldn't have to worry about a Penguin slap because of them. If Google has to live with them then they can't really downgrade you for them (Penguin slap you) so they are beneficial, to some degree. This is good to know John! Thanks. Also it's interesting information for owners of these sites. Do I smell a self-confessed Penguin-proof link-building technique here? It's possible but again, caveat, it's dangerous so if you do go this route, beware!


Q. Does https only apply for websearch or does it apply for image search?

A. Only for websearch at this point. It's a really small ranking factor at this moment so you're not likely to see any big jump by going to https. Go ahead an switch.

My take: Any jump is a jump and you did say "at this time". It might benefit webmasters to start considering the switch, especially large sites that would have a chance of seeing a small jump better than a small site. This will be more important when they change and consider it even more in the future so if you start now and do it slowly and smoothly then you won't have to rush later.


Q. Is the disappearance of search query data in Google Analytics dealing with a beta?

A. I wasn't aware of that disappearance but you can still get that data from WMT and Analytics will be switching over to the new format as well once that's finalized. If you signed up for preview version you should have that but we're going to be doing more testing for other people on the list.

My take: New format, new data, that would be good. What's coming?


Q. Why do I see different data between Analytics sessions and WMT clicks?

A. If someone doesn't have java script enabled it won't show. Multiple clicks by the same person can be considered one session in GA.

My take: Standardize! How about you provide like data so we can benefit from it? Why not include clicks in GA? Google has a myriad of subtle ways to make optimizing for them more and more difficult.


Q. If I have pagination on my site and it's generating hundreds of pages of content with the same Title and Description will that affect my site with Panda?

A. Panda looks at the overall quality of a website and its content and Titles and Meta tags are not a sign of quality. They're something more for the Search Engines, something we show in search and not something the user actually sees on the website so just because you have duplicate Titles and Meta tags isn't really a quality problem because we look at the overall quality of a website and not at the individual tags. They don't affect how a user sees your content.

If you have a lot of dupes it might be worth fixing but it won't affect the quality of your website.

My take: Bull-dung! Did you really just spit that out John? Users don't see it (the Title)? It's the main thing on the SERPs! It doesn't play into the quality, possibly, but the Title in relation to the content is important and every SEO knows how important it is to optimize Titles. Are you saying it's not a Panda thing but it's another algo (Hummingbird)? You can't sit there and make people believe that the Title isn't really a signal or important or that having a massive amount of dupe titles is not a quality issue. Titles should be tailored to the content so either dupe Titles across a site are tailored to the same content and the content is being duped too (big Panda no-no) or you have different content that has the same titles and that's not a Panda issue? That makes for a quality site in Google's eyes? That's just wrong John!


Q. If Panda considers quality does it take links into consideration like Penguin does?

A. We try to focus on the stuff on your site directly (with Panda) and not off-site. We try to keep our algos so that they don't overlap. It's something we try to avoid. We try to look at quality as we index it.

My take: "We try to" means that that's your main goal but there is overlap. Let's take onsite links. Are they Panda, Penguin or both? Well here Matt Cutts answers this very question showing that Penguin deals with internal links. You can't, however, say that internal links and anchor text don't affect the quality of a website or its content. Such links can be considered relevant supplemental content which increases the overall quality thus finding itself in Panda's domain. I think your answer is not genuine and you've left out some of the very important cross-overs that do happen. You could have given a more straight-forward answer instead of obfuscating by using "we try to".


Q. I've completed a site and have content out on hub pages, should I move these to my main site?

A. In general if you're moving from a shared environment to your own, in general I'd try to move that content to your own domain so you have all of your content there and if you want to keep that content out there as well, in general it's up to you, and it tells Google that this is a separate entity that we should take seriously on its own.

My take: Really John? Did you just say that we can have original content on other sites and keep it there but put it on our own main site and that won't be duplicate content? That won't downgrade the value of that content on our site since it wasn't originally posted there? You could have answered that more specifically to the question instead of modifying the question and repeating how if they're migrating from a shared space to their own that it's a good ideA. The question wasn't about that, it was about having a new main site and bringing in varied content from other sites where it resides to this new main site. Either you're saying there won't be a dupe issue or you're telling people to pull that content in and risk dupe issues which would be bad advice. What is it?


Q. I have a client who got a negative SEO attack. You usually tell us not to worry about that but the number of links this competitor is adding is pretty high, should I disavow them?

A. My general advice is if you see this you can use the disavow tool. If you suspect that something crazy is happening and Google is getting confused you can email me and we'll take a look. But disavow should work.

My take: Cool!


Q. Should I build backlinks?

A. Don't focus on link-building but focus on quality content. We do use links as part of our algorithm.

My take: Links are a bit more than your implied small impact. I think Mr. Mueller was seriously down-playing the value of links following his comment that one shouldn't "focus" on links. Obviously links are dangerous (i.e. Penguin) and what's ok today may not be tomorrow. It's also good to generate links naturally via link-bait, PR, outreach and great content. However, links still matter greatly and people will still mess with Google by link-building.


Q. If I have a toxic domain that we've been told by you not to redirect to our new one could we 301 it and take them down?

A. No. If we see this happen like that we'll see it. It's not a combination of websites – it's something different that we can see.

My take: You're not answering him because you've told him one thing but it's different when applied to someone else. But it's good to know that you can see negative SEO.


Q. Will different algorithms be shown as specific data in WMT, especially search quality data?

A. No and I don't see that changing anytime soon.

My take: Google won't give any data (above what they give) that allows you to reverse engineer them. They'll sooner remove data but don't ask for real, hard-core data from them. Try whistling Dixie instead; at least you'll enjoy a tune.


Well - that's all folks! Let me know your thoughts and your takes on this.

Cheers

Mechanical Hummingbird

My take on Semantic Search and Hummingbird.


Semantic Search and Google's Hummingbird Algorithm; Mechanical Hummingbird art by JossHorror

A hummingbird is a beautiful, lively and ever-so-loveable creature that gives one a feeling of animation and personality.



What I think people mistake is that an algorithm is absolutely none of these things. I'm not saying that people actually believe the Hummingbird algo to be loveable and lively but that they see it as caring about things which it actually cannot.

An algorithm is a cold and calculating recipe for pulling data based on certain pre-defined elements; even though it can be very expansive and take a massive amount of such elements in for it to spit out the results - it doesn't actually "consider" anything at all.

Looking anew at Hummingbird and Semantic Search in this rather cold and calculating manner will help SEOs to cater their sites to it and hence rank better, in my humble opinion.

To figure out what the algorithm is programmed to feature requires that we look at the actual facts.

As David Amerland (likely the leading name in Semanitc Search) writes here:


"When complex analysis and mapping takes place, it inevitably reveals usage patterns that can signal intent. Google makes full use of this capability. Hummingbird can now broaden the search horizon by suggestion links and information that are related to a search but were not asked for. A request for local museums, for instance, may lead to data being shown about artists whose works are exhibited there or a search for a local restaurant can also bring up reviews, recipes and food styles".


"The point is that from a practical perspective those seeking to be found need to focus on the creation of content that is as information-rich as possible. Keywords on their own, now ... simply ain't gonna cut it".



I agree with him especially on creating content that answers questions but how do keywords play into it?



The cold algorithm still requires keywords in order to determine everything (meaning that without words you cannot have a language-based search engine). This is axiomatic. Hummingbird must seek to determine user intent via the keywords, phrasing and other information (e.g. previous searches by the very same person, results clicked on and not bounced from, as well as those same aspects by others who've searched Google plus the authority of the content and writer who presents those low-bounced results which appear to answer the question sought.



As for David's reason that 100% Not Provided was announced coincidentally on the launch of Hummingbird (during Google's 15th birthday) is only partially right (IMO).

David writes:



"By removing keyword reporting in Google Analytics Google has removed the "what keyword am I going to build content, around, today?" strategy. Content will always contain keywords, but it should be created to answer specific, potential, end user questions rather than surface a page because of a keyword".



I think there is an additional, and even more important reason for this convenient timing of the two launches. I believe that long-tail keyword data previously gained from Analytics could allow a more in-depth reverse engineering of Hummingbird by tracking the difference in keywords, terms, phrases and interrogatories that began generating the traffic over the previous more-general keywords that provided traffic (ergo rankings) prior to the launch of Not Provided.


By seeing how Google's Hummingbird specifically changed the landscape (i.e. which keywords now do and do not perform – as well as which pages rose and fell – could allow clever SEOs to lift the curtain on the algo. Google was being the mama bear again, protecting its cub, the algorithm, from SEO's poking sticks.



With the mass majority of keywords falling into the Not Provided designation we require other tactics. This means that we look at what Hummingbird does - coldly and mechanically so to say.



If it is able to glean user-intent (to some degree) and it looks at topics to a greater degree - then it is important for the content writer or SEO to determine what questions and desires their potential market is searching on and how they can best answer them (best meaning in comparison to others answering them).



Also it's important to look at other features that bolster the relevance of your content to the question or desire. This means looking into supplemental content and things that can increase the information gained. Supplemental content can be links to other topically related content or things that assist the user such as a mortgage calculator on a real estate site or a translation feature on an international site.



Authority is becoming more and more important. How those who are considered authoritative deal with your content contributions to the topic (e.g. linking to, citing, and even mentioning of you, your name, your site, brand, etc) and how those who are not deemed to be authoritative do the same.



Building authority is primarily based on citation (see Google's original PageRank paper entitled The PageRank Citation Ranking: Bringing Order to the Web).



It shows you that citations are how Google connects the dots between people, sites, brands, etc. That's what links are but also mentions.



Branding is ever more important and research has shown that established brands rank better simply because they're established brands. Branding is a clearly defined thing... it requires you to connect your name or mark to the topic and keywords so that they become closely associated with each other via usage. Google did this so well that their brand became a verb synonymous with searching online.



Building authority and branding can go hand in hand so getting people to use your name or mark in association with the terms and topics helps on very many levels. Sometimes optimizing for semantic search can be very difficult.



You can use this topical "find question or desire – then answer" approach more easily on subpages of content but If you're a service provider, for example, then you may not be able to describe on your homepage just how you're a Los Angeles Plumber that is more a Los Angeles Plumber than the next Los Angeles Plumber. So one must look at the qualifications of such that Google finds most relevant to people seeking one.



Location in this case is going to be important, as will reviews and social and authoritative signals. But when it comes to the content and the terms used, it will require some reverse engineering of your ranking competition as well as those who rank below you. Find what appears to work and what appears not to and begin an approach based on that data.



For SEOs it still comes down to analysis, theories, sleeve-rolling, creation, trial and error, results, analytics analysis, conclusions and a repetition of that very process. Remember, in the end, you're not fighting against a loveable little bird; you're fighting against a cold-calculating algorithm.



Well that's my two cents – I'd love to hear yours.





Fantastic "Mechanical Hummingbird" art by JossHorror

Matt Cutts on Internal Linking

Matt Cutts on Internal Lnking SEO - Image from MattCutts.com

In today’s Post-Penguin world we find ourselves focusing on Onsite SEO more and more and one important aspect of which is Internal Linking.  Since links pass both PageRank and Relevance it is important to make sure that your Internal Links are optimized to conduct PR throughout your site and to optimize the anchortext of those links so that they’re targeted to the topic of the content and that of the destination page.

Many SEOs may be hesitant to delve deeper into Internal Linking because they’re not sure what is allowed and what the differences are between Internal Links and External Backlinks.
There are certainly some big distinctions between the two.

Here we review what Matt Cutts has to say and add some of our own interpretations:

Exact-Match Anchortext

Is EM Anchortext ok?  Well in a word... yes.  EM Anchortext is allowed in Internal Links without fear of a Penguin slap.  Many sites, especially dynamic database driven sites place links automatically and it is intuitive to name the link in a manner that expresses the topic of the link’s destination page. 
The big caveat to that is abuse or Spamming.  Do EM Anchortext links in a way that is natural. 

http://youtu.be/6ybpXU0ckKQ

Hyphens-or_Underscores

There has been a lot of concern about whether to use hyphens or underscores in your URLs which you utilize in Internal Links.  Here Matt talks about which they prefer and why.  In summary, they prefer hyphens overall but if you want to be specific then hyphens are considered separators while underscores conglomerate terms.  So if you’re doing a page for “Search Engine Optimization” then you could use underscores (search_engine_optimization) because it is a term while if you’re doing a page for the 2015 Honda Accord then you’d use hyphens (2015-honda-accord).

http://youtu.be/AQcSFsQyct8

Do Image Links Matter?  Yes they do!

Image Search can generate some decent traffic and without having the topic clearly stated in text in the image itself – the Search Engines require certain data, found in and around the image, to determine the context and topic.  Naming of the image and the location of it (the directory names) can help to define the image’s topic especially if the Alt attribute doesn’t specifically do so.  Google can look at the source of the link to help and this gives an opportunity to provide additional keywords to aid them in doing so.

As with any method you have to avoid Spamming – so don’t create an image at “website.com/keyword/keyword/ keyword/keyword.jpg” or you risk the wrath of Penguin. 

http://youtu.be/h2SWuUobbr0

Follow or NoFollow

The question of using NoFollow attributes in Internal Links has been discussed over and over.  Many use this for PR Sculpting while others simply don’t want certain pages indexed.  Matt says to just leave off the NoFollow on all Internal Links but if you listen carefully to what he says you can see that PR Sculpting can provide some benefit.  He just doesn’t like to openly discuss it.  What he does stress, and rightly so, is to include the NoFollow on User-Generated-Content such as reviews and comments.  Unless you can personally vouch for each link placed on your site by a user, you should add the NoFollow so you don’t get penalized for other people’s link practices.

http://youtu.be/86GHCVRReJs

Hansel & Gretel SEO

Breadcrumb Menus are important and beneficial.   Matt states it best: “Have a set of delimited links that accurately reflect your site’s hierarchy”.  I’ve seen many sites that use different pages in their breadcurmbs that don’t naturally exist in their sites hierarchy because they either don’t have a certain category page or they don’t want that page to be hit primarily.  It’s best to create a solid hierarchical page structure and stick to it.  Using EM in your breadcrumb is also acceptable and advised.

http://youtu.be/-LH5eyufqH0

Multiple Links to the Same Page

Multiple links to the same page with different Anchortext?  This is a great one.   What if you have multiple links to a page from another page?  Do they transfer PageRank or Relevance?  Yes they do!
Multiple links transfer multiple amounts of PageRank.  This means that if you have a header, footer, image and text link to your homepage then it should transfer 4 links-worth of PR to that page, which is determined by those four pages percentage of the total links on the page. 

If you listen closely though, Matt refers to the original PageRank paper submitted by Google’s founders.  It’s a sneaky way to say, that was how we did it then but it may not be how we do it now.  I think by his answering this way though that it is in a lot of ways valid.  I believe that they’ve taken that into consideration and likely downgrade some of the PR passed when duplicated.  Their ability to do so is illustrated in the next part of the question pertaining to the transference of Relevance.

While link extraction takes the anchor text keywords and connects them with the destination page it doesn’t it, necessarily, always for all of the links.  This means that Google clearly has the ability to pick and choose in an automated fashion so I wouldn’t doubt that they’re doing that with PR transference as well.  However, with the transference of keywords/relevance it is good to make sure the link has the right keywords that truly describe the content of the destination page and that link should be within relevant editorial content on the source page.

http://youtu.be/5AsLWIuNNMU

How many links on a page is OK?

Here Matt tells us that It used to be around 100 links per page but this has changed to allow for more.  The amount of links was based on the size of the document (in kb) but they’ve since increased the amount of data that they accept, read, cache, etc.  They also realize that it can be common to have aggregators of links, directories, etc so they have no stated limit.  There is the ever-present caveat of “do it naturally”.  Again here Matt says that the PR is divided by the total outlinks but once again he quotes the original PageRank paper; not current policy.  Though the caveat tells us not to spam links, Matt says that 200 to 400 links isn’t bad however the content length should somewhat match.

http://youtu.be/QHG6BkmzDEM

Non-Link References (Mentions) as a Signal

Here Matt says that mere mentions aren’t necessarily considered a signal because they could be abused.  On this I call “shenanigans”!  This would go in stark contrast to a number of different aspects of Google’s approach to branding and to their focus on linguistic semantics.  The mention of a domain or brand would naturally increase that company’s relevance to its keywords if that brand or domain is consistently connected to those keywords within content.  In other words using something that Matt admits is an actual Google Signal; Proximity.  If the brand or domain is connected by proximity to the keywords on a myriad of sites then Google would determine that that brand or domain is relevant to the keywords.  This may not help as far as Penguin goes but it would definitely help as far as Hummingbird goes.  In fact this predates Hummingbird considerably because Google’s old Wonder Wheel showed that Google did connect mentions as part of their Wonder Wheel graph.  They got rid of it (probably in part for this reason) but it can be seen, to some extent, in the Keyword Planner’s Adgroup Ideas section.  It is likely part of their Brand algo as well.  Either way, here’s what Matt has to say:

http://youtu.be/nlByfISlj5w

Links in Footers versus Editorial Content

Footer links might not carry the same weight as in Editorial Content so Google reserves the right to downgrade links in non-editorial content like footers but it’s not guaranteed.  This has been shown, IMO, to be the case.  Footers have grown in size and many links are to be found in many a footer these days.  Most of these are now dynamically generated.  It is not a bad idea, by any stretch, to not use links in footers but since content is king it is still advised to have a good amount of Editorial Content in which you can include topically relevant links that point to further topically related Editorial Content.  That’s Penguin Food right there!

http://youtu.be/D0fgh5RIHdE

Tag Clouds

Tag clouds can be beneficial because they are actually a list of links so the same rules apply for linking; don’t over-stuff them.  If it’s a small number then it likely won’t hurt but you’ll have PR flow out through those links as well.  In this case it may be good to have a tag cloud with really topically related links that point to great content.  Don’t have these links go to dynamically generated pages that have no content (especially on platforms that have this automated like Wordpress).  Consider creating a custom tag cloud so you can control the links and content being linked to.

Matt doesn’t have them on his blog.

http://youtu.be/bYPX_ZmhLqg

So?

So Internal Linking is obviously important because it transfer PageRank and Relevance/Keywords and it can be done within content or in menus so it can help with building and fine-tuning the relevance of your site and help to transfer that necessary PR that new content pages need.  It also can help, especially in menus, to increase conversions by directing people to where they need to go (or rather where you want them to go). 

I recommend that every SEO review their current Internal Linking structure to see if there are ways you can optimize it to better reflect your topic or direct your visitors.  This will also give you a great opportunity to leverage existing content by reviewing that content and rewriting it based on new Editorial Content Optimization techniques for Panda and Hummingbird.  Don’t hesitate to use mentions on your own site or on other sites and PR sculpting can be valuable especially on end-pages which are those pages at the bottom of your site that only link upwards or sideways.  Do some testing and let others know your results.

Best of luck!

More Entries