Like many search engine marketing companies, some clients have been negatively effected by the Panda/Penguin updates (read disappeared from the SERP). I have carried out my own research and have read many reports and analysis on the Panda/Penguin algorithm here and elsewhere. I’m increasingly seeing more references to Localrank and Hilltop as the most likely factors regarding the new algorithm and these articles and reports back up my own research.
I’d like to make perhaps some practical suggestions, or at least things to try out, for those affected by Panda/Penguin.
Disclaimer: These suggestions are essentially for trial purposes and I do not claim to hold the definitive answer.
Becoming an authority
Localrank is essentially all about the inter connectivity of pre-panda/pre-penguin SERPs for a given keyword. It is believed to be based upon the top 1000 pages of the old serps, although I suspect it is likely to be less. What this means is that the results of the top pre-Florida SERPS (old algorithm) get put through a new filter which then forms the basis for the current SERPs. The localrank is calculated, in layman’s terms, by the amount of incoming links a website has from websites that were previously ranked well in the old SERPS. In other words, are websites on your theme/keyword that were previously well ranked, linking to you?
There is then a calculation based upon the old algorithm SERPs (OldScore) + Localrank (LRscore) to determine the Panda/Penguin SERPs (not clear whether they are multiplied or added).
Evidence that theming/localrank/Hilltop is factored in to the current algorithm is pretty strong imo. There are the odd exceptions but they may well apply to ‘weak’ themes (weak meaning not commonly searched for terms).
Portals, large websites, sub domains of authority sites, High pr sites (7+), resource/info sites and professional spam sites are rising to the top.
On ‘weak’ themes there is a correllation between the number of internal pages indexed and the fact that the internal pages that are ranking well, have the exact phrase as the file name.
As mentioned, I’m seeing a great deal of websites at the top of the current serps that have lots of incoming links from not only similarly related sites but even their competitors. For the most part the top sites are indeed authorities on their subject. For those that aren’t authorities on their theme, yet still rank well, the website owners seem to have a large marketing budget or carry out:
- Purchase of text links on similarly related sites
- Blog spam
- Have multiple domains (on varying c-block ip numbers)
Of the varying theories out there what I personally don’t buy is:
a. There is an over optimization filter. Many sites at the top of florida/austin keywords have high kw densities, h1 tags etc.
b. The Google Adwords trigger theory. That Google have specifically targeted keywords based on Adword price for example. I believe Google has enough search data to know what is a “strong” theme (many searches) and what is a “weak” theme (terms with few searches).
c. Quality of the content is still playing a major signal
So how do you get to be one of the authority sites?
Well of course, the usual way is to create great content that webmasters will feel is worthy of linking to. Not any old link will suffice, but links from sites that are themselves regarded as authority sites. For “weak” themes I still see good results from high link popularity that are not necessarily thematic. Hence for weak themes you will likely see link farm spam rising to the top.
Here some suggestion for becoming an authority site
- Get a link from major directories, this supports the theming theory believed to be in place as the category page you have your link, is highly likely to have lots more occurrences of your keyword /theme and be very thematic (unless it’s a regional category). I am seeing a lot of google directory category links visible on the very top of Austin results.
- Make sure your website content have all the theme keywords. You can check it with SEMrush.
- More effort should be taken in finding thematic portals and quality directories where site submissions are accepted for review (in google try ‘keyword directory’ / ‘keyword +submit site’ etc.)
- As links from a website of the same theme and links from authority sites appear to be important, you might consider trading links with a competitor. Here you might think Webby’s lost his marbles with that suggestion ? but trading a link with a competitor that is ranking well in the current SERPs may bring about a large boost for your own site. Not easy to pursuade, but with a well formulated email feasible.
- Large sites have by default lots of content. More content is likely to mean more chance of you getting one or more of your sub-pages linked to. It is also logical that large sites are seen by google as more effort being taken by a webmaster and the site deemed more of an authority than a smaller site. This suggests increasing the size of your website to compare with the number of indexed pages of the top sites in the current serps (‘site:www.domain.com’ in google). Think glossary/lexicon, faq, forum, detailed product description?)as potential areas for more pages.
- Create something unique such as an on/offline tool or perhaps a research article. Basically something of worth that none of your competitors have on their sites.
- If applicable to your theme, create a forum. This has two advantages, if it is an se friendly forum you will get many more pages indexed (I have thirty odd thousand pages indexed, mostly coming from my German language SEM forum). The other advantage is that posts in a forum regularly get linked to from other forums, or if it’s a particularly good post, perhaps directly from an authority site.
- If you are knowledgeable in your field, write articles and place them on your website (don’t syndicate immediately). If the article is of worth, authority sites may well link to it. A little public relations (the other PR) can help here to get an article published on a major authority site, be sure to get that backward link though.
- Google have a new patent for distinguishing duplicate pages and near duplicate pages. The important bit is ‘near’ duplicate pages. If you have several domains and the content is not completely duplicated, but very similar, consider a new layout and re-wording paragraphs to make the sites more different.
- Google can identify crosslinking on multiple domains with the same ip c-block. A lot of the crosslinking merchants have lost their ranking because they hosted their multiple domains on the same c-block and none of the domains were themselves authorities. Another reason why professional spammers are still in the serps is that their multiple domains are on completely different ip blocks and they have a budget to buy their links (no names but you know I’m sure yourselves of examples of this).
Having multiple domains in itself is not however spam. Don’t kick the proverbial out of cross linking and review your hosting arrangements. Are they on the same ip c-block?
For mom & pop small online businesses, getting a high ranking just got much harder. Most cannot afford to purchase text links from similarly related websites, they cannot compete with the professional spammers who have dozens of separate domains with separate ip blocks as the hosting of such is unaffordable. They can only rank well for minor (weak) terms as some posts here have already highlighted.
On-page optimization is still a factor but much less than it used to be imo. What imo smaller websites that don’t have link purchase kind of budgets can do, is to increase the size if their website as mentioned in point 4 above. Perhaps also split ‘scrolly’ pages in to 2-3 smaller pages.
On a side note, I’ve researched the No. of backward links and No. of indexed pages for many florida top 10 serps. There seems to be some correlation between the top results and:
a. Many backward links + many indexed pages
b. fewer backward links but many indexed pages.
c. Many backward links but few indexed pages.
I believe it is the combination of these two factors which determines an authority page, especially lonks from sites of the same theme. It is like there is some kind of a threshold where sites are filtered out and in. I don’t know what the threshold is but I imagine it is based on the strength of the theme and competition.
So imo, new sites with low budgets, or those hit by Austin/Florida need to increase the the number of pages in their sites, get something unique on their site which makes the site of value in order to get linked to. Really what Google have been saying all along.
However, more and more 3-4 word search terms are no longer showing up highly relevant ‘smaller’ websites due to the new algo placing far too much emphasis on authority.
The results often show authority sites coming tops with 3-4 keyword phrases with the keyword phrase, or even just half the phrase, occuring just once or twice. this hardly makes the page relevant. This is not good. It means that if you ranked well before and have now lost your ranking, It doesn’t matter how relevant your pages are or how good your on-page optimization is, if you aren’t an authority or at least becoming one, you have no chance of getting found unless it really is a very niche keyword phrase. You might want to do some synonym/thesaurus checks to find relevant terms that do not include a ‘strong’ keyword. That way you probably have more chance of getting found.
I can’t tell you for sure if localrank or hilltop is in use. It is most likely a combination of the two plus some more filtering we don’t know about which might explain some anomalies. There again, i might also be completely wrong.
Anyway, I just thought that I’d at least provide perhaps some practical ideas to try out. For those devastated by Panda/Penguin, some of my suggestions might be worth trying. It certainly cant hurt your site.