This month, my interview is with Dixon Jones, an SEO who has been involved with Internet marketing for as long as me as founder
and director of Search Engine Marketing Agency Receptional which was launched in 1999.
Since I last met Dixon at Search Engine Strategies, London, when I asked him for an interview he has also joined link analysis services company Majestic SEO as director, so it seemed natural to find out why and how we should best use link analysis tools, if we don't already. As well as the discussion about the marketing I recommend Q4 which looks specifically at the stages in link-analysis for SEO.
By the way, if you don't follow his advice and thoughts already, I can also recommend his SEO focused blog or Twitter
Why use a link analysis service?
Q1. Why is a specialist paid service like Majestic SEO or SEOmoz Linkscape needed for link analysis? What do you get over and beyond the Yahoo! "linkdomain:" command?
The linkdomain: command is mostly useful in Yahoo's Site Explorer. It returns a list of websites that link to the website that you are analysing. This is great if you want to go through each of these, one by one, to put these links into context the data comes back at the domain level and really isn't conducive to any kind of structured analysis. It doesn't list anchor text, it doesn't specify the exact page being linked to, it doesn't give any qualitative information about the nature of the inbound link and - on top of that - Yahoo limits you to a thousand links.
Features to look for in link analysis?
Q2. OK, I'm sold! Which key features should you look for when you're reviewing and selecting a link analysis tool? By the way, do you find these tools are mainly used by SEO agencies or client-side SEO specialists?
Who says this is just for SEOs at all? How about using it in conjunction with a whois (or the inbuilt IP lookup) lookup to identify collections of websites owned or controlled by the same person. I would imagine that it would be great for assessing market share when layered with other data... or you can use it to define your competitors' affiliates.
The more sophisticated the research, the more you'll relish a feature rich system. The drawback being that feature rich also means there is a learning curve. MajesticSEO keeps track of deleted links, for example. We know that other search engines (like Microsoft's Bing) does the same, so we need to keep that data.
In the first instance, the SEO needs to be able to filter out the links that carry link juice for search engines and also needs to be able to see anchor text to understand context. Beyond that, needs diversify dramatically. One engine's "Link juice" might not count in another engine's algorithm and even more importantly, Google is constantly tweaking its own algorithm, so a competent SEO would be a little concerned about a tool that makes all the decisions about what constitutes a "valid link". Look for a tool that provides as much data but then also provides many ways to filter this data, to pull out what you feel is possible.
Majestic SEO vs Linkscape
Q3. Where do you think the Majestic SEO tool scores compares to it's main rival, the SEOmoz Linkscape tool.
Do you think Linkscape is a rival? If you do then you are not alone! But since controversy makes for better reading than agreement, I think that Linkscape has great reporting tools, based on an insubstantial dataset. Majestic has the dataset but not, currently, all the other pretty bells and whistles in the SEOMoz toolbox. It's easier to build good tools based on sound data than to build up the data in the first place. We expect that other organisations will build applications under license and we will continue to develop our own web based interface with the data, which all SEOs can use right now.
But if you want to pitch us against each other... Linkscape is publicly reporting 54 Billion urls analyzed over 230 million domains. (I think they are now 30% larger than the public reports.) By contrast, Majesticseo has data publicly admitted to "697 billion URLS": http://www.majesticseo.com/research/competitors-analysis.php as of a few months ago. If you want to compare us to Yahoo, for that matter, we think we are beating them as well.
Steps in link analysis
Q4. Summarise the steps (or main reports) would you recommend following for link analysis when auditing a site for the first time and what are the linking issues users should watch out for?
I need to write a small book on this, so excuse the brevity:
Step 1: Eliminate the bad data:
The first challenge is to make sure you eliminate links which are clearly of no value. Some tools try to do this for you, but in taking the tool's word for it, you are giving up on your own understanding of what constitutes a worthwhile link. I would suggest that you only want:
- Links that are there when the crawler re-spidered: They should ideally not redirect, but if they do then you'll be looking for 301 redirects only. (I'm not sure Google is giving 301s the same credence that they did a year ago, by the way.)
- You need to ascertain the strength of the inbound page: SEOMoz uses "Mozrank"; we use "ACRank" which is based on the number of pages linking INTO the page that you are getting linked from. Yahoo simply lists the more important pages first.
- You should consider whether you want to include or eliminate non-text links: For example image links; Framed pages and possibly other "signals".
This will give you a much better subset to work with. When comparing with other sites, make sure you retain the same definition - at least within the confines of the analysis you are working on!
I also ONLY count one link per domain. Not doing this can really hamper any analysis. One website with your domain listed in the blogroll can destroy any meaningful level of analysis. Others might consider limiting measurements to one per IP number or even one per Class C IP subset.
Step 2: Search the data in CONTEXT
You can now start to also filter in context. Almost everyone ranks number 1 for SOMETHING on Google - but presumably you already have keyword analysis down to a tee, so you need to look at why you are (or rather are not) ranking for a given keyword or keyphrase. This is the hardest part for an agency and understanding this is still in its infancy. Context generally means the anchor text in the inbound link where available. But link equity for a certain keyword, because links that "contain" a given keyword have some validity, as well as those with an "exact match" in the anchor text. Downloading all the data into CSV files MIGHT be best, because Excel is every SEO's best friend.
Step 3: Repeat for your competitors
I really cannot tell you how important this element is. Google Webmaster Tools gives you quite a bit of data about your OWN site's links - but it's the power of your competitor's links around a given phrase in comparison to yours that will make or break your search engine ranking.
Now you should have... for any given search phrase and target landing page... the number of valid links to that page by search phrase, structured in order of importance - and mapped against a similar profile for your competitor's target page. Then, with luck, you'll know exactly what is stopping you from beating your competitor.
Regular link analysis
Q5. And finally, for regular weekly or monthly reviews for a well optimised site, how would the review process differ?
There are three distinct and obvious tasks to carry out periodically.
1: you should be tracking deleted links. (What links disappeared since last month? Why did they disappear and can you get them back?)
2: Look at what NEW links your competitor generated, to try to emulate his or her successes.
3. Review the new links that you acquired. Can you improve the anchor text or choice of landing page from these finds? Can you do anything to promote the page that is giving you the link? (Like Twittering it?).
Thanks Dixon - I love the clear description of steps - that's best practice in best practice descriptions!