Welcome to my 4th annual What’s in my SEO Toolbox post. If you’ve read this post in the past, you’ll note that there’s a lot of consistency with respect to my choices. There are two reasons for that. First, these are damn good tools and their developers keep improving them year after year. Second, as much as I love testing the newest “thing” on the market, I get attached to my tools and I’m reluctant to change.
My tools fall into several categories: website crawling, log file analysis, keyword research, analytics, competitive research, link development, website performance and usability analysis. With so many tools available in each category, it can be hard to pick just one. So most of the time, I don’t. New this year, I added a couple of items that don’t really qualify as tools but they belong in my toolbox because, without them, I couldn’t do my job correctly. Also new this year is the way this post is written. With so many of these tools being multi-functional, I decided to create a handy little chart:
So without further ado, here is my SEO toolbox for 2020.
|Screaming Frog Spider||?||?||?||?|
|Screaming Frog Log File Analyzer||?|
|Bing Webmaster Tools||?||?||?|
|Google Search Console||?||?||?|
|Google Ads Editor|
Screaming Frog Web Crawler
Now at version 12.4, Screaming Frog’s Web Crawler just keeps on getting better and better. For a basic SEO Audit, the first thing I do is input the top-level domain, connect Google Analytics, Google Search Console, and Majestic and hit Submit. Screaming Frog will happily crawl every link it finds and reports back it’s findings. (I could also at MOZ or AHREFs data if I wanted to.) For other projects, I can switch to list mode and paste in a list of URLs from my clipboard or check the accuracy of an XML sitemap by pasting in the URL of the sitemap and starting a crawl. Oh, and I can also crawl SERP results to see who I’m competing against. Cool stuff.
Once I have my crawl data, I can find broken links, audit redirects, analyze page titles and metadata, discover duplicate content, review robots.txt files, create sitemaps, extract data with XPath, visualize site architecture, create word clouds of on-page text or the anchor text in incoming links. With custom extraction, I can look for specific text strings in the written text or in the code. All of that data can be exported into an Excel file or a CSV for additional analysis. And I can produce reports. Not just a few. A lot.
The paid version is about $195 USD per year for a single user, and it’s worth every penny. Download Screaming Frog Web Crawler here.
Screaming Frog Log File Analyzer
Log file analysis is (in my opinion) a vastly underused tool in the SEO’s toolbox. It’s hard to come by – not every hosting company will collect this information for you by default. One reason? Since the log file records every interaction, every request for a file, it’s a big file and takes up a lot of storage space. But the biggest reason is that most web developers don’t know enough to ask the hosting companies for them.
The Screaming Frog Log File Analyzer is the perfect companion tool for their web crawler. By itself, it does an amazing job of analyzing log file data in a way that even a novice SEO can understand. But when you combine it with data exported from Screaming Frog’s Web Crawler, you can get a list of every URL that is – or is not – included in the log file. You can also see which URLs are crawled how often, and that’s a good indicator of whether or not the search engines think they are important.
There’s a free version and a paid version, but like the web crawler, at about $125 USD this is a steal. Download Screaming Frog Log Analyzer here.
Market Samurai is a hidden gem of a tool. I’m not a fan of the hard sell copy on their Home Page (“Download Your FREE Copy of Market Samurai and Laser-Target High-Traffic, High-Profit, Low-Competition Markets With Devastating Accuracy”). Kind of reminds me of those high pressure :30 minute infomercials on late night TV. But having said that, it would be a crime if you let that turn you off from a really handy little desktop application.
Market Samurai is a great tool for doing keyword research and getting data on who I’m competing with for visibility on page one of Google. The web site says it does a lot more (and it does) but I find these the two most useful features.
You can get your copy of Market Samurai for $149 and it’s worth every dollar. You just have to ignore the sales pitch.
Majestic is to links what Google is to search engines. It’s the biggest. Majestic’s Fresh index has crawled 453,548,169,678 URLs. That’s over 453 billion unique URLs. It has found a bit more than twice that many. Majestic’s Historic index, which goes back to October 2013, has over eight trillion – that’s 8,485,618,151,026 – links in it. So yeah, it’s pretty huge.
What can you do with all that data? Just about anything you want. What I use it for is understanding a web site’s link profile (topicality) and the amount of authority it has and is able to pass. I also use it for research when I’m building out a link development campaign. With that much data, it’s really easy to research who you want to send a link request out to and who you want to avoid.
Majestic’s pricing starts at $49.99 a month (for a minimum quarterly commitment) and $79.99 a month if you want to go month-to-month. Both of those plans are for single-user access. Multi-user access plans start at $169.99 a month.
SEMRush has been in my toolbox for years and unless they do something stupid, it’s going to stay there. For me, it’s like Screaming Frog – it’s one of those tools I have to have in order to be able to do my job effectively.
SEMRush is generally the first tool I run to when I’m talking to a potential client. A quick search and I can get an idea of what their traffic looks like (paid vs organic) and who they compete against on the SERPs. From there I can do a backlink audit, analyse their content, perform a cursory site audit and see what they’re going on social media. And that’s without having access to their site. Once I’ve signed them, the data is more accurate and more actionable.
SEMRush is another must-have tool. Start with the Pro package ($99 a month) and go from there.
Personally, I don’t do a log of paid search work – there are people on staff who do that – but I when I dive into the pool, SpyFu is a lifesaver.
SpyFu is a competitive intelligence tool for search marketers. You can type in a competitor’s domain to see all of the keywords it ranks for (including the content that ranks), the ads it buys on Google, and an estimation of how they compare to their competitors in the marketplace. You can also type in a keyword to find the domains that buy it (which is kinda cool), the domains that rank for the keyword, and how that has changed over time.
Another thing I like is that SpyFu combines an amazing amount of competitive data into an easy to use interface. You have access to most everything your competitors are doing in terms of PPC, keywords and SEO. It can identify competitors in your industry as well as recommend top Adwords to buy.
With deep insights into both SEO and PPC, SpyFu is a solid tool for people who are starting a campaign and those who are improving what is already in motion. Pricing is pretty reasonable at $39 a month for a Basic package.
I just stumbled on this tool about a month ago, but it’s already become indispensable. The search volume data, like all third party tools, is pulled from Google’s Adwords API so the volume data isn’t actually based on the number of organic searches. Ah well – you can’t have everything.
On the plus side, there are 5 different keyword tools to chose from. There’s the Find Keywords tool where you can put in up to 30 seed keywords and pull data from 11 different APIs; there’s the Import Keywords tool when you can drop in your Google Search Console (or Bing Webmaster Tools) keywords; there’s the Related Keywords tool, the People Also Ask For tool and the Merge Keywords Tool (which is way easier than using an Excel spreadsheet with a concatenate function). The onscreen reports are brilliant. The downloadable reports lose the “prettiness” of the onscreen reports, but the data is solid. You just have to format everything yourself.
Still, this has become a must-have tool for me. Keyword Keg pricing starts at $40 USD a month and goes up from there.
If your business serves a local market, whether you have a retail location, or you’re selling services in a specific geographic market, BrightLocal is worth looking at.
BrightLocal integrates a bunch of tools and functions you’ll need to be successful in a local SEO campaign into a single, reliable platform. It tracks organic traffic, your appearance in map results, rankings for specific keywords, competitors – even on-site SEO and off-site SEO. It integrates with your Google Analytics, Google My Business, Facebook, and Twitter accounts. It tracks your citations and even offers a citation builder utility for an extra cost.
There’s a 14-day free trial, and after that pricing starts at $29 a month. If you want Facebook and Twitter integration it’s $49 a month and if you’re an Agency, you really need the $79 a month package. Still, it’s a good deal.
Web Page Test
Whenever I’m doing a speed test on a web site, I like to get multiple datapoints since there are too many variables to work from just one. At least in my opinion. So there are 3 in my toolbox.
A former Client who worked at AOL turned me on to this tool. WebPagetest is a tool that was originally developed by AOL for use internally and was open-sourced in 2008 under a BSD license. The platform is under active development on GitHub and is also packaged up periodically and available for download if you would like to run your own instance. For me, the online version is fine (although sometimes the queue gets quite long.
What I like about this tool is that it’s free – we LOVE free – and it’s got solid data. It runs every test 3 times so that you can any fluctuations or variations in the data. Test output is provided in the form of highly detailed charts and downloadable files.
And did I mention it’s free? Click here to try it yourself.
Google Analytics / Google Search Console
I put Google Analytics and Google Search Console together since you really can’t have one without the other. Google Search Console tells you what happens before someone gets to your website: how they found you, where they’re coming from (geographically), what pages they’re looking at, CTR, impressions, etc. Google Analytics provides insight into what happens after they’ve gotten to your website.
It would take much longer article than this to get into all of the information provided and how you can use it. Suffice to say that no SEO can do his or her job without these tools. Yes, there are other analytics platforms out there in the marketplace, but as long as you’re not bumping your head on limits Google has put in place for a free version of Analytics, there is no better solution.
Bing Webmaster Tools
Bing Webmaster tools is the red-headed stepchild of SEO. That’s a shame because it’s a really, really good tool. In some respects, it’s actually better than Google search console.
First off, setup is really simple. If you already have access to a website via Google Search Console you can use that to verify your access for Bing Webmaster Tools. That’s a whole lot easier than the old process.
Second, when you log into Bing Webmaster Tools there’s a dashboard that shows you all of the sites you’ve got connected and you can quickly see the improvements (or not) in your key metics: the number of clicks from search, the number of appearances the number of pages crawled and the number of pages indexed.
Getting additional detail just requires clicking on that particular property. The site dashboard has a chart that shows clicks, search appearance in search , pages crawled and crawl errors. Below that are your Top 3 pages for traffic along with some metrics, below that are the Top 3 keywords for your site with their metrics and then below that are some SEO reports that shows where you might want to make changes to better optimize the website. There are also some great tools for diagnostics, validating structured markup and keyword research. What’s interesting about that is it’s not tied to advertising – it’s actually organic search volume.
Pingdom is site monitoring on steroids. If you’d got questions, they’ve got answers.
The product allows you to test website uptime, page speed, user transactions, servers, real-time user monitoring – all of which come with alerts. The web-based application has a really clean understandable user interface. Charts and data galore. What I use it for is page speed monitoring because that’s such a critical factor for Google. Pingdom allows me to find poor-performing code and third party assets that can negatively impact the performance of my Client web sites.
If you want ongoing monitoring, Pingdom has a 14-day trial available, after which it costs $45 a month. If you’re happy with the occasional one-off report, you can do that for free.
Much as I love Pingdom, there are times when I don’t need all of their services. Or their cost. For those times, I use GTMetrix.
To be clear, GTMetrix doesn’t do everything that Pingdom does. But when I’m looking for quick insights into why a web site might be running slow, it’s awesome. Their Report Page neatly summarizes the website’s page performance based off of key indicators of page load speed: Analyze a page with Google PageSpeed and Yahoo! YSlow rulesets, get the page’s Page Load Time, Total Page Size and Total # of Requests when it loads, and a page’s performance relative to the average of all sites analyzed on GTmetrix. Kinda cool.
All that for free. If that’s not enough, there’s always a GTMetrix Pro version you can get.
Deep Crawl does everything you could want from a crawler. Want to crawl just the top-level domain? Check. How about all of the subdomains? Yup. Compare your crawl data to your sitemaps, your Google Analytics data, and your Google Search Console Data? Check, check and check. What about backlinks? Absolutely. You can import backlink data from Google Search Console or Majestic or just about any tool that exports into a .csv format. Do you have 10 sites you need crawl on the 1st of every month? No problem. Just set up the scheduling and push the Save button. The magic happens without you having to watch anything, and you’ll get an email and a browser alert when it’s done. Deep Crawl does server log files.
So why is this in my toolbox as well as Screaming Frog? Screaming Frog is a desktop application that uses Adobe Air and it doesn’t handle really large websites – like 1mm pages – really well. Not unless you have lots and lots of memory. That’s where Deep Crawl shines – running out of memory is someone else’s problem.