3) Google: This is pretty straight forward but it’s the main reason I like it. I search for my main seed keyword in Google, and use the keywords that Google itself highlights in bold on the search results, plus the “Searches related to” section at the bottom to get keyword variations or LSI. That’s basically what Google is telling you that topic is about. No need for a thousands other tools. I use these to optimize the on page of my target pages as well.
I recently decided to go with ahrefs after using spyfu for a couple years and trialing secockpit. I was a moz client for awhile too about a year ago. I found spyfu data to be sketchy (or just plain wrong) fairly often, and moz, I don’t know, just didn’t seem like they were really into supporting what I wanted to know. secockpit was achingly slow for a trickle of data. ahrefs isn’t nearly so graph-y as spyfu, but they are so blazing fast and the data is so deep. I enjoy it a great deal, even if it is spendy.
When you purchase something from this website, I may receive an affiliate commission.The Articles and pages on this site are my opinions and are not representative of the companies that create these products. My reviews are based on my own experience and research. I never recommend poor quality products or create false reviews to make sales. It is my intention to explain products or services so you can make an informed decision on which ones suit your needs best.
You can improve your site speed by a ton of methods, but the overall goal should be to test your site from different geo-locations using a tool like pingdom and then attend to issues. You could go for a CDN provider like Cloudflare or install caching plugins that speed up your site by reducing database queries and therefore the server load. Choosing the right hosting company for you is a critical decision and is based on many factors including your CMS, expected site traffic, and what your goals are for the site amongst others.
In Chapter 2, we learned about SERP features. That background is going to help us understand how searchers want to consume information for a particular keyword. The format in which Google chooses to display search results depends on intent, and every query has a unique one. Google describes these intents in their Quality Rater Guidelines as either “know” (find information), “do” (accomplish a goal), “website” (find a specific website), or “visit-in-person” (visit a local business).
There is a myriad of search algorithm updates, erratic market trends, increase in competition, among other things, all the more reason for you to be always on the move. With the help of the different tools that you can easily access with just a google searc>h away, all of these can be done in a snap. If you are committed to these practices, SEO ranking would just be a light feather on your workload.
You can also block certain files or folders with passwords to the public or from certain bots. For example if you are still setting up a site and don't want it accessed - you can block it. This is very useful when building your Private Blog Network, because you can block tools like Ahrefs and Majestic from crawling your PBN site and hence hide any backlinks to your main money site from being discovered by your competitors (and therefore hide your PBN entirely). You can read up on Private Blog Networks and how to build them in my PBN guide.
XML sitemaps are especially useful because it lists your site’s most important pages, allowing the search engine to crawl them all and increase understanding on your website’s structure. Webmasters use the XML Sitemap to highlight the pages on their sites that are available for crawling. This XML file lists URLs together with additional meta-data about each of these links.
As for duplicate content, Google gets confused when you create and publish articles with similar content, and this eventually leads to indexation issues. Keyword cannibalization happens when the owner focuses his effort on ranking for a particular keyword from several different pages. When this happens, Google won’t acknowledge multiple pages; they’ll only focus on the best one thus making the other ones useless and inaccessible to search engines.
Its important that you setup your social channels and interlink them and then engage with your users on social with the right content and drive traffic to your site through these channels. Racking up fake signals and fake followers who do not engage or visit your site through the channels, is easily detected by Google as false and it does not help your rankings.
After diagnosing your site through the different facets of the search engine (Google), it’s time for you to check your website as an entity. The tool we’ve always used to check on our site’s onsite status is Screaming Frog. We’ve always used it as the websites we handle grow larger as the months pass by. You set the parameters and it’s even capable of crawling/compiling outbound links to let you know if you have broken links. Here’s what the overview looks like:
If the pages you’ve created don’t rank for the keywords you’ve selected, you should re-evaluate your content strategy and adjust. If your page isn’t generating organic traffic, focus on less competitive keywords. Unfortunately in reality this is pretty common. The good thing is, you’ve collected a lot of actual keyword data at this stage. Adjust your keyword strategy and use this data in your advantage.
Jaaxy is an online keyword finder owned by Kyle Loudoun and Carson Lim that promises to help you find low-competition keywords that will help you improve your rank in the search engines. Other Jaaxy features include alphabet soup, which allows you to brainstorm for keywords; saved list, which allows you to save your list of keywords so that you can view them later; and search analysis, which lets you search what is already on search engines such as Yahoo, Google, and Bing. Jaaxy offers a free trial as you get started, and you can also choose between the pro version and the enterprise version if you like how it works.
The total number of backlinks and their quality pointing to your complete website result in the overall authority of your domain. The external links that all point to a specific page will help this page to rank in the search engine results (SERPs). The relevance and quality of an external link are very important factors when you like to measure the impact / value of an link. To find out more about quality links have a look at this article on: the Official Google Webmaster Central Blog – https://webmasters.googleblog.com/2010/06/quality-links-to-your-site.html
Your local listing in Google is your “Google My Business” page. To improve the ranking of this listing in the local search results, make sure that your business information is up to date in Google My Business, within all of the Internet Yellow Pages websites (IYPs) and anywhere else where your business name, address and phone number (NAP) appears. An instance of your NAP is called a “citation.”
It depends on what you need to do… If you just need to lookup search volumes, then KWFinder is a winner and cheap. Check out my other blog post on SEMrush that describes the 6 different ways I use it: https://flizo.com/semrush-review/ If you don’t need any of those features, then I would go with KWFinder. If you need some of those features, then I would go with SEMrush as you can lookup search volume in both.
Thanks so much for offering this helpful tool. It is very useful. In case you want feedback, I think it would be great if you could please also consider including another column to display the linked page (i.e., the actual page that the backlink goes to on the domain). When selecting “All pages on this domain” it is difficult to know which page each backlink is going to on the domain. Thanks for your consideration.
I have use Jaaxy for my website and I think it is a fantastic program highly recommended. I use I ask a key word research tool for my website but I was not aware of the other areas that it could be used for that you have listed in this article. With the new latest updated feature and the fact you can try it for free what do you have to lose. Thanks for the great article.
You can also indicate which pages don't need to be crawled or are not important. You call the Googlebot to crawl and index your site from inside the Google Search Console. However, do note that although Google "looks" at your sitemap - Google is more interested in doing a raw crawl of your site - jumping from one link to another to spider all the pages in its database. By doing that, it also forms a link map of your site into its own index - which tell it which pages on your site are the most important pages (they are the ones that have the most links - the most prominent links).
What this does is give you an idea of how realistic it is for you to target keywords with high commercial value. You want to go after keywords with some volume, because they’ll have a better return in terms of traffic. But you don’t necessarily want to go after the most competitive keywords, because you’re less likely to be able to rank for them. You’re looking for a sweet spot.