Everyone is upset about the Google Keyword Planer Tool - It's gonna be ok
- SEO |
At first I very upset with this thing Google did. You can't blame them really. But no matter it still hurt us poor folk who can't afford the AdWords minimum to get the 100% full search criteria. There is a positive side to this. Or rather there is still use that can be taken from it.
After a few weeks of using this new version of keywords tool, I started noticing patterns and similarities. To start with the keyword tool was always just approximations of how and what people were searching for. It was never always 100% dead on que of how many searches. the system rounded up or down and we'll never know by how much. It is no different now it is just less specific. That is ALL it is. All that has changed is how specific the info is. A LOT of information can still be gathered from the keyword tool.
You're going to need to create yourself a system. The KEY to my system is that I always assume the lowest value by a specific percentage.
Example: Where to buy snakes in florida
Search Volume = 10 - 100 or 100 - 1k (100 - 1,000)
I simply round down 75%. If a search term, like the one above, has a max of 100 searches, I assume it has only 25 searches. My model involves much more but no matter how your model works this is the key principle. I have started to see success by sticking to this. The same success, or close I should say, as that I would see prior to the change. when you see 10 - 100 do not assume 100. Assume its 25 or 15.
There is also ways to reverse engineer the SERPS but i simply wanted to share with you a fundamental principle in my model of keyword research. Round down by 75% or more.
TL;DR - If you assume that the search volume for each keyword is the max number; e.g 100 - 1,000 and you bet on it being a 1,000 most of the time you will fail. You leave yourself no room for loss or for failure. If you assume a lower percentage of the search volume from the start and stick with your assumptions as a rule of thumb you can only get higher than expected results. Basically, if you conclude that each search result search volume use only a quarter (or less) of it's max volume you will do better than if you always assume it is the max. Assuming it is the lowest could also be a great rule of thumb and a key principle for your model. The lower the number you assume and prepare for the better off you will be. And don't rock back and forth don't think you have the force and can guess sometimes that one is the max. Just create a rule and stick to it.
This is my take and so far it has produced consistent results within a predicted range of failures.
Regards,
DeveloperDan
-
squeebo -
Thanks
{{ DiscussionBoard.errors[10947832].message }} -
-
yukon Banned-
Thanks - 1 reply
{{ DiscussionBoard.errors[10947833].message }}-
DeveloperDan -
Thanks - 1 reply
{{ DiscussionBoard.errors[10948569].message }}-
phenomix -
[ 1 ] Thanks
{{ DiscussionBoard.errors[10948655].message }} -
-
-
-
kandabrewer -
Thanks
{{ DiscussionBoard.errors[10947854].message }} -
-
fishequip90 Banned{{ DiscussionBoard.errors[10947871].message }} -
phenomix -
Thanks - 1 reply
{{ DiscussionBoard.errors[10947898].message }}-
newbim -
Thanks
SignatureIf what I said helps, let me know, throw me a 'thanks'.{{ DiscussionBoard.errors[10948011].message }} -
-
-
KarenByerly -
Thanks
{{ DiscussionBoard.errors[10948360].message }} -