Advertisement
hjysy

Ranking with Usage Metrics - 2020 findings

May 13th, 2020
49
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 18.06 KB | None | 0 0
  1. Ranking with Usage Metrics - 2020 findings
  2. It's been a long time since I posted something but If found something very interesting and I'd like to share it with you.
  3. ++++++++++++++
  4. list of top cheapest host http://Listfreetop.pw
  5.  
  6. Top 200 best traffic exchange sites http://Listfreetop.pw
  7.  
  8. free link exchange sites list http://Listfreetop.pw
  9. list of top ptc sites
  10. list of top ptp sites
  11. Listfreetop.pw
  12. Listfreetop.pw
  13. +++++++++++++++
  14. Keep in mind though that what i'm about to share worked in the Adult Niche - for other niches I suggest testing if you're kind enough sharing with the community here.
  15.  
  16. Let's dig in.
  17. For those that are new, let's first explain usage metrics. Usage metrics means:
  18. - Time spent on site/page
  19. - Amount of pages browsed
  20. - Bounce rates
  21. - Clicks through from SERPs
  22. - Diversity of branded search
  23. - Direct traffic
  24. Those are the most important usage metrics.
  25. Google finds them extremely important, and if google likes, google gets =))
  26. Note: In this post I'll mostly focus on PandaBot (https://www.pandabot.net/) which also has a free basic plan. So it won't cost you anything - which is good for low budgets. I'll share tips on how to do the most of it later in the article. I've known about pandabot for about 6 years and I used it on and off with questionable results, but this year, I think I learned how to use it "right". Please note that in the 6 years I got zero penalties for using this (neither algo or manual) and I used it on about a dozen of sites in various niches. But as I said the results were questionable as I didn't know then what I know now.
  27. This all started after reading an update on the article "How To Rank Your Website With Fake Usage Metrics Using PandaBot" found on @MatthewWoodward 's site. Link here: https://www.matthewwoodward.co.uk/tutorials/rank-website-fake-usage-metrics-no-backlinks-required/ I really enjoy Matthew's articles as he provides a shit-ton of information and he doesn't present himself as a "whitehat" only dude. Actually, I think it's safe to say that I wouldn't be the SEO I am today without his series on how to use GSA. BTW Matthew, thanks for all the good stuff you shared over the years! If I'll ever find myself in your adoptive country I hope we'll meet so I can buy you a couple of beers!
  28. Note: if you want to find out how to use Pandabot, read the article above as I will not explain everything into detail, I'll just go into the most important stuff and the results.
  29. So, I set up a couple of PandaBot campaigns to rank two short tails, one with ~14k searches per month the second one with ~45k searches. I was ranking in top 20 for both of them.
  30. To make things right, I divided the searches to find daily amount of searches THEN i estimated the CTR for my positions (so position #1 gets about 30% CTR while position #10 gets about 1% CTR).
  31. Note #1 This is important to do as if you send too much traffic it will be a signal for Google that you try to manipulate SERPs and we all know what that means...
  32. Note #2 Keep in mind that PandaBot's minimum amount of visits per day is 5 so for low searches longtails PandaBot won't work. I'll tell you how to do about those later in the post.
  33. Initially, i created a campaign for each keyword and waited for the results. The campaigns were set with "random number of pages" setting. I saw positive serps movement in 2-3 days time. But then I dropped to the original serp position. So I did some AB Testing after re-reading the article and paying attention to it, I found JUST ONE SENTENCE THAT CHANGED THE OUTCOME OF THE CAMPAIGN. Aka something in the lines of "and I added a lot of longtails variations".
  34. Since I wanted to rank more on the KW with 45k searches, what I did is the following:
  35. - gather all longtail variations of the 45k shorttail and created "random number of pages" campaigns (making sure that I followed the rule in note #1).
  36. - the shorttail with 14k searches I left as it is.
  37. One week later I was in #9 and then #8 for the shorttail with 45k searches and what about the 14k searches shorttail? No significant change.
  38. Conclusion: It works!
  39. BUUT.....here's what happened next.
  40.  
  41. When i hit the #1 page i didn't take under account these facts:
  42. - different position means different CTR
  43. - campaigns for the longtails and shorttails leave a footprint of usage metrics.
  44. So, one week later I got back to the initial position :(
  45. Hence, I went back to the "drawing board" and re-adjusted each campaign's settings for both short tail and long tails. Here's how:
  46. First off, given the fact that I fell back to page #2 I saw a big reason to change the amount of daily users for the 45k searches kw. Given the fact that I had calculated a CTR for page #2, I knew that dialing down would be a signal for google that something is unnatural so i actually dialed it up a notch BUT i also changed the priority of the campaing (priority means which campaign will run first and more on PandaBot, so it's a good way to ensure that even if you got 5 visits per day you'll only get 2-3). If you see the broken logic in that here's my "Vulcan" way of thinking: With Panda you get increments of 10 in the number of daily visits up to 100 visits, where the increments are of 25. Since I wanted to keep things random, by increasing the number of visits but decreasing campaign priority I would achive just that.
  47. I also changed the following on all campaigns:
  48. - max time usage/day (randomly either added or subtracted time)
  49. - max visitors per day (I upped number for longtails where i ranked in #1 page and also upped campaign priority, for those ranking on #2 or #3 page I upped and changed campaign priority to lower eg: if it was priority 2 I changed to 3 or 4 as those PandaBot hours are hard to get)
  50. - max concurrent visitors (randomly on each campaing of both longtail and shorttails)
  51. - max time to wait between tasks
  52. - duration deviation
  53. - average visit duration
  54. - minimum number of pages to be visited
  55. - maximum number of pages to be visited
  56. All of the above was changed randomly with the exception of the short tail where:
  57. - I upped average visit duration
  58. - I set minimum number of pages to 2 and maximum to 10 (previously I had 3 min -6 max) then daily changes to them
  59. - campaign priority changed back to 1 (highest priority) after positive serp movement was seen
  60. RESULTS? Ranking in #1 page again for 45k searches kw and positive improvements on the longtails too.
  61. BUT (yes, I know a lot of BUTs in this article) I decided to CHANGE ON A DAILY BASIS the following campaing settings for all keywords:
  62. - max concurrent visitors (either up or down)
  63. - max time to wait between tasks
  64. - duration deviation
  65. - average visit duration
  66. - minimum number of pages
  67. - maximum number of pages
  68.  
  69. Looks like insanity, but trust me, it's worth the time. Takes about 20 mins per day (as I have only about 20 keywords that i'm working on). Why do I do this? To truly keep things natural looking for Google.
  70. From my test I realized the following:
  71. - even usage metrics can leave a footprint and Google is really good at detecting footprints :(
  72. - it is extremly important to check the analytics data, decide on the minimum metrics that you already have on the page that you want to rank for the keyword and randomize stuff - sometimes go to -10% of usage metrics, sometimes go to +10%, +20% or even more .
  73. And it makes sense: normal traffic patterns for each of your keywords have daily variations. Not each and every day each new visitor will stay between 1-2 mins on your page, not each and every day he/she will look at minimum 3-4 pages, etc. These daily variations needs to be taken under account as analytics data provides averages - plus, you don't really know the keyword they landed from, so...keeping it random seems to work.
  74.  
  75. Additional tips
  76. How to maximize the time you get on Pandabot?
  77. What I did is the following:
  78. - run pandabot 24/7 on a vps that I normally use for scraping
  79. - run pandabot 24/7 on my travel laptop using a proxy (5 dedicated private proxies cost me $10 per month from buyproxies.org which i recommend as I've been using them for the last 8 years and had no problems)
  80. - run pandabot on my main computer (w/o proxy)
  81. - purchase a premium subcription ($19.99 per month if i'm not mistaken) as I get 120% on all traffic on my primary computer and 50% (i think on the secondary ones)
  82. - additionally as a premium member you can buy 1000 hours for $299 (if you have the budget, I don't)
  83. How to get visits for the low search volume kws (10-30 searches per month) safely?
  84. For this I use BearSiteVisitorPro (costs $99 USD but it's worth it as you can run multiple instances on a single computer). Keep in mind that you need proxies for that one - i use scrapebox to find them.
  85. BearSiteVisitorPro can also be used for refferal and direct traffic. This is is a good way to save some of the PandaBot hours.
  86. Why not only use BearSiteVisitorPro?
  87. Pandabot emulates scrolling, while Bear does not. It only emulates typing in organic traffic, but no scrolling. Plus, it takes a toll on the processor and ram. I prefer to stick to what I know works best.
  88. So, if Bear doesn't emulate scrolling why do I use it? Well, I test my proxies on a domain which has an "underconstruction" page for about 1 year now. Basically, when you test proxies with SB with a custom string test, it's just like you give a site a ton of visits with 100% bounce rate. To my surprise, I saw the site ranking for a shorttail (variation of the domain name EMD) that it didn't rank before. Let me know if you guys think that a tutorial on how to use Bear would be interesting (unfortunately, the documentation is lacking, and support isn't present)
  89. Hence the conclusion: visits matter, even if they are direct and 100% bounce.
  90. Another interesting finding: on the "under construction" site I use to test proxies (started using it about 1 month for testing proxies) I have no analytics, but still Google gets the usage metrics data. This furthermore confirms to me that this article is right https://www.matthewwoodward.co.uk/tutorials/new-google-ranking-factors-2020/ (and btw, it's still a source of information to apply in 2020).
  91.  
  92. TL;DR
  93. Ok, if you made it this far, please keep in mind that I found this working in the adult niche. The article on Matthew's site shows that it was last updated in 2020 so if you try it in other niches, let me know how that works.
  94. Also, remember that you still need link building. Although, the 45k searches pe month keyword that I ranked in #1 page doesn't have any OBL links to it. None whatsoever. The only thing it does have is good internal linking. BTW, did you know that Google said that they don't care about internal linking and you can abuse it as much as you want? If you found that to be true, let me know.
  95.  
  96. Hope you guys find this a good source of information. If you do, please give it a "thumbs up" so I know I didn't spend 1.5h writing this without helping anybody :)
  97. Note: having troubles editing the post. I noticed a confusing fact in the changes I made to the campaigns.
  98. So, the campaign for the 45k searches kw was initially dialed down to a lower priority (when I dropped from page 1). But then I say the campaign remained with priority 1. Here's what I did: when I hit page #1 again, the campaign priority was set to #1 and hasn't changed since. For a couple of days after changing to priority 1 it bounced back and forth between positions 9-10-11-12 but now it "settled" to positions 8-10 (still some daily fluctuations but that's normal).
  99. Absolutely brilliant...are there any other variations to Panda Bot that has this random chaos theory built in?
  100. Indeed, it is quite a bit of work. Then again, it's not Panda that's to blame but Google becoming more and more smart. As I said, prior to that I didn't do such "intensive" manual labor but I the results weren't consistent. Some sites ranked and stayed in #1 page with only 1-2 campaigns with random number of pages, some didn't. But, personally, I spend more time doing log analysis on a daily basis (say 3h) than on Panda. For me, it's become a "morning routine" while I drink my coffee. 20 mins and I'm done.
  101. Of course, this might be only in this niche. Previously I had used Panda in other niches and didn't have to go through all this trouble. Might also be the competition on the kw...This is why I asked others that manipulate usage metrics to share their findings too. Nevertheless, I find this strategy working, ranking in #1 on that kw makes me money, so...I'm happy :)
  102. The TLDR doesn't explain what the post was about :D
  103. Thanks for the great post!
  104. None whatsoever. The only thing it does have is good internal linking. BTW, did you know that Google said that they don't care about internal linking and you can abuse it as much as you want? If you found that to be true, let me know.
  105. Thanks alot for the share. I didn't realize that internal linking wasn't important, may I know when Google said that. These guys made it seem like it was the bomb: https://www.authorityhacker.com/site-architecture/
  106. @affbull I said that internal linking IS important. And that Google said that you can pretty much abuse it as much as you want. I don't remember the post, but a quick google search should help you.
  107. Thanks for the great post!
  108. I got tired =))
  109. Cheers mate, hope it helps everyone rank better - and why not, bank too. As I said, I'm really curious to see if other members used similar tactics in other niches and how that worked for them.
  110. Nice long post - bookmarked!
  111. Damm Op great share.
  112. Good to see quality contents showing up after a long time.
  113. If the pandabot traffic just visits and scrolls, but doesn't convert to the thank you landing page, won't G see that in analytics and decide your site isn't providing the type of value that a top ranking site should - leaving a footprint in essence?
  114. Whoa I had no idea that anything like "How To Rank Your Website With Fake Usage Metrics Using PandaBot" can exist!
  115. Bookmarked this page for a later read. Seems very interesting!
  116. @The Curator It depends on how many visits lead to a conversion and how many visits you send per day with Pandabot.
  117. Oh, and another thing: don't double your traffic with PandaBot or any other bot overnight - this applies for everybody. Not only it will seem fishy for the big G but also you might not be able to keep the number running for long enough. Use small increments 10%-20% at first, then you can increase and tweak along the way. Keep in mind that reaching #1 page will also mean that you'll get more visitors, so you will also need to increase the number of PandaBot visits accordingly until you settle into a position (settling means being in the same position +/- 1,2 places for at least 2 weeks).
  118. So to answer your question:
  119. a) don't go all in with panda start with 10%-20% increase at least for 1-2 weeks before increasing again. I had a client who also knew a little SEO, told him about panda, he bought 10k hours (paid I think about 1,000USD or smth similar) set up a shit ton of campaigns, site visits exploded over night, serps exploded a couple of days later and in a week he didn't have any more hours. Panda stopped. Everything dropped in serps lower than it was when it began
  120. b) carefully calculate the number of visits (uniques) it takes for a conversion. Keep in mind that Panda sometimes sends traffic from same IP more than once.
  121. c) apply a strategy where your visitors numbers increase by 20% max in the first week and monitor things on your site and serps and change campaigns accordingly.
  122. d) to be extra sure (or if the site has daily conversions) create "fixed number of pages" campaigns as described below.
  123.  
  124. Let's take an example. Let's say that you have 100 visitors per day to your site. The extra amount of visitors that would come from the bot would sum up to 20 per day. So about 20% of the visits. There is no big problem if you get 20% extra visits and those visits don't covert, especially if you are not on #1 page. Being on #1 page almost always means there will be some conversions naturally IF the keyword in itself has a buying intent or is one that converts naturally. Should the keyword be an informative one, G won't expect a great deal of conversion from it.
  125. If you want to keep it extra safe OR if you are already in page #1 and try to reach for #1 position and that means more showing G that your site converts better than the competitor's, then you can always set up a couple of "fixed number of pages" pandabot campaign where you can set up your conversion funnel aka send them to the thank you page or order confirmed page.
  126. Of course, you need to make sure that they don't get the highest priority and I would change some of the funnel pages on a daily basis, just to be sure - this in case your site has daily conversions and an increase in daily conversions is expected if you rank #1 or in #1 page for your chosen keywords.
  127. There is the other case where your site doesn't normally get daily conversions. In this case, simply turn the "fixed number of pages camapaign" ON for a day, make sure you get your visit to the thank you page, then turn it off. Wait a couple of days and turn it on again - having already made subtle changes to the sequence of pages to be visited + metrics like time on page, interval between visits, etc.
  128. Basically, you can do a lot with Panda - and you could do the exact same thing with BearPro (but as I said, because it doesn't have scrolling, I don't use it that much for organic traffic). The idea is to know exactly your user behavior, and to tweak the bot to do what you want it to do. Does it take time? Yes, of course. Is it a tedious process? You bet! Does it work? For me works great. Looking forward to other people sharing their results to see in other niches as well :)
  129. @affbull I said that internal linking IS important. And that Google said that you can pretty much abuse it as much as you want. I don't remember the post, but a quick google search should help you.
  130. Misread your post. Thanks alot!
  131. That is such a great post!
  132. I have started implementing it - just have one question
  133. Do you create new campaigns everyday or just edit the campaign on a daily basis once its run?
  134. is a traffic exchange website or specially designed for Google traffic exchange?
  135. yoslink.com
  136. well-beingsecrets.com
  137. juicy j make money lyrics
  138. inc.com
  139. www.youromail.com
  140. yandex.com
  141. infinitemailer.com
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement