Categories
Link Building

(Mostly) Automated Broken Backlink Prospecting

What you’ll need: A broken link that has many links to it. The more the better, though technically this will work at any scale. 

  1. Take your link, plug it into Ahrefs Site Explorer. Set it to “URL Only.” Navigate to Backlinks. 
  2. Filter by RecentLive, English (I assume), DoFollow
  3. Export to Excel
  4. Open up the CSV
  5. Filter by DR >= 20 (or other value, up to you)
  6. Copy and paste those results into a Google Sheet
  7. Delete columns so you just have Referring Page URL, Referring Page Title, and Link URL (the broken link URL)
  8. Add columns for Our Link, First Name, Last Name, Root Domain
  9. In Root Domain, paste this formula and extend it to all rows:
    =REGEXEXTRACT(A2,”^(?:https?:\/\/)?(?:www\.)?([^\/]+)”)
  10. Now download the page as a CSV, delete everything but the Referring Page URL, and create an MTurk task to find the First Name and Last Name for each article. I set the task at $.10 and got 400 done in about 30 minutes. You can also use URL profiler Content Analysis- Readability function which can extract author names before sending to url  to MTurk  
  11. Once you have the First Name and Last Name data, paste it into your Google sheet. Now you should have URL, Title, Broken Link, First Name, Last Name, and Root Domain. Add another column called Company.
  12. Add the Hunter.io Google Sheets add-on. Then use the Addon to populate all the emails. 
  13. Tada! You should have a ton of emails you can reach out to directly to get the broken link fixed to your link. 
  14. Want more? Take all the URLs that MTurk didn’t get First/Last names for and plug them into Hunter and look for Editorial or Info email addresses. Never hurts to try. 
Categories
Link Building

Trying to figure out if PBNs are bad or not so bad?

PBNs work. But the best way is to build your own. As noted, the economics of it are increasingly difficult, but they do help with money keywords. I’m not doing any right now, but have within the past 1.5 years.
Consider this: $10 for expired domain, $12/yr. for cheap web host, plus you need to get up unique content, or do a website rebuild (cost $12).
$10+12+12 = $34 per PBN site, if you’re doing it super-duper cheap. Better domains cost more, better hosting costs more. And if your VA or a service to build it for you, that costs more too.
Seems like most guys doing links at scale would rather pay $100+ for a guest post on a site with high DR and real organic search traffic. You’re one and done, and the link is permanent. Better than building out thousands of PBNs, which require ongoing upkeep.
The only people that seem to still be relying on PBNs are greyhat SEOs in the local SEO space, because local SEO doesn’t really warrant a huge content marketing campaign. Plus with many clients you only need 10-30 links to rank them #1 for their keywords. PBNs will make that easy.
But even then there are disadvantages to PBNs.
Personally I’ve never done PBNs at scale, but using them here and there they have helped quite a bit.
The key to not getting penalized is to not leave a “footprint”. This means creating websites that look and feel different. (Different content, WP themes, different CMSes, website layouts, etc.) The hardest thing is scaling web hosting.
You have to get hosting for your sites on different servers. So keeping track of a bunch of sites on different hosts is a pain.
If all of your websites look and feel the same and are all on the same web host, that is how you get penalized.
Or are all on the same Google Search Console or Google Analytics account. Stuff like that that makes a pattern easy for Google to find.

PBNs these day resemble real sites to the point that some (not all) actually are. One such PBN provider I know actually decided to start a niche relevant PBN network. By the time he had 10 websites, all of them were ranking for search terms and had actually useful content.
That is not to say that 1 hour PBN builds don’t pass juice anymore, but the guy has been building PBNs for years. If he doesn’t do quick builds anymore, I’m not going to bother.
After you build a PBN you also have to toxicity test it. This takes about a month. Basically means to create a post and link to a site ranking on page two for a low competition KW. Then see what happens to the sites.
I’m not saying “don’t bother”, but you definitely need a lot of research. The amount of footprints alone is staggering; around 60-70 you need to be aware of.
For myself, I’ve decided to only ever use PBNs if I’m in a niche where they are ubiquitous. Otherwise, it’s just not worth it for me.

Categories
Link Building

How To Get Keyword Difficulty For A Bunch Of Keywords?

Each tool uses its own formula so you’re always going to see some differences.

Some weight (seemingly too much) on the individual pages ranking, and will call a term easy even if its a bunch of pages on really high authority sites

(So a ‘weak’ page on NerdWallet, for example – still hard to beat given how massive they are in the finance space.)

I’d recommend checking them out, and sticking with the one that works for you.

I’d define that has some decent level of consistency in “With equal page optimization, I rank highest for stuff it grades as easy, lowest for hard, and in the mid group for medium”

if you find one that aligns with a bunch of your intentional work like that, it’s a good sign that you can use their stats for ‘easy’ to go scoop up more easy wins.

What I’d likely do here is grab a bunch of terms, toss em into the keyword tool at ahrefs and use the ‘traffic share by domain’ report.

If you take a look at the top sites and feel like you can compete with them in terms of the amount of content and links, go for it.

Categories
Link Building

What is the best rank tracking software on the market?

SERPWoo – tracking overall SERPs/markets. I don’t try to use this for tracking a bit list of specific keywords for my site, though you could do that
Accuranker – my specific list of terms I’m interested in – piped into GDS for quick views of head terms/overall trends.
aHrefs – more a backup. The rank tracking comes with the plan so… why not.
Serprobot.com is surprising even if it may be fugly, but it works and goes beyond top100 also it’s supercheap. 5$/300kws.

Isn’t pretty to watch, but if you’re testing out new batch or want the lay of the land, it is surprisingly good for the price.

There is also nothing too fancy like serp features etc. all that mumbo jumbo.

But you can also point it to competitor and see how their doing quite quickly. May not work for day to day long term professional check. But for shoestring budget and quick checks unbeatable

Categories
Link Building

Site Depth & Inner Link Onsite SEO 101

One of the things that I see come up quite a lot within SEO industry, is questions around URL structure and site architecture. In particular, people often ask about how they should structure their URLs for the best possible results (from a ranking perspective).
There’s a common misconception that simply having shorter URLs will help you rank better. This isn’t the case. You could have an extremely long URL and it still has a strong potential to rank assuming everything else is equal.

Where this misconception stems from is the idea that having a page URL live as close to the domain root as possible (i.e. domain.com/page-url vs `domain.com/folder/page-url`) will pass more authority to the URL. This is not the case. I hear the words PageRank, URL structure and architecture often all in the same sentence as if they’re one thing but, while they’re related, they’re not the same.

Click depth is an important concept to understand and it relates to the number of clicks it takes to reach a page from the homepage (taking the shortest path, that is). So, if a page has a link from the homepage, it’s got the shortest possible click depth, which is good if you want to make it as easy as possible to crawl, and (assuming it’s also linked to from other pages) push as much authority to it as possible.

If you have a page that sits 10 layers deep in your site architecture, that means it takes 10 clicks to finally reach a page on your site when starting at the homepage (again, assuming you reach it in the shortest number of clicks possible). Now, you could reduce that click depth to be just one layer deep by adding a homepage link to it, or you could put it 2 layers deep by adding a link from a page that is directly linked to by the homepage (see how this works?). Each of these reductions in click depth will help with that page’s ability to be crawled and likely ranked (as it is getting more internal links from pages near the top of your site’s architecture, which are usually the most authoritative).

Here’s where the mix-up happens in relation to URL structure… if you have the following URL: domain.com/folder/folder/folder/page-url and you were worried that it was sitting quite deep in your architecture, so you decide that the way to solve this is by changing the URL to domain.com/page-url and then 301 redirect the old page to the new, well, this is wrong. Also, if you assumed that the original page sits deep in the architecture purely because it lives off of a few subdirectories, well, this is also wrong.

I could have the following URL, domain.com/folder/folder/folder/folder/folder/folder/page-url and if it has a single internal link from the homepage, it sits at the top of the site architecture. It’s nothing to do with the URL.
So, why do we care about URL length and structure? There are lots of reasons, but if I had to distill them down into a few buckets, it would be as follows:
1. Content Grouping for Analysis: having a load of related content live within one subdirectory can be really useful if you want to do a bulk analysis of all the content together within your analytics platform. For example, if all my product pages live in /products/ then I can easily do a regex query to grab analytics data on all URLs with this subdirectory in the URL.
2. Content Grouping for Crawling: this is more for larger sites but it can also be useful to group large amounts of related content together in one subdirectory (or subdomain) so that you are indicating to Google et al. that all of this content is related and that it could be crawled in a similar way. Google often uses sites sections to help shape its crawl behavior, so this can be useful. It’s also just much cleaner this way and you could build out site-section sitemaps etc. much easier.
3. Shorter URLs are more Readable: having a URL that you need to scroll to the right in your browser for half an hour just to read is not ideal. If you want someone to remember a URL, it’s much easier for it to be shorter. Shorter URLs are typically (not always) shared more – this is often why vanity URLs are used.
Anyway, I hope this helps. I wanted to write something more comprehensive up to dig into this a little so it can be a reference point for any future questions on this. Also, if you have Qs, drop them in the thread and I’ll do my best to answer them.

Categories
Link Building

Growing Your Affiliate Site Into A Brand

My personal definition of turning an affiliate site into a brand is taking the affiliate income and:

Investing into diversifying the traffic sources (email, YouTube, Social, Paid etc) so you’re less reliant on search engine traffic

Diversifying the monetisation sources so affiliate commissions are not your only source of income. Eg you invest into launching your own store with physical or digital products.

The space is definitely starting to mature and it’s going to get harder and harder to flip sites. Eventually the returns won’t be there.

You’re much better off building a brand, diversifying revenue streams, diversifying traffic sources, creating a community/email list if possible. aka building a real, defensible business…not just an affiliate scheme riding the current wave. I’m not familiar with a specific community around turning affil sites into something more, but the site indiehackers has a lot of cool stories about digital founders that may spark some ideas.

Categories
Link Building

7 “leaked” Google UX Playbooks

Here are the 7 “leaked” Google UX Playbooks I have found so far that may be of interest to the group (fairly basic but a reminder that good UX is part of SEO):

https://drive.google.com/open?id=1I97OCHKFr2WHJ7jfbpvrUkPVgE45_Wpm

Categories
Link Building

Auditing Sites With Archive.org

Hey all, I wanted to share a quick tip for anyone auditing sites or working through site migrations, etc.

I’ve found sometimes that in some cases historic URLs can be lost and as such sometimes there are also lost opportunities. (external sties linking to 404’s as an example) so what I like to do is use this tool to get a broad view of all URLs that at some point have been crawled by archive.org

https://web.archive.org/cdx/search/cdx?url=dom&matchType=domain&fl=original&collapse=urlkey (replace dom with the website domain)

This then gives a huge list of URLs, there may be some crap in there but for the most part it’s super useful. I then take these URLs and whack them in screaming frog list most to check what their status is. You’d be surprised at how often there are some quick wins in here.

You can also de-dupe against any other URL lists you might have which again is really useful

Categories
Link Building

Personal Review On WordPress SEO Plugins

So I’ve been testing out a variety of different WordPress SEO plugins. If you’re new to WP SEO, you won’t go wrong with choosing Yoast WordPress SEO. But if you’re fed up with their ongoing journey to completely take over your WordPress dashboard or still pissed about their attachments I’ve been looking at alternatives
1, All in One SEO (free/paid)
Offers a lot of options for a free plugin – it’s lighter than Yoast and doesn’t want to take over my Admin screen. Does what you’d expect robotst.txt, canonicals, sitemaps. Bonus for free WooCommerce integration. Sadly need to upgrade to Pro if you want to get access to SEO options for taxonomies
2. The SEO Framework (free)
Bloat free, one big list of options no messing around. Free support for WooCommerce and bbPress. Really impressed so far – https://wordpress.org/plugins/autodescription/
3. Rank Math (paid)
Really good migration options to migrate from Yoast – the main issue so far is handling of taxonomies can only choose to index/noindex Categories and Tags – not categories or tags. If this was fixed It would be a bigger step towards me recommending this to more advanced SEO

Categories
Link Building

[GET]Search Console Explorer Sheet

Tool Features

1. Get inspiration for writing Title Tags. Simply enter in a primary focus keyword and get a SERP preview of the top-ranking and paid listings along with their title tags graded.
2. Enter in a current title tag or proposed title tag and get an instant CTR grade.
3. Get custom recommendations on what to improve based on previous studies to increase CTR.
4. It’s all free.

 

 

https://docs.google.com/spreadsheets/u/1/d/1MrpGs824Ub6xgkmj5PegfXw7aJdqcUNbwWmSc1e2Kgg/copy