Advanced Tiered Link Building Tutorial Part 5 – Indexation & Organization

Episode Guide

This is the fifth video to be released in my Ultimate Guide To Tiered Link Building video tutorial series. If you missed the previous parts then check out the episode guide on the right.

This is the longest video in the series so far weighing in at nearly 28 minutes! Thanks to all of those that have watched the video, left feedback and been in touch to say thanks so far.

Important Update: Please read this.

What You Will Learn

  1. How to export & organize all of your links
  2. How to get every link indexed including the character spun text
  3. How to send search spiders through all of your tiers


Video Transcript

Hi guys welcome to part 5 of my ultimate guide to tiered link building. In the last video we setup tiers 2 and 3 to post forever, in this video I’m going to show you how to export and organise all of your links.

How to get every single you have created indexed on auto pilot and most importantly how to send the search spiders ripping through the millions of unique paths that lead to your site.

Now if you remember from the last video we created this link tracker spreadsheet and I created some additional tabs here. What we need to do is populate these tabs with tier 2 and 3 links and in the case of link wheel bandit tier 4 links that we have created. And once we’ve got that list together we can move on to getting them all indexed and things like that.

So first of all lets jump over into link wheel bandit so exporting the report from linkwheelbandit is really easy, just select tools, generate report and that gives us this window here and then we can select our link wheel that we want to export the report from. You can select date ranges; I always go for the beginning of time to make sure we get all of the links here. In options we can select which tiers we are going to export, remember this is actually tier 2, this is tier 3, this is tier 4 because we built the link wheels out to our tier 1 links.

So you will need to in order do tier 1 posted article urls and then save that as Tier 1. And this is going to say there’s no data for my options because I didn’t actually build this link wheel out but yours will actually export a list of URLS. So you need to go through and do a report for each of the tiers as so, generate report and again yours will actually export some data here and then tier 3 and generate report here, ok.

So if you come to C link wheel bandit reports here you’ll get the text files of the reports you should have 3 here, tier 1, tier 2 and tier 3. So what we need to do is open this up and select it all and come to our excel spreadsheet and remember tier 1 in linkwheelbandit is actually tier 2, paste them in here and then go through and do the same for the tier 2 text file which goes on the tier 3 tab and the tier 3 text file goes on the tier 4 tab.

So I’m just going to paste in some dummy content here to show you how it all pieces together. Just before I show you how to export the URL’s from GSA I just wanted to show you the value of Captcha Sniper since my last tutorial. Last time the total solved was around 100,000 and since then it’s done another 86,000 captcha solves for me. That an immense number and would cost about the same cost as Captcha Sniper to solve and that is in just a week’s time.

So again I highly recommend captcha sniper it is essential for posting out on mass with GSA so just make sure you pick it up it will save you a lot of money in the long run. So let’s just minimise that.

Exporting our tier 2 links first here is the project we created last time, I didn’t let it run for very long just enough to build a few links so I can export them and show you how this works.

So if we just right click on the project, show urls, verified and in this window here click and press ctrl + a to select all, right click copy URL, come to our spreadsheet and paste it down into the tier 2 tab here.

And then we just need to come back into GSA and find our tier 3 project and the same process again, right click, show URL’s, verified, select all, right click, copy urls and then go to tier 3 of our spreadsheet and drop them in there. Then at this point you want to be saving a copy of all of your exported urls.

Now we just need to export all of these URL’s into their own individual text file. So to do that select all of the tier 1 links, come into your text editor, paste them, save tier 1 dot txt and then tier 2 tab, come up to the top here, select them all and tier 2 dot txt and then come here to tier 3 links, save as tier 3 dot txt and then here the tier 4 links. And again these are just sample ones I’ve tweaked all the URL’s so none of these are actually live URLs and your list will be an awful lot longer here so copy these, come across tier 4 dot txt.

And now we can cross reference all of these text files against each other in Scrapebox to make sure all of the links are still alive and none of them have been deleted before we move onto the next step of getting them all indexed and boosted even further.

So in Scrapebox the first thing that we are going to do is make sure that all of our tier 2 links are still all live and linking to our tier 1 links. So to do that we are going to use the check links function in the websites field here we are going to load our list of tier 1 urls. In the blog lists here we are going to load our list of tier 2 urls and then click on check links.

And what that will do is this is a list of our tier 2 links here and it will check each of these and see if they are linking to any of our tier 1 links. Now this tier 2 list is only 62 URL’s here but your tier 2 lists is going to be much bigger. So once you’ve got that loaded in like this just click start and I’m going to pause the video and resume it when it’s finished.

Ok that finished so we just need to go to export links, export links found and we are going to call this tier 2 found and press save. Then we can close this next in this field here we need to open our tier 2 links and this time down here we are going to open tier 3. Click check links again and this time this is a list of our tier 3 links and its going to check each of these to see if they link to any of our 62 tier 2 links down here so again click start I’m going to pause the video I’ll be back when it’s done.

Ok that’s done so again export links, export links found this one’s tier 3 found and click save and close. And we just need to do it once more this time in here a list of our tier 3 links and in here a list of our tier 4 links click check links click start and I’ll resume the video when that’s done.

Ok that’s done you can see that it only found 7 entries here this is because I didn’t actually build out the full campaign and build all of the tiers in link wheel bandit and GSA I just let it build a couple of URLS before I stopped it this is purely for demonstration purposes. So again that’s why you’re not seeing all the URLs, you’ll have a much bigger list here when you’re doing this final check but anyway export links, export links found and that is tier 4 found.

If we come back over to our link tracker spreadsheet we can actually go through and delete everything out of tier 2, 3 and 4. And then what we need to do is open our tier 2 found, tier 3 found and tier 4 found files that we have just created and then copy those into the relevant tabs.

And once you’ve done that what you will now have is a nice organised spreadsheet of all your live links split into tiers. This will be really handy moving forward so make sure you get everything organised like this and now we can start indexing everything up and sending all the search spiders through all of these links to eventually find our site.

Just before we move on to the next step I wanted to show you a bit of software called Inspyder Backlink Monitor. I’ve only just started playing with it properly I don’t actually use it as part of my main process yet but it’s a really nice way of managing a tiered link building campaign. Now the process I’ve just shown you in excel while it works it is a bit of a ball ache and it does take a bit of time to put together but with Inspyder Backlink Monitor you can just setup a project and paste all of your backlinks into this one box and thats all of your tier 1, 2, 3 and 4 and there’s no need to separate them out in anyway just one big long list of all the links you’ve built in your campaign.

Once you’ve done you can just hit go here and it goes out and checks various things for each page. If the link is still alive, how many outbound links there are, if the links no follow, the domain page rank, the page rank, whether or not it’s indexed in Google, the anchor text and the IP address. And when the run is completed if you come over to this link hierarchy tab you can see here earlier on I think in video 3 I created bookmark links to Google.co.uk and the rest of the links to Yourdomain.com as examples. But you can actually see here we can double click here and this is our money site URL, these are all tier 1 links this is a tier 2 link pointing to the tier 1 link and we can see another example there a tier 2 link pointing to a tier 1 link and if we just open this one up here, here again we’ve got a tier 2 linking to our tier 1 link. If you’d have done the campaigns properly, oh here we go so here we’ve got a tier 2 link, tier 3 link and all the additional tiers under that.

If I’d have built this campaign out fully there would be a lot more here for you to see but this is a really nice way of organising your tiered link building campaign as it automatically puts the link hierarchy and the structure together so you can easily see exactly what’s going on, the PR, the links whether things are indexed or not and then as you build new links for your campaign all you need to do is paste them into this box click Go again and it will go out and update all of these and update the data for your backlinks and recreate the link hierarchy and things like that.

There’s also the selection tool here so you can select things based on criteria. You can say I just want tier 1 links that are not indexed or you might want just tier 3 links that are not indexed and you can see this supports up to 10 tiers of links which is far beyond anything I’m ever going to build. But I just wanted to show you this tool it’s not something I’m using heavily right now I’ve only just started testing it properly this last week or two it might be a better solution for some people out there so I just wanted to go over that and show you how it all works.

So let’s take a look at how we are going to index everything up. It is really important that we get all of the tiered links we have created indexed in Google. We also need to make sure we send the search spiders on a path of discovery throughout the tiers and as the spiders move through the tiers they are going to discover millions of unique and relevant paths to find your money site.

The way we are going to do that is by building more backlinks obviously, this is tiered link building and the best way to index anything is to build more links to it. But there’s a clever way to go about it, we are only going to build links to pages that are already indexed in Google and get regular spider visits already. This means that we don’t have to worry about building links to index links and then we need to build more links to index the links we have created above that. If we just build links that are already indexed by Google and get regular spider visits then it will index everything underneath it automatically, we don’t need to worry about indexing any more.

So there’s two ways that we can do this one is to use auto approve blog comments and we can use software like scrapebox to achieve that quite easily and the second one is guestbook submissions. Personally I use xrumer to do these just because it’s much faster but I appreciate that not everyone’s going to be able to afford the costs of xrumer nor the server that you’ll need to run it on so I’m going to show you how to do it with the Scrapebox learning mode poster and also GSA search engine ranker so you’ve got 2 ways to go about doing the Guestbook submissions.

So just before we jump to scrapebox we need to make a master list of our tier 2, 3 and 4 links. So to do that if you just select all of your tier 2 links and bring them to a new tab here, all of your tier 3 links and the tier 4 links which is actually tier 3 from linkwheelbandit if you remember correctly and drag that here. And if we just rename that Master List and give that a save. Next we need to go and get a big long list of related keywords so if we fire up Scrapebox.

And once Scrapebox is open go to scrape, keyword scraper and just enter a few of your root keywords into here and hit scrape and I’m just going to pause the video while that completes. Ok that’s completed so hit ok, remove duplicates and we actually need a really really long list of keywords so take those scraped keywords and we are going to add them to the search box here and hit scrape again and that will find even more keywords for us so I’m just going to pause the video.

Ok that’s done and then if we just remove duplicate keywords, transfer results to main keyword list, close and then we’ll just save that as keywords. Ok and what I should have done before with the master list from excel is if you just take a copy of the master list we created and save that as a text file as well and that can be called websites, save.

So then we need to generate a list of names and e-mail addresses and we can do that up here and I’m just going to generate a couple thousand names and we are not actually going to use the names for posting we are going to use the keywords we just saved for posting so we can skip the names files and just come across here and generate some e-mails and save those. Ok.

So close that and then we just need to setup the actual comment posting project. Make sure your using proxies this is important in this example I’m not going to both but you should be using proxies here and setting the project up for names we are actually going to select our keywords file. E-mail our E-mails file. Websites our websites file which is a master list of our tier 2, 3 and 4 links, Ok. Your comments file, I have a pre-prepared spun comments file, I’ll show you what that looks like quickly. This is something you should prepare in advance but it’s just a list of generic type comments that are suitable for any site. And finally your blog lists or target lists of sites that we are going to post to and this is a huge auto approve list that I’ve created over time. If you need to create your own auto approve list then have a look around some forums and things like that you can usually find some lists going about. Spend some time collecting lots of different ones and you can merge them into one list.

As you can see here mines over 180MB in size I think there’s over 3 million entry’s in it at the moment but just go around the web, collect your own list and make sure you remove any that constantly fail. So open. And that might take a while to load because of the size of my list so I’ll pause it while it’s having a think. Ok and once that’s loaded all you need to do is press start posting and that will go out and start submitting a load of blog comments to your targets URL’s and if I just press Ok there and we can see that’s working through the list already. So we can just leave this running now in the background and this will build lots of auto approve blog comment links to our tier 2, 3 and 4 properties and next we can move over to guestbook submission.

So with guestbook submissions I normally use xrumer but it’s pretty expensive and you need a dedicated server or at least a VPS to run it really so it’s probably going to be out of most people’s price range. So we are going to look at two different ways we can post to guestbook’s without a huge cost. The first one is going to be with the Scrapebox learning mode, now you can install the learning mode in the show available add-ons toolbox and once you’ve got that installed just open it up. And once it’s open you can come across to the learning tab here.

Now you can pretty much teach Scrapebox any platform you want with the learning mode and it’s really easy to do. So I’m going to teach it the Lazarus guestbook platform here and I’ve already got a list of Lazarus guestbook’s as you can see here so if we go to load urls from file and select our list of Lazarus URLS and just open it up and here it will say detected forms, name = book that looks like a guestbook form to us and click select and then it’s just a case of clicking in these fields and it will come up with a list of variables to assign, so in the name field we are going to want username, Ok. For e-mail field user email, Ok. Location it isn’t a required field so we can skip that. Homepage userurl, ok. Your message, comment, ok.

And Captchas this is a text based question captcha so we can just click here and it says select the part before the question that will identify the captcha which it has already done for us there. Select the part after the question which will identify the captcha and it’s already done that for us there as well so just click Ok. And in here we have got a text captcha result and press OK, and once you’re happy with those we can move onto the next URL up here. Detected the form, select and as you can see it has already filled out these fields for us based on what we filled out for the last one.

So let’s take a look at one more and check they are all the same. Oh that one’s got an image based captcha so we’ll skip that you can’t actually mix image and text based captchas in the learning mode you have to separate them out so let’s try this one instead. That’s an image based one, here we go another text based captcha so select username, useremail, user URL and comment. It’s already looking pretty trained to me lets double check all of this should be filled out and that’s it, so that is scrapebox we’ve taught it the Lazarus platform here you can go through and teach it lots of other different platforms but once you’ve done that you just need to save the learned form data, Lazarus posting config and hit save and let’s quickly have a look at how we post out to all of these guestbook’s.

So submitting to the guestbook’s is really easy if we just come across to poster here and the definition file is the file we have just saved the posting config open that up and then just fill out the rest of these quickly. Name and E-mail, comments that’s it there and load URL list this is our list of Guestbook’s we want to target which is there and that is pretty much it. So we taught it the Lazarus guestbook platform, exported that config, loaded it all into the poster here and then you’ve just got to click on start posting and that will go out and submit your link to all of the guestbook’s.

This is kind of a slow and sluggish way to achieve guestbook posting as you’ll have to go through each guestbook platform and teach it to scrapebox which is going to take you a while and it’s a bit of a ball ache managing different lists for different platforms and things like that. There’s a much easier way to do it with GSA Search Engine Ranker if you’ve got it but if not Scrapebox, you can do it with. So let me show you how to do it with GSA because it’s much easier.

So if you open GSA and click new, untick everything here except for guestbook we want that ticked and you can see how many guestbook platforms are supported there, there’s quite a few. So let’s fill out this form quickly our target URL’s which are our tier 2, 3 and 4 links and we are going to import our master text list of those urls which is here.

Our keywords is a comma separated list of keywords and anchor text is a spun list of our anchor text. Tick these 3 boxes here it comes with some default text inserted there but I like to use my own custom written and hand spun comments so if you grab that and chuck it in there and also replace the German one and the guestbook titles which you can leave at default.

Come across to options the only thing that you want to untick is that so we only post guestbook links otherwise we end up getting some of the other types come through and right click check all with English language and click on Ok and this is going to ask us for a name so I’m going to call it video tutorial guestbook spam and hit ok.

And as you can see our project has been created there now what GSA will actually do is it will go out and scrape its own target list of guestbook’s before it submits to them. What we can actually do is use Scrapebox to scrape a huge list of all the different types of Guestbook’s and then import that URL list into GSA so that this project only has to focus on the submission of our URL’s and it’s just a really nice way to speed things up. So underneath this video on my blog you will find a list of guestbook’s that I have previously posted to and used you can download that and import as target list here free. But what you should do is use Scrapebox to scrape your own targets, now I’ve included a list of footprints as well under this video that you can use for scraping so let’s just head over to Scrapebox and I’ll show you how to use those very quickly one moment.

So in Scrapebox to complete the guestbook scrape you need 2 files, one is the list of footprints for all of the guestbook’s that are supported in GSA this file you can get underneath this video on my blog and then we also need the merge list of common words which I made available in video 2 I believe it was. So with those 2 files in place here we can import from file and that’s going to be our actual footprints list here and then if we click the merge button we can merge that with our merge list and that creates a list of things to go out and scrape for. You will need to use proxies here to complete the scraping process I’m not going to just for this example I’m just going to do a very brief harvest here and just see what we can get so I’m just going to pause the video while that completes and we’ve got a couple of thousands results in now so I’m just going to stop harvesting, remove duplicates and then just export that URL list as GSA Guestbook Targets.

Now obviously that will take a lot longer to run as there’s nearly 10,000 keywords here and you’re going to end up with a huge list but once you’ve got a list of guestbook’s exported you can head back over to GSA.

Once you’re in GSA to import that list of Guestbook’s you can just right click on the project, import target URLs from file and select that target list and GSA will automatically parse that list and submit to them all as soon as you press start.

So let’s quickly sum up what we have done in this video.

First of all we learnt how to extract and organise all of our tiered backlinks with scrapebox and a spreadsheet. Don’t forget to take a look at inspyder backlink monitor as well.

Then I showed you how to setup an auto approve comments blast with Scrapebox

Next we looked at 2 different ways to post to guestbook’s, one way was with the Scrapebox learning mode and the other was with GSA search engine ranker

The end result is you now know how to index all of your tiered links and most important get the attention of the search spiders.

All of the comment and guestbook’s links have been placed on pages already indexed in Google so now all we have to do is wait for Google to find our links and send it’s spiders through all of our tiers.

In the next video I’m going to reveal my personal advanced tier 1 tactics to help throw you up the rankings in no time at all.

You’re also going to learn how to automate natural social signals in just a few clicks

And I’m going to show you how to buy and use new or aged domains in your campaign to create the perfect tier 1 link.

Well that wraps up part 5 of the ultimate guide to tiered link building tutorial.

If you have enjoyed it and want to see more like this then please subscribe to my blog at www.matthewwoodward.co.uk

See ya!

Resources In The Video

GSA Search Engine Ranker FREE TRIAL – An absolutely essential piece of software for any link builder.

Link Wheel Bandit FREE TRIAL – Setup thousands of web 2.0 sites in a few clicks.

Captcha Sniper – Solves captchas automatically, saves huge amounts of money on Captcha fees.

Inspyder Backlink Monitor FREE TRIAL & Exclusive Discount – The only piece of software capable of tracking a multi tiered link campaign.

Update: The Inspyder team got in touch to offer a $10 discount for my readers, this gets added on automatically at checkout.

Xrumer FREE TRIAL – My personal choice for posting to Guestbooks (I show you how to do it with Scrapebox/GSA in the tutorial).

88 Responses

  1. Preston King


    gonna peep the vid but had to be the first to post…everybody else asleep or are you all building links?

  2. Preston King

    video ain't showing dog, only a black box.

    • Matthew Woodward
      August 28th, 2012 at 4:22 pm

      Really? Its working here in IE, firefox & chrome can you drop me a screenshot or something and some more details?

    • Michael Cox
      August 29th, 2012 at 4:22 am

      Working fine for me, another great video Matt, will be buying some more tools soon through your links i have been following all the way and nearly got my tier 1 first batch complete.

    • Matthew Woodward
      August 29th, 2012 at 10:17 pm

      Thanks Michael let me know how progress goes or if you need any pointers you've got my email address

  3. John D'oh

    Will this be your last vid? :(

    • Matthew Woodward
      August 29th, 2012 at 10:18 pm

      No there will be one more in this series and then I'm going to cover some other stuff, like how I semi automate guestposting and so other tidbits. Got a surprise or two lined up over the next month as well ;)

  4. Rex Stan

    you are the man matt..im always checking everyday for new videos.. please can you do a video on using xrumer as you mentioned in the video for guestposting to index backlinks..i have xrumer and you will be surprised how many people have it but can't use it..please do a video and you will be surprised how far the video will go cuz there's not many detailed video on how to use xrumer..


    • Matthew Woodward
      August 29th, 2012 at 10:16 pm

      If you just subscribe to the e-mail list you will get notified of them before anywhere else :)

      There are a few decent xrumer tutorials out there if you look around, loads on youtube mate

    • Michael Cox
      August 30th, 2012 at 6:17 pm

      I have xrumer but haven't kept up my subscription used it back in the day for direct linking my money site with forum profiles, there are some good xrumer tutorials on BHW – http://www.blackhatworld.com/blackhat-seo/black-hat-seo-tools/372089-must-read-xrumer-tutorial-list.html

    • Michael Cox
      August 30th, 2012 at 6:18 pm

      Is there no way to live link on these FB comments Matt?

    • Matthew Woodward
      August 31st, 2012 at 9:06 am

      Tbh I thought they did live link :S

    • Tom Thoma
      September 12th, 2012 at 10:54 pm

      Matt, as Michael has mentioned forum profiles above, could you clarify if it is acceptable to still use these, I have used large numbers of profiles pointing to videos pointing to my articles on my website and this has worked well prior to penguin, I can switch to web2.0 as suggested, but wanted to know if the use of profiles is still acceptable and how to use them now post penguin?

    • Tom Thoma
      September 13th, 2012 at 11:48 am

      Hey Matt, as Michael mentioned forum profile backlinks, thought to ask your opinion of them post penguin, and whether they can be in used in one of the layers and what volume, but ran into problems with these recently with our site getting slapped ?

    • Matthew Woodward
      September 13th, 2012 at 2:27 pm


      Personally I dont use them anymore

  5. Anonymous

    Is it me or does video start breaking up 5 mins in?

    • Matthew Woodward
      August 31st, 2012 at 9:07 am

      Seems ok this end

  6. Anonymous

    Also Matt, why don't you allow normal WP comments on your blog, I have some detailed questions on tiered backlinking I would like to ask, i.e would you carry out this one campaign for only 1 url on your money site, or would you link tier1 to many url's on your money site, sorry but you didn't make this clear from the first vid. By the way great tutorial you are a natural teacher and thanks for all the effort you put into this, I have all the tools, apart from CS and LWB, which I will get from you link when I decide to go for them.

    • Matthew Woodward
      August 31st, 2012 at 9:06 am

      I would for the most part only link to one page on the money site but mixing it up here and there wont hurt.

      I like the Facebook comment system better, looks pretty n stuff :)

      Thanks for your kind words I've never tried to teach before and this is new territory for me, glad it has helped you out :)

  7. Anonymous

    So Guestbook psoting will be in the next video (I show you how to do it with Scrapebox/GSA in the tutorial)?

    • Matthew Woodward
      September 10th, 2012 at 7:01 am

      Its in this video ^^

  8. Hardvard Andrew

    do you index all link in 1 time? not with drip feed?

    • Matthew Woodward
      September 10th, 2012 at 7:02 am

      I just index the lowers the tiers which will auto index the upper tiers over a period of time

  9. Luke Ilechuku

    Matt, do you normally use a VPS? Is there a VPS service you could recommend?

    • Matthew Woodward
      September 13th, 2012 at 2:26 pm

      Hi Luke,

      I self host everything but had good experience with https://www.matthewwoodward.co.uk/get/PowerUpHosting in the past – they look expensive but look at the discount codes under the prices.

      They are on 1gbps line as well ^^

    • Luke Ilechuku
      September 14th, 2012 at 1:56 pm

      Matt, many thanks for that. I think I saw that site recently in the PPP WSO so I've been looking at that one. Didn't know about the discount codes though; will check if those are for life.

      When you say you self host, how does that work? Is it a spare PC in your home office? That's the other possible option I'm looking at.

    • Matthew Woodward
      September 17th, 2012 at 9:54 pm

      Yeah its just a couple of machines in the office that I remote into, nothing special at all but I'm on a 24 meg line which performs consistently which helps although I think I'm going to need a VPS or two soon looking at the project I've just mapped out

  10. Spencer Padway

    When do you do the guestbook/comment indexing? Since linkwheel bandit never ends, how long do you wait before going through with that?

    • Matthew Woodward
      September 15th, 2012 at 10:37 am

      Just leave LWB running for a week or two, you can click on the projects to see how many of the tiers it has built out.

      • Ray
        October 1st, 2013 at 12:45 am

        Great info. Do you just let Scrapebox continuously make comments on blogs (since you said that you have 3+ million entries on your master list)? I mean, don’t stop until there are 3+ million comments to your Tiers 2, 3, and 4? Is this correct? Thanks.

        • Matthew Woodward
          October 4th, 2013 at 9:20 am

          Pretty much :) Might even do a 2nd run lol

  11. Rex Stan

    Thanks for all your tutorials. In response to your reply to Luke. What do you mean by you self host..can you plese shed more light on that. Also regarding xrumer, I have searched all through the internet and non seem to deal with guestbook posting. Would you still be making a video on using Xrumer for guestbook posting (your tutorials are the best)?

  12. Rex Stan

    Thanks for the link I have watched the videos and iv got a hang of it now. Regarding self host, do you mind sharing all that you use. I'd prefer self-hosting to paying for a server to host my softwares.. please matt share what sort of computers you use, the web connection/broadband service you use (im equally here in the Uk) and anything else that will help.


    • Matthew Woodward
      October 2nd, 2012 at 11:48 am

      Hi Rex,

      I just use a couple of old machines – I think one is something like a dual core x2 with a couple gig of ram and the other is an old core 2 duo with 4gb of ram on a 24meg line although recently I have started to use VPS's again as i'm running my home based machines/bandwidth at near 100% capacity.

    • Rex Stan
      October 3rd, 2012 at 7:28 pm

      Thanks for the headsup.I really enjoy coming back to your blog.. A couple of questions though.Iv got a couple of machines myself (2 to be precise) and i have upped their ram, what internet provider/package would you recommend in the Uk {BT, Sky or Virgin}that would allow me pump all the threads i can without any complaints or getting banned. Also, do you make use of proxies {public or private}and vpn's to protect your IP..


    • Matthew Woodward
      October 4th, 2012 at 4:50 pm

      Hi Rex,

      I'm with Be broadband who used to offer an unlimited 24meg package but don't anymore although my friend just got his installed and is running near to that anyway :S

      I use 50x semi dedicated private proxies from https://www.matthewwoodward.co.uk/get/BuyProxies

    • Rex Stan
      October 9th, 2012 at 9:52 pm

      Thanks mate. i really do appreciate your being open with the resources you make use of. i was able to find some broadband provider on moneysupermarket that offer 24meg.gonna try one and see what happens.


    • Matthew Woodward
      October 14th, 2012 at 3:41 pm

      No worrys :)

  13. Kobiruzzaman Jibon
  14. Anonymous

    <a href="www.paydayliftoff.co.uk">UK's Top Payday Loans Lenders</a>.

  15. Jack Willis

    Hi Matt,

    What is your opinions on using indexing programs/services like Lindexed?

    • Matthew Woodward
      November 2nd, 2012 at 9:45 am

      Waste of time & money – just build more links to them

  16. Milon Hossain

    <a href="http:/www.yourlocalinstaller.com/">satellite plan, satellite internet, internets plan, infternets offer connection</a>.

  17. Wakonda Marketing

    Hi Matt, thank you for having this around mate! I'm sure we could benefit from all of it.

    • Matthew Woodward
      November 21st, 2012 at 10:08 am

      No worrys hope it helps you out!

  18. Marcus Ochoa

    Thanks for the video Matthew! Where can I find someone that I can pay to do all of this for me if I don't have the time? Do you have a service that you offer?

    • Matthew Woodward
      November 21st, 2012 at 10:10 am

      I dont offer a service sorry and only work on a consultancy basis – I could put together a strategy for you covering the setup of the system as well as any required technical/human resources to implement this so you still have full control over what happens without doing any of the work etc

  19. Anonymous

    Hi Matt, You are really the man! Thanks so much for the in-depth tutorials. One thing though for me is that I have not been able to access or download any of the foot prints and many other resources you mentioned in the video. I have shared the video by going to share and inbed but still can't get the link to download the mentioned resources. Please help me. I need them, in particular, the image spinning template, the auto approve list and the scrapebox foot prints. Once a gain thanks so much!

    • Matthew Woodward
      November 30th, 2012 at 11:09 am

      Thanks for your kind words

      Drop me an email and I'll send them all over to you – not sure why its not working for you =

  20. Brent

    Thanks mate! This was probably the best tutorial I have seen (and I have seen many over the years).

    One question. Why did you have the LWB post an article every hour? Isn’t that like overkill? And wouldn’t you run out of unique articles at some point? I saw you only loaded 52 in LWB. At the rate of every hour, that would run out pretty darn fast!

    Also, for the target URL’s, does LWB create separate pyramids for each target URL? If you load 10 target URLS, does that mean, 10 separate pyramids?

    Thanks again for putting in the time and effort to teach the rest of us. Now take down the videos, so it remains a secret! :)

    • Matthew Woodward
      January 17th, 2013 at 8:54 am


      Because the link structures it makes are massive and if it wasn’t posting every hour it would take months. The 52 articles I loaded are all super high spun content.

      No it just creates one monster pyramid across all of the URLs

      I’m just playing with LWB 2 now – very impressive so far!

      • Brent
        January 17th, 2013 at 10:46 pm

        Yeah, I’m playing around with it as well.

        Maybe its a question that should be directed at the developers, but posting one article every hour means to each property? And is there a way we can export all the built links?

        not to worry, I will give your reference when I contact the developers with my questions. You have pushed their product better than them!

        • Matthew Woodward
          January 18th, 2013 at 7:59 am


          Yes I believe that is the case. You can export reports in the File menu.

          Haha the dev is a cool guy – but he is a dev and not a marketer :)

  21. choikh

    Hi Matt,
    Can you share how do you find the auto approve blog list in you use in scrapebox?

    • Matthew Woodward
      May 26th, 2013 at 10:51 am

      A quick and easy way it to just take them from forums, merge them all together, do an outbound link check and remove any that are majorly spammed

      • choikh
        May 26th, 2013 at 11:20 am

        What do you mean by an outbound link check?
        Can I do this in ScrapeBox?

        • Matthew Woodward
          May 27th, 2013 at 8:26 am


          Yes look in the addons section for the link checker tool. Basically if its got 4,000 outbound links, dont use it :P

  22. Chris

    1.) Just curious on whether you purposely left out adding the “Target URL’s” in Scrapebox for teaching ‘Lazarus’ platform when posting?

    2.) Your awesome, btw I bought CS from you, will be purchasing UD, LWB, Inspyder from you. SB, GSA I already have, also I should’ve held out on Backlink Beast as they don’t have a drip feed option and don’t post to nearly as many as UD does.

    • Matthew Woodward
      June 18th, 2013 at 8:35 am


      1) Can you please expand on that a bit?

      2) THanks =D

  23. nicholgm

    Hi Matt, you did’nt really go into detail about where you got the lazarus list from. Could you please elaborate?

    • Matthew Woodward
      July 12th, 2013 at 9:51 am


      Just search ‘autoapprove list’ in Google, filter for discussions – eat your heart out!

  24. Andrew Edmonds

    Curious. When you check that tier2 links point to tier1 and create a tier2 found list, shouldn’t you use that new list when checking tier 3 links instead of your original tier2, else you may end up spending time indexing tier 3 links that point to tier2 when the tier 2 no longer point to tier1.

    Does that make sense?

  25. watekungsik

    Hi Matt,

    How long would you run the guestbook in GSA? And can I update the new tier 2 & 3 url in the same project?

    • Matthew Woodward
      September 16th, 2013 at 10:09 am

      Until everything is indexed :)

  26. Waseem

    is their an alternative to link wheel ?

    • Matthew Woodward
      September 18th, 2013 at 11:54 am

      No its the only tool of its kind but you can do without it

  27. Erik Heyl

    Hi Matt, I’ve been trying to decide between Captcha Sniper and Captcha Breaker (of GSA fame). I’m curious: Besides price what made you pick CS over CB?

    • Matthew Woodward
      October 7th, 2013 at 7:59 am

      CB didn’t exist at the time ^^

  28. Ray

    Hi Matt. I’m doing your Scrapebox (purchased with your link, BTW) blog comments right now. I have 2.5 million blogs to comment on, and it seems like 10% is going through so that once it gets done, I’ll conceivably have 250,000 comments. My question is, Do I have to do the guestbook submission IN ADDITION to my comment blasts or is it optional–like you can do one but not the other, etc.? Thanks.

    • Matthew Woodward
      October 8th, 2013 at 10:22 am


      Scrapebox doesn’t have an affiliate program but thank you anyway :)

      The guestbook stuff is optional but blast away with anything on those lower tiers!

      • Ray
        October 8th, 2013 at 3:17 pm

        Thanks for the info. Scrapebox doesn’t have an affiliate link? Haha. Don’t I feel like a pandering a-hole. But seriously I did buy something with your link. Captcha Sniper maybe? Forgot what it was. Anyways, keep up the good work!

        • Matthew Woodward
          October 9th, 2013 at 10:19 am

          Haha I don’t care if you bought through my link or not, as long as its helped you out f*** everything else :)

  29. Nitin

    Hi Matthew! I am a great follower of you, and I must say you are one of the few whom I can call “G** of SEO”. As I have yet to understand, do Link wheels work as good as Link Pyramids? Can I get penalised using Link Wheels? Can you do a tutorial on link wheels using web 2.0s. Just the basics, and the advantages as well as the disadvantages. Not an actual tutorial using a software.

    • Matthew Woodward
      October 23rd, 2013 at 1:46 pm

      Thanks very much :)

      No you should stay away from link wheels – they worked a lot better a few years back :)

      • Nitin
        October 24th, 2013 at 2:27 am

        Okay, I shall stay away from it! Thanks Matthew!

        • Matthew Woodward
          October 24th, 2013 at 8:15 am

          No worrys :)

  30. Chris

    Hi Matt, loving the videos I have a few questions do I need all the products you use? I mean is their not one piece of software that does it all?

    • Matthew Woodward
      November 12th, 2013 at 9:06 am


      You can get away without using LWB

  31. Sal

    Hello Matthew, your kick a** man!!! Needed to learn about link building desperately and you have provided me with a lot of knowledge thank you very very much. One question I would like to ask is once you have submitted your first round of links, tier 1, 2, 3, you say rinse and repeat, now does that mean that you submit to the same sites again in ultimate demon?

    • Matthew Woodward
      December 1st, 2013 at 1:08 pm

      Thanks Sal :)

      No you must use different sites

  32. John

    Hi Matt

    1, The TIERD LINK BUILDING IS the same as link wheel?

    2, Should I link all Tier 1 pages with one way link?

    • Matthew Woodward
      April 18th, 2014 at 9:51 am


      1) No – use google image search to spot the different

      2) not sure what you mean

  33. Ken

    Hi Matt,
    I am now watching your tutorials, and Link Wheel Bandit links seems to be broken ? how can I get the software ?

    Thanks for your great tutorials. I also subscribe for your Premium tutoiral.


    • Matthew Woodward
      October 1st, 2014 at 9:14 pm

      Yes it is dead now – you can skip it!

      • Ken
        October 2nd, 2014 at 2:17 am

        Hi Matt,

        I subscribe to your Premium tutorials. But in the member panel 2 video shows error flash: Video file not found :
        Riding Other Peoples Traffic Waves With Twitter – Rank Cracker
        Content Creation & Promotion Made Easy – Rank Cracker

        Can you get it back?

        • Matthew Woodward
          October 14th, 2014 at 10:43 pm

          Refresh the pages :)

What are your thoughts?


* Name, Email, Comment are Required