Advanced Tiered Link Building Tutorial Part 2 – Preperation

This is the second video in my Ultimate Guide To Tiered Link Building video tutorial series. If you missed part 1 then be sure to check out the episode guide on the right.

Thanks for all the great feedback so far! Please share this video series with others if you find it useful. It takes hours to make each video so taking a few seconds to share it is a great way of saying thank you.

Important Update: Please read this.

What You Will Learn

  1. How to create a perfect tier 1 link profile
  2. How to prepare for the campaign
  3. How to spin content the right way
  4. How to setup a campaign over my shoulder
  5. How to schedule your tier 1 link campaigns

Resources In The Video

Essential Tools

TheBestSpinner (download the trial) – It really is The Best Spinner.

Ultimate Demon ($50 Discount) – perfect for creating a solid tier 1 link profile.

Sourcing Content

99centarticles.com – Good quality articles for the money – perfect for link building!

Exclusive reader discount – Save 10% with coupon code MATTHEWWOODWARD

SEOGenerals.com – Fantastic backend for managing your orders and projects.
Update: Since publication the SEO Generals service is not what it once was and the coupon has now expired. Please use 99CentArticles who offer a great service and pricing.

Fiverr videos – Can’t get them any cheaper than that!.

Other Resources

Google Adwords Keyword tool – Find related keywords and tags.

Scrapebox – The swiss army knife of the SEO world.

BuyProxies.org – The semi dedicated ones from BuyProxies are superior in comparison to SquidProxies which I used to use.

456 Responses

  1. David Sadows

    Aditya, I think Matthew is sharing some quality info here, don't you? I don't mind throwing him an affiliate sale or two for his trouble.

    • Matthew Woodward
      August 4th, 2012 at 2:44 pm

      I think the comment was deleted but thank you very much for the support.

      These videos take an awful lot of time to make, nevermind how much time and money I invest in testing various factors!

      Don't like the affiliate links? Don't watch the videos :)

      • penny
        September 22nd, 2013 at 1:43 pm

        I am going to buy all the products that you listed with your affiliate links. The amount of information here is invaluable.

        • Matthew Woodward
          September 22nd, 2013 at 4:48 pm

          That is very kind of you thank you!

      • Tony Hayes
        February 1st, 2014 at 11:35 pm

        People want the content to learn how to make money online and then some complain when you lead by example.
        Look at what Matthew is doing here guys…
        High quality content with useful and actionable information, compelling headlines and call to action.
        Don’t knock it, learn from it!

        • Matthew Woodward
          February 2nd, 2014 at 7:17 pm

          You can’t win them all :)

  2. Brax Caberte Bustillo


  3. Peter Webb


    I have now found a resource that finally explains tiered link building in a simple way and how to integrate the tools together to make it happen in a Google safe way. Wish I had this before I bought UD. I would certainly have purchased through your affiliate link.

    Can't wait for the next video's. Will you be covering the new version of UD with link trees?

    Many thanks for what you are doing.


    • Matthew Woodward
      August 8th, 2012 at 8:02 am

      Hi Peter,

      Thanks for the kind words :)

      The link tree's feature looks interesting but it won't be something I use personally, I have a much more efficient and streamlined way of doing things but it will allow you to create multi tiered structures within Ultimate Demon easily.


      P.s. video 3 is up ^^

  4. Nick Roberts

    Great videos, do you offer this as a service Matthew? I don't have SB at the moment or ultimate demon. I am just building my tier 1 with a blogger blog and was going to redirect that to my main domain once I had it ranking. it's on page 2 now.

    • Matthew Woodward
      August 19th, 2012 at 1:09 pm


      I dont offer it as a service sorry and if I did most people wouldn't pay the price I would ask. I may take on a couple of private clients but dont expect pricing to be what you see in forums etc

      Just scale out what your doing with automation :)

  5. Michael Cox

    When scraping your own sites do you wory about PR at page or domain level at all or just add a good mixture of all PR?

    • Matthew Woodward
      August 21st, 2012 at 7:58 am

      Not when I'm scraping but when I submit tier 1 links I like to ensure they are only on domains with PR

    • Michael Cox
      August 21st, 2012 at 10:37 am

      Yea thought that would be the case, hey do you know you rank no13 in the UK for buy seo? Not sure how thats happend lol…

    • Matthew Woodward
      August 21st, 2012 at 10:42 am

      Haha I didn't know that, how did you find that? Maybe Matt Cutt's isn't talking out his arse after all :P

    • Michael Cox
      August 22nd, 2012 at 9:08 pm

      I seen you where getting a few genuine social signals on the videos with the content lock so was checking your site out on semrush to see what KW's you wher ranking for.


    • Matthew Woodward
      August 23rd, 2012 at 8:31 pm

      Ahh thats pretty sweet thanks will be keeping an eye on that one.

      I dont suppose you know of a tool that can report the amount of tweets, likes and +1s for a list of urls in bulk do you?

    • Michael Cox
      August 25th, 2012 at 2:15 pm

      Not really seomoz does it for sites you have added as projects but that's about it, sure there will be a tool somewhere.

    • Matthew Woodward
      August 28th, 2012 at 4:35 pm

      SEO Tools for excel ^^

    • Tom Thoma
      August 29th, 2012 at 10:58 pm

      MatthewWoodward.co.uk Danial Tans(seopressor) Social Metrics Pro Plugin collects stats on posts and pages for twitter, facebook, google+,pintrest, stumbleupon,digg,linkedin and you export them csv,xls.

    • Michael Cox
      August 30th, 2012 at 5:55 pm

      Ow ya from seo gadgets, have you used it yet?

    • Matthew Woodward
      August 31st, 2012 at 9:14 am

      I did a quick test to look up social values for URLs, seems to do the job will be using it to do the monthly report on the blog

    • Matthew Woodward
      September 10th, 2012 at 7:09 am

      Checking out Social Metrics Pro now, thanks for the heads up!

  6. Tom Thoma

    Hi Matt, Great Videos, Keep it Up!

    I am new to the 3 tier linking, but need to clarify a point.

    I normaly have about 20 unique kewyord specific articles on my sites and follow up with links from article directories, blogs, social networks, ect.

    In this section you mention 3 x 500 word articles on a keyword, does that mean you are linking to all three articles on the website or your mass spinning these back to one original copy on the website?


    • Matthew Woodward
      September 10th, 2012 at 7:08 am

      Hi Tom,

      The 3x 500 word articles are what is used to create the backlinks and are not posted on your own website.

  7. Tom Thoma

    Hi Matt, Great Videos, Keep it Up!

    I am new to the 3 tier linking, but need to clarify a point.

    I normaly have about 20 unique kewyord specific articles on my sites and follow up with links from article directories, blogs, social networks, ect.

    In this section you mention 3 x 500 word articles on a keyword, does that mean you are linking to all three articles on the website or your mass spinning these back to one original copy on the website?


    • Matthew Woodward
      August 29th, 2012 at 10:24 pm


      The 3x 500 word articles I talk about preparing are to be used in the link building campaign and not for the site itself.

      You can use this entire process to rank 1 page for a range of keywords and longtails if you mix the anchor text up correctly.

  8. Anonymous

    Matt, could you clarify a few points.
    1) the 3 x 500 once spun will be placed on web2.0 sites does this mean you will have a 3 posts on each web2.0 and the final post will have your link pointing back to your money site?
    2)in the video you post your link in the first post to the web 2.0 isn't customarily to create your profile wait a few days before posting content, to prevent your account getting deleted by moderators.
    3) would you use the same 3×500 articles spun to submit to article directories?
    thanks for your help in advance I feel I've got most of it down just need a few clarification.

    • Matthew Woodward
      September 3rd, 2012 at 8:40 am

      1) Sorry I use one of the articles on the web 2.0s, 1 for the Wikis and the other one for the AD's & Press releases

      2) Dont worry about this (we will build out proper web 2.0 sites by hand in the last video). UltimateDemon mostly posts to Elgg and Jcow web 2.0 platforms where this isnt a problem.

      3) See answer 1 ^^

      Hope that clears things up

  9. Callum Ward

    Brilliant vids! you normally pay for this kinda tuition ;) thank you!

    • Matthew Woodward
      September 10th, 2012 at 7:06 am

      You can send me some money if you want :P

  10. Nathan Williams

    Hey Matt,

    I'm having a hard time coming up with the funds for UD. What do you think about manually posting to about 15-20 WEB 2.0 properties…like the "big" ones (wordpress/blogger/etc) for Tier 1 and then using GSA to make the Tier 2 and 3? Would 15-20 be enough? Or would those "big" ones know what I'm doing and shut me down?

    Appreciate your input!

    • Matthew Woodward
      September 17th, 2012 at 9:44 pm

      Hi Nathan,

      Yes that would be a sensible approach to take – but a better approach would be to watch some of these tutorials on SENuke X https://www.matthewwoodward.co.uk/reviews/the-best-senuke-x-reviews-tutorials/ and then work out how you will plan to use it.

      Spend some time preparing your content and watching the videos so your familiar with the tool before you even use it. Then take out the 14 day trial, load your campaigns in and fire them up straight away using the theory of what you have learnt in these videos to guide you.


      • dr cliff
        April 11th, 2014 at 5:06 am

        For beauty supplements isn’t gsa just as good? why is senuke or ultimate demon better? exactly why? and which gsa? ranker or submitter to use if i am broke and miss senuke trial?

        • Matthew Woodward
          April 12th, 2014 at 12:17 pm

          I don’t know what any of that has to do with beauty supplements

  11. Luke Ilechuku

    My goodness. This is really, extremely good content. I would have paid for this and been happy with my purchase. Thanks!

    • Matthew Woodward
      September 13th, 2012 at 2:43 pm

      Glad you enjoyed it :) Good content should be free not a WSO :P

  12. Paul Rone-Clarke

    Decent list. A bit old (over 50% no longer work) but still plenty of power.

    • Matthew Woodward
      September 14th, 2012 at 11:45 am

      Well the list is less than a month old and all confirmed working when I published it. Try the import again but tick 'my site is stored in a subfolder' and you'll get more successes :)

      Will have a check over the list again at the weekend and update if neccessary

    • Paul Rone-Clarke
      September 14th, 2012 at 2:40 pm

      Hi – No real need, my lists are very much larger anyway (I have two servers scraping 24/7) – but I appreciate the gesture [scritty from BHW]

    • Matthew Woodward
      September 14th, 2012 at 3:46 pm

      And those lists are on their way to my inbox right now right? =D

      Welcome to the blog by the way :) Will probably catch you over at BHW more often than not!

  13. Rich Sale

    Hey Matt, Any experience with Magic Submitter and how it compares to UD?

    I have been using MS for my tier1 links for a while no complaints except the ongoing monthly cost.
    I see that UD has a one-time payment option.
    So just curious to see whether I should swap to UD.

    • Matthew Woodward
      September 17th, 2012 at 9:46 pm

      Hi Rich,

      Sorry I haven't got any hands on experience with Magic Submitter but I don't see why you can't use it for tier 1 effectively.

      I just see UD as a one time investment without having to worry about anything else, but a few people have asked me to tackle magic submitter so I might start having a play with it and seeing what it can really do.

      Its personal preference really! But one time payment > monthly payment

      • eichie abdul
        November 15th, 2014 at 9:16 am

        Hey matthew you are so so awesome. My God since i found you online yesterday i have been shouting your name and telling all my friends about you. I am a total newbie to SEO. i dont know a thing. However i have a good heart to make an impact among students so i started a blog. http://www.opensourceafrica.com/genius can you please kindly look at it and tell me what SEO keyword i should target so i know way to start in the link building process. Thanks so much already.

        • Matthew Woodward
          November 19th, 2014 at 5:22 pm


          That is very kind of you thanks.

          What is the purpose of your blog? It wasn’t clear when I looked

  14. Andrew Thomson

    Great stuff! – Have the downloads been removed? It would be good to be able to get them. I have not used scrapebox before – do you need to put the footprints in to make sure it finds the right link type?

    • Matthew Woodward
      September 17th, 2012 at 9:42 pm

      No they are still there, just not looking hard enough :)

      Yes the footprints are what you use to find that a site is a certain platform, try some of the searches manually in Google to understand how it works.

    • Andrew Thomson
      September 20th, 2012 at 4:47 pm

      Ok figured it out now – I should have read it more carefully! That thing with the social buttons is pretty clever.

      How does Ultimate Demon tell the difference between the url website (platform) types?
      Does it work it out itself or do you need to submit them as separate pre-sorted lists – based on your footprint criteria?

    • Matthew Woodward
      October 2nd, 2012 at 10:41 am

      You would be surprised how many people miss them – I wonder what other important info people skip over when reading.

      With Ultimate Demon you just paste the list of harvested urls into one box and click start, no need to seperate or sort.

      The only additional filtering I do is once UD has added the sites to the database is to export them all, check the domains are indexed and remove any that aren't via the mass deletion tool.

  15. Anonymous

    hi matt, again many thx for the amazing tutorials, I just set up my web 2.0's, I have to admit it did take sometime, hopefully I will get a bit faster with practice. just wanted ask what you thought of seogenerals? just received 3 articles and they where really bad quality. have u had any similar experiences?

    thx bro


    • Matthew Woodward
      October 2nd, 2012 at 10:39 am


      The quality of the articles are often down the quality of instruction you provide – I am very specific about what I want. But push back on them and they will get it sorted for you.

      With a bit of practice you can set these campaigns up in no time at all – it becomes a mindless process after a while :)

  16. Trevor Bandura

    Quick question about the articles then spinning them.

    Could I use my original articles from my main blog, spin them like you suggested then use those spun articles to create all the tier one web properties?

    • Matthew Woodward
      September 21st, 2012 at 11:25 am

      You could but it is something I would avoid doing personally.

      For the sake of a few dollars to get a new article written I dont see why you would take a short cut like that which increases risk of getting caught.

  17. Sally Cline

    Hi Matt.

    First, this is a great set of videos you've made! I just had a quick question regarding the content and preparation. You had mentioned…

    3 x 500 word articles. – Are these spun articles?

    Then you went on to mention 2 alternative sentences for each sentence in each article. This confuses me a little bit. Would you mind explaining this so I can set up my content appropriately? Would it be…

    3 unique 500 word articles…then write 2 alternative sentences for each of these 500 word articles and then spin them?

    Thanks in advance!

    • Sally Cline
      September 20th, 2012 at 8:39 pm

      Sorry for the double post, but I'm not sure where to use all 3 articles. Or if it's just 1 mass spun article. I watched video 3 and it looks like you used just 1 of the spun articles for the web 2 and article submission directories. Would we just spin it using {Article 1 + Spun content w/ 2 alternative sentences} {article 2 with spun content + 2 alternative sentences} {article 3 w etc}. Appreciate your help Matt!

    • Matthew Woodward
      September 21st, 2012 at 11:29 am


      Yes they are spun to the specification in the video – for each article write 2 altenative sentences for each original sentence then spin all of the words phrases where it makes sense to do so.

      In answer to your second post thats something I forgot to explain ^^ I use 1 for web 2.0s, 1 for Wikis and 1 for AD's/PR's

      • Chris
        August 20th, 2013 at 4:32 pm

        Just so I’m clear, are you saying here that you use 3 articles prepared for spinning, and use the first of those articles to build links using Web 2.0s, the second of those articles to build links using Wikis, and the third of those articles to build links using article directories and “PRs”?

        Is this correct? You use a different article for Web 2.0, Wiki and article directories/PRs?

        And what are “PRs”?

        Thanks in advance.

        • Matthew Woodward
          September 4th, 2013 at 12:56 am


          Yes that is right :)

          PR’s are press releases!

    • Sally Cline
      September 22nd, 2012 at 12:07 am

      Thanks Matt! You're the best! Keep up your great work. I assure you it's helping us newbies out!

  18. Tom Thoma

    Hey Matt, I am looking at SEOGenerals to get my articles written, however there are several options either,

    Junior, Senior, Expert,

    as well as whether they should be,

    LSI articles or Regular,

    your advice welcome?


    • Matthew Woodward
      October 2nd, 2012 at 10:34 am

      Hi Tom,

      This is a new feature they launched last week that I haven't used yet but whichever one offers 500 word articles @ $4.70 a pop

  19. Susan Banga

    very advanced strategies.

    • Matthew Woodward
      October 2nd, 2012 at 10:38 am

      Thanks Susan!

  20. Rob Terrio

    Hey Matt, maybe you can shed some light on this question/problem and can help some fellow subscribers here. Basically what I'm doing is exactly what you have done here. I first acquired a list of keywords via scrapebox (30 keywords). Then I imported those keywords, the merge list, and the footprints.

    I then ended up with around 182,000 keywords/queries. I ended up getting 8 high quality private proxies as well. I imported those proxies and then I hit the harvest button. The problem I'm having is that the cpu usage is fluctuating around 0-2% and it repeatedly stops harvesting at around 9,000 results.

    When I stop harvesting I keep getting a message saying 90% of the URL's were removed because they're all similar. I found this post (http://knockoutbox.com/optimize-your-network-for-scrapebox/) on optimizing my network settings etc. Any suggestions?

    P.S. I changed my power settings from "power saver" to "high performance" and still get very low CPU usage in scrapebox. Sorry the post is so long just being as thorough as possible.

    • Matthew Woodward
      October 2nd, 2012 at 10:38 am


      When you say it 'stops harvesting' does that mean the Scrapebox window just pops up showing you the scrape is complete and which keywords were completed and which weren't?

      If so the problem is all your proxies are getting banned in Google at which point Scrapebox cannot continue to scrape and ends the process. 8 proxies to scrape 182,000 queries is more than ambitious unless you have threads set to less than 10 or something (which would take forever)

      The URL's are getting removed because you have options > automatically remove duplicate domains ticked.

      I dont see why you would want to try and increase CPU load? If anything you should be looking to reduce it.

    • Rob Terrio
      October 4th, 2012 at 6:27 am

      scrapebos doesnt automatically show any kind of complete signal. It just stops harvesting like a freeze up i guess you could say. I figured proxies would speed up the time the actual harvesting could complete rather than a whole 20 some odd hours like you explain in the video here. How many proxies do you suggest? I figured CPU output would speed up the whole harvesting process.

    • Matthew Woodward
      October 4th, 2012 at 4:49 pm

      Who is your proxy source? Not squidproxies by any chance?

  21. Argo Blue LLC

    2 Questions, how many private proxies do you use/.
    What are your thoughts on not pinging the links and letting them be found naturally.

    • Matthew Woodward
      October 14th, 2012 at 3:36 pm


      I use 50 semi dedicated but if you only have a few sites than 10 will be fine.

      Pinging is a waste of time and has being for a long time

  22. Sally Cline

    Hi Matt,

    I was wondering if you would recommend spinning the words in the Titles. I have created 15 alternative titles and think it may be better to spin the words in there as well. If you disagree, can you please let me know why? I appreciate your help!

    • Matthew Woodward
      October 14th, 2012 at 3:41 pm

      Yes you should spin the words, when writing the titles try to include at least 8 words so you can give them a good spinning.

  23. Shaun Mengella

    Hi Mathew
    Awesome content! Is there a way I can use SEnuke instead of UD?

    What part of the NW you in? I am in Lancashire.:-)

    • Matthew Woodward
      October 14th, 2012 at 3:27 pm


      Yes you can apply the theory taught in the videos with SeNuke without a problem, although it is a more expensive solution.

      I'm in Cheshire

  24. Patrick Brady

    Hey Matt,

    Just a quick question. When word spinning your content (after sentence spinning) how many synonyms are you adding? The reason I ask is that I've sentence spun (3 versions of every sentence) and word spun everything in TBS for my article, and TBS still only says 23% unique at the bottom. Not sure if that's an issue. Thanks for your time.

    • Matthew Woodward
      October 18th, 2012 at 4:37 pm

      As many as I can where it makes sense to do so.

      The unique counter in TBS is awful and makes no sense at all at any time – use the generate and compare function on the publish tab to get a true look at uniqueness.

    • Patrick Brady
      October 18th, 2012 at 5:50 pm

      Sounds good, Matt. Thanks very much for the reply.

  25. John Kostrzynski

    Hi Matt

    This is actually the best linkbuilding guide I've ever read and it's free, actually I do the same just a less scale (until I get all the tools xD) and I never get slapped with those G updates, thanks for the share! and succes!

    • Matthew Woodward
      October 26th, 2012 at 7:16 pm

      Hi John,

      I'm glad you enjoyed it :) Who needs WSO's ^^

  26. Mark Scott

    Hi Matthew, jerry attrikk here again.

    I have 2 questions on the tiered link building videos (part 2 in particular)….

    1. You kindly detail the process of finding your own list of target sites for UD. This is superb information, thank you. However it caused me to wonder about something. You say to merge a list of related keywords, which I understand and have done so. However I will be using UD for various projects, which span several very different niches. What I don't understand is that if I do it for one niche, lets say car insurance, then won't my sites list be full of car insurance article directories, bookmarking sites etc? If so, what happens when I come to run a link tree or campaign for a website about fishing bait for example? Should I scrape again, and add those scraped target sites? If so, won't I be submitting hundreds of spun articles to hundreds of sites which won't accept the content, and won't that get my email or IP banned?
    Hope I explained this well enough, maybe you can explain where my thinking is going wrong, as I am sure you must be able to run multiple campaigns to different niches in one installation of UD.

    2. On your videos you set up a gmail address, which I also did. However on some of my campaigns, already run, I noticed a lot of failures on article directories citing the reason for refusal as something like "Free email addresses are not allowed". Do you get this problem and if so do you no longer use free email addresses? I have many parked domains, I suppose I could just turn on pop3 for some of them and let UD access them. Would you do that, or do you just use free email addresses and accept the failures?

    Thanks again for such a brilliant video tutorial, if I had money I would pay you a lot of money to sit with me for a few days and show me first hand how you use UD! (Idea perhaps?)

    • Matthew Woodward
      October 26th, 2012 at 7:14 pm

      Hi Mark,

      1. I think your over thinking it – I just merge in those keywords to find more sites that I can target. As long as the pages your links are coming from have relevent content that makes sense on them your good to go.

      2. Yes you can reuse those domains and setup catch all email addresses so you can just make them up on the spot without actually creating accounts for each one. See the UD documentation for more details on how to use it with UD.

      I have a few consultation clients, can always use 1 more :P

  27. Mark Scott

    Matthew, I have followed your instructions to the letter, for scraping for target sites and importing into UD. However out of several thousand sites, only about 6 were successfully imported into the site detector. Is this normal?
    About 80% had "NOT FOUND" and the rest of the failures had server errors either 500 or 403 forbidden.

    • Andrew Smith
      November 10th, 2012 at 9:34 pm

      Similar question here Matt – Seeing a really low success rate with my export from SB as UD runs through the sites and detects what platform they are.

      Just wondering if this is down to the footprints as would expect a pretty high hit rate if the footprints are specifically digging up sites for the platforms UD supports?

      How have you found this and any tips on improving.

      Also, can't remember if you mention this, but glad I've been running anti virus on my VPS as obviously UD is hitting many sites and AVG is catching the viruses… Just hope it gets them all or my VPS is gonna need rebuilding frequently!

    • roman
      February 12th, 2013 at 5:05 am

      Yes i m also interested in a tip

  28. Shaun Mengella

    Hi Mathew
    Im having a little trouble inputting my private proxies into ultimate demon, don't realy know how TBH.Could you posssibly show me an example?

    • Matthew Woodward
      November 2nd, 2012 at 9:52 am

      Click Global Proxies > Paste into big box > Test proxy servers > Save

      They need to be in this format xxx.xxx.xxx.xxx:xxxx OR if you have a username/pass user:[email protected]:xxxx

      • Tom
        February 14th, 2013 at 10:02 pm


        Having purchased Ultimate Demon and ScrapeBox I am trying to set up the proxies and have a general question, I am in BuyProxies and they offer for $10/m the 2 subnet semidedicated 10 proxies, but which location should be specified, USA, Europe, or USA & Europe, I am not sure if it makes a difference which is chosen but thought to ask before purchasing these?

        • Matthew Woodward
          February 18th, 2013 at 11:04 am


          Any location will be fine :) I tend to buy the ones closest to me though.

      • Tom
        March 4th, 2013 at 2:58 pm

        Hi Matt, have a bit of a problem with the UD proxies as well, I have used the format that you mentioned above, but as I am using an email directly through hostgater(ssl/tsl) the second part of that fails see the format below

        user:[email protected]:xxxx
        [email protected]”:[email protected]:xxxx

        This is the format used on the UD video, as it was recommended, gmail and hotmail do not seem to be the best way to distribute articles so I am trying to use a cpanel email.

        Any ideas welcome….


        • Matthew Woodward
          March 5th, 2013 at 10:39 am


          I think your getting confused.

          The user and pass is for the PROXY login IF your proxies require it.

          It has nothing to do with your email address – this is a seperate login.

  29. Harjinder Gill

    Hey Matt,

    Awesome videos. I really like how you have broken down the whole backlinking process. Your videos have helped me to improve my existing backlinking strategy :)

    Quick question for you: Do you outsource the spinning part(it's really boring :))? If yes, could you please share it with us?

    • Matthew Woodward
      November 6th, 2012 at 6:18 pm


      Glad they have helped you out!

      Yes I outsource the spinning I employ people directly solely for that task.

  30. Jeffie Hylands

    Hey Matt getting everything set up right now and transferring the seo software to a VPS. I am checking out the proxies, what package do you use for your seo services? I am sure you do way more than me but just want to make sure I am covered when scraping content. Thanks buddy

    • Matthew Woodward
      November 7th, 2012 at 9:53 am

      I have 50 semi dedicated's, if your only doing 1 or 2 sites you'll be fine with 10 though.

  31. Alexhs Alex

    Awesome! thanks for giving all this info for free by the way will you make a tutorial to explain how to make senuke templates? or do you know any good link that will explain how to do this? I googled but didn't found anything good.

    • Matthew Woodward
      December 4th, 2012 at 4:14 pm


      No worrys – say thanks by sharing my tutorials on forums and things :)

      Do you mean the wizard linking templates?

      Don't use the wizard its awful, very very basic, limits you to 3 accounts, lacks a lot of control, learn to use it without the wizard otherwise your wasting your money.

    • Alexhs Alex
      December 6th, 2012 at 11:30 pm

      Thank you very much! My problem is that i don't know where should tier backlinks point to eg. web2 profiles should link to articles or moneysite or something else? i wanted to know if there is a detailed guide on what to link to what

  32. Atif Rehman

    where the list of your scrping?

    • Matthew Woodward
      December 7th, 2012 at 1:28 pm

      what do you mean?

  33. Tom Thoma

    Hi Matt,

    I purchased scrapebox and shocked at the uge list of keywords it has uncovered and having spun my articles now at SEO generals I am about to purchase Utlimate Demon and have a couple of questions.

    Normaly I use Market Sammurai to identify keywords and select 20 for the website articles, so with the list that scrapebox comes back with, does that mean we use the scrapebox keywords linking back to the articles on the website, OR can the scrapebox keywords be used for the website articles as well?

    The reason I ask is that many of the 1000 keywords that scrape box has come back with are very long tail keywords with very few searches, so not sure how it all fits together?


  34. Terry

    Hey, Why can’t I see your lists?
    I went over to the blog, liked+Google+Twitted and nothing happen…
    Where can I see them?

    Great videos by the way :)

  35. Kris

    Hey Matt,

    I’ve been in this business a while and still found the easy way you lay all this out very helpful. One question about spinning, I saw above that you outsource yours.

    From what I have found, deep spinning services run around $30per 500 words + the article cost. Once you have all that content you still have to spin in links, images and video. You also have to generate all of your titles, descriptions, tags, keywords, etc.

    Seems to me that the content generation side of this either costs well over $100 or takes several days to really do right yourself. Even if you buy the articles you likely have to do the descriptions etc. yourself, which combined with the image/video/link insertion, takes several hours. Once you have all of the content it looks to me as though it will take an hour + to set up the campaign in UD.

    I guess in the end my question is, does that all sound about right? Obviously if you can get a good odesk worker to do this at $2-$5hr you could lower your costs.

    The videos do a great job of showing how to set up and run the campaign but it looks like most of the work/cost is on the content side and that is a very involved process. I would think this is a $399+ service just for tier 1, lol.

    I am looking to run this whole process on probably around a post a week on just one of my sites, let alone other ventures. Seems to me you really need full time staff if you are working on much more than 1 simple site?

    I am getting ready to dive in and set up systems to get this done, just wanted to make sure I am not missing something that would make the content side a lot easier.

    Thanks not only for these videos but for this whole blog, its a great case study!

  36. Lizzie Rubinow

    Great Video Tuts on Ultimate Demon. So far I watch part 2 and continue with the next video.

    Some questions here:
    Can I use VPN instead of proxy?
    How many links in tier-1 for medium competition keyword?
    How to check deindex sites using ScrapeBox? Doyou have video tuts?

    I’d like to have your list but can’t download it even after tweeting it twice. How to download your list? Are you going to email it?


    • roman
      February 12th, 2013 at 5:07 am

      would interesting to know that , or does it come in a later video ?

  37. Diaz


    We cannot download the list after like on Facebook, why ?


    • Matthew Woodward
      December 31st, 2012 at 1:37 pm


      Sorry had issues with the script it is fixed now :)

      • Gordon McLennan
        January 1st, 2013 at 1:09 pm

        Still no links for me either Matthew….

        • Matthew Woodward
          January 2nd, 2013 at 2:45 pm


          Just retested and definitely working. Please clear caches, also you will need to reshare for it to work. If you completed the share when it was broken then the links wont appear.


  38. alex

    i tweeted the link but i cannot find the Merge Lists, Footprints < links

    • Matthew Woodward
      January 2nd, 2013 at 2:45 pm


      Just retested and definitely working. Please clear caches, also you will need to reshare for it to work. If you completed the share when it was broken then the links wont appear.


  39. Tom

    Are there still script issues with the downloads? After “like” I see no change?

    Really great stuff by the way! Been struggling with UD for several days until I stumbled upon you videos.

    • Matthew Woodward
      January 2nd, 2013 at 2:46 pm


      Just retested and definitely working. Please clear caches, also you will need to reshare for it to work. If you completed the share when it was broken then the links wont appear.


  40. maggie

    This design is spectacular! You certainly know how to keep a reader amused.
    Between your wit and your videos, I was almost moved to
    start my own blog (well, almost…HaHa!) Excellent job.
    I really enjoyed what you had to say, and more than
    that, how you presented it. Too cool!

    • Matthew Woodward
      January 2nd, 2013 at 5:19 pm

      Thanks very much glad you enjoyed them :)

      Whats stopping you from starting your own blog!

  41. Al

    Hi, Matthew, Great video’s – really straightforward.
    Tweeted (3x) to get access to the Personal Targets, Merge Lists, Footprints & Videos
    But not seeing a place to download –
    Guessing I’m missing something obvious…
    Any help would be greatly appreciated.

    • Matthew Woodward
      January 2nd, 2013 at 5:22 pm

      Apologies for the problems, I really buggered up the script during speed optimisation.

      I have sent you an email with the direct link!

      • Al
        January 2nd, 2013 at 9:10 pm

        Got em n just finished the vid’s – really awesome – cleared up a ton.
        One question – I think I missed something obvious or had a brain freeze…
        Why the 3 articles?
        Is each for a different tier?

        Sidenote: I’m a bit paranoid about swiping images, but was thinking one could just take the same image file, make multiple copy’s of it – then add it to name mangler (free I think) software that will bulk change the file names to your keywords but retain the original extension if you choose – then jump back to what you’ve laid out.
        2 cents(pence) from across the pond I suppose…

        Anyway, thanks so much!

        • Matthew Woodward
          January 3rd, 2013 at 8:42 am

          Sorry the 3 articles are for the various tasks, typically I tend to use 1 article for the web 2.0s, 1 article for the Wikis then the other article for everything else but mix it up a bit as you go :)

          I wouldn’t use the same image with a different file name, given you can search Google by images and not just words I’m guessing they can detect duplicate images easier than they can detect duplicate text based content.

          Cheers man!

  42. Vance

    Hey Matty. Great tutorials mate. Just a quick question:

    I’m using the best spinner and spinning 5 paragraph’s and 5 sentences and getting over 90% unique (without photos/vids) when doing a publish and compare with 50 articles. Is there any need to do a word spin if I’m getting that high a unique rate with paragraph and sentence spinning already?
    I won’t be using any spun content on my 1st tier or connected to my money site.

    • Matthew Woodward
      January 3rd, 2013 at 9:02 am

      Hi Vance,

      Personally I would go the whole hog – even if I was only going to use it those 50 times I would still go through and do the words. While that uniqueness is an indicator of uniqueness it isn’t a stretch to say that Google may pick up on duplicate sentences across 50 articles.

      Remember you should be building out your campaigns for the future, not just what works in the here and now!

  43. Marc-André Larivière

    Hi Matthew,

    First of all, nice videos! One of my resolution for 2013 is to get more involved into IM and SEO in general. I have found your blog yesterday through BHW and THANKS for all the information. I really enjoy your transparency.

    Secondly, like other people, I have shared your post over Twitter/Facebook and no link…

    Thirdly, I wish you a happy new year and wishes you all the best!

    • Matthew Woodward
      January 4th, 2013 at 2:08 pm

      Nice to see a few more BHW’ers over here :)

      Glad you like the blog, let me apologise for the problems with the share script. Works fine when im logged in as admin, my host is looking into it now!

      I have emailed you the resources in the mean time :)

      Good luck with your ventures throughout the new year!

  44. Davey

    Matthew, great tutorial series (found it on BHW)! I’m on my second time through. I’ve got at least large 3 projects I will be applying these techniques to. Now for the cash outlay to purchase the rest of the tools I don’t already have… I’ll be purchasing through your links.

    How are you viewing social signals into the mix these days? Seems the Big G is emphasizing that more and more.

    I also tweeted your video out but did not see a download button.


    • Matthew Woodward
      January 6th, 2013 at 1:13 pm

      Thats very kind of you thank you and I’m glad you’ve enjoyed the series!

      Social signals are actually covered in video 6 :) They are supplementary tactics that you can use to bolster your efforts but I do think the future of link building will be social based. Essentially a ‘link’ is one person saying they approve of something else which is exactly what social sharing is by definition.

      So I think we will be moving to a more socially driven search in the coming year or two. I’ve been giving my key tier 1 properties social signals for a while now in anticipation of that shift.

      The script is now fixed but I emailed you the resources anyway :)

  45. royalmice

    The 7daysale coupon does not work for SEOgenerals anymore — do u maybe have another coupon code ??


    • Matthew Woodward
      January 7th, 2013 at 8:33 am

      Ahhh yes they have actually upgraded the site since then and offer different levels of writing but reduced prices to be the same as with the code at the same time ^^

  46. Will McDonald

    Thank you! You must have spent so many hours fine tuning these tiered link campaigns, and then you just share them step by step…who does that

    I do have a quick question, and not to sound greedy, but for the targets, footprints, and merge lists – are these still relevant? Especially after your videos with more people using them?

    • Matthew Woodward
      January 22nd, 2013 at 9:01 am


      Well my link building process has evolved over the years. I usually have 1 main process along with 3/4 variations which I use to refine the main process further, it’s constantly evolving :)

      Well you can use the footprints, merge list and your own list of keywords to create your own target list. But I guarantee my personal target list has been used less times than any that get posted on forums etc

  47. Ayumi


    I use your two files and a list of keywords I made up. I got a list of 300k keywords scrapebox says. I click on harvest like in your video. Remove dup. url and save as text file… I look at the url list they look very poor and low PR. Would it be better just to download a High 4-8 PR list of articles directories, web 2.0, bookmarking etc. Or have I missed the point. Look forward to your answer. Love the series think you need to do one on the New Link Wheel model or something as well… Thank Matt keep up the good work!

    • Matthew Woodward
      January 24th, 2013 at 9:05 am


      Just import the list into UD and see what you get from there – you will always get more lower PR sites than higher PR sites simply because there are less higher PR sites in the world.

      It would not be better just to download an existing list that has been download and used by every man and his dog – you could do that AS WELL as scraping your own target list but not instead of.

      I’ll be doing somethings with the new LWB don’t worry :) Can’t say what yet though ;)

  48. Jacob King

    Hey Matthew,

    Just a thought.

    Couldn’t you just check the Google index status in Scrapebox before adding to Ultimate Demon?

    And save the step of exporting form UD and using mass deletion…

    • Matthew Woodward
      January 29th, 2013 at 11:36 am


      Really the choice is do an index check on your entire scrape which can often run into the millions of URLs, or do an index check on just the URL’s Ultimate Demon can recognise & post to.

      One method is more time/resource efficient than the other :P

  49. Matthew Woodward

    I can’t replicate that here – they are just YouTube videos =\

  50. Lex

    Hey Matt,
    Just a heads up – I don’t know why this is happening, but Scrape Box is misreporting indexed links as not indexed. I would manually check some of the output links reported as de-indexed before you remove them from your database.

    I’ve got a ticket in with SB support now, but searches online show people complaining about this as far as 2011.

    • Matthew Woodward
      February 5th, 2013 at 9:55 am


      This could be happening for a number of reasons, what happens if you take a sample of 15 URLS and do the index check without proxies?

      • Lex
        February 7th, 2013 at 4:23 am

        Hi Matt,

        The outcome is identical without proxies.

        It turns out that Scrapebox uses the ‘info’ operator instead of the ‘site’ operator and not all indexed sites return a result using the info.

        They say ‘info’ has always been a better indicator than ‘site’ but the fact that ‘info’ ignores indexed sites that ‘site’ clearly shows as Google indexed, suggests otherwise to me.

        Am I missing something?

        • Matthew Woodward
          February 7th, 2013 at 9:04 am

          Well unless you want to go and check each one manually that is pretty much your only option.

          I know that https://www.matthewwoodward.co.uk/experiments/backlink-checkers-compared-ahrefs-majestic-seo-seomoz-raven-tools-seo-spyglass/ uses its own unique way to check indexed status which seems pretty reliable but I’m not allowed to say how they do that :(

          I think when your checking this many links enmasse then your going to get issues like this.

          The only thing you could do in scrapebox is the site: command itself to return the list of URL’s, then do a lookup in excel to determine indexed status.

  51. Yohan

    Hi Matthew,
    What proxy package do you use / would you recommend to use in scrapebox for creating my target lists?
    Thanks so much!

    • Matthew Woodward
      February 7th, 2013 at 8:48 pm


      Either 10 or 20 semi dedicated proxies would do you fine.

      • Yohan
        February 7th, 2013 at 9:33 pm

        Awesome, thanks so much!

        • Matthew Woodward
          February 7th, 2013 at 11:37 pm

          No worrys

  52. Tom

    Hi All,

    Has anyone had any problems buying Ultimate Demon, I enter in my paypal details and a screen pops up saying please return to merchant, “Invalid Order Action” is what is flagged.

    Anyone else having this problem?


    • Matthew Woodward
      February 11th, 2013 at 9:31 am


      Try a different payment method? Sounds like a Paypal issue.

      • Tom
        February 16th, 2013 at 11:10 pm

        Tried repeatedly on PayPal, still not working, I don’t have a credit card either, only debit.

        Frustrating to say the least.

        • Tom
          February 16th, 2013 at 11:13 pm

          Solved, had a proxy left on and PayPal didn’t like the IP.

          • Matthew Woodward
            February 18th, 2013 at 11:09 am

            Wahahah worn that tshirt a few times myself =D

        • Matthew Woodward
          February 18th, 2013 at 11:09 am

          You can use a debit card on the normal checkout

  53. Drew

    Matthew, you are awesome. A lot of what you offer in the way of tutorials is better than paid memberships I have participated in. It’s amazing! I know you plug a few products but I believe your promotion to be of a helping nature of which is well founded in your experience. I was on the fence about UD but I know I will now purchase as well as ScrapeBox through your link. I just hope there are some tutorials offered by the developers of the those programs to help tie everything together. Will be spending a lot of time with you here my friend. I may need to watch some of these a couple of times for it to completely register-lol. Appreciate it more than you know. THANKS!!!!

    • Matthew Woodward
      February 21st, 2013 at 1:31 pm


      Thank you very much glad your enjoying the tutorials. I had to make the decision of releasing them as a product/wso or just giving them away for free, I actually felt like putting a price on them and associating them with WSOs would devalue them so I just gave them for free =D

      My tutorials will tie it all together for you no worrys, and thank you for buying through my links – much appreciated!

  54. Tom

    Matt Hi

    Using Scrapebox for the first time following your video and have a small problem.

    I produced a list of keywords as suggested in scrapebox in video2 then put this into scrapebox again to produce a bigger list, this came back with over 1,000,000. However, as I am now harvesting, scrape box this has now been running for 2 days and still going, should I abort and do this start this again using the first list of keywords scrapebox generated?

    • Tom
      February 22nd, 2013 at 12:20 am

      Update, when I mentioned 1,000,000 above this was the total quieries after the the merge files were added as indicated in the video. Perhaps I should have used the original list of 30 scraped keywords rather than the 163 that came back after scraping the list a second time?

      Your advice welcome…

      • Matthew Woodward
        February 22nd, 2013 at 10:00 am

        Hi Tom,

        To be honest the more you can scrape the better

    • Matthew Woodward
      February 22nd, 2013 at 10:00 am

      Hi Tom,

      My recommendations is once it harvests 500,000 URLS or so, stop the harvest. In the window that popups up in the bottom right corner you can choose to keep the not completed keywords.

      Then export that list and kick off the harvest again.

      Hope that helps

  55. Sam

    Hi Matt,

    I’ve been looking for the link to download the sample files (Personal Targets, Merge Lists, Footprints & Videos), but I can’t seem to find it anywhere. Am I missing something?


    • Matthew Woodward
      February 25th, 2013 at 4:50 pm


      Yes your not using your eyes ^^

  56. Tinozito

    Twitter @tinozito

    Hi Matt, i ve been retweeting your vids, I really appreciate the priceless content!!

    my question is , I harvested a list of 1,000,000 + Url’s with scrapebox using your footprint and merge list and my keyword, after removing duplicates i end up with still a big list of ~200 000 urls…

    Then in UD, iam supposed to add these urls in the site detector like you mention at the end of the video ( UD tell me that i am recommended to keep it below 10,000 items) …so i tried with like 10 000 item , it took like 45 min and stop a 99% with only 10-15 success detected…

    So what do you recommend , should i paste my huge list of 200 000 urls to get the most of it

    • Matthew Woodward
      March 7th, 2013 at 12:27 pm


      Yes ignore what UD says just whack the whole list in. Then run the list again with the tickbox about the site in a subfolder ticked :)

      • tinozito
        March 14th, 2013 at 8:47 pm

        Hi Matt,

        Thanks for your answer, I really appreciate that we actually can communicate with you …

        I scrapped probably 2 Millions of URL’s using your footprints, and merge list , i had 2 list of 300k URLs to check with the UD site detectors, I check those list with the tickbox as well… Then at the end i just get only 1800 target list !!! ( most of them are FORUM)

        So I run an index check in scrapebox , only 3 are indexed …

        So is something wrong with the scrapping part …?

        Can i use your target list ( but what would be the point of scrapping your own list of targets…) i’m kinda frustrated right now :/

        Thanks matt

        • Matthew Woodward
          March 15th, 2013 at 11:48 am


          No sometimes you’ll get lots of new sites, sometimes not many and sometimes none – just how it is. Scraping your site list and refining it is a continuous process that should be repeated monthly at a minimum.

  57. Ian

    Thank you Matt for these videos. I am on the third round of watching them and each time I seem to pick up another point that I need to address. :)

    I am now ready to get going with my first campaign. I have downloaded your target list but am unsure as to what sections to upload them into within UD. You have broken them down in what appears to be a sub-criteria of the way UD splits them up.

    Please can you indicate which type of site each of your folders relate to.


    • Matthew Woodward
      March 13th, 2013 at 11:13 am

      Hi Ian,

      Hahaha yeah its a lot to take in at once :)

      Just paste the whole list in, no need to split it up – I just did that for reference purposes!

  58. Ayte

    Matt can you please make a List scraping series? I tried to scrape those footprints. I trimmed URLS to root, removed duplicate URLS and entered into Ultimate Demon. I got less than 500(5k unique domains) sites to go most with very low PR

    • Matthew Woodward
      March 13th, 2013 at 11:17 am


      Yeah that sounds about right to me – just merge in more words and keep scraping. Its a never ending process.

      Don’t think it warrants a dedicated tutorial as its already covered in detail but I suppose it is focused on one task of scraping

  59. Steven Hughes

    Hey Matthew – Thanks for the tweet. I can appreciate the time and effort put into the videos.

    This is why most people are not fond of SEO. It’s centered around beating the system, especially these advanced methods. It’s centered around beating Google.

    The problem is Google is getting smart, and their Search technology is evolving. There’s are some clues out there that links are becoming less important to SERP’s. I wonder if in the not to distance future all of this work that you suggest will be worthless.

    Spinning, scraping, and link building are not foreign to Google. They’ve learned that great content is not at the end of this rainbow. These techniques are tried and true, very powerful over time. However, anyone using these techniques needs to be careful, and realize that roof can cave in at any moment.

    • Matthew Woodward
      March 14th, 2013 at 9:29 am


      No worrys :)

      Yes Google are at long last getting smarter – a lot of the updates in the past few years haven’t really changed anything (use HQ content, diversify keywords, standard advice for years) but with the introduction of AuthorRank and Social Signals this year it will be an interesting time :)

      But that is why the tiered system is so good, as long as people ensure they have the ability to remove any tier 1 links you can instantly disconnect your site from any links Google may decide are troublesome in the future that aren’t now.

      Its all about risk management :)

  60. hydride

    Another user has already asked this question, but it was never answered, so what are your thoughts about using VPN over proxy?

    • Matthew Woodward
      March 14th, 2013 at 9:38 am


      Get proxies because the software you are using will be able to rotate IP addresses as and when it needs to which you cant with a VPN (its either on a timer or manual)

      • hydride
        March 14th, 2013 at 11:14 pm


  61. Arran

    Matt…I followed your videos through and now I have a two week wait. So Ive started on a second site….and followed videos 1-3 all the way through again….when I go to each project on UD the sites it’s got in the database that I scraped for each project are exactly the same according to UD…is this right?

    I assumed with the different keywords in Scrapebox it would have found different sites?

    • Matthew Woodward
      March 14th, 2013 at 9:40 am


      The site list in UD is a ‘master site list’ that you use across all projects. You can add sites just to specific projects if you want but thats know how I run my ship :)

      • Arran
        March 14th, 2013 at 12:20 pm

        Nice, thanks for the quick reply.

        Using this method, and I know it’s not easy to predict, but how long do you think it would be before a new site started ranking for easy and medium keywords?

        • Matthew Woodward
          March 15th, 2013 at 11:46 am


          I wouldn’t build links to a new site

          • Lucas
            November 25th, 2013 at 2:13 am

            Matthew, when you say you “wouldn’t build links to a new site”, do you mean to say that your tiered link building method should not be used on a newly created website/domain? Do we have to wait for a site to be of a certain age or have a certain amount of existing authority or traffic before we can use this method on it? Please elaborate as I would really like to know whether I can use this on a new website.

          • Matthew Woodward
            November 25th, 2013 at 8:44 am


            I focus on marketing in the early stages of a site rather than SEO.

  62. JJ

    I have some questions, things I need to clear up?

    – Am I supposed to rewrite 15 titles for all three articles so I have 45 titles in total.
    – and then spin all the 3 articles using your best spinner tutorial.

    I also wanted to know what are your opinions for article writing and what are opinions about it?

    • Matthew Woodward
      March 15th, 2013 at 11:46 am


      1) Yes
      2) Yes

      About article writing? Outsource it ^^

      • JJ
        March 16th, 2013 at 2:24 am

        I meant to say wanted to know what are your opinions on iwriter.com and if so what are opinions are about using it?

        • Matthew Woodward
          March 18th, 2013 at 9:15 am


          I haven’t used them sorry =\

        • hydride
          March 26th, 2013 at 9:14 pm

          I’ve used them. The only thing I like about iwriter is the fast turn around. Quality is unique too, but don’t expect any life changing content. I did use them for quite a while but, stopped because the quality isn’t what I wanted.

  63. Alex Pavlenko

    It seems that the link for scrapebox does not work. I have tried a second day to purchase the software but it would not load the page. Any ideas would be appreciated.

    • Matthew Woodward
      March 18th, 2013 at 9:13 am


      Sorry that is out of my control, give the SB team a shout and they’ll help you out!

  64. Skip

    Hi Matthew,

    Thanks for all of your hard work with the videos and the transcripts are a NICE TOUCH.

    Can you tell me where the link is to your ping server list or,

    “Secondly we need to setup the ping servers, again paste your list into here. If you don’t have a list you can download mine underneath this video.”

    can you provide instructions how to get this list?

    Many thanks,

    • Matthew Woodward
      March 19th, 2013 at 9:13 am

      Hi Skip,

      Glad you like the transcripts but I despise creating them with every brain cell I posses ^^

      What your looking for is underneath the video, your not looking hard enough :)

  65. Daz

    Hi Matt

    In your Video you said write 3 articles, do you mean write 3 articles then 2 alternative sentences for each article

    or do you mean write 1 article and 2 alternative sentences for each sentence which would basically make 3 articles if they were not spun together.


    • Matthew Woodward
      April 6th, 2013 at 4:29 pm


      3 separate articles and each sentence in each article has 2 alternatives :)

      Check out the advanced spinning tutorial for much more detail

  66. Gideon

    Hey Matthew,

    Thanks for all the awesome tutorials. I tweeted and liked the video, but no links to download your lists. I scanned this page up and down to make sure I didn’t miss it, but I still don’t see it :(
    Could you help me out?

    • Matthew Woodward
      April 6th, 2013 at 4:33 pm

      Hi Gideon,

      I’m away from home until the 8th at the moment but drop me an email with what you need and I’ll send them to you directly when I return.

      Not sure why it didnt work, on the list to test when I get back thanks

  67. Diego

    Hi Matthew, Thank you so much for the tutorials, they are really a gold mine.

    I have some questions about the tutorial:

    1 – I see that you answer on a comment that do you dont recommend this approach for a new site, how is supposed to be the right time to do this technique? what do you have to do before?

    2 – This could help to recover a site with a penguin penalty?

    3 – I run some webs on Spanish, do you know if making the articles on english and changing the keywords for the spanish ones i want to rank for, it would work?


  68. Jason

    Hi Matthew I love the tutorials. I am only starting out trying seo. I usually source it out to someone else. I am going to try put your tutorials into action with a site I am working on at the minute. It is an Opencart site. I have one very basic question before I start.
    Should I submit a site to Google through the webmaster tools or just jump straight into your what your are doing in the tutorials?

    • Matthew Woodward
      April 8th, 2013 at 3:50 pm


      Yes you should!

  69. Geo

    Hi Matt,

    YOU rock.
    I have many questions, but here is one:

    Preparing for the campaign you said I need 3 articles. and then I must write 2 alternatives for each sentence of each of the 3 articles… but why not actually buy 9 unique articles? …it’s faster. or maybe I’m not getting this right….?


  70. CBETZ

    Hey Matt, I’m having hell of trouble getting these list of targets imported into my ultimate demon software. On the first run, only 14 stuck, and 23,000 could not download or were not found. Do you have any tips here? Am I doing something wrong?



    • Matthew Woodward
      April 9th, 2013 at 8:20 am


      Re run it again with the box ticked my site is stored in a subfolder and your done! I assume you have seelected all from the drop down?

  71. Andy

    For the social bookmark, web directory and video titles, do we need 3 titles and 3 descriptions for each, or one for each? Also, can you give an example of what makes a good title and description for each one?

    • Matthew Woodward
      April 12th, 2013 at 9:32 am


      Just one set will do, but don’t be scared to do more – the more the better!

      Just look at some bookmarking sites

  72. Tj

    wauw thanks!

    If bin looking for a long while for this info. Dint know for sure what a tier link was and besides that i was lost trak in what to do with all those seo tools.

    Thanks for the great cuality content you made!

    • Matthew Woodward
      April 13th, 2013 at 8:24 pm

      No worrys glad it has helped you out!

  73. Wildcat

    I’ve shared the video but I don’t see the link to download the merge list etc?

    • Matthew Woodward
      April 26th, 2013 at 7:32 am


      Just tried it with Twitter & Facebook – the share buttons get replaced with the links.

      • Wildcat
        April 26th, 2013 at 10:12 am

        Thanks Matthew,

        It was an extension in Chrome that was blocking this..

        • Matthew Woodward
          April 26th, 2013 at 10:42 am


          Can you tell me which extension please?

          • Wildcat
            April 26th, 2013 at 11:01 am


            Sure no problem. It was “Do Not Track Me” which is produced by Abine and they also have an extension called “Mask Me”. When I disabled “Do not track me” the links worked.

            Hope this help if anyone else runs into the same issue.

            Thanks again for a very informative video.

          • Matthew Woodward
            April 26th, 2013 at 12:15 pm


            Ahhh yes that blocks cookies and the plugin uses cookies to know if you shared or not :)

  74. Walter

    Hey Matt,

    Do you know if putting the keyword in the title of the backlink source (creating a spun article with target keyword in the title) helps with ranking for that keyword?


    • Matthew Woodward
      May 13th, 2013 at 7:18 am


      It will help a little yes but more so in Bing

  75. Matt

    Hey Matt

    Great tutorials by the way. Just a quick question – The Best Spinner site seems to be down at the moment. Are there any recommended alternatives I can spin the articles with?



  76. David Berger

    Can’t find merge files any where?
    Please help.
    Great vid’s!!!

    • Matthew Woodward
      May 24th, 2013 at 8:50 am

      Have you tried the old age technology of reading?

  77. Chris

    Great tutorial! Very well put together, and I would agree to the previous comments that there isn’t anyone, anywhere on the web that publishes content so invaluable as this. My question for you Matthew is …

    When uploading your target sites from scrapebox into UD, what’s confirming those sites being uploaded are valid web 2.0, Article Directories, Social Bookmarks, etc… And if UD has no problem sorting them out, should we use semantically related keywords to further scrape for additional target sites in scrapebox, for overlooked target sites that weren’t picked up from the original set of keywords?

    • Matthew Woodward
      May 24th, 2013 at 8:43 am

      Hi Chris,

      Thanks very much!

      The UD site detector takes care of checking they are valid and what type they are.

      Yes you can use related keywords if you wish! Never stop scraping, always more to be found!

  78. Moody

    Wow, what a site. This video tutorial is superb. I know you said it took you 8 hours to create it, but let me tell you that you have easily saved me a month or more. I have been spending the week hunting through poorly written, garbage descriptions and advertisements to figure out some Multi-Tier marketing best practices.

    Your site rocks, your videos rock, and I will be viewing all of them tonight.

    • Matthew Woodward
      May 26th, 2013 at 10:54 am

      Thanks – glad it has helped you out!

  79. Moody

    Hi, like I said earlier, fantastic resource. I am excited about using scrapebox with the keywords and footprints you created, only I cannot locate them. It said on this page that if you tweet, then you will get those resources. I did that but for some reason I am unable to download these. Please help.

    • Matthew Woodward
      May 26th, 2013 at 10:55 am

      Drop me an email and I’ll get them to you :)

  80. Nick Thomson

    Hey Matthew killer vids so far my dude I’m on #3 now. Quick question.

    What determines the amount of private proxies I should need? I have the general concept of them and was reading like 20,000 url blast a day with 10 would be good but can you give me the run down on what determines the amount I would need?

    Thanks again man!

    • Matthew Woodward
      June 1st, 2013 at 3:29 pm


      Just get 10 semi dedicated from BuyProxies and you’ll be fine! When blasting comments, scrape a bunch of public ones as well.

  81. David


    How to download your personal target lists? I can’t find any download link on this page.



    • Matthew Woodward
      June 3rd, 2013 at 6:13 pm

      Read harder :)

  82. SylvainZ

    Awesome tutorial Matthew,

    I’m working on my backlink strategy and your tuto help me a lot.
    But I’m french and I would like to know if writing a post on sites like EzineArticles in english has the same effect or I need to find some others french site?
    I can find them but they have less Authority


    • Matthew Woodward
      June 10th, 2013 at 11:15 am


      I don’t have much hands on experience here but from what I can gather it doesnt make much difference now, but I wouldn’t be surprised if it did in the future.

  83. Tim

    Hi Mathew, I am new to this so please bare with me. Whilst going through your Tiered link videos I lost track a little. What threw me was how to do Tier 1 in Senuke because the Senuke video is an addition and not core to the Tiered linking videos. Once you have 3 unique articles – is it these that I should spin by following your expert spinning video and have them all linking direct to the money site? is that safe in your opinion?
    and then linkwheel tier 2 and 3 as per your video series?

    • Matthew Woodward
      June 13th, 2013 at 9:03 am

      Yes that is right – just substitute video 3 in the series with the nuke tutorial – but then see the advanced spinning tutorial for a more detailed look at how to prepare the content as seen in video 2.

  84. JJ

    when you are setting up your Tier 1 campaign is there a maximum number of links you aim for overall for your money site? Or are the number of Tier 1links irrelevant?

    • Matthew Woodward
      June 18th, 2013 at 8:44 am


      This is covered in video 3 :)

  85. Nicolas

    Hi Matthew,
    In video 1 we get the huge list of sites from scrapebox. Is it this list you paste in UD in video 2 ? Not so sure, because those wont be Tier 1 quality links or did I miss a beat ?

    • Matthew Woodward
      June 24th, 2013 at 9:21 am


      You mean when we are scraping our target list to import into UD?

  86. Chua

    Hi Matthew,
    May I know how many proxy do I need for Tier linking? There are a few option to pick from buyproxy.org.

    • Matthew Woodward
      June 24th, 2013 at 9:19 am


      10 semi dedicateds will do you sir!

  87. Kiril

    I was just wondering, since I prepared my Tier 1 with Senuke XCr Social Network links , some PDFs and some high PR Wikis, then create a campaign in GSA Search Engine Ranker later that night – 7 hours later I have submitted around 400 Tier 2 Links with GSA , but I get around 20 LIVE links at the moment – is that normal ?

    What is your experience ?

    Thanks and Regards

    • Matthew Woodward
      June 24th, 2013 at 9:19 am


      Thats normal, it only verifies the links every x hours or so.

      You shouldn’t be building tier 1 and 2 in the same day though

  88. at

    I am extremely impressed with your writing skills and also with the layout
    on your weblog. Is this a paid theme or did you customize it yourself?

    Either way keep up the nice quality writing, it’s rare to see a great blog like this one today.

    • Matthew Woodward
      June 27th, 2013 at 11:42 am

      Thanks =D

  89. Keith

    Hey Matt,

    Thanks for all of the awesome videos. Extreme newbie here just starting out. One step is confusing me. After you scrape your URLs and export them as text you import them to UD. You then export them again back to scrapebox to check the index status of the URL’s. Why would you not just check the index status of the URL’s after you’ve scraped them in Scrapebox straight away?

    • Matthew Woodward
      July 12th, 2013 at 9:53 am


      Because when you do the first load into UD, only a very small percentage of them will be added to the main UD database. So instead of doing an index check for like ~50,000 URL’s, you only need to check a few hundred.

  90. Flash Memory Usb

    Hi there! I could have sworn I’ve been to this website before but after browsing through some of the post I realized it’s new to me.
    Anyways, I’m definitely happy I found it and I’ll be book-marking and checking
    back often!

    • Matthew Woodward
      July 12th, 2013 at 9:48 am

      Thanks – let me know if you have any questions!

  91. Gerry D

    Hey Matthew Gerry here from Ireland been following you now for a while as I can tell from your writing and your tutorials you have a vast experience with a No BS approach is there another way to contact you other than here, I gave your vids a share as I thought it was the least I could do, also as these were made quiet a while back is this still a viable method now ?

    • Matthew Woodward
      July 12th, 2013 at 9:46 am

      Hi Gerry!

      Thanks for the share – just hit up the contact page :)

  92. hydride

    What about relevancy of the scraped sites? The PRs? What if someone scrapes a whole bunch of negative PR sites?

    • Matthew Woodward
      July 15th, 2013 at 7:36 am


      I take care of this in the tutorials :)

  93. Darcy

    Hey Matthew!

    I can’t wait to put all of this into action and see if I can make it work for me!

    I wanted to know if you could tell us how you add those cool little ‘fly in’ effects on your video?! I love that – is it a part of the editing software you use?

    Thanks :)

    • Matthew Woodward
      July 15th, 2013 at 7:34 am


      Good luck :)

      I use adobe after effects for those!

  94. Jay

    Hi Matthew,

    How can I get the resource material (personal targets, merge list, footprints). I subscribed and fb liked (I don’t use google plus),

    Thanks for the great tutorial!

    • Matthew Woodward
      July 22nd, 2013 at 8:28 am


      Working here not sure what went wrong, sent you an email with them!

  95. Daniela

    Hi Matthew:

    I just love your tutorails videos and all your blog in general, I still have some noobie doubts though, sry if you already answered them:

    1. When you say “scrape your own targets lists for automated software”, what makes a difference between pligg, jcow, and all of those platforms? aren many people using the same footprints? so how could I avoid adding my content and links in platforms that are no so “known”? (spammed) :)

    2.What is the best way to find new platforms for building our tier linkbuilding? I mean, pligg and jcow, drupal, etc. are famous platforms, so, is it better to post in not so popular platforms to avoid posting where everybody else is doing it?

    Both questions I made are very similar, but Im about to star my tier linkbuilding, and I don´t want to do it the wrong way.

    Thx for all the incredible info that you share with us. God bless you

    • Matthew Woodward
      July 22nd, 2013 at 7:56 am


      1) You would need something custom to post to platforms that the software doesn’t support natively

      2) Just browse the internet, get involved with things as you normally would and as your surfing just keep an eye out for possible link opps

  96. geekmom

    I absolutely love these videos and once I finish the set, I will be writing something up on my site about them.

    • Matthew Woodward
      July 24th, 2013 at 3:01 pm

      That would be very kind of you thank you!

  97. Jed Hanlin

    Thank you for spending the time to help others in this crazy confusing business. I had a question, hope I didnt miss it some where but her goes.
    So when building the initial tier 1 level should I direct that towards http://mysite.c0m or should I build out a seperate set of tier 1’s for every page I.e. mysite.com/how-to-skin-a-cat, etc. or should I mix them all up? Hope that makes sense.

    • Matthew Woodward
      July 31st, 2013 at 1:08 pm


      Each run of the campaign can target 1 URL, so you can choose whether that URL is your homepage, an inner page or whatever :)

      • Ray
        August 23rd, 2013 at 3:41 pm

        Would each of the campaigns need 3 more articles and its various titles, descriptions, etc., or can you use the original 3 articles. If I had a website with 5 URLs I want to target, would I then need 15 articles or just the 3 original ones? Thanks.

        • Matthew Woodward
          September 3rd, 2013 at 9:59 pm

          Yup, each campaign needs a new set of everything

  98. Martin Harris

    Hi Matt,

    I’m an agency SEO and the majority of the work i do for my clients is white and grey hat, I’ve decided to buy ultimate demon not for any my current clients but for a side project i’m doing.

    I haven’t done any major black hat campaigns before, so i’m actually looking forward to it!

    However i’ve had a problem buying ultimate demon through mycommerce. Long story short they had a issue processing the order and ive had to pay twice! I’ve supposedly got a refund/cancelled transaction, but if you get double commission you owe me a pint! Haha

    • Matthew Woodward
      August 13th, 2013 at 4:02 pm


      Hahaha I will let you know – thanks very much!

      I encourage you to build a site and try to get it penalised, you’ll learn a whole bunch!

  99. Holger

    Hi Matt,

    it´s me once again. I have got a few questions about web 2.0 sites.

    1. In this tutorial you built them with UD or Licorne but on the other hand you stress they should be hand built. What is the best way in your opinion then?

    2. Granted you have built a few web 2.0 sites manually, can you easily add those URLs to a T1 link campaign of that kind you set up in this tutorial or do you have to create a different Task/campaign for existing web 2.0 accounts. I recently bought Licorne and therefore I´m mainly interested in how to manage this in Licorne if possible.

    3. A bit referring to question 2: Can you generally use existing T1 accounts or sites (created by UD/Licorne or just manually) to create new content and backlinks with Licorne and UD on these properties and build them out? This question may sound a little bit stupid but I didn´t find a description to this issue in the tutorial videos yet. You only recommend to repeat the process and create completly new T1-accounts (if I understood this right).

    Best regards,

    • Matthew Woodward
      August 13th, 2013 at 4:01 pm


      1) THey are very different things. The ones that are built with UD are not comparable to the hand built ones. What UD calls a web 2.0 site isn’t always a web 2.0 site with a dedicated subdomain like we create by hand.

      2) Yes just add them to the target list in GSA

      3) I always create new accounts on each run

      • Holger
        September 3rd, 2013 at 9:19 am

        To point 1):

        a) What about Licorne in this matter? Is Licorne able to create real Web 2.0 sites you described above (with dedicated subdomain)?

        b) BTW, is UD able to build this kind of sites too?

        c) And finally: Could you please give us an example to the other web 2.0 sites (that ones without dedicated subdomain) , just for a better understanding?



        • Matthew Woodward
          September 3rd, 2013 at 7:24 pm


          1) Yes

          2) Yes

          3) I dont use them, but anything that gives you domain.com/username

  100. Neil

    Hey Matt I could really do with your Personal Targets, Merge Lists, Footprints & Videos but I don’t have a FB, Twitter or G+1 account:( Can you sort me out please as these downloads will be a great starter for me.

    Great vid series aswell, very informative and well explained.


    • Matthew Woodward
      August 13th, 2013 at 3:57 pm

      You know its 2013 right?

      Drop me an email :P

  101. Ray

    Quick question re the sets. In your video, you showed the following requirements for 1 article:

    1 set of bookmark titles/descriptions
    1 set of web directory titles/descriptions
    1 set of related tags and keywords

    Do these translate to this:

    15 spun bookmark titles/descriptions
    15 web directory titles/descriptions
    15 sets of related tags and keywords


    45 spun bookmark titles/descriptions
    45 web directory titles/descriptions
    45 sets of related tags and keywords


    • Matthew Woodward
      August 17th, 2013 at 8:41 am

      The requirements are 1 for campaign. No translation required, just as it says.

      • Ray
        August 23rd, 2013 at 4:49 pm

        Thanks for the info. One last question: do I need to spin these titles and descriptions?

        • Matthew Woodward
          August 23rd, 2013 at 4:56 pm

          Yes indeedy :)

  102. Mridu

    Hey Matthew,I have downloaded all of your videos and it’s great.But I have one question,when I asked at warriorforum,people are saying that article submission direct to money site is dead now.So can we use article submission in Tier1?Or does it have negative impact in rankings?

    • Matthew Woodward
      September 3rd, 2013 at 7:40 pm


      It is safe to use as long as you have a diverse profile of links

  103. Chris

    Hi Mathew, this is a fantastic tutorial series. Thank you so much.

    Quick request – about these things:

    1x set of bookmarks titles / descriptions
    1x set of web directory titles / descriptions
    1x video with titles / descriptions

    I get what you mean by a “set” and that we need to “spin” all the titles and descriptions.

    But I want to be sure I know what those things are supposed to look like. Could you please give me an example of each of those three types of set that you would consider to be of suitable quality?

    I have tried looking at these on various other websites, but they seem to look different on some sites to others, and I would like examples of ones that are done like you do in this process so that I can feel certain I’m doing them right.

    I would really appreciate you giving me an example of these three things, or just posting links to three pages that have ones that are done right.

    Thank you very much for your time.

    • Matthew Woodward
      September 3rd, 2013 at 7:38 pm


      JUst do a manual submission to a social bookmark site and write a title/desc like your normally would to get an idea of what it should look like. Take a look at inbound.org etc

      In terms of quality it should read perfect english everytime and not look like generic spun crap.

  104. Sterling

    Hi Matthew, another excellent video. I’ve watched it several times over the last couple of months.

    I’m doing something wrong on the scrape part after merging your files with keywords. I’m getting results like youtube videos and blogspot in the URL list.

    Most of the matches don’t seem to match the inurl criteria of the various merged search terms.

    Here is an example search term:

    this inurl:footer_page/1 “About Us” my keyword here

    Is something wrong with the keyword merging? Or do I have to uncheck some of the search engines to scrape? Should I just stick to bing and google?

    thank you sir!

    • Matthew Woodward
      September 2nd, 2013 at 7:09 pm


      Yeah you get that sometimes but I dont worry about it, the import into ultimate demon takes care of that

      • Sterling
        September 2nd, 2013 at 8:05 pm

        Thanks Matthew. It looks like only Google supports “inurl:” option, so list is much cleaner and scrape much faster when I only use Google with those kind of footprints.

        Wish senuke’s import was as slick as UDs.

        • Matthew Woodward
          September 3rd, 2013 at 7:28 pm

          Ooops missed that sorry but yes thats a Google only operator.

          Yeah Nukes import sucks ass in comparison.

  105. Chris

    Hey Matt,
    great set of tutorials, thanks.

    Just a quick question about the 3 base articles. Do you have these written in a specific way? What I mean is, does the article you use for a press release have to be a certain style, i.e. read like an actual press release?
    Or do you just have three articles written about a niche related topic?



    • Matthew Woodward
      September 8th, 2013 at 2:51 pm

      Hi Chris,

      I just keep it simple with niche related topics

  106. Holger

    Hi Matt,

    I wonder how to manage the target URLs the right way in manners of already used targets and targets I not used yet. I will try to exemplify this: Granted, I scraped 1000 target sites (bookmark sites) but I only really posted to let´s say 200 of them in a bookmarking campaign (because I limited this T1 campaign to 200 operations). When I use the same target list (1000) again later on, UD, Licorne and so on will target this 200 sites again. How do you avoid this problem in your daily work? Do you rather scrape completly new sets of target sites for each run or do you delete the used ones. In other words: How often do you use a list of target sites for your linkbuilding campaigns within one project?


    • Matthew Woodward
      September 8th, 2013 at 2:44 pm


      I try to use only each site once per campaign and you can do that by just editing the project and adding more sites/updating the schedule.

  107. Matt B.

    Hey there fellow Matthew.

    I’m curious what you would say if I offered to buy you a pint and asked you how would you change your tier one campaign were you to intend on using it to diversify anchor text? Would you substitute domain type anchor text for some of the keywords scraped via scrapebox?

    That being said, seriously I’m buying you a pint some day.

    -Matt B.

    • Matthew Woodward
      September 11th, 2013 at 11:10 am

      No need for a pint but just change it up/be random. Or drink a bottle of vodka before you start work, no way you will leave a footprint or any pattern then ^^

  108. Alvin

    Thanks Matthew for sharing your link building strategies. I have a concern. If we build tier 3 spammy backlinks, a future google algorithm update might take a toll on it.

    I recently came across Backlink Beast. Like to know your thoughts on it. Between backlink beast and SEnuke Xcr which one is better.

    • Matthew Woodward
      September 11th, 2013 at 11:06 am

      You could say that about any approach to link building. That is why I put the guidelines in place in the first video so you can disconnect the tiers as and when you want/need to.

      Not heard of backlink best and probably wont use it/look at it.

  109. Lee

    Hi Mathew. great videos.

    I am having problems with scrapebox trying to scrape my own list.

    It keeps crashing, I can use for doing everything else but when I follow your procedure for scraping lists it just crashes when i stop it harvesting. Any ideas?

    • Matthew Woodward
      September 11th, 2013 at 11:05 am

      Hit up Scrapebox support :)

  110. Matt B

    Greetings Matthew,

    In scraping my list I ended up with over 2,000,000 + keywords using the footprints and the merge list file, which basically killed my tired old PC.

    Would you recommend starting out with fewer keywords, or perhaps using just the footprints without the merge list file?

    Thanks in advance,

    -Matt B

    • Matthew Woodward
      September 11th, 2013 at 10:33 am


      Well Scrapebox can only handle 1,000,000 at a time in the final results window so just keep an eye on it and perhaps stop it at 500,000 or so.

      You could use fewer keywords yes but I just stop it manually, export, then carry on with the ones it hadn’t done yet.

      • Matt B
        September 13th, 2013 at 8:55 pm

        Thanks for the response Matthew.

        I just watched the Senuke video and was wondering how you separated your list by link type in order to import it into Senuke?

        Thanks in advance,


        • Matthew Woodward
          September 14th, 2013 at 3:54 pm


          By doing seperatae scrapes one by one to begin with – pain in the arse right?

  111. Holger

    Hi Matt,

    I want to scrape for article directories in german language which are supported by Licorne AIO. I know you prefer UD but maybe there´s no difference. Can you give an example for footprints to scrape These targets? Thanks in advance.


    • Matthew Woodward
      September 14th, 2013 at 4:04 pm


      You could add an inurl:.de to the search to return only german domains?

      • Holger
        September 16th, 2013 at 10:29 am

        I did so but there are no article directories in the results that Licorne supports. Instead I discovered that many german AD run on wordpress. I just try to find out if you can submit to them using Licorne and posted this as question inside the Licorne forums. By the way. Do you know if submitting to WordPress sites running as article directories works with Licorne and UD (i don´t mean blog comments)?


        • Matthew Woodward
          September 20th, 2013 at 10:00 am


          I’m pretty sure Licorne does support the wordpress based article dirs, I know UD does 100%!

  112. blueflame

    Hi Matthew

    Firstly a BIG thankyou for your tutorials, its really motivated me to get on with it and explore everything.

    Please help, I’m so confused, I keep getting my proxies blocked from google, its driving me mad! and making me poor !!

    Ive just tried using Scrapebox and trying to harvest 4120 keywords with 4 shared private proxy’s and all 4 got banned straight after the harvest which prob took about 10 minutes, I’ve even deselected multi-threading, they seem to still be okay if test them on proxy manager, or using the keyword scraper

    Any ideas on what i might be doing wrong ?



    • Matthew Woodward
      September 18th, 2013 at 11:56 am


      No worrys :)

      YOu need more proxies, im using semi-dedicateds from https://www.matthewwoodward.co.uk/get/buyproxies/ which can go all day long

      • blueflame
        September 18th, 2013 at 3:57 pm

        Thanks Matthew

        Ironically I did use them earlier today, I’ve just brought 30 of the semi dedicated so hopefully be a lot better

        I spoke to the owner without realizing it, he was so helpful and nice, said exactly the same thing as you.

        Thanks again, can finally move on with the training :-)

        • Matthew Woodward
          September 20th, 2013 at 9:54 am

          Best proxy company ever in my experience as well ^^

  113. Holger

    Hi Matt,

    I just began scraping with different keywords for my current niche site project. Since I have other projects in different niches I wonder if I should scrape new targets for each project or if I can use this target list for all my projects. What is your opinion about this issue and if I am allowed to ask: Do you manage different target list for your projects? I think it could make sense for certain platforms but article directories for example are good for almost every linkbuilding because they mostly have lots of categories. Please tell me if I´m wrong or not in your opinion. Replys from others would be nice too.

    Thanks in advance,


    • Matthew Woodward
      September 20th, 2013 at 9:50 am


      You can use it for all projects but you should never stop adding sites/maintaining the quality of your list.

  114. Mike

    This is pretty good info–not new–but it’s pretty good…..but I think that you have to a 20 something crazy young guy to actually spend so much time doing this mind-numbing stuff–SEO.

    Life is too short :) I do spend some time doing SEO of sorts, but mostly pay for services that I have found to work, just don’t have the time, or the desire to ever do this kind of stuff again…

    And, actually, SEO is NOT easy and it IS complicated-despite what SEO courses/guys selling SEO courses will tell you, so if you don’t have the time, resources, knowledge etc to spend time testing, and keeping up on things….well, good luck…but to each their own..



    • Matthew Woodward
      October 4th, 2013 at 9:20 am

      Well it was published 14 months ago :)

      Who ever you pay/hire to do your SEO – has more control over the success of your business than you do.

      Be careful with that!

      • Mike
        October 4th, 2013 at 3:57 pm

        Sure, and it is good info, and I have learned much of it as you have….not a lot has changed in the basics…one thing I won’t be doing is sending 1000’s spammy links to any tier..no matter of it’s the “3rd tier”, I believe there may be a few footprints that G can follow sooner or later.

        I actually do spend some time doing my own SEO :)….but for some parts, I find people to do certain tasks, not the whole enchilada, so I keep some control.

        And building a network of sorts is a good idea for your own use. :)


        • Matthew Woodward
          October 7th, 2013 at 9:17 am


          As long as you have control of the first tier that is irrelevant because you can always disconnect from the spam in the click of a button ;)

          Owning your first tier is the future though I agree!

          • Ray
            October 12th, 2013 at 3:04 pm

            Hi Matt. Great info! But when you write “click of a button,” do you delete just the backlink on, say, the blog post or do you delete the entire blog post? Thanks.

          • Matthew Woodward
            October 14th, 2013 at 9:26 am

            Either, as long as its removed :)

  115. Rony

    Have just finished watching all the tutorials – Where does the fiverr created video fit in?

    • Matthew Woodward
      October 5th, 2013 at 8:51 am


      In the video project for tier 1 in UD.

  116. Tom

    Hi Matt,

    I am currently following your tiered link tutorial have nearly completed spinning the articles.

    SEO is new to me, but this tutorial is helping me learn fast, for which I thank you for.

    I am however, a little unsure about the social bookmarking and web directory title and description preparing.

    are these titles and descriptions targeted towards the articles, i.e. the set of titles is in reference to each of the 3 article subjects or do you point directories / bookmarks to your money site with the title and link referring to there?

    Thanks in advance



    • Matthew Woodward
      October 5th, 2013 at 8:34 am


      You point them to your money site so the title/desc need to be relevant to that.

      • Tom Owens
        October 5th, 2013 at 3:27 pm

        Cheers for the advice. Been doing a bit of research into web directories today and noticed, some have 50 or so caps and others can be paragraphs long.

        What would be a good word count number to shoot for when writing the descriptions for the directories and bookmarks.

        Thanks again

        • Matthew Woodward
          October 6th, 2013 at 9:39 am

          Anywhere between 2-4 sentences will do!

  117. Rony

    I’m having a bit of a nightmare with scrapebox proxies – wanted to get your opinion.

    I have purchased 10 dedicated proxies from buyproxies, and also tried 10 semi dedicated from squid – im using only 1 connection to harvest but its going extremely slow. e.g. average 1 URL per second sometimes none. The proxy’s test fine for google, but then when harvesting seem to lock up?

    When I try with public proxies it goes fine, and when I tested with my own IP address (no proxy) it went super fast. Problem is it’s tough maintaining the public proxies so I was really hoping to use the privates I bought!

    Have you had any issues like this with private/semi private proxies?

    • Matthew Woodward
      October 7th, 2013 at 9:11 am


      Well I have found that squidproxies SUCK! and had the issues you are describing. Oddly I moved to buyproxies and had no issues.

      When you says it ‘locks up’ what do you mean?

      • Rony
        October 7th, 2013 at 10:04 am

        One thing I have learnt is the scrapebox proxy tester is not really accurate when it comes to google. It checks if the proxy can access google, but doesnt check if its been captcha blocked. So it can sometimes say its G passed when its not. Using Google Proxy Checker (BHW forum) has helped assess these more accurately.

        Here is my testing from several proxy providers:

        Purchased 50 dedicated from Squid: Had to contact support to get the proxies changes to google/scrapebox then got working to a degree. 20 were already captcha blocked and 30 currently working with 2 connections.

        10 dedicated from buyproxy: These seem to be captcha blocked at the beggining. Giving them the benefit of the doubt and will test when they are unblocked.

        proxy-hub – proxies not working at all in scrapebox – support said they should be fine but they are not. Support slow to respond.

        SSL-private-proxies – so far so good

        • Matthew Woodward
          October 8th, 2013 at 10:27 am

          With buyproxies drop support a line and tell them what you are intending to use the proxies for – they can change things up so they work better for scraping Google.

  118. JJ

    Matt, I’m having a problem filtering out de-indexed domains,

    I’m using 50 proxies from buyproxies.com to carry out this task with scrapebox, with 2 connections set. However most of my proxies get IP banned after an hour

    Have you ever had this issue before?

    • Matthew Woodward
      October 8th, 2013 at 10:17 am


      Just add a 1s delay :)

  119. Marck

    A very useful article and site thank you.
    I have one question:
    I followed step buy step your first video, the only problem is at the end the url count is 0. thank you very much.

    • Matthew Woodward
      October 11th, 2013 at 8:30 am

      THe URL count from where sorry?

  120. Ken Wild

    Hey Matt,

    Thanks for the video tutorials. I have two question:

    1. I get a list of over 2000 target sites after harvesting in scrapebox, which I paste in UD, however I only get about 8 successful sites detected. When I then tick the subfolder – got zero sites detected? Am I doing anything wrong (I did change the dropdown list to All)

    2. Also there is a huge list of sites already pre-loaded in UD. Is it safe to create links to all of them and the target sites? or do you have any tips that you can share on this?

    • Matthew Woodward
      October 18th, 2013 at 11:21 am


      1) Thats about right, just have to keep working through the lists

      2) Yes you can use them, but just be aware every UD user is using them

  121. Matt B

    Hi Matthew,

    I’ve gone through you tutorial several times now. You mention the importance of having the ability to remove tier one links, but I’ve been unable to find a place in the tutorial where you show how to do that.

    Any guidance is appreciated.


    • Matthew Woodward
      October 18th, 2013 at 11:18 am


      Ahhh thats just a case of exporting all of the logins and then manually working through them

  122. Agnes90

    Hey, I am wondering why you delete Ultimate Demon? Do you think it’s no longer working?

    • Matthew Woodward
      October 18th, 2013 at 3:16 pm
      • Agnes90
        October 19th, 2013 at 10:00 am

        Hi, yes I read that post.
        I notice under this vide tutor, in the “essetial tools” part, ultimate demon is striked through, so I got some confusion.

        • Matthew Woodward
          October 22nd, 2013 at 9:29 am

          I striked the $50 discount while the $80 discount was available, back to normal now though :)

  123. hans

    Hi Matt, 2 questions:
    1: when merging your ‘merge-list’ and ‘footprints-raw’ list with my keyword I get everytime and error message after finishing or aborting scrapebox and scrapebox will shut down. Do you have an idea why this happens? Scraping related websites is for example working fine. First time I see this error.

    2. you say the following 2 things:
    -write yourself – 1 title with a 3 sentence description for every type (so 3 sets in total, so you end up with 3 titles with 3×3 sentence descriptions in total)
    -you should write 15 alternative titles for everything including your articles, bookmarks so on…

    So, do I need to write 15 titles or 3 titles?

    • Matthew Woodward
      October 22nd, 2013 at 9:10 am


      1) Not sure sorry, try getting in touch with scrapebox support

      2) 3 titles, and each title has 15 variations

  124. Hans

    question 4: how many urls per second do you scrape on average with your 50 semi dedicated proxies?

    • Matthew Woodward
      October 23rd, 2013 at 1:49 pm

      It’s not really something I pay attention to – and also depends on how powerful the machine is, how much bandwidth is available and how many threads your running.

    • frank
      October 25th, 2013 at 10:57 pm

      50 buyproxies US semi dedicated Proxies. I am currently scrapping Google 14 AVg URL/Sec
      using 3 connections @ 162mb. Yahoo 39 AVg URL/Sec using 3 connections @ 435mb. Bing 21 AVg URL/Sec using 3 connections @ 625 mb.

      • Matthew Woodward
        October 28th, 2013 at 8:44 am

        What Frank said ^^

  125. Hans

    Hi Matt, thanks for your reply. I made this comment also yesterday: I think it dissapeared because I made 2 comments. Not sure about it. But this were the questions:

    1)You said you have to make a 3 sentence description for every title in the tutorial. So how many descriptions do I need to make exactly? You said 3×3 sentence descriptions in the tutorial, but Im not so sure about this anymore? Because you said first also only 3 titles. But I cant imagine we have to do 45 descriptions.

    So is 3×3 descriptions and for every sentence 2 alternative sentences and sentences also spun enough? Or how many do you exactly? Hopefully you can explain it in a very clear way.

    2) Im using squidproxies, but my proxies got banned after an hour. I also read here in the comments, that you don’t advocate Squidproxies, because you also have problems with it.

    So I think it’s best to buy 30 semi dedicated buyproxies, proxies. But which settings in scrapebox are best for this, so I will not get banned when scraping 24/7? This are my settings now for 25 private squidproxies:

    maximum connections settings: http://imgur.com/yRcBnZf
    you also talked about a delay of 1 second. Do you mean the ‘adjust RND delay range’ with this? Screenshot: http://imgur.com/wiTiWLC and is this set right?

    Are this the best settings for 24/7 with 30 semi dedicated ‘buy proxies’ proxies, scraping?

    • Matthew Woodward
      November 11th, 2013 at 10:26 am


      1) For every title, you need 10-15 unique titles. Don’t get titles confused with the article body.

      2) Yeah SquidProxies suck in my experience the cheaper semi-dedicated ones from https://www.matthewwoodward.co.uk/get/buyproxies/

      Really getting the settings right is just a case of tweaking them – those settings look ok though!

  126. Fabin

    Hey Matt,

    when I merge my keywords with your footprints should I use multi-threaded harvester or not?

    What effect do both option have?

    Thanks a lot!

    • Matthew Woodward
      October 24th, 2013 at 8:18 am

      Yes use the multi threaded harvester, that just means it does more things at once :)

  127. Chris

    Hi Matt, great stuff. I’m confused about something: you talk about “pasting your personal set of footprints into the footprints.ini file in the ScrapeBox configuration folder” so that it will be available in a drop down menu, and yet in the video we only see you pressing the button in ScrapeBox to merge in the raw footprints and then pressing it again to merge in the list of common words.

    If we can just do it that way, can we just do that and refrain from modifying the footprints.ini file and still get the same results? Or do we need to make that modification to the footprints.ini file in order for us to use your footprints?

    • Matthew Woodward
      October 28th, 2013 at 8:33 am

      Yes you get the same results although I prefer to have things organised better and in the future everything I need is in Scrapebox rather than referencing external files.

  128. winner

    Hey Matt, I am not sure how to thank you for the tutorials.. it’s mind blowing. This write-up inspired me to try and do SEO across our 5 eCommerce sites in-house instead of spending hundreds of dollars every month contracting it out. I also started buying up old domains with intentions of starting a PBN .. thank you.
    If you’re ever in Texas, drinks are on me!

    I also used your link to sign up for the monthly UD subscription.. will likely buy the full version next month.

    After I scrapped 78k keywords and merged with your footprint and common word files, I got over 2 million target URLS and when I loaded in scrapebox, it kept crashing. I ended up using only my keywords merged with your footprint (skipped the common words) this yielded 200k target urls.
    I loaded those in UD and been running the site detector since 1pm ET Saturday – 49hrs ago.

    These are the current stats
    32601 domains on queue.
    Success 395
    undetected 19302
    Duplicate 32.
    Connection error 894
    Estimated time left 71 hrs.

    I am using 50 semi-dedicated proxies from buyproxies.
    This means it checked 20k domains in 48hrs.. still have over 30K to go..

    1. Does it usually take this long?
    2. Apparently it loaded only about 50k is there an easier way to know which 150k domains were skipped?
    3. I am yet to uncheck the folder stuff to run the detect the 2nd time.. is this necessary at this time? I do not have another week to run this same 50k domains.

    By the way, i am running a Quad core 2.0GHZ, 8GB machine. 10MB fiber connection.
    Thanks again.

    • Matthew Woodward
      October 29th, 2013 at 12:28 pm


      Haha thanks – I was in Texas not so long ago but don’t have any plans to return or pass through at the moment!

      Yes it will take a long time you can also increase the number of threads and disable the proxies for the import process.

      It loads the list in the order you paste it so it will be the last 150k that were skipped

      • winner
        October 29th, 2013 at 6:33 pm

        ohh ok .. thanks a lot. Will try that once this is done in a couple of days.

        • Matthew Woodward
          November 11th, 2013 at 10:23 am

          No worrys :)

  129. Ken Wild

    Hey Mathew,

    When I load your footprints file and merge file to the 800 keywords in scrapebox, I get total of 7296000 keywords. But, when I press the harvest button to get a list of related target sites, it keeps crashing.

    I have emailed scrapebox support regards this. However I wanted your thoughts on this. Have you ever encountered anything similar or if you have any possible solution to this.


    • Matthew Woodward
      November 5th, 2013 at 10:32 am


      Upgrade your computer or use less keywords :)

  130. Andrew

    Thanks for these amazingly helpful tutorials Matthew. I have a question about sourcing a video gig from Fiverr which you mention here. What kind of video are we looking for? If the money site in question is for an affiliate product, can this video be a testimonial? Let me know if I’m on the right track, thanks!

    • Matthew Woodward
      November 5th, 2013 at 9:53 am


      Whatever is relevant to your promotion.

  131. Ken Wild

    Hey Mat,

    In the UD software, after I add the site’s from the site detector to the site list, you say to select all the sites and then run a google index check in scrapebox. But in UD there is already around 100 site’s that was preloaded with Ultimate Demon. It is okay to keep that as part of your tier one or do you recommend deleting them and only use the target URL’s from the site detector?


    • Matthew Woodward
      November 18th, 2013 at 10:41 am


      If those domains are deindexed, remove them! The source is irrelevant

  132. Jeff

    Hi Matthew, are your methods still working? Especially with the Google’s constant algorithm change and all.

    • Matthew Woodward
      November 21st, 2013 at 10:50 am

      As with any approach to link building you need to diversify and mix things up. If you combine this with replicating competitor links, building a private network, creating good content and driving social signals, you are winning!

  133. Holger

    Hi Matt, could you please explain what kind of social signals are the best in your opinion?

    • Matthew Woodward
      November 22nd, 2013 at 9:07 am

      Best in terms of what goal?

      • Holger
        November 22nd, 2013 at 9:13 am

        To get better ranking of course. Do you rather mean the big players (Twitter, Facebook, Google etc.) with high PR or others as well? What about 0 PR sites?

        • Matthew Woodward
          November 22nd, 2013 at 9:27 am

          Well social signals bring a range of value, increasing ranking is just 1 thing they can be used for. But yes Google+, Facebook, Twitter, Pinterest

          • Holger
            November 22nd, 2013 at 9:36 am

            I currently use Licorne AIO but I have some difficulties with it. The same Thing with scarpebox. But back to the Topic now: Do you know if you can create Google, FB, Twitter and Pinterest accounts and submissions with Licorne. If or if not: Is it possible to do that with UD?

          • Matthew Woodward
            November 24th, 2013 at 11:28 am

            No you can’t =\

  134. Martin

    Hey Matthew,

    Am I going completely bonkers?! I can’t seem to find your list of targets…

    • Matthew Woodward
      November 24th, 2013 at 11:24 am

      Yes you are :)

  135. mark

    I can’t believe this is all in one place. What an amazing gift.
    Thank you!!

    • Matthew Woodward
      November 27th, 2013 at 3:21 pm

      No worrys :)

  136. myancey

    Hi Matt,

    I went thru your entire tutorial and, man, it’s superb! The detailed instruction is awesome – a little fast but it’s on vid so…

    I’m def impressed. Thank you.

    Unfortunately it is over-my-head in some areas and def out of my budget. I calculated over $700 worth of tools.

    I am so new to backlinking and struggling to set up my Link building platform now. I have found that many of the Web 2 properties that allow do-follow are sites where I have to create a site (mini site I guess).

    I def was not expecting that! I thought that I could just put my article up and be done. No one seems to teach the best way to QUICKLY set these mini sites up.

    I don’t want to spend days trying to set up all of these sites.

    I thought you said you had a post about that but I haven’t found it yet.

    Long story short – I can’t employ your method right now b/c it is so out of my league right now – mostly financially and in other areas I’m just befuddled – like proxies and all that.

    I learn quickly but this stuff has slowed me down quite a bit.

    Any assist or suggestions would be appreciated.

    Thanks again!


  137. Mike

    Hi Matt,

    are your footprints and mergelist still up to date or did you made any changes?

  138. Kaelos

    Any good tutorials you’d recommend on how to spin manually, by hand?

  139. Andy

    Spending $25 just to rank 1 keyword doesn’t seem economical. What if you are going an authority site trying to rank 100’s of keywords?

    • Matthew Woodward
      December 24th, 2013 at 1:20 pm

      Spend $2,500?

  140. Trevor

    Why do you now use 99centarticles? the prices and whole “feel” of this place is all wrong. I feel like im being scammed from the home page on. The site must have been designed before computers were invented!

    There are no 99ct articles thats for sure.

    • Matthew Woodward
      January 9th, 2014 at 11:24 pm

      You used to be able to order 100 word articles for 99cents, not sure if thats still the case!

  141. Alex

    You mntion footprint file under the video – but i have searched high and low and don’t see! Is there a direct link?


    • Matthew Woodward
      January 17th, 2014 at 11:57 am

      Did you read while you were searching?

  142. Chua

    Hi Matthew,
    Thanks for the tutorial, I managed to follow the steps and used the footprints provided by you, using scrape box to scrape more than 14000 links. As I only have GSA SER at the moment, can I import them into GSA? Any additional steps like filtering off not usable links or GSA is able to work it out. Have a good day.

    Thanks again

    • Matthew Woodward
      January 28th, 2014 at 10:29 am

      Yes you can import them in and im pretty sure the import will respect your GSA project settings but I would check that with the dev.

  143. Mike

    Hey Matt,
    Thanks for the videos, they have been very helpful in breaking everything down. I have a basic question regarding the 3 X 500 word articles when implementing them in UD. Currently I have one of them spun properly in TBS (by the way it is mind numbing at times). My question is when submitting them in UD….are you just submitting one of them to the 100 web 2.0 directories in your video or are you incorporating all three at the same time somehow? I may have missed that in your video or didn’t see it. Also I have yet to pick up UD, so far I just have TBS and Scrapebox getting everything set before I pick that up. So do you just put in your spun format article and UD does all the spinning like TBS does? Also after you launch the article once in UD to the 100 directories is the article then useless at that point to re-use as it could be reduplicated in UD if submitted at a later time or does UD remember what it’s spun previously and re-spins it differently if used down the line? Hope you can answer some of these questions and thanks again for your videos.

    • Matthew Woodward
      February 3rd, 2014 at 10:06 am

      1 for web 2, 1 for wiki, 1 for PR/AD

      Yes an article can only be used so many times, its up to you to work out how many times that is and keep control.

  144. Kerh

    Hello Matthew,

    Thank you very much for making the videos. Definitely a great help to newbie like me.

    I have two questions.

    1) 3 x 500 word articles about your keyword or topic.
    Do I need to write 2 alternative sentences for each sentence of these articles for spinning?

    2) 1 x video with titles/descriptions.
    How long is the video required?

    Thank you.

    • Matthew Woodward
      February 4th, 2014 at 9:51 am


      1) Yes

      2) Whatever is ‘right’ for your niche/what you are trying to communicate

      • kerh
        March 24th, 2014 at 1:45 pm

        Hello Matthew,

        I am concerned seogenerals or 99 CentArticles may copied from other sources to produce the articles. Have you ever encountered such problem where their articles failed copyscape?

        • Matthew Woodward
          March 24th, 2014 at 5:14 pm

          Yes that happens sometimes you have to keep a leash on quality

  145. Sheikh Ovais

    Hi Matt,

    You’ve recommended 99centarticles in your tutorials most of the time and it seems that you’re quite satisfied with their quality.

    My question is that can I use their service to have (1000 word) articles written for my own (primary) site, not for the purpose of link building? Assuming that I’ll tweak the articles to add more detail and value, should I go for them to have 10 x (1000-word) articles written for my site?

    Thanks a lot.

    • Matthew Woodward
      March 6th, 2014 at 8:47 pm


      I ALWAYS write my own content on the front end. By the time you have tweaked/added detail to it you might as well have written it.

  146. KimMass

    Hi Matthew Woodward
    I’ve built about 10 web 2.0 (3-5 articles per site, 1 to 2 articles with back links to money site). Google has to index these web, but when I check the back link of money site, I do not see any back link. This is why?
    thank a lot

    • Matthew Woodward
      March 21st, 2014 at 7:40 am

      If its indexed its good to go.

  147. Kim

    In your example of video 2 (and with the raw footprint) when you scrape, do you also include those websites without PR (example like N/A?)
    And when commenting/scraping it needs to be niche related right ????

    And do you go over the websites manually to see if they are related ???

    • Matthew Woodward
      March 31st, 2014 at 8:34 am

      1) Yes
      2) No
      3) No

  148. Kim

    Did you delete my blog comment ??? :(

    • Matthew Woodward
      March 31st, 2014 at 8:34 am

      Comments must be manually approved before publishing, I assume your not familiar with wordpress?

      • Kim
        March 31st, 2014 at 11:22 pm

        No im not familiar with the manually approved in “wordpress” yet. But thanks for taking time on every question/comment. Even though im not always the one who are asking. Im still reading to learn from other questions!

  149. Fareed

    Hi Matthew,

    Thank you for all this wonderful knowledge you are sharing with us. I have few questions:
    1- the link to Linkwheelbandit doesn’t seem to work. Is it no more on the market?
    2- Is there any particular reason why you don’t use GSA for Tier1.
    3- What are the tools, services or softwares (hidden costs) involved for someone who buy your Premium tutorial?
    I already own GSA, scrapebox, KontentMachine,WAC, and The Best spinner.



    • Matthew Woodward
      April 12th, 2014 at 12:23 pm


      1) No its not

      2) Answered a million times above

      3) None – there are free and premium ways to do things

  150. Jared Jammer

    Hello, Mr. Woodward, and thanks for all of the info you provide free of charge. It’s much appreciated.

    If you don’t mind, I have one simple question.

    After I scrape a list of URL’s via Scrapebox, and before I add them to Ultimate Demon’s site detector, should I remove duplicate domains or just duplicate URL’s?

    Thanks in advance for your help!

    • Matthew Woodward
      May 11th, 2014 at 1:30 am


      Yeah you can remove both :)

  151. Peter

    SEO Generals – A word of warning to all.

    At 2 Jun 2014 10:40:29 BST I paid $54 to SEO Generals AKA GlobalDynamics Solutions Inc to write 3 express / expert 500 word articles. I have the PayPal receipt of this as well.

    Since then nothing – I have sent emails via their contact page (5 times) NO RESPONSE.

    I have emailed the email address on their website [email protected] – again NO RESPONSE.

    I have now given up and written off a waste of $54 – if anyone else has had a similar problem then replay here – if anyone is thinking of using them – DO NOT BOTHER.


  152. Sam


    I have got an issue with Ulimate Deamon while adding target list.xls file, around 599 sites didn’t added. Do you have any other work around?

    Appciate your help in advance.


    • Matthew Woodward
      June 26th, 2014 at 7:00 pm

      Sites come and go, re-run the list with the subdirectory box ticked/unticked

      • Sam
        June 26th, 2014 at 7:59 pm

        Hi Matthew,

        I have tried over 20-30 times so far but not luck. Still same. Tried switching off the proxy as well. No luck at all. I don’t where to proceed now.


  153. Daniel

    Matthew, I’m sure this is a dumb question but I’m a noob, and want to cover my bases. Just to clarify, when you say “hand-spun”, are you referring to using The Best Spinner, or literally spinning all of my titles, descriptions, and sentences manually, by hand. I’m sitting here imagining doing everything by hand and am like, Damn, that’s going to be a lot of work! Thanks in advance for not laughing at me too hard.

    • Matthew Woodward
      June 26th, 2014 at 6:59 pm

      Using TheBestSpinner but doing it manually with the software rather than using the auto replace buttons.

      • Ryan
        July 8th, 2014 at 6:13 am

        Hello Matt,

        Loving your articles. Just to clarify. You said we have to (write 2 alternative sentence) for all 3 orginal articles.

        Can I use TheBestSpinner to do this? or do you suggest to write it by hand?


        • Matthew Woodward
          July 8th, 2014 at 10:04 pm

          Yes I use The Best Spinner for this

  154. keith

    how do I get to download your personal targets. I clicked on your social share button, google plus and facebook and got nothing in return?

    • Matthew Woodward
      July 2nd, 2014 at 1:21 am

      Please drop me a mail and I will send it over to you

  155. Sam

    Hi Matthew,

    Thank you once again for the great tutorial.

    Small Confusion:
    At 09:44 on the video 2 you have mentioned about “Add these to your footprints.ini file in the Configuration folder for future use”

    Do you mean to copy the text from footprints.ini or footprints-raw.txt (from your list? to the footprint of Scrapbox?

    Thank you in advance.


    • Matthew Woodward
      July 2nd, 2014 at 1:15 am

      Move the footprints.ini file into the folder

      • Sam
        July 4th, 2014 at 7:37 am

        Your are star. Thank you. Kicking off the first campaign tomorrow.

        • Matthew Woodward
          July 6th, 2014 at 11:13 pm

          Let us know how it goes!

  156. Moshe Dahari

    By the way great job on reorganzing your navigation bar, i like the categories, easier to find things.

    Question, I noticed when you scraped for URLS in Scrapebox you used a broad match, do you think its better to scrape your keyword without quotes? e.g. “article dashboard’ dog training. Or “article dashboard” “dog training”

    • Matthew Woodward
      July 6th, 2014 at 11:06 pm

      Haha nice spot thanks – waiting for more data to decide if it was a good move or not!

      You can use the quotes for tighter/more accurate results if you want to but I don’t personally.

  157. Dave

    Hi Matthew,

    I come across your tutorial after a short Google Search and I am finding very helpful. You are doing very good job for the community.

    I am bit confused: I have scrapped my target list as you have suggested using Scrapebox.

    On UD you have added the list of sites on Site Detector >>>is that the same list you had scrapped as “target list”

    or is “target list ” and “site list” two different things?

    Many thanks

    • Matthew Woodward
      July 6th, 2014 at 11:05 pm

      Thanks for your kind words, yes they are the same thing :)

      • Dave
        July 6th, 2014 at 11:55 pm

        Thank you for the reply.

        One more question (sorry)

        I want to start Tire Link Building for my site + for few other sites.
        Once i import the target list for 1 website (www.xyz.com) and start the link building

        If I want to do start the 2nd project for (www.abc.com) and again did the scraping and paste the target list in UD?

        So there will be mixed list for site 1 and site 2 in UD

        Is this good? or am I not understanding and over thinking?

        Or do I delete the list for site 1 and than only paste the target list for site 2 and kick off the LB?

        Sorry for too many questions


        • Matthew Woodward
          July 7th, 2014 at 3:52 pm

          You should always be looking to scrape and import new targets regardless of how many projects you are running

  158. Anthony

    Hi Matt,

    Firstly, thanks for the video series. You should be commended for the effort put in here.

    Can I ask how many private proxies you would advise owning for running the first tier?


    • Matthew Woodward
      July 15th, 2014 at 3:06 pm

      Thanks Anthony, I would recommend 10 semi dedicated from BuyProxies

  159. Freddy James

    Hi Matthew,

    Just wanted to say thank you for these excellent videos. I have just finished a MNS and am about to spin some articles to start the 1st tier backlinking process with SENuke as per your instructions.

    I hope to be rising up the SERPS in the next few weeks.

    Thanks again.

    ps – I have made sure to click your affiliate links for any software I am going to purchase as a thanks for your efforts.

    • Matthew Woodward
      August 4th, 2014 at 12:09 pm

      Thats very kind of you – good luck with your efforts =D

  160. networkwithdon

    I have a question about tier one sites if you please.

    So let’s say I take on an attorney. I then create a PBN for tier ones. I then have content created for the tier ones and post the new content linking to the money site. Then tier 2’s and 3’s.

    So when I repeat the process again and again by adding the content to my tier ones linking to the money site. Does it look weird for all the tier ones to be linking to just one site? Do you add other content to the tier ones linking to site that aren’t your clients money site so the site doesn’t just have links to one domain?

    Please fill me in on this. This is really the one thing in the process I am unclear on.

    Thanks so much for all your help.

    • Matthew Woodward
      August 12th, 2014 at 8:30 pm

      Never have more than 1 link from a tier 1 site to a money site

  161. med


    Is it worth to put backlinks on a site with different language than mine? For example, if my site is on russian is it worth to put backlink on an english site if my anchor text is on russian?


    • Matthew Woodward
      August 9th, 2014 at 3:26 am


      Yes that works but I would mix things up!

  162. Damon


    Great video.

    You recommend article demon. Won’t scrapbox do the same thing?? Maybe its not as flashy…. but just wondering if I need article demon?


    • Matthew Woodward
      August 12th, 2014 at 7:45 pm

      I recommend Ultimate Demon here and Scrapebox is a very different tool!

  163. Cyrus

    So… I din’t know much about tier 1,2,3…. et al. Thanks a lot man. I think most bloggers (especially me) are very afraid of doing all that work. I will invest my time in doing what i’ve learnt in the videos. You should have a donate button somewhere :)

    • Matthew Woodward
      September 4th, 2014 at 2:41 pm

      If you are a blogger just stick to blogging!

  164. Sourav Das

    Should the tier 1 links be built for only the homepage or all the pages of my website?

    • Matthew Woodward
      October 5th, 2014 at 9:13 pm

      Depends what you are trying to rank

  165. Erik Emanuelli

    Is Tiered Link Building still working today, Matthew?
    I mean, even after the latest news, PR is going to disappear, etc.

    • Matthew Woodward
      November 6th, 2014 at 4:06 pm

      Yes it is still effective!

  166. Edwin

    Hi Matthew, I’m really satisfied about Ultimate Demon. That’s a great seo tool for tier 1. Is the file footprints-raw that you share contain all footprints for all platforms in Ultimate Demon?

    • Matthew Woodward
      June 23rd, 2015 at 8:20 am

      It sure does!

  167. samim

    the best spinner link isn’t working

    • Matthew Woodward
      September 21st, 2015 at 9:46 am

      Hmm try it again its working here

  168. Alex

    Hi Matthew,

    I really love all your hard work in explaining SEO link building strategies on this page. Although I haven’t finished watching all your videos, but thank you very much for all the information and efforts you put into them.

    I have some questions which I hope you could clear up for me, if you wouldn’t mind:
    – your targets-list.xls file that is downloadable via a link under the video on this page, why does it contain so many dead target links, I tried openning 8 links randomly from this file, and they are all dead, is it because this .xls file was generated a while ago?
    – I would like to use your link building technique for SEO-ing my site, I read that link building technique is much less effective nowadays compared to several years ago, my question would be is it still worthwhile going through this path or you would recommend something else?
    – My web would be in a foreign language (an Indonesian language), and I need certainly need to prepare my contents in Indonesian. Do you think the software you recommended here (bestspinner, ultimate demon, etc.) would have no problem in helping me in my situation?
    – Is there a rule of thumb of how a “good” content should look like? I mean … do we need to stuff in keywords we are targeting to the content? and if so, generally how many times we could stuff in the keywords in an article (let say 500 words)?

    Thank you very much, Matthew.

    • Matthew Woodward
      January 27th, 2016 at 4:36 pm

      Hey Alex,

      In answer to your questions-

      1) Probably because lots of people have downloaded it, used it and abused it which will be the case with any public list

      2) Which SEO approach you use largely depends on your specific circumstances as there is not a one size fits all strategy

      3) I have no experience with foreign language sorry

      4) I wouldnt worry about the keyword density but it should be on topic/related

  169. alan

    does this work in other lenguages such spanish??

    • Matthew Woodward
      May 16th, 2016 at 8:32 am

      Yes although you would need to use something like SEOContentMachine for the Spanish content

      • Alan
        May 20th, 2016 at 11:44 am

        thanks Matthew, eres una vergaa!!!

        • Matthew Woodward
          May 24th, 2016 at 10:55 am

          jajaja cara picha :P

          • Alan
            May 24th, 2016 at 2:36 pm


  170. adi

    Thanks Matthew, i use your tutorial video for my site linking and this is work for me

    • Matthew Woodward
      February 26th, 2017 at 7:48 am

      Glad to hear it Adi!

  171. Joaquin Rios

    Thanks for the tutorial Matthew, …like always. You are great!!

    • Matthew Woodward
      August 8th, 2017 at 1:30 pm

      No worrys Joaquin!

  172. shanaul saikh

    thank you very helpful. looking for more videos. this method works for my site linking

    • Matthew Woodward
      July 11th, 2017 at 2:09 pm

      No worrys!

  173. hit me

    Good poѕt. I ⅼearn something totally new and challenging
    on websites I stumbleupon every day. It will always
    be exciting to read articles ffrom other writers and use something
    from other web sites.

    • Matthew Woodward
      April 2nd, 2018 at 4:13 pm

      Thanks for reading!

  174. dengan benar

    I got this web site from my friend who informed me regarding
    this web site and now this time I am browsing this website and reading very informative articles at this time.

    • Matthew Woodward
      May 11th, 2018 at 11:46 am

      Hey, pleased you find them informative, thanks and also thanks to your friend for the recommendation

  175. sanat

    It’s difficult to find well-informed people about
    this subject, however, you sound like you know what you’re talking about!

    • Matthew Woodward
      June 8th, 2018 at 10:41 am

      No problem :)

What are your thoughts?


* Name, Email, Comment are Required