The Ultimate Guide To Tiered Link Building Part 2

The Ultimate Guide To Tiered Link Building Part 2

This is the second video in my Ultimate Guide To Tiered Link Building video tutorial series. If you missed part 1 then be sure to check out the episode guide on the right.

Thanks for all the great feedback so far! Please share this video series with others if you find it useful. It takes hours to make each video so taking a few seconds to share it is a great way of saying thank you.

Important Update: Please read this.

What You Will Learn

  1. How to create a perfect tier 1 link profile
  2. How to prepare for the campaign
  3. How to spin content the right way
  4. How to setup a campaign over my shoulder
  5. How to schedule your tier 1 link campaigns

Click Here to Read the Transcript

Want more great tutorials like this? Just enter your email and click “Sign Me Up!”

Resources In The Video

Essential Tools

TheBestSpinner (download the trial) – It really is The Best Spinner.

Ultimate Demon ($50 Discount) – perfect for creating a solid tier 1 link profile.

Sourcing Content

99centarticles.com – Good quality articles for the money – perfect for link building!

SEOGenerals.com – Fantastic backend for managing your orders and projects.
Update: Since publication the SEO Generals service is not what it once was and the coupon has now expired. Please use 99CentArticles who offer a great service and pricing.

Fiverr videos – Can’t get them any cheaper than that!.

Other Resources

Google Adwords Keyword tool – Find related keywords and tags.

Scrapebox – The swiss army knife of the SEO world.

BuyProxies.org – The semi dedicated ones from BuyProxies are superior in comparison to SquidProxies which I used to use.

Get My Latest Posts

Subscribe to the blog to get the latest updates delivered direct to your inbox.


Connect With Me...

Circle me on Google+ so we can chat and I can put a face to a name!

392 Responses

8.2.2012

Aditya, I think Matthew is sharing some quality info here, don't you? I don't mind throwing him an affiliate sale or two for his trouble.

Reply

Matthew Woodward Reply:

I think the comment was deleted but thank you very much for the support.

These videos take an awful lot of time to make, nevermind how much time and money I invest in testing various factors!

Don't like the affiliate links? Don't watch the videos :)

Reply

penny Reply:

I am going to buy all the products that you listed with your affiliate links. The amount of information here is invaluable.

Reply

Matthew Woodward Reply:

That is very kind of you thank you!

Reply

Tony Hayes Reply:

Agreed!
People want the content to learn how to make money online and then some complain when you lead by example.
Look at what Matthew is doing here guys…
High quality content with useful and actionable information, compelling headlines and call to action.
Don’t knock it, learn from it!

Reply

Matthew Woodward Reply:

You can’t win them all :)

Reply

superb:)

Reply

8.7.2012

Matthew,

I have now found a resource that finally explains tiered link building in a simple way and how to integrate the tools together to make it happen in a Google safe way. Wish I had this before I bought UD. I would certainly have purchased through your affiliate link.

Can't wait for the next video's. Will you be covering the new version of UD with link trees?

Many thanks for what you are doing.

Peter

Reply

Matthew Woodward Reply:

Hi Peter,

Thanks for the kind words :)

The link tree's feature looks interesting but it won't be something I use personally, I have a much more efficient and streamlined way of doing things but it will allow you to create multi tiered structures within Ultimate Demon easily.

Cheers

P.s. video 3 is up ^^

Reply

8.19.2012

Great videos, do you offer this as a service Matthew? I don't have SB at the moment or ultimate demon. I am just building my tier 1 with a blogger blog and was going to redirect that to my main domain once I had it ranking. it's on page 2 now.

Reply

Matthew Woodward Reply:

Hi,

I dont offer it as a service sorry and if I did most people wouldn't pay the price I would ask. I may take on a couple of private clients but dont expect pricing to be what you see in forums etc

Just scale out what your doing with automation :)

Reply

8.20.2012

When scraping your own sites do you wory about PR at page or domain level at all or just add a good mixture of all PR?

Reply

Matthew Woodward Reply:

Not when I'm scraping but when I submit tier 1 links I like to ensure they are only on domains with PR

Reply

Michael Cox Reply:

Yea thought that would be the case, hey do you know you rank no13 in the UK for buy seo? Not sure how thats happend lol…

Reply

Matthew Woodward Reply:

Haha I didn't know that, how did you find that? Maybe Matt Cutt's isn't talking out his arse after all :P

Reply

Michael Cox Reply:

I seen you where getting a few genuine social signals on the videos with the content lock so was checking your site out on semrush to see what KW's you wher ranking for.

http://www.semrush.com/uk/info/matthewwoodward.co.uk

Reply

Matthew Woodward Reply:

Ahh thats pretty sweet thanks will be keeping an eye on that one.

I dont suppose you know of a tool that can report the amount of tweets, likes and +1s for a list of urls in bulk do you?

Reply

Michael Cox Reply:

Not really seomoz does it for sites you have added as projects but that's about it, sure there will be a tool somewhere.

Reply

Matthew Woodward Reply:

SEO Tools for excel ^^

Reply

Tom Thoma Reply:

MatthewWoodward.co.uk Danial Tans(seopressor) Social Metrics Pro Plugin collects stats on posts and pages for twitter, facebook, google+,pintrest, stumbleupon,digg,linkedin and you export them csv,xls.

Reply

Michael Cox Reply:

Ow ya from seo gadgets, have you used it yet?

Reply

Matthew Woodward Reply:

I did a quick test to look up social values for URLs, seems to do the job will be using it to do the monthly report on the blog

Reply

Matthew Woodward Reply:

Checking out Social Metrics Pro now, thanks for the heads up!

Reply

8.27.2012

Hi Matt, Great Videos, Keep it Up!

I am new to the 3 tier linking, but need to clarify a point.

I normaly have about 20 unique kewyord specific articles on my sites and follow up with links from article directories, blogs, social networks, ect.

In this section you mention 3 x 500 word articles on a keyword, does that mean you are linking to all three articles on the website or your mass spinning these back to one original copy on the website?

Tom

Reply

Matthew Woodward Reply:

Hi Tom,

The 3x 500 word articles are what is used to create the backlinks and are not posted on your own website.

Reply

8.28.2012

Hi Matt, Great Videos, Keep it Up!

I am new to the 3 tier linking, but need to clarify a point.

I normaly have about 20 unique kewyord specific articles on my sites and follow up with links from article directories, blogs, social networks, ect.

In this section you mention 3 x 500 word articles on a keyword, does that mean you are linking to all three articles on the website or your mass spinning these back to one original copy on the website?

Tom

Reply

Matthew Woodward Reply:

Hi,

The 3x 500 word articles I talk about preparing are to be used in the link building campaign and not for the site itself.

You can use this entire process to rank 1 page for a range of keywords and longtails if you mix the anchor text up correctly.

Reply

Anonymous
8.31.2012

Matt, could you clarify a few points.
1) the 3 x 500 once spun will be placed on web2.0 sites does this mean you will have a 3 posts on each web2.0 and the final post will have your link pointing back to your money site?
2)in the video you post your link in the first post to the web 2.0 isn't customarily to create your profile wait a few days before posting content, to prevent your account getting deleted by moderators.
3) would you use the same 3×500 articles spun to submit to article directories?
thanks for your help in advance I feel I've got most of it down just need a few clarification.

Reply

Matthew Woodward Reply:

1) Sorry I use one of the articles on the web 2.0s, 1 for the Wikis and the other one for the AD's & Press releases

2) Dont worry about this (we will build out proper web 2.0 sites by hand in the last video). UltimateDemon mostly posts to Elgg and Jcow web 2.0 platforms where this isnt a problem.

3) See answer 1 ^^

Hope that clears things up

Reply

Callum Ward
9.1.2012

Brilliant vids! you normally pay for this kinda tuition ;) thank you!

Reply

Matthew Woodward Reply:

You can send me some money if you want :P

Reply

Nathan Williams
9.11.2012

Hey Matt,

I'm having a hard time coming up with the funds for UD. What do you think about manually posting to about 15-20 WEB 2.0 properties…like the "big" ones (wordpress/blogger/etc) for Tier 1 and then using GSA to make the Tier 2 and 3? Would 15-20 be enough? Or would those "big" ones know what I'm doing and shut me down?

Appreciate your input!

Reply

Matthew Woodward Reply:

Hi Nathan,

Yes that would be a sensible approach to take – but a better approach would be to watch some of these tutorials on SENuke X http://www.matthewwoodward.co.uk/reviews/the-best-senuke-x-reviews-tutorials/ and then work out how you will plan to use it.

Spend some time preparing your content and watching the videos so your familiar with the tool before you even use it. Then take out the 14 day trial, load your campaigns in and fire them up straight away using the theory of what you have learnt in these videos to guide you.

Voila!

Reply

dr cliff Reply:

For beauty supplements isn’t gsa just as good? why is senuke or ultimate demon better? exactly why? and which gsa? ranker or submitter to use if i am broke and miss senuke trial?

Reply

Matthew Woodward Reply:

I don’t know what any of that has to do with beauty supplements

Reply

Luke Ilechuku
9.12.2012

My goodness. This is really, extremely good content. I would have paid for this and been happy with my purchase. Thanks!

Reply

Matthew Woodward Reply:

Glad you enjoyed it :) Good content should be free not a WSO :P

Reply

Decent list. A bit old (over 50% no longer work) but still plenty of power.

Reply

Matthew Woodward Reply:

Well the list is less than a month old and all confirmed working when I published it. Try the import again but tick 'my site is stored in a subfolder' and you'll get more successes :)

Will have a check over the list again at the weekend and update if neccessary

Reply

Paul Rone-Clarke Reply:

Hi – No real need, my lists are very much larger anyway (I have two servers scraping 24/7) – but I appreciate the gesture [scritty from BHW]

Reply

Matthew Woodward Reply:

And those lists are on their way to my inbox right now right? =D

Welcome to the blog by the way :) Will probably catch you over at BHW more often than not!

Reply

9.15.2012

Hey Matt, Any experience with Magic Submitter and how it compares to UD?

I have been using MS for my tier1 links for a while no complaints except the ongoing monthly cost.
I see that UD has a one-time payment option.
So just curious to see whether I should swap to UD.
Rich

Reply

Matthew Woodward Reply:

Hi Rich,

Sorry I haven't got any hands on experience with Magic Submitter but I don't see why you can't use it for tier 1 effectively.

I just see UD as a one time investment without having to worry about anything else, but a few people have asked me to tackle magic submitter so I might start having a play with it and seeing what it can really do.

Its personal preference really! But one time payment > monthly payment

Reply

Andrew Thomson
9.17.2012

Great stuff! – Have the downloads been removed? It would be good to be able to get them. I have not used scrapebox before – do you need to put the footprints in to make sure it finds the right link type?

Reply

Matthew Woodward Reply:

No they are still there, just not looking hard enough :)

Yes the footprints are what you use to find that a site is a certain platform, try some of the searches manually in Google to understand how it works.

Reply

Andrew Thomson Reply:

Ok figured it out now – I should have read it more carefully! That thing with the social buttons is pretty clever.

How does Ultimate Demon tell the difference between the url website (platform) types?
Does it work it out itself or do you need to submit them as separate pre-sorted lists – based on your footprint criteria?

Reply

Matthew Woodward Reply:

You would be surprised how many people miss them – I wonder what other important info people skip over when reading.

With Ultimate Demon you just paste the list of harvested urls into one box and click start, no need to seperate or sort.

The only additional filtering I do is once UD has added the sites to the database is to export them all, check the domains are indexed and remove any that aren't via the mass deletion tool.

Reply

Anonymous
9.20.2012

hi matt, again many thx for the amazing tutorials, I just set up my web 2.0's, I have to admit it did take sometime, hopefully I will get a bit faster with practice. just wanted ask what you thought of seogenerals? just received 3 articles and they where really bad quality. have u had any similar experiences?

thx bro

gerard

Reply

Matthew Woodward Reply:

Hi,

The quality of the articles are often down the quality of instruction you provide – I am very specific about what I want. But push back on them and they will get it sorted for you.

With a bit of practice you can set these campaigns up in no time at all – it becomes a mindless process after a while :)

Reply

9.20.2012

Quick question about the articles then spinning them.

Could I use my original articles from my main blog, spin them like you suggested then use those spun articles to create all the tier one web properties?

Reply

Matthew Woodward Reply:

You could but it is something I would avoid doing personally.

For the sake of a few dollars to get a new article written I dont see why you would take a short cut like that which increases risk of getting caught.

Reply

Sally Cline
9.20.2012

Hi Matt.

First, this is a great set of videos you've made! I just had a quick question regarding the content and preparation. You had mentioned…

3 x 500 word articles. – Are these spun articles?

Then you went on to mention 2 alternative sentences for each sentence in each article. This confuses me a little bit. Would you mind explaining this so I can set up my content appropriately? Would it be…

3 unique 500 word articles…then write 2 alternative sentences for each of these 500 word articles and then spin them?

Thanks in advance!

Reply

Sally Cline Reply:

Sorry for the double post, but I'm not sure where to use all 3 articles. Or if it's just 1 mass spun article. I watched video 3 and it looks like you used just 1 of the spun articles for the web 2 and article submission directories. Would we just spin it using {Article 1 + Spun content w/ 2 alternative sentences} {article 2 with spun content + 2 alternative sentences} {article 3 w etc}. Appreciate your help Matt!

Reply

Matthew Woodward Reply:

Hi,

Yes they are spun to the specification in the video – for each article write 2 altenative sentences for each original sentence then spin all of the words phrases where it makes sense to do so.

In answer to your second post thats something I forgot to explain ^^ I use 1 for web 2.0s, 1 for Wikis and 1 for AD's/PR's

Reply

Chris Reply:

Just so I’m clear, are you saying here that you use 3 articles prepared for spinning, and use the first of those articles to build links using Web 2.0s, the second of those articles to build links using Wikis, and the third of those articles to build links using article directories and “PRs”?

Is this correct? You use a different article for Web 2.0, Wiki and article directories/PRs?

And what are “PRs”?

Thanks in advance.

Reply

Matthew Woodward Reply:

Hi,

Yes that is right :)

PR’s are press releases!

Reply

Sally Cline Reply:

Thanks Matt! You're the best! Keep up your great work. I assure you it's helping us newbies out!

Reply

9.26.2012

Hey Matt, I am looking at SEOGenerals to get my articles written, however there are several options either,

Junior, Senior, Expert,

as well as whether they should be,

LSI articles or Regular,

your advice welcome?

Tom

Reply

Matthew Woodward Reply:

Hi Tom,

This is a new feature they launched last week that I haven't used yet but whichever one offers 500 word articles @ $4.70 a pop

Reply

Susan Banga
9.29.2012

very advanced strategies.

Reply

Matthew Woodward Reply:

Thanks Susan!

Reply

Rob Terrio
10.2.2012

Hey Matt, maybe you can shed some light on this question/problem and can help some fellow subscribers here. Basically what I'm doing is exactly what you have done here. I first acquired a list of keywords via scrapebox (30 keywords). Then I imported those keywords, the merge list, and the footprints.

I then ended up with around 182,000 keywords/queries. I ended up getting 8 high quality private proxies as well. I imported those proxies and then I hit the harvest button. The problem I'm having is that the cpu usage is fluctuating around 0-2% and it repeatedly stops harvesting at around 9,000 results.

When I stop harvesting I keep getting a message saying 90% of the URL's were removed because they're all similar. I found this post (http://knockoutbox.com/optimize-your-network-for-scrapebox/) on optimizing my network settings etc. Any suggestions?

P.S. I changed my power settings from "power saver" to "high performance" and still get very low CPU usage in scrapebox. Sorry the post is so long just being as thorough as possible.

Reply

Matthew Woodward Reply:

Hi,

When you say it 'stops harvesting' does that mean the Scrapebox window just pops up showing you the scrape is complete and which keywords were completed and which weren't?

If so the problem is all your proxies are getting banned in Google at which point Scrapebox cannot continue to scrape and ends the process. 8 proxies to scrape 182,000 queries is more than ambitious unless you have threads set to less than 10 or something (which would take forever)

The URL's are getting removed because you have options > automatically remove duplicate domains ticked.

I dont see why you would want to try and increase CPU load? If anything you should be looking to reduce it.

Reply

Rob Terrio Reply:

scrapebos doesnt automatically show any kind of complete signal. It just stops harvesting like a freeze up i guess you could say. I figured proxies would speed up the time the actual harvesting could complete rather than a whole 20 some odd hours like you explain in the video here. How many proxies do you suggest? I figured CPU output would speed up the whole harvesting process.

Reply

Matthew Woodward Reply:

Who is your proxy source? Not squidproxies by any chance?

Reply

10.4.2012

2 Questions, how many private proxies do you use/.
What are your thoughts on not pinging the links and letting them be found naturally.

Reply

Matthew Woodward Reply:

Hi,

I use 50 semi dedicated but if you only have a few sites than 10 will be fine.

Pinging is a waste of time and has being for a long time

Reply

Sally Cline
10.11.2012

Hi Matt,

I was wondering if you would recommend spinning the words in the Titles. I have created 15 alternative titles and think it may be better to spin the words in there as well. If you disagree, can you please let me know why? I appreciate your help!

Reply

Matthew Woodward Reply:

Yes you should spin the words, when writing the titles try to include at least 8 words so you can give them a good spinning.

Reply

10.13.2012

Hi Mathew
Awesome content! Is there a way I can use SEnuke instead of UD?

P.S
What part of the NW you in? I am in Lancashire.:-)

Reply

Matthew Woodward Reply:

Hi,

Yes you can apply the theory taught in the videos with SeNuke without a problem, although it is a more expensive solution.

I'm in Cheshire

Reply

Patrick Brady
10.18.2012

Hey Matt,

Just a quick question. When word spinning your content (after sentence spinning) how many synonyms are you adding? The reason I ask is that I've sentence spun (3 versions of every sentence) and word spun everything in TBS for my article, and TBS still only says 23% unique at the bottom. Not sure if that's an issue. Thanks for your time.

Reply

Matthew Woodward Reply:

As many as I can where it makes sense to do so.

The unique counter in TBS is awful and makes no sense at all at any time – use the generate and compare function on the publish tab to get a true look at uniqueness.

Reply

Patrick Brady Reply:

Sounds good, Matt. Thanks very much for the reply.

Reply

10.22.2012

Hi Matt

This is actually the best linkbuilding guide I've ever read and it's free, actually I do the same just a less scale (until I get all the tools xD) and I never get slapped with those G updates, thanks for the share! and succes!

Reply

Matthew Woodward Reply:

Hi John,

I'm glad you enjoyed it :) Who needs WSO's ^^

Reply

10.25.2012

Hi Matthew, jerry attrikk here again.

I have 2 questions on the tiered link building videos (part 2 in particular)….

1. You kindly detail the process of finding your own list of target sites for UD. This is superb information, thank you. However it caused me to wonder about something. You say to merge a list of related keywords, which I understand and have done so. However I will be using UD for various projects, which span several very different niches. What I don't understand is that if I do it for one niche, lets say car insurance, then won't my sites list be full of car insurance article directories, bookmarking sites etc? If so, what happens when I come to run a link tree or campaign for a website about fishing bait for example? Should I scrape again, and add those scraped target sites? If so, won't I be submitting hundreds of spun articles to hundreds of sites which won't accept the content, and won't that get my email or IP banned?
Hope I explained this well enough, maybe you can explain where my thinking is going wrong, as I am sure you must be able to run multiple campaigns to different niches in one installation of UD.

2. On your videos you set up a gmail address, which I also did. However on some of my campaigns, already run, I noticed a lot of failures on article directories citing the reason for refusal as something like "Free email addresses are not allowed". Do you get this problem and if so do you no longer use free email addresses? I have many parked domains, I suppose I could just turn on pop3 for some of them and let UD access them. Would you do that, or do you just use free email addresses and accept the failures?

Thanks again for such a brilliant video tutorial, if I had money I would pay you a lot of money to sit with me for a few days and show me first hand how you use UD! (Idea perhaps?)

Reply

Matthew Woodward Reply:

Hi Mark,

1. I think your over thinking it – I just merge in those keywords to find more sites that I can target. As long as the pages your links are coming from have relevent content that makes sense on them your good to go.

2. Yes you can reuse those domains and setup catch all email addresses so you can just make them up on the spot without actually creating accounts for each one. See the UD documentation for more details on how to use it with UD.

I have a few consultation clients, can always use 1 more :P

Reply

10.25.2012

Matthew, I have followed your instructions to the letter, for scraping for target sites and importing into UD. However out of several thousand sites, only about 6 were successfully imported into the site detector. Is this normal?
About 80% had "NOT FOUND" and the rest of the failures had server errors either 500 or 403 forbidden.

Reply

Andrew Smith Reply:

Similar question here Matt – Seeing a really low success rate with my export from SB as UD runs through the sites and detects what platform they are.

Just wondering if this is down to the footprints as would expect a pretty high hit rate if the footprints are specifically digging up sites for the platforms UD supports?

How have you found this and any tips on improving.

Also, can't remember if you mention this, but glad I've been running anti virus on my VPS as obviously UD is hitting many sites and AVG is catching the viruses… Just hope it gets them all or my VPS is gonna need rebuilding frequently!

Reply

roman Reply:

Yes i m also interested in a tip

Reply

10.31.2012

Hi Mathew
Im having a little trouble inputting my private proxies into ultimate demon, don't realy know how TBH.Could you posssibly show me an example?
Thanks
Shaun

Reply

Matthew Woodward Reply:

Click Global Proxies > Paste into big box > Test proxy servers > Save

They need to be in this format xxx.xxx.xxx.xxx:xxxx OR if you have a username/pass user:pass@xxx.xxx.xxx.xxx:xxxx

Reply

Tom Reply:

Matt

Having purchased Ultimate Demon and ScrapeBox I am trying to set up the proxies and have a general question, I am in BuyProxies and they offer for $10/m the 2 subnet semidedicated 10 proxies, but which location should be specified, USA, Europe, or USA & Europe, I am not sure if it makes a difference which is chosen but thought to ask before purchasing these?

Reply

Matthew Woodward Reply:

Hi,

Any location will be fine :) I tend to buy the ones closest to me though.

Reply

Tom Reply:

Hi Matt, have a bit of a problem with the UD proxies as well, I have used the format that you mentioned above, but as I am using an email directly through hostgater(ssl/tsl) the second part of that fails see the format below

user:pass@xxx.xxx.xxx.xxx:xxxx
“info@websitename.com”:pass@xxx.xxx.xxx.xxx:xxxx

This is the format used on the UD video, as it was recommended, gmail and hotmail do not seem to be the best way to distribute articles so I am trying to use a cpanel email.

Any ideas welcome….

Tom

Reply

Matthew Woodward Reply:

Hi,

I think your getting confused.

The user and pass is for the PROXY login IF your proxies require it.

It has nothing to do with your email address – this is a seperate login.

Reply

Harjinder Gill
11.2.2012

Hey Matt,

Awesome videos. I really like how you have broken down the whole backlinking process. Your videos have helped me to improve my existing backlinking strategy :)

Quick question for you: Do you outsource the spinning part(it's really boring :))? If yes, could you please share it with us?

Reply

Matthew Woodward Reply:

Hi,

Glad they have helped you out!

Yes I outsource the spinning I employ people directly solely for that task.

Reply

11.6.2012

Hey Matt getting everything set up right now and transferring the seo software to a VPS. I am checking out the proxies, what package do you use for your seo services? I am sure you do way more than me but just want to make sure I am covered when scraping content. Thanks buddy

Reply

Matthew Woodward Reply:

I have 50 semi dedicated's, if your only doing 1 or 2 sites you'll be fine with 10 though.

Reply

Alexhs Alex
11.30.2012

Awesome! thanks for giving all this info for free by the way will you make a tutorial to explain how to make senuke templates? or do you know any good link that will explain how to do this? I googled but didn't found anything good.

Reply

Matthew Woodward Reply:

Hi,

No worrys – say thanks by sharing my tutorials on forums and things :)

Do you mean the wizard linking templates?

Don't use the wizard its awful, very very basic, limits you to 3 accounts, lacks a lot of control, learn to use it without the wizard otherwise your wasting your money.

Reply

Alexhs Alex Reply:

Thank you very much! My problem is that i don't know where should tier backlinks point to eg. web2 profiles should link to articles or moneysite or something else? i wanted to know if there is a detailed guide on what to link to what

Reply

12.6.2012

where the list of your scrping?

Reply

Matthew Woodward Reply:

what do you mean?

Reply

12.19.2012

Hi Matt,

I purchased scrapebox and shocked at the uge list of keywords it has uncovered and having spun my articles now at SEO generals I am about to purchase Utlimate Demon and have a couple of questions.

Normaly I use Market Sammurai to identify keywords and select 20 for the website articles, so with the list that scrapebox comes back with, does that mean we use the scrapebox keywords linking back to the articles on the website, OR can the scrapebox keywords be used for the website articles as well?

The reason I ask is that many of the 1000 keywords that scrape box has come back with are very long tail keywords with very few searches, so not sure how it all fits together?

Tom

Reply

12.26.2012

Hey, Why can’t I see your lists?
I went over to the blog, liked+Google+Twitted and nothing happen…
Where can I see them?

Great videos by the way :)
Thanks!!

Reply

Kris
12.27.2012

Hey Matt,

I’ve been in this business a while and still found the easy way you lay all this out very helpful. One question about spinning, I saw above that you outsource yours.

From what I have found, deep spinning services run around $30per 500 words + the article cost. Once you have all that content you still have to spin in links, images and video. You also have to generate all of your titles, descriptions, tags, keywords, etc.

Seems to me that the content generation side of this either costs well over $100 or takes several days to really do right yourself. Even if you buy the articles you likely have to do the descriptions etc. yourself, which combined with the image/video/link insertion, takes several hours. Once you have all of the content it looks to me as though it will take an hour + to set up the campaign in UD.

I guess in the end my question is, does that all sound about right? Obviously if you can get a good odesk worker to do this at $2-$5hr you could lower your costs.

The videos do a great job of showing how to set up and run the campaign but it looks like most of the work/cost is on the content side and that is a very involved process. I would think this is a $399+ service just for tier 1, lol.

I am looking to run this whole process on probably around a post a week on just one of my sites, let alone other ventures. Seems to me you really need full time staff if you are working on much more than 1 simple site?

I am getting ready to dive in and set up systems to get this done, just wanted to make sure I am not missing something that would make the content side a lot easier.

Thanks not only for these videos but for this whole blog, its a great case study!

Reply

12.28.2012

Great Video Tuts on Ultimate Demon. So far I watch part 2 and continue with the next video.

Some questions here:
Can I use VPN instead of proxy?
How many links in tier-1 for medium competition keyword?
How to check deindex sites using ScrapeBox? Doyou have video tuts?

I’d like to have your list but can’t download it even after tweeting it twice. How to download your list? Are you going to email it?

Thanks

Reply

roman Reply:

would interesting to know that , or does it come in a later video ?

Reply

Diaz
12.31.2012

Hello,

We cannot download the list after like on Facebook, why ?

Thanks

Reply

Matthew Woodward Reply:

Hi,

Sorry had issues with the script it is fixed now :)

Reply

Gordon McLennan Reply:

Still no links for me either Matthew….

Reply

Matthew Woodward Reply:

Hi,

Just retested and definitely working. Please clear caches, also you will need to reshare for it to work. If you completed the share when it was broken then the links wont appear.

Thanks

Reply

alex
1.1.2013

i tweeted the link but i cannot find the Merge Lists, Footprints < links
Weird

Reply

Matthew Woodward Reply:

Hi,

Just retested and definitely working. Please clear caches, also you will need to reshare for it to work. If you completed the share when it was broken then the links wont appear.

Thanks

Reply

1.1.2013

Are there still script issues with the downloads? After “like” I see no change?

Really great stuff by the way! Been struggling with UD for several days until I stumbled upon you videos.

Reply

Matthew Woodward Reply:

Hi,

Just retested and definitely working. Please clear caches, also you will need to reshare for it to work. If you completed the share when it was broken then the links wont appear.

Thanks

Reply

1.2.2013

This design is spectacular! You certainly know how to keep a reader amused.
Between your wit and your videos, I was almost moved to
start my own blog (well, almost…HaHa!) Excellent job.
I really enjoyed what you had to say, and more than
that, how you presented it. Too cool!

Reply

Matthew Woodward Reply:

Thanks very much glad you enjoyed them :)

Whats stopping you from starting your own blog!

Reply

Al
1.2.2013

Hi, Matthew, Great video’s – really straightforward.
Tweeted (3x) to get access to the Personal Targets, Merge Lists, Footprints & Videos
But not seeing a place to download –
Guessing I’m missing something obvious…
Any help would be greatly appreciated.
Thnx,
A

Reply

Matthew Woodward Reply:

Apologies for the problems, I really buggered up the script during speed optimisation.

I have sent you an email with the direct link!

Reply

Al Reply:

Got em n just finished the vid’s – really awesome – cleared up a ton.
One question – I think I missed something obvious or had a brain freeze…
Why the 3 articles?
Is each for a different tier?

Sidenote: I’m a bit paranoid about swiping images, but was thinking one could just take the same image file, make multiple copy’s of it – then add it to name mangler (free I think) software that will bulk change the file names to your keywords but retain the original extension if you choose – then jump back to what you’ve laid out.
2 cents(pence) from across the pond I suppose…

Anyway, thanks so much!
A

Reply

Matthew Woodward Reply:

Sorry the 3 articles are for the various tasks, typically I tend to use 1 article for the web 2.0s, 1 article for the Wikis then the other article for everything else but mix it up a bit as you go :)

I wouldn’t use the same image with a different file name, given you can search Google by images and not just words I’m guessing they can detect duplicate images easier than they can detect duplicate text based content.

Cheers man!

Reply

1.3.2013

Hey Matty. Great tutorials mate. Just a quick question:

I’m using the best spinner and spinning 5 paragraph’s and 5 sentences and getting over 90% unique (without photos/vids) when doing a publish and compare with 50 articles. Is there any need to do a word spin if I’m getting that high a unique rate with paragraph and sentence spinning already?
I won’t be using any spun content on my 1st tier or connected to my money site.

Reply

Matthew Woodward Reply:

Hi Vance,

Personally I would go the whole hog – even if I was only going to use it those 50 times I would still go through and do the words. While that uniqueness is an indicator of uniqueness it isn’t a stretch to say that Google may pick up on duplicate sentences across 50 articles.

Remember you should be building out your campaigns for the future, not just what works in the here and now!

Reply

Marc-André Larivière
1.3.2013

Hi Matthew,

First of all, nice videos! One of my resolution for 2013 is to get more involved into IM and SEO in general. I have found your blog yesterday through BHW and THANKS for all the information. I really enjoy your transparency.

Secondly, like other people, I have shared your post over Twitter/Facebook and no link…

Thirdly, I wish you a happy new year and wishes you all the best!

Reply

Matthew Woodward Reply:

Nice to see a few more BHW’ers over here :)

Glad you like the blog, let me apologise for the problems with the share script. Works fine when im logged in as admin, my host is looking into it now!

I have emailed you the resources in the mean time :)

Good luck with your ventures throughout the new year!

Reply

Davey
1.5.2013

Matthew, great tutorial series (found it on BHW)! I’m on my second time through. I’ve got at least large 3 projects I will be applying these techniques to. Now for the cash outlay to purchase the rest of the tools I don’t already have… I’ll be purchasing through your links.

How are you viewing social signals into the mix these days? Seems the Big G is emphasizing that more and more.

I also tweeted your video out but did not see a download button.

Thanks.

Reply

Matthew Woodward Reply:

Thats very kind of you thank you and I’m glad you’ve enjoyed the series!

Social signals are actually covered in video 6 :) They are supplementary tactics that you can use to bolster your efforts but I do think the future of link building will be social based. Essentially a ‘link’ is one person saying they approve of something else which is exactly what social sharing is by definition.

So I think we will be moving to a more socially driven search in the coming year or two. I’ve been giving my key tier 1 properties social signals for a while now in anticipation of that shift.

The script is now fixed but I emailed you the resources anyway :)

Reply

royalmice
1.7.2013

The 7daysale coupon does not work for SEOgenerals anymore — do u maybe have another coupon code ??

Thanks

Reply

Matthew Woodward Reply:

Ahhh yes they have actually upgraded the site since then and offer different levels of writing but reduced prices to be the same as with the code at the same time ^^

Reply

1.22.2013

Thank you! You must have spent so many hours fine tuning these tiered link campaigns, and then you just share them step by step…who does that

I do have a quick question, and not to sound greedy, but for the targets, footprints, and merge lists – are these still relevant? Especially after your videos with more people using them?

Reply

Matthew Woodward Reply:

Hi,

Well my link building process has evolved over the years. I usually have 1 main process along with 3/4 variations which I use to refine the main process further, it’s constantly evolving :)

Well you can use the footprints, merge list and your own list of keywords to create your own target list. But I guarantee my personal target list has been used less times than any that get posted on forums etc

Reply

Ayumi
1.23.2013

Hi,

I use your two files and a list of keywords I made up. I got a list of 300k keywords scrapebox says. I click on harvest like in your video. Remove dup. url and save as text file… I look at the url list they look very poor and low PR. Would it be better just to download a High 4-8 PR list of articles directories, web 2.0, bookmarking etc. Or have I missed the point. Look forward to your answer. Love the series think you need to do one on the New Link Wheel model or something as well… Thank Matt keep up the good work!

Reply

Matthew Woodward Reply:

Hi,

Just import the list into UD and see what you get from there – you will always get more lower PR sites than higher PR sites simply because there are less higher PR sites in the world.

It would not be better just to download an existing list that has been download and used by every man and his dog – you could do that AS WELL as scraping your own target list but not instead of.

I’ll be doing somethings with the new LWB don’t worry :) Can’t say what yet though ;)

Reply

1.29.2013

Hey Matthew,

Just a thought.

Couldn’t you just check the Google index status in Scrapebox before adding to Ultimate Demon?

And save the step of exporting form UD and using mass deletion…

Reply

Matthew Woodward Reply:

Hi,

Really the choice is do an index check on your entire scrape which can often run into the millions of URLs, or do an index check on just the URL’s Ultimate Demon can recognise & post to.

One method is more time/resource efficient than the other :P

Reply

I can’t replicate that here – they are just YouTube videos =\

Reply

2.1.2013

Hey Matt,
Just a heads up – I don’t know why this is happening, but Scrape Box is misreporting indexed links as not indexed. I would manually check some of the output links reported as de-indexed before you remove them from your database.

I’ve got a ticket in with SB support now, but searches online show people complaining about this as far as 2011.

Reply

Matthew Woodward Reply:

Hi,

This could be happening for a number of reasons, what happens if you take a sample of 15 URLS and do the index check without proxies?

Reply

Lex Reply:

Hi Matt,

The outcome is identical without proxies.

It turns out that Scrapebox uses the ‘info’ operator instead of the ‘site’ operator and not all indexed sites return a result using the info.

They say ‘info’ has always been a better indicator than ‘site’ but the fact that ‘info’ ignores indexed sites that ‘site’ clearly shows as Google indexed, suggests otherwise to me.

Am I missing something?

Reply

Matthew Woodward Reply:

Well unless you want to go and check each one manually that is pretty much your only option.

I know that http://www.matthewwoodward.co.uk/reviews/inspyder-backlink-checker-tool-review/ uses its own unique way to check indexed status which seems pretty reliable but I’m not allowed to say how they do that :(

I think when your checking this many links enmasse then your going to get issues like this.

The only thing you could do in scrapebox is the site: command itself to return the list of URL’s, then do a lookup in excel to determine indexed status.

Reply

Yohan
2.7.2013

Hi Matthew,
What proxy package do you use / would you recommend to use in scrapebox for creating my target lists?
Thanks so much!
Yohan

Reply

Matthew Woodward Reply:

Hi,

Either 10 or 20 semi dedicated proxies would do you fine.

Reply

Yohan Reply:

Awesome, thanks so much!

Reply

Matthew Woodward Reply:

No worrys

Reply

Tom
2.10.2013

Hi All,

Has anyone had any problems buying Ultimate Demon, I enter in my paypal details and a screen pops up saying please return to merchant, “Invalid Order Action” is what is flagged.

Anyone else having this problem?

Tom

Reply

Matthew Woodward Reply:

Hi,

Try a different payment method? Sounds like a Paypal issue.

Reply

Tom Reply:

Tried repeatedly on PayPal, still not working, I don’t have a credit card either, only debit.

Frustrating to say the least.

Reply

Tom Reply:

Solved, had a proxy left on and PayPal didn’t like the IP.

Reply

Matthew Woodward Reply:

Wahahah worn that tshirt a few times myself =D

Reply

Matthew Woodward Reply:

You can use a debit card on the normal checkout

Reply

2.20.2013

Matthew, you are awesome. A lot of what you offer in the way of tutorials is better than paid memberships I have participated in. It’s amazing! I know you plug a few products but I believe your promotion to be of a helping nature of which is well founded in your experience. I was on the fence about UD but I know I will now purchase as well as ScrapeBox through your link. I just hope there are some tutorials offered by the developers of the those programs to help tie everything together. Will be spending a lot of time with you here my friend. I may need to watch some of these a couple of times for it to completely register-lol. Appreciate it more than you know. THANKS!!!!

Reply

Matthew Woodward Reply:

Hi,

Thank you very much glad your enjoying the tutorials. I had to make the decision of releasing them as a product/wso or just giving them away for free, I actually felt like putting a price on them and associating them with WSOs would devalue them so I just gave them for free =D

My tutorials will tie it all together for you no worrys, and thank you for buying through my links – much appreciated!

Reply

Tom
2.21.2013

Matt Hi

Using Scrapebox for the first time following your video and have a small problem.

I produced a list of keywords as suggested in scrapebox in video2 then put this into scrapebox again to produce a bigger list, this came back with over 1,000,000. However, as I am now harvesting, scrape box this has now been running for 2 days and still going, should I abort and do this start this again using the first list of keywords scrapebox generated?

Reply

Tom Reply:

Update, when I mentioned 1,000,000 above this was the total quieries after the the merge files were added as indicated in the video. Perhaps I should have used the original list of 30 scraped keywords rather than the 163 that came back after scraping the list a second time?

Your advice welcome…

Reply

Matthew Woodward Reply:

Hi Tom,

To be honest the more you can scrape the better

Reply

Matthew Woodward Reply:

Hi Tom,

My recommendations is once it harvests 500,000 URLS or so, stop the harvest. In the window that popups up in the bottom right corner you can choose to keep the not completed keywords.

Then export that list and kick off the harvest again.

Hope that helps

Reply

Sam
2.25.2013

Hi Matt,

I’ve been looking for the link to download the sample files (Personal Targets, Merge Lists, Footprints & Videos), but I can’t seem to find it anywhere. Am I missing something?

Thanks,
Sam

Reply

Matthew Woodward Reply:

Hi,

Yes your not using your eyes ^^

Reply

3.6.2013

Twitter @tinozito

Hi Matt, i ve been retweeting your vids, I really appreciate the priceless content!!

my question is , I harvested a list of 1,000,000 + Url’s with scrapebox using your footprint and merge list and my keyword, after removing duplicates i end up with still a big list of ~200 000 urls…

Then in UD, iam supposed to add these urls in the site detector like you mention at the end of the video ( UD tell me that i am recommended to keep it below 10,000 items) …so i tried with like 10 000 item , it took like 45 min and stop a 99% with only 10-15 success detected…

So what do you recommend , should i paste my huge list of 200 000 urls to get the most of it

Reply

Matthew Woodward Reply:

Hi,

Yes ignore what UD says just whack the whole list in. Then run the list again with the tickbox about the site in a subfolder ticked :)

Reply

tinozito Reply:

Hi Matt,

Thanks for your answer, I really appreciate that we actually can communicate with you …

I scrapped probably 2 Millions of URL’s using your footprints, and merge list , i had 2 list of 300k URLs to check with the UD site detectors, I check those list with the tickbox as well… Then at the end i just get only 1800 target list !!! ( most of them are FORUM)

So I run an index check in scrapebox , only 3 are indexed …

So is something wrong with the scrapping part …?

Can i use your target list ( but what would be the point of scrapping your own list of targets…) i’m kinda frustrated right now :/

Thanks matt

Reply

Matthew Woodward Reply:

Hi,

No sometimes you’ll get lots of new sites, sometimes not many and sometimes none – just how it is. Scraping your site list and refining it is a continuous process that should be repeated monthly at a minimum.

Reply

3.12.2013

Thank you Matt for these videos. I am on the third round of watching them and each time I seem to pick up another point that I need to address. :)

I am now ready to get going with my first campaign. I have downloaded your target list but am unsure as to what sections to upload them into within UD. You have broken them down in what appears to be a sub-criteria of the way UD splits them up.

Please can you indicate which type of site each of your folders relate to.

Ian

Reply

Matthew Woodward Reply:

Hi Ian,

Hahaha yeah its a lot to take in at once :)

Just paste the whole list in, no need to split it up – I just did that for reference purposes!

Reply

Ayte
3.12.2013

Matt can you please make a List scraping series? I tried to scrape those footprints. I trimmed URLS to root, removed duplicate URLS and entered into Ultimate Demon. I got less than 500(5k unique domains) sites to go most with very low PR

Reply

Matthew Woodward Reply:

Hi,

Yeah that sounds about right to me – just merge in more words and keep scraping. Its a never ending process.

Don’t think it warrants a dedicated tutorial as its already covered in detail but I suppose it is focused on one task of scraping

Reply

3.13.2013

Hey Matthew – Thanks for the tweet. I can appreciate the time and effort put into the videos.

This is why most people are not fond of SEO. It’s centered around beating the system, especially these advanced methods. It’s centered around beating Google.

The problem is Google is getting smart, and their Search technology is evolving. There’s are some clues out there that links are becoming less important to SERP’s. I wonder if in the not to distance future all of this work that you suggest will be worthless.

Spinning, scraping, and link building are not foreign to Google. They’ve learned that great content is not at the end of this rainbow. These techniques are tried and true, very powerful over time. However, anyone using these techniques needs to be careful, and realize that roof can cave in at any moment.

Reply

Matthew Woodward Reply:

Hi,

No worrys :)

Yes Google are at long last getting smarter – a lot of the updates in the past few years haven’t really changed anything (use HQ content, diversify keywords, standard advice for years) but with the introduction of AuthorRank and Social Signals this year it will be an interesting time :)

But that is why the tiered system is so good, as long as people ensure they have the ability to remove any tier 1 links you can instantly disconnect your site from any links Google may decide are troublesome in the future that aren’t now.

Its all about risk management :)

Reply

hydride
3.14.2013

Another user has already asked this question, but it was never answered, so what are your thoughts about using VPN over proxy?

Reply

Matthew Woodward Reply:

Hi,

Get proxies because the software you are using will be able to rotate IP addresses as and when it needs to which you cant with a VPN (its either on a timer or manual)

Reply

hydride Reply:

Thanks!

Reply

Arran
3.14.2013

Matt…I followed your videos through and now I have a two week wait. So Ive started on a second site….and followed videos 1-3 all the way through again….when I go to each project on UD the sites it’s got in the database that I scraped for each project are exactly the same according to UD…is this right?

I assumed with the different keywords in Scrapebox it would have found different sites?

Reply

Matthew Woodward Reply:

Hi,

The site list in UD is a ‘master site list’ that you use across all projects. You can add sites just to specific projects if you want but thats know how I run my ship :)

Reply

Arran Reply:

Nice, thanks for the quick reply.

Using this method, and I know it’s not easy to predict, but how long do you think it would be before a new site started ranking for easy and medium keywords?

Reply

Matthew Woodward Reply:

Hi,

I wouldn’t build links to a new site

Reply

Lucas Reply:

Matthew, when you say you “wouldn’t build links to a new site”, do you mean to say that your tiered link building method should not be used on a newly created website/domain? Do we have to wait for a site to be of a certain age or have a certain amount of existing authority or traffic before we can use this method on it? Please elaborate as I would really like to know whether I can use this on a new website.

Reply

Matthew Woodward Reply:

Hi,

I focus on marketing in the early stages of a site rather than SEO.

Reply

JJ
3.14.2013

I have some questions, things I need to clear up?

- Am I supposed to rewrite 15 titles for all three articles so I have 45 titles in total.
- and then spin all the 3 articles using your best spinner tutorial.

I also wanted to know what are your opinions for article writing and what are opinions about it?

Reply

Matthew Woodward Reply:

Hi,

1) Yes
2) Yes

About article writing? Outsource it ^^

Reply

JJ Reply:

I meant to say wanted to know what are your opinions on iwriter.com and if so what are opinions are about using it?

Reply

Matthew Woodward Reply:

Hi,

I haven’t used them sorry =\

Reply

hydride Reply:

I’ve used them. The only thing I like about iwriter is the fast turn around. Quality is unique too, but don’t expect any life changing content. I did use them for quite a while but, stopped because the quality isn’t what I wanted.

Reply

3.15.2013

It seems that the link for scrapebox does not work. I have tried a second day to purchase the software but it would not load the page. Any ideas would be appreciated.

Reply

Matthew Woodward Reply:

Hi,

Sorry that is out of my control, give the SB team a shout and they’ll help you out!

Reply

Skip
3.18.2013

Hi Matthew,

Thanks for all of your hard work with the videos and the transcripts are a NICE TOUCH.

Can you tell me where the link is to your ping server list or,

“Secondly we need to setup the ping servers, again paste your list into here. If you don’t have a list you can download mine underneath this video.”

can you provide instructions how to get this list?

Many thanks,
Skip

Reply

Matthew Woodward Reply:

Hi Skip,

Glad you like the transcripts but I despise creating them with every brain cell I posses ^^

What your looking for is underneath the video, your not looking hard enough :)

Reply

Daz
3.31.2013

Hi Matt

In your Video you said write 3 articles, do you mean write 3 articles then 2 alternative sentences for each article

or do you mean write 1 article and 2 alternative sentences for each sentence which would basically make 3 articles if they were not spun together.

Thanks

Reply

Matthew Woodward Reply:

Hi,

3 separate articles and each sentence in each article has 2 alternatives :)

Check out the advanced spinning tutorial for much more detail

Reply

Gideon
4.1.2013

Hey Matthew,

Thanks for all the awesome tutorials. I tweeted and liked the video, but no links to download your lists. I scanned this page up and down to make sure I didn’t miss it, but I still don’t see it :(
Could you help me out?

Reply

Matthew Woodward Reply:

Hi Gideon,

I’m away from home until the 8th at the moment but drop me an email with what you need and I’ll send them to you directly when I return.

Not sure why it didnt work, on the list to test when I get back thanks

Reply

Diego
4.4.2013

Hi Matthew, Thank you so much for the tutorials, they are really a gold mine.

I have some questions about the tutorial:

1 – I see that you answer on a comment that do you dont recommend this approach for a new site, how is supposed to be the right time to do this technique? what do you have to do before?

2 – This could help to recover a site with a penguin penalty?

3 – I run some webs on Spanish, do you know if making the articles on english and changing the keywords for the spanish ones i want to rank for, it would work?

Thanks!

Reply

Jason
4.5.2013

Hi Matthew I love the tutorials. I am only starting out trying seo. I usually source it out to someone else. I am going to try put your tutorials into action with a site I am working on at the minute. It is an Opencart site. I have one very basic question before I start.
Should I submit a site to Google through the webmaster tools or just jump straight into your what your are doing in the tutorials?

Reply

Matthew Woodward Reply:

Hi,

Yes you should!

Reply

4.8.2013

Hi Matt,

YOU rock.
I have many questions, but here is one:

Preparing for the campaign you said I need 3 articles. and then I must write 2 alternatives for each sentence of each of the 3 articles… but why not actually buy 9 unique articles? …it’s faster. or maybe I’m not getting this right….?

thanks

Reply

CBETZ
4.8.2013

Hey Matt, I’m having hell of trouble getting these list of targets imported into my ultimate demon software. On the first run, only 14 stuck, and 23,000 could not download or were not found. Do you have any tips here? Am I doing something wrong?

Thanks!

Carl

Reply

Matthew Woodward Reply:

Hi,

Re run it again with the box ticked my site is stored in a subfolder and your done! I assume you have seelected all from the drop down?

Reply

Andy
4.11.2013

For the social bookmark, web directory and video titles, do we need 3 titles and 3 descriptions for each, or one for each? Also, can you give an example of what makes a good title and description for each one?

Reply

Matthew Woodward Reply:

Hi,

Just one set will do, but don’t be scared to do more – the more the better!

Just look at some bookmarking sites

Reply

Tj
4.12.2013

wauw thanks!

If bin looking for a long while for this info. Dint know for sure what a tier link was and besides that i was lost trak in what to do with all those seo tools.

Thanks for the great cuality content you made!

Reply

Matthew Woodward Reply:

No worrys glad it has helped you out!

Reply

Wildcat
4.25.2013

I’ve shared the video but I don’t see the link to download the merge list etc?

Reply

Matthew Woodward Reply:

Hi,

Just tried it with Twitter & Facebook – the share buttons get replaced with the links.

Reply

Wildcat Reply:

Thanks Matthew,

It was an extension in Chrome that was blocking this..

Reply

Matthew Woodward Reply:

Hi,

Can you tell me which extension please?

Reply

Wildcat Reply:

Hi,

Sure no problem. It was “Do Not Track Me” which is produced by Abine and they also have an extension called “Mask Me”. When I disabled “Do not track me” the links worked.

Hope this help if anyone else runs into the same issue.

Thanks again for a very informative video.

Reply

Matthew Woodward Reply:

Hi,

Ahhh yes that blocks cookies and the plugin uses cookies to know if you shared or not :)

Reply

Walter
5.12.2013

Hey Matt,

Do you know if putting the keyword in the title of the backlink source (creating a spun article with target keyword in the title) helps with ranking for that keyword?

Thanks.

Reply

Matthew Woodward Reply:

Hi,

It will help a little yes but more so in Bing

Reply

5.19.2013

Hey Matt

Great tutorials by the way. Just a quick question – The Best Spinner site seems to be down at the moment. Are there any recommended alternatives I can spin the articles with?

Thanks

Matt

Reply

Matthew Woodward Reply:

Hi,

Looks ok from here? Check it with http://www.downforeveryoneorjustme.com/

The Best Spinner is actually the best one ^^

Reply

5.21.2013

Can’t find merge files any where?
Please help.
Great vid’s!!!

Reply

Matthew Woodward Reply:

Have you tried the old age technology of reading?

Reply

Chris
5.22.2013

Great tutorial! Very well put together, and I would agree to the previous comments that there isn’t anyone, anywhere on the web that publishes content so invaluable as this. My question for you Matthew is …

When uploading your target sites from scrapebox into UD, what’s confirming those sites being uploaded are valid web 2.0, Article Directories, Social Bookmarks, etc… And if UD has no problem sorting them out, should we use semantically related keywords to further scrape for additional target sites in scrapebox, for overlooked target sites that weren’t picked up from the original set of keywords?

Reply

Matthew Woodward Reply:

Hi Chris,

Thanks very much!

The UD site detector takes care of checking they are valid and what type they are.

Yes you can use related keywords if you wish! Never stop scraping, always more to be found!

Reply

Moody
5.24.2013

Wow, what a site. This video tutorial is superb. I know you said it took you 8 hours to create it, but let me tell you that you have easily saved me a month or more. I have been spending the week hunting through poorly written, garbage descriptions and advertisements to figure out some Multi-Tier marketing best practices.

Your site rocks, your videos rock, and I will be viewing all of them tonight.

Reply

Matthew Woodward Reply:

Thanks – glad it has helped you out!

Reply

Moody
5.24.2013

Hi, like I said earlier, fantastic resource. I am excited about using scrapebox with the keywords and footprints you created, only I cannot locate them. It said on this page that if you tweet, then you will get those resources. I did that but for some reason I am unable to download these. Please help.

Reply

Matthew Woodward Reply:

Drop me an email and I’ll get them to you :)

Reply

Nick Thomson
6.1.2013

Hey Matthew killer vids so far my dude I’m on #3 now. Quick question.

What determines the amount of private proxies I should need? I have the general concept of them and was reading like 20,000 url blast a day with 10 would be good but can you give me the run down on what determines the amount I would need?

Thanks again man!

Reply

Matthew Woodward Reply:

Hi,

Just get 10 semi dedicated from BuyProxies and you’ll be fine! When blasting comments, scrape a bunch of public ones as well.

Reply

6.3.2013

Hi,

How to download your personal target lists? I can’t find any download link on this page.

Regards,

David

Reply

Matthew Woodward Reply:

Read harder :)

Reply

SylvainZ
6.6.2013

Awesome tutorial Matthew,

I’m working on my backlink strategy and your tuto help me a lot.
But I’m french and I would like to know if writing a post on sites like EzineArticles in english has the same effect or I need to find some others french site?
I can find them but they have less Authority

Sylvain

Reply

Matthew Woodward Reply:

Hi,

I don’t have much hands on experience here but from what I can gather it doesnt make much difference now, but I wouldn’t be surprised if it did in the future.

Reply

Tim
6.12.2013

Hi Mathew, I am new to this so please bare with me. Whilst going through your Tiered link videos I lost track a little. What threw me was how to do Tier 1 in Senuke because the Senuke video is an addition and not core to the Tiered linking videos. Once you have 3 unique articles – is it these that I should spin by following your expert spinning video and have them all linking direct to the money site? is that safe in your opinion?
and then linkwheel tier 2 and 3 as per your video series?

Reply

Matthew Woodward Reply:

Yes that is right – just substitute video 3 in the series with the nuke tutorial – but then see the advanced spinning tutorial for a more detailed look at how to prepare the content as seen in video 2.

Reply

JJ
6.14.2013

when you are setting up your Tier 1 campaign is there a maximum number of links you aim for overall for your money site? Or are the number of Tier 1links irrelevant?

Reply

Matthew Woodward Reply:

Hi,

This is covered in video 3 :)

Reply

Nicolas
6.20.2013

Hi Matthew,
In video 1 we get the huge list of sites from scrapebox. Is it this list you paste in UD in video 2 ? Not so sure, because those wont be Tier 1 quality links or did I miss a beat ?
Thanks
Nicolas

Reply

Matthew Woodward Reply:

Hi,

You mean when we are scraping our target list to import into UD?

Reply

6.21.2013

Hi Matthew,
May I know how many proxy do I need for Tier linking? There are a few option to pick from buyproxy.org.

Reply

Matthew Woodward Reply:

Hi,

10 semi dedicateds will do you sir!

Reply

6.21.2013

I was just wondering, since I prepared my Tier 1 with Senuke XCr Social Network links , some PDFs and some high PR Wikis, then create a campaign in GSA Search Engine Ranker later that night – 7 hours later I have submitted around 400 Tier 2 Links with GSA , but I get around 20 LIVE links at the moment – is that normal ?

What is your experience ?

Thanks and Regards

Reply

Matthew Woodward Reply:

Hi,

Thats normal, it only verifies the links every x hours or so.

You shouldn’t be building tier 1 and 2 in the same day though

Reply

6.26.2013

I am extremely impressed with your writing skills and also with the layout
on your weblog. Is this a paid theme or did you customize it yourself?

Either way keep up the nice quality writing, it’s rare to see a great blog like this one today.

Reply

Matthew Woodward Reply:

Thanks =D

Reply

7.6.2013

Hey Matt,

Thanks for all of the awesome videos. Extreme newbie here just starting out. One step is confusing me. After you scrape your URLs and export them as text you import them to UD. You then export them again back to scrapebox to check the index status of the URL’s. Why would you not just check the index status of the URL’s after you’ve scraped them in Scrapebox straight away?

Reply

Matthew Woodward Reply:

Hi,

Because when you do the first load into UD, only a very small percentage of them will be added to the main UD database. So instead of doing an index check for like ~50,000 URL’s, you only need to check a few hundred.

Reply

Flash Memory Usb
7.10.2013

Hi there! I could have sworn I’ve been to this website before but after browsing through some of the post I realized it’s new to me.
Anyways, I’m definitely happy I found it and I’ll be book-marking and checking
back often!

Reply

Matthew Woodward Reply:

Thanks – let me know if you have any questions!

Reply

7.11.2013

Hey Matthew Gerry here from Ireland been following you now for a while as I can tell from your writing and your tutorials you have a vast experience with a No BS approach is there another way to contact you other than here, I gave your vids a share as I thought it was the least I could do, also as these were made quiet a while back is this still a viable method now ?

Reply

Matthew Woodward Reply:

Hi Gerry!

Thanks for the share – just hit up the contact page :)

Reply

hydride
7.14.2013

What about relevancy of the scraped sites? The PRs? What if someone scrapes a whole bunch of negative PR sites?

Reply

Matthew Woodward Reply:

Hi,

I take care of this in the tutorials :)

Reply

Darcy
7.14.2013

Hey Matthew!

I can’t wait to put all of this into action and see if I can make it work for me!

I wanted to know if you could tell us how you add those cool little ‘fly in’ effects on your video?! I love that – is it a part of the editing software you use?

Thanks :)

Reply

Matthew Woodward Reply:

Hi,

Good luck :)

I use adobe after effects for those!

Reply

7.17.2013

Hi Matthew,

How can I get the resource material (personal targets, merge list, footprints). I subscribed and fb liked (I don’t use google plus),

Thanks for the great tutorial!

Reply

Matthew Woodward Reply:

Hi,

Working here not sure what went wrong, sent you an email with them!

Reply

Daniela
7.20.2013

Hi Matthew:

I just love your tutorails videos and all your blog in general, I still have some noobie doubts though, sry if you already answered them:

1. When you say “scrape your own targets lists for automated software”, what makes a difference between pligg, jcow, and all of those platforms? aren many people using the same footprints? so how could I avoid adding my content and links in platforms that are no so “known”? (spammed) :)

2.What is the best way to find new platforms for building our tier linkbuilding? I mean, pligg and jcow, drupal, etc. are famous platforms, so, is it better to post in not so popular platforms to avoid posting where everybody else is doing it?

Both questions I made are very similar, but Im about to star my tier linkbuilding, and I don´t want to do it the wrong way.

Thx for all the incredible info that you share with us. God bless you

Reply

Matthew Woodward Reply:

Hi,

1) You would need something custom to post to platforms that the software doesn’t support natively

2) Just browse the internet, get involved with things as you normally would and as your surfing just keep an eye out for possible link opps

Reply

7.23.2013

I absolutely love these videos and once I finish the set, I will be writing something up on my site about them.

Reply

Matthew Woodward Reply:

That would be very kind of you thank you!

Reply

Jed Hanlin
7.30.2013

Thank you for spending the time to help others in this crazy confusing business. I had a question, hope I didnt miss it some where but her goes.
So when building the initial tier 1 level should I direct that towards http://mysite.c0m or should I build out a seperate set of tier 1′s for every page I.e. mysite.com/how-to-skin-a-cat, etc. or should I mix them all up? Hope that makes sense.

Reply

Matthew Woodward Reply:

Hi,

Each run of the campaign can target 1 URL, so you can choose whether that URL is your homepage, an inner page or whatever :)

Reply

Ray Reply:

Would each of the campaigns need 3 more articles and its various titles, descriptions, etc., or can you use the original 3 articles. If I had a website with 5 URLs I want to target, would I then need 15 articles or just the 3 original ones? Thanks.

Reply

Matthew Woodward Reply:

Yup, each campaign needs a new set of everything

Reply

8.7.2013

Hi Matt,

I’m an agency SEO and the majority of the work i do for my clients is white and grey hat, I’ve decided to buy ultimate demon not for any my current clients but for a side project i’m doing.

I haven’t done any major black hat campaigns before, so i’m actually looking forward to it!

However i’ve had a problem buying ultimate demon through mycommerce. Long story short they had a issue processing the order and ive had to pay twice! I’ve supposedly got a refund/cancelled transaction, but if you get double commission you owe me a pint! Haha

Reply

Matthew Woodward Reply:

Hi,

Hahaha I will let you know – thanks very much!

I encourage you to build a site and try to get it penalised, you’ll learn a whole bunch!

Reply

8.8.2013

Hi Matt,

it´s me once again. I have got a few questions about web 2.0 sites.

1. In this tutorial you built them with UD or Licorne but on the other hand you stress they should be hand built. What is the best way in your opinion then?

2. Granted you have built a few web 2.0 sites manually, can you easily add those URLs to a T1 link campaign of that kind you set up in this tutorial or do you have to create a different Task/campaign for existing web 2.0 accounts. I recently bought Licorne and therefore I´m mainly interested in how to manage this in Licorne if possible.

3. A bit referring to question 2: Can you generally use existing T1 accounts or sites (created by UD/Licorne or just manually) to create new content and backlinks with Licorne and UD on these properties and build them out? This question may sound a little bit stupid but I didn´t find a description to this issue in the tutorial videos yet. You only recommend to repeat the process and create completly new T1-accounts (if I understood this right).

Best regards,
Holger

Reply

Matthew Woodward Reply:

Hi,

1) THey are very different things. The ones that are built with UD are not comparable to the hand built ones. What UD calls a web 2.0 site isn’t always a web 2.0 site with a dedicated subdomain like we create by hand.

2) Yes just add them to the target list in GSA

3) I always create new accounts on each run

Reply

Holger Reply:

To point 1):

a) What about Licorne in this matter? Is Licorne able to create real Web 2.0 sites you described above (with dedicated subdomain)?

b) BTW, is UD able to build this kind of sites too?

c) And finally: Could you please give us an example to the other web 2.0 sites (that ones without dedicated subdomain) , just for a better understanding?

Thanks,

Holger

Reply

Matthew Woodward Reply:

Hi,

1) Yes

2) Yes

3) I dont use them, but anything that gives you domain.com/username

Reply

8.8.2013

Hey Matt I could really do with your Personal Targets, Merge Lists, Footprints & Videos but I don’t have a FB, Twitter or G+1 account:( Can you sort me out please as these downloads will be a great starter for me.

Great vid series aswell, very informative and well explained.

regards

Reply

Matthew Woodward Reply:

You know its 2013 right?

Drop me an email :P

Reply

8.15.2013

Quick question re the sets. In your video, you showed the following requirements for 1 article:

1 set of bookmark titles/descriptions
1 set of web directory titles/descriptions
1 set of related tags and keywords

Do these translate to this:

15 spun bookmark titles/descriptions
15 web directory titles/descriptions
15 sets of related tags and keywords

Or

45 spun bookmark titles/descriptions
45 web directory titles/descriptions
45 sets of related tags and keywords

Thanks.

Reply

Matthew Woodward Reply:

The requirements are 1 for campaign. No translation required, just as it says.

Reply

Ray Reply:

Thanks for the info. One last question: do I need to spin these titles and descriptions?

Reply

Matthew Woodward Reply:

Yes indeedy :)

Reply

8.28.2013

Hey Matthew,I have downloaded all of your videos and it’s great.But I have one question,when I asked at warriorforum,people are saying that article submission direct to money site is dead now.So can we use article submission in Tier1?Or does it have negative impact in rankings?

Reply

Matthew Woodward Reply:

Hi,

It is safe to use as long as you have a diverse profile of links

Reply

Chris
8.28.2013

Hi Mathew, this is a fantastic tutorial series. Thank you so much.

Quick request – about these things:

1x set of bookmarks titles / descriptions
1x set of web directory titles / descriptions
1x video with titles / descriptions

I get what you mean by a “set” and that we need to “spin” all the titles and descriptions.

But I want to be sure I know what those things are supposed to look like. Could you please give me an example of each of those three types of set that you would consider to be of suitable quality?

I have tried looking at these on various other websites, but they seem to look different on some sites to others, and I would like examples of ones that are done like you do in this process so that I can feel certain I’m doing them right.

I would really appreciate you giving me an example of these three things, or just posting links to three pages that have ones that are done right.

Thank you very much for your time.

Reply

Matthew Woodward Reply:

Hi,

JUst do a manual submission to a social bookmark site and write a title/desc like your normally would to get an idea of what it should look like. Take a look at inbound.org etc

In terms of quality it should read perfect english everytime and not look like generic spun crap.

Reply

Sterling
8.31.2013

Hi Matthew, another excellent video. I’ve watched it several times over the last couple of months.

I’m doing something wrong on the scrape part after merging your files with keywords. I’m getting results like youtube videos and blogspot in the URL list.

Most of the matches don’t seem to match the inurl criteria of the various merged search terms.

Here is an example search term:

this inurl:footer_page/1 “About Us” my keyword here

Is something wrong with the keyword merging? Or do I have to uncheck some of the search engines to scrape? Should I just stick to bing and google?

thank you sir!

Reply

Matthew Woodward Reply:

Hi,

Yeah you get that sometimes but I dont worry about it, the import into ultimate demon takes care of that

Reply

Sterling Reply:

Thanks Matthew. It looks like only Google supports “inurl:” option, so list is much cleaner and scrape much faster when I only use Google with those kind of footprints.

Wish senuke’s import was as slick as UDs.

Reply

Matthew Woodward Reply:

Ooops missed that sorry but yes thats a Google only operator.

Yeah Nukes import sucks ass in comparison.

Reply

9.4.2013

Hey Matt,
great set of tutorials, thanks.

Just a quick question about the 3 base articles. Do you have these written in a specific way? What I mean is, does the article you use for a press release have to be a certain style, i.e. read like an actual press release?
Or do you just have three articles written about a niche related topic?

Thanks,

Chris

Reply

Matthew Woodward Reply:

Hi Chris,

I just keep it simple with niche related topics

Reply

9.5.2013

Hi Matt,

I wonder how to manage the target URLs the right way in manners of already used targets and targets I not used yet. I will try to exemplify this: Granted, I scraped 1000 target sites (bookmark sites) but I only really posted to let´s say 200 of them in a bookmarking campaign (because I limited this T1 campaign to 200 operations). When I use the same target list (1000) again later on, UD, Licorne and so on will target this 200 sites again. How do you avoid this problem in your daily work? Do you rather scrape completly new sets of target sites for each run or do you delete the used ones. In other words: How often do you use a list of target sites for your linkbuilding campaigns within one project?

Holger

Reply

Matthew Woodward Reply:

Hi,

I try to use only each site once per campaign and you can do that by just editing the project and adding more sites/updating the schedule.

Reply

9.8.2013

Hey there fellow Matthew.

I’m curious what you would say if I offered to buy you a pint and asked you how would you change your tier one campaign were you to intend on using it to diversify anchor text? Would you substitute domain type anchor text for some of the keywords scraped via scrapebox?

That being said, seriously I’m buying you a pint some day.

-Matt B.

Reply

Matthew Woodward Reply:

No need for a pint but just change it up/be random. Or drink a bottle of vodka before you start work, no way you will leave a footprint or any pattern then ^^

Reply

9.9.2013

Thanks Matthew for sharing your link building strategies. I have a concern. If we build tier 3 spammy backlinks, a future google algorithm update might take a toll on it.

I recently came across Backlink Beast. Like to know your thoughts on it. Between backlink beast and SEnuke Xcr which one is better.

Reply

Matthew Woodward Reply:

You could say that about any approach to link building. That is why I put the guidelines in place in the first video so you can disconnect the tiers as and when you want/need to.

Not heard of backlink best and probably wont use it/look at it.

Reply

9.10.2013

Hi Mathew. great videos.

I am having problems with scrapebox trying to scrape my own list.

It keeps crashing, I can use for doing everything else but when I follow your procedure for scraping lists it just crashes when i stop it harvesting. Any ideas?

Reply

Matthew Woodward Reply:

Hit up Scrapebox support :)

Reply

9.10.2013

Greetings Matthew,

In scraping my list I ended up with over 2,000,000 + keywords using the footprints and the merge list file, which basically killed my tired old PC.

Would you recommend starting out with fewer keywords, or perhaps using just the footprints without the merge list file?

Thanks in advance,

-Matt B

Reply

Matthew Woodward Reply:

Hi,

Well Scrapebox can only handle 1,000,000 at a time in the final results window so just keep an eye on it and perhaps stop it at 500,000 or so.

You could use fewer keywords yes but I just stop it manually, export, then carry on with the ones it hadn’t done yet.

Reply

Matt B Reply:

Thanks for the response Matthew.

I just watched the Senuke video and was wondering how you separated your list by link type in order to import it into Senuke?

Thanks in advance,

-Matt

Reply

Matthew Woodward Reply:

Hi,

By doing seperatae scrapes one by one to begin with – pain in the arse right?

Reply

9.12.2013

Hi Matt,

I want to scrape for article directories in german language which are supported by Licorne AIO. I know you prefer UD but maybe there´s no difference. Can you give an example for footprints to scrape These targets? Thanks in advance.

Holger

Reply

Matthew Woodward Reply:

Hi,

You could add an inurl:.de to the search to return only german domains?

Reply

Holger Reply:

I did so but there are no article directories in the results that Licorne supports. Instead I discovered that many german AD run on wordpress. I just try to find out if you can submit to them using Licorne and posted this as question inside the Licorne forums. By the way. Do you know if submitting to WordPress sites running as article directories works with Licorne and UD (i don´t mean blog comments)?

Holger

Reply

Matthew Woodward Reply:

Hi,

I’m pretty sure Licorne does support the wordpress based article dirs, I know UD does 100%!

Reply

9.17.2013

Hi Matthew

Firstly a BIG thankyou for your tutorials, its really motivated me to get on with it and explore everything.

Please help, I’m so confused, I keep getting my proxies blocked from google, its driving me mad! and making me poor !!

Ive just tried using Scrapebox and trying to harvest 4120 keywords with 4 shared private proxy’s and all 4 got banned straight after the harvest which prob took about 10 minutes, I’ve even deselected multi-threading, they seem to still be okay if test them on proxy manager, or using the keyword scraper

Any ideas on what i might be doing wrong ?

Thanks

Kam

Reply

Matthew Woodward Reply:

Hi,

No worrys :)

YOu need more proxies, im using semi-dedicateds from http://www.matthewwoodward.co.uk/get/buyproxies/ which can go all day long

Reply

blueflame Reply:

Thanks Matthew

Ironically I did use them earlier today, I’ve just brought 30 of the semi dedicated so hopefully be a lot better

I spoke to the owner without realizing it, he was so helpful and nice, said exactly the same thing as you.

Thanks again, can finally move on with the training :-)

Reply

Matthew Woodward Reply:

Best proxy company ever in my experience as well ^^

Reply

9.19.2013

Hi Matt,

I just began scraping with different keywords for my current niche site project. Since I have other projects in different niches I wonder if I should scrape new targets for each project or if I can use this target list for all my projects. What is your opinion about this issue and if I am allowed to ask: Do you manage different target list for your projects? I think it could make sense for certain platforms but article directories for example are good for almost every linkbuilding because they mostly have lots of categories. Please tell me if I´m wrong or not in your opinion. Replys from others would be nice too.

Thanks in advance,

Holger

Reply

Matthew Woodward Reply:

Hi,

You can use it for all projects but you should never stop adding sites/maintaining the quality of your list.

Reply

10.1.2013

This is pretty good info–not new–but it’s pretty good…..but I think that you have to a 20 something crazy young guy to actually spend so much time doing this mind-numbing stuff–SEO.

Life is too short :) I do spend some time doing SEO of sorts, but mostly pay for services that I have found to work, just don’t have the time, or the desire to ever do this kind of stuff again…

And, actually, SEO is NOT easy and it IS complicated-despite what SEO courses/guys selling SEO courses will tell you, so if you don’t have the time, resources, knowledge etc to spend time testing, and keeping up on things….well, good luck…but to each their own..

cheers,

Mike

Reply

Matthew Woodward Reply:

Well it was published 14 months ago :)

Who ever you pay/hire to do your SEO – has more control over the success of your business than you do.

Be careful with that!

Reply

Mike Reply:

Sure, and it is good info, and I have learned much of it as you have….not a lot has changed in the basics…one thing I won’t be doing is sending 1000′s spammy links to any tier..no matter of it’s the “3rd tier”, I believe there may be a few footprints that G can follow sooner or later.

I actually do spend some time doing my own SEO :)….but for some parts, I find people to do certain tasks, not the whole enchilada, so I keep some control.

And building a network of sorts is a good idea for your own use. :)

A

Reply

Matthew Woodward Reply:

Hi,

As long as you have control of the first tier that is irrelevant because you can always disconnect from the spam in the click of a button ;)

Owning your first tier is the future though I agree!

Reply

Ray Reply:

Hi Matt. Great info! But when you write “click of a button,” do you delete just the backlink on, say, the blog post or do you delete the entire blog post? Thanks.

Reply

Matthew Woodward Reply:

Either, as long as its removed :)

Reply

10.4.2013

Have just finished watching all the tutorials – Where does the fiverr created video fit in?

Reply

Matthew Woodward Reply:

Hi,

In the video project for tier 1 in UD.

Reply

10.4.2013

Hi Matt,

I am currently following your tiered link tutorial have nearly completed spinning the articles.

SEO is new to me, but this tutorial is helping me learn fast, for which I thank you for.

I am however, a little unsure about the social bookmarking and web directory title and description preparing.

are these titles and descriptions targeted towards the articles, i.e. the set of titles is in reference to each of the 3 article subjects or do you point directories / bookmarks to your money site with the title and link referring to there?

Thanks in advance

Cheers

Tom

Reply

Matthew Woodward Reply:

Hi,

You point them to your money site so the title/desc need to be relevant to that.

Reply

Tom Owens Reply:

Cheers for the advice. Been doing a bit of research into web directories today and noticed, some have 50 or so caps and others can be paragraphs long.

What would be a good word count number to shoot for when writing the descriptions for the directories and bookmarks.

Thanks again

Reply

Matthew Woodward Reply:

Anywhere between 2-4 sentences will do!

Reply

10.6.2013

I’m having a bit of a nightmare with scrapebox proxies – wanted to get your opinion.

I have purchased 10 dedicated proxies from buyproxies, and also tried 10 semi dedicated from squid – im using only 1 connection to harvest but its going extremely slow. e.g. average 1 URL per second sometimes none. The proxy’s test fine for google, but then when harvesting seem to lock up?

When I try with public proxies it goes fine, and when I tested with my own IP address (no proxy) it went super fast. Problem is it’s tough maintaining the public proxies so I was really hoping to use the privates I bought!

Have you had any issues like this with private/semi private proxies?

Reply

Matthew Woodward Reply:

Hi,

Well I have found that squidproxies SUCK! and had the issues you are describing. Oddly I moved to buyproxies and had no issues.

When you says it ‘locks up’ what do you mean?

Reply

Rony Reply:

One thing I have learnt is the scrapebox proxy tester is not really accurate when it comes to google. It checks if the proxy can access google, but doesnt check if its been captcha blocked. So it can sometimes say its G passed when its not. Using Google Proxy Checker (BHW forum) has helped assess these more accurately.

Here is my testing from several proxy providers:

Purchased 50 dedicated from Squid: Had to contact support to get the proxies changes to google/scrapebox then got working to a degree. 20 were already captcha blocked and 30 currently working with 2 connections.

10 dedicated from buyproxy: These seem to be captcha blocked at the beggining. Giving them the benefit of the doubt and will test when they are unblocked.

proxy-hub – proxies not working at all in scrapebox – support said they should be fine but they are not. Support slow to respond.

SSL-private-proxies – so far so good

Reply

Matthew Woodward Reply:

With buyproxies drop support a line and tell them what you are intending to use the proxies for – they can change things up so they work better for scraping Google.

Reply

JJ
10.8.2013

Matt, I’m having a problem filtering out de-indexed domains,

I’m using 50 proxies from buyproxies.com to carry out this task with scrapebox, with 2 connections set. However most of my proxies get IP banned after an hour

Have you ever had this issue before?

Reply

Matthew Woodward Reply:

Hi,

Just add a 1s delay :)

Reply

10.10.2013

A very useful article and site thank you.
I have one question:
I followed step buy step your first video, the only problem is at the end the url count is 0. thank you very much.
Marck

Reply

Matthew Woodward Reply:

THe URL count from where sorry?

Reply

Ken Wild
10.16.2013

Hey Matt,

Thanks for the video tutorials. I have two question:

1. I get a list of over 2000 target sites after harvesting in scrapebox, which I paste in UD, however I only get about 8 successful sites detected. When I then tick the subfolder – got zero sites detected? Am I doing anything wrong (I did change the dropdown list to All)

2. Also there is a huge list of sites already pre-loaded in UD. Is it safe to create links to all of them and the target sites? or do you have any tips that you can share on this?

Reply

Matthew Woodward Reply:

Hi,

1) Thats about right, just have to keep working through the lists

2) Yes you can use them, but just be aware every UD user is using them

Reply

10.16.2013

Hi Matthew,

I’ve gone through you tutorial several times now. You mention the importance of having the ability to remove tier one links, but I’ve been unable to find a place in the tutorial where you show how to do that.

Any guidance is appreciated.

-Matt

Reply

Matthew Woodward Reply:

Hi,

Ahhh thats just a case of exporting all of the logins and then manually working through them

Reply

10.18.2013

Hey, I am wondering why you delete Ultimate Demon? Do you think it’s no longer working?

Reply

Matthew Woodward Reply:

Eh?? Have you seen http://www.matthewwoodward.co.uk/reviews/ultimate-demon-review-tutorial/ ?

Reply

Agnes90 Reply:

Hi, yes I read that post.
I notice under this vide tutor, in the “essetial tools” part, ultimate demon is striked through, so I got some confusion.
:-)

Reply

Matthew Woodward Reply:

I striked the $50 discount while the $80 discount was available, back to normal now though :)

Reply

hans
10.21.2013

Hi Matt, 2 questions:
1: when merging your ‘merge-list’ and ‘footprints-raw’ list with my keyword I get everytime and error message after finishing or aborting scrapebox and scrapebox will shut down. Do you have an idea why this happens? Scraping related websites is for example working fine. First time I see this error.

2. you say the following 2 things:
-write yourself – 1 title with a 3 sentence description for every type (so 3 sets in total, so you end up with 3 titles with 3×3 sentence descriptions in total)
-you should write 15 alternative titles for everything including your articles, bookmarks so on…

So, do I need to write 15 titles or 3 titles?

Reply

Matthew Woodward Reply:

Hi,

1) Not sure sorry, try getting in touch with scrapebox support

2) 3 titles, and each title has 15 variations

Reply

Hans
10.22.2013

question 4: how many urls per second do you scrape on average with your 50 semi dedicated proxies?

Reply

Matthew Woodward Reply:

It’s not really something I pay attention to – and also depends on how powerful the machine is, how much bandwidth is available and how many threads your running.

Reply

frank Reply:

50 buyproxies US semi dedicated Proxies. I am currently scrapping Google 14 AVg URL/Sec
using 3 connections @ 162mb. Yahoo 39 AVg URL/Sec using 3 connections @ 435mb. Bing 21 AVg URL/Sec using 3 connections @ 625 mb.

Reply

Matthew Woodward Reply:

What Frank said ^^

Reply

Hans
10.23.2013

Hi Matt, thanks for your reply. I made this comment also yesterday: I think it dissapeared because I made 2 comments. Not sure about it. But this were the questions:

1)You said you have to make a 3 sentence description for every title in the tutorial. So how many descriptions do I need to make exactly? You said 3×3 sentence descriptions in the tutorial, but Im not so sure about this anymore? Because you said first also only 3 titles. But I cant imagine we have to do 45 descriptions.

So is 3×3 descriptions and for every sentence 2 alternative sentences and sentences also spun enough? Or how many do you exactly? Hopefully you can explain it in a very clear way.

2) Im using squidproxies, but my proxies got banned after an hour. I also read here in the comments, that you don’t advocate Squidproxies, because you also have problems with it.

So I think it’s best to buy 30 semi dedicated buyproxies, proxies. But which settings in scrapebox are best for this, so I will not get banned when scraping 24/7? This are my settings now for 25 private squidproxies:

maximum connections settings: http://imgur.com/yRcBnZf
you also talked about a delay of 1 second. Do you mean the ‘adjust RND delay range’ with this? Screenshot: http://imgur.com/wiTiWLC and is this set right?

Are this the best settings for 24/7 with 30 semi dedicated ‘buy proxies’ proxies, scraping?

Reply

Matthew Woodward Reply:

Hi,

1) For every title, you need 10-15 unique titles. Don’t get titles confused with the article body.

2) Yeah SquidProxies suck in my experience the cheaper semi-dedicated ones from http://www.matthewwoodward.co.uk/get/buyproxies/

Really getting the settings right is just a case of tweaking them – those settings look ok though!

Reply

10.23.2013

Hey Matt,

when I merge my keywords with your footprints should I use multi-threaded harvester or not?

What effect do both option have?

Thanks a lot!

Reply

Matthew Woodward Reply:

Yes use the multi threaded harvester, that just means it does more things at once :)

Reply

10.26.2013

Hi Matt, great stuff. I’m confused about something: you talk about “pasting your personal set of footprints into the footprints.ini file in the ScrapeBox configuration folder” so that it will be available in a drop down menu, and yet in the video we only see you pressing the button in ScrapeBox to merge in the raw footprints and then pressing it again to merge in the list of common words.

If we can just do it that way, can we just do that and refrain from modifying the footprints.ini file and still get the same results? Or do we need to make that modification to the footprints.ini file in order for us to use your footprints?

Reply

Matthew Woodward Reply:

Yes you get the same results although I prefer to have things organised better and in the future everything I need is in Scrapebox rather than referencing external files.

Reply

winner
10.28.2013

Hey Matt, I am not sure how to thank you for the tutorials.. it’s mind blowing. This write-up inspired me to try and do SEO across our 5 eCommerce sites in-house instead of spending hundreds of dollars every month contracting it out. I also started buying up old domains with intentions of starting a PBN .. thank you.
If you’re ever in Texas, drinks are on me!

I also used your link to sign up for the monthly UD subscription.. will likely buy the full version next month.

After I scrapped 78k keywords and merged with your footprint and common word files, I got over 2 million target URLS and when I loaded in scrapebox, it kept crashing. I ended up using only my keywords merged with your footprint (skipped the common words) this yielded 200k target urls.
I loaded those in UD and been running the site detector since 1pm ET Saturday – 49hrs ago.

These are the current stats
32601 domains on queue.
Success 395
undetected 19302
Duplicate 32.
Connection error 894
Estimated time left 71 hrs.

I am using 50 semi-dedicated proxies from buyproxies.
This means it checked 20k domains in 48hrs.. still have over 30K to go..

1. Does it usually take this long?
2. Apparently it loaded only about 50k is there an easier way to know which 150k domains were skipped?
3. I am yet to uncheck the folder stuff to run the detect the 2nd time.. is this necessary at this time? I do not have another week to run this same 50k domains.

By the way, i am running a Quad core 2.0GHZ, 8GB machine. 10MB fiber connection.
Thanks again.

Reply

Matthew Woodward Reply:

Hi,

Haha thanks – I was in Texas not so long ago but don’t have any plans to return or pass through at the moment!

Yes it will take a long time you can also increase the number of threads and disable the proxies for the import process.

It loads the list in the order you paste it so it will be the last 150k that were skipped

Reply

winner Reply:

ohh ok .. thanks a lot. Will try that once this is done in a couple of days.

Reply

Matthew Woodward Reply:

No worrys :)

Reply

Ken Wild
11.4.2013

Hey Mathew,

When I load your footprints file and merge file to the 800 keywords in scrapebox, I get total of 7296000 keywords. But, when I press the harvest button to get a list of related target sites, it keeps crashing.

I have emailed scrapebox support regards this. However I wanted your thoughts on this. Have you ever encountered anything similar or if you have any possible solution to this.

Thanks,
Wild..

Reply

Matthew Woodward Reply:

Hi,

Upgrade your computer or use less keywords :)

Reply

11.4.2013

Thanks for these amazingly helpful tutorials Matthew. I have a question about sourcing a video gig from Fiverr which you mention here. What kind of video are we looking for? If the money site in question is for an affiliate product, can this video be a testimonial? Let me know if I’m on the right track, thanks!

Reply

Matthew Woodward Reply:

Hi,

Whatever is relevant to your promotion.

Reply

Ken Wild
11.17.2013

Hey Mat,

In the UD software, after I add the site’s from the site detector to the site list, you say to select all the sites and then run a google index check in scrapebox. But in UD there is already around 100 site’s that was preloaded with Ultimate Demon. It is okay to keep that as part of your tier one or do you recommend deleting them and only use the target URL’s from the site detector?

Thanks,
Ken

Reply

Matthew Woodward Reply:

Hi,

If those domains are deindexed, remove them! The source is irrelevant

Reply

11.21.2013

Hi Matthew, are your methods still working? Especially with the Google’s constant algorithm change and all.

Reply

Matthew Woodward Reply:

As with any approach to link building you need to diversify and mix things up. If you combine this with replicating competitor links, building a private network, creating good content and driving social signals, you are winning!

Reply

11.21.2013

Hi Matt, could you please explain what kind of social signals are the best in your opinion?

Reply

Matthew Woodward Reply:

Best in terms of what goal?

Reply

Holger Reply:

To get better ranking of course. Do you rather mean the big players (Twitter, Facebook, Google etc.) with high PR or others as well? What about 0 PR sites?

Reply

Matthew Woodward Reply:

Well social signals bring a range of value, increasing ranking is just 1 thing they can be used for. But yes Google+, Facebook, Twitter, Pinterest

Reply

Holger Reply:

I currently use Licorne AIO but I have some difficulties with it. The same Thing with scarpebox. But back to the Topic now: Do you know if you can create Google, FB, Twitter and Pinterest accounts and submissions with Licorne. If or if not: Is it possible to do that with UD?

Reply

Matthew Woodward Reply:

No you can’t =\

Reply

Martin
11.22.2013

Hey Matthew,

Am I going completely bonkers?! I can’t seem to find your list of targets…

Reply

Matthew Woodward Reply:

Yes you are :)

Reply

mark
11.27.2013

I can’t believe this is all in one place. What an amazing gift.
Thank you!!

Reply

Matthew Woodward Reply:

No worrys :)

Reply

myancey
12.1.2013

Hi Matt,

I went thru your entire tutorial and, man, it’s superb! The detailed instruction is awesome – a little fast but it’s on vid so…

I’m def impressed. Thank you.

Unfortunately it is over-my-head in some areas and def out of my budget. I calculated over $700 worth of tools.

I am so new to backlinking and struggling to set up my Link building platform now. I have found that many of the Web 2 properties that allow do-follow are sites where I have to create a site (mini site I guess).

I def was not expecting that! I thought that I could just put my article up and be done. No one seems to teach the best way to QUICKLY set these mini sites up.

I don’t want to spend days trying to set up all of these sites.

I thought you said you had a post about that but I haven’t found it yet.

Long story short – I can’t employ your method right now b/c it is so out of my league right now – mostly financially and in other areas I’m just befuddled – like proxies and all that.

I learn quickly but this stuff has slowed me down quite a bit.

Any assist or suggestions would be appreciated.

Thanks again!

Melodye

Reply

Matthew Woodward Reply:

Hi,

If your on a low budget you could take a look at http://www.matthewwoodward.co.uk/tutorials/backlink-competitor-analysis/

Reply

Mike
12.5.2013

Hi Matt,

are your footprints and mergelist still up to date or did you made any changes?

Reply

Matthew Woodward Reply:

Hi,

There are more over at http://www.matthewwoodward.co.uk/reviews/ultimate-demon-review-tutorial/ !

Reply

Tim Reply:

Hey Matthew, not sure if you have a partial set here and additional ones at the link above; can you you make a single updated footprint file for each software that you use? UD and GSA. Thanks

Reply

Matthew Woodward Reply:

Hi,

UD from http://www.matthewwoodward.co.uk/reviews/ultimate-demon-review-tutorial/ – GSA later in the series

Reply

Kaelos
12.20.2013

Any good tutorials you’d recommend on how to spin manually, by hand?

Reply

Matthew Woodward Reply:

Hi,

Like this one http://www.matthewwoodward.co.uk/tutorials/advanced-article-spinning-techniques-with-the-best-spinner/

Reply

Kaelos Reply:

Nice – thanks!

Reply

12.24.2013

Spending $25 just to rank 1 keyword doesn’t seem economical. What if you are going an authority site trying to rank 100′s of keywords?

Reply

Matthew Woodward Reply:

Spend $2,500?

Reply

Trevor
1.9.2014

Why do you now use 99centarticles? the prices and whole “feel” of this place is all wrong. I feel like im being scammed from the home page on. The site must have been designed before computers were invented!

There are no 99ct articles thats for sure.

Reply

Matthew Woodward Reply:

You used to be able to order 100 word articles for 99cents, not sure if thats still the case!

Reply

1.12.2014

You mntion footprint file under the video – but i have searched high and low and don’t see! Is there a direct link?

Thanks

Reply

Matthew Woodward Reply:

Did you read while you were searching?

Reply

Chua
1.28.2014

Hi Matthew,
Thanks for the tutorial, I managed to follow the steps and used the footprints provided by you, using scrape box to scrape more than 14000 links. As I only have GSA SER at the moment, can I import them into GSA? Any additional steps like filtering off not usable links or GSA is able to work it out. Have a good day.

Thanks again

Reply

Matthew Woodward Reply:

Yes you can import them in and im pretty sure the import will respect your GSA project settings but I would check that with the dev.

Reply

2.2.2014

Hey Matt,
Thanks for the videos, they have been very helpful in breaking everything down. I have a basic question regarding the 3 X 500 word articles when implementing them in UD. Currently I have one of them spun properly in TBS (by the way it is mind numbing at times). My question is when submitting them in UD….are you just submitting one of them to the 100 web 2.0 directories in your video or are you incorporating all three at the same time somehow? I may have missed that in your video or didn’t see it. Also I have yet to pick up UD, so far I just have TBS and Scrapebox getting everything set before I pick that up. So do you just put in your spun format article and UD does all the spinning like TBS does? Also after you launch the article once in UD to the 100 directories is the article then useless at that point to re-use as it could be reduplicated in UD if submitted at a later time or does UD remember what it’s spun previously and re-spins it differently if used down the line? Hope you can answer some of these questions and thanks again for your videos.

Reply

Matthew Woodward Reply:

1 for web 2, 1 for wiki, 1 for PR/AD

Yes an article can only be used so many times, its up to you to work out how many times that is and keep control.

Reply

2.4.2014

Hello Matthew,

Thank you very much for making the videos. Definitely a great help to newbie like me.

I have two questions.

1) 3 x 500 word articles about your keyword or topic.
Do I need to write 2 alternative sentences for each sentence of these articles for spinning?

2) 1 x video with titles/descriptions.
How long is the video required?

Thank you.

Reply

Matthew Woodward Reply:

Hi,

1) Yes

2) Whatever is ‘right’ for your niche/what you are trying to communicate

Reply

kerh Reply:

Hello Matthew,

I am concerned seogenerals or 99 CentArticles may copied from other sources to produce the articles. Have you ever encountered such problem where their articles failed copyscape?

Reply

Matthew Woodward Reply:

Yes that happens sometimes you have to keep a leash on quality

Reply

Sheikh Ovais
3.6.2014

Hi Matt,

You’ve recommended 99centarticles in your tutorials most of the time and it seems that you’re quite satisfied with their quality.

My question is that can I use their service to have (1000 word) articles written for my own (primary) site, not for the purpose of link building? Assuming that I’ll tweak the articles to add more detail and value, should I go for them to have 10 x (1000-word) articles written for my site?

Thanks a lot.

Reply

Matthew Woodward Reply:

Hi,

I ALWAYS write my own content on the front end. By the time you have tweaked/added detail to it you might as well have written it.

Reply

KimMass
3.21.2014

Hi Matthew Woodward
I’ve built about 10 web 2.0 (3-5 articles per site, 1 to 2 articles with back links to money site). Google has to index these web, but when I check the back link of money site, I do not see any back link. This is why?
thank a lot

Reply

Matthew Woodward Reply:

If its indexed its good to go.

Reply

3.30.2014

In your example of video 2 (and with the raw footprint) when you scrape, do you also include those websites without PR (example like N/A?)
And when commenting/scraping it needs to be niche related right ????

And do you go over the websites manually to see if they are related ???

Reply

Matthew Woodward Reply:

1) Yes
2) No
3) No

Reply

Kim
3.30.2014

Did you delete my blog comment ??? :(

Reply

Matthew Woodward Reply:

Comments must be manually approved before publishing, I assume your not familiar with wordpress?

Reply

Kim Reply:

No im not familiar with the manually approved in “wordpress” yet. But thanks for taking time on every question/comment. Even though im not always the one who are asking. Im still reading to learn from other questions!

Reply

4.9.2014

Hi Matthew,

Thank you for all this wonderful knowledge you are sharing with us. I have few questions:
1- the link to Linkwheelbandit doesn’t seem to work. Is it no more on the market?
2- Is there any particular reason why you don’t use GSA for Tier1.
3- What are the tools, services or softwares (hidden costs) involved for someone who buy your Premium tutorial?
I already own GSA, scrapebox, KontentMachine,WAC, and The Best spinner.

Thanks

Fareed

Reply

Matthew Woodward Reply:

Hi,

1) No its not

2) Answered a million times above

3) None – there are free and premium ways to do things

Reply

What are your thoughts?

* Name, Email, Comment are Required

Welcome To My Blog!

Let me introduce myself, my name is...

No not Matt Cutts, but I am better at SEO than him :P I make a great living online with SEO and I will teach you how you can as well...

You Can Find Me On...

Get My Latest Posts

Sign up for my newsletter to get the latest blog updates direct to your inbox.



WARNING! I do not send ANY 'Guru' spam or affiliate promotions.

Sponsors

My Friends


Advertise Here.