My experiment to determine what Google favors more - Unique Content or Quality Backlinks

by 29 replies
33
For a while now I've been wondering what Google favors more - or .

I really wanted to know what the "magic ratio" is. In other words, should I spend 50% of my time writing new content and 50% of my time building backlinks? Or is the optimal ratio something more like 80/20?

I'm kind of a numbers nerd, so this lead me to do the following experiment:
  • A single niche topic
  • Primary keyword is in subdomain name
  • Two subdomains each with an SEO-pimped WordPress install
  • Site A will concentrate on content
  • Site B will concentrate on backlinks
  • Time period is 1 month
My questions for the Warriors -
  • Is anyone aware of this being done before? If it has I'd love to see the metrics and results so I can tailor my experiment to provide completely new insights.
  • Any suggestions for things like # of words per content post, # of posts per day, # of backlinks per day...basically anything that will make the experiment more "legit"?
  • I read that ClipMarks.com will get you indexed in a couple hours but from my experience that's far from the truth (more like days).
I'm writing all about it on my blog, so feel free to follow along -

#search engine optimization #backlinks #content #determine #experiment #favors #google #quality #unique
  • I like the challenge, but most warriors already knows this: Google cares $%#@ about unique content. They only care about satisfying the searchers and a good "user experience" for everyone. They could not care less if you stole the article from another website, that is not their problem, the only thing they need to do is satisfy the searcher's needs.

    So back links will always have much more weight than any unique content, because Google sees back links as a finger showing "that is popular!". And as we all know, the more popular a site or article is, the better chance of satisfying the user (Google is all about statistics), and therefore the article will get good search engine rankings.

    I also thought in the beginning, lets impress Google with unique content! Well, I tried it, wasted more than 100 hours (Martial Arts - Nova Martial Arts) , Google gave me 10 visitors a day for three months. Without back links you are doomed.

    BUT remember, if your site does not have any nice content visitors won't stay, and Google knows this, so make sure your visitors stay after a search, otherwise the back links won't help in the long term, Google will quickly throw your site out of the SERPS if they notice you don't satisfy the searcher (just like they increase the bids in Adwords). But how you do that, Google doesn't really care, as long as you do it.

    Hope this helps
    • [2] replies
    • Hey Saidar, thanks for your input.

      I'm not entirely sure about your assumption (but I guess we're gonna find out! ). From my experience if I stop adding content my Google ranking (for my primary keywords) keeps dropping...until I add more content. This is especially true if the keyword phrase (in quotes) has lots of competition.

      But like every topic on WF, there's always people on either side of the fence. I don't think I've ever posed a question that had 100% consensus...haha.

      So hopefully this experiment will help!
      • [1] reply
    • Completely off topic, but worth mentioning. Saidar, that is quite an impressive and informative martial arts site. Great job!
  • Yup, I did a mini-experiment and stopped working on a site altogether (no new content, no new backlinks). I noticed the Google ranking gradually get lower and lower. Then I added more content, just one post a day, and it came back up.

    But you're right, there's so many other parameters in play that it's hard to determine exactly what was going on.
    • [1] reply
    • Oh you forgot to say you tested a blog. A blog can "die" if you don't update it regularly. But static sites is something entirely different. From how I've heard people talk on this forum, it looks to me that blogs are placed through an entirely different algorithm than static sites. But that is just my opinion, can't prove it
  • Well Khtm,

    From my experience to get indexed you as fast as possible you just need to post your page to social media websites like digg.com and so on with the dofollow link.

    Do it to like 3 or 4 of them and you will get indexed quick sometimes in like 1 hour.

    Also if you have a blog. I have noticed this.

    Be a member of forums like warrior or dp and have a backlink to your main page and be an active poster.

    Now any new stories will be on that page and will get indexed quick.

    So that is what has worked for me.

    Thanks,
    Brian P
    • [1] reply
    • You can get your site indexed very quickly by putting a link to it in your signature here and responding to a fresh thread in the main forum.
  • Use a program like Word Press Direct to automate the content creation and focus on backlinks
  • Yea backlinks is really the only way to rank high on pages. I mean content is content. If you have good backlinks with those keywords in your content then you will rank high. Its really a rat race in my mind.

    I have had success with SEO and that is how I have had to get to the first page of certain celebrity names. If anyone knows an easier way I am all ears.

    Thanks
  • I've setup both sites, am now just waiting for Google to index them. I tried to make them have similar content but rewritten to be unique, but I'm a bit worried that Google may completely ignore one of the two because their structure is exactly the same. Oh and not to mention that they share the same domain.

    Guess time will tell!

    A - Jump Higher Exercises
    B - Jump Higher Exercises
    • [2] replies
    • Perhaps it depends on what the niche is, but I have found that unique content -- posted regularly -- knocks the ball out of the park. If it is interesting, other blogs link to you, which in turn boosts the inbound links.

      Personally, I'd much rather spend my time writing good content than doing the yeomanly work of getting back links.
    • Ok just an update -

      About 2 days after I tried to get both sites indexed by Google, site "b" was but site "a" wasn't.

      Now a couple weeks later site "a" is but site "b" isn't. So basically Google knows that they are similar and randomly decides to switch which one is indexed? Frickin' weird! :confused:

      Any ideas?

      I'm guessing it may be because they're both subdomains of the same domain, so maybe my experiment was flawed before I ever started. Maybe I should have purchased two distinct domains?

      Site A index query
      Site B index query

  • The on-page ranking algorithm has very little to do with unique content, and more to do with site relevance, bounce rate, and a few other metrics no one can be too sure about.

    Think about it for a moment - Google has to come up with mathematical algorithms to rank billions of sites automatically. They can't decide what content is "good" or bad", they can't specify a certain number of "words" that must be met, nor how "unique" content is, they have to instead go off algorithms, which greatly diminishes the value of unique and good quality content.

    Backlinks are something that Google can easily incorporate into their algorithms. If you have a thousand relevant, high authority backlinks, with a crappy dupe content site, you can be certain it's going to outrank a site that has a thousand pages of the most state of the art, unique content ever written (this is barring people linking to that site of course, due to its good content, but we're talking strictly about algorithms here).

    Now the long term effect may be slightly different. Whilst not proven, I strongly believe Google will look at the click-through rate, and the bounce rate of a site, in order to get an idea as to whether it's satisfying the user experience. If the bounce rate is too high, and the click through rate too low, then the site may have a negative metric applied to it. Even still, the "crappy" site will continue to outrank your "unique content site" since it will be nowhere to be seen. Google has not yet had the chance to determine whether the unique content site is any good or not, only that it contains a lot of pages of content. Furthermore, even with a high bounce rate, the poor quality site is going to continue to outrank the good quality site with a few high authority backlinks, even if the good content site has a low bounce rate. This is until the good content site has some authority backlinks as well.

    So to answer your question, there's no doubt about it - backlinks are much, much more important than content, although content is ultimately what sells. So have the content you need to attract your subscribers, or to sell your products, but then focus your efforts primarily on obtaining backlinks.

    One last thing I'll debunk is a site losing ranking if it hasn't been updated for a while, and hence the presumption that updating a site constantly is necessary to maintain rankings. Quite simply, this isn't the case, however there's many reasons for this, and I can't sum them up in this post. Least to say, this "illusion" is caused by two main things - first of all, if the update frequency of your site stops, the Googlebot will crawl your site less often, and a metric may be applied that causes you to temporarily lose rankings. The second thing is that static content may take a while to acquire trust and gain rankings, and this is why updated fresh content ranks quicker, since you have the Googlebot constantly crawling it, whereas you don't with static content. If you keep acquiring a natural progression of inlinks to your static content pages however, your rankings should start to rise again to where they were. Sometimes this can take many months.

    One last note I'll add is that a manual review destroys everything I've said. It's a combination of good content to pass a manual review, along with a good link building strategy, that will lead to long term rankings.
    • [ 2 ] Thanks
    • [1] reply
    • Hi Steven,

      Actually Google has been on the record about this issue. They try to determine whether the content "needs" to be updated. For example, an article about Ghengis Khan probably doesn't need to be updated, as it's historically "set", while an article about a contemporary pop star likw Britney Spears may need to be updated often to maintain it's ranking.

      Another example: The basic strategy for blackjack has been around for 40+ years and doesn't need to be updated. However, blackjack playing conditions at Las Vegas casinos is a topic that would need to be updated at least a few times a year to be accurate.

      How Google determines this, I'm not sure, but they do try. This may explain why some pages need to be updated or lose their ranking while others don't.
      • [1] reply
  • ^ Thanks for the very detailed and informative post, Steven. You seem to really know your stuff.

    I had no idea that "manual reviews" even happened? So you're saying some dude in Google-land actually manually reviews indexed sites?

    And I'm curious about bounce rate. Couldn't Google only know about this if we have Analytics or Webmaster Tools code in our HTML? So how is that fair that sites using Google's own services would be possibly penalized over sites that don't? This is something that I've been thinking about for a while. It almost makes me want to steer clear of using any Google tracking code.
  • Actually I found this in the Analytics FAQ

    "Your website data will not be used to affect your natural search results, ad quality score or ad placement. Aggregate data across many customers will be used to improve our products and services."

    So apparently my paranoia was unfounded Well, at least according to Google's public statement.
  • Not a problem khtm =)

    And if only Google told us the truth about everything. When I say there's no evidence to back up bounce rates, there isn't, but from my experience they do incorporate this into their algorithms. I also believe they use a number of other controversial things (such as whois data).

    And yes, if a site is automatically "flagged for review" by Google, they will manually review it, and could even deindex the site if it's spam. This is what puts the "spanner in the works" of the good old days of algorithm manipulation.
  • Why not just ad both? problem solved
  • Not sure what ya mean, I've been trying equally as hard to get both sites indexed, but Google is too smart. Two weeks later only one of the two sites is indexed.
    • [1] reply
    • Google only said they don't use Google Analytics to influence your rankings. They didn't say they don't use data from the millions and millions of people that have the Google Bar installed.

      If anyone else has noticed recently, if you have the Google Bar installed and open a new tab in IE, Google shows you a page filled with recent sites you've visited and those in your "favorites". Solid proof Google knows what sites are in your favorites.

      I wrote about "People Rank" years ago. There's a ton of things SEs can use that comes from real people data, especially if an SE tool bar is installed...

      Right now if you have the Google Bar installed they can tell if the site is bookmarked, how long you stayed on a page, did you click through the site or did you click back quickly to the SERPs, then click to another site and stay longer. They can even tell how far you scrolled down a page and how fast.

      All of this can be applied to the SERPs, and probably should be. Yahoo did a test a few years ago and determined human activity produced far better results than algos did.

      It's been proven Google can collect this info. Yahoo has proven this info is efffective. I guess the only question left is, "Why doesn't Google use this info in their rankings"?

      Read this page I wrote a few years ago about "People Rank" and other on page factors you may not have thought about. Again this page is a few years old. It's even more likely now that Google uses some of these methods that it was when I first wrote the report.

      SEO and SEM Kurt Melvin's Big Page of Search Engine Optimization Strategies.
  • i'm curious how this turns out and what the ratio turns out to be
    • [1] reply
    • Yeah me too, except I've hit this major roadblock where Google will only index one of the two sites. So I'm not sure how to proceed.

      Does anyone have any suggestions?

      I'm thinking maybe I should move site "b" to another domain I own and see if that helps.

  • @mikeong88

    Both domains have the same three octets of the web server IP, but not entirely unique IPs. Do you think this would be enough of a difference?
  • ^ Sorry, I thought you meant if I moved one of the sites to another domain that I own would both domains have the same IP. (Because they are with the same host).

    Of course they're the same now as they are subdomains of the same domain

    Ok I guess my next step will be to move one of the sites to another domain.
  • I've just seen interesting results with a new site. It has 15 pages so far and I gave it one link from a PR3 page of another site I have. I got 85 daily visitors after 2 weeks.

    So the content is starting to rank well but the site has only 1 incoming link. Perhaps that gives you some ideas on how Google works.
  • Ok, I've moved Site B to another domain. I even renamed the Site A subdomain so it doesn't contain the letter "a" anymore.

    Now let's see if Google finally indexes both of 'em

    Site A - Jump Higher Exercises
    Site B - Jump Higher Exercises

Next Topics on Trending Feed