Showing posts with label Link Popularity. Show all posts
Showing posts with label Link Popularity. Show all posts

Saturday, 9 June 2012

Google's Good Writing Content Filter

by: Joel Walsh
The web pages actually at the top of Google have only one thing clearly in common: good writing. Don't let the usual SEO sacred cows and bugbears, such as PageRank, frames, and JavaScript, distract you from the importance of good content.

I was recently struck by the fact that the top-ranking web pages on Google are consistently much better written than the vast majority of what one reads on the web. Yet traditional SEO wisdom has little to say about good writing. Does Google, the world's wealthiest media company, really only display web pages that meet arcane technical criteria? Does Google, like so many website owners, really get so caught up in the process of the algorithm that it misses the whole point?

Apparently not.
Most Common On-the-Page Website Content Success Factors
Whatever the technical mechanism, Google is doing a pretty good job of identifying websites with good content and rewarding them with high rankings.

I looked at Google's top five pages for the five most searched-on keywords, as identified by WordTracker on June 27, 2005. Typically, the top five pages receive an overwhelming majority of the traffic delivered by Google.

The web pages that contained written content (a small but significant portion were image galleries) all shared the following features:

Updating: frequent updating of content, at least once every few weeks, and more often, once a week or more.

Spelling and grammar: few or no errors. No page had more than three misspelled words or four grammatical errors. Note: spelling and grammar errors were identified by using Microsoft Word's check feature, and then ruling out words marked as misspellings that are either proper names or new words that are simply not in the dictionary. Does Google use SpellCheck? I can already hear the scoffing on the other side of this computer screen. Before you dismiss the idea completely, keep in mind that no one really does know what the 100 factors in Google's algorithm are. But whether the mechanism is SpellCheck or a better shot at link popularity thanks to great credibility, or something else entirely, the results remain the same.

Paragraphs: primarily brief (1-4 sentences). Few or no long blocks of text.
Lists: both bulleted and numbered, form a large part of the text.

Sentence length: mostly brief (10 words or fewer). Medium-length and long sentences are sprinkled throughout the text rather than clumped together.

Contextual relevance: text contains numerous terms related to the keyword, as well as stem variations of the keyword. The page may contain the keyword itself few times or not at all.

SEO "Do's" and "Don'ts"

A hard look at the results slaughters a number of SEO bugbears and sacred cows.

PageRank. The median PageRank was 4. One page had a PageRank of 0. Of course, this might simply be yet another demonstration that the little PageRank number you get in your browser window is not what Google's algo is using. But if you're one of those people who attaches an overriding value to that little number, this is food for thought.

Frames. The top two web pages listed for the most searched-on keyword employ frames. Frames may still be a bad web design idea from a usability standpoint, and they may ruin your search engine rankings if your site's linking system depends on them. But there are worse ways you could shoot yourself in the foot.

JavaScript-formatted internal links. Most of the websites use JavaScript for their internal page links. Again, that's not the best web design practice, but there are worse things you could do.
Keyword optimization. Except for two pages, keyword optimization was conspicuous by its absence. In more than half the web pages, the keyword did not appear more than three times, meaning a very low density. Many of the pages did not contain the keyword at all. That may just demonstrate the power of anchor text in inbound links. It also may demonstrate that Google takes a site's entire content into account when categorizing it and deciding what page to display.

Sub-headings. On most pages, sub-headings were either absent or in the form of images rather than text. That's a very bad design practice, and particularly cruel to blind users. But again, Google is more forgiving.

Links: Most of the web pages contained ten or more links; many contain over 30, in defiance of the SEO bugbears about "link popularity bleeding." Moreover, nearly all the pages contained a significant number of non-relevant links. On many pages, non-relevant links outnumbered relevant ones. Of course, it's not clear what benefit the website owners hope to get from placing irrelevant links on pages. It has been a proven way of lowering conversion rates and losing visitors. But Google doesn't seem to care if your website makes money.

Originality: a significant number of pages contained content copied from other websites. In all cases, the content was professionally written content apparently distributed on a free-reprint basis. Note: the reprint content did not consist of content feeds. However, no website consisted solely of free-reprint content. There was always at least a significant portion of original content, usually the majority of the page.
Recommendations

Make sure a professional writer, or at least someone who can tell good writing from bad, is creating your site's content, particularly in the case of a search-engine optimization campaign. If you are an SEO, make sure you get a pro to do the content. A shocking number of SEOs write incredibly badly. I've even had clients whose websites got fewer conversions or page views after their SEOs got through with them, even when they got a sharp uptick in unique visitors. Most visitors simply hit the "back" button when confronted with the unpalatable text, so the increased traffic is just wasted bandwidth.

If you write your own content, make sure that it passes through the hands of a skilled copyeditor or writer before going online.

Update your content often. It's important both to add new pages and update existing pages. If you can't afford original content, use free-reprint content.

Distribute your content to other websites on a free-reprint basis. This will help your website get links in exchange for the right to publish the content. It will also help spread your message and enhance your visibility. Fears of a "duplicate content penalty" for free-reprint content (as opposed to duplication of content within a single website) are unjustified.

In short, if you have a mature website that is already indexed and getting traffic, you should consider making sure the bulk of your investment in your website is devoted to its content, rather than graphic design, old-school search-engine optimization, or linking campaigns.

Tuesday, 5 June 2012

Link Popularity: Distribute content, not just links.

by: Robert Raught
You've spent many hours trying to increase your online traffic with your linking campaign. You've sent out 200 e-mails pleading with other web sites to trade links with your site. Many of your e-mails bounce back.

The requests that find thier targets get rejected for numerous reasons. For example, your Google pagerank is too low or your links pages are dynamic and not static, etc., blah, blah, etc., ad nauseum. Out of those 200 requests, you wind up getting 25 reciprocal links, if you are lucky.

So, you say to yourself, "Great, now i have 25 more links!". But are these links really worth it? Do they generate any traffic?

There are many reasons why your links won't even get counted or indexed by the search engines. If your link is on a page among 100 other links, or the page is irrelevant to your subject matter, the page probably won't hold much weight with most search engines. It's also rumored that Google is changing it's algorithm to discount reciprocal links altogether.

So, what can you do to get your links indexed and noticed? Write your own content, distribute it to article directories or trade it with other related websites!

Here are 8 tips on increasing your online traffic with distributed content.


1. Try to write about popular content. The more popular it is, the more people will download it and want to include it on their websites and the more links you'll have pointing back to your site.

2. Try not to use any promotional jargon or sales pitches in your articles. If you do, many webmasters will not want to include your article on thier site.

3. Use plain English. Don't try to get too technical. Read it back to yourself and make sure you don't get tongue-tied while reading it.

4. If possible, work in your site's main keyword phrases into your articles. If your site is about online marketing, write articles about online marketing.

5. Make sure you include an "About the Author" section at the bottom. Make it somewhat short and always include a link back to your site in an anchor tag. And once again, include your keywords in the link text.

6. Proofread your article carefully. I see so many articles out there with misspellings. It just makes you look bad. After you spell check, have a friend or co-worker read it to double check for errors.

7. When you're finished with your article, submit it to popular article directories like goarticles.com, articlefactory.com, amazines.com and imparticles.com. For a fee, there are even services out there that will submit your articles to the top directories for you.

8. Make sure you publish your articles on your own website too, more content equals more traffic. Don't worry about getting penalized by search engines for having duplicate content. You only get penalized if the content is duplicated on your own domain, not if it's duplicated on other websites.

So there you have it. Distributed content allows you to make every link count, by creating targeted links that directly contribute to your search engine rankings, and by delivering targeted traffic on it's own. And besides, it might even make you famous!