Useless things (nothing to lose, but not to win)

   Meta Keywords
    Some time ago the label <meta name="keywords" content=""> was to tell the search engine the keywords relevant to this particular page. In the modern history of SEO search engine web sites download and extract the keyword relevant to your content, meta tag so that the K and no longer used for ranking web. Just forget it, it is of no use for SEO.

    Keyword density
    One of the factors most overrated web ranking is keyword density. What is keyword density and why this myth has lived for so long? The keyword density of each particular word on a page, is calculated as follows:

    DK = Cuenta_de_Palabras / Palabras_Totales * 100%


    That is, if a page is 150 words and the word "SEO" appears 24 times on that page, the keyword density is: 24/150 * 100% = 16%

    But, why this value is useless? Because search engines have evolved and no longer take into account the keyword density, because it can be manipulated very easily. There are thousands of factors that search engines consider when calculating the ranking of a page, so, why need a method so simple (if not early) for positioning pages and count the number of times a word appears in the text of a page? You may hear that the keyword density of 6% is the best rate, or to be maintained between 7% and 10% or so that the search engines like kw density of between 3% and 7% and other nonsense more. The truth is ...

    A search engines like pages that are written in natural language .. Write for humans, not search engines! A page can have any keyword density from 0% (the keyword is not listed on the page) to 100% (a page consisting of a single word) and still have a high ranking.

    Well of course you want to control the keyword density of your pages, but please note that there is a good value for this factor. Any value will work if your text is written with a human reader in mind. Why would anyone continue to review the density of keywords if it does not count for anything? Because it is a quick and dirty way to estimate the theme of a page. Just do not overstate this point, is just a number, nothing more, and has no utility for SEO.

    Another interesting question: Why does this myth is still alive and why so many people still talk about the keyword density as an important factor in ranking? Perhaps because the density of keywords is easy to understand and modify if necessary. You can see here with their own eyes and realize quickly if your site is good or bad. Well, it only seems that way, but it really is not - keyword density is good for nothing, remember?

    Vs dynamic URLs. Static URLs
    If you believe me or not makes no difference. For the same SEO value means the same. The days when the engines had trouble indexing dynamic URLs Web sites are fortunately gone.

    www.site.com vs. site.com
    There is no difference. If you want to be able to access your site either way, add something like this in your. Htaccess:

    RewriteEngine on
    RewriteCond% {HTTP_HOST} ^ dominio.com
    RewriteRule (.*) http://www.dominio.com/ $ 1 [R = 301, L]

    Underline vs. script URLs
    Again, no difference from the standpoint of SEO. You can underline or use dashes, and do not even use any separator - this does not help nor harm in terms of its position in the SERPs.

    Subfolders
    Is it better to have a file / widget-red-small-barato.php instead of / widgets / red / small / cheap / index.php? Will it hurt your rankings if you put the content in the depths of the subfolders? The answer is no, not hurt their rankings and do not really care less about how deep in the tree of your file folders are located. What matters is how many clicks it takes to get that file from the home page.

    If you can get to that file with a single click - this is actually more important and have greater weight than any other file that is located 5 clicks from the index page. The home page usually has enough material for links to share, so pages to which links directly are obviously more important than others (well, since they get more link juice).

    W3C Validation
    W3C means World Wide Web Consortium (World) - an international consortium where Member organizations, full-time staff and the public work together to develop Web standards. Basically speaking, people who invented HTML, CSS, SOAP, XML and other web technologies.

    Validation is the process of checking if a page or site meets W3C standards. Here you can perform a validation for any website for free. Note that this validation, it shows only trivial matters such as appointments without closing, labels indefinite or erroneous attribute values. Also check encoding issues, compliance with the DOCTYPE specific tags and attributes obsolete and more.

    Why is validation necessary? A 100% valid web site assures you that will be displayed correctly (and identically!) In all browsers that support standards. Unfortunately, in real life some browsers do not adhere strictly to W3C standards, so a range of different cross-browser problems with the number of web sites are not a rare thing in the whole web. Anyway, this does not detract from the W3C standards.

    From SEO point of view, the validation does not seem so fundamental. Run a validation through google.com and you will see a handful of warnings and errors on your website. This example clearly shows that Google does not care about W3C validation. At least not enough to give a nice boost in the rankings to websites penalize invalid or erroneous. Simply does not care. The W3C validation strategy is recommended: please file to your site can work with all common browsers and be accessible to them and not bother doing it only for purposes of SEO, if not experience any problems of cross-browser - is well as it is.


3.1.4. Things that affect your ranking

    Keywords Abuse
    Google clearly define the term. Once more, write for humans. The repetition of words throughout the page can activate the Google spam filter will result in great loss of positions, or even the total ban of its website. Write naturally, optimize it a bit if necessary, is the best way to use keywords today.

    Hidden Text / Links invisible
    First let's see what Google says about hidden text. Obviously, Google does not like, and if your site uses this technique can be excluded from the Google index. You may wonder how can you tell Google if I use hidden text or not? Ok, I can select "display: none" in my external CSS file and limit access to the CSS with my robots.txt file. Would Google then realize that a page contains hidden text? Yes and no. This can work short term, but eventually his costume will fail, sooner or later. It is also said that GoogleBot does not always follow strictly the instructions of robots.txt, then you can actually read and convert JavaScript and CSS without any problem, and Once it does, the consequences for your website and your ranking will be disastrous.

    Redirects
    SEO is a method that can not be worse. The redirected pages are landing pages created with the sole aim of obtaining a good position for a certain keyword. Content has no value and its sole purpose is to attract visitors from the SERP and redirect to another page non-forwarder which incidentally is generally irrelevant to the original query of the visitor.

    Splogs
    Splogs Blogs from the English Spam or junk content Blogs "and is the modern version of the old and evil redirects. The technique was as follows: one created thousands of posts in a free service like blogspot.com, linked them together and got some external links through spam comments on blogs and other blackhat methods (see below). The Splog itself does not contain any particular information, its contents have always been automatically generated articles and saturated with keywords, and anyway, due to a large number of inbound links, these Splogs got a very good ranking in the SERPs displacing many legitimate blogs. Later on, Google implemented some filters to protect the large number of splogs and today any splog is banned soon.

    If you have a blog, do not make it look like spam. Put your attention instead to write interesting content and quality. This actually is more effective.

    Cloaking (Cloaking)
    In some particular cases is not so bad, but still a blackhat technique. This method is based on whether a visitor is human or spider is a search engine and decide which content to display. Then humans see a version of the site while other browsers are saturated with keywords.

    Duplicate content
    While it is taboo for many webmasters, duplicate content is not as dangerous as they say. There are two types of content that can be called duplicates. The first case is when a site has several ways to access the same page, for example:

    http://www.algunsitio.com/
    http://algunsitio.com/
    http://algunsitio.com/index.php
    http://www.algunsitio.com.index.php?sessionid=4567
    etc.

    The four options refer to the same page, but in reality they are taken to different pages with the same content. Google can easily solve these cases of duplicate content and does not carry any penalty by you.

    The other type is duplicate content with different domain names. Be considered duplicate content of a site when you do not add any value to the original content. This means that if you simply copy and paste an item to your site, duplicate content. If you copy and paste an article and add some feedback or make a review from his point of view, this is not duplicate content. The key point here is some kind of added value. If a site adds value to the original information is not duplicated.

    There are two circumstances in point. First, if someone copies your text and then published elsewhere, it is highly unlikely that you will be penalized for this. Google tracks the length of each page and tend to view the oldest, which in this case is your site, as a source of original text. Second, you can borrow material from other sites without significant risk of being penalized for duplicate content simply rewriting the text with your own words. There is a way to produce unique random text using Markov chains, seekers of synonyms and other methods, but I would not recommend them because the product is to look too "spam" and not natural, so it can really affect your rankings Google. Write for human beings. Write yourself.

    Marcos
    Even without being in itself a blackhat SEO practice, technology frameworks can affect your rankings because the search engines dislike frames as they destroy the whole concept of the site a unique page for each unique URL. Frames, a page can load and display the content of many other URLs, making it very difficult scanning and indexing. Avoid using IFRAME and other related tags unless you really have to do it and if it does, provide an alternative way to index the contents of each frame with direct links or use the NOFRAMES tag with a reinforcement content visible to search engines.

    JavaScript and Flash
    Google can read both JS and Flash (well, part of the texts of course), but it is recommended that you build your site based solely on these two. There should always be a way for a visitor (whether human or bot) can read the contents of a site with simple text links. Do not rest solely on JavaScript or Flash navigation, that will end with SEO prospects instantly.

3.1.5. Summary of factors internal to the page Well, if you carefully read the above and can imagine the abstract. Content is king, but only one of quality. Do not try to trick the search engines as this only works short term and is always just a matter of time your ranking go to the bottom of the list forever. Provide high quality relevant content that are interesting to you and for your visitors is the key to success for the internal ranking to the page and (paradoxically!) Halfway to success with ranking factors external to the page.
Continued on the next post tomorrow >>>>

Comments