Web Hosting

META Tags Are A Waste of Time

I read a post recently in a forum that argued quite strongly that META tags are a waste of time.

The argument was that they don’t serve a useful purpose any more, with the exception of the Title and Description tags.

I disagreed quite strongly with that position.

Here’s why:  

META tags are part of the agreed standards for building web pages and sites.

They’re clearly defined and used for managing the way the search engines view and treat your site.

For example, the ‘robots’ META tag gives the spiders instructions on whether to index a page or not, and whether or not to follow outgoing links.

If there is no instruction the robots will index the page and follow links.  But if you don’t want people to find a page via the search engines – for example, if it’s a download page – telling the robots they cannot index it is one of the ways of protecting it.

One of the arguments against using them (in that forum post) was that people have started spamming the META tags, as a result of which the search engines now ignore them.

Of course you can spam keywords into META tags – but of course the Search Engines recognise these and ignore them.  The spammy keywords, that is, not the META tags.

I do agree that the Title and Description tags are ones you should always use – they’re currently used in the natural search results and displaid to searchers. So a good title and description can grab attention and entice people to click through to your site.

But all the META tags are there for a reason, so it’s both sensible and in keeping with agreed standards to use them.

Different Search Engines do place different emphasis on different META tags now than they did a couple of years ago.  But, as we know, search engines are forever changing their algorithms and that also applies to how they view and use META tags.

So who knows how they’ll use them in 2 years time? I suspect not even the search engines do (yet).

There are a lot of META tag generators available online.  You simply fill in the answers to the questions they ask and they will generate the code for you to paste into the HEAD section of your page.

So there’s no reason at all not to use META tags – and every reason to do so.

They are part of the agreed standards for building pages and sites, and it’s so easy to generate them that they should always be included in any and all pages.

After all – if they’re all included it won’t matter how much or how often the search engines change the way they view and use them in the future.

Whatever they do, your META tags will be in place, which will increase your site’s resilience to any changes they make.

Web Hosting

Comments on this entry are closed.

  • Codesucker 29 May, 2009, 9:47 am

    Great Article. I believe that search engines such as Yahoo and Google no longer care about META tags, not even the keywords and description!

    I believe SE’s have gotten to the point where they don’t need or trust the actual page developers description of the page anymore. Instead, ranking is simply awarded in terms of linkback and relevancy.

    Anyway, I still tell people to clearly define their keywords and descriptions, it helps the content writer keep on his/her toes about the keywords they should be using to write the article of content. The description meta tag is usually pulled from submission sites and stands for the default description of the content.

    I am against the use of unneccessary meta tags, like using the robots tag and saying follow,index. That’s just a waste of code. The ‘code-to-content’ ratio of your pages must stay low so the search spiders dont have to sift through too much to get to the content on your page. All CSS definitions and javascript files should go in seperate files, you need to keep the code short and clean.

    Codesucker´s last blog post..Biff and the Perfect Backlink

    • WealthyDragon 29 May, 2009, 11:38 am

      Hi there,

      I certainly agree with you on the importance of link back and relevancy as ways of ranking well in the search results. I also agree that CSS definitions should be kept in a separate file (a single separate file) – but this is more for site performance than search results placings.

      However, I would have to disagree with your view of the description tag.

      The descriptions I place in my description tag fields consistently appear in the description field of the search results, on all pages of all my sites. The only exceptions are where the search term that the searcher has entered appears somewhere in my content.

      On those occasions both Google and Yahoo extract that particular phrase and display it in the description field.

      Also, in my view, the title field is important because it’s what’s displaid to searchers in the search results. Without a good Title tag (as well as a good description) people will be less likely to click through. The title and description tags, and the content of the page, need to be consistent for best results.

      I do agree that there’s no real need to add the Robots tag with index,follow as its content, because they will index and follow by default.

      But if you don’t want a page indexed then using the robots tag is one of the ways you can prevent that. I used the example above of a download page – if this is for a paid product you (or at least I) wouldn’t want someone to stumble across it because it had been indexed and turned up in a set of search results.

      On my blogs I don’t want my admin pages indexed – so those are other pages where I use ‘noindex,nofollow’ in the robots tag

      So until the standards are changed to exclude the need for META tags I’ll continue to use them fully, for the reasons I discussed above.

      Cheers,

      Martin.

  • Codesucker 30 May, 2009, 11:22 am

    Thanks for the response. Good point about the description tag – its most important function is the little blurb to your site that it provides next to your link in the search results, as long as the query doesn’t hit the nail on the head with keywords. Submission sites also pull it out of the html for a default description of the content.

    Either way, we both agree that the description tag is still useful and should be defined.

    I love the point you made in the article about using noindex for download pages!

    Codesucker´s last blog post..Bing SEO! Do I have to optimize for Bing now?

    • WealthyDragon 30 May, 2009, 12:23 pm

      Thank you for adding to the article! 🙂

      Your contributions have improved the quality of info for other readers and given me an opportunity to clarify different points.

      All good!

      Cheers,

      Martin.

  • atul chatterjee 30 May, 2009, 5:00 pm

    There is a website webtrafficonline.com which for the keyword phrase ‘web traffic’ which is a highly competitive keyword prase comes up as #2 on Google.com. It had precisely 8 articles on it which were not keyword optimised.
    I wrote these articles so I have seen the climb of this sites. Understandably it has very little traffic and a high bounce rate. But in the light of this I tend to agree about the role of tags and descriptors.

    • WealthyDragon 31 May, 2009, 9:41 am

      Atul, hi,

      I tried to check it out but you must have taken the page down recently – it’s just parked with Go Daddy and has a whole bunch of links on it now,

      Cheers,

      Martin.

  • Bill Beavers 1 June, 2009, 9:47 pm

    I suppose you would have to be a “Guru” to understand a Guru’s thought process. All I will say, not being a SEO Guru is simply that I totally agree with the post. Nobody knows for certain. If I were paying big bucks for a SEO specialist, he’d better fill in the Meta Tags. One never really knows which way the search engines will go tomorrow.

    • WealthyDragon 1 June, 2009, 10:29 pm

      Hi Bill,

      Yes – that is an important point.

      The best sites are resilient to whatever changes the search engines make in their algorithms and approaches – and that’s the only real way to develop a sustainable online business. You don’t want to have to work on your site every time the algorithms change.

      And one way of making your site resilient to Search Engine changes is to follow the agreed standards when you’re building it. (It’s not the only way, but it’s a good start).

      Cheers,

      Martin.