SEO Dubai Trends Series
Welcome to the next blog of our SEO Dubai Trend series where our focus is on keeping you abreast with the latest trends in search engine optimization. If you are from Dubai, then this is definitely of a lot of relevance to you because we have written keeping you in mind! If you are from elsewhere, do still read on. Our blog will give you a lot of inputs on two best practices which you can adopt while web designing or after the designing with regards to crawling and indexing.
All of us know that crawlers are our friends who go through websites and pages in order to sift across the updated content and information, and help us by locating them and featuring them on the search engine page. Indexing is their way of making a relevant bibliography for us to get information from! So how do we make use of crawlers and indexers in a smart manner so as have the optimum search engine optimization for the page? Read on!
1. Relevant and appropriate use of Meta Tags and Redirection for Crawlers
Now crawlers and bots have to be directed in a judicious manner. This helps in a great SEO job being done. Robot.txt, meta robots, and the no follow tag are important bots to which you have to pay a lot of attention if your intention is to control bots. yoursite.com/robots.txt is where Robot.txt is located at. This bot tells the crawlers on what pages on your site are allowed and what are not. A tip here. Do not get very specific on what is disallowed! The more specific you get, the more search engines like you and hence what is disallowed is taken as an error on your part. This means it gets indexed!
Meta robots are a better option as they reside on the header of individual pages. So what is being allowed to be indexed or not indexed is told to the crawlers at the individual level. Hence more control for you! There is the nofollow tag which is used to communicate if an individual page has authenticity with respect to the content on it. It also states if the page rank and link equity metrics can be rendered for that specific page. This tag, when used cautiously, can really help when the content on a page is not ready or is incomplete.
2. Site Architecture and Content
Site Architecture will really help crawlers and indexing, if done with some basic common sense! Usually your homepage ends up being the primary landing page. Hence arrange the other pages in a manner which is both relevant and is sensible. Your category pages should follow the home page, and then the sub category pages, finally followed by the details page.
The content should flow around the page architecture and should blend with the page. It should not seem out of context. Such smart site architecture helps multi-folds in delivering a great search engine optimization for the page.
Use these two best practices and feature at the top of any search engine!
Disclaimer: All graphics/images are copyright to their respective owner unless stated otherwise. If you are the owner or you want to remove any of the graphic/image from this page please contact us HERE.