Trending this month

Showing posts with label SEO. Show all posts
Showing posts with label SEO. Show all posts

December 16, 2016

5 seo best practices for blogging the perfect post

"5 seo best practices for blogging the perfect post"


The Power of a blog post . How a blog post can help market your website , by creating content that is spiderable and indexible by google and other search engines . This Infographic shows you how to just create the perfect Blog. The top 5 seo best practices for blogging the perfect post

December 15, 2016

3 ways to get most juice out of your site maps

The making of the sitemap

Traditional site maps are static HTML files that outline the first and second level structures on the website. The original purpose of the site map  was to easily find items on the website. Over time  sitemaps also became a useful tool to help search engine find content  and index all parts of the site you wanted to.  Today its recommended that every webmaster have a  XML site map which provides easy to read links dump for the spiders to index . A Good Site map must fulfill the following 5 criteria. At best a site map is just   a table of contents, at worst its just an index for your site.

1)show a quick and easy follow overview of your website
2)provides a pathway for the search engine spiders to follow 
3)provides a text links to every page on your website
4)Quickly show visitors what information they will  be getting across which pages
5)Utilities keywords phrases and to help  rank well across search engines

Here are some of the best practices to get more juice out of your site maps

1)Your sitemap should be linked from the homepage. Linking it this way gives the search engine an easy way to find the website and then follow it all the way through the site.If its linked from the other pages , the spider might find a dead end along the way and quit following your website
2)Small sites can place every page on their site map, Bigger sites should not :You would not want the search engines to see a never ending list of links and assume its a link farm. Use nested sitemaps if you jave many pages to cover.A nested sitemap contains only your top level  pages on the main sitemap and includes links to more specific sitemaps.
3)Some SEO experts believe that you should have no more than 25 to 40 links on your sitemap. This also ensures that that your sitemap is immensely readable by the human visitors.
4)The Anchor text in your site map ( words which are clicked on )  of each link should ideally contain a keyword as far as possible.Also make sure that the anchor text links on your site map are all linked to the appropriate page.
5) After creating   a sitemap, go back and  make sure you check all your links are correct. A broken link on your site map is the last thing you need and is a terrible user experience.All the pages shown on your site map should contain a link back to the sitemap.

6)If you have  a very extensive content with huge number of pages,you should try to create a sitemap for each silo.The master sitemap would not contain all the pages of the website, but would lead the search engines and users to the appropriate sitemap just like the rest of your site

The Site map must also
begin with an opening <urlset> tag ( encapsulates the file and references the current protocol standard) and with with a closing urlset tag)
include an entry for each <URL> as a parent XML tag.Parent tag for eachURL entry. The remaining tags are children of this tag)
include a< loc> child entry for each url parent tag.The url must begin with a protocol such as http:// and end with the trailing slash,if your web server requires it.This value must be less than 2048 characters

December 1, 2016

how to create local landing pages: top 3 tips for ranking well on search engines

"how to create local landing pages: top 3 tips for ranking well on search engines"

3 tips for creating local landing pages which are  optimized for search engines: Local search is big business.In the US alone there are 3.2 million print yellow pages “advertisers” in the U.S. today generating just under $15 billion in annual revenues.According to a recent search estimates 1 in 3 searches are about a place.BIA/Kelsey forecasts overall U.S. local advertising revenues to reach $148.8B in 2017.There are right ways and wrong ways to create local landing pages to rank well in search engines. However not all of them are " technically the right way to rank on Google

One wrong method is to create content by a bunch of city names with a zip code in a text block and hoping to rank for those terms. The second wrong method is to generate a hundred of copies of web page and then use a find and replace method to substitute a different city name on each landing page. That approach will create thin,mostly duplicated content that is less likely to bring serious and targeted traffic and finally lack of content might attract a penalty from google's panda algorithm and damage your site's ranking to a large extent.

These type of find and replace method does not convince search engines that " You are trying to rank seriously for these given keywords so what is the right method for creating local landing pages that will bring you quality traffic.Here are some best practices for creating landing pages for ranking well in location search

In your title of the landing page including body and text and keywords ensure that you mention the locality of the region which you are planning to target.Make sure each page has the name of the city and your service which you are offering to consumers. Also use H1 and H2 tags in the body text.

Include conversational modifiers:This means when users search for location they mostly use words like "Nearby /Proximity/in between and Neighborhood".These words signal interest in the location specific search queries results which are, now becoming more routine thanks to voice enabled search across smartphones.Sprinkling in conversational modifiers near your keywords might make sense.For example users may search " car repair centers between Paulo Alto and Sacramento. In your landing page talk about things related to the neighborhood. Instead of using merely the name of the location use the names of famous landmarks near the place ,a well known coffee shop or nearby Starbucks cafe, or a local KFC restaurant, so that users might be able to locate the place easily in case they dont know the exact address.Also if your establishing a business in Los Angeles on a given page, you can add Hollywood or southern California in the text making it unique

Think creatively about creating content which are different from other local landing pages. For example, you can create local customer testimonials, with videos or text images of what local customers have to say about your services Talk about local events which your business have tied up with, co produced, co branded or any local events which you have sponsored, or images of past events where you had participated in. Publish one page of City Guides which gives a snapshot of the place ,its history, climate, key landmarks,events, culture,local guides and other places of interest users might like to visit.

October 20, 2016

a dynamic website using Content management system:must have seo features

A dynamic website is a website that is built by using a template and a CMS. Content management system gives you control on how do you create a your web page pulling in information from various sources from a database.That means  the web pages does not exist unless someone builds them for you. For example you run a online shopping website and sell more than 10,000 products which you sell online . For creating a website such as this. You are not going to build a 10,000 web pages each for each product manually. Instead you use a content management system to build the pages dynamically on the fly. 
A CMS actually generates the page which a search spider crawls by taking the information in your database and plugging it into a template web page so that the CMS creates the tags, content and the code which ultimately is seen by the search engines.However one important thing to remember is its extremely essential to have a search marketing friendly CMS for creating a dynamic website.Any CMS that allows you to completely have access to these individually pages on a granular level alllows you to create tags, meta tags, keywords and titles are considered search friendly. Having these features allows you also to have a control on the entire content of the web pages. You should be able to make changes in the tags, H1 tags,edit keywords and descriptions.

In short any CMS which does not allow you to control individual page wise level seo.. is not seo friendly. You can use wordpress or Pixelsilk ,which are both open source.For search keep these things in mind when you are deciding the CMS 1) It should be able to customise HTML templates 2) Should allow you to produce unique Title tags' 3) You should be able to add meta description keywords on your own 4) produce and change H1,H2 h3 tags at will 5)categorise and create content by groups

October 14, 2016

google confirms sites hit by google penguin 4 years ago are recovering

Those webmasters who had been hit by Google Penguin can now rejoice. A recent Moz posts " shows that sites which had been hit by Google Penguin " is showing increased traffic from Google after a long two-year wait. Post the  latest Penguin update rolled out in late September and into early October 2016. This roll-out is unusual in many ways, and it only now seems to be settling down. In the past couple of weeks, we've seen many reports of recoveries from previous Penguin demotions.On October 7th, Gary Illyes from Google said that the Penguin roll-out was in the "final stage" (presumably, the removal of demotions) and would take a "few more days". As of this writing, it's been five more days.
Google representatives are confirming details about the new Penguin both publicly and privately, and algorithm flux matches the general timeline. Perhaps more importantly, we're seeing many anecdotal reports of Penguin recoveries shown in the image above

Also websites  that track websites hit by Google Penguin have reported that " the kind of hit websites had taken after Penguin Update is slowly showing signs of recovering.

August 4, 2016

why using flash content is seo unfriendly : 5 reasons

Although Google indexes flash content and follow links since 2008 onwards, the fact remains that Google or for that matter any search engine cannot read what is written within the flash files . We take a look at 5 reasons why using Flash content in your website is considered not fully seo friendly. The most crucial problems with Flash are the missing SEO features: Anchor texts, H,-H2. Tags, bold , strong tags,, alt image attributes even title pages are not very simple to execute in Flash.Developing flash with seo factors in mind is more difficult than doing it in HTML.  

 1)Different content is not on different urls:This is the same problem you have with Ajax related pages. If you have unique frames, movies within movies that appear to be on complete unique portions on the flash site, but there is no way to link them to individual elements 


 2)The text breakdown is not clean : Google can index the output files in the .swf to see words and phrases, but in flash, a lot of the text inside is not clean,h1> or <p>. It is jumbled into half phrases for graphical effects and often the output is in a incorrect order. Worse still are text effects that often require breaking words apart into individual letters to animate them. 

 3)Flash gets embedded ; A lot of flash content is only linked to other flash content,wrapped inside the shell flash pages This line of inks where no other intenal or external urls are refrencing the interior content , leads to documents with very low page rank, as the links juice fails to get transferred Even if it manages to stay in the main index , they probably wont rank for anything  

"why Flash should be avoided "

4)Flash does not learnt external links like HTML : A all flash site may get large number of links to the homepage, but the interior page always suffer. When webmasters implement links to embeddable Flash. They normaly point to the html host page , rather than any interior pages withing the Flash

3 explanations why creating microsites are the cardinal sins in seo

Search algorithms specifically favour large, authoritative websites that have a depth of information. This means that at any given day the bigger the site, the url that has the highest and the deepest content has the highest probability to appear in search results, as opposed to a small and  thinly spread out content in the form of a microsite . All search engines give more weightage to big  websites with deep pages  and broad pages with lot of content.This negates the reason of  creating new content as a microsite as it  is less helpful vs adding content across existent site
" microsite vs one website : whats the best way forward in seo"

 No matter how well you create your content, you can never replicate the “content “which is on your main website”. As your website is already being seen and indexed by  the the search engines as a website with a  huge reservoir of “quality content, trying to create another site with similar or a few unique content will not get you much traction. A multiple site strategy always split the benefits of the link: A single , quality  link pointing to a page on the main  domain positively influences the entire domain and all the pages associated with it . Because of this fact, it makes more sense if you can preserve a good link” or a new link which you get pointing to the same domain to help boost the rank and value of the pages on it .Also  a new site  will take  time  for  search engines to spider a site , and get ranked in search results. This impacts visibility of a brand on the web 

Additionally, it will take some time to appear in google results. .A clear cut content strategy is a must to create a new micros site notwithstanding, the reason not to develop the same. At times developing content for such a site is quite difficult . Having content or keyword targeted pages on other domains that don’t benefit in terms of content or backlinks is a action in futility.

3 instances when you should not be using a .com as a top level domain

  WHEN NOT TO USE.COM AS YOUR TL DOMAIN :Most of web urls have a .com as a TLD, however in some certain circumstances its better to avoid using a .com as a TLD We take a look into a few special cases when we use TLD other than .com

1) When you own the brand .com already and wish to redirect it to a .tv, .org,.biz possibly for marketing, branding and geographic reasons , to venturing into areas across corporate social activity as a CSR initiative. Additionally, any socially relevant service or addressing the bottom of the pyramid either through a corporate initiative or tying up with a NGO.
2.You can use .gov,.mil,.Edu domains if you belong to the similar category ( for the appropriate organizations and associations )

3) When you are serving a specific geography or a market and are completely focussed on “the local market” without the any intention of venturing into other markets for a long time to come. In such cases country extensions are a great way to get good search ranking. (,,.fr ,.de ) 4) When your organization is non profit and you are into a social cause, like NGO, charity, helping war veterans, collecting donations for social cause. Under these circumstances you can use a .org. extension

The world wide web 's made of of 92% .com domain as the TLD. Though the top level domain hardy matters in terms of search, however the url is an important  part of online branding as well as social media .

July 14, 2016

deciding the optimum url length for search engine benefits

Selecting the most appropriate url formats for search engines: 

How doe a search engine decide the "  length of  the optimum size of url which is considered to be search friendly ?

While its true that search engines does give a fair amount of importance for the url specification, SE also are quite adept to  understand  and  interpret regular long urls with numerous hypens   and the extent to which webmasters  can use them for " spamming it " ( for example some seo webmasters at the sound of getting too excited and may use words like an actual url .) This only sends out the signal and tells the search robots and give them the sound of trumpets" telling them to discount the entire post/ url as a spam .. as its the result of an over excited adolescent seo webmaster

deciding the optimum url length  for  search engine benefits "

" the art of creating search engine friendly url's for optimization"
So how do you show the search engines about the content and its url parameters by ensuring you do this just the right way ?
1) Describe your  content :  if a user can  make an accurate guess about your content  by looking at the address bar, you have done your job
2) Use static URLs: dynamic url's are harder to index for both search and users
3) Descriptors not numbers:  never use  a random 1234 numbers or a set of numbers..when you can use brand/name etc
4)use lowercase: Although urls can accept both upper and lower always desist from using Uppercase   

June 2, 2016

optimizing your site for 5 things that search engine cannot see

How do you identify problems with your site which search engines fail to see and leads to search spiders missing indexing your site.
Problem 1 For example here is a simple scenario “ your webmaster is working on a site on a staging server"as  you don't want the search engine to see this page which as they are duplicate versions of your page  which you normally keep on a staging server which is keept as "NoIndex" This is something that search engines cannot see
Normally when you move the site from the staging server to the live server you should remove the NoIndex tags. However in case you forget removing the Noindex , you will see phenomenal drop in traffic .

Problem 2 : Some webmasters implement robot.txt that prohibits crawling of your site in the staging server. If this site gets copied over when the site is switched to live server , the consequences will be just as bad as the NoIndex example

Problem3 : A key difference between a person using a browser and a search engine spider is that the person can manually type a URL into the browser window and retrieve the page the URL points to. Search engine crawlers lack this capability.Instead, they’re forced to rely on links they find on Web pages to find other pages. This is one of the reason why inbound links are so very crucial 

Problem:4 Another technical problems which happens is when " search crawlers " encounters an object or file type that’s not a simple text document." Search engines are designed to index text and are highly optimized to perform search and retrieval operations on text. But they don’t do very well with nontextual data

Problem :5 The best way of understanding this and detecting this and taking appropriate action is to use an analytics software to find pages in your site that gets page views but no referring site traffic. Though this itself is not conclusive enough but it does provide a clue on what is going wrong in your site. The reverse of this situation is also true. If you see content on your site that is getting search referrals even though you don't want it or expect it, you may want to hide that content
Auditing what you have missed in search optimisation:Another data point which you can use to find if search engines are not able to see you content is check if your content is being picked up by search engines. For example if you have a site with 1000 pages with good inbound links and after 3 months you see only 20 pages are indexed thats enough clue that there is a problem

May 31, 2016

deciding on when to add no follow links vs follow links

How do you decide  when to use  rel=nofollow  and when to not . While this has been explain in depth in my earlier post on  the importance of rel=no follow and follow links ,one of the best ways to decide this  is to understand  and analyse which kinds of sites or  links influence your posts . In short  comparing which is relevant  to  the website and which one is not.. Assuming you read an article about new seo strategies for 2016 compared to a blog template which you like and want to add that as your template .. which one would you choose for as rel= no_follow links..

In 2005 all 3 search engines,Yahoo, Bing an google agreed to support an initiative to reduce the effectiveness of automated spam Unlike the meta robots version of NoFollow, a new directive was employed as an attribute with within an or link tag to indicate the fact that the linking site " does not vouch for the quality of the linked pag
Read the follow post below

You an read more about rel=nofollow links and rel=follow links here

If you are running a seo blog , obviously the first one is of relevance, as it gives you information which is topical , relevant and contextual to what your site is about, while the blog template is not.. The analogy is similar. Follow links are used for something that you liked and wish to pass the link juice to the site. It means you are voting for that site and telling the search spiders that the site is trustworthy . Use no=follow for the second one..or your telling the spiders that " you dont wish to pass on the link juice to the site .

May 3, 2016

Loosing out on your quality score :the best landing page infographic

" 5 tips for a perfect landing page"

5  Ways to ensuring SEO benefits while deciding your landing page 

for any SEO optimisation , landing page is a key. While in organic search, Landing page per se is not that crucial as compared to paid search. But irrespective the nature of search a well designed landing page helps spiders to look at your content in a very structured and granular way. This  helps you increase your page score in paid search, and helps your ranking across organic SEO. The above chart illustrates  via an infographic showing how a landing page ought to be designed and the key elements to be incorporated

"how to ensure SEO benefits while deciding your paid search landing page"

The 12 jewels to be incorporated in your landing page 

April 12, 2016

how to use blended search to rank well across local search

BLENDED SEARCH RESULTS :5  Ways it Impacts your organic search results 

The introduction of Google’s ‘Snack Pack” results was perhaps the biggest local search shakeup since Google Pigeon rolled out back in July of 2014 which  had about 7 parameters on : local search rank"However of late, Google started to blend 7 snack pack into 3 snack packs which are considered the most crucial ranking parameter for local search

 Blended search results means" when google " starts searching and  ranking your site, its tries to use blended factors to find out how effective you are  in terms of vertical search. Blended results such as local, shopping,images, real time videos , news are taken into account  and have begun to play an important and crucial role in addition to  the standard search results factors.
Here is a lowdown on Blended search results  that might help you to rank specially across Local search which might help you to index your site much faster 


  1. The business has a listing in the local index including, Google Places ,Yahoo Local, Bing local listings 
  2. The actual business location has the same address as mentioned in in the website  and search engine query friendly
  3. The business has a local phone number for the searched location.
  4. The website contains local address and phone number for the query
  5. The website should have more positive and superior ratings by users  
  6. Social Signals : How are your business shared across social media,how much clicks do you manage to generate out of your local ad business cross facebook, twitter , Klout and linkedin among others
  7. "5 ways how you can rank better in Local search"

  1. videos that describes the local institution or something about the product and services which are tagged 
  2. The local listings ( google local ) has to be claimed by local owners and verified as their busines videos that describes the local institution or something about the product and services which are tagged 
  3. The local listings ( google local ) has to be claimed by local owners and verified as their business

  1. Foursquare checkins, mobile clickthrough rates
  2. Of late behavioral and/or mobile signals to make up 9.5% of the algorithm across  localized organic results
  3. High Numerical ratings on Google Local/ Ratings by Google users ( Google Local) 
  4. The listing title  and the description has to have atleast 3 keyword in order to search across blended search SERPS
  5. The business  should be listed across third party data providers and Yellow pages
  6. Geo-tagged images on photo uploading sites, like Flickr, Dropbox and Panoramio , along with a caption that explain the " Geotagged images and local business

April 11, 2016

struggling with organic search : 5 seo parameters which you need to improve

5 Reasons why you are not succeeding in Organic Search  

While  most of us know the algorithm of google search results to determine the ranking of the site . Externally ,that is off page optimization,the most important are 1) web page text
2) the linking sites
3)the number and quality of the sites linking back to you 
4) The anchor text
However lets find out what"most search engine cares about.Here is a quick lowdown on the most important factors which is affecting your organic search results .

" search engine ranking on Google"

5 Factors to keep in mind for organic search

INBOUND ANCHOR TEXT : ask webmasters when asking for a link to include " your anchor text" The anchor text should be relevant to the basic theme of the website.Lets say your have a website on Hotels and you are approaching a travel site for a link, and your target keywords are Cheap Hotels with great facilities" Use this anchor text to ask webmasters to link to your site, or you  can also ask webmasters to link to your internal page

SITE AUTHORITY : how other sites see your website. Are you considered as the " best website in terms of cheal hotel site ? For example how many back links does your nearest competitor have as compared to you
" seo list of periodic tables to rank in the first page"

periodic table of  elements and factors for better search engine ranking

VISIBLE HTML ON YOUR PAGE : Avoid dynamic links, java script and flash, Most of your ebsite needs to be in html, as the search spider are more comfortable in picking up html content

AGE OF THE DOMAIN : the earlier your website has been registered,and online  is considered more relevant as compared with a website that has started recently

NATURE OF INBOUND LINKS: quality, relevance and quantity are the most important tha decides the most important factors for " Organic factors of SEO

AUTHORITY : how other sites see your website. Are you considered as the " best website in terms of cheal hotel site ? For example how many back links does your nearest competitor have as compared to you
CONTENT PRIMACY : do you have an unique content, which has not been existing in any other site ? or you have copied the content, and add a few stuff from your own.. This may take some time, but the more you write original content, the search engine views this as fresh content with unique webiste

SPEED: the faster your site's loading time, the better it is for both search engines and usersthe A slower speed  harms your website in too ways, bounce rate increases, and as google website comes with specific bandwidth , the  harder it  will be for the bots to find your websiteb

DISCLAIMER : These are some of the most important seo parameters , there re more than 100  parameters which search engines use to rank websitesyIn the above site ,

April 4, 2016

planning a navigation for non human visitor and search bots

SITE MAP TO BE FOLLOWED  BY SEARCH ENGINE SPIDERS  : 5 ways to ensure bots visit at your web site

The most crucial fact you have to understand that the search engine does not look the same as a human visitor .By creating an robot friendly navigation  you encourage the  search bots visit your site more often which in turn  increase the visibility of your site. Increasing visits by search bots ensure your content gets indexed  faster, your cache( when google spiders visits your site the cache shows when they have visited last. If you see google's cache showing very recent visit, it means, your content is getting  updated faster,, and this improves your visibility in the SERPS.sees it.o perform better in search engine listings, your most important content should be in HTML text format. Images,
"an ideal link structure for robots"

Things not to do : Avoid Flash files, Java applets, and other non-text content and emphasis more on html pages .This is because   search bots often ignore or devalue these codes  

see the entire post below

how do you get to do this : 1) by providing search engines robots with links to navigate the through your website
 2)by pointing search engine robots to dynamic or hard to read pages that might not be able to accessible other wise.
3) By providing a possible link to landing page
4) By ensuring your 404 error has a link pointing to a page ( preferably home page 5) by using ready to use content if a page is not available or a url that is broken

April 1, 2016

VC exits from technology start ups

2015 TECH EXITS BY VC's There were over 3,400 exits in 2015, a 14% increase from 2014. But exits by quarter are trending downward with just 717 exits in Q4’15 – the lowest quarterly total since Q1’14.

VC exits from top 10 biggest Technology start ups

"VC exits from top 10 biggest Technology start ups"

vc activity including start ups exits numbered 3400 last year"

Global tech exits including M&A, IPO trends, and much mores 

March 29, 2016

dynamic serving vs responsive design : customising your design for mobile users

As we explained in the last responsive mobile design post.There are 3 ways to optimize your mobile website.These are 1) Creating a responsive website 3) dynamic serving and 3) Creating a mobile website

In this post we look at " how to use dynamic serving for mobile users and under what circumstances this design is used 

Dynamic serving is a server side development approach that detects which type of device your visitors are using to view your website delivers .Dynamic serving specifically optimize the website content based on the device your users are using and the server responds accordingly .

Like responsive design, the dynamic serving uses one single set of url's for all content regardless of how the content is seen, irrespective of a desktop, PC, laptop or me. However that's when the similarity ends .In dynamic serving the url too remain the same,but the content delivered to mobile device is not always the same as in desktop

This is because dynamic serving is a server side approach that alters the content code (HTML, CSS and php ) based on the device that is asking for it before the content is delivered to the browser.This code allows the server to alter the content of the page without altering the URL of the page


1)When your website needs to include complex mobile friendly functionality such as multi-page form and interactive dashboards.Dynamic serving allows you to serve the best experience based on user circumstances

2)When you see that your website needs to serve 2 different device markets very differently. One example is that iPhone users take a different path to conversion as compared to android users . Dynamic serving is also used when you want your webpages to render differently for your tablet users and smartphone users.

3)When your visitors largely use different keywords to access your website via desktop search and mobile search. Dynamic content allows you to altered the way the " content is rendered on page by page basis

 4)Dynamic serving  is the best solution If you want to optimize specific pages for high volume keywords phrases without changing the desktop language and if you want your desktop visitors and mobile visitors to convert in different ways

March 26, 2016

science of writing meta tag descriptions : 5 best practices for seo

5 best practices in writing the perfect SEO meta tag description"

The significance of Meta description  :While the perceived (and real) importance of meta data across search has depreciated , the attribute still plays a significant role in SEO rankings  :Meta Description has 3 primary uses

Meta Descriptions Role in Search
1)describe the content of the page accurately and succinctly 
2)Serve as a short term advertisement for to click on your pages in the search engine 
3)to display the targeted words not for ranking purposes but to indicate its content to searchers 

5 rules to write the best meta descriptions

1)Number of characters :. Descriptions should be succinct and compact
2)Keep the length to 160 characters ( for google) and Bing up to 200 characters :However you are allowed to keep your descriptions to an average of 165, including spaces

3)Test, Refine, Repara and Rephrase : Just like ad ad which undergoes a plethora of tests , ensure that the description actually fits the content or the theme of the page in question. Each web page ideally has to have its own description pages.

3)Include relevant keywords : Its very important to have the right keyword in the meta description tag - the boldface that the search engines apply which can make a huge difference in visibility and the cilck thro rate of the website in question.

4)Ensure tagged facts in the description: You can include other structural tagged facts in the information apart from standard information . While news or blog posts can list the author, date of publication, or byline information. What about a product page? How do you incorporate product information like —price, age, manufacturer, features which lies scattered throughout a page. A good meta description can bring all this data together. For example, the following meta description provides detailed information about a Ian Fleming.

5)Understanding User Psychology of Search Disparity: take into account user differences in terms of search behaviour . For example an organic search user will not see your website the same way as users which comes from" PPC search'.While creating the meta decsription its important to keep in mind this basic fact . Users looking for information and users looking to shop online are clearly two different sets of consumers and you need to create descriptions based on which kind of users are your target segment

6)Employing Universal descriptions: Some search marketers are of the view that " you should not always write a good descriptions".Conventional logic holds that its wiser to write a good meta description to maximise the chances of it being used in the SERPs, rather than letting the engines build one of them. However this isn't always true

If the page in question is targetting one , two or three heavily searched terms or keyword phrases , go with the meta descriptions that targets user performing those searches . However if you are targeting long tail traffic with hundreds of articles or blogs entries, it can sometimes be wiser to let the esarch engine decide and themselves extract the relevant text The reason is simple to understand.When the search engines show a page in the , SERPs they always display the keywords and the surrounding phrases that the user searched for. However If you try to force a meta description, you can end up creating one that is not appropriate for the search phrase your page gets matched too