Trending this month

Showing posts with label search engine. Show all posts
Showing posts with label search engine. Show all posts

December 1, 2016

how to create local landing pages: top 3 tips for ranking well on search engines

"how to create local landing pages: top 3 tips for ranking well on search engines"

3 tips for creating local landing pages which are  optimized for search engines: Local search is big business.In the US alone there are 3.2 million print yellow pages “advertisers” in the U.S. today generating just under $15 billion in annual revenues.According to a recent search estimates 1 in 3 searches are about a place.BIA/Kelsey forecasts overall U.S. local advertising revenues to reach $148.8B in 2017.There are right ways and wrong ways to create local landing pages to rank well in search engines. However not all of them are " technically the right way to rank on Google

One wrong method is to create content by a bunch of city names with a zip code in a text block and hoping to rank for those terms. The second wrong method is to generate a hundred of copies of web page and then use a find and replace method to substitute a different city name on each landing page. That approach will create thin,mostly duplicated content that is less likely to bring serious and targeted traffic and finally lack of content might attract a penalty from google's panda algorithm and damage your site's ranking to a large extent.

These type of find and replace method does not convince search engines that " You are trying to rank seriously for these given keywords so what is the right method for creating local landing pages that will bring you quality traffic.Here are some best practices for creating landing pages for ranking well in location search

In your title of the landing page including body and text and keywords ensure that you mention the locality of the region which you are planning to target.Make sure each page has the name of the city and your service which you are offering to consumers. Also use H1 and H2 tags in the body text.

Include conversational modifiers:This means when users search for location they mostly use words like "Nearby /Proximity/in between and Neighborhood".These words signal interest in the location specific search queries results which are, now becoming more routine thanks to voice enabled search across smartphones.Sprinkling in conversational modifiers near your keywords might make sense.For example users may search " car repair centers between Paulo Alto and Sacramento. In your landing page talk about things related to the neighborhood. Instead of using merely the name of the location use the names of famous landmarks near the place ,a well known coffee shop or nearby Starbucks cafe, or a local KFC restaurant, so that users might be able to locate the place easily in case they dont know the exact address.Also if your establishing a business in Los Angeles on a given page, you can add Hollywood or southern California in the text making it unique

Think creatively about creating content which are different from other local landing pages. For example, you can create local customer testimonials, with videos or text images of what local customers have to say about your services Talk about local events which your business have tied up with, co produced, co branded or any local events which you have sponsored, or images of past events where you had participated in. Publish one page of City Guides which gives a snapshot of the place ,its history, climate, key landmarks,events, culture,local guides and other places of interest users might like to visit.

March 25, 2016

5 best practices in title tags construction for search engines

For keyword optimization title tags are most critical elements for search engine relevance .The title tag is in the <head> section of the html document and  the only pieces of meta information about a page that  has  influences relevacy and ranking

section of the html document and the only pieces of meta information about a page that has influences relevacy and ranking the following 6 rules represents the best practices for title tag construction. One of the things to keep in mind is to ensure that the title tag of any given page has to corresond to that page content

1)incorporate keyword phrases : an obvious thing to do, is to use title tag wherever your keyword research shows as being the most valuable keywords for capturing searches

2)Place your keywords at the beginning of the title tag.This provides the most search engine benefit. If your doing this and also wish to employ your brand name in the title tag, always place it at the end There is a trade off here between seo benefit and branding benefit that you should think about explicitly before taking the decision .Well established and well known brands might want to have their name at the start of the title tag as it may result to increase click thro rates (CTR)

 3) Limit yout title to 65 characters including spaces Content in the title tag should not be more than 65- 70 characters for allmost all search engines, any content beyond 65 characters gets cut off the SERPS
4)Focus the title on click thro and conversion rate : The title tag is exceptionally similar to title you generally write for paid search ads.However this data specifically is hard to measure and improve as the stats are not found easily and readily. If the  market you serve has  relatively stable search volumes , you can do some testing and see if you can improve your CTR

5)Target the searcher's  intent: while writing titles and descriptions  keep in mind what users are doing to reach your site  either by search engines or referrals.Study the top 10 landing pages and try to find " why users are coming to your site for "

If the searcher's intent is researching.. you need to tweak your title and maybe the structure needs to have more descriptive .However if users are purchasing online and looking at discounts at your site  your title should clearly mention that these functions are available .example " digital cameras now at 40% less: axax( name of website ): the top selling bestdigi cams online

February 28, 2016

5 kind of linking strategies that can effect your seo rankings

Among the the most crucial changes that might affect your website traffic is your linking strategy. At times we go overboard to get links and focus on the quantity , rather than quality.This affects your traffic and might change your SERP Again how do you judge quality of your links . There are 3 ways how you can do this
1) Is the industry you are targeting complimentary or it is your competitor ?
2) How many links does the website have ? Are the links merely buried in their links pages. A page has more than 100 links can be assured the link will not  pass on the attributes of links

3) Relevance and authority : Links that are related to the same topic are given more weightage than random linking  to unrelated pages.Think of relevance of each link being evaluated by in the context of specific user search query .For example " for a search term " new cars at Arizona " if the publisher has received a link from Arizona chamber of commerce, the search engine derives  that  the fact that" the link is relevant and trustworthy as the site is about Arizona".Once you decide on the industry you are targeting,your focus on link building should be razor sharp on getting linksfrom within that industry.

4) No follow links : Although google recently claimed that the attribute of "Nofollow" links is no longer that relevant and important as compared to a couple of years ago. However when you use a No follow meta tag on a page, the search engine will still crawl the page and place it in its index. However all links ( both external and internal) on the page will be disabled from passing the link juice to other pages

 4) Anchor text : Desist from a link where the anchor text mentions" our links, click here to know more,read more, or check out the full post here. Be very specific on the anchor text to which you site is being linked to. The best policy is to use the title of your website or webpage as an anchor text The impact of anchor text is quite more powerful than you think.For example if you link to a page that has minimal search friendly ( flash site for example) The search engine will look for signals to know what the page is about.In such case inbound anchor text becomes the primary driver in determining the relevance of the page 

February 25, 2016

how search engines use historical and temporal link data for ranking

5 ways how search engines use historical and temporal data to determine SERP ranking

while we know that links are the basic bedrocks to ranking a site and count them as a vote for the site. This concept was actually based on " citation" where an established thesis was often cited during researching for another topic, which meant that the original thesis was a vote for further research.However how do search crawlers use these links as an information ?Know more below or click here

Index inclusion :Search engines need to decide what kind of information and pages to include in their index .They do this by discovering web pages by crawling the links, if the site has more links it crawls through those pages and this  process continues as the crawlers jumps from one link to another 

In short links are used to index most parts of the world wide web, by which  search spiders collect all the relevant information which is stored in its index .The second way how search engine discover webpages with links is through Xml site maps

However one caveat here is that " search engines" do not include links to webpages which it considers to be a low value  page.. They do this as  cluttering up their index with  low quality pages impacts the search results for the user.

Crawl Rate Frequency :The search spider crawls  a portion of the world wide web every day. How do  search engine decide which sites they need to visit, where to begin and where to end 
Google has publicly stated that page rank is an indicator in which order it starts to crawl . According to Google, it starts their crawl in reverse Page Rank order. So they visit the PR 10 sites,followed by PR 9 and so on and so forth. A page with higher page rank also gets crawled much faster and more deeper

Ranking :Everything being equal , the site with the highest number of backlinks will be ranked first 

Source independence : A link back from your own site to another site run by you is not an editorial vote for your site. A link from 3rd party independent site is seen as an " actual link" which  is a vote for your site.

Among the temporal factors used for ranking are 

1)When did the link first appear.
2)When did the link disappear or become a broken link
3)How long has the link been there: A link for a larger time is a ranking factor
4)How quickly was the link added : should be organic and gradual
5) The context of the page and links pointing towards them
6)Page Placement : links at the body content  as compared to a link page at the bottom is considered more powerful and impacts ranking

February 21, 2016

5 tips on how to make your content management system friendly for seo

5  Ways to ensuring SEO benefits while deciding your content management system

While looking to publish a website many webmasters might wonder if the selection of CMS plays a role in seo and how to ensure that " you fine tune your CMS to make it SEO friendly
The truth is that CMS  does play a huge role in seo. The top 3 CMS  happens to be Jhoomla,Drupal and Wordpress, out of which wordpress has the largest marketshare
Lets take a look on the the basic things you need to keep in mind while deciding the CMS and how to ensure your CMS functionality plays a big role in ensure your search visibility

TITLE TAG CUSTOMIZATION : A search engine friendly CMS has to ensure that each title tags are customised based on the url not only at a page level but also enable rules for particular webpages Sites that run on blogger and wordpress often use the date as a url. xyz.wordpress/post/21-02-2016 . This is seo unfriendly and should be avoided. Replace the date in the url with the post title . The biggest issues CMS faces is not to customise the title tags with the url or the theme of the post
 For example if you have a site on cameras ..and your url is .www.a1cameras4 , and your CMS only allows you to create the title , where the tag always has to start with your domain name followed by a colon, followed by the article you post, Your on the brink of seo disaster

Lets see the example below. In the above site , a post on the top 10 cameras has a url which is If your CMS allows you only to create your title which starts with your website name for example in the above post ( the  title shows A 1 cameras for you  repeats for every ul and post , then you  are treading dangerously .You should be able to customize each url with customized title and meta tags

PAGINATION CONTROLS :Pagination can be the bane of website search rankings so controlling it with inclusion of more items per page and  more contextually relevant anchor text  is recommended .Instead of next or previous page at the bottom of   you can use titles like "more eCommerce news", or latest trends on online marketing

 301 FUNCTIONALITY: Many CMS lack this critical feature which plays an very crucial role in redirection of content when necessary.. Using 301 permanent redirection tells the search crawlers to treat a non www version and www version as the same url, therefore informing the crawlers to pass on the benefits and link juice to the same url. 301 redirection is used when you have a new domain or have a newer version and wish to pass on the search benefits to the new one, thereby helping to preserve the search benefits of the the older version.This also helps dodging from keyword cannibalization

IMAGE HANDLING :mage Handling and alt attributes:: alt attribute are a must have feature , which is used as an anchor text when you use an image link.( However remember on terms of search preference text links are more advisable than image links.) However if you are using it, ensure that the CMS have this alt tag functionality when helps search engines understand " the relevance the content of your image . Images in CMS navigational elements should preferably use CSS image replacement rather than merely al tag attributes

STATIC CATCHING OPTIONS :Static Catching options is a must for the CMS you are considering for your website: Many CMS currently offer caching options which makes perfect sense if a page receives consistently higher traffic from social or news portals. A bulk CMS often make extraneous database connections which may increase load and and overwhelm the server if caching is not in place.. This might affect and lessem your potential inbound links.

MULTILEVEL CATEGORIZATION STRUCTURE:.If your CMS does not allow you to nest subcategories into categories , subcategories to internal categories, rethink your CMS options. This limited functionality of the CMS will not allow you to use your site structure and internal hierarchical linking structure.

 META NO INDEX FOR LOW VALUE PAGES : even if you use rel= NoFollow for your internal pages , other sites might still link to you, or some low value pages might rank ahead of the pages you intend to optimize for . Check if your CMS allows to you use NoIndex  for those pages which have a low value, like about us, contact us or FAQ's 
This  iis a better way to handle these low value pages which you do not intend to show up in the SERPs

February 18, 2016

5 best practices on creating spiderable link structure for search crawlers

Links are the bedrock of the worldwide web. Search engines rely on the links to  rank websites . Search algorithms depend a lot of the the link graph which are  created by human editors
The quality of a site and its ultimate chances in appearing on th SERPs is determined to a large extent by the search spiders which crawls these sites, picking up linking signals on who links to it . Each link is used as a citation and a positive signal for the site that is linked to

This means  you need your website to be search friendly to allow the  crawlers to spider your site. However  many site owners obfuscate their sites  navigation , which in turn obfuscate the links structure" to such an extend that the search crawlers cannot find them which, limits spiders accessibility and thus impacting SERP rankings

Described below are some  the best practices in creating a link structure for your website. Each of these factors affects the crawlers ability to spider your site.

 1)Link in submission required forms: Search spiders cannot read submitted content or forms which are accessible only via a form as they are invisible to search engines 

 2)Links in hard to parse java script : If you use java script for links , you may find that search engines either do not crawl or give very ,little weightage to the embedd links( In June 2014, google announced enhanced crawling of Java script and CSS . For a review on how your site may render, go to google search console- crawl- Fetch as Google ( you need to login to google webmaster tools) 

3)Link in Flash,Java and other Plugins :Links  embedded inside Java and plugins are invisible to the search engines.In theory the  engines are making progress in detecting links within flash, but don't rely too heavily on this 

4)Links in powerpoint, pdf are no different from Flash, java and other plugins. Search engines sometimes report links seen in pdf and powerpoints, but its not yet clear how much they count 

5)Avoid Linking  to  pages with"No Follow" or Robots.txt  . If your link is  pointing to pages blocked by meta robots tag or ,rel="NoFollow, it is almost equal to a dead link .Both these factors " prevent" search crawlers to " pass on the page rank juice to the pages which are linked from there as well as the "links ability" to serve as a citation for other websites

 6) Links on pages with hundreds of links : Google ( According to Matt Cutts) its guidelines on linking states that its crawler will stop spidering the page having more that 100 links" Though this is just an indicative number. Limiting your links to a number between 100-200 " on a web page will ensure that the crawlability of the page in question is not affected .

February 14, 2016

5 ways to check duplicate content in your website

DUPLICATE CONTENT :.Duplicate content is one of the major reasons  for sites which  otherwise having  done a good job with  offpage and onpage optimization often falter


1) among the first thing you must do to ensure your site does not have duplicate content is to make sure that the non www versions of your page 301 redirect to the www version or vice versa.
This is often know as canonical redirect. Also ensure that you dont have  a https : page that are duplicate of  your http: pages

2)The easiest way to do this is to take unique string of  your major content  pages on the site and search them in Google . Make sure  you enclose the  the string inside double quotes. example "keywords that  have the most CTR on Google " This will help you to know "if your site has a duplicate content pages

3) You can also use commands such as "allinurl" and"intitle". For example if you have  url's for pages that have distinct components to them .
For example "1968 mustang blue" or "10974567" ) You can search for these with the operator inurl  and see  if they return one page or more than one page. To see more click  on the image below or scroll down

4)Another duplicate content task to perform  is to see if each piece of content is available at only one url. This probably trips more commercial sites as compared to others.However is the same content is found on   different url's  or multiple ways, it forces the search engines  and users to choose which is the canonical version,which to link to and which to disregard,
 As each search spider comes with predefined size to crawl and with limited time., a duplicate content  slows the spider down, as its not sure which is the original and which is the genuine content

5)Tools for duplicate content checker :  You can also use some of the free or paid tools available in the market to check for duplicate content, however  dont reply too much them, they are merely indicative and not exhaustive.See the post on  top 10 duplicate content checker,

10 advanced Bing search operators


Along with Google,Bing also allows a host of  quick search  search operators  and hacks to derive the most relevant search result. These SERP results  are difficult to find with conventional search operators
Here are  the top 10 Bing Operators to makes your search easier and  find sites which  you probably cannot find even  in vertical or visual search engines To read the  entire Bing Search Operators. Click on the link below

Operator : linkfromdomain : find all pages  the given domain has links to 
SEO application : find the most relevant and popular sites  your competitor links

Operator : contains
SEO application : find pages linking to a specific document type ( pdf,xls,)
Example:contains:pdf seo

Operator IP :IP  restricted search, shows sites sharing the common IP
SEO application : Helps you find if  your inbound links, are  from sites  having the  same IP address
example :ip:

Operator : inbody:
SEO application : find pages that has most "relevant optimized body  text"
example :inbody:online marketing

Operator :Location:/loc:location specific search ,
SEO application:helpful during local search
example :seo loc :Austin

Operator :feed
SEO application : Finds keyword and narrows search results to terms contained in the feed
example : hasfeed:seo

Operator :hasfeed:
SEO application :narrows search results to pages linking to feeds which has necessary keywords

March 22, 2015

Chinese Search Giant " Sogou " plans a $3billion IPO this year

Chinese  search marketshare via )
Sogou Inc., the Chinese search engine controlled by Inc., is planning a U.S. initial public offering at a valuation of more than $3 billion,Sogou, Inc. is a subsidiary of, Inc. founded on 9 August 2010. It is the owner and developer of Sogou

The Beijing- Search based  company  sogou whose name means “Search Dog” in Chines in Chinese, could  have an IPO  as early as the second half of this

 Sogou has been in the radar screen of many investors and companies,earlier and   has seen an investment of over US$448 million in 2013 by Tencent  who acquired a 36.5% stake in Sogou. 

Spinning off China’s third-largest search site would provide a boost to majority owner Sohu, the Internet portal operator whose shares have lost more than half their value since an April 2011 peak. It would also provide funds for Sogou to gain ground against industry leader Baidu Inc. with the country’s 649 million Internet users.

 Currently the search marketshare in China is squarely led by Baidu with 74%, 

January 18, 2012

November 18, 2011

Google Vs Bing VsYahoo :Marketshare Comparison by Industry

How big do you think is Google compared to competitors Bing andYahoo?The Above infographic by Search Engine land shows the state of  the Big 3 Search Giants by comparing them across 3 parameters which are 
(1)Historical Marketshare
(2)Present s Marketshare
(3)Industry verticals (Finance,healthcare,b2b and and retail

According to Search Engine Journal Bing has been climbing up the charts mostly at Yahoo’s expense .It is  possible for a search engine to see its share of searches drop while the actual number of searches  the search volume rises. Also, month-to-month changes are often mean little. You want to see a trend emerge over time.

That’s why the big “Bing may just come out on top” quote on the chart — from this Mashable article last April — should be ignored. Mashable assumed continued Bing growth and Google losses for one particular month, projecting that Bing would overtake Google in January 2012. That’s nowhere near happening.