Trending today

Showing posts with label search engine optimization. Show all posts
Showing posts with label search engine optimization. Show all posts

May 3, 2016

Loosing out on your quality score :the best landing page infographic



" 5 tips for a perfect landing page"


5  Ways to ensuring SEO benefits while deciding your landing page 


for any SEO optimisation , landing page is a key. While in organic search, Landing page per se is not that crucial as compared to paid search. But irrespective the nature of search a well designed landing page helps spiders to look at your content in a very structured and granular way. This  helps you increase your page score in paid search, and helps your ranking across organic SEO. The above chart illustrates  via an infographic showing how a landing page ought to be designed and the key elements to be incorporated





"how to ensure SEO benefits while deciding your paid search landing page"

The 12 jewels to be incorporated in your landing page 





April 12, 2016

how to use blended search to rank well across local search



BLENDED SEARCH RESULTS :5  Ways it Impacts your organic search results 


The introduction of Google’s ‘Snack Pack” results was perhaps the biggest local search shakeup since Google Pigeon rolled out back in July of 2014 which  had about 7 parameters on : local search rank”However of late, Google started to blend 7 snack pack into 3 snack packs which are considered the most crucial ranking parameter for local search


 Blended search results means” when google ” starts searching and  ranking your site, its tries to use blended factors to find out how effective you are  in terms of vertical search. Blended results such as local, shopping,images, real time videos , news are taken into account  and have begun to play an important and crucial role in addition to  the standard search results factors.
Here is a lowdown on Blended search results  that might help you to rank specially across Local search which might help you to index your site much faster 



HOW BLENDED SEARCH HELPS TO RANK ACROSS LOCAL SEARCH


  1. The business has a listing in the local index including, Google Places ,Yahoo Local, Bing local listings 
  2. The actual business location has the same address as mentioned in in the website  and search engine query friendly
  3. The business has a local phone number for the searched location.
  4. The website contains local address and phone number for the query
  5. The website should have more positive and superior ratings by users  
  6. Social Signals : How are your business shared across social media,how much clicks do you manage to generate out of your local ad business cross facebook, twitter , Klout and linkedin among others
  7. "5 ways how you can rank better in Local search"


  1. videos that describes the local institution or something about the product and services which are tagged 
  2. The local listings ( google local ) has to be claimed by local owners and verified as their busines videos that describes the local institution or something about the product and services which are tagged 
  3. The local listings ( google local ) has to be claimed by local owners and verified as their business










  1. Foursquare checkins, mobile clickthrough rates
  2. Of late behavioral and/or mobile signals to make up 9.5% of the algorithm across  localized organic results
  3. High Numerical ratings on Google Local/ Ratings by Google users ( Google Local) 
  4. The listing title  and the description has to have atleast 3 keyword in order to search across blended search SERPS
  5. The business  should be listed across third party data providers and Yellow pages
  6. Geo-tagged images on photo uploading sites, like Flickr, Dropbox and Panoramio , along with a caption that explain the ” Geotagged images and local business





March 24, 2016

how metasearch engines differ from general search engines



how metasearch engines works : how different are they as compared to traditional search engines 

 Metesearch engines  are  search engine aggregator, which extracts the search results from different search engines and presents the users with the top ranked sites .. Metasearch engines do not maintain databases of its own, but rather it extracts  and combines data from results from  several other search engines  and displays it in a search engine SERP format.

One of the biggest advantages  that metasearch engines have is that ” They often combine the best search results across many search engines and displays the best results based  on the user query for the keyword”
 The more results  that can be seen in one place the better it is for users .Among the top 2 biggest metasearch engines are 1) Dogpile and Duckduckgo.



While Dogpile pulls results from Yahoo and Google in one place, Duckduckgo gives you a overview of search results without creating a personalised feed of your search rankings.This creates a more filtering approach  to all the indexed data gathered from the other search engine sources.The ads appear at the top and bottom  with organic search  called ” web results” appearing in the middle .The Biggest proposition which  duckduckgo provides  is the fact that  its the only known search engine that does not track user behavior 



After pulling the necessary data from known search engines, The metasearch engines , retains the top  search rankings from the separate search engines and presents the user with top ranked sites  This is different from running an algorithm  to find the best sites for a keyword by creating an index  


February 28, 2016

5 kind of linking strategies that can effect your seo rankings

Among the the most crucial changes that might affect your website traffic is your linking strategy. At times we go overboard to get links and focus on the quantity , rather than quality.This affects your traffic and might change your SERP Again how do you judge quality of your links . There are 3 ways how you can do this
1) Is the industry you are targeting complimentary or it is your competitor ?
2) How many links does the website have ? Are the links merely buried in their links pages. A page has more than 100 links ..you can be assured the link will not  pass on the attributes of links



3) Relevance and authority : Links that are related to the same topic are given more weightage than random linking  to unrelated pages.Think of relevance of each link being evaluated by in the context of specific user search query .For example ” for a search term ” new cars at Arizona ” if the publisher has received a link from Arizona chamber of commerce, the search engine derives  that  the fact that” the link is relevant and trustworthy as the site is about Arizona”.Once you decide on the industry you are targeting,your focus on link building should be razor sharp on getting linksfrom within that industry.



4) No follow links : Although google recently claimed that the attribute of “Nofollow” links is no longer that relevant and important as compared to a couple of years ago. However when you use a No follow meta tag on a page, the search engine will still crawl the page and place it in its index. However all links ( both external and internal) on the page will be disabled from passing the link juice to other pages

 4) Anchor text : Desist from a link where the anchor text mentions” our links, click here to know more,read more, or check out the full post here. Be very specific on the anchor text to which you site is being linked to. The best policy is to use the title of your website or webpage as an anchor text The impact of anchor text is quite more powerful than you think.For example if you link to a page that has minimal search friendly ( flash site for example) The search engine will look for signals to know what the page is about.In such case inbound anchor text becomes the primary driver in determining the relevance of the page 


February 21, 2016

why should you stop commenting on blogs

Many site owners use Spam tactics by creating bots that crawls around the web looking for open forums and blogs  where its easy to add comment  and get a backlink to their website , which leaves behind automated comments bordering on spam, which are most of the times, not related to the content of the site in question
The great majority of these sites are deleted  by the  rel=NoFollow  tag or by the blogs software content management systems. however the spammers do not care as they operate on a huge scale



Reason number 2 why you should avoid comments across blogs and forums is most of the times, the comments do not have any relevance to the topic. If you do have to comment ,ensure you stick on the context and relevance, this way. For Example sites like quora and a  the seo forum  like webmastertools, has user generated content which is editorially very sound.While blog and forum site owners might use rel=NoFollow tag, as we saw in the previous post, Search engines can decide the quality of the post and the site which is linked too, while the no follow tag might be used, it only stops passing of the link juice to the page linked to and not the indexing

Link Farms  and Number of links: due to the nature of the blogs and forums, just anyone can leave behind a comment, irrespective of relevance or being topical. Over a number of time Forums and Blogs  acquire a huge number of links, with search engines viewing the links as not relevant or even ignoring  the links .Worse a webmaster that has many related sites , might ask a link back in exchange for 10 other sites he owns.  This way all links are interlinked and your site might be a part of the “link farms” which google penalises very heavily

February 16, 2016

3 steps to get your site back after a google penalty

 While getting penalised by google is bad enough, getting your site back Google after you have been flagged and penalized can be a daunting experience.
But you dont have   to have many reasons to worry  if you follow these steps which are outlined below

What you need
1)Access to google webmasters tool
2)Access to a site to check domain authority website “bulkchecker”
2) Google Drive
3) Excel sheet to map your progress

Penalty are of 2 types
1) manual penalty :  due to unatural linking
2) Algorithm penalty: due to conflict between your site SEO and Google SEO algorithm

Today we will today cover only manual penalty

 MANUAL PENALTY  Manual penalty refers to the penalty which you receive by a real life human, after the site in question is flagged  by google spiders.This does not happen on whims and fancies of the google spiders  but based on both spiders and human intervention

HOW IS PENALTY INFLICTED :Any website that is online which can be found by a search engine is being evaluated by who knows who and who is recommending whom.. I know “Content is the King” has been a favorite for all webmasters for the last  15 years. and it  continues to reverberate across search engine forums and search gurus alike  But the fact of the matter is your content does not dictate to your site on how well you rank at Google. In fact it cant.Google has made sure of that that due to “lots of  gaming and content  manipulation  which has happened earlier..

 Google looks at your website  almost like a credit card report,checking who links to whom how important  the link is who is referring yous . To cut a long story short.. if you’ve come to a party , Google just wants to know to who has brought to you the party

Manual Penalty also called unnatural link penalty,as these are purely caused  due to  ” unatural and artificial links which projects your site in a certain way links which are irrelevant, meant to manipulate and  artificially  pass link juice , links that are broken, irrelevant links ,   including buying  of links to  manipulate page rank

This tool helps you to understand your links with low domain and remove it from the links

This is the kind of email you will receive if  your guilty of unatural linking





HOW DO YOU KNOW IF ITS A 
PENALTY (1)if your site has been loosing a lot of backlinks fast  as compared to earlier Its time  for you to check if your link building .

(2)However if your site has indeed been flagged by google and its has been validated by the search team at California.. You will get a mail from Google asking you to  take a relook on your unatural or artificial linking, and will  ask you to resubmit the request to index your site once you have removed the links  in question

Step :1)solving manual Penalty : Go to webmaster and go to  tools,click  on search traffic  and then who links to your site . Download the entire list as CSV

Step:2)Now go to a site called  Bulkdachecker : This toll will identify which of the links you have at your site has the lowest domain authority. To to this  first download all the links of your site via Exel or CSV

Step 2:) Now upload your links one by one or  upload the entire list of links via a excel  or a CSV. Once you have entered and clicked on check , it shows you ” all the links with their domain authority. Now check each link and their domain authority and remove all the links which has a DA ( domain authority of 25)




February 15, 2016

worldwide indexed pages by google stands at 50billion vs Bings 5 billion


"number of pages indexed by google vs Bing"

Bing vs Google’s Indexed Pages

The size  of the world wide web : Google index vs Bing Index .The chart shows The size of the World Wide Web ) indexed  pages  compared with last 5 years, 3 years, last 6 months. The above chart shows  how many pages has been indexed  by Google.for the last  2 years. So far Google has indexed over 50 billion webpages since 2014 to  15th February,2016.

The saffron color shows the number of pages indexed by Bing  since  2014 which stands at 5 billion as of February 16th,2016 .The Dutch Indexed Web contains at least 241.41 million pages ( as of Monday, 15 February, 2016




Number of Pages Indexed by google in the last 3 months


Tool to check  worldwide Indexed pages by search engines in Real 

5 linking tricks that search engines hates

The  5 linking tricks that search engine hates : Before you create your SEO strategy , you need to remember what are the linking guidelines which is preferred by Google and linking practices which are considered a very low quality by Google.We take a look at the best practices on creating organic links, identify how to look for quality links and how to understand if you unknowingly linked to a spam website
. Click here to scroll down to find out the best linking practices  and which linking tricks are hated by  search engines

WHICH SITES ARE  LINK SPAMS OR LINK FARMS  This is one of the most common and the biggest  mistake a site owner commits when you link your site to spam sites or link farms. You might do this unknowingly but the repercussions are huge and you might kiss goodbye to SEO for  the next 6 months to one year
Many webmaster falsely think that ” any link  is good for your search rankings.However this thinking is flawed and completely wrong.as the below linking guideline explains

1)What are link farms :Link farms happen when you link out to sites which might have a dozen other sites and interlinks them. Suppose a webmaster  asks you to link back to his sites , while he does the same. Now you  fall prey into a vicious cycle of interlinked sites which just link to others   and you do the same
This is a problem has this  webmaster might have 10 sites which originates  from  the same IP, which is easily picked  up by Google .Any search engine including  Google can  easily  find out i these sites have something of value to readers or has been created just to pass on the  link juice in order to rank better.By associating with such a site, you run the risk of being penalized or downgraded

2)  Not having relevant links :  Every site is based on 1) relevance , 2) Topical and 3) who links to you. A site on pet food  having links to  a movie review does not make sense. . Concentrate on your niche  topic and create relevant content , the relevant  backlinks will follow .

3) How to research for high quality and Relevant links : The best way to get the best links is to find your competitor links and soliciting for links from those sites . Finding the competitor sites is easy. All you have to do us use the operator in Google search :link:competitorlink.com ( replace competitor link with your domain
The reason  why  this will provide you non spam links is that Google has already penalized those sites and will not show those sites as they have stopped indexing them

4) Checking  the quality of your backlinks  and finding if your site has been marked as spam  There are many tools and sites which will help you to knowing   the quality of your backlinks  and check  your site and if they have been linking to  for spam

Backlink Sites/Tols

1 Yahoo site explorer  ( one of the best)
2Monitorbacklinks
2)Majestic SEO tool
3)Ahrefs
4)Smallseotool
5)Checkyourlinkpopularity

 Tools to check spam sites: There are many such sites that inform you if  your site link is  pointing to spam pages  or the quality of your  pages or  your link quality( However do not take them very seriously as , these are prone to error as they can only make an estimated guess)

1)Hmtweb.
3)Trueurl 
4) Spamometer

February 14, 2016

5 ways to check duplicate content in your website

DUPLICATE CONTENT :.Duplicate content is one of the major reasons  for sites which  otherwise having  done a good job with  offpage and onpage optimization often falter

HOW TO CHECK FOR DUPLICATE CONTENT 


1) among the first thing you must do to ensure your site does not have duplicate content is to make sure that the non www versions of your page 301 redirect to the www version or vice versa.
This is often know as canonical redirect. Also ensure that you dont have  a https : page that are duplicate of  your http: pages


2)The easiest way to do this is to take unique string of  your major content  pages on the site and search them in Google . Make sure  you enclose the  the string inside double quotes. example “keywords that  have the most CTR on Google “ This will help you to know “if your site has a duplicate content pages

3) You can also use commands such as “allinurl” and”intitle”. For example if you have  url’s for pages that have distinct components to them .
For example “1968 mustang blue” or “10974567” ) You can search for these with the operator inurl  and see  if they return one page or more than one page. To see more click  on the image below or scroll down


4)Another duplicate content task to perform  is to see if each piece of content is available at only one url. This probably trips more commercial sites as compared to others.However is the same content is found on   different url’s  or multiple ways, it forces the search engines  and users to choose which is the canonical version,which to link to and which to disregard,
 As each search spider comes with predefined size to crawl and with limited time., a duplicate content  slows the spider down, as its not sure which is the original and which is the genuine content

5)Tools for duplicate content checker :  You can also use some of the free or paid tools available in the market to check for duplicate content, however  dont reply too much them, they are merely indicative and not exhaustive.See the post on  top 10 duplicate content checker,

February 8, 2016

when to use a sub domain vs a domain




When to use a subdomain instead of a domain 


  • The debate between a domain and a subdomain has been there for years. What most people do not understand is that creating a subdomain has no significance unless you create a totally different product line or your company has launched a new product and has a totally new content and would like to use a catchy subdomain
  • One good reason for using a subdomain is that you would look more authoritative to users However take into account that it is much less expensive to use a subfolder and and have slightly less panache that it is to educate through branding an advertising
  •  However a subfolder will work 99% of the time.Keeping content on a single root domains and single subdomain gives the maximum seo benefits as the engine passes on all the positive metrics , including the backlinks and the PR earned by the site
  • Subdomains are definitely not worth the time and  not a popular choice if SEO is a prime concern. Subdomain may iherit the ranking benefits and benefits of the root domain they are hosted underneath, but they do not always do Subdomains may be used when keyword usage in the doman name is of critical importance For example own usedtoyotatrucks.com, you can pull in quality search traffic for the specific term used toyota trucks with a microsite

How to display different content to search engines and visitors





A variety of strategies are used to segment content delivery. The idea is to serve content that are not meant for search engines in a un spiderable format( placing text in images flash files and plugins)

However dont use these formats for the purpose for cloaking, rather you should use them for if they bring substantial benefit to users If you want to show the search engines , you dont want visitors to see ,you can use CSS formatting( preferably not display:none) as the engines might have filters to track this
 However keep in mind that  search engines are very way of webmasters to use such tactics. Use  cloaking only if it brings substantial user benefits

 Tactics to show  different content for search engines and users

 Robot.txt files : This file is located at the root level of your domain( www.domain.com/robots.txt) which you can use to 1)Prevent crawlers form accessing non public from parts of your website 2) Block search engines from accessing index scripts,utilities or other types of code, 3) Auto discovery of XML sitemaps.

The robot.txt must reside in it root directory and should be in small case. Any other format is not valid for search engines

 Syntax of robot,txt file : The basic syntax of robot.txt is fairly simple. You specific a robot name such as google bot and specify an action. Some of the major actions you can specify are

Disallow : use this to specify google bot not crawl your certain parts of your website
NoIndex : Use this page for telling the bots not to iindex your site in its SERPs. ( this might be used when you wish to hide duplicate content pages in your site)

heres an example of “robot.txt file”

User agent : Googlebot Disallow: 
User agent :msn bot #Block all robot  tmp and logs directories ( the has symbol# )may be used for comments within a robot.txt file where everything after the # on that line be ignored 

One additional problem webmasters run into , is   when they have ssl installed so that pages may served v iaHTTP and HTTPS .
However the search engine will not interpret this as robot.txt file at http://www.domain.com/robot.txt as a guiding their crawl behavior on http:www.domain.com/txt.
For this you need to create additional robots.txt file at yourdomain.com/txt. so if you want toallow crawling of all pages served from your https server you would need to implement the following FOR HTTP user agent: * disallow FOR HTTPS user agent:* disallow:

February 7, 2016

10 tools for measuring the social media authority of your brand




10 tools for measuring your brands social  sentiment  metrix


 1)Postrank :great for tracking RSS feed content performace in the social media world

 2) Bitly.com: One of the best tools for tracking urls and click through content along with the number of clicks from countries

 3) Radian 6 : One if the best known paid  social monitoring tool which  provides exhaustive research on social signals.Radian 6 is mostly geared towards the Enterprise market. Sentiment analysis, competitor tracking, number of clickthroughs with a break down of social media sites .

 4)Klout :A site that is mostly geared towards measuring metrics of your social media expertise. Can be used for measuing your personal social standing as well as your brands social media metrix. This score is extracted from a list of sites inlcuding facebooked, Linkedin , Tumblr, Twitter and many other social properties which you can given access to

 5)Backtype : a great tool for measuing metrix on social media , which was acquired by Twitter in 2011

 6)Socialmention; Enables google like alers, along with sentiment analysis from blogs, websites, forums and social media websites., offers several areas which you can measuring your social media

7)Converseon :impressive social media tool ,mostly geared towards the enterprise market like Radian.One of the highlights by which this tool stands out, is the ‘ hyman reviewed sentiment and social media analysis and classification

 8)Pagelever: Specially focused on tracking Facebook interactions including paid Facebook advertising and brand pages . Pagelever provides more depth and data than data that native insights functionality

9)Twittercounter: Tracks   twitter and number of retweets and mentions and clickthroughs

10)Socialbakers : A relative  new entrant. It is a paid tool which allows  tracking the social authority of your brand  across many social sites, along with cool analytics on follower overlap and OTS ( opportunities to see )

11) Simplymeasured :Offers  social reputation and  social sentiment reports via excel and pdf including datastream from twitter, Facebook, and Linkedin