Trending this month



Showing posts with label search Optimization. Show all posts
Showing posts with label search Optimization. Show all posts

April 4, 2016

planning a navigation for non human visitor and search bots



SITE MAP TO BE FOLLOWED  BY SEARCH ENGINE SPIDERS  : 5 ways to ensure bots visit at your web site


The most crucial fact you have to understand that the search engine does not look the same as a human visitor .By creating an robot friendly navigation  you encourage the  search bots visit your site more often which in turn  increase the visibility of your site. Increasing visits by search bots ensure your content gets indexed  faster, your cache( when google spiders visits your site the cache shows when they have visited last. If you see google's cache showing very recent visit, it means, your content is getting  updated faster,, and this improves your visibility in the SERPS.sees it.o perform better in search engine listings, your most important content should be in HTML text format. Images,
"an ideal link structure for robots"

Things not to do : Avoid Flash files, Java applets, and other non-text content and emphasis more on html pages .This is because   search bots often ignore or devalue these codes  

see the entire post below









how do you get to do this : 1) by providing search engines robots with links to navigate the through your website
 2)by pointing search engine robots to dynamic or hard to read pages that might not be able to accessible other wise.
3) By providing a possible link to landing page
4) By ensuring your 404 error has a link pointing to a page ( preferably home page 5) by using ready to use content if a page is not available or a url that is broken




March 29, 2016

dynamic serving vs responsive design : customising your design for mobile users


As we explained in the last responsive mobile design post.There are 3 ways to optimize your mobile website.These are 1) Creating a responsive website 3) dynamic serving and 3) Creating a mobile website

CHOOSING THE RIGHT MOBILE DESIGN APPROACH :WHEN TO USE DYNAMIC SERVING FOR YOUR MOBILE USERS 
 
In this post we look at " how to use dynamic serving for mobile users and under what circumstances this design is used 

Dynamic serving is a server side development approach that detects which type of device your visitors are using to view your website delivers .Dynamic serving specifically optimize the website content based on the device your users are using and the server responds accordingly .

Like responsive design, the dynamic serving uses one single set of url's for all content regardless of how the content is seen, irrespective of a desktop, PC, laptop or me. However that's when the similarity ends .In dynamic serving the url too remain the same,but the content delivered to mobile device is not always the same as in desktop

This is because dynamic serving is a server side approach that alters the content code (HTML, CSS and php ) based on the device that is asking for it before the content is delivered to the browser.This code allows the server to alter the content of the page without altering the URL of the page

 CIRCUMSTANCES WHEN YOU SHOULD USE DYNAMIC SERVING


1)When your website needs to include complex mobile friendly functionality such as multi-page form and interactive dashboards.Dynamic serving allows you to serve the best experience based on user circumstances

2)When you see that your website needs to serve 2 different device markets very differently. One example is that iPhone users take a different path to conversion as compared to android users . Dynamic serving is also used when you want your webpages to render differently for your tablet users and smartphone users.

3)When your visitors largely use different keywords to access your website via desktop search and mobile search. Dynamic content allows you to altered the way the " content is rendered on page by page basis

 4)Dynamic serving  is the best solution If you want to optimize specific pages for high volume keywords phrases without changing the desktop language and if you want your desktop visitors and mobile visitors to convert in different ways

March 26, 2016

mobile first strategy :when to choose responsive web design


MOBILE FIRST STRATEGY :

How do you choose your mobile content delivery platform for your mobile users :Traditionally marketers have 3 ways of doing this 1) Responsive web design, 2) dynamic serving 3)mobile website 


Because a desktop monitor and smartphone are different in size, designing on mobile implies that you have to do either of these 3 things 1) build a responsive design website that dynamically adjusts content from desktop format to mobile format


2) Use dynamic serving to make the mobile experience device specific and control how the mobile content is delivered and viewed page by page  3)Create separate mobile website specifically made for your mobile users

Planning your mobile content delivery platform: when to use responsive design, dynamic serving and mobile site

Option 1 :Responsive design :Responsive design is a web design technology that uses CSS ad series of coded rules to dynamically change and adjust the appearance of your desktop content so that it fits within the screen size parameters of different mobile devices.Currently, responsive design is the mobile-friendly configuration recommended by both Google and Bing. Responsive design uses java script and client side serving to alter the way the pages appears in the mobile or desktop browsers after the server has already loaded the page.The below is the  illustration how this works

 You can use responsive design for your mobile users if it fulfills these  three criteria's listed below 1)When you want a coherent and an integrated desktop and mobile experience.

"responsive design,dynamic serving or mobile website"

2)you have developed a strong desktop website that and have managed to built up a reputation in terms of page strength inbound links, unique content ,optimized for search engines and link equity, trust, and industry authority, and You want to pass on these factors to your mobile version of the page . In these cases responsive mobile sites benefit from shared indexing with desktop sites. No 3 (see below)





3) When you want your mobile users to follow a similar " page navigation, data flow and conversion metrics as they do in your desktop version.When you have limited development resources, which can make maintaining a custom mobile experience with custom content seem out of the question.The responsive web concept is also the best option when you want to manage the maintenance and search engine indexing of just one version of  site, and the seo benefit passes on to the mobile version of the site as well.

science of writing meta tag descriptions : 5 best practices for seo

5 best practices in writing the perfect SEO meta tag description"


The significance of Meta description  :While the perceived (and real) importance of meta data across search has depreciated , the attribute still plays a significant role in SEO rankings  :Meta Description has 3 primary uses

META DESCRIPTION TAG SEARCH ENGINES: 5 BEST PRACTICES 
Meta Descriptions Role in Search
1)describe the content of the page accurately and succinctly 
2)Serve as a short term advertisement for to click on your pages in the search engine 
3)to display the targeted words not for ranking purposes but to indicate its content to searchers 

5 rules to write the best meta descriptions


1)Number of characters :. Descriptions should be succinct and compact
2)Keep the length to 160 characters ( for google) and Bing up to 200 characters :However you are allowed to keep your descriptions to an average of 165, including spaces



3)Test, Refine, Repara and Rephrase : Just like ad ad which undergoes a plethora of tests , ensure that the description actually fits the content or the theme of the page in question. Each web page ideally has to have its own description pages.



3)Include relevant keywords : Its very important to have the right keyword in the meta description tag - the boldface that the search engines apply which can make a huge difference in visibility and the cilck thro rate of the website in question.

4)Ensure tagged facts in the description: You can include other structural tagged facts in the information apart from standard information . While news or blog posts can list the author, date of publication, or byline information. What about a product page? How do you incorporate product information like —price, age, manufacturer, features which lies scattered throughout a page. A good meta description can bring all this data together. For example, the following meta description provides detailed information about a Ian Fleming.


5)Understanding User Psychology of Search Disparity: take into account user differences in terms of search behaviour . For example an organic search user will not see your website the same way as users which comes from" PPC search'.While creating the meta decsription its important to keep in mind this basic fact . Users looking for information and users looking to shop online are clearly two different sets of consumers and you need to create descriptions based on which kind of users are your target segment


6)Employing Universal descriptions: Some search marketers are of the view that " you should not always write a good descriptions".Conventional logic holds that its wiser to write a good meta description to maximise the chances of it being used in the SERPs, rather than letting the engines build one of them. However this isn't always true

If the page in question is targetting one , two or three heavily searched terms or keyword phrases , go with the meta descriptions that targets user performing those searches . However if you are targeting long tail traffic with hundreds of articles or blogs entries, it can sometimes be wiser to let the esarch engine decide and themselves extract the relevant text The reason is simple to understand.When the search engines show a page in the , SERPs they always display the keywords and the surrounding phrases that the user searched for. However If you try to force a meta description, you can end up creating one that is not appropriate for the search phrase your page gets matched too

March 23, 2016

5 most popular kinds of search engine spam






 How to spot search engine spam : 5 tell tale signs of spam impacting search engines



Search engine spams refer to an  attempt to deceive search engines by telling them to override " existing" search engine best practices  and " laying emphasis on a set of given criteria, which under ideal conditions does not deserve to be ranked at all.

In this post we discuss the most popular kind of search engine spam and how to recognize them.However never try to use them no matter how much you are tempted, as this will only result in your site being black listed


HIDDEN LINKS : White texts or links on a  white background renders  texts invisible to the users  unless the text is highlighted  by right clicking the mouse .Spammers then use relevant keywords or hyperlink that the spiders can read and counts as relevant 

TEXT LINKS HIDDEN BY A LAYER :  One of the tricks  most used by black hat seo webmasters  is to use CSS to  hide spiderable  content under this page which is not visible to the naked eye or by highlighting the page 





DOORWAY PAGES : Doorway pages are web pages  that are made to meet specific search algorithmic  requirements  for various search engines,and are not meant to be shown to ordinary users .In short the doorway pages do not earn the ranking but deceive the search engines into ranking by design  whose main intention is to spam the search engine index  so that it appears high in the SERPS. However  when a user clicks on it , it is automatically redirected /  to another site or page within the same site..

UNCLICKABLE LINKS: creating a link that has  only a single1-x-1 pixel as an anchor , that uses  the period on a sentence as an anchor or  that has no anchor at all . For users there is nothing to click, but the search engine can still follow the link


CLOAKING : In cloaking  the content showed to search engines and the version which is shown to the user browsers  are different. Spammers may cloak by IP  address( information used to find out where your computer or server may be located  or the user agent : HTTP header  describing whether  your a person or a search robot which is requesting the page. When a user is identified as a search spider  a server side script delivers  different version of a web page , in which the content is different from what is viewable by the searching user browser


February 21, 2016

5 tips on how to make your content management system friendly for seo



5  Ways to ensuring SEO benefits while deciding your content management system


While looking to publish a website many webmasters might wonder if the selection of CMS plays a role in seo and how to ensure that " you fine tune your CMS to make it SEO friendly
The truth is that CMS  does play a huge role in seo. The top 3 CMS  happens to be Jhoomla,Drupal and Wordpress, out of which wordpress has the largest marketshare
Lets take a look on the the basic things you need to keep in mind while deciding the CMS and how to ensure your CMS functionality plays a big role in ensure your search visibility


TITLE TAG CUSTOMIZATION : A search engine friendly CMS has to ensure that each title tags are customised based on the url not only at a page level but also enable rules for particular webpages Sites that run on blogger and wordpress often use the date as a url. xyz.wordpress/post/21-02-2016 . This is seo unfriendly and should be avoided. Replace the date in the url with the post title . The biggest issues CMS faces is not to customise the title tags with the url or the theme of the post
 For example if you have a site on cameras ..and your url is .www.a1cameras4 you.com , and your CMS only allows you to create the title , where the tag always has to start with your domain name followed by a colon, followed by the article you post, Your on the brink of seo disaster

Lets see the example below. In the above site , a post on the top 10 cameras has a url which is a1cameras4you.com/top-10/cameras If your CMS allows you only to create your title which starts with your website name for example in the above post ( the  title shows A 1 cameras for you  repeats for every ul and post , then you  are treading dangerously .You should be able to customize each url with customized title and meta tags

PAGINATION CONTROLS :Pagination can be the bane of website search rankings so controlling it with inclusion of more items per page and  more contextually relevant anchor text  is recommended .Instead of next or previous page at the bottom of   you can use titles like "more eCommerce news", or latest trends on online marketing

 301 FUNCTIONALITY: Many CMS lack this critical feature which plays an very crucial role in redirection of content when necessary.. Using 301 permanent redirection tells the search crawlers to treat a non www version and www version as the same url, therefore informing the crawlers to pass on the benefits and link juice to the same url. 301 redirection is used when you have a new domain or have a newer version and wish to pass on the search benefits to the new one, thereby helping to preserve the search benefits of the the older version.This also helps dodging from keyword cannibalization

IMAGE HANDLING :mage Handling and alt attributes:: alt attribute are a must have feature , which is used as an anchor text when you use an image link.( However remember on terms of search preference text links are more advisable than image links.) However if you are using it, ensure that the CMS have this alt tag functionality when helps search engines understand " the relevance the content of your image . Images in CMS navigational elements should preferably use CSS image replacement rather than merely al tag attributes

STATIC CATCHING OPTIONS :Static Catching options is a must for the CMS you are considering for your website: Many CMS currently offer caching options which makes perfect sense if a page receives consistently higher traffic from social or news portals. A bulk CMS often make extraneous database connections which may increase load and and overwhelm the server if caching is not in place.. This might affect and lessem your potential inbound links.


MULTILEVEL CATEGORIZATION STRUCTURE:.If your CMS does not allow you to nest subcategories into categories , subcategories to internal categories, rethink your CMS options. This limited functionality of the CMS will not allow you to use your site structure and internal hierarchical linking structure.

 META NO INDEX FOR LOW VALUE PAGES : even if you use rel= NoFollow for your internal pages , other sites might still link to you, or some low value pages might rank ahead of the pages you intend to optimize for . Check if your CMS allows to you use NoIndex  for those pages which have a low value, like about us, contact us or FAQ's 
This  iis a better way to handle these low value pages which you do not intend to show up in the SERPs

February 20, 2016

5 facts about rel=no follow attribute to keep in mind before optimizing your search

In 2005 all 3 search engines,Yahoo, Bing an google agreed to support an initiative to reduce the effectiveness of automated spam
Unlike the meta robots version of NoFollow, a new directive was employed as an attribute with within an or link tag to indicte the fact that the linking site " does not vouch for the quality of the linked page.


With In short the rel=nofollow tag was intended for search spiders not to pass on the link juice to the third party link which the website is linking to originally this enabled to " stop automated links appearing o blogs as comments, forums and other user generated content siteswhere links were liberally splashed around, to fool the search engine to crawl and pass on the usual benefits of the search benefits


 In due course of time" it was seen" most website owners used content from other sites, but used the tag rel=no follow" to stop the link juice flowing to the linked page. However google guidelines say that " only paid links" or links attained through dubious methods should be used as rel=noFollow tag. Google also says that " when linking a site " which is editorially good" you should not be using the " rel=no follow tag.


Please note that although the rel=noFollow tag is used to indicate search crawlers from passing on the linking benefits, it does not stop indexing the link( despite the lack of semantic logic)
You can implement the no Follow link as follows a <a href="http://www.onlinemarketing-trends.com/" rel="NoFollow">


In  2009, Matt Cutts wrote a post which suggests that" link juice " associated with NoFollowed link is discarded rather than reallocated , In theory you can still use rel=NoFollow  many times you want, however using it on internal links does not bring the type of benefit webmasters and seo preference which it once used to

One word of caution, is  using it many times across external links too many times,can be flagged as a site being overoptimized. the thumb rule here is out of 10 posts use no follow for 7 of them , while for posts which  you use from third parties" no do use rel=no Follow for sites which are editorially seen as very strong

February 15, 2016

the highest cost per click keyword list in bing

THE MOST EXPENSIVE KEYWORDS IN  BING 

"The biggest cost per click keywords in Bing"
Lawers and Attorney's and structural settlement are among the top 10 most expensive keywords in BIING.

February 14, 2016

the 5 biggest server and hosting issues that affect search results

Thankfull  only a handful of search and server issues impact search optimization . However when overlooked they spiral into big problems  and hence its worthwhile to  understand hosting issues along with server problems that can negate the effect of a well optimized site

Here are among the biggest web hosting and server issues which impact search visibility

1)Server timeouts :if a search engine makes a request that isn't served within the bots limited  time, your pages might not be included even in the google index  and will without doubt rank poorly for your given keyword.( As no indexable content is found)

2)Slow response time : response time are extremely vital for your search visibilty as the  search crawlers will have tough time indexing your page.If the response time is low,, its less likely the spider will continue to wait for your site to open as  the crawlers come yo visit your site for minimum and limited time, and non failure to serve your pages during that time will negatively impact your site's seo

3)Shared IP address . Basic concerns are speed, potential of being linked to sites that originate from the same IP also implies that you are a part of link spam or a spammy neighborhood .




"the 5 biggest server and hosting issues  that affect search results"
How hosting and server issues affect search
4)Bot redirection and handling Some system admins and webmasters  goes overboard with protection and restricted action to files  if a single visitor makes more than certain  number of request in a given time frame.. This will only serve to constantly limit the spider's ability  to crawl and will effect your search ranking

5)Server geography : Though this is not a major issue with search, but search engine does consider the location of the sever in determining if the site's  content  is relevant  from  the local search perspective . According to Google around 40% of searches have a local search criteria in terms of search results

February 7, 2016

10 duplicate content checker tools to prevent seo penalty



Here are some of the tools which will help you to filter out duplicate content issues and prevent SEO to crawl your site which it sees as duplicate and penalise the, . This list contains both paid and free tools

1. Duplichecker :Duplichecker free plagiarism checker search This free plagiarism checker tool allows you to conduct text searches, docx or Text file, and URL searches.

 2. Siteliner: Free Duplicate Content Checker  from Siteliner For checking entire websites for duplicate content.Simply paste your site’s URL in the box and it will scan for duplicate content, the SiteLiner premium service is very affordable (each page scanned only costs 1c),

 3. PlagSpotter :Free Plagiarism Checker Recommendation From Thrive .You can sign up for their no-cost 7 day trial to enjoy a plethora of useful features, including plagiarism monitoring, unlimited searches, batch searches, full site scans, and much more. PlagSpotter's paid version is extremely affordable.

 4)Moz Tool : quite handy  paid tool to understand,which content keywords or phrase is duplicated.

5) CopyScape : One of the most well known duplicate content checker for free from copyscape.CopyScape offers a free URL search, with duplicate content results coming in in just a few seconds.

6) Wordtracker Tool : This filters the duplicate  phrases and lets you import into a pdf file. It also helps you match a url that is stealing content from you site

7) Webconf : This tool will tell you the number of pages and the urls  which are duplicate content and the site results which has copied content from your site .Once your type the url in the content box, it allows you to determine the percentage of duplication similarity between two pages.)http://www.webconfs.com/similar-page-checker.php

 Free Tools 
 1. Duplichecker
2. Plagiarisma
3. Plagium

top 5 free keywords research tools which you are yet to know




" the top 10 free resources for researching keyword  tools"
1a)WORDSTREAM  KEYWORD SUGGESTION TOOL" basic tool for generating keywords suggestion along with search volume metrics

1b)WORDSTREAN KEYWORD NICHE FINDER :useful for building keywords  for a new list of niche content which you want to optimize

2.)QUINTURA : provides interactive keyword data in the form of a cloud. Along with a tag cloud also sits a traditional SERP pages. This visual tool can give you some great keywords along with phrases which you might have overlooked . Just watch the tag cloud, change its shape as it automatically changes its shape, with the most relevant keywords, grow bigger while the long tail grows smaller.

3.)GOOGLE SUGGEST ; you might be already familiar with this tool along with google keywords inventory research tool.

4.)SOOVLE : shows real time search terms as you search them,this is ordered by their popularity of keywords. This is a one stop destination that polls, amazon, bing, wikipedia and answers.com and pulls out all relevant keywords, which most sites tend to overlook, refreshed dynamically each time you pause during typing. This keyword tool helps you to tap into 7 top researches at once

5.)YOUTUBE SUGGEST; in case you are interested to find out search terms for video optimization, this is the site to go. As you start to type keywords, it sharts showing the most popular keywords instantly

6.)SEMRUSH ; one of the most popular tools that help you narrow your keyword and extract the most popular and shows you the long tail keywords as well

7)KEYWORD SPY : one of the best tool to research  know your competitors keywords, including how much they have been spending on paid as well as organic searches besides giving you a ranking of your competitors as well as  your own keywords. It has a paid version and a free version .

8)SPYFU :This is one of the competitors of keyword spy and an excellent resource to understand you competitors seo optimization  and where they rank. It allows you to run a check on popular keywords and provides insights on how much your competitors are spending on the paid ads


May 28, 2015

inbound link in SEO building efforts : 5 metrics to measure link worthiness

" link building :  the value of  links in seo efforts"

 Not all links are created equal. The value of an inbound link, depends on a lot of factors. 1) the value ofthe page from where the link is offered 2) The number of links from  the originating page. 3) The page rank of the page from the   page 4)  Whether is  link is a 301 redirection or a  302 ( temporary redirection) 5) if the link has no follow or follow .

April 11, 2012

Link Building for "Link Unfriendly Industry": Top 5 Tips

Link Evaluation Survey 2012 infographicBy Orange Line SEO

David Klein recently published   an infographic highlighting  how to benchmark your links for SE Optimization, Off page and Onpage search are both important, however link building involves off page optimization and David's 3 linkability tests will help you know if your link policy makes sense technically

Here are some metrics, and data points to refer, if you are not sure : about the linkability prospects

  • The strength of a link
      despite the shifts in link building practices, it is vital to assess a link from a strength point of view. This means utilising one or several or your preferred metrics, for us this is PageRank, mozRank and domain authority.
  • The quality of a site
     just what is a high-quality website? We assess a website based on the overall user experience and the apparent policies of the website being evaluated. This includes things like a suitable number of adverts relative to content, well-written and well-constructed pages, and ensuring the website doesn’t appear to be overtly flogging its sidebars or footers.
  • The relevance of that website
     some argue that the relevant link is a myth but we are still keen advocates of links making sense from a user’s perspective and that is the key to relevance. Obviously in some industries it is near on impossible to get links from directly relevant websites .. however the fact that "the link should be relevant from the industry, if not directly, maybe you can try the entire industry ecosystem, both back end as well as front end, try the  suppliers, your vendors, and OEM's distribution channel, if you are in the " Non Friendly" Link building industry

June 27, 2011

Global Search Vs Country Search : Search Marketshare









May 23, 2011

The Search Market Growth in Latin America

According to Comscore data  In March 2011, Latin Americans conducted a total of 18.5 billion searches, an increase of 21% from the previous year. Brazil, which accounted for the largest volume of search queries at nearly 6 billion, also had the strongest growth rate of 34%. Mexico ranked as the second largest search market with nearly 3.2 billion queries conducted in March 2011 (up 23%), followed by Colombia with almost 2.9 billion searches (up 28%).

May 17, 2011

Link Building Commandments

KliKKi is a leading search marketing agency in the Nordic region providing SEO services to numerous multinational brands both locally and globally.

April 21, 2011

Top 10 Paid Search Properties where US Brands Spends Most


         
Search marketing, Paid search is among the most frequently used online marketing  tactics where US Brands spends the Most.

The Latest annual findings from Search Engine Marketing Professional Organization (SEMPO) and Econsultancy indicate paid search marketers are increasingly turning to PPC advertising on social media channels to complement traditional search engine placements.

More than half (52%) of companies worldwide vouched for the “moderate” or “huge” impact social media has had on their search engine marketing programs within the last year.

Add that to the growing number of social media channels offering a PPC advertising model, and it’s no wonder 47% of North American companies are running PPC campaigns on Facebook, and more than a quarter (27%) are doing so on LinkedIn. In addition 18% of companies are PPC advertising on YouTube, and 15% on Twitter.

Although these percentages are dwarfed by those of North American companies advertising on Google and Bing/Yahoo!, major search engines aside, it’s clear companies prefer PPC advertising on social media channels to smaller engines like AOL and Business.com.

Image source :elliance /Dyseo blog