Trending this month



Showing posts with label SEO. Show all posts
Showing posts with label SEO. Show all posts

June 2, 2016

optimizing your site for 5 things that search engine cannot see

How do you identify problems with your site which search engines fail to see and leads to search spiders missing indexing your site.
Problem 1 For example here is a simple scenario “ your webmaster is working on a site on a staging server"as  you don't want the search engine to see this page which as they are duplicate versions of your page  which you normally keep on a staging server which is keept as "NoIndex" This is something that search engines cannot see
Normally when you move the site from the staging server to the live server you should remove the NoIndex tags. However in case you forget removing the Noindex , you will see phenomenal drop in traffic .


Problem 2 : Some webmasters implement robot.txt that prohibits crawling of your site in the staging server. If this site gets copied over when the site is switched to live server , the consequences will be just as bad as the NoIndex example

Problem3 : A key difference between a person using a browser and a search engine spider is that the person can manually type a URL into the browser window and retrieve the page the URL points to. Search engine crawlers lack this capability.Instead, they’re forced to rely on links they find on Web pages to find other pages. This is one of the reason why inbound links are so very crucial 


Problem:4 Another technical problems which happens is when " search crawlers " encounters an object or file type that’s not a simple text document." Search engines are designed to index text and are highly optimized to perform search and retrieval operations on text. But they don’t do very well with nontextual data

Problem :5 The best way of understanding this and detecting this and taking appropriate action is to use an analytics software to find pages in your site that gets page views but no referring site traffic. Though this itself is not conclusive enough but it does provide a clue on what is going wrong in your site. The reverse of this situation is also true. If you see content on your site that is getting search referrals even though you don't want it or expect it, you may want to hide that content
Auditing what you have missed in search optimisation:Another data point which you can use to find if search engines are not able to see you content is check if your content is being picked up by search engines. For example if you have a site with 1000 pages with good inbound links and after 3 months you see only 20 pages are indexed thats enough clue that there is a problem


May 31, 2016

deciding on when to add no follow links vs follow links

How do you decide  when to use  rel=nofollow  and when to not . While this has been explain in depth in my earlier post on  the importance of rel=no follow and follow links ,one of the best ways to decide this  is to understand  and analyse which kinds of sites or  links influence your posts . In short  comparing which is relevant  to  the website and which one is not.. Assuming you read an article about new seo strategies for 2016 compared to a blog template which you like and want to add that as your template .. which one would you choose for as rel= no_follow links..

In 2005 all 3 search engines,Yahoo, Bing an google agreed to support an initiative to reduce the effectiveness of automated spam Unlike the meta robots version of NoFollow, a new directive was employed as an attribute with within an or link tag to indicate the fact that the linking site " does not vouch for the quality of the linked pag
Read the follow post below



You an read more about rel=nofollow links and rel=follow links here



If you are running a seo blog , obviously the first one is of relevance, as it gives you information which is topical , relevant and contextual to what your site is about, while the blog template is not.. The analogy is similar. Follow links are used for something that you liked and wish to pass the link juice to the site. It means you are voting for that site and telling the search spiders that the site is trustworthy . Use no=follow for the second one..or your telling the spiders that " you dont wish to pass on the link juice to the site .

May 3, 2016

Loosing out on your quality score :the best landing page infographic



" 5 tips for a perfect landing page"


5  Ways to ensuring SEO benefits while deciding your landing page 


for any SEO optimisation , landing page is a key. While in organic search, Landing page per se is not that crucial as compared to paid search. But irrespective the nature of search a well designed landing page helps spiders to look at your content in a very structured and granular way. This  helps you increase your page score in paid search, and helps your ranking across organic SEO. The above chart illustrates  via an infographic showing how a landing page ought to be designed and the key elements to be incorporated





"how to ensure SEO benefits while deciding your paid search landing page"

The 12 jewels to be incorporated in your landing page 





April 12, 2016

how to use blended search to rank well across local search



BLENDED SEARCH RESULTS :5  Ways it Impacts your organic search results 


The introduction of Google’s ‘Snack Pack” results was perhaps the biggest local search shakeup since Google Pigeon rolled out back in July of 2014 which  had about 7 parameters on : local search rank"However of late, Google started to blend 7 snack pack into 3 snack packs which are considered the most crucial ranking parameter for local search


 Blended search results means" when google " starts searching and  ranking your site, its tries to use blended factors to find out how effective you are  in terms of vertical search. Blended results such as local, shopping,images, real time videos , news are taken into account  and have begun to play an important and crucial role in addition to  the standard search results factors.
Here is a lowdown on Blended search results  that might help you to rank specially across Local search which might help you to index your site much faster 



HOW BLENDED SEARCH HELPS TO RANK ACROSS LOCAL SEARCH


  1. The business has a listing in the local index including, Google Places ,Yahoo Local, Bing local listings 
  2. The actual business location has the same address as mentioned in in the website  and search engine query friendly
  3. The business has a local phone number for the searched location.
  4. The website contains local address and phone number for the query
  5. The website should have more positive and superior ratings by users  
  6. Social Signals : How are your business shared across social media,how much clicks do you manage to generate out of your local ad business cross facebook, twitter , Klout and linkedin among others
  7. "5 ways how you can rank better in Local search"


  1. videos that describes the local institution or something about the product and services which are tagged 
  2. The local listings ( google local ) has to be claimed by local owners and verified as their busines videos that describes the local institution or something about the product and services which are tagged 
  3. The local listings ( google local ) has to be claimed by local owners and verified as their business










  1. Foursquare checkins, mobile clickthrough rates
  2. Of late behavioral and/or mobile signals to make up 9.5% of the algorithm across  localized organic results
  3. High Numerical ratings on Google Local/ Ratings by Google users ( Google Local) 
  4. The listing title  and the description has to have atleast 3 keyword in order to search across blended search SERPS
  5. The business  should be listed across third party data providers and Yellow pages
  6. Geo-tagged images on photo uploading sites, like Flickr, Dropbox and Panoramio , along with a caption that explain the " Geotagged images and local business





April 11, 2016

struggling with organic search : 5 seo parameters which you need to improve



5 Reasons why you are not succeeding in Organic Search  


While  most of us know the algorithm of google search results to determine the ranking of the site . Externally ,that is off page optimization,the most important are 1) web page text
2) the linking sites
3)the number and quality of the sites linking back to you 
4) The anchor text
However lets find out what"most search engine cares about.Here is a quick lowdown on the most important factors which is affecting your organic search results .

" search engine ranking on Google"

5 Factors to keep in mind for organic search

INBOUND ANCHOR TEXT : ask webmasters when asking for a link to include " your anchor text" The anchor text should be relevant to the basic theme of the website.Lets say your have a website on Hotels and you are approaching a travel site for a link, and your target keywords are Cheap Hotels with great facilities" Use this anchor text to ask webmasters to link to your site, or you  can also ask webmasters to link to your internal page



SITE AUTHORITY : how other sites see your website. Are you considered as the " best website in terms of cheal hotel site ? For example how many back links does your nearest competitor have as compared to you
" seo list of periodic tables to rank in the first page"

periodic table of  elements and factors for better search engine ranking

VISIBLE HTML ON YOUR PAGE : Avoid dynamic links, java script and flash, Most of your ebsite needs to be in html, as the search spider are more comfortable in picking up html content

AGE OF THE DOMAIN : the earlier your website has been registered,and online  is considered more relevant as compared with a website that has started recently




NATURE OF INBOUND LINKS: quality, relevance and quantity are the most important tha decides the most important factors for " Organic factors of SEO

AUTHORITY : how other sites see your website. Are you considered as the " best website in terms of cheal hotel site ? For example how many back links does your nearest competitor have as compared to you
CONTENT PRIMACY : do you have an unique content, which has not been existing in any other site ? or you have copied the content, and add a few stuff from your own.. This may take some time, but the more you write original content, the search engine views this as fresh content with unique webiste

SPEED: the faster your site's loading time, the better it is for both search engines and usersthe A slower speed  harms your website in too ways, bounce rate increases, and as google website comes with specific bandwidth , the  harder it  will be for the bots to find your websiteb

DISCLAIMER : These are some of the most important seo parameters , there re more than 100  parameters which search engines use to rank websitesyIn the above site ,



April 4, 2016

planning a navigation for non human visitor and search bots



SITE MAP TO BE FOLLOWED  BY SEARCH ENGINE SPIDERS  : 5 ways to ensure bots visit at your web site


The most crucial fact you have to understand that the search engine does not look the same as a human visitor .By creating an robot friendly navigation  you encourage the  search bots visit your site more often which in turn  increase the visibility of your site. Increasing visits by search bots ensure your content gets indexed  faster, your cache( when google spiders visits your site the cache shows when they have visited last. If you see google's cache showing very recent visit, it means, your content is getting  updated faster,, and this improves your visibility in the SERPS.sees it.o perform better in search engine listings, your most important content should be in HTML text format. Images,
"an ideal link structure for robots"

Things not to do : Avoid Flash files, Java applets, and other non-text content and emphasis more on html pages .This is because   search bots often ignore or devalue these codes  

see the entire post below









how do you get to do this : 1) by providing search engines robots with links to navigate the through your website
 2)by pointing search engine robots to dynamic or hard to read pages that might not be able to accessible other wise.
3) By providing a possible link to landing page
4) By ensuring your 404 error has a link pointing to a page ( preferably home page 5) by using ready to use content if a page is not available or a url that is broken




April 1, 2016

VC exits from technology start ups


2015 TECH EXITS BY VC's There were over 3,400 exits in 2015, a 14% increase from 2014. But exits by quarter are trending downward with just 717 exits in Q4’15 – the lowest quarterly total since Q1’14.

VC exits from top 10 biggest Technology start ups

"VC exits from top 10 biggest Technology start ups"





vc activity including start ups exits numbered 3400 last year"


Global tech exits including M&A, IPO trends, and much mores 












March 29, 2016

dynamic serving vs responsive design : customising your design for mobile users


As we explained in the last responsive mobile design post.There are 3 ways to optimize your mobile website.These are 1) Creating a responsive website 3) dynamic serving and 3) Creating a mobile website

CHOOSING THE RIGHT MOBILE DESIGN APPROACH :WHEN TO USE DYNAMIC SERVING FOR YOUR MOBILE USERS 
 
In this post we look at " how to use dynamic serving for mobile users and under what circumstances this design is used 

Dynamic serving is a server side development approach that detects which type of device your visitors are using to view your website delivers .Dynamic serving specifically optimize the website content based on the device your users are using and the server responds accordingly .

Like responsive design, the dynamic serving uses one single set of url's for all content regardless of how the content is seen, irrespective of a desktop, PC, laptop or me. However that's when the similarity ends .In dynamic serving the url too remain the same,but the content delivered to mobile device is not always the same as in desktop

This is because dynamic serving is a server side approach that alters the content code (HTML, CSS and php ) based on the device that is asking for it before the content is delivered to the browser.This code allows the server to alter the content of the page without altering the URL of the page

 CIRCUMSTANCES WHEN YOU SHOULD USE DYNAMIC SERVING


1)When your website needs to include complex mobile friendly functionality such as multi-page form and interactive dashboards.Dynamic serving allows you to serve the best experience based on user circumstances

2)When you see that your website needs to serve 2 different device markets very differently. One example is that iPhone users take a different path to conversion as compared to android users . Dynamic serving is also used when you want your webpages to render differently for your tablet users and smartphone users.

3)When your visitors largely use different keywords to access your website via desktop search and mobile search. Dynamic content allows you to altered the way the " content is rendered on page by page basis

 4)Dynamic serving  is the best solution If you want to optimize specific pages for high volume keywords phrases without changing the desktop language and if you want your desktop visitors and mobile visitors to convert in different ways

March 26, 2016

science of writing meta tag descriptions : 5 best practices for seo

5 best practices in writing the perfect SEO meta tag description"


The significance of Meta description  :While the perceived (and real) importance of meta data across search has depreciated , the attribute still plays a significant role in SEO rankings  :Meta Description has 3 primary uses

META DESCRIPTION TAG SEARCH ENGINES: 5 BEST PRACTICES 
Meta Descriptions Role in Search
1)describe the content of the page accurately and succinctly 
2)Serve as a short term advertisement for to click on your pages in the search engine 
3)to display the targeted words not for ranking purposes but to indicate its content to searchers 

5 rules to write the best meta descriptions


1)Number of characters :. Descriptions should be succinct and compact
2)Keep the length to 160 characters ( for google) and Bing up to 200 characters :However you are allowed to keep your descriptions to an average of 165, including spaces



3)Test, Refine, Repara and Rephrase : Just like ad ad which undergoes a plethora of tests , ensure that the description actually fits the content or the theme of the page in question. Each web page ideally has to have its own description pages.



3)Include relevant keywords : Its very important to have the right keyword in the meta description tag - the boldface that the search engines apply which can make a huge difference in visibility and the cilck thro rate of the website in question.

4)Ensure tagged facts in the description: You can include other structural tagged facts in the information apart from standard information . While news or blog posts can list the author, date of publication, or byline information. What about a product page? How do you incorporate product information like —price, age, manufacturer, features which lies scattered throughout a page. A good meta description can bring all this data together. For example, the following meta description provides detailed information about a Ian Fleming.


5)Understanding User Psychology of Search Disparity: take into account user differences in terms of search behaviour . For example an organic search user will not see your website the same way as users which comes from" PPC search'.While creating the meta decsription its important to keep in mind this basic fact . Users looking for information and users looking to shop online are clearly two different sets of consumers and you need to create descriptions based on which kind of users are your target segment


6)Employing Universal descriptions: Some search marketers are of the view that " you should not always write a good descriptions".Conventional logic holds that its wiser to write a good meta description to maximise the chances of it being used in the SERPs, rather than letting the engines build one of them. However this isn't always true

If the page in question is targetting one , two or three heavily searched terms or keyword phrases , go with the meta descriptions that targets user performing those searches . However if you are targeting long tail traffic with hundreds of articles or blogs entries, it can sometimes be wiser to let the esarch engine decide and themselves extract the relevant text The reason is simple to understand.When the search engines show a page in the , SERPs they always display the keywords and the surrounding phrases that the user searched for. However If you try to force a meta description, you can end up creating one that is not appropriate for the search phrase your page gets matched too

March 25, 2016

5 best practices in title tags construction for search engines

For keyword optimization title tags are most critical elements for search engine relevance .The title tag is in the <head> section of the html document and  the only pieces of meta information about a page that  has  influences relevacy and ranking

section of the html document and the only pieces of meta information about a page that has influences relevacy and ranking the following 6 rules represents the best practices for title tag construction. One of the things to keep in mind is to ensure that the title tag of any given page has to corresond to that page content


1)incorporate keyword phrases : an obvious thing to do, is to use title tag wherever your keyword research shows as being the most valuable keywords for capturing searches

2)Place your keywords at the beginning of the title tag.This provides the most search engine benefit. If your doing this and also wish to employ your brand name in the title tag, always place it at the end There is a trade off here between seo benefit and branding benefit that you should think about explicitly before taking the decision .Well established and well known brands might want to have their name at the start of the title tag as it may result to increase click thro rates (CTR)



 3) Limit yout title to 65 characters including spaces Content in the title tag should not be more than 65- 70 characters for allmost all search engines, any content beyond 65 characters gets cut off the SERPS
4)Focus the title on click thro and conversion rate : The title tag is exceptionally similar to title you generally write for paid search ads.However this data specifically is hard to measure and improve as the stats are not found easily and readily. If the  market you serve has  relatively stable search volumes , you can do some testing and see if you can improve your CTR

5)Target the searcher's  intent: while writing titles and descriptions  keep in mind what users are doing to reach your site  either by search engines or referrals.Study the top 10 landing pages and try to find " why users are coming to your site for "




If the searcher's intent is researching.. you need to tweak your title and maybe the structure needs to have more descriptive .However if users are purchasing online and looking at discounts at your site  your title should clearly mention that these functions are available .example " digital cameras now at 40% less: axax( name of website ): the top selling bestdigi cams online

5 ways of optimizing domains for search engines

6 ways of optimizing domain names and urls 

1)Brainstorm 5-10 keywords : before you set up a website, it is important for you to create few url's and brainstorm on the keywords you wish to target via your website :Once you have this list, you can start to pair them or add pre fixes and suffixes to create good domain names . For example if your about to start a website/blog on mortgage related domain. You might want to start with keywords such as " Mortgage , finance , home loan, house payment  


2)make the domain unique : creating a domain which is often confused with a popular website which is already owned by someone is a  recipe for disaster. Among some of the popular ways webmasters often tries to leverage the " popularity of an existing key phrases or urls  is to book domain names that are simply plural names,hyphenated or misspellt  version of already established domains


However this seldom helps, as a the strength of the site in question is always big enough ( assuming the site in question is really big) a misspellt  keyword will eventually lead users to the original site compared to yours as the domain authority of the page will be always be higher than yours





3)Make the url easy to sound and easy to type :  Any brand that has a trouble being read or even being written.. has already lost round one .Ensure you use a name that " is easily recognizable by what it means to users along with an imagery. 

Make your url easy to pronounce, be shared and passed around :Word of mouth remains the fastest way of bridging distances between 2 people 


4)Keep the domain short and sweet :Domain names should ideally should not exceed 10 words, unless you cater to very niche industries. Avoid repetition of numbers ,alphabets and numbers. Do not create a url that looks more like a password !

5) Not all names sounds familiar: consumers do not react to all names in the same way.Some names create positive vibes among users while some do not.. This is not due to any inherent bias..as even users are unaware of this.However neuro marketers has some answers to why this happens.Apparently they explain " human beings including consumers loves familiarity more than the unknown.The unknown has an element of fear". while the known has no fear. 
This is one of the reasons why domain names like autotrader., Realty,webmd sounds familiar as the user can guess the theme of the site by hearing their names.as opposed to zillow.com and monster.com 

6)Reject hypens and numbers : both hypens and numbers make it hard  to convey the domain name verbally .Also avoid roman numerals and large caps and small caps combination or being case sensitive

March 24, 2016

how google hummingbird was the precursor to semantic search revolution



HOW GOOGLE HUMMINGBIRD UPDATE STARTED THE SEMANTIC SEARCH REVOLUTION 


September 16th, 2013  was a  a watershed day in the history of search marketing, as Google dropped its Google Hummingbird Bombshell on  its knowledge graph as it announced that Google hummingbird was live and its search algorithm  undergone a dramatic makeover.
When hummingbird flew in, the old way of just matching up strings of characters in a search query  just went out of window. With its Hummingbird update Google changed  its core prowess from strings  to things and in the process it changed google from a search engine to a knowledge engine


The Need For Semantic Search:All  search engines compete among themselves to keep their search keywords more relevant  and in tune with users  actual intent. While a users input device might increase 10 times or 20 times, their intent does not change.. For any search engine its true test starts and begins with user intent  as it tries to understand the query itself and not only just the keywords and in the query. Once a semantic search  engine understands the context  and relevance..it will throw up the same results and meanings of their words as humans do

For example a search query "what is the height of Barrack Obama",  a conventional search engines will try to bring up pages which may have " stories on Obama, height of something, which might or might  not have any connection to Obama, news on American Presidents". This 
is a typical machine to machine learning in progress

However by  bringing on the human element ..where one user asks the other .. whats the height of Barrack Obama..Its being made amply clear  that we are interested  to know the height of the American President. This conversation is now aided by semantics

In terms of semantic search, The Hummingbird update takes the small but a significant step towards bridging the gap between machine led learning to human  based  learning. The best example of how Hummingbird update finally manages to bridge the semantic gap is this example shown below

A search for “pizza hut calories per slice” used to list an answer like this, Google said, but not one from Pizza Hut. However Now,  after the Google Hummingbird update it lists this answer directly from Pizza Hut itself, Google says



Google Hummingbird has been the precursor  to  start the  first Semantic search revolution. Semantic search technology seeks to extract entities from its databases as answers and displays personalised results based on the individuals personal  browsing habits

Hummingbird and Voice Search: Hummingbird paved the way for voice search by ability to search by speaking your query into search engine using a smartphone . The ability of the the search engine to understand spoken search queries by users intent and contextual meaning which are today used  in spoken conversation is an perfect application of using semantic technology across  search engines . 

Imagine being able to talk  to your computer the same way  you talk to your friend. 
"Hey I saw this awesome Oscar nominated bridge of spies.movie.. Why dont you watch it too? do you want me to find out wheres it playing at NJ "?
Semantic search is revolutionizing how people search because they can not merely  interact  by keywords but in conversational line . With mobile penetration burning up all records.. Voice based search is set to grow in numbers that will defy all expectations 


how metasearch engines differ from general search engines



how metasearch engines works : how different are they as compared to traditional search engines 

 Metesearch engines  are  search engine aggregator, which extracts the search results from different search engines and presents the users with the top ranked sites .. Metasearch engines do not maintain databases of its own, but rather it extracts  and combines data from results from  several other search engines  and displays it in a search engine SERP format.

One of the biggest advantages  that metasearch engines have is that " They often combine the best search results across many search engines and displays the best results based  on the user query for the keyword"
 The more results  that can be seen in one place the better it is for users .Among the top 2 biggest metasearch engines are 1) Dogpile and Duckduckgo.



While Dogpile pulls results from Yahoo and Google in one place, Duckduckgo gives you a overview of search results without creating a personalised feed of your search rankings.This creates a more filtering approach  to all the indexed data gathered from the other search engine sources.The ads appear at the top and bottom  with organic search  called " web results" appearing in the middle .The Biggest proposition which  duckduckgo provides  is the fact that  its the only known search engine that does not track user behavior 



After pulling the necessary data from known search engines, The metasearch engines , retains the top  search rankings from the separate search engines and presents the user with top ranked sites  This is different from running an algorithm  to find the best sites for a keyword by creating an index  


March 23, 2016

5 most popular kinds of search engine spam






 How to spot search engine spam : 5 tell tale signs of spam impacting search engines



Search engine spams refer to an  attempt to deceive search engines by telling them to override " existing" search engine best practices  and " laying emphasis on a set of given criteria, which under ideal conditions does not deserve to be ranked at all.

In this post we discuss the most popular kind of search engine spam and how to recognize them.However never try to use them no matter how much you are tempted, as this will only result in your site being black listed


HIDDEN LINKS : White texts or links on a  white background renders  texts invisible to the users  unless the text is highlighted  by right clicking the mouse .Spammers then use relevant keywords or hyperlink that the spiders can read and counts as relevant 

TEXT LINKS HIDDEN BY A LAYER :  One of the tricks  most used by black hat seo webmasters  is to use CSS to  hide spiderable  content under this page which is not visible to the naked eye or by highlighting the page 





DOORWAY PAGES : Doorway pages are web pages  that are made to meet specific search algorithmic  requirements  for various search engines,and are not meant to be shown to ordinary users .In short the doorway pages do not earn the ranking but deceive the search engines into ranking by design  whose main intention is to spam the search engine index  so that it appears high in the SERPS. However  when a user clicks on it , it is automatically redirected /  to another site or page within the same site..

UNCLICKABLE LINKS: creating a link that has  only a single1-x-1 pixel as an anchor , that uses  the period on a sentence as an anchor or  that has no anchor at all . For users there is nothing to click, but the search engine can still follow the link


CLOAKING : In cloaking  the content showed to search engines and the version which is shown to the user browsers  are different. Spammers may cloak by IP  address( information used to find out where your computer or server may be located  or the user agent : HTTP header  describing whether  your a person or a search robot which is requesting the page. When a user is identified as a search spider  a server side script delivers  different version of a web page , in which the content is different from what is viewable by the searching user browser


February 28, 2016

5 kind of linking strategies that can effect your seo rankings

Among the the most crucial changes that might affect your website traffic is your linking strategy. At times we go overboard to get links and focus on the quantity , rather than quality.This affects your traffic and might change your SERP Again how do you judge quality of your links . There are 3 ways how you can do this
1) Is the industry you are targeting complimentary or it is your competitor ?
2) How many links does the website have ? Are the links merely buried in their links pages. A page has more than 100 links ..you can be assured the link will not  pass on the attributes of links



3) Relevance and authority : Links that are related to the same topic are given more weightage than random linking  to unrelated pages.Think of relevance of each link being evaluated by in the context of specific user search query .For example " for a search term " new cars at Arizona " if the publisher has received a link from Arizona chamber of commerce, the search engine derives  that  the fact that" the link is relevant and trustworthy as the site is about Arizona".Once you decide on the industry you are targeting,your focus on link building should be razor sharp on getting linksfrom within that industry.



4) No follow links : Although google recently claimed that the attribute of "Nofollow" links is no longer that relevant and important as compared to a couple of years ago. However when you use a No follow meta tag on a page, the search engine will still crawl the page and place it in its index. However all links ( both external and internal) on the page will be disabled from passing the link juice to other pages

 4) Anchor text : Desist from a link where the anchor text mentions" our links, click here to know more,read more, or check out the full post here. Be very specific on the anchor text to which you site is being linked to. The best policy is to use the title of your website or webpage as an anchor text The impact of anchor text is quite more powerful than you think.For example if you link to a page that has minimal search friendly ( flash site for example) The search engine will look for signals to know what the page is about.In such case inbound anchor text becomes the primary driver in determining the relevance of the page