Trending this month



Showing posts with label Search Marketing. Show all posts
Showing posts with label Search Marketing. Show all posts

October 20, 2016

a dynamic website using Content management system:must have seo features


 
A dynamic website is a website that is built by using a template and a CMS. Content management system gives you control on how do you create a your web page pulling in information from various sources from a database.That means  the web pages does not exist unless someone builds them for you. For example you run a online shopping website and sell more than 10,000 products which you sell online . For creating a website such as this. You are not going to build a 10,000 web pages each for each product manually. Instead you use a content management system to build the pages dynamically on the fly. 
A CMS actually generates the page which a search spider crawls by taking the information in your database and plugging it into a template web page so that the CMS creates the tags, content and the code which ultimately is seen by the search engines.However one important thing to remember is its extremely essential to have a search marketing friendly CMS for creating a dynamic website.Any CMS that allows you to completely have access to these individually pages on a granular level alllows you to create tags, meta tags, keywords and titles are considered search friendly. Having these features allows you also to have a control on the entire content of the web pages. You should be able to make changes in the tags, H1 tags,edit keywords and descriptions.


In short any CMS which does not allow you to control individual page wise level seo.. is not seo friendly. You can use wordpress or Pixelsilk ,which are both open source.For search keep these things in mind when you are deciding the CMS 1) It should be able to customise HTML templates 2) Should allow you to produce unique Title tags' 3) You should be able to add meta description keywords on your own 4) produce and change H1,H2 h3 tags at will 5)categorise and create content by groups

October 19, 2016

how many form fields is the ideal for cta landing page forms


 
Best practices for creating CTA forms: Using forms  for creating a call to action page in your landing page :Forms are your best methods of collecting user information and customer data points. The amount of data ad information you ask for ( the number of form fields you have in the form should tally directly with the degree of upfront prospects screening you wish to filter your qualified leads or information you need to qualify a lead as moderate, hot and warm)
Most often visitors perceive value of benefit in exchange for their personal data.For example users  give their details based on the demand of your proposition. How important is the value of your product or service offering:When a visitor perceives something as more valuable they are willing to release more information. Conversely the lower the perceived value.. the lower will be the filters which user will be willing to part with their personal information.
A customer may be willing to part with their credit card for a one month trial or a subscription offfer, but they would not be willing to part with the same credit card number for a ebook.your basic point of action should be to elevate the power of offering or promising something which they have not been exposed or seen earlier . Sure that depends upon your product if you wish to hand them freebies or wish to capture only the email id and phone number


Ultimately its a wise strategy to keep your form filter set at capturing information like email id/mobile number. as form fatigue tends to grip the online user and they will over a time will part with only a few information before they move elsewhere. I usually recommend a form having no more than 3 form fields
 First and last name Email ID Phone number If you have more than these 3 fields, pls designate the forms with mandatory for those fields which are must for filling the form and those which are non mandatory. For Business to Business companies you might want to inlclude :business email :Title and Number of employees

October 14, 2016

google confirms sites hit by google penguin 4 years ago are recovering


 
Those webmasters who had been hit by Google Penguin can now rejoice. A recent Moz posts " shows that sites which had been hit by Google Penguin " is showing increased traffic from Google after a long two-year wait. Post the  latest Penguin update rolled out in late September and into early October 2016. This roll-out is unusual in many ways, and it only now seems to be settling down. In the past couple of weeks, we've seen many reports of recoveries from previous Penguin demotions.On October 7th, Gary Illyes from Google said that the Penguin roll-out was in the "final stage" (presumably, the removal of demotions) and would take a "few more days". As of this writing, it's been five more days.
Google representatives are confirming details about the new Penguin both publicly and privately, and algorithm flux matches the general timeline. Perhaps more importantly, we're seeing many anecdotal reports of Penguin recoveries shown in the image above


Also websites  that track websites hit by Google Penguin have reported that " the kind of hit websites had taken after Penguin Update is slowly showing signs of recovering.

August 4, 2016

why using flash content is seo unfriendly : 5 reasons

Although Google indexes flash content and follow links since 2008 onwards, the fact remains that Google or for that matter any search engine cannot read what is written within the flash files . We take a look at 5 reasons why using Flash content in your website is considered not fully seo friendly. The most crucial problems with Flash are the missing SEO features: Anchor texts, H,-H2. Tags, bold , strong tags,, alt image attributes even title pages are not very simple to execute in Flash.Developing flash with seo factors in mind is more difficult than doing it in HTML.  

 1)Different content is not on different urls:This is the same problem you have with Ajax related pages. If you have unique frames, movies within movies that appear to be on complete unique portions on the flash site, but there is no way to link them to individual elements 



 

 2)The text breakdown is not clean : Google can index the output files in the .swf to see words and phrases, but in flash, a lot of the text inside is not clean,h1> or <p>. It is jumbled into half phrases for graphical effects and often the output is in a incorrect order. Worse still are text effects that often require breaking words apart into individual letters to animate them. 

 3)Flash gets embedded ; A lot of flash content is only linked to other flash content,wrapped inside the shell flash pages This line of inks where no other intenal or external urls are refrencing the interior content , leads to documents with very low page rank, as the links juice fails to get transferred Even if it manages to stay in the main index , they probably wont rank for anything  


"why Flash should be avoided "




4)Flash does not learnt external links like HTML : A all flash site may get large number of links to the homepage, but the interior page always suffer. When webmasters implement links to embeddable Flash. They normaly point to the html host page , rather than any interior pages withing the Flash

3 explanations why creating microsites are the cardinal sins in seo

Search algorithms specifically favour large, authoritative websites that have a depth of information. This means that at any given day the bigger the site, the url that has the highest and the deepest content has the highest probability to appear in search results, as opposed to a small and  thinly spread out content in the form of a microsite . All search engines give more weightage to big  websites with deep pages  and broad pages with lot of content.This negates the reason of  creating new content as a microsite as it  is less helpful vs adding content across existent site
" microsite vs one website : whats the best way forward in seo"

 
 No matter how well you create your content, you can never replicate the “content “which is on your main website”. As your website is already being seen and indexed by  the the search engines as a website with a  huge reservoir of “quality content, trying to create another site with similar or a few unique content will not get you much traction. A multiple site strategy always split the benefits of the link: A single , quality  link pointing to a page on the main  domain positively influences the entire domain and all the pages associated with it . Because of this fact, it makes more sense if you can preserve a good link” or a new link which you get pointing to the same domain to help boost the rank and value of the pages on it .Also  a new site  will take  time  for  search engines to spider a site , and get ranked in search results. This impacts visibility of a brand on the web 


Additionally, it will take some time to appear in google results. .A clear cut content strategy is a must to create a new micros site notwithstanding, the reason not to develop the same. At times developing content for such a site is quite difficult . Having content or keyword targeted pages on other domains that don’t benefit in terms of content or backlinks is a action in futility.

July 14, 2016

deciding the optimum url length for search engine benefits

Selecting the most appropriate url formats for search engines: 

How doe a search engine decide the "  length of  the optimum size of url which is considered to be search friendly ?

While its true that search engines does give a fair amount of importance for the url specification, SE also are quite adept to  understand  and  interpret regular long urls with numerous hypens   and the extent to which webmasters  can use them for " spamming it " ( for example some seo webmasters at the sound of getting too excited and may use words like buy-this-awesome-product-iasshumiort-html.as an actual url .) This only sends out the signal and tells the search robots and give them the sound of trumpets" telling them to discount the entire post/ url as a spam .. as its the result of an over excited adolescent seo webmaster




deciding the optimum url length  for  search engine benefits "

" the art of creating search engine friendly url's for optimization"
So how do you show the search engines about the content and its url parameters by ensuring you do this just the right way ?
1) Describe your  content :  if a user can  make an accurate guess about your content  by looking at the address bar, you have done your job
2) Use static URLs: dynamic url's are harder to index for both search and users
3) Descriptors not numbers:  never use  a random 1234 numbers or a set of numbers..when you can use brand/name etc
4)use lowercase: Although urls can accept both upper and lower always desist from using Uppercase   


June 2, 2016

optimizing your site for 5 things that search engine cannot see

How do you identify problems with your site which search engines fail to see and leads to search spiders missing indexing your site.
Problem 1 For example here is a simple scenario “ your webmaster is working on a site on a staging server"as  you don't want the search engine to see this page which as they are duplicate versions of your page  which you normally keep on a staging server which is keept as "NoIndex" This is something that search engines cannot see
Normally when you move the site from the staging server to the live server you should remove the NoIndex tags. However in case you forget removing the Noindex , you will see phenomenal drop in traffic .


Problem 2 : Some webmasters implement robot.txt that prohibits crawling of your site in the staging server. If this site gets copied over when the site is switched to live server , the consequences will be just as bad as the NoIndex example

Problem3 : A key difference between a person using a browser and a search engine spider is that the person can manually type a URL into the browser window and retrieve the page the URL points to. Search engine crawlers lack this capability.Instead, they’re forced to rely on links they find on Web pages to find other pages. This is one of the reason why inbound links are so very crucial 


Problem:4 Another technical problems which happens is when " search crawlers " encounters an object or file type that’s not a simple text document." Search engines are designed to index text and are highly optimized to perform search and retrieval operations on text. But they don’t do very well with nontextual data

Problem :5 The best way of understanding this and detecting this and taking appropriate action is to use an analytics software to find pages in your site that gets page views but no referring site traffic. Though this itself is not conclusive enough but it does provide a clue on what is going wrong in your site. The reverse of this situation is also true. If you see content on your site that is getting search referrals even though you don't want it or expect it, you may want to hide that content
Auditing what you have missed in search optimisation:Another data point which you can use to find if search engines are not able to see you content is check if your content is being picked up by search engines. For example if you have a site with 1000 pages with good inbound links and after 3 months you see only 20 pages are indexed thats enough clue that there is a problem


May 31, 2016

deciding on when to add no follow links vs follow links

How do you decide  when to use  rel=nofollow  and when to not . While this has been explain in depth in my earlier post on  the importance of rel=no follow and follow links ,one of the best ways to decide this  is to understand  and analyse which kinds of sites or  links influence your posts . In short  comparing which is relevant  to  the website and which one is not.. Assuming you read an article about new seo strategies for 2016 compared to a blog template which you like and want to add that as your template .. which one would you choose for as rel= no_follow links..

In 2005 all 3 search engines,Yahoo, Bing an google agreed to support an initiative to reduce the effectiveness of automated spam Unlike the meta robots version of NoFollow, a new directive was employed as an attribute with within an or link tag to indicate the fact that the linking site " does not vouch for the quality of the linked pag
Read the follow post below



You an read more about rel=nofollow links and rel=follow links here



If you are running a seo blog , obviously the first one is of relevance, as it gives you information which is topical , relevant and contextual to what your site is about, while the blog template is not.. The analogy is similar. Follow links are used for something that you liked and wish to pass the link juice to the site. It means you are voting for that site and telling the search spiders that the site is trustworthy . Use no=follow for the second one..or your telling the spiders that " you dont wish to pass on the link juice to the site .

May 25, 2016

5 best practices in creating and marketing your video



5  Best Practices to create and market your video online 


These are the best practices in creating a video

Placement : Embedd the video in a page which the content related  rather than just any page

 Descriptive text : Describe what your video is about. Keep your description  to about 250 to 350 words and ensure you use the keywords liberally in the text

 Saving the video : Save the video file inside the curent silo directory , rather than a central video directory


Play :Use the html player to ensure that your video is compatible to different browsers  and devices . Dont set up the auto video play automatically. Not all users appreciate a video to open without the user clicking on it .Let users start the video themselves

Size ; Ensure  your video can adjust the size automatically  if the user screen size shrinks

Quality:render your video in file size that fits the user.Tech savvy or metro cities have faster connections and can handle large files. However ideally the size must be smaller  for smoth mobile viewing . Find a balance between good quaility and faster viewing  and download speed

Length : small videos are easier to download and convinient to watch. Your video ideally should not exceed 3-5 minutes in length . Remeber user attention decreased with every minute  on the web. Create videos that conveys the actual marketing message or crucial information   right at the start


Posting:In addition to posting your video on your site. You can upload it across other online video sharing site as well. Youtube, Metacafe, Vimeo  and link it back to your site . Alternately you can upload the actualy video on youtube abd link the embedd code to your site or blog

Sceme Mark Up ; Wherever the video is embedded in your site, adding a schema markup in your htnl code can make it easier for search engines to find your video

XML site map:Inform the search engine spiders where they can find your video with an xml site map





May 3, 2016

Loosing out on your quality score :the best landing page infographic



" 5 tips for a perfect landing page"


5  Ways to ensuring SEO benefits while deciding your landing page 


for any SEO optimisation , landing page is a key. While in organic search, Landing page per se is not that crucial as compared to paid search. But irrespective the nature of search a well designed landing page helps spiders to look at your content in a very structured and granular way. This  helps you increase your page score in paid search, and helps your ranking across organic SEO. The above chart illustrates  via an infographic showing how a landing page ought to be designed and the key elements to be incorporated





"how to ensure SEO benefits while deciding your paid search landing page"

The 12 jewels to be incorporated in your landing page 





April 12, 2016

how to use blended search to rank well across local search



BLENDED SEARCH RESULTS :5  Ways it Impacts your organic search results 


The introduction of Google’s ‘Snack Pack” results was perhaps the biggest local search shakeup since Google Pigeon rolled out back in July of 2014 which  had about 7 parameters on : local search rank"However of late, Google started to blend 7 snack pack into 3 snack packs which are considered the most crucial ranking parameter for local search


 Blended search results means" when google " starts searching and  ranking your site, its tries to use blended factors to find out how effective you are  in terms of vertical search. Blended results such as local, shopping,images, real time videos , news are taken into account  and have begun to play an important and crucial role in addition to  the standard search results factors.
Here is a lowdown on Blended search results  that might help you to rank specially across Local search which might help you to index your site much faster 



HOW BLENDED SEARCH HELPS TO RANK ACROSS LOCAL SEARCH


  1. The business has a listing in the local index including, Google Places ,Yahoo Local, Bing local listings 
  2. The actual business location has the same address as mentioned in in the website  and search engine query friendly
  3. The business has a local phone number for the searched location.
  4. The website contains local address and phone number for the query
  5. The website should have more positive and superior ratings by users  
  6. Social Signals : How are your business shared across social media,how much clicks do you manage to generate out of your local ad business cross facebook, twitter , Klout and linkedin among others
  7. "5 ways how you can rank better in Local search"


  1. videos that describes the local institution or something about the product and services which are tagged 
  2. The local listings ( google local ) has to be claimed by local owners and verified as their busines videos that describes the local institution or something about the product and services which are tagged 
  3. The local listings ( google local ) has to be claimed by local owners and verified as their business










  1. Foursquare checkins, mobile clickthrough rates
  2. Of late behavioral and/or mobile signals to make up 9.5% of the algorithm across  localized organic results
  3. High Numerical ratings on Google Local/ Ratings by Google users ( Google Local) 
  4. The listing title  and the description has to have atleast 3 keyword in order to search across blended search SERPS
  5. The business  should be listed across third party data providers and Yellow pages
  6. Geo-tagged images on photo uploading sites, like Flickr, Dropbox and Panoramio , along with a caption that explain the " Geotagged images and local business





April 4, 2016

planning a navigation for non human visitor and search bots



SITE MAP TO BE FOLLOWED  BY SEARCH ENGINE SPIDERS  : 5 ways to ensure bots visit at your web site


The most crucial fact you have to understand that the search engine does not look the same as a human visitor .By creating an robot friendly navigation  you encourage the  search bots visit your site more often which in turn  increase the visibility of your site. Increasing visits by search bots ensure your content gets indexed  faster, your cache( when google spiders visits your site the cache shows when they have visited last. If you see google's cache showing very recent visit, it means, your content is getting  updated faster,, and this improves your visibility in the SERPS.sees it.o perform better in search engine listings, your most important content should be in HTML text format. Images,
"an ideal link structure for robots"

Things not to do : Avoid Flash files, Java applets, and other non-text content and emphasis more on html pages .This is because   search bots often ignore or devalue these codes  

see the entire post below









how do you get to do this : 1) by providing search engines robots with links to navigate the through your website
 2)by pointing search engine robots to dynamic or hard to read pages that might not be able to accessible other wise.
3) By providing a possible link to landing page
4) By ensuring your 404 error has a link pointing to a page ( preferably home page 5) by using ready to use content if a page is not available or a url that is broken




March 29, 2016

dynamic serving vs responsive design : customising your design for mobile users


As we explained in the last responsive mobile design post.There are 3 ways to optimize your mobile website.These are 1) Creating a responsive website 3) dynamic serving and 3) Creating a mobile website

CHOOSING THE RIGHT MOBILE DESIGN APPROACH :WHEN TO USE DYNAMIC SERVING FOR YOUR MOBILE USERS 
 
In this post we look at " how to use dynamic serving for mobile users and under what circumstances this design is used 

Dynamic serving is a server side development approach that detects which type of device your visitors are using to view your website delivers .Dynamic serving specifically optimize the website content based on the device your users are using and the server responds accordingly .

Like responsive design, the dynamic serving uses one single set of url's for all content regardless of how the content is seen, irrespective of a desktop, PC, laptop or me. However that's when the similarity ends .In dynamic serving the url too remain the same,but the content delivered to mobile device is not always the same as in desktop

This is because dynamic serving is a server side approach that alters the content code (HTML, CSS and php ) based on the device that is asking for it before the content is delivered to the browser.This code allows the server to alter the content of the page without altering the URL of the page

 CIRCUMSTANCES WHEN YOU SHOULD USE DYNAMIC SERVING


1)When your website needs to include complex mobile friendly functionality such as multi-page form and interactive dashboards.Dynamic serving allows you to serve the best experience based on user circumstances

2)When you see that your website needs to serve 2 different device markets very differently. One example is that iPhone users take a different path to conversion as compared to android users . Dynamic serving is also used when you want your webpages to render differently for your tablet users and smartphone users.

3)When your visitors largely use different keywords to access your website via desktop search and mobile search. Dynamic content allows you to altered the way the " content is rendered on page by page basis

 4)Dynamic serving  is the best solution If you want to optimize specific pages for high volume keywords phrases without changing the desktop language and if you want your desktop visitors and mobile visitors to convert in different ways

March 26, 2016

mobile first strategy :when to choose responsive web design


MOBILE FIRST STRATEGY :

How do you choose your mobile content delivery platform for your mobile users :Traditionally marketers have 3 ways of doing this 1) Responsive web design, 2) dynamic serving 3)mobile website 


Because a desktop monitor and smartphone are different in size, designing on mobile implies that you have to do either of these 3 things 1) build a responsive design website that dynamically adjusts content from desktop format to mobile format


2) Use dynamic serving to make the mobile experience device specific and control how the mobile content is delivered and viewed page by page  3)Create separate mobile website specifically made for your mobile users

Planning your mobile content delivery platform: when to use responsive design, dynamic serving and mobile site

Option 1 :Responsive design :Responsive design is a web design technology that uses CSS ad series of coded rules to dynamically change and adjust the appearance of your desktop content so that it fits within the screen size parameters of different mobile devices.Currently, responsive design is the mobile-friendly configuration recommended by both Google and Bing. Responsive design uses java script and client side serving to alter the way the pages appears in the mobile or desktop browsers after the server has already loaded the page.The below is the  illustration how this works

 You can use responsive design for your mobile users if it fulfills these  three criteria's listed below 1)When you want a coherent and an integrated desktop and mobile experience.

"responsive design,dynamic serving or mobile website"

2)you have developed a strong desktop website that and have managed to built up a reputation in terms of page strength inbound links, unique content ,optimized for search engines and link equity, trust, and industry authority, and You want to pass on these factors to your mobile version of the page . In these cases responsive mobile sites benefit from shared indexing with desktop sites. No 3 (see below)





3) When you want your mobile users to follow a similar " page navigation, data flow and conversion metrics as they do in your desktop version.When you have limited development resources, which can make maintaining a custom mobile experience with custom content seem out of the question.The responsive web concept is also the best option when you want to manage the maintenance and search engine indexing of just one version of  site, and the seo benefit passes on to the mobile version of the site as well.

science of writing meta tag descriptions : 5 best practices for seo

5 best practices in writing the perfect SEO meta tag description"


The significance of Meta description  :While the perceived (and real) importance of meta data across search has depreciated , the attribute still plays a significant role in SEO rankings  :Meta Description has 3 primary uses

META DESCRIPTION TAG SEARCH ENGINES: 5 BEST PRACTICES 
Meta Descriptions Role in Search
1)describe the content of the page accurately and succinctly 
2)Serve as a short term advertisement for to click on your pages in the search engine 
3)to display the targeted words not for ranking purposes but to indicate its content to searchers 

5 rules to write the best meta descriptions


1)Number of characters :. Descriptions should be succinct and compact
2)Keep the length to 160 characters ( for google) and Bing up to 200 characters :However you are allowed to keep your descriptions to an average of 165, including spaces



3)Test, Refine, Repara and Rephrase : Just like ad ad which undergoes a plethora of tests , ensure that the description actually fits the content or the theme of the page in question. Each web page ideally has to have its own description pages.



3)Include relevant keywords : Its very important to have the right keyword in the meta description tag - the boldface that the search engines apply which can make a huge difference in visibility and the cilck thro rate of the website in question.

4)Ensure tagged facts in the description: You can include other structural tagged facts in the information apart from standard information . While news or blog posts can list the author, date of publication, or byline information. What about a product page? How do you incorporate product information like —price, age, manufacturer, features which lies scattered throughout a page. A good meta description can bring all this data together. For example, the following meta description provides detailed information about a Ian Fleming.


5)Understanding User Psychology of Search Disparity: take into account user differences in terms of search behaviour . For example an organic search user will not see your website the same way as users which comes from" PPC search'.While creating the meta decsription its important to keep in mind this basic fact . Users looking for information and users looking to shop online are clearly two different sets of consumers and you need to create descriptions based on which kind of users are your target segment


6)Employing Universal descriptions: Some search marketers are of the view that " you should not always write a good descriptions".Conventional logic holds that its wiser to write a good meta description to maximise the chances of it being used in the SERPs, rather than letting the engines build one of them. However this isn't always true

If the page in question is targetting one , two or three heavily searched terms or keyword phrases , go with the meta descriptions that targets user performing those searches . However if you are targeting long tail traffic with hundreds of articles or blogs entries, it can sometimes be wiser to let the esarch engine decide and themselves extract the relevant text The reason is simple to understand.When the search engines show a page in the , SERPs they always display the keywords and the surrounding phrases that the user searched for. However If you try to force a meta description, you can end up creating one that is not appropriate for the search phrase your page gets matched too

March 25, 2016

5 best practices in title tags construction for search engines

For keyword optimization title tags are most critical elements for search engine relevance .The title tag is in the <head> section of the html document and  the only pieces of meta information about a page that  has  influences relevacy and ranking

section of the html document and the only pieces of meta information about a page that has influences relevacy and ranking the following 6 rules represents the best practices for title tag construction. One of the things to keep in mind is to ensure that the title tag of any given page has to corresond to that page content


1)incorporate keyword phrases : an obvious thing to do, is to use title tag wherever your keyword research shows as being the most valuable keywords for capturing searches

2)Place your keywords at the beginning of the title tag.This provides the most search engine benefit. If your doing this and also wish to employ your brand name in the title tag, always place it at the end There is a trade off here between seo benefit and branding benefit that you should think about explicitly before taking the decision .Well established and well known brands might want to have their name at the start of the title tag as it may result to increase click thro rates (CTR)



 3) Limit yout title to 65 characters including spaces Content in the title tag should not be more than 65- 70 characters for allmost all search engines, any content beyond 65 characters gets cut off the SERPS
4)Focus the title on click thro and conversion rate : The title tag is exceptionally similar to title you generally write for paid search ads.However this data specifically is hard to measure and improve as the stats are not found easily and readily. If the  market you serve has  relatively stable search volumes , you can do some testing and see if you can improve your CTR

5)Target the searcher's  intent: while writing titles and descriptions  keep in mind what users are doing to reach your site  either by search engines or referrals.Study the top 10 landing pages and try to find " why users are coming to your site for "




If the searcher's intent is researching.. you need to tweak your title and maybe the structure needs to have more descriptive .However if users are purchasing online and looking at discounts at your site  your title should clearly mention that these functions are available .example " digital cameras now at 40% less: axax( name of website ): the top selling bestdigi cams online

5 ways of optimizing domains for search engines

6 ways of optimizing domain names and urls 

1)Brainstorm 5-10 keywords : before you set up a website, it is important for you to create few url's and brainstorm on the keywords you wish to target via your website :Once you have this list, you can start to pair them or add pre fixes and suffixes to create good domain names . For example if your about to start a website/blog on mortgage related domain. You might want to start with keywords such as " Mortgage , finance , home loan, house payment  


2)make the domain unique : creating a domain which is often confused with a popular website which is already owned by someone is a  recipe for disaster. Among some of the popular ways webmasters often tries to leverage the " popularity of an existing key phrases or urls  is to book domain names that are simply plural names,hyphenated or misspellt  version of already established domains


However this seldom helps, as a the strength of the site in question is always big enough ( assuming the site in question is really big) a misspellt  keyword will eventually lead users to the original site compared to yours as the domain authority of the page will be always be higher than yours





3)Make the url easy to sound and easy to type :  Any brand that has a trouble being read or even being written.. has already lost round one .Ensure you use a name that " is easily recognizable by what it means to users along with an imagery. 

Make your url easy to pronounce, be shared and passed around :Word of mouth remains the fastest way of bridging distances between 2 people 


4)Keep the domain short and sweet :Domain names should ideally should not exceed 10 words, unless you cater to very niche industries. Avoid repetition of numbers ,alphabets and numbers. Do not create a url that looks more like a password !

5) Not all names sounds familiar: consumers do not react to all names in the same way.Some names create positive vibes among users while some do not.. This is not due to any inherent bias..as even users are unaware of this.However neuro marketers has some answers to why this happens.Apparently they explain " human beings including consumers loves familiarity more than the unknown.The unknown has an element of fear". while the known has no fear. 
This is one of the reasons why domain names like autotrader., Realty,webmd sounds familiar as the user can guess the theme of the site by hearing their names.as opposed to zillow.com and monster.com 

6)Reject hypens and numbers : both hypens and numbers make it hard  to convey the domain name verbally .Also avoid roman numerals and large caps and small caps combination or being case sensitive