Trending this month

Showing posts with label SEO. Show all posts
Showing posts with label SEO. Show all posts

January 7, 2018

google issues seo guidelines for voice search

Consumers are increasingly turning to voice for search. While Amazon Alexa or Cortana-enabled products uses results from Bing.Google Assistant or Apple Siri, is powered by Google.Brands and publishers are have realised that they need to be on top of SERPs across voice search as well Following these best practices guidelines published by Google recently will be increasingly important for your content to surface in voice searches. 

Google Research blog recently has published the search quality raters guidelines, contractors guidelines to evaluate Google’s search results, specifically for the Google Assistant and voice search results. It is similar to the web search quality guidelines, but it changes in that there is no screen to look at when evaluating such results; instead you are evaluating the voice responses from the Google Assistant.

It is important to note that comScore predicts that 50% of all searches will be by voice in 2020. Gartner forecasts 30% of web browsing will be done by voice the same year. Voice adoption is growing quickly and discovery through Voice SEO will become critical.

According to searchengineland “The Google Assistant needs its own guidelines in place, as many of its interactions utilize what is called ‘eyes-free technology,’ when there is no screen as part of the experience.” Google has designed machine learning and algorithms to try to make the voice responses and “answers grammatical, fluent and concise.” Google said that they ask raters to make sure that answers are satisfactory across several dimensions.Here are some of the voice search and voice assisted search guidelines issues by google. 

Information Satisfaction: the content of the answer should meet the information needs of the user. 

Length of the query: when a displayed answer is too long, users can quickly scan it visually and locate the relevant information. For voice answers, that is not possible. It is much more important to ensure that we provide a helpful amount of information, hopefully not too much or too little. Some of our previous work is currently in use for identifying the most relevant fragments of answers. Formulation: it is much easier to understand a badly formulated written answer than an ungrammatical spoken answer, so more care has to be placed in ensuring grammatical correctness. 

Elocution: spoken answers must have proper pronunciation and prosody. Improvements in text-to-speech generation, such as WaveNet and Tacotron 2, are quickly reducing the gap with human performance.

November 28, 2017

how niche websites can benefit by google's rankbrain

the rise of AI and deep learning in Google searches
the above image shows how " Rankbrain is using vectors to understand country and their capitals using AI, deep learning

Bloomberg News recently broke the story on a new artificial intelligence program from Google that it calls “Rank Brain”.“For the past few months, a “very large fraction” of the millions of queries a second that people type into the company’s search engine has been interpreted by an artificial intelligence system, nicknamed RankBrain.

Putting is simply Rank Brain is an artificial intelligence (AI) program used to help process Google search queries. RankBrain uses artificial intelligence to embed vast amounts of written language into mathematical entities, called vectors, that the computer can understand.If RankBrain sees a word or phrase it isn’t familiar with, the machine can make a guess as to what words or phrases might have a similar meaning and filter the result accordingly, making it more effective at handling never-before-seen search queries.Presently RankBrain is mainly used as a way to interpret the searches that people submit to find pages that might not have the exact words that were searched for and for searches that has different meaning across geographies. 

To cite an example, for a query "how much spoons of sugar will fill a cup, Google’s Rank Brain will show different search results based on things like what country the search was made from.This is because each country has different standards of measuring them. a normal cup in Australia will be of a different size that Austria .

RankBrain is mainly used as a way to interpret the searches that people submit to find pages that might not have the exact words that were searched for. It must be understood here that RankBrain is not a new Algorithm, Rankbrain is only a part of Google’s overall search “algorithm,” along with ( hummingbird, Penguin, Panda) which is used to sort through the billions of pages it knows about and find the ones deemed most relevant for particular queries. 

In particular, we know RankBrain is part of the overall Hummingbird algorithm because the Bloomberg article makes clear that RankBrain doesn’t handle all searches, as only the overall algorithm would. Hummingbird also contains other parts with names familiar to those in the SEO space, such as Panda, Penguin, and Payday designed to fight spam, Pigeon designed to improve local results.

The basic idea of using Artificial Intelligence and assigning mathematical vectors to search queries is due to the fact the present computers and algorithms does not understand natural human language so well enough, which forces people to do a lot of the heavy lifting— when they are searching for something 'unique or keywords which Google has never searched or a phrase which it cannot understand, for example, speaking "searchese" to find information online, or slogging through lengthy forms to book a trip. Computers ideally should understand the natural language better, so people can interact with them more easily and get any kind of semantic information without having to sweat it out.

While state-of-the-art technology is still miles away from this goal, Google's Rankbrain has started making significant progress using the latest machine learning and natural language processing techniques. Deep learning has markedly improved speech recognition and image classification. For example, studies have shown that computers can learn to recognize cats (and many other objects) just by observing a large number of images, without being trained explicitly on what a cat looks like. Now we apply neural networks to understanding words by having them “read” vast quantities of text on the web. Google says that its scaling this approach to datasets thousands of times larger than what has been possible before, and they have seen a dramatic improvement in performance.

To promote research on how machine learning can apply to natural language problems, Google has published an open source toolkit called word2vec that aims to learn the meaning behind words. Word2vec uses distributed representations of text to capture similarities among concepts. 

For example, it understands that Paris and France are related the same way Berlin and Germany are (capital and country), and not the same way Madrid and Italy are. This chart shows how well it can learn the concept of capital cities, just by reading lots of news articles -- with no human intervention.

So how does Rankbrain would affect your SEO ? For starters, Its important to step back and understand why RankBrain was deployed.The problem is with the sheer scale of search volumes today. 
Google processes three billion searches per day. In 2007, Google said that 20 - 25% of those queries had never been seen before. In 2013, it brought that number down to 15 percent and by 2015 it still hovers around the same number. But 15% of three billion is still a huge number of queries never entered by any human searcher — 450 million per day.That number by any imagination is huge, among those search keywords , some of them can be complex, multi-word queries, also called “long-tail” queries. RankBrain is designed to help better interpret those queries and effectively translate them, behind the scenes in a way, to find the best and relevant pages for the searcher.-So if you run a  very niche website targetting  rare long tail keywords ,you might see increase in traffic as Rankbrain tries to serve users using them as vectors by integrating state of art Artificial intelligence and Deep learning 

November 23, 2017

new to seo ?: 5 free tools to diagnose your website


 How to self-diagnose a website if you're new to SEO:

 Here are among the top 5 online tools for self-diagnosing a website especially if you're new to search engine optimization and online marketing.

1)Storegrader ( https://ecommerce.shopify/grader ) : If you run and maintain an e-commerce website ,Shopify’s store grader is a good place to start off with your marketing efforts, site usability and performance ,including content management and SEO.
2)Alexa( : for a quick snapshot of the performance of your website go to and type your website url. The basic free version provides an estimated ranking for your country and the world based on a variety of factors..While Alexa has its limitations , it is just a starting point to understand the performance of your website. Its also quick and you can delve into deeper insight with their paid version which has more accurate data.

3)Moz Rank Tracker( ) Moz has been a leader in SEO initiatives for many years and they provide a altogether different insight into your website.Moz ranks your website from zero( lowest value) to 9.99 ( highest value) primarily based on your link juice , which is the number of backward links to your website, both qualitative as well as quantitative.Like the Richter scale for earthquakes, the rankings are logarithmic. 

4)Hubspot Marketing Grader( ): Hubspot’s transition from a website grader to a marketing grader is an indicative of the increasing demand for tools to measure more than just a website.Hubspots innovation grader runs a deep data drive and provides actionable intelligence and help you identify gaps in performance . Marketing grader includes your Alexa and Moz ranks and much much more.The tool is comprehensive and robust and provides a thorough diagnostic examination of your website. 

5)Nibbler Grader( ) :Nibbler has a website performance grader that provides a score based on a ten point scale including accessibility ,experience , marketing and technology.Sub categories are broken down by individual scores too .Nibbler also looks at social media connections and grades your mobile website .Like Hubspots marketing grader it provides a rich insights in an customized and interesting dashboard.

6)Woorank Grader( ): Wooorank grades your social media ,SEO conversions and your mobile website .Delivering quick results Woorank outlines critical areas that needs immediate attention as well as points out areas where you are performing well.The action points are outlined under each initiatives graded with easy to read actionable list.
7)QuickSprout ( ): Quick sprout’s website analyzer grades your SEO based on a letter grade while measuring and displaying you’re your mobile site.The SEO breakdown is quite detailed with easy to reach tables, clearly displaying your results.Website analyzer breaks down your factors into high,Medium and Low priority so that you know where to focus on.
8)WRC validator ( ): the world wide web Consortium has developed their own validator that’s geared more towards internet technology and coding rather than marketing.Although it does not have fancy dashboards and easy to read action points, it does specify potentional and existing problems in a website.

May 17, 2017

30 seo tools which can help you rank in first page



Getting  good rankings and organic traffic on your website would be impossible if you don’t use proper SEO tools and techniques. There are a plethora of  free and paid SEO tools available online which can be highly beneficial for your website rankings. Here are some  compilation  of a  list of   SEO tools which would help you to analyse your website and study and map your competitor

However more often than not ,you need more then one SEO tool to satisfy your need for increasing your website traffic. Let’s have a looser look at all the comprehensive list of top SEO tools . 

 Top SEO tools, Complete List of free and paid  SEO Tools and Services 

1- Anchor Text over Optimization Tool Anchor Text over Optimization Tool is used to identify anchor text diversity and also highlight those areas which are at risk for anchor text over optimization.

 2-KeywordSpy This tool helps you in profitable list of keyword research.

3.Screaming frog Small spider tool for websites’ links, images, CSS, script and apps from an SEO perspective. 

4.HubSpot A Single hub to collect, translate & route your customer data. 

5.SpyFu SpyFu Help you to download your competitors’ profitable keywords. 

6.Similarweb Similarweb help in Competitor’s traffic figures, Organic keywords, Engagement metrics and Get traffic insights for any website or app 

7.Seer Toolbox This is atool Suite for Analytics and Link Research for traffic gaining. 

8.SEOgadget Links API Help you to unlock the incredible capabilities of tool provider data from Majestic SEO, Moz and Grepwords within Excel.

9 Majestic SEO This tool helps in link Research, Competitive Intelligence and Link Building. 

10. Majestic SEO API Helps in API (Application program interface) that how software components should interact and Link Research. 

11.XML Sitemap Inspector This SEO tool validate your XML sitemap. It makes your sitemap free from errors and ping to all major search engines. 

12.Mozscape API This tool put powerful web crawler and index of over 162 Billion URLs in the palm of your hands. 

13.Title and Description Optimization Tool: This is an optimization tool to optimize title tags and Meta description tag. 

14. Open Site Explorer This tool help you to Link Research, defining, gathering, analyzing data and Link Building.
 15.Bing Webmaster Tools Bing data webmaster tool uncover keyword data & drive traffic to your site. 

 16.Buzzstream Tools Suite This tool help to find link opportunities, conduct link research or automate link building tasks. 

 17.GTmetrix: It helps you to analyze the performance and speed of your website. 

 18.Meta SEO Inspector A Google chrome app used to easily inspect metadata on any site with one simple click. 

 19- Google Webmaster Help you to Track your site’s search performance. 

 20- Merge Words This site combine sets of words easily & automatically. 

21- WordPress Theme and Plugins Detector This google plugin helps you to detects the theme and plugins used in WordPress and display information about them. 

22 Lynx Help Page it’s a plain text browser, which show you how crawler or search engine sees the page. 

23 SEO Extension This extension provides you SEO metrics for a specific page, along with other useful tools such as SEO Audit and many others. 

24.WebRank SEO For Google: “ For Mozilla Firefox Helps you to get website stats directly from your browser. 

25. NerdyData NerdyData is like a Search engine for source code. 

26- Internet Marketing Ninjas SEO Tools This site provide few tools to find broken links, redirects, site crawl tool, 

27.SEO Toolbar SEO Toolbar pulls in many useful marketing data points to make it easy and get a holistic view of competitive landscape of market directly in search results. 

28.SEO Tools for Excel This site provides you some Analytics, Social, On page, off page SEo tools
29.SEO gadget Tools This site provides some Content Strategy Generator, Highcharts Generator and Infographic embedded tools. 

 30.SEO Quake SEOquake tool is used to obtain information about any website for a wide range of parameters such as page rank, Google index, Alexa and many others. For Google: For Opera Mini 

 31.Virante SEO Tools This site provide you some content checking and optimizing tool.
32- Wordstream Free Keyword Tools Wordstream is a SEO keyword tool. It helps in performing ongoing organize, analyze keyword research. This SEO tool also provide the facility to search engine optimizer in keyword discovery tools, keyword grouping, long-tail keyword tools and SEO content creation tools. 
33.SEMrush helps analyse your domain, backlinks and search ranking apart from helping you runa competitor research

March 8, 2017

5 steps of optimizing for product search results on google and bing

Optimizing for  product search across Google and Bing  :For products and shopping sites who own products, a top ranking in google product based business is essential to their ranking and therefore revenue.This is because the top 3 google product search results most often than not make it to the main google search listings. Here are the 5 steps of optimizing your product for google product searches 

• The first step for optimizing products searches are putting together feeds for your products and submit them to Google Merchant center. Here you can upload products in bulk and learn the specifics of formatting the feed. to be included in google product center you need to upload true physical and tangible products.

• In your feed populate as many fields as possible with (data,) Brand, category, color,Price ,condition, warranties and more .These additional fields will help google Product search match up to your potential customers search fields. 

• Ensure you update your feed as much as possible. Some major eretail sites update their feeds on a daily basis. Your feed needs to be authentic and accurate as possible and should be able to edit multiple times a day if it’s required. 

Always include the product photo across listings. If you don’t have a photo, no product will be seen. Here are some tips for product images.Google converts 90 x 90 pixels into display thumbnails Therefore it makes decision to use a square picture to take advantage of the space. Higher contrast picture make for a better view and are easier to read in an thumbnail 

Change your feed whenever product is added, existing products undergo changes or new products are announced. If your product is out of stock, ensure you quickly remove them from your online inventory and replace it will”limited stocks” or currently “stocks not available.” 

• Use description and titles they normal way you would use for any: search keyword you are targeting.Use proper metatags and make sure the description matches up to the product category. Use long tail keywords for your products as there are a hundred ways of “ how your consumer would typically use it to find and search that particular product or the phrase.One of the smart ways is going through amazon feature of users also bought, / users who purchased this also brought .. This would give you an idea of the product category and the vertical which the user is interested in searching for 

• Try to use phrases and match keywords based on the higher search volumes and high chance for conversion. Keep on mapping your products to your competitors and see which phrases are associated with the particular product. You can use google inventory keyword tool for this

• Go to the larger eCommerce sites which lists thousands of products and where hand researching is not possible pick out unique attributes of that product which users may type while they are searching the products 

Seller and Product ratings are important and they play a very big role in google product rankings .Manage your Seller ratings across contributor sources like Shopzilla,Dealtime, NextTag,PriceGrabbber, ResellerRatings. You can also get your products rated at Epininions 

Some other factors which impact ranking across product searches are a) The perceived authority of the domain b) The website rankings for keywords in web search’ c) If the products are deemed adult.. Some of your products might get filtered out due to safe search d)Users specifying Google Checkout item daily e)The number of uses who have added your google product items to their individual shopping lists within the product search or placed them on their shared wishlist.The google merchant center does accept other types of items also like flights, hotels, car rentals , travel packages and real estate but these items will not get into product searches.

March 1, 2017

3 structural decisions before targeting keywords post google algo changes

Keyword research is perhaps the most crucial element in search engine optimization. Selecting and targeting keywords based on what users “ type” is the holy grail in getting traction and traffic to your website How do your keywords match up with your website’s hierarchy? Ultimately the logical structure of your page should match up with the way users think about products and services like yours. Its important the define the target audiences based on the industry, are you in the b2b space or b2b space, are you in the business of providing leads or generating good and unique content, what are your sites’s core strength

Do you believe in attacking your competitor openly or your products are more niche and appeal to certain segment of a “product cycle”The one thing you should avoiding is being everything to everyone? You need to have a very specific target in mind before you start optimizing those keywords Here are a few things you need to keep in mind for keyword targeting, post google algo changes in 2017 

 1)Cross Linking Relevance: Linking between articles that are relevant and complimentary can be very powerful as it adds a lot of value to your content. People looking for a content will always find it useful, if you have more choices of relevant content based on the similar topic. Perhaps Amazon’s “Frequently bought together” and “Users who searched for this ‘ also searched for the following items are brilliant ways to integrate group products into categories and items 

Amazon has today perhaps established the gold standard for the entire seo industry” showing the way how “ cross linking of relevant content can increase both search rankings and web traffic In the Amazon system all this is of course rendered automatically and dynamically, so it required little day to day effort from Amazon’s part. The Customers who bought data is a part of Amazon’s internal database and the “Tags Customers Associate data is provided directly by users themselves 

2) Using the Correct Anchor Text: Anchor text is one of the most underrated element in the entire on page optimization chain. All your content should focus away to use keyword rich anchor text in your internal links. Avoid using words like More and Click here to know more as it devalued the content of your links. Make sure the technical and the creative team understand this. Take time to prepare before hand an anchor text strategy for your website 

3)Using breadcrumb navigation : Breadcrumb navigation essentially shows a user which part of the website they are navigating. At every point on the website, it informs the user , how to they reach to different areas of the website. Ensure that at all points the user is not 2 to 3 clicks away from the homepage. Among the best practices is to to make the breadcrumb and the menu navigation keyword rich as that helps search engines and users both 

4) Minimizing  Link Depth: Search engines and users look to site architecture for clues to what pages in the website are most important. A key factor is the number of links it takes to reach a page. The Golden rule of linking is “any webpage that is more than 4 clicks away is not that influential. The deeper the links are, the higher amount of indexing time the search engine spiders take to index the website, As each website that is crawled by the search spiders has a fixed bandwidth, they aren’t going to spend more time searching for content if its hidden deep, 4 to 5 clicks from the main page . In fact a search engine might totally ignore such a page as the spiders may not find the page and no matter how good the content is. 


The Standard SEO advice is keep the site architecture as flat as possible to minimize clicks from the homepage to important content. Do not go off the deep end too, and have a never ending links from the homepage, as too many links on a page are not good for search engines either. A standard recommendation is not to exceed 100 links from a web page as this divides the page rank juice among all the links originating from that page. The bottomline is that you need to plan the website structure as flat as possible without compromising the user experience

December 16, 2016

5 seo best practices for blogging the perfect post

"5 seo best practices for blogging the perfect post"


The Power of a blog post . How a blog post can help market your website , by creating content that is spiderable and indexible by google and other search engines . This Infographic shows you how to just create the perfect Blog. The top 5 seo best practices for blogging the perfect post

December 15, 2016

3 ways to get most juice out of your site maps

The making of the sitemap

Traditional site maps are static HTML files that outline the first and second level structures on the website. The original purpose of the site map  was to easily find items on the website. Over time  sitemaps also became a useful tool to help search engine find content  and index all parts of the site you wanted to.  Today its recommended that every webmaster have a  XML site map which provides easy to read links dump for the spiders to index . A Good Site map must fulfill the following 5 criteria. At best a site map is just   a table of contents, at worst its just an index for your site.

1)show a quick and easy follow overview of your website
2)provides a pathway for the search engine spiders to follow 
3)provides a text links to every page on your website
4)Quickly show visitors what information they will  be getting across which pages
5)Utilities keywords phrases and to help  rank well across search engines

Here are some of the best practices to get more juice out of your site maps

1)Your sitemap should be linked from the homepage. Linking it this way gives the search engine an easy way to find the website and then follow it all the way through the site.If its linked from the other pages , the spider might find a dead end along the way and quit following your website
2)Small sites can place every page on their site map, Bigger sites should not :You would not want the search engines to see a never ending list of links and assume its a link farm. Use nested sitemaps if you jave many pages to cover.A nested sitemap contains only your top level  pages on the main sitemap and includes links to more specific sitemaps.
3)Some SEO experts believe that you should have no more than 25 to 40 links on your sitemap. This also ensures that that your sitemap is immensely readable by the human visitors.
4)The Anchor text in your site map ( words which are clicked on )  of each link should ideally contain a keyword as far as possible.Also make sure that the anchor text links on your site map are all linked to the appropriate page.
5) After creating   a sitemap, go back and  make sure you check all your links are correct. A broken link on your site map is the last thing you need and is a terrible user experience.All the pages shown on your site map should contain a link back to the sitemap.

6)If you have  a very extensive content with huge number of pages,you should try to create a sitemap for each silo.The master sitemap would not contain all the pages of the website, but would lead the search engines and users to the appropriate sitemap just like the rest of your site

The Site map must also
begin with an opening <urlset> tag ( encapsulates the file and references the current protocol standard) and with with a closing urlset tag)
include an entry for each <URL> as a parent XML tag.Parent tag for eachURL entry. The remaining tags are children of this tag)
include a< loc> child entry for each url parent tag.The url must begin with a protocol such as http:// and end with the trailing slash,if your web server requires it.This value must be less than 2048 characters

December 1, 2016

how to create local landing pages: top 3 tips for ranking well on search engines

"how to create local landing pages: top 3 tips for ranking well on search engines"

3 tips for creating local landing pages which are  optimized for search engines: Local search is big business.In the US alone there are 3.2 million print yellow pages “advertisers” in the U.S. today generating just under $15 billion in annual revenues.According to a recent search estimates 1 in 3 searches are about a place.BIA/Kelsey forecasts overall U.S. local advertising revenues to reach $148.8B in 2017.There are right ways and wrong ways to create local landing pages to rank well in search engines. However not all of them are " technically the right way to rank on Google

One wrong method is to create content by a bunch of city names with a zip code in a text block and hoping to rank for those terms. The second wrong method is to generate a hundred of copies of web page and then use a find and replace method to substitute a different city name on each landing page. That approach will create thin,mostly duplicated content that is less likely to bring serious and targeted traffic and finally lack of content might attract a penalty from google's panda algorithm and damage your site's ranking to a large extent.

These type of find and replace method does not convince search engines that " You are trying to rank seriously for these given keywords so what is the right method for creating local landing pages that will bring you quality traffic.Here are some best practices for creating landing pages for ranking well in location search

In your title of the landing page including body and text and keywords ensure that you mention the locality of the region which you are planning to target.Make sure each page has the name of the city and your service which you are offering to consumers. Also use H1 and H2 tags in the body text.

Include conversational modifiers:This means when users search for location they mostly use words like "Nearby /Proximity/in between and Neighborhood".These words signal interest in the location specific search queries results which are, now becoming more routine thanks to voice enabled search across smartphones.Sprinkling in conversational modifiers near your keywords might make sense.For example users may search " car repair centers between Paulo Alto and Sacramento. In your landing page talk about things related to the neighborhood. Instead of using merely the name of the location use the names of famous landmarks near the place ,a well known coffee shop or nearby Starbucks cafe, or a local KFC restaurant, so that users might be able to locate the place easily in case they dont know the exact address.Also if your establishing a business in Los Angeles on a given page, you can add Hollywood or southern California in the text making it unique

Think creatively about creating content which are different from other local landing pages. For example, you can create local customer testimonials, with videos or text images of what local customers have to say about your services Talk about local events which your business have tied up with, co produced, co branded or any local events which you have sponsored, or images of past events where you had participated in. Publish one page of City Guides which gives a snapshot of the place ,its history, climate, key landmarks,events, culture,local guides and other places of interest users might like to visit.

October 20, 2016

a dynamic website using Content management system:must have seo features

A dynamic website is a website that is built by using a template and a CMS. Content management system gives you control on how do you create a your web page pulling in information from various sources from a database.That means  the web pages does not exist unless someone builds them for you. For example you run a online shopping website and sell more than 10,000 products which you sell online . For creating a website such as this. You are not going to build a 10,000 web pages each for each product manually. Instead you use a content management system to build the pages dynamically on the fly. 
A CMS actually generates the page which a search spider crawls by taking the information in your database and plugging it into a template web page so that the CMS creates the tags, content and the code which ultimately is seen by the search engines.However one important thing to remember is its extremely essential to have a search marketing friendly CMS for creating a dynamic website.Any CMS that allows you to completely have access to these individually pages on a granular level alllows you to create tags, meta tags, keywords and titles are considered search friendly. Having these features allows you also to have a control on the entire content of the web pages. You should be able to make changes in the tags, H1 tags,edit keywords and descriptions.

In short any CMS which does not allow you to control individual page wise level seo.. is not seo friendly. You can use wordpress or Pixelsilk ,which are both open source.For search keep these things in mind when you are deciding the CMS 1) It should be able to customise HTML templates 2) Should allow you to produce unique Title tags' 3) You should be able to add meta description keywords on your own 4) produce and change H1,H2 h3 tags at will 5)categorise and create content by groups

October 14, 2016

google confirms sites hit by google penguin 4 years ago are recovering

Those webmasters who had been hit by Google Penguin can now rejoice. A recent Moz posts " shows that sites which had been hit by Google Penguin " is showing increased traffic from Google after a long two-year wait. Post the  latest Penguin update rolled out in late September and into early October 2016. This roll-out is unusual in many ways, and it only now seems to be settling down. In the past couple of weeks, we've seen many reports of recoveries from previous Penguin demotions.On October 7th, Gary Illyes from Google said that the Penguin roll-out was in the "final stage" (presumably, the removal of demotions) and would take a "few more days". As of this writing, it's been five more days.
Google representatives are confirming details about the new Penguin both publicly and privately, and algorithm flux matches the general timeline. Perhaps more importantly, we're seeing many anecdotal reports of Penguin recoveries shown in the image above

Also websites  that track websites hit by Google Penguin have reported that " the kind of hit websites had taken after Penguin Update is slowly showing signs of recovering.

August 4, 2016

why using flash content is seo unfriendly : 5 reasons

Although Google indexes flash content and follow links since 2008 onwards, the fact remains that Google or for that matter any search engine cannot read what is written within the flash files . We take a look at 5 reasons why using Flash content in your website is considered not fully seo friendly. The most crucial problems with Flash are the missing SEO features: Anchor texts, H,-H2. Tags, bold , strong tags,, alt image attributes even title pages are not very simple to execute in Flash.Developing flash with seo factors in mind is more difficult than doing it in HTML.  

 1)Different content is not on different urls:This is the same problem you have with Ajax related pages. If you have unique frames, movies within movies that appear to be on complete unique portions on the flash site, but there is no way to link them to individual elements 


 2)The text breakdown is not clean : Google can index the output files in the .swf to see words and phrases, but in flash, a lot of the text inside is not clean,h1> or <p>. It is jumbled into half phrases for graphical effects and often the output is in a incorrect order. Worse still are text effects that often require breaking words apart into individual letters to animate them. 

 3)Flash gets embedded ; A lot of flash content is only linked to other flash content,wrapped inside the shell flash pages This line of inks where no other intenal or external urls are refrencing the interior content , leads to documents with very low page rank, as the links juice fails to get transferred Even if it manages to stay in the main index , they probably wont rank for anything  

"why Flash should be avoided "

4)Flash does not learnt external links like HTML : A all flash site may get large number of links to the homepage, but the interior page always suffer. When webmasters implement links to embeddable Flash. They normaly point to the html host page , rather than any interior pages withing the Flash

3 explanations why creating microsites are the cardinal sins in seo

Search algorithms specifically favour large, authoritative websites that have a depth of information. This means that at any given day the bigger the site, the url that has the highest and the deepest content has the highest probability to appear in search results, as opposed to a small and  thinly spread out content in the form of a microsite . All search engines give more weightage to big  websites with deep pages  and broad pages with lot of content.This negates the reason of  creating new content as a microsite as it  is less helpful vs adding content across existent site
" microsite vs one website : whats the best way forward in seo"

 No matter how well you create your content, you can never replicate the “content “which is on your main website”. As your website is already being seen and indexed by  the the search engines as a website with a  huge reservoir of “quality content, trying to create another site with similar or a few unique content will not get you much traction. A multiple site strategy always split the benefits of the link: A single , quality  link pointing to a page on the main  domain positively influences the entire domain and all the pages associated with it . Because of this fact, it makes more sense if you can preserve a good link” or a new link which you get pointing to the same domain to help boost the rank and value of the pages on it .Also  a new site  will take  time  for  search engines to spider a site , and get ranked in search results. This impacts visibility of a brand on the web 

Additionally, it will take some time to appear in google results. .A clear cut content strategy is a must to create a new micros site notwithstanding, the reason not to develop the same. At times developing content for such a site is quite difficult . Having content or keyword targeted pages on other domains that don’t benefit in terms of content or backlinks is a action in futility.

3 instances when you should not be using a .com as a top level domain

  WHEN NOT TO USE.COM AS YOUR TL DOMAIN :Most of web urls have a .com as a TLD, however in some certain circumstances its better to avoid using a .com as a TLD We take a look into a few special cases when we use TLD other than .com

1) When you own the brand .com already and wish to redirect it to a .tv, .org,.biz possibly for marketing, branding and geographic reasons , to venturing into areas across corporate social activity as a CSR initiative. Additionally, any socially relevant service or addressing the bottom of the pyramid either through a corporate initiative or tying up with a NGO.
2.You can use .gov,.mil,.Edu domains if you belong to the similar category ( for the appropriate organizations and associations )

3) When you are serving a specific geography or a market and are completely focussed on “the local market” without the any intention of venturing into other markets for a long time to come. In such cases country extensions are a great way to get good search ranking. (,,.fr ,.de ) 4) When your organization is non profit and you are into a social cause, like NGO, charity, helping war veterans, collecting donations for social cause. Under these circumstances you can use a .org. extension

The world wide web 's made of of 92% .com domain as the TLD. Though the top level domain hardy matters in terms of search, however the url is an important  part of online branding as well as social media .

July 14, 2016

deciding the optimum url length for search engine benefits

Selecting the most appropriate url formats for search engines: 

How doe a search engine decide the "  length of  the optimum size of url which is considered to be search friendly ?

While its true that search engines does give a fair amount of importance for the url specification, SE also are quite adept to  understand  and  interpret regular long urls with numerous hypens   and the extent to which webmasters  can use them for " spamming it " ( for example some seo webmasters at the sound of getting too excited and may use words like an actual url .) This only sends out the signal and tells the search robots and give them the sound of trumpets" telling them to discount the entire post/ url as a spam .. as its the result of an over excited adolescent seo webmaster

deciding the optimum url length  for  search engine benefits "

" the art of creating search engine friendly url's for optimization"
So how do you show the search engines about the content and its url parameters by ensuring you do this just the right way ?
1) Describe your  content :  if a user can  make an accurate guess about your content  by looking at the address bar, you have done your job
2) Use static URLs: dynamic url's are harder to index for both search and users
3) Descriptors not numbers:  never use  a random 1234 numbers or a set of numbers..when you can use brand/name etc
4)use lowercase: Although urls can accept both upper and lower always desist from using Uppercase