Tuesday, July 28, 2009

See Your Site With the Eyes of a Spider

Making efforts to optimize a site is great but what counts is how search engines see your efforts. While even the most careful optimization does not guarantee tops position in search results, if your site does not follow basic SEO truths, then it is more than certain that this site will not score well with search engines. One way to check in advance how your SEO efforts are seen by search engines is to use a search engine simulator.

Spiders Explained

Basically all search engine spiders function on the same principle – they crawl the Web and index pages, which are stored in a database and later use various algorithms to determine page ranking, relevancy, etc of the collected pages. While the algorithms of calculating ranking and relevancy widely differ among search engines, the way they index sites is more or less uniform and it is very important that you know what spiders are interested in and what they neglect.

Search engine spiders are robots and they do not read your pages the way a human does. Instead, they tend to see only particular stuff and are blind for many extras (Flash, JavaScript) that are intended for humans. Since spiders determine if humans will find your site, it is worth to consider what spiders like and what don't.

Flash, JavaScript, Image Text or Frames?!

Flash, JavaScript and image text are NOT visible to search engines. Frames are a real disaster in terms of SEO ranking. All of them might be great in terms of design and usability but for search engines they are absolutely wrong. An incredible mistake one can make is to have a Flash intro page (frames or no frames, this will hardly make the situation worse) with the keywords buried in the animation. Check with the Search Engine Spider Simulator tool a page with Flash and images (and preferably no text or inbound or outbound hyperlinks) and you will see that to search engines this page appears almost blank.

Running your site through this simulator will show you more than the fact that Flash and JavaScript are not SEO favorites. In a way, spiders are like text browsers and they don't see anything that is not a piece of text. So having an image with text in it means nothing to a spider and it will ignore it. A workaround (recommended as a SEO best practice) is to include meaningful description of the image in the ALT attribute of the tag but be careful not to use too many keywords in it because you risk penalties for keyword stuffing. ALT attribute is especially essential, when you use links rather than text for links. You can use ALT text for describing what a Flash movie is about but again, be careful not to trespass the line between optimization and over-optimization.

Are Your Hyperlinks Spiderable?

The search engine spider simulator can be of great help when trying to figure out if the hyperlinks lead to the right place. For instance, link exchange websites often put fake links to your site with _javascript (using mouse over events and stuff to make the link look genuine) but actually this is not a link that search engines will see and follow. Since the spider simulator would not display such links, you'll know that something with the link is wrong.

Optimizing for Yahoo!

Back in the dawn of the Internet, Yahoo! was the most popular search engine. When Google arrived, its indisputably precise search results made it the preferred search engine. However, Google is not the only search engine and it is estimated that about 20-25% or searches are conducted on Yahoo! Another major player on the market is MSN, which means that SEO professionals cannot afford to optimize only for Google but need to take into account the specifics of the other two engines (Yahoo! and MSN) as well.

Optimizing for three search engines at the same time is not an easy task. There were times, when the SEO community was inclined to think that the algorithm of Yahoo! was on deliberately just the opposite to the Google algorithm because pages that ranked high in Google did not do so well in Yahoo! and vice versa. The attempt to optimize a site to appeal to both search engines usually lead to being kicked out of the top of both of them.

Although there is no doubt that the algorithms of the two search engines are different, since both are constantly changing, none of them is made publicly available by its authors and the details about how each of the algorithms function are obtained by speculation based on probe-trial tests for particular keywords, it is not possible to say for certain what exactly is different. What is more, having in mind the frequency with which algorithms are changed, it is not possible to react to every slight change, even if algorithms' details were known officially. But knowing some basic differences between the two does help to get better ranking. A nice visual representation of the differences in positioning between Yahoo! and Google gives the Yahoo vs Google tool.

The Yahoo! Algorithm - Differences With Google

Like all search engines, Yahoo! too spiders the pages on the Web, indexes them in its database and later performs various mathematical operations to produce the pages with the search results. Yahoo! Slurp (the Yahoo! spiderbot) is the the second most active spider crawler on the Web. Yahoo! Slurp is not different from the other bots and if your page misses important elements of the SEO mix that make it not spiderable, then it hardly makes a difference which algorithm will be used because you will never get to a top position. (You may want to try the Search Engine Spider Simulator and check what of your pages is spiderable).

Yahoo! Slurp might be even more active than Googlebot because occasionally there are more pages in the Yahoo! index than in Google. Another alleged difference between Yahoo! and Google is the sandbox (putting the sites “on hold” for some time till they appear in search results). Google's sandbox is deeper, so if you have made recent changes to your site, you might have to wait a month or two (shorter for Yahoo! and longer for Google) till these changes are reflected in the search results.

With new major changes in the Google algorithm under way (the so-called “BigDaddy” Infrastructure expected to be fully launched in March-April 2006) it's hard to tell if the same SEO tactics will be hot on Google in two months' time. One of the supposed changes is the decrease in weight of links. If this happens, a major difference between Yahoo! and Google will be eliminated because as of today Google places more importance on factors such as backlinks, while Yahoo! sticks more to onpage factors, like keyword density in the title, the URL, and the headings.

Of all the differences between Yahoo! and Google, the way keywords in the title and in the URL are treated is the most important. If you have the keyword in these two places, then you can expect a top 10 place in Yahoo!. But beware – a title and an URL cannot be unlimited and technically you can place no more than 3 or 4 keywords there. Also, it matters if the keyword in the title and in the URL is in a basic form or if it is a derivative – e.g. when searching for “cat”, URLs with “catwalk” will also be displayed in Yahoo! but most likely in the second 100 results, while URLs with “cat” only are quite near to the top.

Since Yahoo! is first a directory for submissions and then a search engine (with Google it's just the opposite), a site, which has the keyword in the category it is listed under, stands a better chance to be in the beginning of the search results. With Google this is not that important. For Yahoo! keywords in filenames also score well, while for Google this is not a factor of exceptional importance.

But the major difference is keyword density. The higher the density, the higher the positioning with Yahoo! But beware – some of the keyword-rich sites on Yahoo! can with no difficulty fall into the keyword-stuffed category for Google, so if you attempt to score well on Yahoo! (with keyword density above 7-8%), you risk to be banned by Google!

Yahoo! WebRank

Following Google's example, Yahoo! introduced a Web toolbar that collects anonymous statistics about which sites users browse, thus way getting an aggregated value (from 0 to 10) of how popular a given site is. The higher the value, the more popular a site is and the more valuable the backlinks from it are.

Although WebRank and positioning in the search results are not directly correlated, there is a dependency between them – sites with high WebRank tend to position higher than comparable sites with lower WebRank and the WebRanks of the top 20-30 results for a given keyword are most often above 5.00 on average.

The practical value of WebRank as a measure of success is often discussed in SEO communities and the general opinion is that this is not the most relevant metrics. However, one of the benefits of WebRank is that it alerts Yahoo! Slurp that a new page has appeared, thus inviting it to spider it, if it is not already in the Yahoo! Search index.

When Yahoo! toolbar was launched in 2004, it had an icon that showed the WebRank of the page that is currently open in the browser. Later this feature has been removed but still there are tools on the Web that allow to check the WebRank of a particular page. For instance, this tool allows to check the WebRanks of a whole bunch of pages at a time.

Robots.txt File for Website

Robots.txt

It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. For instance, if you have two versions of a page (one for viewing in the browser and one for printing), you'd rather have the printing version excluded from crawling, otherwise you risk being imposed a duplicate content penalty. Also, if you happen to have sensitive data on your site that you do not want the world to see, you will also prefer that search engines do not index these pages (although in this case the only sure way for not indexing sensitive data is to keep it offline on a separate machine). Additionally, if you want to save some bandwidth by excluding images, stylesheets and javascript from indexing, you also need a way to tell spiders to keep away from these items.

One way to tell search engines which files and folders on your Web site to avoid is with the use of the Robots metatag. But since not all search engines read metatags, the Robots matatag can simply go unnoticed. A better way to inform search engines about your will is to use a robots.txt file.

What Is Robots.txt?

Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter” on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.

The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it – they do not search the whole site for a file named robots.txt. Instead, they look first in the main directory (i.e. http://mydomain.com/robots.txt) and if they don't find it there, they simply assume that this site does not have a robots.txt file and therefore they index everything they find along the way. So, if you don't put robots.txt in the right place, do not be surprised that search engines index your whole site.

The concept and structure of robots.txt has been developed more than a decade ago and if you are interested to learn more about it, visit http://www.robotstxt.org/ or you can go straight to the Standard for Robot Exclusion because in this article we will deal only with the most important aspects of a robots.txt file. Next we will continue with the structure a robots.txt file.

Structure of a Robots.txt File

The structure of a robots.txt is pretty simple (and barely flexible) – it is an endless list of user agents and disallowed files and directories. Basically, the syntax is as follows:

User-agent:

Disallow:

User-agent” are search engines' crawlers and disallow: lists the files and directories to be excluded from indexing. In addition to “user-agent:” and “disallow:” entries, you can include comment lines – just put the # sign at the beginning of the line:

# All user agents are disallowed to see the /temp directory.

User-agent: *

Disallow: /temp/

The Traps of a Robots.txt File

When you start making complicated files – i.e. you decide to allow different user agents access to different directories – problems can start, if you do not pay special attention to the traps of a robots.txt file. Common mistakes include typos and contradicting directives. Typos are misspelled user-agents, directories, missing colons after User-agent and Disallow, etc. Typos can be tricky to find but in some cases validation tools help.

The more serious problem is with logical errors. For instance:

User-agent: *

Disallow: /temp/

User-agent: Googlebot

Disallow: /images/

Disallow: /temp/

Disallow: /cgi-bin/

The above example is from a robots.txt that allows all agents to access everything on the site except the /temp directory. Up to here it is fine but later on there is another record that specifies more restrictive terms for Googlebot. When Googlebot starts reading robots.txt, it will see that all user agents (including Googlebot itself) are allowed to all folders except /temp/. This is enough for Googlebot to know, so it will not read the file to the end and will index everything except /temp/ - including /images/ and /cgi-bin/, which you think you have told it not to touch. You see, the structure of a robots.txt file is simple but still serious mistakes can be made easily.

Tools to Generate and Validate a Robots.txt File

Having in mind the simple syntax of a robots.txt file, you can always read it to see if everything is OK but it is much easier to use a validator, like this one: http://tool.motoricerca.info/robots-checker.phtml. These tools report about common mistakes like missing slashes or colons, which if not detected compromise your efforts. For instance, if you have typed:

User agent: *

Disallow: /temp/

this is wrong because there is no slash between “user” and “agent” and the syntax is incorrect.

In those cases, when you have a complex robots.txt file – i.e. you give different instructions to different user agents or you have a long list of directories and subdirectories to exclude, writing the file manually can be a real pain. But do not worry – there are tools that will generate the file for you. What is more, there are visual tools that allow to point and select which files and folders are to be excluded. But even if you do not feel like buying a graphical tool for robots.txt generation, there are online tools to assist you. For instance, the Server-Side Robots Generator offers a dropdown list of user agents and a text box for you to list the files you don't want indexed. Honestly, it is not much of a help, unless you want to set specific rules for different search engines because in any case it is up to you to type the list of directories but is more than nothing.

Jumping Over the Google Sandbox

It's never easy for newcomers to enter a market and there are barriers of different kinds. For newcomers to the world of search engines, the barrier is called a sandbox – your site stays there until it gets mature enough to be allowed to the Top Positions club. Although there is no direct confirmation of the existence of a sandbox, Google employees have implied it and SEO experts have seen in practice that new sites, no matter how well optimized, don't rank high on Google, while on MSN and Yahoo they catch quickly. For Google, the jailing in the sandbox for new sites with new domains is on average 6 months, although it can vary from less than a month to over 8 months.

Sandbox and Aging Delay

While it might be considered unfair to stop new sites by artificial means like keeping them at the bottom of search results, there is a fair amount of reasoning why search engines, and above all Google, have resorted to such measures. With blackhat practices like bulk buying of links, creation of duplicate content or simply keyword stuffing to get to the coveted top, it is no surprise that Google chose to penalize new sites, which overnight get tons of backlinks, or which are used as a source of backlinks to support an older site (possibly owned by the same company). Needless to say, when such fake sites are indexed and admitted to top positions, this deteriorates search results, so Google had to take measures for ensuring that such practices will not be tolerated. The sandbox effect works like a probation period for new sites and by making the practice of farming fake sites a long-term, rather than a short-term payoff for site owners, it is supposed to decrease its use.

Sandbox and aging delay are similar in meaning and many SEO experts use them interchangeably. Aging delay is more self-explanatory – sites are “delayed” till they come of age. Well, unlike in legislation, with search engines this age is not defined and it differs. There are cases when several sites were launched in the same day, were indexed within a week from each other but the aging delay for each of them expired in different months. As you see, the sandbox is something beyond your control and you cannot avoid it but still there are steps you can undertake to minimize the damage for new sites with new domains.

Minimizing Sandbox Damages

While Google sandbox is not something you can control, there are certain steps you can take in order to make the sandbox effect less destructive for your new site. As with many aspects of SEO, there are ethical and unethical tips and tricks and unethical tricks can get you additional penalties or a complete ban from Google, so think twice before resorting to them. The unethical approaches will not be discussed in this article because they don comply with our policy.

Before we delve into more detail about particular techniques to minimize sandbox damage, it is necessary to clarify the general rule: you cannot fight the sandbox. The only thing you can do is to adapt to it and patiently wait for time to pass. Any attempts to fool Google – starting from writing melodramatic letters to Google, to using “sandbox tools” to bypass the filter – can only make your situation worse. There are many initiatives you can take, while in the sandbox, for as example:

  • Actively gather content and good links – as time passes by, relevant and fresh content and good links will take you to the top. When getting links, have in mind that they need to be from trusted sources – like DMOZ, CNN, Fortune 500 sites, or other reputable places. Also, links from .edu, .gov, and .mil domains might help because these domains are usually exempt from the sandbox filter. Don't get 500 links a month – this will kill your site! Instead, build links slowly and steadily.

  • Plan ahead– contrary to the general practice of launching a site when it is absolutely complete, launch a couple of pages, when you have them. This will start the clock and time will be running parallel to your site development efforts.

  • Buy old or expired domains – the sandbox effect is more serious for new sites on new domains, so if you buy old or expired domains and launch your new site there, you'll experience less problems.

  • Host on a well- established host – another solution is to host your new site on a subdomain of a well-established host (however, free hosts are generally not a good idea in terms of SEO ranking). The sandbox effect is not so severe for new subdomains (unless the domain itself is blacklisted). You can also host the main site on a subdomain and on a separate domain host just some contents, linked with the main site. You can also use redirects from the subdomained site to the new one, although the effect of this practice is also questionable because it can also be viewed as an attempt to fool Google.

  • Concentrate on less popular keywords – the fact that your site is sandboxed does not mean that it is not indexed by Google at all. On the contrary, you could be able to top the search results from the very beginning! Looking like a contradiction with the rest of the article? Not at all! You could top the results for less popular keywords – sure, it is better than nothing. And while you wait to get to the top for the most lucrative keywords, you can discover that even less popular keywords are enough to keep the ball rolling, so you may want to make some optimization for them.

  • Rely more on non-Google ways to increase traffic – it is often reminded that Google is not the only search engine or marketing tool out there. So if you plan your SEO efforts to include other search engines, which either have no sandbox at all or the period of stay there is relatively short, this will also minimize the damages of the sandbox effect.

Choosing SEO as Your Career

Its always better to know in advance what you can expect from a career in SEO.

Some Good Reasons to Choose SEO as Your Career

1. High demand for SEO services

Once SEO was not a separate profession - Web masters performed some basic SEO for the sites they managed and that was all. But as sites began to grow and make money, it became more reasonable to hire a dedicated SEO specialist than to have the Web master do it. The demand for good SEO experts is high and is constantly on the rise.

2. A LOT of people have made a successful SEO career

There are many living proofs that SEO is a viable business. The list is too long to be quoted here but some of the names include Rob from Blackwood Productions, Jill Wahlen from High Rankings, Rand Fishkin from SEO Moz and many others.

3. Search Engine Optimizers make Good Money !

SEO is a profession that can be practiced while working for a company or as a solo practitioner. There are many jobboards like Dice and Craigslist that publish SEO job advertisements. It is worth noting that the compensation for SEO employees is equal to or even higher than that of developers, designers and marketers. Salaries over $80K per annum are not an exception for SEO jobs.
As a solo SEO practitioner you can make even more money. Almost all freelance sites have sections for SEO services and offers for $50 an hour or more are quite common. If you are still not confident that you can work on your own, you can start a SEO job, learn a bit and then start your own company.
If you already feel confident that you know a lot about SEO, you can take this quiz and see how you score. Well, don't get depressed if you didn't pass - here is a great checklist that will teach you a lot, even if you are already familiar with SEO.

4. Only Web-Designing MAY NOT be enough

Many companies offer turn-key solutions that include Web design, Web development AND SEO optimization. In fact, many clients expect that when they hire somebody to make their site, the site will be SEO friendly, so if you are good both as a designer and a SEO expert, you will be a truely valuable professional.
On the other hand, many other companies are dealing with SEO only because they feel that this way they can concentrate their efforts on their major strength – SEO, so you can consider this possibility as well.

5. Logical step ahead if you come from marketing or advertising

The Web has changed the way companies do business, so to some extent today's marketers and advertisers need to have at least some SEO knowledge if they want to be successful. SEO is also a great career for linguists.

6. Lots of Learning

For somebody who comes from design, development or web administration, SEO might look not technical enough and you might feel that you will downgrade if you move to SEO. Don't worry so much - you can learn a LOT from SEO, so if you are a talented techie, you are not downgrading but you are actually upgrading your skills packages.

7. SEO is already recognized as a career

Finally, if you need some more proof that SEO is a great career, have a look at the available courses and exams for SEO practitioners. Well, they might not be a CISCO certification but still they help to institutionalize the SEO profession.

Some Ugly Aspects of SEO

1. Dependent on search engines

It is true that in any career there are many things that are outside of your control but for SEO this is a rule number one. Search engines frequently change their algorithms and what is worse – these changes are not made public, so even the greatest SEO gurus admit that they make a lot of educated guesses about how things work. It is very discouraging to make everything perfect and then to learn that due to a change in the algorithm, your sites dropped 100 positions down. But the worst part is that you need to communicate this to clients, who are not satisfied with their sinking ratings.

2. No fixed rules

Probably this will change over time but for now the rule is that there are no rules – or at least not written ones. You can work very hard, follow everything that looks like a rule and still success is not coming. Currently you can't even rely on bringing a search engine to court because of the damages they have done to your business because search engines are not obliged to rank high sites that have made efforts to get optimized.

3. Rapid changes in rankings

But even if you somehow manage to get to the top for a particular keyword, keeping the position requires constant efforts. Well, many other businesses are like that, so this is hardly a reason to complain – except when an angry customer starts shouting at you that this week their ratings are sinking and of course this is all your fault.

4. SEO requires Patience

The SEO professional and customers both need to understand that SEO takes constant effort and time. It could take months to move ahead in the ratings, or to build tens of links. Additionally, if you stop optimizing for some time, most likely you will experience a considerable drop in ratings. You need lots of motivation and patience not to give up when things are not going your way.

5. Black hat SEO

Black hat SEO is probably one of the biggest concerns for the would-be SEO practitioner. Fraud and unfair competition are present in any industry and those who are good and ethical suffer from this but black hat SEO is still pretty widespread. It is true that search engines penalize black hat practices but still black hat SEO is a major concern for the industry.

So, let's hope that by telling you about the pros and cons of choosing SEO as your career we have helped you make an informed decision about your future.

Bing Optimization

Ever since Microsoft launched its Bing search engine, it has drawn a lot of interest (and speculation) from the SEO community. On one hand, this is quite logical because Bing is intended to be one more heavy-weight player and it is expected to cut some share from Google. On the other hand, this is hardly the first time a new heavy-weight player comes to the ring, so maybe the expectations that Bing will put an end to Google's monopoly are groundless. Still, Bing is quite different (in a positive way) from the other search engines and this is its major strength.

Is Bing Really That Different?

The first impression you get when you go to Bing.com is that it is different – the background makes it cute but sure, there have been many other cases of search engines with tons of graphical frills to disguise their irrelevant search algorithms. However, when you type a search term, the results you get are a pleasant surprise because they are relevant.

It is this relevance of search results that worries SEO experts. The results you get when you search with Bing are relevant, yet they are very different from Google's. Actually, no matter if you search with Google or with Bing (or if you go to Bingle, you can compare the result sets side by side), you get relevant results and the two sets are very different from one another.

One of the most important things SEO experts are curious to know about Bing is its algorithm. Obviously, Bing's algorithm is different from Google's because when the search term is the same but the set of results is different, a difference in the algorithm is the obvious answer. Actually, the question is exactly what is different between the two algorithms and if the difference is so drastic that it makes it mandatory to reoptimize a site for Bing.

What Do I Need to Do In Order to Optimize My Site for Bing?

Wait. This is the first thing you need to do. Right now it is too early to say what steps (if any) are required in order to optimize your site for Bing.

Additionally, no matter how promising Bing looks, it is still early to predict if it will become a real competitor to Google or if it will become one more failed attempt to dethrone Google. Let's see how users react – will they start Binging more or will they stick to Google. When it becomes clear that Bing will be able to make it, then it will make sense to optimize for it as well. So for now the best you can do is wait.

Which Factors Make a Site Rank Well With Bing

As you probably guess, the exact algorithm of Bing is not publicly available and because of that there is a lot of speculation about what weighs more for Bing (in comparison to Google) and what weighs less. Many SEO experts test different search queries, analyze the results, and based on that try to figure out what of the known SEO tactics works with Bing. For instance, these tests are quite interesting.

Some SEO experts even think that Bing is actually Live Search in new clothes (i.e. user interface), while others say that there are noticeable differences between Live Search and Bing. But there is no doubt that for now Bing is a significant improvement over Live Search in terms of relevance of search results.

Bing is hardly the first time when there is no agreement in the SEO community about the intricacies of the algorithm but if we can summarize, here are some factors, which are (or at least are strongly believed to be) of importance when Bing optimization is concerned:

  • Backlinks are of less importance. If you compare the first 10 results in Bing and Google, it is noticeable that all equal, the winners in Bing have less backlinks than the winners in Google. It is unclear if nofollow matters with Bing.

  • Inbound anchor text matters more. The quantity of quality inbound links might be of less importance for Bing but the anchor text certainly matters more. Actually, since anchor text is one of the measurements of the quality of inbound links, it isn't much different. Get quality anchor text and you will do well in both Bing and Google.

  • Link spamming won't do much for you on Bing. Since the quantity of backinks (even if they are of supreme quality) seems to be of less importance to Bing, link spamming will be even less effective than with Google.

  • Onpage factors matter more than with Google. This is one of the most controversial points. Many SEO experts disagree but many also think that onpage factors matter more with Bing than with Google. Still, it has nothing to do with the 90s, when onpage factors were definitive.

  • Bing pays more attention to the authority of the site. If this is true, this is bad news for bloggers and small sites because it means that search results are distorted in favor of older sites and/or sites of authoritative organizations. Age of domain is also very important with Bing – even more than with Google.

  • PR matters less. When you perform a search for a competitive keyword and you see a couple of PR2 or even PR1 sites among the top 10 results, this might make you wonder. On Google this is hardly possible but on Bing it looks quite normal.

  • Fresh content matters less. Bing looks a bit conservative – or maybe it just can't index sites that quickly – but it seems that fresh content is not so vital as with Google. This is related to the age of domain specifics and as a result you will see ancient pages rank high (but these ancient pages are relevant to the search query).

  • Bing is more Flash-friendly. Optimizing a Flash site for Google is a bit of a SEO nightmare. It is too early to say but it looks like Bing is more Flash-friendly, which is good news to all sites where Flash is (still) heavily employed.

For now it is too early to say which factors are of primary importance with Bing. But the fact that their search results are relevant means that their algorithm is really precise. Well, maybe the relevant results in Bing are due to the fact that web masters were taken by surprise and they haven't had the time to optimize for Bing. As a result, the content is authentic, there are no SEO gimmicks and artificial pumping. We'll see if this will stay so in the future, when web masters learn how to optimize for Bing as well!

SEO Careers during a Recession

I don't know if many people became SEO experts because they planned ahead and thought that SEO careers are relatively stable in the long run, especially when compared to other business areas, or the reasons to make a career in SEO were completely different, but my feeling is that SEO experts are lucky now. Why? Because while the recession makes many industries wrench in pain, many SEO professionals are in top financial shape and full of optimism for the future.

It will be an exaggeration to say that the SEO industry doesn't feel the recession. This is not exactly so but when compared to industries such as automobiles, newspapers, banking, real estate, etc., SEO looks like a coveted island of financial security. This doesn't mean that there is no drop in volumes and everybody in SEO is working for top dollar but as a whole the SEO industry and the separate individuals, who make their living in SEO, are much better than many other employees and entrepreneurs.

What Can You Expect from Your SEO Career During a Recession?

The question about what realistic expectations are is fundamental. I bet there are people in SEO, who are not very happy with their current situation and blame the recession for that. Well, if most of your clients were from troubled industries (cars, real estate, financial services, etc.), then you do have a reason to complain. In such cases you should be happy if you can pay the bills. What you can do (if you haven't already done it) is to look for new customers from other industries.

Another factor that influences your expectations about your SEO career during the recession is your position on the career ladder. It makes a big difference if you work for a company or you are your own boss. Being an employee has always been a more vulnerable position, so if you expect job security, this is easier to achieve when you ares an independent SEO contractor. Mass layoffs might not the common for SEO companies but hired workers are never immune against it.

Additionally, your skill level also affects how your SEO carer will be influenced by the recession. The recession is not the right time for novices to enter SEO. Many people from other industries rush to SEO as a life belt. When these people don't have the right skills and expertise but expect rivers of gold, this inevitably leads to disappointment.

What Makes SEO Careers Recession-Proof?

So, if you are a seasoned SEO practitioner and you don't dream of rivers of gold, you can feel safe with SEO because unlike careers in many other industries SEO careers are relatively recession-proof. Here are some of the reasons why SEO careers are recession-proof:

  • The SEO market is an established market. If you remember the previous recession from the beginning of the century, when the IT industry was among the most heavily stricken, you might be skeptical a bit that now it won't be the same story. No, it is not the same now. SEO is not a new service anymore and the SEO market itself is more established than it was a couple of years ago. This is what makes the present recession different from the previous one – the difference is fundamental and it can't be neglected.

  • SEO is one of the last expenses companies cut. SEO has already become a necessity for companies of any size. Unlike hardware, cars, not to mention entertainment and even business trips, SEO expenses are usually not that big but they help a company to stay aboard. That is why when a company decides to make cuts in the budget, SEO expenses are usually not among the things that get the largest cut (or any cut at all).

  • SEO has great ROI. The Return On Investment (ROI) for money spent on SEO is much higher than the ROI for other types of investments. SEO brings companies money and this is what makes it such a great investment. Stop SEO and the money stops coming as well.

  • Many clients start aggressive SEO campaigns in an attempt to get better results fast. During a recession SEO is even more important. That is why many clients decide that an aggressive SEO campaign will help them get more clients and as a result these clients double their pre-recession budgets.

  • SEO is cheaper than PPC. SEO is just one of the many ways for a site to get traffic. However, it is also one of the most effective ways to drive tons of traffic. For instance, if you consider PPC, the cost advantages of SEO are obvious. PPC is very expensive and as a rule, ranking high in organic search results even for competitive keywords is cheaper than PPC.

  • Cheaper than traditional promotion methods. Traditional promotion methods (i.e. offline marketing) are still an option but their costs are higher even than PPC and the other forms of online promotion. Besides many companies have given offline marketing completely and have turned to SEO as their major way to promote their business and attract new clients.

  • SEO is an recurring expense. Many businesses build their business model around memberships and other forms of recurring payments. For you memberships and other types of recurring payments are presold campaigns – i.e. more or less you know that if the client is happy with a campaign you did for him, he or she will return. Acquiring recurring clients is very beneficial because you have less expenses in comparison to acquiring clients one by one.

The outlook for SEO careers during times of recession is pretty positive. As we already mentioned, it is possible to experience drops in volumes or some of your clients to go the bankruptcy road but as a whole SEO offers more stability than many other careers. If you manage to take advantage of the above mentioned recession-proof specifics of SEO and you are a real professional, you won't have the pleasure to feel the recession in all its bitterness.

Optimization, Over-Optimization or SEO Overkill?

The fight to top search engines' results knows no limits – neither ethical, nor technical. There are often reports of sites that have been temporarily or permanently excluded from Google and the other search engines because of malpractice and using “black hat” SEO optimization techniques. The reaction of search engines is easy to understand – with so many tricks and cheats that SEO experts include in their arsenal, the relevancy of returned results is seriously compromised to the point where search engines start to deliver completely irrelevant and manipulated search results. And even if search engines do not discover your scams right away, your competitors might report you.

Keyword Density or Keyword Stuffing?

Sometimes SEO experts go too far in their desire to push their clients' sites to top positions and resort to questionable practices, like keyword stuffing. Keyword stuffing is considered an unethical practice because what you actually do is use the keyword in question throughout the text suspiciously often. Having in mind that the recommended keyword density is from 3 to 7%, anything above this, say 10% density starts to look very much like keyword stuffing and it is likely that will not get unnoticed by search engines. A text with 10% keyword density can hardly make sense, if read by a human. Some time ago Google implemented the so called “Florida Update” and essentially imposed a penalty for pages that are keyword-stuffed and over-optimized in general.

Generally, keyword density in the title, the headings, and the first paragraphs matters more. Needless to say that you should be especially careful not to stuff these areas. Try the Keyword Density Cloud tool to check if your keyword density is in the acceptable limits, especially in the above-mentioned places. If you have a high density percentage for a frequently used keyword, then consider replacing some of the occurrences of the keyword with synonyms. Also, generally words that are in bold and/or italic are considered important by search engines but if any occurrence of the target keywords is in bold and italic, this also looks unnatural and in the best case it will not push your page up.

Doorway Pages and Hidden Text

Another common keyword scam is doorway pages. Before Google introduced the PageRank algorithm, doorways were a common practice and there were times when they were not considered an illegal optimization. A doorway page is a page that is made especially for the search engines and that has no meaning for humans but is used to get high positions in search engines and to trick users to come to the site. Although keywords are still very important, today keywords alone have less effect in determining the position of a site in search results, so doorway pages do not get so much traffic to a site but if you use them, don't ask why Google punished you.

Very similar to doorway pages was a scam called hidden text. This is text, which is invisible to humans (e.g. the text color is the same as the page background) but is included in the HTML source of the page, trying to fool search engines that the particular page is keyword-rich. Needless to say, both doorway pages and hidden text can hardly be qualified as optimization techniques, there are more manipulation than everything else.

Duplicate Content

It is a basic SEO rule that content is king. But not duplicate content. In terms of Google, duplicate content means text that is the same as the text on a different page on the SAME site (or on a sister-site, or on a site that is heavily linked to the site in question and it can be presumed that the two sites are related) – i.e. when you copy and paste the same paragraphs from one page on your site to another, then you might expect to see your site's rank drop. Most SEO experts believe that syndicated content is not treated as duplicate content and there are many examples of this. If syndicated content were duplicate content, that the sites of news agencies would have been the first to drop out of search results. Still, it does not hurt to check from time if your site has duplicate content with another, at least because somebody might be illegally copying your content and you do not know. The guys from Blackwood Productions have told me about cases like this, so as incredible as it might seen, content theft is not that uncommon. The Similar Page Checker tool will help you see if you have grounds to worry about duplicate content.

Links Spam

Links are another major SEO tool and like the other SEO tools it can be used or misused. While backlinks are certainly important (for Yahoo backlinks are important as quantity, while for Google it is more important what sites backlinks come from), getting tons of backlinks from a link farm or a blacklisted site is begging to be penalized. Also, if outbound links (links from your site to other sites) considerably outnumber your inbound links (links from other sites to your site), then you have put too much effort in creating useless links because this will not improve your ranking. You can use the Domain Stats Tool to see the number of backlinks (inbound links) to your site and the Site Link Analyzer to see how many outbound links you have.

Using keywords in links (the anchor text), domain names, folder and file names does boost your search engine rankings but again, the precise measure is the boundary between topping the search results and being kicked out of them. For instance, if you are optimizing for the keyword “cat”, which is a frequently chosen keyword and as with all popular keywords and phrases, competition is fierce, you might not see other alternative for reaching the top but getting a domain name like http://www.cat-cats-kittens-kitty.com, which no doubt is packed with keywords to the maximum but is first – difficult to remember, and second – if the contents does not correspond to the plenitude of cats in the domain name, you will never top the search results.

Although file and folder names are less important than domain names, now and then (but definitely not all the time) you can include “cat” (and synonyms) in them and in the anchor text of the links. This counts well, provided that anchors are not artificially stuffed (for instance if you use “cat_cats_kitten” as anchor for internal site links this anchor certainly is stuffed). While you have no control over third sides that link to you and use anchors that you don't like, it is up to you to perform periodic checks what anchors do other sites use to link to you. A handy tool for this task is the Backlink Anchor Text Analysis, where you enter the URL and get a listing of the sites that link to you and the anchor text they use.

Finally, to Google and the other search engines it makes no difference if a site is intentionally over-optimized to cheat them or over-optimization is the result of good intentions, so no matter what your motives are, always try to keep to reasonable practices and remember that do not overstep the line.

Top 10 SEO Mistakes

1. Targetting the wrong keywords

This is a mistake many people make and what is worse – even experienced SEO experts make it. People choose keywords that in their mind are descriptive of their website but the average users just may not search them. For instance, if you have a relationship site, you might discover that “relationship guide” does not work for you, even though it has the “relationship” keyword, while “dating advice” works like a charm. Choosing the right keywords can make or break your SEO campaign. Even if you are very resourceful, you can't think on your own of all the great keywords but a good keyword suggestion tool, for instance, the Website Keyword Suggestion tool will help you find keywords that are good for your site.

2. Ignoring the Title tag

Leaving the < title> tag empty is also very common. This is one of the most important places to have a keyword, because not only does it help you in optimization but the text in your < /title>< title> tag shows in the search results as your page title.< /title>

3. A Flash website without a html alternative

Flash might be attractive but not to search engines and users. If you really insist that your site is Flash-based and you want search engines to love it, provide an html version. Here are some more tips for optimizing Flash sites. Search engines don't like Flash sites for a reason – a spider can't read Flash content and therefore can't index it.

4. JavaScript Menus

Using JavaScript for navigation is not bad as long as you understand that search engines do not read JavaScript and build your web pages accordingly. So if you have JavaScript menus you can't do without, you should consider build a sitemap (or putting the links in a noscript tag) so that all your links will be crawlable.

5. Lack of consistency and maintenance

Our friend Rob from Blackwood Productions often encounters clients, who believe that once you optimize a site, it is done foreve. If you want to be successful, you need to permanently optimize your site, keep an eye on the competition and – changes in the ranking algorithms of search engines.

6. Concentrating too much on meta tags

A lot of people seem to think SEO is about getting your meta keywords and description correct! In fact, meta tags are becoming (if not already) a thing of the past. You can create your meta keywords and descriptions but don't except to rank well only because of this.

7. Using only Images for Headings

Many people think that an image looks better than text for headings and menus. Yes, an image can make your site look more distinctive but in terms of SEO images for headings and menus are a big mistake because h2, h2, etc. tags and menu links are important SEO items. If you are afraid that your h1 h2, etc. tags look horrible, try modifying them in a stylesheet or consider this approach: http://www.stopdesign.com/articles/replace_text.

8. Ignoring URLs

Many people underestimate how important a good URL is. Dynamic page names are still very frequent and no keywords in the URL is more a rule than an exception. Yes, it is possible to rank high even without keywords in the URL but all being equal, if you have keywords in the URL (the domain itself, or file names, which are part of the URL), this gives you additional advantage over your competitors. Keywords in URLs are more important for MSN and Yahoo! but even with Google their relative weight is high, so there is no excuse for having keywordless URLs.

9. Backlink spamming

It is a common delusion that it more backlinks are ALWAYS better and because of this web masters resort to link farms, forum/newgroup spam etc., which ultimately could lead to getting their site banned. In fact, what you need are quality backlinks. Here are some more information on The Importance of Backlinks

10. Lack of keywords in the content

Once you focus on your keywords, modify your content and put the keywords wherever it makes sense. It is even better to make them bold or highlight them.

Wednesday, July 15, 2009

Find State & Telecom Operator of any Mobile Number SERIES in INDIA...

If someone is calling you from a number you never seen and you want to know about his location( from where he/ she is calling you) or if you just want to know that which mobile operator’s number your friend is using, then here is the way to find out. Below list contains the complete mobile codes and prefix of all private or govt. mobile operators in India. Now you can easily search, find out and get the information about state, city, and telecom service provider just from the any mobile number. If u want to get PDF version of mobile Codes list send mail to me at on knightkams@gmail.com and i'll provide u via mail.


How to get info about location of caller from cellphone/ mobile number??

India is divided into various cellular zones such that within each zone, the call is treated as a local call, while across zones, it becomes a long-distance call. A cellular zone (or cellular circle) is normally the entire state, with a few exceptions like Mumbai (which is a different zone), Goa (which is a part of the Maharashtra zone) or Uttar Pradesh (divided into multiple zones).All mobile numbers in India have the prefix 9 (This includes pager services, but the use of pagers is on the decline). Each zone is allowed to have multiple private operators (earlier it was 2 private + BSNL, subsequently it was changed to 3 private + BSNL in GSM 900/1800, now it also includes 2 private + BSNL in CDMA). All cellphone numbers are 10 digits long, (normally) split up as OO-AA-NNNNNN where OO is the operator code, AA is the zone code assigned to the operator, and NNNNNN is the subscriber number.


Indian Mobile Phone Numbering system

* 92-xx-yyyyyy - TATA Indicom Numbers

* 93-xx-yyyyyy - Reliance Mobile Numbers

* 94-xx-yyyyyy - BSNL CellOne Numbers

* 96-xx-yyyyyy - Various operators except Reliance, TATA and BSNL

* 97-xx-yyyyyy - Various operators except Reliance, TATA and BSNL

* 98-xx-yyyyyy - Various operators except Reliance, TATA and BSNL

* 99-xx-yyyyyy - Various operators except Reliance, TATA and BSNL


Tata Ind.........Reliance.......BSNL.........97 Shared.........98 Shared........99 Shared
9200 MP 9300 MP 9400 –– 9700 a AP 9800 A WB 9900 A KA
9201 MP 9301 MP 9401 AS 9701 A AP 9801 A BR 9901 A KA
9202 MP 9302 MP 9402 NE 9702 I MU 9802 D HR 9902 A KA
9203 MP 9303 MP 9403 MH 9703 V AP 9803 D PB 9903 A KO
9204 –– 9304 BR 9404 –– 9704 A AP 9804 D KO 9904 I GJ
9205 –– 9305 UE 9405 –– 9705 I AP 9805 A HP 9905 R BR
9206 –– 9306 JK 9406 MP 9706 E AS 9806 D MP 9906 A JK
9207 – 9307 UE 9407 MP 9707 R AS 9807 D UE 9907 R MP
9208 –
9308 BR 9408 – 9708 I BR 9808 D UW 9908 A AP
9209 –
9309 RJ 9409 – 9709 E BR 9809 D KL 9909 V GJ
9210 DL 9310 DL 9410 UW 9710 a CH 9810 A DL 9910 A DL
9211 DL 9311 DL 9411 UW 9711 V DL 9811 V DL 9911 I DL
9212 DL 9312 DL 9412 UW 9712 V GJ 9812 I HR 9912 I AP
9213 DL 9313 DL 9413 RJ 9713 - –– 9813 V HR 9913 V GJ
9214 RJ 9314 RJ 9414 RJ 9714 I GJ 9814 S PB 9914 S PB
9215 HR 9315 HR 9415 UE 9715 a TN 9815 A PB 9915 A PB
9216 PB 9316 PB 9416 HR 9716 a DL 9816 A HP 9916 V KA
9217 PB 9317 PB 9417 PB 9717 A DL 9817 R HP 9917 I UW
9218 HP 9318 HP 9418 HP 9718 I DL 9818 A DL 9918 V UE
9219 UW 9319 UW 9419 JK 9719 V UW 9819 V MU 9919 V UE
9220 MU 9320 MU 9420 MH 9720 V UW 9820 V MU 9920 V MU
9221 MU 9321 MU 9421 MH 9721 V UE 9821 B MU 9921 I MH
9222 MU 9322 MU 9422 MH 9722 a GJ
9822 I MH 9922 I MH
9223 MU 9323 MU 9423 MH 9723 I GJ 9823 V MH 9923 V MH
9224 MU 9324 MU 9424 MP 9724 A GJ 9824 I GJ 9924 I GJ
9225 MH 9325 MH 9425 MP 9725 A GJ 9825 V GJ 9925 V GJ
9226 MH 9326 MH 9426 GJ 9726 V GJ 9826 I MP 9926 I MP
9227 GJ 9327 GJ 9427 GJ 9727 V GJ 9827 R MP 9927 I UW
9228 GJ 9328 GJ 9428 GJ 9728 I HR 9828 V RJ 9928 A RJ
9229 MP 9329 MP 9429 GJ 9729 A HR 9829 A RJ 9929 A RJ
9230 KO 9330 KO 9430 BR 9730 A MH 9830 V KO 9930 V MU
9231 KO 9331 KO 9431 BR 9731 A KA 9831 A KO 9931 A BR
9232 WB 9332 WB 9432 KO 9732 V WB 9832 R WB 9932 A WB
9233 WB 9333 WB 9433 KO 9733 V WB 9833 V MU 9933 A WB
9234 BR 9334 BR 9434 WB 9734 V WB 9834 A MP 9934 A BR
9235 UE 9335 UE 9435 AS 9735 V WB 9835 R BR 9935 A UE
9236 UE 9336 UE 9436 NE 9736 E HP 9836 V KO 9936 A UE
9237 OR 9337 OR 9437 OR 9737 - –– 9837 I UW 9937 A OR
9238 OR 9338 OR 9438 OR 9738 a KA 9838 V UE 9938 A OR
9239 KO 9339 KO 9439 –– 9739 V KA 9839 V UE 9939 A BR
9240 CH 9340 CH 9440 AP 9740 A KA 9840 A CH 9940 A CH
9241 KA 9341 KA 9441 AP 9741 A KA 9841 a CH 9941 a CH
9242 KA 9342 KA 9442 TN 9742 V KA 9842 a TN 9942 a TN
9243 KA 9343 KA 9443 TN 9743 S KA 9843 V TN 9943 V TN
9244 TN 9344 TN 9444 CH 9744 I KL 9844 S KA 9944 A TN
9245 TN 9345 TN 9445 CH 9745 V KL 9845 A KA 9945 A KA
9246 AP 9346 AP 9446 KL 9746 A KL 9846 V KL 9946 V KL
9247 AP 9347 AP 9447 KL 9747 I KL 9847 I KL 9947 I KL
9248 AP 9348 AP 9448 KA 9748 A KO 9848 I AP 9948 I AP
9249 KL 9349 KL 9449 KA 9749 R WB 9849 A AP 9949 A AP
9250 DL 9350 DL 9450 UE 9750 a TN 9850 I MH 9950 A RJ
9251 RJ 9351 RJ 9451 UE 9751 V TN 9851 D WB 9951 I AP
9252 RJ 9352 RJ 9452 UE 9752 A MP 9852 D BR 9952 A TN
9253 HR 9353 HR 9453 UE 9753 I MP 9853 D OR 9953 V DL
9254 HR 9354 HR 9454 UE 9754 I MP 9854 D AS 9954 A AS
9255 HR 9355 HR 9455 –– 9755 A MP 9855 S PB 9955 A BR
9256 PB 9356 PB 9456 UW 9756 I UW 9856 D NE 9956 A UE
9257 PB 9357 PB 9457 UW 9757 M MU 9857 D HP 9957 A AS
9258 UW 9358 UW 9458 –– 9758 V UW 9858 D JK 9958 A DL
9259 UW 9359 UW 9459 –– 9759 V UW 9859 D AS 9959 A AP
9260 TN 9360 TN 9460 RJ 9760 A UW 9860 A MH 9960 A MH
9261 TN 9361 TN 9461 RJ 9761 V UW 9861 R OR 9961 I KL
9262 TN 9362 TN 9462 –– 9762 a MH 9862 A NE 9962 V CH
9263 TN 9363 TN 9463 PB 9763 I MH 9863 R NE 9963 A AP
9264 TN 9364 TN 9464 PB 9764 V MH 9864 R AS 9964 S KA
9265 TN 9365 TN 9465 –– 9765 V MH 9865 a TN 9965 a TN
9266 TN 9366 TN 9466 HR 9766 A MH 9866 A AP 9966 V AP
9267 TN 9367 TN 9467 –– 9767 I MH 9867 A MU 9967 A MU
9268 –– 9368 –– 9468 –– 9768 a MU 9868 M DL 9968 M DL
9269 –– 9369 –– 9469 JK 9769 V MU 9869 M MU 9969 M MU
9270 MH 9370 MH 9470 BR 9770 R MP 9870 B MU 9970 A MH
9271 MH 9371 MH 9471 –– 9771 A BR 9871 A DL 9971 A DL
9272 MH 9372 MH 9472 –– 9772 V RJ 9872 A PB 9972 A KA
9273 MH 9373 MH 9473 –– 9773 B MU 9873 V DL 9973 A BR
9274 GJ 9374 GJ 9474 WB 9774 E NE 9874 V KO 9974 A GJ
9275 GJ 9375 GJ 9475 –– 9775 V WB 9875 r RJ 9975 A MH
9276 GJ 9376 GJ 9476 –– 9776 E OR 9876 A PB 9976 a TN
9277 GJ 9377 GJ 9477 –– 9777 A OR 9877 F PB 9977 I MP
9278 –– 9378 –– 9478 –– 9778 R OR 9878 A PB 9978 V GJ
9279 –– 9379 –– 9479 –– 9779 A PB 9879 V GJ 9979 V GJ
9280 CH 9380 CH 9480 KA 9780 V PB 9880 A KA 9980 A KA
9281 CH 9381 CH 9481 –– 9781 S PB 9881 I MH 9981 A MP
9282 CH 9382 CH 9482 –– 9782 a RJ 9882 I HP 9982 V RJ
9283 CH 9383 CH 9483 –– 9783 V RJ 9883 R KO 9983 V RJ
9284 CH 9384 CH 9484 –– 9784 A RJ 9884 V CH 9984 V UE
9285 CH 9385 CH 9485 –– 9785 I RJ 9885 V AP 9985 V AP
9286 –– 9386 –– 9486 TN 9786 V TN 9886 V KA 9986 V KA
9287 KL 9387 KL 9487 –– 9787 V TN 9887 I RJ 9987 A MU
9288 KL 9388 KL 9488 –– 9788 a TN 9888 V PB 9988 V PB
9289 –– 9389 –– 9489 –– 9789 A TN 9889 I UE 9989 A AP
9290 AP 9390 AP 9490 AP 9790 A TN 9890 A MH 9990 I DL
9291 AP 9391 AP 9491 AP 9791 A TN 9891 I DL 9991 V HR
9292 AP 9392 AP 9492 –– 9792 V UE 9892 A MU 9992 I HR
9293 AP 9393 AP 9493 –– 9793 A UE 9893 A MP 9993 A MP
9294 AP 9394 AP 9494 –– 9794 A UE 9894 A TN 9994 A TN
9295 AP 9395 AP 9495 KL 9795 I UE 9895 A KL 9995 A KL
9296 AP 9396 AP 9496 –– 9796 E JK 9896 A HR 9996 A HR
9297 AP 9397 AP 9497 –– 9797 A JK 9897 A UW 9997 A UW
9298 AP 9398 AP 9498 –– 9798 R BR 9898 A GJ 9998 A GJ
9299 AP 9399 AP 9499 –– 9799 A RJ 9899 V DL 9999 V DL

The new 90 series..

The new 90 series..
9000 Airtel Andhra Pradesh
9001 Airtel Rajasthan
9002 Airtel West Bengal
9003 Airtel Tamil Nadu
9004 Airtel Mumbai
9005 Airtel UP(east)
9006 Airtel Bihar Jharkhand
9007 Airtel Kolkata
9008 Airtel Karnataka
9009 Idea Madhya Pradesh
9010 Idea Andhra Pradesh
9011 Idea Maharashtra
9012 Idea UP(west)
9013 Mtnl Dolphin Delhi
9014 Reliance Gsm Andhra Pradesh
9015 Reliance Gsm Delhi
9016 Reliance Gsm Gujrat
9017 Reliance Gsm Haryana
9018 Reliance Gsm Jammu Kashmir
9019 Reliance Gsm Karnataka
9020 Reliance Gsm Kerala
9021 Reliance Gsm Maharashtra
9022 Reliance Gsm Mumbai
9023 Reliance Gsm Punjab
9024 Reliance Gsm Rajasthan
9025 Reliance Gsm Tamil Nadu
9026 Reliance Gsm UP(east)
9027 Reliance Gsm UP(west)
9028 Tata Gsm Maharashtra
9029 Tata Gsm Mumbai
9030 Tata Gsm Andaa Pradesh
9031 Tata Gsm Bihar Jharkhand
9032 Tata Gsm Delhi
9033 Tata Gsm Gujrat
9034 Tata Gsm Haryana
9035 Tata Gsm Himachal Pradesh
9036 Tata Gsm Karnataka
9037 Tata Gsm Kerala
9038 Tata Gsm Kolkata
9039 Tata Gsm Madhya Pradesh
9040 Tata Gsm Orrisa
9041 Tata Gsm Punjab
9042 Tata Gsm Rajasthan
9043 Tata Gsm Tamil Nadu
9044 Tata Gsm UP(east)
9045 Tata Gsm UP(west)
9046 Tata Gsm West Bengal
9047 Vodafone(Hutch) Tamil Nadu
9048 Vodafone(Hutch) Kerala
9049 Vodafone(Hutch) Maharashtra
9050 Vodafone(Hutch) Haryana
9051 Vodafone(Hutch) Kolkata
9052 Vodafone(Hutch) Andhra Pradesh
9053 Unitech Haryana
9054 Unitech Himachal Pradesh
9055 Unitech Jammu Kashmir
9056 Unitech Punjab
9057 Unitech Rajasthan
9058 Unitech UP(west)
9059 Unitech Andhra Pradesh
9060 Unitech Karnataka
9061 Unitech Kerala
9062 Unitech Kolkata
9063 Datacom Andhra Pradesh
9064 Datacom Assam
9065 Datacom Bihar Jharkhand
9066 Datacom Delhi
9067 Datacom Gujrat
9068 Datacom Haryana
9069 Datacom Himachal Pradesh
9070 Datacom Jammu Kashmir
9071 Datacom Karnataka
9072 Datacom Kerala
9073 Datacom Kolkata
9074 Datacom Madhya Pradesh
9075 Datacom Maharashtra
9076 Datacom Mumbai
9077 Datacom North East
9078 Datacom Orrisa
9079 Datacom Rajasthan
9080 Datacom Tamil Nadu
9081 Datacom UP(east)
9082 Datacom UP(west)
9083 Datacom West Bengal
9084 Unitech Delhi
9085 Idea Assam
9086 Idea Jammu Kashmir
9087 Idea Karnataka
9088 Idea Kolkata
9089 Idea North East
9090 Idea Orrisa
9091 Idea Punjab
9092 Idea Tamil Nadu
9093 Idea West Bengal
9094 Aircel Chennai
9095 Aircel Tamil Nadu
9096 Airtel Maharashtra
9097 Aircel Bihar Jharkhand
9098 Reliance Gsm Madhya Pradesh
9099 Vodafone(Hutch) Gujrat


Abbreviations

Telecom Circles Cellular Operator
AP - ANDHRA PRADESH
AS - ASSAM
BR - BIHAR & JHARKHAND
CH - CHENNAI
DL - DELHI
GJ - GUJRAT
HP - HIMACHAL PRADESH
HR - HARYANA
JK - JAMMU & KASHMIR
KL - KERALA
KA - KARNATAKA
KO - KOLKATA
MH - MAHARASHTRA
MP - MP & CHHATTISGARH
MU - MUMBAI
NE - NORTH EAST
OR - ORISSA
PB - PUNJAB
RJ - RAJASTHAN
TN - TAMILNADU
UE - UP (EAST)
UW - UP (WEST) & UTTARANCHAL
WB - WEST BENGAL & ANDAMAN
A - AIRTEL
a - AIRCEL LTD.
B - BPL MOBILE
D - DISHNET WIRELESS LTD.
E - ESSAR SPACETEL
F - HFCL CONNECT
V - VODAFONE (HUTCH)
I - IDEA
M - DOLPHIN (MTNL)
R - RELIANCE TELECOM PVT. LTD.
r - RAINBOW (SHYAM) (CDMA)
RIM - RELIANCE INDIA (CDMA)
S - SPICE COMMUNICATIONS
CELLONE- BSNL
TATA INDIC