Dynamic URLs vs. Static URLs (SEO Friendly URLs)

Article Source : http://googlewebmastercentral.blogspot.com/

Which can Googlebot read better, static or dynamic URLs?

We’ve come across many webmasters who, like our friend, believed that static or static-looking URLs were an advantage for indexing and ranking their sites. This is based on the presumption that search engines have issues with crawling and analyzing URLs that include session IDs or source trackers. However, as a matter of fact, we at Google have made some progress in both areas. While static URLs might have a slight advantage in terms of clickthrough rates because users can easily read the urls, the decision to use database-driven websites does not imply a significant disadvantage in terms of indexing and ranking. Providing search engines with dynamic URLs should be favored over hiding parameters to make them look static.

Myth: “Dynamic URLs cannot be crawled.”
Fact: We can crawl dynamic URLs and interpret the different parameters. We might have problems crawling and ranking your dynamic URLs if you try to make your urls look static and in the process hide parameters which offer the Googlebot valuable information. One recommendation is to avoid reformatting a dynamic URL to make it look static. It’s always advisable to use static content with static URLs as much as possible, but in cases where you decide to use dynamic content, you should give us the possibility to analyze your URL structure and not remove information by hiding parameters and making them look static.

Myth: “Dynamic URLs are okay if you use fewer than three parameters.”
Fact: There is no limit on the number of parameters, but a good rule of thumb would be to keep your URLs short (this applies to all URLs, whether static or dynamic). You may be able to remove some parameters which aren’t essential for Googlebot and offer your users a nice looking dynamic URL. If you are not able to figure out which parameters to remove, we’d advise you to serve us all the parameters in your dynamic URL and our system will figure out which ones do not matter. Hiding your parameters keeps us from analyzing your URLs properly and we won’t be able to recognize the parameters as such, which could cause a loss of valuable information.

Following are some questions we thought you might have at this point.

Does that mean I should avoid rewriting dynamic URLs at all?
That’s our recommendation, unless your rewrites are limited to removing unnecessary parameters, or you are very diligent in removing all parameters that could cause problems. If you transform your dynamic URL to make it look static you should be aware that we might not be able to interpret the information correctly in all cases. If you want to serve a static equivalent of your site, you might want to consider transforming the underlying content by serving a replacement which is truly static. One example would be to generate files for all the paths and make them accessible somewhere on your site. However, if you’re using URL rewriting (rather than making a copy of the content) to produce static-looking URLs from a dynamic site, you could be doing harm rather than good. Feel free to serve us your standard dynamic URL and we will automatically find the parameters which are unnecessary.

Can you give me an example?
If you have a dynamic URL which is in the standard format like foo?key1=value&key2=value2 we recommend that you leave the url unchanged, and Google will determine which parameters can be removed; or you could remove uncessary parameters for your users. Be careful that you only remove parameters which do not matter. Here’s an example of a URL with a couple of parameters:



  • language=en – indicates the language of the article
  • answer=3 – the article has the number 3
  • sid=8971298178906 – the session ID number is 8971298178906
  • query=URL – the query with which the article was found is [URL]

Not all of these parameters offer additional information. So rewriting the URL to www.example.com/article/bin/answer.foo?language=en&answer=3 probably would not cause any problems as all irrelevant parameters are removed.

The following are some examples of static-looking URLs which may cause more crawling problems than serving the dynamic URL without rewriting:

  • www.example.com/article/bin/answer.foo/en/3/98971298178906/URL
  • www.example.com/article/bin/answer.foo/language=en/answer=3/
  • www.example.com/article/bin/answer.foo/language/en/answer/3/
  • www.example.com/article/bin/answer.foo/en,3,98971298178906,URL

Rewriting your dynamic URL to one of these examples could cause us to crawl the same piece of content needlessly via many different URLs with varying values for session IDs (sid) and query. These forms make it difficult for us to understand that URL and 98971298178906 have nothing to do with the actual content which is returned via this URL. However, here’s an example of a rewrite where all irrelevant parameters have been removed:

  • www.example.com/article/bin/answer.foo/en/3

Although we are able to process this URL correctly, we would still discourage you from using this rewrite as it is hard to maintain and needs to be updated as soon as a new parameter is added to the original dynamic URL. Failure to do this would again result in a static looking URL which is hiding parameters. So the best solution is often to keep your dynamic URLs as they are. Or, if you remove irrelevant parameters, bear in mind to leave the URL dynamic as the above example of a rewritten URL shows:

  • www.example.com/article/bin/answer.foo?language=en&answer=3

For Best SEO Services Please Visit SEOServicesDelhi.com

Posted in Uncategorized

New SEO Project Limo321.com !

orlando limo

Orlando limo Service

BACKSTAGE LIMOUSINE ORLANDO offers limo hire in Florida. If you live in Florida and want a limo then contact us and we will be pleased to give you a limousine quote for your occasion. We are a Orlando Limo Hire company providing Limo Hire services in the Orlando area. We offer limousine hire in Orlando at very competitive prices and will cater for whatever your occasion. Our franchised network ensures that youreceive the very best, a local service, high standards, ethical business practices and modern vehicles. We are able to provide special personal touch and attention to your personal requests.

Visit site : http://www.limo321.com

Visit site : http://www.limo321.com

Posted in Uncategorized

Adding a site to Google!

Article Source Google Webmaster Tools

Inclusion in Google’s search results is free and easy; you don’t even need to submit your site to Google. Google is a fully automated search engine that uses software known as “spiders” to crawl the web on a regular basis and find sites to add to our index. In fact, the vast majority of sites listed in our results aren’t manually submitted for inclusion, but found and added automatically when our spiders crawl the web.

To determine whether your site is currently included in Google’s index, just perform a search for your site’s URL. For example, a search for [ site:www.google.com ] returns the following results: http://www.google.com/search?hl=en&q=site%3Awww.google.com+

Although Google crawls billions of pages, it’s inevitable that some sites will be missed. When our spiders miss a site, it’s frequently for one of the following reasons:

  • The site isn’t well connected through multiple links to other sites on the web.
  • The site launched after Google’s most recent crawl was completed.
  • The design of the site makes it difficult for Google to effectively crawl its content.
  • The site was temporarily unavailable when we tried to crawl it or we received an error when we tried to crawl it. You can use Google Webmaster Tools to see if we received errors when trying to crawl your site.

Our intent is to represent the content of the internet fairly and accurately. To help make this goal a reality, we offer guidelines as well as tips for building a crawler-friendly site. While there’s no guarantee that our spiders will find a particular site, following these guidelines should increase your site’s chances of showing up in our search results.

Consider creating and submitting a detailed Sitemap of your pages. Sitemaps are an easy way for you to submit all your URLs to the Google index and get detailed reports about the visibility of your pages on Google. With Sitemaps, you can automatically keep us informed of all of your current pages and any updates you make to those pages. Please note that submitting a Sitemap doesn’t guarantee that all pages of your site will be crawled or included in our search results.

For Search engine Submission Service visit http://www.seoservicesdelhi.com/

Posted in Uncategorized

SEO Services Delhi offers Affordable SEO in India

WebTech Search marketing Founded in mid 2006, is now one of the leading SEO Services company in India. It is based in Delhi and provides all types of SEO Services and Solutions for every need. We provide SEO Services at very low prices and also Provide the Gauranteed SEO Service in Delhi.

our Main Areas of Services are:

  • Organic SEO
  • Link Building
  • PPC management
  • Search engine Submission
  • Directory Submission
  • Social Media bookmarking
  • Viral Marketing

Our SEO Services starts from just $399 for six months.

To know more or contact us for SEO Quote or free website Analysis contact us at info@seoservicesdelhi.com or Visit SEOServicesDelhi.com

Posted in Uncategorized

Google invalid click policy, combating invalid clicks fraud !

Source Google Inside Adsense Blog

Invalid clicks are clicks generated by prohibited methods. Examples of invalid clicks may include repeated manual clicking or the use of robots, automated clicking tools, or other deceptive software. Invalid clicks are sometimes intended to artificially and/or maliciously drive up an advertiser’s clicks and or a publisher’s earnings.

  • Manual clicks intended to increase your advertising costs or to increase profits for website owners hosting your ads.
  • Clicks by automated tools, robots, or other deceptive software.

We closely monitor these and other scenarios to help protect advertisers from receiving invalid clicks.

Invalid clicks are clicks for which we decide not to charge our AdWords advertisers, since they may artificially drive up advertiser cost or publisher revenue. These include extraneous clicks without any value to the advertiser, such as the second click of a double-click. They also include many other types of clicks that we’ve determined aren’t motivated by genuine user interest.

“Invalid clicks” are often confused with “clicking on your own ads”. However, we’d like to stress that invalid clicks are generally any clicks that artificially inflate advertiser cost or publisher revenue, regardless of their source.

Click fraud is a subset of invalid clicks that are generated with malicious or fraudulent intent — in other words, clicks that are intended to drive up advertiser cost or publisher revenue artificially. Sources for these clicks include, but are not limited to:

  • A publisher clicking on his own ads, or encouraging clicks on his ads
  • Users or family members clicking to support the site / publisher
  • Third-party programs with user incentives, such as paid-to-click services and click-exchanges
  • Automated clicking tools, robots, or other deceptive software

The same principles above apply to ad impressions and conversions as well. Some sources of invalid impressions include, but are not limited to:

  • Excessive page refreshes, generated either manually or automatically
  • Third-party programs with user incentives, such as paid-to-surf or auto-surf programs
  • Third-party programs for purchasing fixed amounts of traffic, e.g. “$10 for 1,000 page views”

As a reminder, any method that artificially generates clicks, impressions, or conversions is strictly prohibited by our program policies. You can also find more information about these topics in our Invalid Clicks FAQ and our Ad Traffic Quality Resource Center.

The security of Google AdWords advertisers is important to Google. Our proprietary technology analyzes clicks and impressions in an attempt to determine whether they fit a pattern of use that may artificially drive up an advertiser’s clicks.

The goals of our system are to automatically identify clicks generated by unethical users and automated robots and to filter out these clicks before they ever reach your reports. However, if we believe you’ve been charged for invalid clicks in the past two months, we’ll apply a credit to your account .

Google has three powerful tools for protecting clicks on AdWords ads:

Detection and filtering techniques: Each click on an AdWords ad is examined by our system. Google looks at numerous data points for each click, including the IP address, the time of the click, any duplicate clicks, and various other click patterns. Our system then analyzes these factors to try to isolate and filter out potentially invalid clicks.

Advanced monitoring techniques: Google uses a number of unique and innovative techniques for managing invalid click activity. We can’t disclose details about the software, except to say that we’re constantly working to expand and improve our technology.

The Google Team: In addition to our automated click protection techniques, we have a team that uses specialized tools and techniques to examine individual instances of invalid clicks. When our system detects potentially invalid clicks, a member of this team examines the affected account to glean important data about the source of the potentially invalid clicks.

Contact us for Affordable PPC Management Services in India : SEOServicesDelhi.com

Posted in Uncategorized