Friday, October 26, 2007

Page Rank Update in all Google data center

From last couple of days I noticed different page rank in different Google datacenter and today in the morning when I reached to the office and as per schedule I was checking my websites I noticed PR update on all my new domains. My blog got PR 2. – 0 to 2 – 0 to 2 – 0 to 1 – 2 to 3

I visited many website today and found this PR updation is not too good for previous well ranked websites. I checked JupiterOnlineMedia network websites and other well ranked websites and I noticed most of all websites PR went down.

Recently Google had changed algorithm and very strict about paid linking. In Google webmaster tool they introduced paid links form where you can write about any website that selling or buying a link. Mostly paid linking is used to manipulation of search engine. I visited; they are also selling links and found that their page rank goes down from 7 to 4. This shows Google is very strict about paid linking and in future Google may penalize website that involved in paid linking.

Monday, October 15, 2007

Improve website ranking with Meta Description

Hi All,

Major search engines give importance to Meta Description tag. Google like to display Meta description in search result because it believes that Meta description give detail idea about the website content. While creating a Meta Description for your web pages make sure that description don’t comprise of long keyword string because it will count as spam. Well formatted Meta tag will help you improve your Click through ratio.

How to create good Meta for your web pages?
If you want to optimize each and every web pages of your site then it is must necessary to create unique Meta for every pages. Create accurate description that describes each page clearly. You can also give priority to website pages like your home page and other important pages while creating description and for less important pages you can use programmatically generated Meta description. It is quite difficult to create unique description tag for database driven website. In this case you need to create description tag programmatically but make sure that your description does not violet search engine webmaster guideline. Dynamically generated descriptions are not spammy and it must be easy to read and easy understand.
Always write clear facts in website description and make your description descriptive.

Bhavesh Goswami.

Thursday, October 4, 2007

Duplicate Content caused by URL Parameters

It is very crucial to fine out how search engine find duplicate content from your site. It is easy to make website very clear and effective if it have 5 to 50 pages but this discussion come into action for large database driven websites. If your website is product base then it should have thousands of pages. Webmaster and site owner need to focus on website page duplication. In most cases duplication start from the URL parameters like session IDs or Tracking IDs.

How tracking IDs and Session IDs cause duplication?Tracking IDs and Session IDs are used to help and store user information and this is the main source of duplication. All user information go through session ID and this will cause duplicate content and create similar pages for different users.

How it will affect search engine ranking?

Search engine try to crawl as many pages identical pages but with session IDs or tracking IDs same page generate number of times and this will become main resource of duplicate content.
Also with session ID URL become user-unfriendly and it will decrease chances of selecting the listing.

How to protect your website from duplicate URL problem?
Here is the clue about protecting your website from duplication. You can ask your developer to re-write URL using .htaccess or any other good URL rewriting tools. Try to make URLs possible sort in length. If your website have number of same page with different session IDs then you can do permanent redirect (301) that will help you much.

How do I know if Google find duplicate pages from my website?Day by day Google and other major searching try to give accurate results and also try to help webmaster for optimizing website. Google is ahead in the race, they have clearly mentioned
1. When we detect duplicate content, such as through variations caused by URL parameters, we group the duplicate URLs into one cluster.

2. We select what we think is the "best" URL to represent the cluster in search results.

3. We then consolidate properties of the URLs in the cluster, such as link popularity, to the representative URL.

Bhavesh Goswami,
SEO & Web Promotion Expert.