Howdy...
Posted 17th January 2012 at 07:00 PM by fractal0107
For a long time Google's algorithms have remained a mystery to us all. Well, not exactly all of us, since Larry Page, Sergey Brin, and a handful of techies at Google know the 'secrets'. What are these algorithms? What do they do? How do they work?
To understand the importance of Google's algorithms, you must first understand what happens when a term is entered into Google's search engine. When you search for a term, Google's robots scan the web to find web pages that contain information pertaining to the term that you entered. These pages are entered into an index. They are then analyzed and sorted by the algorithms. The page that contains the most relevant information is ranked first. Subsequent pages are ranked according to the relevancy of the information they contain. All this, as you know, happens in a few microseconds.
As you can imagine, Google's algorithms do an unfathomably massive job of analyzing millions of web pages and ranking them according to the information they contain. Considering that Google gets several hundred million queries every day, you can imagine the kind of work put in to these algorithms.
Apart from the developers, no one really knows how Google's algorithms work. It is, however, well known that the search engine uses a number of contextual signals (more than 200) to rank web pages that contain relevant information. These signals include anchor text, web page title, location, external link popularity, diversity of link sources, and many, many more. Using these signals, the algorithms are able to rank web pages in the right order.
Google changes its algorithms continuously, making sure people receive the most relevant results when they search the web for information. In 2007 alone, Google tweaked its algorithms more than 400 times.
Interestingly, there is absolutely no human involvement with respect to the results generated by Google's algorithms. If a Google query for some reason returned irrelevant items, the results could not be manually reordered. The guys at Google would have to analyze the algorithms, check for the weaknesses causing the problem, and find a solution. This explains why the developers spend so much time tweaking the algorithms. It helps to ensure that we receive the best results on a consistent basis.
Someone once described Google's algorithms as 'a work in progress'. It is possibly the best description as these algorithms are dynamic in every sense of the word. As you are reading this article, a developer at Google is probably tweaking algorithms. This kind of explains why Google's competitors have such a tough time catching up with it.
By nature, lotteries are set up to be random. That is, each number has an equal chance at being drawn. Yet, there are many people that believe that there are lottery algorithms that can help you win the big jackpot. Is this for real? Can a lottery algorithm help you win a jackpot that is purportedly random?
First off, the definition of an algorithm is "a set of instructions for solving a problem." In this case, the problem to be solved is how to win the lotto jackpot. If there is an algorithm for this, then that same algorithm should apply to other random events, such as a coin toss. You know that if you flip a coin, there is a 50-50 chance as to which side it would land on. So, said algorithm should be able to predict how many times that coin would land on heads if, say, it was flipped 100 times. But that sound preposterous, doesn't it? So why should we believe it when it comes to lotteries?
There are many theories as to how to predict future lottery numbers. The most popular of the theories are the hot-number theory and the cold-number theory. I won't explain the gist of the two theories, but I could assure you that they are just the way they sound. Think back to the coin toss. Would you bet that in the future, the coin would land on heads a lot more times than tails just because in the past few tosses, heads have been hot? No, of course not. Why? Because coin tosses are random, just as lottery balls coming out of the machine are random as well. No algorithm can predict future lottery numbers.
Encryption algorithms are commonly used in computer communications, including FTP transfers. Usually they are used to provide secure transfers. If an algorithm is used in a transfer, the file is first translated into a seemingly meaningless cipher text and then transferred in this configuration; the receiving computer uses a key to translate the cipher into its original form. So if the message or file is intercepted before it reaches the receiving computer it is in an unusable (or encrypted) form.
Here are some commonly used algorithms:
DES/3DES or TripleDES
This is an encryption algorithm called Data Encryption Standard that was first used by the U.S. Government in the late 70's. It is commonly used in ATM machines (to encrypt PINs) and is utilized in UNIX password encryption. Triple DES or 3DES has replaced the older versions as a more secure method of encryption, as it encrypts data three times and uses a different key for at least one of the versions.
Blowfish
Blowfish is a symmetric block cipher that is unpatented and free to use. It was developed by Bruce Schneier and introduced in 1993.
AES
Advanced Encryption Standard or Rijndael; it uses the Rijndael block cipher approved by the National Institute of Standards and Technology (NIST). AES was originated by cryptographers Joan Daemen and Vincent Rijmen and replaced DES as the U.S. Government encryption technique in 2000.
Twofish
Twofish is a block cipher designed by Counterpane Labs. It was one of the five Advanced Encryption Standard (AES) finalists and is unpatented and open source.
IDEA
This encryption algorithm was used in Pretty Good Privacy (PGP) Version 2 and is an optional algorithm in OpenPGP. IDEA features 64-bit blocks with a 128-bit key.
MD5
MD5 was developed by Professor Ronald Riverst and was used to create digital signatures. It is a one-way hash function and intended for 32-bit machines. It replaced the MD4 algorithm.
SHA-1
SHA-1 is a hashing algorithm similar to MD5, yet SHA-1 may replace MD5 since it offers more security.
HMAC
This is a hashing method similar to MD5 and SHA-1, sometimes referred to as HMAC-MD5 and HMAC-SHA1.
RSA Security
RC4 - RC4 is a variable key-size stream cipher based on the use of a random permutation.
RC5 - This is a parameterized algorithm with a variable block, key size and number of rounds.
RC6 - This an evolution of RC5, it is also a parameterized algorithm that has variable block, key and a number of rounds. This algorithm has integer multiplication and 4-bit working registers.
Encryption algorithms are commonly used in computer communications, including FTP transfers. Usually they are used to provide secure transfers. If an algorithm is used in a transfer, the file is first translated into a seemingly meaningless cipher text and then transferred in this configuration; the receiving computer uses a key to translate the cipher into its original form. So if the message or file is intercepted before it reaches the receiving computer it is in an unusable (or encrypted) form.
Here are some commonly used algorithms:
DES/3DES or TripleDES
This is an encryption algorithm called Data Encryption Standard that was first used by the U.S. Government in the late 70's. It is commonly used in ATM machines (to encrypt PINs) and is utilized in UNIX password encryption. Triple DES or 3DES has replaced the older versions as a more secure method of encryption, as it encrypts data three times and uses a different key for at least one of the versions.
Blowfish
Blowfish is a symmetric block cipher that is unpatented and free to use. It was developed by Bruce Schneier and introduced in 1993.
AES
Advanced Encryption Standard or Rijndael; it uses the Rijndael block cipher approved by the National Institute of Standards and Technology (NIST). AES was originated by cryptographers Joan Daemen and Vincent Rijmen and replaced DES as the U.S. Government encryption technique in 2000.
Twofish
Twofish is a block cipher designed by Counterpane Labs. It was one of the five Advanced Encryption Standard (AES) finalists and is unpatented and open source.
IDEA
This encryption algorithm was used in Pretty Good Privacy (PGP) Version 2 and is an optional algorithm in OpenPGP. IDEA features 64-bit blocks with a 128-bit key.
MD5
MD5 was developed by Professor Ronald Riverst and was used to create digital signatures. It is a one-way hash function and intended for 32-bit machines. It replaced the MD4 algorithm.
SHA-1
SHA-1 is a hashing algorithm similar to MD5, yet SHA-1 may replace MD5 since it offers more security.
HMAC
This is a hashing method similar to MD5 and SHA-1, sometimes referred to as HMAC-MD5 and HMAC-SHA1.
RSA Security
RC4 - RC4 is a variable key-size stream cipher based on the use of a random permutation.
RC5 - This is a parameterized algorithm with a variable block, key size and number of rounds.
RC6 - This an evolution of RC5, it is also a parameterized algorithm that has variable block, key and a number of rounds. This algorithm has integer multiplication and 4-bit working registers.
This article will go into much more depth on different tactics to use to further optimize your site for higher rankings in Google's search engine.
Types of Optimization
As explained in the previous article, there is a myriad of things needed to be done in order to properly and completely optimize your site to take advantage of Google's algorithm. Here is a quick list of the things we will be learning to optimize, which will be explained in further depth shortly.
Things we will be going over in this article will include: Text Links and Optimization, Content Optimization, Domain Registration, Whois Information, Click Through Rates, PR (Page Rank), Traffic, Frequency of Updates, and IP address.
Text Links and Optimization
Although briefly explained in the first article, there are new ways of getting good links coming out almost daily, and having a deeper knowledge of getting links to your site from related ones is always important. As said before, try to stay away from the doorway pages or portals out there, as well as the FFA pages, as these will hurt your rankings on the major search engines more than help them. Be careful who you request links from, in regards to sites that are not within your specific category, or at the very least make sure they have a section for your general area on their links page, such as a "web services" section if you are a web hosting company.
Links from completely unrelated sites on a page with links from all different categories from all over the internet will not benefit your site much, if at all. This being said, don't worry if a site adds a link to your own without a request simply because they like your site or services or product. The added traffic is always a plus, and if you don't already have a lot of incoming links, every little bit helps.
Content Optimization
Be sure your content is all related to the subject your site is based on. Don't have articles on the latest video games if you sell furniture, as the keywords are much less likely to get picked up, and most people looking to buy furniture online don't want to be bogged down searching through hundreds of pages of video game reviews while trying to find a couch that will match their living room set.
Free content sites are always helpful, and although the content is duplicate, with the added articles and guides you will attract more visitors, increasing your rating through the quality of your traffic. Be sure to always include the article exactly as you got it, and include the actual author's name and a link to their site at the very least. Stay away from tactics such as simply copying another site's front page for an article, even if you plan on giving an author name and a link to where you got the information from. This can be seen as not only duplicate content, but is also a technique used by spammers to provide a huge collection of "content" in order to boost their rankings. Once the same information has been seen enough times by the Google spiders, it is considered "duplicate content" and therefore disregarded.
The most important thing to remember when trying to optimize your content is that writing your own articles and information will provide your site with free, unique content. This attracts browsers and at the same time impresses the Google spiders. Having unique content will even make other webmasters much more apt to link to your site as an "authority" in the field.
Domain Registration
Another thing checked for by the Google algorithm is how long your domain is actually registered for. Many Spam sites only register domain names for one year, leaving it to die so they can move on with an untainted domain name to attempt to spam their way into the top rankings on Google. Keeping your domain registered for five or more years, although it may cost a bit more initially, will help your rankings considerably in Google's search engine rankings.
Whois Information
Your Whois information, as stated in the previous article, is also taken into account when calculating your rankings on the search engine. Is your physical address a real place, or was it falsified? Is your contact information up to date, including your phone number and mailing address? Is your name associated with spam sites from the past? All this information is recorded and worked into the algorithm. Just be sure to have your actual information used when registering a domain name, and make sure your contact information is accurate. Another tip would be to use your full name when registering a domain name, including middle name as well as any titles, such as Junior or the Third, as this will help to prevent confusion between yourself and other webmasters.
Click Through Rates (CTR)
Click through rates on your site are now observed by Google as well. If your site is seen on the Google search results, how often is it clicked instead of the other 9 sites on the same page? This is tracked with many things in mind, including seasonal click through rates as well as current trends among browsers. In this system, if you own a ski shop online, obviously your site will be ranked higher during the winter on certain keywords than during the summer. Likewise, if you promote a beach resort online, then your site will be more likely to be ranked higher during the summer months and during spring break than during the winter or fall.
PR (Google's Page Rank)
PR, or Google's Page Ranking system, is a way to keep track of how popular your site is. This is determined through traffic to your site as well as the amount of related sites linking to your own. The complexities of this system have not fully been discovered yet, but the main thing to know is that the more related links and the more high quality traffic you have coming to your site the higher your PR will be. With links from high PR sites, your own PR will skyrocket, bringing you ever higher in Google's search results.
The algorithm also takes your PR into account when working out rankings. The higher PR your site has (on all of its pages), and the longer your site has a high PR, the more likely you are to get in the top rankings with Google. Even with a page ranking of 6 (out of 10), the results can be astounding, and your ranking within the search results will continue to grow.
Traffic
Traffic is also monitored to and from your site, and even how long your visitors stay. This allows Google to accurately gauge the amount of "real" traffic you get in any given month, as well as the type of visitors you receive and the overall popularity of your site. Now, for the younger sites that only get a few hits a day at most, this will not effect your rankings as much as the larger sites with hundreds or thousands of visitors a day. Instead of focusing on the amount of traffic you get to your site, focus on the quality of the traffic you receive. With more visitors that stay longer and actually stay to view your site and purchase your product or sign up as users, you should be able to get better rankings than some of the sites that have three times the volume of traffic but one tenth the amount of time spent by that traffic on your site.
Quality is king, while quantity is simply an added benefit on today's internet. Work on your quality and then you can focus on the quantity.
Update Frequency
Spiders from Google also check the frequency your site is updated with their new algorithm. The more your site is updated, in essence, with fresh and unique content, the higher it will become ranked in the Google search engine. If you take a month vacation, leaving your site to gather dust until you come back, be prepared for a drop in your rankings. If you update your site daily, on the other hand, you can expect to be ranked much higher (provided you have fresh, related content) than the sites that only update once a week.
Basically, as the old adage says, "You reap what you sew." Or, in other words, the more work you put into something, the better results will come of it. In general, this remains true for not only Google, but the other search engines and directories as well.
IP Address
Even the IP address is considered in the Google Algorithm, which can be a good or bad thing, depending on the other websites that are currently sharing your IP. If you are on a specific IP address that is dedicated to your site, you have little to worry about, but if your IP is shared between multiple websites, you could have a real problem. IP addresses are recorded and compared between your site and others on the web. If, for example, your site just happens to be on the same server as a spammer who shares your IP, not only will Google delete the spammer from their rankings, but they could also delete your site simply for being related to the Spam site.
Obviously, there are some simple solutions to this problem, including talking to your hosting company to get a dedicated IP address specifically for your own site. This might cost a bit more per month or year, depending on your billing schedule, but the benefits will be numerous. If you simply don't have the cash flow to pay the extra money per month, you can search through the sites that share your IP and report spammers to your hosting company. In nearly every case they will be ready and willing to get rid of spammers on their server.
Again, the basic idea is to keep your site reputable in its promotion practices. Stay away from anything that could possibly classify your site as a Spam site, and keep good, relevant links and content coming.
To understand the importance of Google's algorithms, you must first understand what happens when a term is entered into Google's search engine. When you search for a term, Google's robots scan the web to find web pages that contain information pertaining to the term that you entered. These pages are entered into an index. They are then analyzed and sorted by the algorithms. The page that contains the most relevant information is ranked first. Subsequent pages are ranked according to the relevancy of the information they contain. All this, as you know, happens in a few microseconds.
As you can imagine, Google's algorithms do an unfathomably massive job of analyzing millions of web pages and ranking them according to the information they contain. Considering that Google gets several hundred million queries every day, you can imagine the kind of work put in to these algorithms.
Apart from the developers, no one really knows how Google's algorithms work. It is, however, well known that the search engine uses a number of contextual signals (more than 200) to rank web pages that contain relevant information. These signals include anchor text, web page title, location, external link popularity, diversity of link sources, and many, many more. Using these signals, the algorithms are able to rank web pages in the right order.
Google changes its algorithms continuously, making sure people receive the most relevant results when they search the web for information. In 2007 alone, Google tweaked its algorithms more than 400 times.
Interestingly, there is absolutely no human involvement with respect to the results generated by Google's algorithms. If a Google query for some reason returned irrelevant items, the results could not be manually reordered. The guys at Google would have to analyze the algorithms, check for the weaknesses causing the problem, and find a solution. This explains why the developers spend so much time tweaking the algorithms. It helps to ensure that we receive the best results on a consistent basis.
Someone once described Google's algorithms as 'a work in progress'. It is possibly the best description as these algorithms are dynamic in every sense of the word. As you are reading this article, a developer at Google is probably tweaking algorithms. This kind of explains why Google's competitors have such a tough time catching up with it.
By nature, lotteries are set up to be random. That is, each number has an equal chance at being drawn. Yet, there are many people that believe that there are lottery algorithms that can help you win the big jackpot. Is this for real? Can a lottery algorithm help you win a jackpot that is purportedly random?
First off, the definition of an algorithm is "a set of instructions for solving a problem." In this case, the problem to be solved is how to win the lotto jackpot. If there is an algorithm for this, then that same algorithm should apply to other random events, such as a coin toss. You know that if you flip a coin, there is a 50-50 chance as to which side it would land on. So, said algorithm should be able to predict how many times that coin would land on heads if, say, it was flipped 100 times. But that sound preposterous, doesn't it? So why should we believe it when it comes to lotteries?
There are many theories as to how to predict future lottery numbers. The most popular of the theories are the hot-number theory and the cold-number theory. I won't explain the gist of the two theories, but I could assure you that they are just the way they sound. Think back to the coin toss. Would you bet that in the future, the coin would land on heads a lot more times than tails just because in the past few tosses, heads have been hot? No, of course not. Why? Because coin tosses are random, just as lottery balls coming out of the machine are random as well. No algorithm can predict future lottery numbers.
Encryption algorithms are commonly used in computer communications, including FTP transfers. Usually they are used to provide secure transfers. If an algorithm is used in a transfer, the file is first translated into a seemingly meaningless cipher text and then transferred in this configuration; the receiving computer uses a key to translate the cipher into its original form. So if the message or file is intercepted before it reaches the receiving computer it is in an unusable (or encrypted) form.
Here are some commonly used algorithms:
DES/3DES or TripleDES
This is an encryption algorithm called Data Encryption Standard that was first used by the U.S. Government in the late 70's. It is commonly used in ATM machines (to encrypt PINs) and is utilized in UNIX password encryption. Triple DES or 3DES has replaced the older versions as a more secure method of encryption, as it encrypts data three times and uses a different key for at least one of the versions.
Blowfish
Blowfish is a symmetric block cipher that is unpatented and free to use. It was developed by Bruce Schneier and introduced in 1993.
AES
Advanced Encryption Standard or Rijndael; it uses the Rijndael block cipher approved by the National Institute of Standards and Technology (NIST). AES was originated by cryptographers Joan Daemen and Vincent Rijmen and replaced DES as the U.S. Government encryption technique in 2000.
Twofish
Twofish is a block cipher designed by Counterpane Labs. It was one of the five Advanced Encryption Standard (AES) finalists and is unpatented and open source.
IDEA
This encryption algorithm was used in Pretty Good Privacy (PGP) Version 2 and is an optional algorithm in OpenPGP. IDEA features 64-bit blocks with a 128-bit key.
MD5
MD5 was developed by Professor Ronald Riverst and was used to create digital signatures. It is a one-way hash function and intended for 32-bit machines. It replaced the MD4 algorithm.
SHA-1
SHA-1 is a hashing algorithm similar to MD5, yet SHA-1 may replace MD5 since it offers more security.
HMAC
This is a hashing method similar to MD5 and SHA-1, sometimes referred to as HMAC-MD5 and HMAC-SHA1.
RSA Security
RC4 - RC4 is a variable key-size stream cipher based on the use of a random permutation.
RC5 - This is a parameterized algorithm with a variable block, key size and number of rounds.
RC6 - This an evolution of RC5, it is also a parameterized algorithm that has variable block, key and a number of rounds. This algorithm has integer multiplication and 4-bit working registers.
Encryption algorithms are commonly used in computer communications, including FTP transfers. Usually they are used to provide secure transfers. If an algorithm is used in a transfer, the file is first translated into a seemingly meaningless cipher text and then transferred in this configuration; the receiving computer uses a key to translate the cipher into its original form. So if the message or file is intercepted before it reaches the receiving computer it is in an unusable (or encrypted) form.
Here are some commonly used algorithms:
DES/3DES or TripleDES
This is an encryption algorithm called Data Encryption Standard that was first used by the U.S. Government in the late 70's. It is commonly used in ATM machines (to encrypt PINs) and is utilized in UNIX password encryption. Triple DES or 3DES has replaced the older versions as a more secure method of encryption, as it encrypts data three times and uses a different key for at least one of the versions.
Blowfish
Blowfish is a symmetric block cipher that is unpatented and free to use. It was developed by Bruce Schneier and introduced in 1993.
AES
Advanced Encryption Standard or Rijndael; it uses the Rijndael block cipher approved by the National Institute of Standards and Technology (NIST). AES was originated by cryptographers Joan Daemen and Vincent Rijmen and replaced DES as the U.S. Government encryption technique in 2000.
Twofish
Twofish is a block cipher designed by Counterpane Labs. It was one of the five Advanced Encryption Standard (AES) finalists and is unpatented and open source.
IDEA
This encryption algorithm was used in Pretty Good Privacy (PGP) Version 2 and is an optional algorithm in OpenPGP. IDEA features 64-bit blocks with a 128-bit key.
MD5
MD5 was developed by Professor Ronald Riverst and was used to create digital signatures. It is a one-way hash function and intended for 32-bit machines. It replaced the MD4 algorithm.
SHA-1
SHA-1 is a hashing algorithm similar to MD5, yet SHA-1 may replace MD5 since it offers more security.
HMAC
This is a hashing method similar to MD5 and SHA-1, sometimes referred to as HMAC-MD5 and HMAC-SHA1.
RSA Security
RC4 - RC4 is a variable key-size stream cipher based on the use of a random permutation.
RC5 - This is a parameterized algorithm with a variable block, key size and number of rounds.
RC6 - This an evolution of RC5, it is also a parameterized algorithm that has variable block, key and a number of rounds. This algorithm has integer multiplication and 4-bit working registers.
This article will go into much more depth on different tactics to use to further optimize your site for higher rankings in Google's search engine.
Types of Optimization
As explained in the previous article, there is a myriad of things needed to be done in order to properly and completely optimize your site to take advantage of Google's algorithm. Here is a quick list of the things we will be learning to optimize, which will be explained in further depth shortly.
Things we will be going over in this article will include: Text Links and Optimization, Content Optimization, Domain Registration, Whois Information, Click Through Rates, PR (Page Rank), Traffic, Frequency of Updates, and IP address.
Text Links and Optimization
Although briefly explained in the first article, there are new ways of getting good links coming out almost daily, and having a deeper knowledge of getting links to your site from related ones is always important. As said before, try to stay away from the doorway pages or portals out there, as well as the FFA pages, as these will hurt your rankings on the major search engines more than help them. Be careful who you request links from, in regards to sites that are not within your specific category, or at the very least make sure they have a section for your general area on their links page, such as a "web services" section if you are a web hosting company.
Links from completely unrelated sites on a page with links from all different categories from all over the internet will not benefit your site much, if at all. This being said, don't worry if a site adds a link to your own without a request simply because they like your site or services or product. The added traffic is always a plus, and if you don't already have a lot of incoming links, every little bit helps.
Content Optimization
Be sure your content is all related to the subject your site is based on. Don't have articles on the latest video games if you sell furniture, as the keywords are much less likely to get picked up, and most people looking to buy furniture online don't want to be bogged down searching through hundreds of pages of video game reviews while trying to find a couch that will match their living room set.
Free content sites are always helpful, and although the content is duplicate, with the added articles and guides you will attract more visitors, increasing your rating through the quality of your traffic. Be sure to always include the article exactly as you got it, and include the actual author's name and a link to their site at the very least. Stay away from tactics such as simply copying another site's front page for an article, even if you plan on giving an author name and a link to where you got the information from. This can be seen as not only duplicate content, but is also a technique used by spammers to provide a huge collection of "content" in order to boost their rankings. Once the same information has been seen enough times by the Google spiders, it is considered "duplicate content" and therefore disregarded.
The most important thing to remember when trying to optimize your content is that writing your own articles and information will provide your site with free, unique content. This attracts browsers and at the same time impresses the Google spiders. Having unique content will even make other webmasters much more apt to link to your site as an "authority" in the field.
Domain Registration
Another thing checked for by the Google algorithm is how long your domain is actually registered for. Many Spam sites only register domain names for one year, leaving it to die so they can move on with an untainted domain name to attempt to spam their way into the top rankings on Google. Keeping your domain registered for five or more years, although it may cost a bit more initially, will help your rankings considerably in Google's search engine rankings.
Whois Information
Your Whois information, as stated in the previous article, is also taken into account when calculating your rankings on the search engine. Is your physical address a real place, or was it falsified? Is your contact information up to date, including your phone number and mailing address? Is your name associated with spam sites from the past? All this information is recorded and worked into the algorithm. Just be sure to have your actual information used when registering a domain name, and make sure your contact information is accurate. Another tip would be to use your full name when registering a domain name, including middle name as well as any titles, such as Junior or the Third, as this will help to prevent confusion between yourself and other webmasters.
Click Through Rates (CTR)
Click through rates on your site are now observed by Google as well. If your site is seen on the Google search results, how often is it clicked instead of the other 9 sites on the same page? This is tracked with many things in mind, including seasonal click through rates as well as current trends among browsers. In this system, if you own a ski shop online, obviously your site will be ranked higher during the winter on certain keywords than during the summer. Likewise, if you promote a beach resort online, then your site will be more likely to be ranked higher during the summer months and during spring break than during the winter or fall.
PR (Google's Page Rank)
PR, or Google's Page Ranking system, is a way to keep track of how popular your site is. This is determined through traffic to your site as well as the amount of related sites linking to your own. The complexities of this system have not fully been discovered yet, but the main thing to know is that the more related links and the more high quality traffic you have coming to your site the higher your PR will be. With links from high PR sites, your own PR will skyrocket, bringing you ever higher in Google's search results.
The algorithm also takes your PR into account when working out rankings. The higher PR your site has (on all of its pages), and the longer your site has a high PR, the more likely you are to get in the top rankings with Google. Even with a page ranking of 6 (out of 10), the results can be astounding, and your ranking within the search results will continue to grow.
Traffic
Traffic is also monitored to and from your site, and even how long your visitors stay. This allows Google to accurately gauge the amount of "real" traffic you get in any given month, as well as the type of visitors you receive and the overall popularity of your site. Now, for the younger sites that only get a few hits a day at most, this will not effect your rankings as much as the larger sites with hundreds or thousands of visitors a day. Instead of focusing on the amount of traffic you get to your site, focus on the quality of the traffic you receive. With more visitors that stay longer and actually stay to view your site and purchase your product or sign up as users, you should be able to get better rankings than some of the sites that have three times the volume of traffic but one tenth the amount of time spent by that traffic on your site.
Quality is king, while quantity is simply an added benefit on today's internet. Work on your quality and then you can focus on the quantity.
Update Frequency
Spiders from Google also check the frequency your site is updated with their new algorithm. The more your site is updated, in essence, with fresh and unique content, the higher it will become ranked in the Google search engine. If you take a month vacation, leaving your site to gather dust until you come back, be prepared for a drop in your rankings. If you update your site daily, on the other hand, you can expect to be ranked much higher (provided you have fresh, related content) than the sites that only update once a week.
Basically, as the old adage says, "You reap what you sew." Or, in other words, the more work you put into something, the better results will come of it. In general, this remains true for not only Google, but the other search engines and directories as well.
IP Address
Even the IP address is considered in the Google Algorithm, which can be a good or bad thing, depending on the other websites that are currently sharing your IP. If you are on a specific IP address that is dedicated to your site, you have little to worry about, but if your IP is shared between multiple websites, you could have a real problem. IP addresses are recorded and compared between your site and others on the web. If, for example, your site just happens to be on the same server as a spammer who shares your IP, not only will Google delete the spammer from their rankings, but they could also delete your site simply for being related to the Spam site.
Obviously, there are some simple solutions to this problem, including talking to your hosting company to get a dedicated IP address specifically for your own site. This might cost a bit more per month or year, depending on your billing schedule, but the benefits will be numerous. If you simply don't have the cash flow to pay the extra money per month, you can search through the sites that share your IP and report spammers to your hosting company. In nearly every case they will be ready and willing to get rid of spammers on their server.
Again, the basic idea is to keep your site reputable in its promotion practices. Stay away from anything that could possibly classify your site as a Spam site, and keep good, relevant links and content coming.
Total Comments 0

