Google's PageRank has been around for years, and in the opinions of a lot of e-business owners, it can make or break a site. Lately, with Google's fingers in every pie, it seems important to remind everyone that there is more to a website than just PageRank. PageRank is a term that relates to the algorithm that Google uses to rank a website in its search engine. Coined by Larry Page, one of the engineers of Google, PageRank has come to mean so much to webmasters and SEO's that it dictates how we market a website. But let me coin a few terms of my own. (Or, borrow them from others ,haps …) And while some of these concepts are included in the PageRank algorithm themselves, it's often helpful to be reminded that there are many factors that a webmaster should concentrate on, and not just one overwhelming Aspect. It bothers me that Google's toolbar's PageRank indicator "measures the IMPORTANCE" of a page; Important to them, sometimes, especially in light of the release of the Google Toolbar for Firefox on July 7th. But lack of PageRank does not mean that your site is not important. So How do you let the search engines, Google in particular, know that? This article is a collection of the phrases that indicate the behavior of search engines today.
Currently there is patent application in the US Trademark Office from Monika Henzinger, published on June 30, 2005, that certifies that she has a way of determining a document's "freshness" which we will name FreshRank, coated such by Michael Martinez on Cre8asite Forums. The abstract says that one of the problems is that the "last modified since" attribute is not always correct, and even if a webmaster has figured out that they can change the date, it still does not fool Google. What Google looks for is actual modified content. As far as how Google determines how old a document may be is still something of a secret. Monika is trying to patent a more explicit form of freshness, since not all search engines use the "last modified since" attribute anyways, and that that search engines need a more reliable way of determining overall updated content. Well, I for one, happen to agree.
This is a term trademarked by Google. Google has this to say about TrustRank: "Web spam pages use various techniques to achieve higher-than-deserved rankings in a search engine's results." While human experts can identify spam, it is too expensive to normally evaluate a large number of pages. , We propose techniques to semi-automatically separate reputable, good pages from spam. We first select a small set of seed pages to be evaluated by an expert. Once we manually identify the reputable seed pages, we use the link structure of the web to Discover other pages that are likely to be good. In this paper we discuss possible ways to implement the seed selection and the discovery of good pages. We present results of experiments run on the World Wide Web indexed by AltaVista and evaluate the performance of our techniques . Our results show that we can effectively filter out spam from a significant fraction of the web, based on a good seed set of less than 200 sites. "
The paper they are referring to is a 12 page abstract dealing with Google's TrustRank, found at Stanford University's site. In essence, TrustRank is a way to cut down on spam and filter out content that is not relevant to the searcher in order to bring them results they really want. While there is no way to implement techniques to include Google's TrustRank current, what this means to you as a website owner is simple: do not spam the search engines! Spam is defined as email related in Webster's, but the term has come to mean any unwanted information or propaganda that may have been received through deceptive measures on the part of the sender. To a search engine, spam is hyperlinked pages that are intent on misleading the search engine. So as long as you are actually trying to keep your nose clean, you should be okay, right? Generally, this is true, but you should still be careful. The best policy is to develop your website for your visitor, and not for the search engines; You'll do well every time.
This is a term that I'm simply making up. If you contact the Small Business Administration for a packet for starting a business, you get a 26 page question about your niche product. There are many aspects to NicheRank. Ideally, offering a product that no one has ever because of before is the best niche to be in. Not everyone can have that brand new invention that everyone just has to have, however. So offering product in a scheduled market is still possible, but you still have to find your niche market. What is it about your product that is different from all the rest? Why should someone buy from you instead of one of your competitors? Whether it's about price, or freebies with purchase, unbeatable customer service, or selection, you are not just selling a product; You are selling your company. And this is what makes up NicheRank. How effective will you be at penetrating a field, and then stealing some of that market share? I can not remember what movie this was in, but the saying "You gotta have a gimmick" is still tried and true. So what's your gimmick? How does it apply to search engines?
In an article by Scottie Claiborne, she talks about the infomercials that you see on TV. One in particular, the "Chocolate Dream", was nothing more than an age-old concept: the double boiler. But this established product was given a new spin by the makers of the "Chocolate Dream", which made it seem like something you just could not live without. I love watching those shows on HGTV that turn old things into new things. They are still the same basic widgets, but they get a makeover to turn them into something different. I'm not a garage saler by nature, as I Just do not have the time, but I like seeing what possibilities those old items have for potentially potential new ones. What's your "Chocolate Dream"?
Search engines are excited over new things, even if you have an "old thing" that seems new. In fact, search engines penalize for duplicate content, and may not rank those sites at all. So you have to make your product different than the rest of them out there, even if it IS the same thing. And that's NicheRank.
When a visitor is using a search engine, they use words to find what they are looking for. So in turn, a website uses words to connect with that visitor. So how does that search engine know which bunch of words are relevant, and which are not? By comparing them to all the other words on the site. Keyword meta tags are still used to tell the search engines in a glance what the site is about, but thanks to a bunch of unscrupulous web designers and SEO's, meta tags are not as important as they used to be, because it is too easy to Manipulate search results using keywords alone. Keyword meta tags in relation to the actual words on a page determine how important those keywords actually are in a search, or "keyword density". Simply put, keyword density is how dense those keywords are in the actual content of a webpage. But it's not just about individual pages; It's about the scope of keyword density in general for an entire site. If a page has nothing to do with what the website relates to, then most likely, it will not get listed.
If you've done any reading at all on search engine optimization, then chances are you've come across link popularity and its importance in optimization. PageRank initially measured how important a website was by its back links, or a search engine's perception of votes for a site. But you can not do any kind of search on link popularity without coming across a dozen or so ads that tell you that you can just buy your back links. It seems to be pretty popular these days to purchase your back links. This is not to be confused with hiring an SEO. Using an SEO consultant to market all aspects of a website is probably a far better long term investment than to buy a bunch of links. We live in a world where we've gotten used to getting what we want RIGHT NOW. What happened to patience is a value? Anymore, patience is a nostalgic concept, and is almost as antiquated as dime-store sodas.
There are changes in the works to help curb the back links controversy, with seven new search exclusions, by placing a hold delay on newer sites that have launched with seemingly huge amounts of back links before the site has even had a chance to take off. John Scott of Cre8asite Forums offered an opinion based on a source who used to work with a current Google employee:
"The probation does not apply to new sites. It applies to links. When the algorithm was deployed certain older links were grandfathered in. After that, links will be (are being) given partial credit, and be essentially on 'probation.'"
"It applies to links, not sites. And the age of the link is not the only factor. The IP range of the links and other considerations are made, and the person who I discussed this with said that Krishna Bharat is at Google primarily to Develop and implement this new algorithm. It is supposed to radically change the way links are evaluated. "
Spammers have traditionally harvested links in the way that my neighbor harvests soybeans: more is more. But it's the relevant links that achieves the well placed rankings. Spammers artificial inflate search engine rankings through many links, most of which do not relate to the sites content whatsoever. Gone are the days of link farms; I tried to search for one the other day, and could not find a single one. Now it is the day of the "directories". When compiling links for a client, we get a lot of requests for link exchanges. A site, for example, that relates to search engine optimization should stick to links to other sites that have to do with the Internet and search engines. So why on earth would they want to exchange links with Bahamas Vacation Packages? It seems that the way to get around the non-relevant links these days is to create a directory with many different categories that relate to just about anything. I can not stress enough that these things are going to be examined by the search engines before too long, and it is best to just not partake. It is very difficult to pass up a link exchange for a new webmaster, especially when they are racing so hard to get those inbound links. But it just is not worth it.
I've known several a-company that has rushed out to register a domain name just in case they need it in five or so years. This might be a good idea for now, but in five years, will it matter? No one knows. AgeRank is a fairly new idea that has occurred to webmasters as far as importance to search engines. There are a lot of sites out there that exist for the sole purpose of spamming. And with Google's recent stance on spamming (just look at the infamous Traffic Power fiasco), search engines are taking into account the longevity of a domain as well as all the other factors. However, all of the webmasters out there with brand new sites can take heart: it is theorized that AgeRank only makes up about 1% of the PageRank algorithm. After all, it should matter too much that a website that has been around for years should have more importance than a new site, especially if the new site has more relevance to what the searcher is looking for. Google's number one goal is to bring the visitor the most relevant search results.
There is, however, a patent application in the works that Barry Schwarz of the SEO Round Table calls this the "sandbox effect," because it's a place where the new sites can where they can all play nicely away from the real sites until they're ' Ve had a chance to prove themselves. In an explanation of why new sites rank well at first, then drop into obscurity, Schwarz said, "the only pattern I see from the threads is that these are new sites. I see a wide range of back links reported, a wide range of Styles of on-page optimization. Only pattern is the site was launched after December. " The patent will be just another weapon in the arsenal against spammers. Those spammers ruin it for the rest of us, do not they? But it sees that it could work for new sites as well, contradicting the idea of a damaging sandbox effect. Google cites two current problems with the way search engines work: FreshRank and LinkRank. The sandbox effect will simply delay PageRank for a site, so as to give PageRank authenticity again, and not just something any old Joe with enough cash can simply purchase. So for the small business owner, this can mean something important. For the spammer, however, it could be devastating.
Google's PageRank is not going anywhere – at least not yet. Perhaps during the PageRank blackout in late May, 2005 had some webmasters celebrating that PageRank finally bit the dust, though most of them worried it was not coming back, especially those bought their. But it did, of course. So while PageRank plays, in my opinion, wreaks far too much havoc up the mental stresses of any webmaster or SEO, and has far too much importance in the minds of some, it's probably around for good, but bringing back the original purpose of PageRank Can only mean good things for those who are truly describing.
There is a rumor that there are going to be some major implementations of these changes sometime this summer. Even though this is a rumor, it is one that makes me cringe a bit. Humans do not like change; We are creatures of habit – me especially. But change I will, because that is what the industry is all about. I believe some of the change will have to do with those concepts outlined in the many patent applications, and link changes will be the first to start. I'd bet my firstborn on it.
My goal in writing this article today is to remind you of the importance of doing what you should do best: focusing on your website, and what it has to say, and what it means to your visitors, instead of what it may mean to search Engines. This is not a game in which the competitor with the most money wins. When you have that down, everything else will fall into place.