Tuesday, September 30, 2008

PPC vs. Organic SEO

Why do more netizens and internet newbies utilize Google to search for products, services, images, information, news, or anything else under the proverbial internet superhighway sun? Well, Google’s search technology has been proven smarter (in the minds of the average web surfer) than the rest in regards to returning relevant, valuable, and fast results for any given search query - this would explain the almost 50% market share that Google now commands for online search.

Google co-founders Larry Page and Sergey Brin bet the farm on the quality of their search, and it has paid off marvelously in stock options and an almost inconceivable market capitalization (142.58 as of April 1, 2007) for a company that has been in existence for less than a decade. Google’s immense market cap is partially a product of its revenue stream, but where in fact does Google actually generate the bulk of their proceeds from?

Gmail, website analytics, images, web searches, all can be performed for free (provided you are among the over 1 billion humans who have access to an internet connection and a PC). The answer is advertising. And not just any old type of advertising - Google generates up to 95% of its revenue from what is known as Pay Per Click advertising; specifically it is the Google Adwords and Adsense suite of programs.

Google Adwords enables businesses, young and old alike, to infuse their advertising message to the masses. Adwords works like a standard auction. Businesses bid on keywords that are relevant to their industry. The highest bidders, who also maintain a high advertisement Click Through Rate (CTR), appear towards the top of the screen when conducting a search. Lower bidders get the lower valued placement real estate as web surfers’ eyes scroll down the browser window. Google monitors the ads, but it is the business owner who chooses what amount he or she is willing to pay per click. Once your ad is clicked by a user, your business is charged a certain amount relating to your bid price. You, the business owner essentially choose where your advertisement is displayed and for what keyword or keyword phrases to be included.
Who Controls What I See When I Conduct A Google Search?
Now, this brings up my initial, yet critical point. I will say it again. Businesses (and individuals) have the power to place their unique message or advertisement on Google’s search engine results pages (SERPS). What is the big deal about that? Well, internet surfers tend to have difficulty discerning between advertisements and actually search engine listing results. Try it for yourself. Go to Google and conduct a search for “AT&T cordless phone.” The first three listings you see are not listings at all- they are Google Adwords results - Google typically shades these “sponsored” results to help differentiate them from the actual organic or natural search engine listings, but they are still shown in similar formats. Even this advertisement shading in the last few weeks however, has slowly been supplanted by a crystal clear background which makes the advertisements and listings almost indistinguishable. Besides the shading and the conspicuously grayed out and small “sponsored results” text the two “listings” and “advertisements” look very similar. In other words, most web surfers could confuse an advertisement for an actual and factual ultra-democratic Google certified search result. The more web surfers that click on Google Adwords advertisements, the more money Google generates, keeping the board of directors and stock holders sleeping soundly tucked in Egyptian cotton.

Isn’t that directly contrary to Google’s mantra that its “mission is to organize the world’s information and make it universally accessible and useful?” The answer is yes, but don’t tear down the walls of the kingdom just yet. Most of the businesses that advertise on Google are legitimate - the point is that when you click on a PPC link, the quality of the site cannot be vouched for by Google. When you click on the first organic search engine listing from a Google query, you are implicitly getting Google’s stamp of approval that this site is the best result as per your associated search based on their algorithm or rules. The push vs. pull reasoning that I have read in various search engine forums and articles may be completely off base. If you don’t know that you are clicking on an advertisement then you are not being pushed to take an action. So, what does all this mean?
Educated Web Surfers Or Buyers Know The Difference Between A Pay Per Click Advertising And A Natural Search Engine Listing.
I have found that educated internet users tend to utilize the organic search engine listings more than the pay per click results (both Adwords and Yahoo’s Overture - now Yahoo Search Marketing). They do so because they understand that the sponsored results are all businesses that have chosen to be listed among the SERPS. Organic results are also different because a lower percentage of these are actual businesses selling goods or services. Google and Yahoo organic results give precedent to valuable web pages - pages that give information, tools, or news to web surfers. There is no guarantee that your search for “AT&T cordless phone” would bring up anything more than a website with a schematic of transistors and speed dial features. Educated buyers comprehend that the first few organic results that are in fact businesses providing what they need will most likely be the most reputable companies around. This fact is indeed more consistent with the Google corporate philosophy. As we have just seen, Pay Per Click “PPC” advertisements and natural results do put forth two conflicting messages - click on the listing because it is relevant, or click on the advertisement because Google has to pay its utility bills.
What Does All This Have To Do With My Business?
I’ve looked at hundreds of Overture and Adwords reports; I usually see an average PPC advertisement click through rate of between 1- 4% (although inter-industry numbers vary dramatically). I also study statistics (often from the analytical tool called Google Analytics) that shows me the number of visitors a site gets for both PPC and natural search engine listings. My findings show that organic listings on the average drive more traffic than PPC ads. This isn’t always the case, but it is the trend I have seen when analyzing the data. If educated surfers are more likely to click on natural search engine listings, it also seems logical to think that the website conversion rates would also be higher. My look at the numbers do suggest that this is true more times than not. Does Google publicize the click through rate or conversion rate for typical organic search queries? No way - that may encourage businesses to utilize natural Search Engine Optimization (SEO) companies as opposed to PPC advertising programs. That wouldn’t be so good for Page, Brin, or Dr. Eric Schmidt (Google CEO).
Does That Make Google Adwords or Yahoo Search Marketing Definitively Bad?
Now, I own stocks, bonds, CDs - and an occasional mutual fund - my financial portfolio is diversified. Diversifying your business’s advertising is something that I also advocate. If PPC advertising is reaping a profit for you, then by all means, continue to utilize this valuable service. What I am suggesting is that you don’t put all your eggs in one basket. If a search engine rule or algorithm changes, your organic search engine rankings may drop, costing you revenue and potential profit. If a new player enters your industry and decides to outbid your PPC advertisements, you will also be pushed down in the search engine shuffle - again affecting your business’s bottom line. I don’t recommend having an undiversified financial portfolio, nor do I recommend obtaining all your search engine traffic from the same source. Experiment with Overture or Adwords, and talk to an ethical search engine optimization firm to find out if organic optimization makes sense for your organization.

--------
by Brian Ortiz is the CEO of SEOMatrix: Ethical Search Engine Optimization.

Monday, September 29, 2008

The Newbie Guide to Online Marketing Terms

In all forms of business, and even hobbies, the people who have been involved in a particular activity for awhile will start to use different slang terms. When someone new comes along these slang terms can be very confusing. As all businesses depend on customers, you need to be careful and educate yourself on these terms so you don’t lose any sales.

It recently came to my attention that online marketing has more than its share of slang for the newbie to learn. Since marketing is all about getting and keeping customers, I could see that seasoned online marketers may start have an image problem. Online marketing slang was pointed out to me by an associate who was interested in an Internet business I was marketing. While I was talking to him he asked for an explanation of what the heck I was talking about and said “Speak English Boy!”

That conversation allowed the light to click on for me and I decided to write this short guide to online marketing slang. This should help anyone who is new to the online marketing arena understand some basic terms.

1. Online Marketing - Is the selling of any product or service on the Internet. Marketing is just the act of promoting and selling something. Selling on the Internet costs much less than having a real world, bricks and mortar business. It’s this low cost of entry which is attractive in starting your own online business.

2. Joint Ventures - A joint venture is a partnership. Simply put 2 or more people work together to increase their sales, or to complete a product or service. Sometimes shortened to JV, joint ventures are often very profitable.

3. Subscriber List - This is sometimes known as an email list, or an ezine list (ezine is short for electronic magazine). It’s also a list of people who have given permission for the marketer to send them emails, which may contain advertising. This is also known as an opt-in list, because subscribers opt-in to receive the emails. This prevents people from complaining that an email is Spam as there will be a record of them agreeing to receive the emails. Another snippet of slang is double opt-in, which just means after filling out a form, the potential subscriber will receive an email asking them to verify that they agree to accept future emails before they are actually added to the email list to receive them.

4. Viral Marketing - This little phrase is used to describe a way of increasing your business, or website reach, by using a “viral technique.” In real life a virus spreads out by people contacting one another, and computer viruses are spread by computers contacting other computers. Some clever Internet gurus realized they could use the power of people to send each other information, and then watched that information spread in the same way as a virus. Giving away a free gift, which contains a link to your business, and allowing others to give it away too will work like a virus spreading the word to others as people send it on to their friends. This works long as your freebie was something people would give away anyway in the first place.

5. Niche Marketing - This is the act of selling to a particular group of people. There are lots of untapped niches (small groups interested in a specific subject) who are willing to pay for information. Online, it’s easy to target these small markets by monitoring search engine results and then finding a specific word or phrase which is being searched for but not near the top of the list of popular searches. An example of a niche market is poodle owners, they are dog owners, but will specifically search for information about poodles. These markets can be profitable because they are not targeted directly by many people, so if you have a product tailored to a niche market you may have little or no competition.

6. EBook - Although getting more popular this term still causes some confusion. An ebook is an electronic book, it can be a .pdf file, plain text file or an executable file (one which is actually a self contained program to run on your computer). Ebooks sell well online, as there is no shipping, and delivery is instant.

7. MLM - These are the initials of one of the least understood, and most maligned of business models, Multi Level Marketing. Sometimes known as network marketing or referral marketing, MLM is an ideal business model for online marketers because of the ease of getting out the message about a new business and the low cost of advertising online. The problem with MLM is that it is also easy to mistake MLM for a pyramid scheme (also called a Ponzi). Many unscrupulous people will sell pyramid schemes as an MLM opportunity. What they are selling is basically just a money game. The big difference is that a genuine Multi-Level Marketing program has a worthwhile product that you are selling and money is made by referring other members to sell the same product. Pyramid schemes will usually have a worthless product or no product at all, and pays current members from new members joining. There are many legit MLM companies out there where millions have been made.

If you are new to the world of online marketing, I hope these short explanations will help you understand some of the terms the seasoned online marketer says or writes. If you’re a seasoned marketer, maybe you will try to introduce your business to the newbie with less of the jargon we seasoned marketers take for granted.

----------
by Chad William Hershey of ManifestYourFortune is founder of his own home based business,

Sunday, September 28, 2008

Is Ask.com Deceiving UK Searchers ?

Ask.com has gotten a lot of press (and backlash) over the guerrilla Information Revolution marketing campaign that’s been running in the United Kingdom lately. It’s a 6-week campaign masked as a grassroots effort to bring awareness to Google’s monopolization of search and encourage searchers to try out other engines. Google currently enjoys 75 percent of UK search market share

It’s gotten quite a bit of coverage, but in case you missed it, over the past few weeks, Ask.com has been employing both guerrilla and traditional media tactics to get their message out and point searchers to InformationRevolution.org to learn about the movement, receive a free T-shirt and leave comments on the site’s message board.

All in all, a campaign to raise awareness about all the other engines out there that are overlooked because of searchers “sleep searching” and blindly using Google is a positive thing, right? Google’s a great engine, but it doesn’t have to be your everything. Unless they’ve given you a three-stone ring, it’s okay to try out other suitors. The site lets users easily experiment with a new engine and search on either Google, Yahoo, Ask.com or Live.

The problem with Ask’s campaign is that they don’t clearly state that they, Ask,com, a Google competitor, is the one behind it. Kinda sketchy. And now they’re taking a lot of slack for it, including today’s very public hand slap from the Wall Street Journal. You can’t blame users for reacting the way they have. It’s natural that when they discovered what they thought was a natural forming, underground campaign based around information is actually a cleverly disguised attempt to promote Ask.com, they’re going to be less than impressed. Ask has made them feel tricked, deceived, and lots of other non-warm-and-fuzzy things, probably not the aftertaste Ask was hoping to leave.

It’s a shame because the message Ask was trying to get out is an important one. Living in a world where one engine “controls” all the information is a dangerous idea and, frankly, gives Google an enormous amount of power. Unfortunately, that message is now totally lost on an audience who feels like they were duped by a company just trying to promote their own product. Right now, Ask.com has lost its ability to leverage this in a positive light. And it is users who will miss out because Ask really is a great engine. It may never replace Google as your steady, but for some queries, Ask really does provide better, more relevant results.

To be fair, the InformationRevolution.org site does contain an Ask.com logo in the bottom right hand corner, but it’s small and not overly noticeable. And if you don’t immediately recognize the photo of Apostolos Gerasoulis (Team Eli!), you probably miss the affiliation completely. I really think this was an oversight on Ask’s part. Now, instead of being a powerful branding campaign, it came off as a jealous Anti-Google smear campaign, especially when searching on Ask.com for “google” used to bring up a Smart Answer accusing searchers of being puppets. (Fortunately, this has since been taken down.) Not cool.

I also don’t think Ask.com intended to deceive users. I actually think the extent of backlash they’ve received has been somewhat unwarranted. They made a mistake but they weren’t intentionally being malicious. I have to wonder, if Google had launched a similar campaign or launched a movement to take away some of Microsoft’s presence in the office suite world, would we be talking about this right now? Probably not. We just would have laughed. But Ask isn’t Google, and that’s the problem.

The ad agency responsible for Ask’s campaign said they felt Ask.com had “nothing to lose” by running the campaign since Google is so much bigger. I think the fallout has proved that statement to be 100 percent false. By appearing deceptive, they’ve lost a lot of potential users, broken the trust of existing users, and probably tainted their overall impression in the United Kingdom. When you’re struggling for market share, you can’t afford to alienate people. It’s that make new friends, but keep the old lesson we talked about yesterday. Ask.com now has some major reputation management work to do in the UK.

Without a doubt, Ask.com made a mistake by not advertising they were the ones behind the campaign. If they wanted to raise awareness, then they’ve done that. People are talking about it. But if they wanted to promote Ask.com as an alternative to Google, I think they did themselves a great disservice by hiding, or at least not promoting, the fact that they were the so-called Information Revolution. They got a revolution, just not the one they were going for.

----------
Author: Lisa Barone is a Sr. Writer at Bruce Clay Inc.

Saturday, September 27, 2008

Advanced Link Building: Hosted Content, The Quest for the Perfect Link

Ask Google, search engines love links. Of course, they love some links more than others. For example, a simple link exchange (reciprocal link) doesn’t have as much value to search engines and so, it doesn’t receive the same weíght as a non-reciprocal (one-way) link – the theory being that a one-way, in-bound link is a recommendation from a site owner to visit this linked site. The link, itself, is testament to the quality of the site being referred.
Article Syndication
In recent years, many sites have employed article syndication to develop links. These site owners write (or have written) articles of interest to a particular audience. The site owners then offër these articles to other relevant sites free in exchange for a link back to the originator of the content in the “about the author” section of the article. In this way, a single site owner can submit dozens of articles for syndication receiving an inbound link from each article in return for the frëe use of content. They can also watch other sites post the content virally to keep their sites fresh, as well.
Sites need fresh content so many will happily display your article and provide a link to your site. It’s a tried and true link building tactic. However, search engines are programmed to seek out the most natural, and therefore valuable, links they can find.
The way articles are syndicated is through sites like goarticles.com and ezinearticles.com. The standard format for the display of the article is: headline, article body followed by a small blurb about the author with a link back to the author’s site. Since those links appear in the body of the page, they appear to be more valuable in comparison to most purchased or reciprocal links which often appear at the bottom of a page column, or in the footer surrounded by lots of other links – somewhat effective, but not necessarily the best way to acquire inbound links.
In addition, syndication leads to duplication when a single article appears on 10 sites all at the same time. This diminishes the quality of the text and the back link to the author’s site. It’s still more valuable than a plain link exchange, but search engines are placing less emphasis on syndicated content. So, what’s a site owner to do?
Hosted Web Content
It goes by many different names: content swapping, advertorials, pre-sell pages and hosted content – all basically the same idea.
The way hosted content works is that you, the author, pay a site owner to display your article. However, now, instead of the back links to your site coming at the end of the article, you embed those links in the body of the text surrounded by your target keywords and actually useful content for the reader. In the “eyes” of a search engine, this is among the highest valued back link.
Hosted content is basically renting a page on another site with links to your site embedded in the main body of the article. The web site that hosts the content receives payment from the author plus fresh content, the author gets a valuable back link and visitors to the hostíng site get useful content.
This strategy isn’t new. It’s simply doing what search engines want us to do – produce content that’s useful, beneficial and appears on quality sites. Not only does a quality piece of content receive more visibility when hosted on an authoritative site, it also delivers increased benefit to the author, and the page may even rank itself for target key phrases. When a major site hosts your content, you gain from its page rank in strong testimonials and referrals. Whether or not the site owners want to monetize their site by allowing approved authors to post content is the same debate as whether or not links should be bought and sold. However, publishing high quality, unique and useful content, rather than just creating inflated link popularity with diminishing returns, is, in comparison, a tested SEO tactic.
Designing a Hosted Content Page
You’re paying for the placement of this content so you want it to be good. In the eternal quest for successful link bait, you also want the content to be ranked by search engines because it provides real value to the reader and is hosted on an authoritative site.
Design the hosted content page using standard SEO conventions: a keyword savvy title, header

, subheads and a keyword density of less than 5%. Any higher and search engines may consider the content to be “spamish” regardless of where the content appears.
Now comes the most important part. As you write the article, carefully place links to topically relevant pages on your own site within the body of the article’s text. These are high value links that will improve your SEO. However, it’s also important to place your articles on sites that are topically related to your piece (and probably already rank for related topics). The authority of the site hostíng your content, the relevance of the site (topically speaking) and that back link make your site look stronger as far as search engines are concerned. Also, remember that the quality of the content to which you link also matters. Link to strong pages (those with quality back links) on your site, as well. Your article should reference other authoritative, relevant articles so that search engines see that your piece was written to offër real value to readers.
It’s Not Quantity, It’s Quality
It’s no longer simply a matter of how many links point to a site. There are many cases of sites in which 50 quality links outrank sites with hundreds of links. It’s not quantity, it’s the quality of the links that improve ranking in the SERPs.
Editorial links (links in hosted content) are more “natural” from a search engine’s perspective and, therefore, more valuable because the article has, at most, two or three targeted links pointing to your site’s pages. Just like quality link bait, which is unique, original and useful content, quality hosted content on respected sites will also naturally develop its own back links - the ultimate validation and the desired outcome of placing quality content. Finally, because these links are found on pages optimized with your keywords, search engines will consider them extremely relevant to the subject at hand.
Start Your Hosted Content Campaign Today
It’s being done everyday, successfully building small sites into largër sites, providing frëe advertising for the thought-leader/author, delivering less duplicate content to search engines and more new content (plus revenue) to the hostíng site and, perhaps most importantly, hosted content actually delivers useful, relevant information to readers – exactly what search engines rank in the first place. As with any link-building technique, hosted content can be abused, but topically authoritative sites are not going to accept content that does not meet their high standards – so everyone wins when the goals are white hat.
Start searching for websites that might be interested in hostíng your next article, or start looking for a site owner interested in content swapping. Create content that’s unique, useful and well-written and you may find that you won’t even have to pay a site owner to share your content with their readers – exactly how it should be.

------
by Frederick Townes in the owner of W3-EDGE.

Friday, September 26, 2008

Protecting Your Search Engine Rankings

Your website’s ranking on search engines is a vital element of your overall marketing campaign, and there are ways to improve your link popularity through legitimate methods. Unfortunately, the Internet is populated by bands of dishonest webmasters seeking to improve their link popularity by faking out search engines.

The good news is that search engines have figured this out, and are now on guard for “spam” pages and sites that have increased their rankings by artificial methods. When a search engines tracks down such a site, that site is demoted in ranking or completely removed from the search engine’s index.
The bad news is that some high quality, completely above-board sites are being mistaken for these web page criminals. Your page may be in danger of being caught up in the “spam” net and tossed from a search engine’s index, even though you have done nothing to deserve such harsh treatment. But there are things you can do - and things you should be sure NOT to do - which will prevent this kind of misperception.
Link popularity is mostly based on the quality of sites you are linked to. Google pioneered this criteria for assigning website ranking, and virtually all search engines on the Internet now use it. There are legitimate ways to go about increasing your link popularity, but at the same time, you must be scrupulously careful about which sites you choose to link to. Google frequently imposes penalties on sites that have linked to other sites solely for the purpose of artificially boosting their link popularity. They have actually labeled these links “bad neighborhoods.”
You can raise a toast to the fact that you cannot be penalized when a bad neighborhood links to your site; penalty happens only when you are the one sending out the link to a bad neighborhood. But you must check, and double-check, all the links that are active on your links page to make sure you haven’t linked to a bad neighborhood.
The first thing to check out is whether or not the pages you have linked to have been penalized. The most direct way to do this is to download the Google toolbar at http://toolbar.google.com. You will then see that most pages are given a “Pagerank” which is represented by a sliding green scale on the Google toolbar.
Do not link to any site that shows no green at all on the scale. This is especially important when the scale is completely gray. It is more than likely that these pages have been penalized. If you are linked to these pages, you may catch their penalty, and like the flu, it may be difficult to recover from the infection.
There is no need to be afraid of linking to sites whose scale shows only a tiny sliver of green on their scale. These sites have not been penalized, and their links may grow in value and popularity. However, do make sure that you closely monitor these kind of links to ascertain that at some point they do not sustain a penalty once you have linked up to them from your links page.
Another evil trick that illicit webmasters use to artificially boost their link popularity is the use of hidden text. Search engines usually use the words on web pages as a factor in forming their rankings, which means that if the text on your page contains your keywords, you have more of an opportunity to increase your search engine ranking than a page that does not contain text inclusive of keywords.
Some webmasters have gotten around this formula by hiding their keywords in such a way so that they are invisible to any visitors to their site. For example, they have used the keywords but made them the same color as the background color of the page, such as a plethora of white keywords on a white background. You cannot see these words with the human eye - but the eye of search engine spider can spot them easily! A spider is the program search engines use to index web pages, and when it sees these invisible words, it goes back and boosts that page’s link ranking.
Webmasters may be brilliant and sometimes devious, but search engines have figured these tricks out. As soon as a search engine perceive the use of hidden text - splat! the page is penalized.
The downside of this is that sometimes the spider is a bit overzealous and will penalize a page by mistake. For example, if the background color of your page is gray, and you have placed gray text inside a black box, the spider will only take note of the gray text and assume you are employing hidden text. To avoid any risk of false penalty, simply direct your webmaster not to assign the same color to text as the background color of the page - ever!
Another potential problem that can result in a penalty is called “keyword stuffing.” It is important to have your keywords appear in the text on your page, but sometimes you can go a little overboard in your enthusiasm to please those spiders. A search engine uses what is called “Keyphrase Density” to determine if a site is trying to artificially boost their ranking. This is the ratio of keywords to the rest of the words on the page. Search engines assign a limit to the number of times you can use a keyword before it decides you have overdone it and penalizes your site.
This ratio is quite high, so it is difficult to surpass without sounding as if you are stuttering - unless your keyword is part of your company name. If this is the case, it is easy for keyword density to soar. So, if your keyword is “renters insurance,” be sure you don’t use this phrase in every sentence. Carefully edit the text on your site so that the copy flows naturally and the keyword is not repeated incessantly. A good rule of thumb is your keyword should never appear in more than half the sentences on the page.
The final potential risk factor is known as “cloaking.” To those of you who are diligent Trekkies, this concept should be easy to understand. For the rest of you?cloaking is when the server directs a visitor to one page and a search engine spider to a different page. The page the spider sees is “cloaked” because it is invisible to regular traffic, and deliberately set-up to raise the site’s search engine ranking. A cloaked page tries to feed the spider everything it needs to rocket that page’s ranking to the top of the list.
It is natural that search engines have responded to this act of deception with extreme enmity, imposing steep penalties on these sites. The problem on your end is that sometimes pages are cloaked for legitimate reasons, such as prevention against the theft of code, often referred to as “pagejacking.” This kind of shielding is unnecessary these days due to the use of “off page” elements, such as link popularity, that cannot be stolen.
To be on the safe side, be sure that your webmaster is aware that absolutely no cloaking is acceptable. Make sure the webmaster understands that cloaking of any kind will put your website at great risk.
Just as you must be diligent in increasing your link popularity and your ranking, you must be equally diligent to avoid being unfairly penalized. So be sure to monitor your site closely and avoid any appearance of artificially boosting your rankings.

-----
by Vern Black is the owner of Pinnacle Publishing,

Thursday, September 25, 2008

Retailers Count on Search Driving Sales

A new study by Internet Retailer shows retail sites will be investing more on organic SEO and paid search. Major findings include:

Search engine marketing drives over 50 percent of sales for 30.2 percent of the survey respondents. This amounts to about 80 retailers out of a total sample of 245. The 50 percent figure coincides with earlier studies.
You won’t see any reductions in paid search advertising as 82.8 percent of respondents said they would not reduce overall PPC spending this year.
57.4 percent of respondents said search engine marketing performs better, or somewhat better, than other marketing media, including email, affiliate marketing and direct mail. Only 12.7 percent said it performs worse.
While more retailers prefer PPC to SEO for driving traffic (39.2 percent), another 34.7 percent indicated a preference for SEO over PPC. A savvy 26.1 percent use both SEO and PPC equally.

Retailers believe SEO delivers better conversions (46.1 percent), while a lesser number (37.3 percent) believe PPC is better, and 16.6 percent think SEO and PPC perform equally well.

Organic SEO
Optimizing their SEO campaigns is a high priority for most retailers. One reason is the rising cost of PPC. Retailers will improve their natural rankings on Google, Yahoo and other search engines as follows:
· 80.9 percent plan to rewrite keyword descriptions on the home page and product pages to achieve higher rankings.
· 67.9 percent will include the actual phrases commonly used in search queries on product pages.
· 58.1 percent will include common product keywords in image file names and in image display captions.
· 61.8 percent plan to improve overall site navigation.
This makes sense since 70 percent of traffic is generated by organic links while the other 30 percent comes from PPC traffic.
Paid Search
Most retailers will spend more rather than less on paid search.
· 58.5 percent will spend more on paid search this year
· 32.8 percent will stick to current budgets.
· 8.7 percent will cut back.
What search engines deliver the best ROI?
· 79 percent said Google produces the best ROI.
· 13 percent prefer Yahoo.
· 4 percent like Microsoft Live Search
· 2 percent count on AOL.
Source: Internet Retailer

-------

By Paul Bruemmer

Wednesday, September 24, 2008

Mobile Search Site Creation and Optimization

The following is coverage of the Search Engine Strategies (SES) New York presentation called “Mobile Search Optimization” by Cindy Krum of Blue Moon Works, Gregory Markel, President of Infuse Creative LLC and Rachel Pasqua, Director of Mobile Marketing at iCrossing.

This presentation provided a fascinating glimpse into the young realm of mobile site creation, compliance and optimization. I have a lot of information to work with here so to make this article a little more digestible I have broken it into two parts; one is the site creation and the second is the site optimization.
Mobile Website Design & Creation

During this presentation two very different lines of thought were noted regarding the best method for creating a mobile website, one from Cindy Krum and the other from Rachel Pasqua.

> Cindy Krum’s Presentation
Cindy Krum felt strongly that an existing website should pull double-duty as both the wired and the mobile version by using CSS to provide an alternative, mobile friendly version shown only to mobile users.
Cindy provided some great tips on how to create a hybrid mobile/wired website:
1. Ensure your website is 100% W3C XHTML compliant because mobile browsers are completely unforgiving when it comes to improper coding.
2. Follow strict XHTML accessibility guidelines to provide the best quality product for both wired, mobile, and those that require accessibility (i.e. the blind). She also noted that by following accessibility requirements any images that do not show up on the mobile browser will be defined in text format – a nice backup.
3. Avoid unnecessary code to minimize download times.
4. Ensure the site uses CSS to control content – this is critical to ensure the mobile version can have reorganized placement of content. (i.e. the menu might be at the bottom vs. the top)
5. Use external CSS files to provide maximum flexibility such as the ability to specify a different style sheet for each mobile browser.
6. Use the LINK element to attach style sheets because it is a much friendlier format for mobile browsers.
7. Use multiple style sheets. The minimum would be a style sheet called “screen” for regular wired visitors and a second style sheet (provided below the first) called “handheld”.
8. Use “display: none” to hide elements in either rendering. This is useful if you have page elements you do not want to appear to mobile users or vice versa. Using this method of hiding content is part of what makes Cindy’s hybrid approach feasible of using a single website for both viewing technologies (handheld, and wired).
9. These headers will help you identify the mobile device being used to access the content. HTTP User-Agent headers, HTTP Accept Headers, and UAProf.
10. Use the appropriate MIME type: “text/html” or “application/xhtml+xml”.
> Rachel Pasqua’s Presentation
At the opposite spectrum was Rachel Pasqua who firmly stated that offering your current website to users, reformatted or not, would likely provide a less than desirable user experience. She went on to explain that mobile users should see an entirely different, more time efficient version of your website because such users are task oriented. Rachel put her thoughts into excellent perspective when she stated that mobile search is “not surf media, it’s search media”. She also went on to state that iCrossing decided to proceed with the subdomain concept rather than a separate domain such as a .mobi. In this case their mobile site is located at mobile.icrossing.com; a sensible concept that retained the branding of the top level domain name without having to rebrand a new one (i.e. going with the .mobi version)
Rachel had some interesting metrics and tips to share with the group that were researched at iCrossing using focus groups and other research (sorry I don’t know the source but the report is due to be released soon I hear). Here are a few tidbits that I caught on paper:
1. Mobile searchers tend to utilize the same search engine they use when they are on their PC.
2. Only 10% of the estimated 234 million US wireless subscribers are active users of mobile search.
3. Searchers are task oriented, they tend to want to get their information and get out; mobile surfing is extremely uncommon.
My Take on Hybrid Sites Versus A Separate Mobile Website

Of the two beliefs I felt myself more strongly drawn to the concept of a separate mobile site. Why? I think the maintenance of a hybrid website is bound to be far more difficult because design updates will require designers to think in both realms which is likely to make updates laborious for the average business owner.
> Gregory Markel’s Presentation

Gregory Markel of Infuse Creative LLC, dropped a very intriguing bombshell at the beginning of his discussion when he noted that Google’s Voice Local Search just might take the world of mobile search in an entirely different direction. According to Gregory, his friends and network of mobile enthusiasts have been impressed by the results of using 800-GOOG-411 and conducting a free voice search; the results have been extremely relevant and Google immediately connects the user to their preferred result by phone. After this bombshell had sunk in, he went on to discuss many of the points already mentioned by Cindy but he had a few highlights definitely worth mentioning including this valuable tip: get into Google local for your area so that you can be found on Google’s Voice Local Search, it is free and easy to do. (Note, I wrote an article on how to do this a few months back called: Have Your Company Listed Free in Google Maps). Unfortunately, Google Voice Local Search is experimental and only available in the United States.

Highlights from Markel:
1. Mobile search adoption has been slower in the US than expected at only 19%
2. An excellent source of mobile statistics is the self-described authority on mobile metrics, MMetrics.com.
3. When users conduct searches, they are more likely to search using 2 or a maximum of 3 words.
4. Nokia has decided to try to simplify the process of searching by integrating it into its future line of cell phones.
5. Mobile devices require ultimate simplicity to ensure compatibility across the vast number of proprietary mobile browsers available.

-------
By Ross Dunn

Tuesday, September 23, 2008

Estimating the Real Click Fraud Rate

The controversy surrounding click fraud comes up every year, but it reached a fever pitch during December’s Search Engine Strategies conference in Chicago when participants voiced concerns over experiencing fraudulent click rates ranging from 20 to 40 percent, threatening the entire paid search industry.

At the time, Google’s Business Product Manager for Trust and Safety Shuman Ghosemajumder tried to calm advertisers’ fears explaining that Google was currently “…examining ways to make its fraud-fighting efforts more transparent without revealing crucial information that might help swindlers elude detection.” Ghosemajumder did, however, express concerns over revealing too much information, fearful it would give away algorithm secrets to competitors.

Paid Search Revenues Continue to Rise

While the major search providers have always insisted the clíck fraud rate is a gross overestimation, a 2005 Outsell survey found that clíck fraud was a $1.3 billion problem for publishers. At the time, many advertiser respondents (27 percent) said they planned to cut back and/or eliminate paid search campaigns in 2006.

Outsell respondents may have intended to cut down on paid search, but they certainly didn’t follow through. SEMPO’s year-end search marketing report showed that North American advertisers spent $8 billion on paid placement programs in 2006, amounting to 86 percent of 2006’s total search spend ($9.4 billion). Seventy-one percent of SEMPO respondents said they used paid search campaigns, illustrating that there were not many defectors.

Despite advertisers’ insistent claims that the search engines don’t do enough to eliminate clíck fraud, paid search revenues continue to fill the coffers of Google, Yahoo, Microsoft and many second and third tier search engines. Additionally, there is a huge gap in the professed prevalence of clíck fraud between the search providers and the advertisers and clíck fraud advocates.

Google Click Fraud Estimates

The rate of clíck fraud changes depending on whose numbers you believe. Clíck fraud detection agencies put the clíck fraud rate hovering around 14 percent, while others believe at least 20 percent of all clicks are fraudulent.

Late last month, Google issued a statement on the Inside AdWords blog that insisted invalid clicks consistently remain under 10, typically in the single-digits, and that virtually all malicious activity is found by Google’s filter. Ghosemajumder claimed the percentage of clicks found by advertiser-initiated investigations account for just .02 percent of clicks. All other accounts, he said, are grossly overestimated.

Alchemist Media President Jessie Stricchiola takes issue with Google’s assertion that it refunds advertisers promptly for fraudulent clicks, stating that “Google has been the most stubborn and the least willing to cooperate with advertisers”.

Google Click Fraud Filters

In February, Google outlined the three-layer filtration process it uses to combat and eliminate clíck fraud. They described the system which uses both proactive and reactive filters as follows:

1. Proactive Filters: Automated algorithms analyze and filter out invalid clicks in real-time without billing advertisers for these false clicks. This accounts for the vast majority of invalid click detection.

2. Proactive Offline Analysis: Post billing, Google uses automated and manual analysis to identify fraudulent clicks that somehow made it through the first layer of filtration. Special attention is paid to clicks occurring on the AdSense network. This is done pro-actively and without any involvement from advertisers. When false clicks are found, advertisers’ accounts are immediately credited via Clíck Quality Adjustments.

3. Reactive Investigations: Investigations take place when an advertiser approaches Google concerned about suspicious activity on their account. Each complaint is investigated, though Google says refunds are relatively rare. Google claims that the vast majority of fraudulent clicks, more than 99 percent, are found and thrown out within the first two stages of filtration. The third stage only includes the .02 percent of clicks where advertisers are affected by undetected cases of clíck fraud.Click Fraud Detection Agency Estimates

In April 2006, The Click Fraud Index reported an industry-wide average clíck fraud rate of 13.7 percent. The clíck fraud rate was broken down as follows:

  • Tier 1 search providers — 12.1 percent
  • Tier 2 search providers — 21.3 percent
  • Tier 3 search providers — 29.8 percent

Some of the newer click fraud prevention firms like Click Assurance and ClickLab offër algorithm-based programs to limit bad clicks. These programs estimate the statistical likelihood of a clíck being fraudulent based on behavioral variables and IP address.

Gap in Prevalence of Click Fraud

As noted above, Google admits to a less than 10 percent click fraud rate, while advertisers and clíck fraud detection agencies believe it is more like 14 to 20 percent. Ghosemajumder explained this gap saying that many advertisers and clíck fraud detection agencies are looking at the wrong signals, mistakenly classifying valid clicks as fraudulent. Additionally, he believes many advertisers request refunds for clicks already thrown out during the first two layers of the filtration system.

For example, misclassification might occur when counting reloads of an advertiser’s landing page. Say the customer clicks through to the landing page, views a product page, and then hits the back button, returning to the same landing page. Without proper tagging, that one clíck and five page re-loads could be misclassified as 6 clicks from the same visitor. Google argues that there are hundreds of different signals that must be monitored to detect clíck fraud, signals that are a closely guarded company secrët and known only to the Google clíck quality team.

A Solution for Click Fraud

Like other experts, we believe the only solution to clíck fraud is for independent auditors to evaluate the system using accurate data provided by the search engines and advertisers themselves. It is the only way to get a neutral calculation — the current clíck fraud detection agencies may not be entirely neutral, and certainly the search providers are not neutral. We need an independent agency that has no incentive to íncrease or decrease the clíck fraud rate. One solution could be to use a technology firm like Fair Isaac, which is currently conducting clíck fraud research for SEMPO.

One thing is certain, until advertisers are willing to provide campaign info, and the search engines are willing to share clíck fraud data, we’ll nevër know the actual prevalence of clíck fraud or how much advertisers are losing as a result.

-----
by Nick Guastella, SEM Analyst and PPC expert at Bruce Clay, Inc.

Monday, September 22, 2008

XML Sitemap Submission Made Easier

Getting Google and other major search engines to spider your XML Sitemap just got a little bit easier. Although submitting through a search engine’s submission interface such as Google Webmaster Tools can offer additional valuable information, if you simply could not be bothered there is now an easier way.

Vanessa Fox posted in the Webmaster Central Blog a few recent announcements regarding sitemaps.org, including a point on XML sitemaps.
Now you can add a simple line to your robots.txt file that will tell the search engines that support XML sitemaps, where your file is. No need to create an account. Simply upload your XML sitemap and add a line including the full path to your robots.txt file
Sitemap: http://www.yoursite.com/sitemap.xml
It is as simple as that, your sitemap will be spidered and you need not do anything else. That said, using the supplied interface, such as Google Webmaster Tools, will allow you to ensure that your sitemap was spidered correctly and without errors, but it is nice to see an alternative is available.

-------

By Scott Van Achte

Sunday, September 21, 2008

SES NY2007 Session Coverage From SERoundtable

Big thank you to Debra Mastaler, Chris Boggs, Greg Meyers, Ben Pfeiffer, Lisa Barone, Li Evans, Rob Kerry, Carolyn Shelby, Tamar Weinberg for helping me out with this awesome coverage. Here are the sessions we covered, somewhat in order.

Here are the fifty-plus sessions we covered at SES NY 2007.

  1. In House: Big SEO
  2. Video Search Optimization
  3. Compare & Contrast: Ad Program Strategies
  4. Podcast and Audio Search Optimization
  5. Benchmarking an SEM Campaign
  6. Online Video Advertising
  7. Advertising in Social Media
  8. Mobile Search Optimization
  9. Where Are Your Spending Your Client’s Money?
  10. Advanced Paid Search Techniques
  11. Ads In A Quality Score World
  12. In House Big PPC
  13. Keynote Conversation with Steve Berkowitz
  14. Sitemaps & URL Submission
  15. Domaining & Address Bar-Driven Traffic
  16. Link Building Basics
  17. Introduction to Search Marketing
  18. Web Analytics & Measuring Success
  19. Duplicate Content & Multiple Site Issues
  20. Converting Visitors Into Buyers
  21. Getting Traffic From Contextual Ads
  22. Writing for Search Engines
  23. Meet the Search Ad Networks
  24. SEO Through Blogs & Feeds
  25. Putting Search Into the Marketing Mix
  26. Earning Money From Contextual Ads
  27. Fun with Dynamic Websites
  28. Landing Page Testing & Tuning
  29. Search and Branding
  30. Robots.txt Summit
  31. Successful Site Architecture
  32. B2B Tactics
  33. Social Search Overview
  34. Creating Compelling Ads
  35. SMO - Social Media Optimization
  36. Images and Search Engines
  37. Search Behavior Research Update
  38. Social Bookmark Strategies
  39. Shopping Search Tactics
  40. Organic Listings Forum
  41. Microsoft adCenter: Today and Tomorrow
  42. Auditing Paid Listings & Click Fraud Issues
  43. Evening Forum with Danny Sullivan
  44. Search & Regulated Industries
  45. Wikipedia & SEO
  46. Linking Strategies
  47. Usability and SEO: Two Wins for the Price of One
  48. Link Baiting and Viral Success
  49. SEM For Non-Profits and Charities
  50. SEM Agencies: Working With Ad Agencies
  51. CSS, AJAX, Web 2.0 & Search Engines
-----
by Organized by Barry Schwartz (aka RustyBrick)

Saturday, September 20, 2008

Black Hat SEO Meets Web 2.0

I’m here at the Web 2.0 Expo in San Francisco. Today (actually yesterday now that it’s after midnight) I sat in on the SEO Workshop being presented by Todd Friesen (”The Oilman”) and Greg Boser (”WebGuerrilla”).

This session turned out to be a lot of fun. It was reminiscent of their “SEO Rock Stars” radio show on Webmaster Radio. I cracked up when Todd plugged their show and explained the name of the show by saying: “Yes, we really are that arrogant.”
Much more of the session than I was expecting turned to “black hat” or “gray hat” SEO tactics — things that are outside the search engine guidelines. Both Todd and Greg believe in being pragmatic about SEO. Greg analogized SEO to speeding: nobody goes the speed limit; just don’t drive recklessly swerving in and out of traffic so you get in a wreck. He “hates the guidelines” and longs for the good ol’ days when the engines didn’t have such idealistic guidelines and if you went too far, you were simply “torched forever and you’re gone.” Greg’s guiding principle: “I don’t want to upset my mother” — i.e. he gauges whether he’s gone too far based on whether she’d be unhappy with the search results because they’re useless. They say the search engine guidelines are simply that: guidelines.
Hmm. I’m not sure I’d take that tack with an audience of webheads from Web 2.0 startups. An audience of SEOs at SES (Search Engine Strategies) is one thing, because they can be discerning about how far to take various bits of advice. But Web 2.0 geeks? That aren’t savvy enough about SEO to know when they are playing with fire.
Greg and Todd made compelling arguments for playing with conditional redirects (serving different destination URLs to Googlebot than to humans). But the 4 major engines had specifically warned against doing that in the session “Search Engine Q&A on Links” last Friday at Search Engine Strategies in NYC. So you’d better REALLY know what you’re doing if you’re going to play with that stuff!
Todd and Greg also testified to the benefits of cloaking. One of their client’s sites was cloaked for 3 years — a new ecommerce platform was purchased and launched by the client, but the old HTML-based site was served up selectively to the bots because it ranked so well. I’m sure that was worth a pile of money to his client. But boy if you get that wrong (like not keeping up to date with all the ever-changing list of IP addresses associated with Googlebot), things could go very badly.
Todd made the point that if you are a big brand or company, you’re not going to get kicked out of Google. Or at worst, maybe it’ll be for a day. He cited eBay and the NY Times as examples of sites that won’t get banned despite operating outside the Google Guidelines by serving millions of “related search” results and/or cloaking. Todd also cited Yahoo’s new spam-reporting feature within Yahoo Site Explorer as evidence that the engines (and Yahoo in particular) aren’t very good at webspam detection: “they’re asking you to report the spam because they can’t find it themselves.” I was ROFLMAO when I heard that one. I doubt Yahoo would find that as funny though. ;-)
Speaking of Yahoo… Greg told the audience that he has consulted for Yahoo. Wow! That surprised me; impressed me too. Nice one, Greg!
Speaking of Yahoo again… Todd loves Yahoo’s paid inclusion program (Search Submit Pro). That’s because with it, Yahoo “gives you a complete pass on the off-page factors”. Yahoo tells everyone publicly that Search Submit Pro doesn’t improve your rankings, but it’s not true according to Todd. Although you shouldn’t expect a rankings lift with big money terms like “credit report” or “data recovery,” it will lift your rankings on most non-ultracompetitive search terms. Nonetheless, I’d exercise caution with this one. That’s because, as Todd admits, when you stop paying all your listings in the SERPs “will mysteriously vanish and it’s really hard to get them back in for free.” A bit of evil advice from Todd (I think he was only half-serious! ;-) … “Submit your competitor and then turn it off”. I was again rolling on the floor laughing at that one! I’ve heard of “Google bowling” your competitors into “Supplemental hell”, but now I guess you might call this one “Yahoo bowling!”
Attendees got a little dose of conspiracy theory. Todd and Greg don’t trust Google and their “don’t be evil” mantra — and they weren’t afraid to spread a little hysteria into the audience. Google Analytics is one service they are very suspect of. They go far as to advise you don’t use it: “giving your ROI data to the company you’re buying your ads from — that’s just assinine.” Personally, I trust Google. So much so that I’m on their waiting list for the Google Implant (beta). ;-)
Greg weighed in about the Open Directory, referring to it as “a dilapidated piece of crap.” And the ODP editors? — “they take bribes all the time.” I’d love to know who the editors are that I can bribe and how much it’ll cost!
Their recommended SEO tools were useful. For example: TouchGraph. They gave a very cool little demo of the tool in action. Todd said: “It’s like Diggswarm, but actually useful. Is Kevin Rose in the room? He’s going to kick my ass.” LOL!
Some of the other free link research tools recommended:
* Yahoo’s Site Explorer

* Netconcepts’ Link Checker

* WeBuildPages’ Neat-o Tool

* SEO for Firefox

* Tattler
Hmm. That’s funny, this list of tools (except for the last one) looked mighty familiar. So did some of the previous slides. I couldn’t resist… I got the microphone during Q&A, and I alluded to the slides, saying “Thanks for the plug for Netconcepts on that Tools slide.” Well you won’t believe what happened next! They admitted it to the audience that they lifted one of my decks from our website and then customized it! OMG!!
They praised me for making such excellent Powerpoints available online. Umm, thanks… I think! Geez!
Actually I’m not mad. Perhaps I should be, but they fessed up and poked fun at themselves for it — in front of 300 people! — so no harm done.
Just don’t do it again, k? Or if you do, at least keep my Netconcepts logo on the slides!
What other nuggets did they share? Todd shared that Endeca can be a secret weapon for SEO when you know how to make it work for you. I concur with that statement, totally. For example, with our (Netconcepts’) client Northern Tool, we were able to blow past previous search traffic records by deploying our GravityStream technology on top of Endeca Guided Navigation. Todd said that one of their (Range’s) biggest SEO successes was with Endeca. Todd gave a word of caution though: “Out of the box, Endeca is just a piece of junk for SEO”. The good news though is that, if you know what to ask for, “they will customize.”
They also recommended Google’s “Domain Park” program if you are a domainer with a bunch of domains you want to monetize.
They also advised folks to stay out of Google’s contextual advertising space so that you don’t get your AdWords campaigns onto the domain parking pages, because it is junk traffic that doesn’t convert.
What was the most interesting revelation of the day? That Todd was a Viagra webspammer before working for Range. I knew he was a reformed black hat SEO, but I never knew what industry. Greg joked that in Todd’s space, “PPC” stood for “Pills, Porn and Casinos” rather than “Pay Per Click.” Todd regaled us with tales of “churn and burn” SEO going after Viagra rankings — he could make $10k in a week, and all it cost him was a day of his time and $7 for a domain name — then the site would get banned and he’d move on to using another domain name. “Viagra” would have to be the most competitive keyword in SEO bar none (well, with the probable exception of “sex”). Todd *must* have been a “rock star” at SEO (but on the Dark Side of the Force of course! ;-) to make such a good living from it. So the name of his Webmaster Radio show really IS fitting and merited. I highly recommend downloading some audio recordings of previous SEO Rockstars episodes; even Matt Cutts has been known to listen to their show!
Rock on!

----------
by Stephan Spencer is the founder and president of interactive agency Netconcepts.

Friday, September 19, 2008

Gerrymandering Google Search Results

According to Wikipedia.com, “Gerrymandering is a form of redistricting in which electoral district or constituency boundaries are manipulated for an electoral advantage.”

Most states in the US redraw their electoral districts at least every decade. In 36 of the states, the state legislature is responsible for drawing the new districts — in principle to account for shifts in population — and the governor is required to approve those changes.

With nearly every redistricting, the party who is out-of-power generally screams foul. The argument is always that the political party in power adjusted the lines to improve their ability to win elections and control the electoral power structure.

The entire process of redistricting is a good analogy for the Google algorithm changes. The people in power (Google) redraw the lines, and the party out-of-power (the webmasters who dropped in ranking) cry foul, threatening lawsuits and other actions.

Something Right And Something Wrong

Of course, the same webmasters screaming about having dropped in Google’s search results did not have a problem when Google allowed them to climb in the search results previously.

The truly funny thing about it is that when people were on their way up in the Google search results, they told all of their friends that “they” did something right. When they are on their way down, “Google” did something wrong.

Shaky Foundations For Search Engine Optimization (SEO)

Problem If the website dropped out of Google’s top 1000 results, then I am willing to bet that in nearly every situation of this type, the webmaster had done something wrong in the last few years, and Google has just now caught on to this infraction.

For example, in years past, link farms were utilized by webmasters and accepted by Google. Then Google began to consider link farms to be “bad neighborhoods.”

Link farms are the kind of thing that can be easily recognized by human and search engine alike. So, the link farms were tagged as “bad neighborhoods” and their values were discounted to zero, and the websites linked from link farms got a minus points for their participation in the link farms.

It was a bad practice that the webmaster had participated, and Google finally got around to penalizing the websites whose rankings were supported by these shaky foundations.

SOLUTION: If you try to put yourself into the shoes of the experts at Google, and you wanted to improve the Google search results, what would YOU do? Before you answer this question, remember that you work for Google and not for yourself. What would you do? What would you do different?

“White hat” and “black hat” SEO methods are an analogy of Search Engine Optimization techniques that refers back to the old cowboy classics, where the “good guy” always wore a “white hat” and the ‘bad guy” always wore a “black hat”.

“White hat” and “black hat” SEO methods are an analogy of Search Engine Optimization techniques that refers back to the old cowboy classics, where the “good guy” always wore a “white hat” and the ‘bad guy” always wore a “black hat”.

While you are working for Google, take a look at the SEO technique you are thinking about using for your own website. Chances are that if you deem a SEO technique to be “gray hat” (something in between ethical SEO and unethical SEO), then chances are that someday that technique will be deemed “black hat” by the Google engineers. And if the SEO technique is “black hat,” it will likely be targeted for discount by Google at some time in the future.

If you delve into “gray hat” or “black hat” SEO, you may be able to rocket to the top of the Google search results quickly, but you will not be able to hold that position for long. Google will eventually get wise to your gimmick.

If you can “white hat” your website to the top of the search results, your future ranking will likely hold its ground for many years to come.

Your Competition Out-Played You

PROBLEM: If a website dropped a few positions in the Google search results, I am willing to bet that in nearly every situation, one of the website’s major competitors out-performed them in the search results.

There are certain things that you can do correctly, to elevate your own listings in the search engine results. Because you have the capability to do these things, your competitors also have the capability to do these things.

Well guess what? Your competitor did what you should have done. They set up marketing systems to ensure that they could climb in the search results, and they climbed enough to pass you by.

SOLUTION: Do those things that you know you should be doing. And more importantly, do them consistently. Your competitor wants your placement in the search results, and they are going to do what it takes for them to get there.

So, it is imperative for you to keep moving forward in the process of building your ranking in the search engines. When you stand still, resting in the comfort zone of where you reside today, then you are bound to be overtaken by someone who wants to be where you are, a little bit more than you do.

Stir That Pot

PROBLEM: If a website dropped a few pages in the search results, the cause for the drop may have been caused by a combination of the two above-mentioned problems: shaky SEO foundations, and hungry competitors.

SOLUTION: Keep your eye on the big picture. You know what you have done to strengthen your website in the past. Keep learning and keep growing.

If you find that something you had done in the past may affect your website negatively in the future, implement “white hat” SEO techniques to strengthen your position, and make plans to eliminate your reliance on “black hat” SEO systems before Google does it for you.

Don’t Make The Mess And You Won’t Have To Clean It Up

Some may think my approach to housecleaning is an indication of laziness. I consider it to be an indication of good time management. I prefer to reserve my time for things that are more enjoyable.

My theory towards housework — and I try to teach this to my kids often — is that if you don’t want to clean up the mess, don’t make the mess in the first place. If the house is not messy, you don’t have to clean it. So, don’t make the house messy.

I utilize the same approach to my websites. If I do it “white hat” and above-reproach the first time, then I won’t have to go in and fix my rankings everytime Google changes their algorithms.

Stop Standing Still

Do remember that for most every search term you compete to rank for, there are probably another one million web pages competing to rank for the same keyword phrase. Getting to the top of the search engine results is a process, not a destination. You must keep competing to maintain your spot at the top of the search results.

If you want to get to the top of the search results, you will have to take positive steps to get there. And, if you want to maintain your top position in the search engines, you need to keep doing the things that put you in that top spot in the first place.

-----------
by Bill Platt has been involved in article marketing and link building since 1999.

Thursday, September 18, 2008

How to Make Boring Businesses Exciting

Wouldn’t it be nice if everyone got as excited about your company as you are?

Unfortunately some businesses just aren’t very sexy; in fact, some businesses are downright boring. As a consequence, companies that sell commodity products and routine services tend to rely on presentations that load-up on features, specifications, and statistics that may be relevant to anal-retentive types, but hardly compelling to the vast majority of your audience.
There is no reason why every company can’t deliver an exciting image to its audience; one that generates the kind of buzz and excitement usually associated with companies like Apple, Victoria’s Secret, Benetton, Absolut Vodka, and Sony.
It may seem impossible to produce a whole lot of steam for things like sand paper, accounting services, and facial tissue, but thanks to the Web and it’s extraordinary ability to deliver multimedia content, even the most mundane offerings can get hearts racing and the blogosphere blogging.
Emotional Experiences Connect
Let’s take facial tissue as an example; it is one of the most common, boring everyday products you can imagine. There is just not much you can do to sell this stuff other than telling people yours is softer and cheaper than the other guys, but then the other guy is saying the same thing; the result, consumers buy whatever is on sale. But wait, the clever fellows at Kimberly-Clark instituted a brilliant website campaign for their facial tissue, called “Kleenex - let it out.”
The campaign zeros in on the emotional experience associated with why people use facial tissue: to wipe away tears of joy or sadness or maybe to clean-up cute little runny noses - in each case the result of some moving event.
Tapping into this emotional association is key to the Kleenex campaign and key to your new thinking on how to make your boring stuff, exciting.
Video - The Best Way To Tell A Story
The Kleenex campaign features prominent videos of articulate people telling their personal stories, all resulting in the need to use a facial tissue.
A pregnant woman discusses the emotional impact of having a child and as her eyes begin to tear, the interviewer hands her a Kleenex. A second video features another well-spoken woman talking about her return to New Orleans after the devastation of hurricane Katrina. Again as the woman becomes emotional and begins to cry, the interviewer hands her a tissue. Nothing more needs to be said, this is very powerful story telling that connects to the audience and delivers an image of the brand as caring and sensitive; the exact kind of impression the company wants to portray.

Even companies that aren’t exactly dead-from-the-neck-up boring can benefit from this approach. The Home Depot ran a series of advertisements with a husband showing his wife a series of power tools that he wanted. Rather than try to convince his wife, and by association all the wives in the audience, that he needs another expensive toy, the husband points to each tool and states, “this is your new shelving unit” and “this one is your new kitchen” - a far more dramatic and effective way to make the case for a new purchase.
You can deliver the same kind of powerful marketing messages for your own company by presenting Web-based videos that follow a few very simple guidelines.
Six Steps To Turn Boring Into Exciting
1. Use People to Sell People
There is no substitute for people. Human beings are capable of communicating with an enormous degree of nuance and subtlety, using voice, expression, body language, and gesture; no animation, avatar, or artificial substitute can take the place of a real person for communicating meaningful, memorable marketing messages.
With relatively easy-to-use production tools anyone can create a video, but not necessarily one that delivers the message or image that your company wants to present. We have seen far too many poor quality efforts both on the Web and even on local television, where company presidents with bad haircuts and ill-fitting suits, uttering nonsense-riddled scripts in zombie-like performances expose themselves to audiences expecting more, much more.
Skilled performers communicate in very subtle ways to an audience, and only the well-practiced professional has the experience and capability to deliver the intended experience. The cost of saving monëy by doing-it-yourself or with amateurs can result in delivering an unintended message that may undermine the impression and image you are trying to create.
2. Perception Is Reality - Use Scripted Professionals
You will notice that I described the women in the facial tissue videos as articulate. Now I cannot tell you if these women were actors or not, or that their powerful presentations were scripted, but if I had to guess, I would say these very effective videos were about as carefully produced and constructed as the latest episode of ‘Survivor’ that by no means makes them any less effective.
The point here is that perception is reality, and the professional filmmaker knows how to tell a story and communicate a message; and that is not the same thing as being able to turn on a video camera.
3. Tell A Memorable Story
When we talk about companies telling their story, it is important to distinguish between the company’s history and the emotional experiences generated by the product or service.
Company histories can make for interesting videos and can produce a sense of trust associated with being in business for a considerable length of time, but that sort of presentation does not speak to the underlying emotional and psychological factors that actually trigger a sale.
It is difficult but imperative that businesses understand that marketing is not about you, or even the product or service, it’s about the audience.
Like the Kleenex videos and the Home Depot commercials, every product and service that is purchased from your company represents an experience, a story that relates to your audience’s aspirations and needs. It is the audience’s story that demonstrates credibility, clarifies purpose, penetrates memory, and makes the message compelling.
4. Create An Emotional Experience
The vast majority of decisions we make are colored by the emotional relevance associated with those decisions. No doubt rational factors figure into our decision-making process, but the pivotal factors that attract the use of one product over another are emotional.
If you’re not connecting to your audience on an emotional level, then you are left with a commodity that can only be sold on price and features, and unless you’re a monopoly, there will always be some competitor willing to offër your customers more for less.
When presenting your product or service it is important to tap into an emotional element that your audience can relate to as its key purchasing decision factor. When people purchase boring accounting services and software, what they’re really buying is an improved life style for their families. It really doesn’t matter what you sell, if you look hard enough, you can find the emotional benefit that should be the central element of your marketing message.
5. Create a Believable Relevant Personality
Part of the process of connecting with your audience is creating an appropriate personality for your company. Many corporations today believe in the cult of management personality but this is a dangerous game. Your company needs a personality of it’s own, one that is distinctive and that will stand alone and not be dependent on senior management’s ego and self-promotion.
Web-video marketing campaigns provide a vehicle that allows companies to create appropriate personalities that engage, inform and entertain your audience in ways that establish your identity and create the basis for a prosperous business relationship.
Clever marketing can create a corporate personality but it is imperative that you follow through and deliver that personality in all aspects of your relationship with your audience. Producing a campaign that promises one thing and a website, staff, and product that delivers another is one of the easiest ways to alienate customers.
6. Deliver A Critical Hot Button Moment
Web-video presentations need to focus on single issues that are driven home by the addition of a hot button moment or punch line. Remember you are telling your audience a story that needs a beginning, middle and end. That story should build to climax and deliver the point in a single memorable moment.
The era of point förm slide presentations is dead, along with the laundry líst of benefits approach.
Creativity, personality, and the ability to communication and develop an emotional connection to your audience through the use of Web-video campaigns has the potential to turn even the most lackluster company into a vibrant and exciting marketing business.

-------------
by Jerry Bader is Senior Partner at MRPwebmedia,

Wednesday, September 17, 2008

Web Promotion via Social Bookmarking

Social bookmarking, (along with other types of free web promotion techniques), is a very popular practice among bloggers and affiliate marketers. Adding social bookmark links to your blog or web site makes it easy for readers to save and share your content.In a social bookmarking system, users store lists of Internet resources that they find useful. These lists are either accessible to the public or a specific network, and other people with similar interests can view the links by category, tags, or even randomly.
You must have seen “share this” or “bookmark this”after an article or a blog post. This is a common way to encourage users to social bookmark your articles. It greatly helps any website to acquire good positions in search engines for relevant keywords. Social bookmarking has more recently inspired the more successful social news movement which includes websites like Del.icio.us, Reddit, Stumbleupon, Digg and the famous Netscape portal.
These days, social bookmarking has become an indicator of any website’s popularity. There are many ways to learn how many people have bookmarked your blog or website on these social bookmarking websites. SocialMeter.com is one of them.
SocialMeter helps you measure the popularity of your website, blog or articles on various social bookmarking websites and blog search engines. Social meter scans the major social websites to analyze a webpage’s social popularity. Currently it scans Bloglines Links, Del.icio.us, Google Links, Rojo Links, Shadows Links, Technorati Links, and Yahoo! Links. The process is very simple, you just have to type your blog URL and socialmeter will fetch the details of all stories on social websites that link to your blog. So I hope you will definitely bookmark this article.

---------
by Daniel Russo is a successful affiliate marketer .

Tuesday, September 16, 2008

Don’t Risk Your Rankings With Unethical SEO

The worst kept secret in Internet Marketing is the advantages of a well-wrought search engine optimization campaign. Companies are quickly adopting SEO into their marketing plan and looking for someone they can trust to optimize their Web sites in order to achieve top rankings. However, not all SEO practitioners are created equal so be sure you know what you’re getting before you sign a contract.

With over 100 million Web sites on the Internet, it is more important than ever to achieve high rankings for visibility. While reputable SEO companies use ethical SEO practices, there are others who will try to trick the engines into high rankings by using questionable techniques. These techniques are known as spam and can get you penalized or banned from the search engines.
Many times, an SEO provider will promise quick, first-page rankings and fail to notify the client that they use spam techniques to achieve those rankings. To avoid falling into this trap, you must be aware of unethical SEO techniques and guard against the companies who use them.

Consequences of Spam

While search engines may have different rules for detecting spam, in the end, the results are the same - you lose your rankings and can be taken out of the index. It is not easy to recover from a ban, so it is important to know the techniques that must be avoided.
Many sites may be unwitting victims of spamming techniques used by aggressive SEO vendors. Whether it happens inadvertently or not, the search engines will penalize and ban, temporarily or permanently. If a significant part of your online business model depends on search engine traffic, you could be in trouble. When infractions are serious, it can take many months to convince the search engine that you corrected the problem and deserve a second chance.

SEO Techniques to Avoid
Search engine optimization practitioners are often divided into two camps: the so-called Black Hats, who use questionable techniques to trick the search engines into ranking them highly, and White Hats, who prefer to work with the search engines guidelines in order to achieve lasting success. A number of Black Hat vendors have turned to White Hat techniques over the past year; however, there are still many unethical vendors claiming to practice SEO soliciting business.
Below are some basic spam techniques to avoid. Obviously this list isn’t exhaustive but it will give you a good idea of the types of things that the search engines find to be unacceptable. Familiarize yourself with Google’s Webmaster Guidelines and make sure you know if what is being done to your Web site is in agreement with those rules.
Keyword Stuffing: This practice involves the repetitive use of the same keyword phrase over and over in your Meta tags, Comment tags, ALT tags or in the copy on your pages. When determining the appropriate keyword density for your page content, plan to repeat your targeted keywords no more than six or seven times within 200 words of text on one page. You can use the keyword density analyzer available on our free SEO tools page in order to determine if you or your SEO is using a particularly keyword too often.
Hidden Text or Links: This practice involves inserting hidden text or links that are readable by search engine spiders but cannot be seen by your site’s human visitors. This can be accomplished several way, the easiest of which is by using a white link or white text containing relevant keywords on a Web page with a white background. Your site visitors won’t be able to see this text and will not know it is there, but the search engines will. All search engines consider hidden links or hidden text to be spam and will penalize the page, if not the entire site, for it.
Cloaking: This involves using a software program to direct search spiders to a group of pages specifically created to trick the spider and re-direct the user to a different set of pages. The user does not see the group of spam pages and is simultaneously re-directed to the real site. Cloaking can have proper uses–some sites use it to redirect based on location. For example, if they sell a product that is illegal to market in a particular area, they can direct those users to a different page where the illegal products are not mentioned. However, by and large, cloaking is used to deceive the search engines and must be considered a spam technique.
Doorway Pages: Also known as Gateway or Bridge pages, these are low-quality Web pages that exist only to pass visitors to the main Web site without providing value of their own. Doorway pages are used to achieve high rankings for a particular key phrase while leading the user to a different page irrelevant to the search query. These pages frequently auto refresh to a second Web site. Be sure that every page on your site that you want indexed can be accessed by at least two internal links and that the page provides value to the user.
Mirror Sites and Duplicate Content: This involves the creation of several sites with identical content and placing them on multiple servers with different domain names. These sites link to one another and are constructed for the purpose of achieving multiple rankings for identical keywords using the same content. Search engines suppress duplicate content because it holds no value for their users. An optimization company who suggests that you interlink all the Web sites that you own is doing you a great disservice. These types of incestuous link rings are a clear violation of the search engine’s spam guidelines.
Link Farms: Google’s quality guidelines suggest that pages contain no more than 99 links. Link farms typically consist of one page with hundreds of links to sites within different categories that are unrelated to your site content. Such pages contain poor quality content that is useless to visitors. Reciprocal links from these pages hold no value for you at all and could potentially associate you with poor neighborhoods. Avoid these inbound links at all costs because they will result in serious penalties and banning.
Ask any prospective search engine optimization company about their best practices and be sure you know what they’re doing to your pages. Beware of spammers who claim to be legitimate search engine optimization experts. Realize that no company can guarantee results and if a company claims a special relationship with Google or any other search engine, run the other way.
Unethical SEO techniques can bring you high rankings; however, visibility is short lived. When you use spamming techniques, your site may benefit briefly from high rankings that last for days, weeks or months. Once the search engines discover the use of spamming techniques, they will penalize or ban your site from their indexes. If you are removed from a search engine index, it can be difficult and time consuming to be reinstated. You might even have to start over with a new domain name. So beware of unethical SEO techniques and ask any prospective vendors if they adhere to a Code of Ethics and/or a Code of Conduct. Once achieved legitimately, organic links can last indefinitely. That’s why it’s important to acquire your search engine rankings fairly and maintain them ethically.

---------
by Susan Esparza is a writer with Bruce Clay Inc

Monday, September 15, 2008

Use Robots.txt, Save the World

Robots.txt help the search engines learn all about your website

There is a growing interest in the little known file that every website should have in the root directory: robots.txt
It’s a very simple text file you can find all about at the robotstxt.org website
Why should you use it ? Here are some good reasons for you to consider.
Controlled access to your content

With a robots.txt file you can “ask” the search engines to “keep out” of certain areas of your website. A typical area you might like to exclude is your images folder: If you aren’t a photographer, painter and your images are for your website use only, there are good chances you don’t want them to be indexed and showing up on image search engines, for people to download, or hotlink.
Unfortunately grabbers and similar software (such as Email harvesting applications) will not read your robots.txt file disregarding any indication you may provide in this respect. But that’s life isn’t it, always someone being disrespectful to say the least …
You can keep search engines away from content you wish to keep out of sight, but remember your robots file is also subject to attention of hackers seeking sensitive objectives you might inadvertently list: keeping out the robots while inviting the hackers – keep this in mind.

(cont. from front…)

The growing importance of robots.txt

At SES New York a robots.txt summit was held where major search engines (Ask, Google, Microsoft, Yahoo!) participated, sharing interesting information on this file. Here are some numbers.
According to Keith Hogan from Ask:

i) Less than 35% of websites have a robots.txt file
ii) The majority of robots.txt files are copied from others found online
iii) On many occasions robots.txt files are provided by your web hosting service

It looks like the majority of webmasters aren’t familiar with this file. This is going to play a major role as the size of the web continues to grow: Spidering is a costly effort that search engines tend to optimize. Those web sites demonstrating optimal command (which in turn determines efficiency) will be rewarded.

During the summit, all search engines announced they will identify (or autodiscover) sitemaps via the robots.txt file. In essence search engines are now able to discover your sitemap via a link in the following format:

Sitemap: , where is the complete URL of your Sitemap Index File (or your sitemap file, if you don’t have an index file).
Being compliant to Google Terms of Service

Robots.txt can help prevent you getting banned or being penalized by Google. In a move to eliminate search results pages because “web search results don’t add value to users” Google has recently added the following sentence to their terms of service:

- Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.

How to implement a robots.txt file

If your website doesn’t support a sitemap and you do not have any areas to exclude, include an empty robots.txt file in your root directory. By doing so you are acknowledging full spidering of your entire site.
Carefully review the robots exclusion protocol available at robotstxt.org. If you must exclude numerous areas of your website, build your file in a step by step manner and monitor spider behaviour with a log analyser tool.
Test your robots.txt file with a few online tools and keep in mind that every spider has a different behaviour and spidering criteria.

Avoid useless spidering traffic

When your website grows to a significant size and achieves optimal visibility, spidering significantly increases to hundreds (if not thousands) of hits per day and will put your server and bandwidth to the test.
Recently I was called on to examine a blog burdened by a very unusual and extremely heavy spidering activity: the log file I examined reported an excess of 8 Gbyte of invisibile (spider) traffic over a 1 month period. Given the reduced amount of daily visitors (less than 200) and the reduced size of the blog (less than 100 posts), there was something wrong in the architecture.
It took just a few minutes to identify the problem: There was no robots.txt file.
At each request for a robots.txt there was a redirect to the home page of the blog triggering a complete download of the blog home page. Each download of the home page was approximately 250 K. There were thousands of unnecessary hits on the home page. This was causing a spidering frenzy that ceased when an empty robots.txt file was created and uploaded to the server. Traffic is now down from 8 Gbyte to 500 Mbyte.

Keep the spiders informed, help save the world

The web is growing by leaps and bounds, the use of a robots.txt file helps the search engines effectively allocate their resources and is a tangible sign of respect and courtesy. If you don’t have a robots.txt file on your website set one up now. Use it to inform the crawlers on how your site is organized, and how often it is changing. I think we should all do our part to avoid waste of resources, saving energy and helping to save the world.

----

By Sante J. Achille