SES San Francisco. Live Reporting Starts August 12

Technology Report will be live blogging SES San Francisco, one of the world’s oldest and arguably the top online marketing conference*.   Most of that coverage will be here at Technology Report but I will also have posts at JoeDuck and our Retire USA Retirement Blog when the topics may be of interest to seniors and retired folks.

Technology Report was actually started with my California Technnology buddy John Ghysels to cover SES plus other conference events in Silicon Valley like the Mashup Camp and Startup Camps  (I miss those!)  but I have not been to SES for several years and I’m looking forward to reporting from there this year and giving extra  attention to the topic of Search Engine Optimization or ‘SEO’, something always near and dear to me as somebody who supervises so many websites.

Social media has shaken things up quite a bit both online and in the SEO field.   In my view Google has become much more conservative ranking websites.  Google now assumes very correctly that most new websites are spam or very low quality sites, and therefore Google looks for “big signals” to allow new sites to rank well for valuable terms.   This adds yet another burden to new websites that deserve to rank well, especially those that seek valuable niche markets.

For more about the upcoming SES Conference visit their website, review the SES Conference Agenda Online, or go directly to the following conference highlights here (these link directly to the SES Official Site):


SES Conference first timers will want to note that this conference is fast and furious, and you can’t possible take in everything over the course of three days.    I’d recommend you review the materials carefully before the show and pick a few topics and speakers you’ll want to hear and be sure to get to those talks.  Ask a question or introduce yourself to the speakers afterwards.    (tip – you’ll certainly want to have Google’s Avinash Kaushik on that list).

SES Parties:   Sadly, the incredible  “Google Dance” is no longer a feature of the SES conference but you’ll want to keep your eyes open for party opportunities with the many exhibitors at the Conference.    The burst of the dot com bubble meant downsizing of both internet companies and their parties, the best of which was the amazing “Google Dance” held at the close of SES on the Google campus in Mountain View.

* There are basically three major online strategy conferences and the history of these events is quite fun and interesting.  They are SES, SMX, and PubCon.

SEO Insights from two top experts

Two of the sharpest tools in the Search Engine Optimization “SEO” shed talk about the evolution of SEO in an April 2009 interview of Ralph Tegtmeier by Aaron Wall at Aaron’s excellent blog  “SEO Book“.

Although I don’t endorse some of the SEO tactics they discuss, it’s important that everybody has a better understanding of what very advanced folks are doing to adapt to the many changes in search over the past several years.    Also very interesting is the discussion about SEO morality.    I’m not as critical of Google as Ralph is in this interview but I do agree that Google’s dominance has severerly distorted the way the internet would ideally assign rankings to sites.    The best example of this in my view is the overzealous use of “NoFollow” tags, which are allowing older, inferior, highly SEOd content to trump fresh, high quality, new content because the incoming links to that content are too often nofollowed, coming in from Twitter, Facebook, WordPress,  Flickr, and other major sites that are automatically nofollowing links.

Google would say this is necessary to avoid the kind of manipulations that don’t serve users, but I remain skeptical this approach has done more good than harm, and certainly Google’s “very low transparency” working philosophy has stunted the growth of quality content.    I know this for a fact because my own decisions in developing content have changed greatly over the years knowing that quality is often not  rewarded, and site downrankings are so confusing that it leads one to abandon sites rather than improve them.

Social media gives us a wonderful opportunity to use human input to screen out junk, and I think better use of this by Google would open up newer, better sites that currently fall well under the radar screens.

Online Marketing: Beware of Bad Statistics

One of the cornerstones of good internet marketing is knowing your statistics, and you’d think with all the elaborate, inexpensive and free measurement and analytical tools everybody would have a great sense of how their sites stack up to the competition.

But you’d  be wrong.

In fact even many large companies are struggling with high quality analysis even as the tools get better and the measures s-l-o-w-l-y are reaching some level of standardization.     For most small companies metrics are, literally, more misses than “hits”. Webmasters routinely report or misinterpret or misrepresent website “hits” as viable traffic when hits often are simply a measure of the number of total files downloaded from the site.    Graphics or data intensive websites can see hundreds of hits from a single web visitor.

Even when the analysis is good the reporting is often opportunistic or manipulative, and it’s often done by the same team that is accountable for the results.     This is a common problem throughout the business metrics field.  Executives are well advised to have independent auditing of results by unbiased parties for any business critical measurements.

Consider learning and using analysis packages like Google Analytics – a brilliantly robust and free tool provided by Google to anyone.

A while back Peter Norvig, one of the top search experts over at Google (also a leading world authority on Artificial Intelligence), published a little study indicating how unreliable the Alexa Metrics were with regard to website traffic.  (Thanks to Matt Cutts for pointing out the Peter paper.

The results here demonstrates that Alexa is off by a factor of 50x (ie an error of five thousand percent!) when comparing Matt Cutts’ and Peter’s site traffic.

Although this is just an anecdotal snapshot indicating the problem, and perhaps Alexa is better now, I’d also noted many problems with comparisons of Alexa to sites where I knew the real traffic.   50x seems to be a spectacular level of error for sites read mostly by technology sector folks.   It even suggests that Alexa may be a questionable comparison tool unless there is abundant other data to support the comparison, in which case you probably don’t need Alexa anyway.

Of course the very expensive statistics services don’t fare all that well either. A larger, and excellent comparison study by Rand Fishkin over at SEOMOZ collected data from several prominent sites in technology, including Matt Cutts’ blog, and concluded that no metrics were reasonably in line with the actual log files. Rand notes that he examined only about 25 blogs so the sample was somewhat small and targeted, but he concludes:

Based on the evidence we’ve gathered here, it’s safe to say that no external metric, traffic prediction service or ranking system available on the web today provides any accuracy when compared with real numbers.

It’s interesting how problematic it’s been to accurately compare what is arguably the most important aspect of internet traffic – simple site visits and pageviews. Hopefully as data becomes more widely circulated and more studies like these are done we may be able to create some tools that allow quick comparisons.  Google Analytics is coming into widespread use but Fishkin told me at a conference that even that “internal metrics” tool seemed to have several problems when compared with the log files he reviewed.  My own experience with Analytics have not been extensive but the data seems to line up with my log stats and I’d continue to recommend this excellent analytics package.