21 June 2007

Google Website Optimizer Tips

Having really put the Google Website Optimizer to the test at this point I have decided to publish some possibly useful tips for people trying to push the limits on this free tool. Here I will list a goup of things which I had to spend some serious time on trial and error with in order to correct. It may save some time for people willing to read.

1. The importance of testing is pretty obvious, but how do you really do a full page A/B test. I know Matt Belkin at Omniture will tell you never test more than one element as an A/B without making considerations of the possibility that what you are discussing is a multi-variate test. However, we have some pages which have identical content which we decided to test in a new layout, format. When it was put in front of me to be able to test, I hastily said, "Sure, I can do that...", and the truth was, I believed it would be as simple as the taking the code, tagging it, and let'er go. This was not the case.

2. Make the fancy page the original and use the original as the variation. As you may or may not know, when using GWO, you have some limitations with respect to what can be passed in and out of the 'utmx_' sections which you define in the set-up. The variations cannot contain active scripts for things like drop downs or tab. So, in a case like ours, where this is the proposed next phase of transition for a number of our pages, and the test design doesn't allow for it, you have two choices. Make it work, or.......make it work. What we were able to do was, instead of taking the original, splicing it up with a tag, and then passing the new page in and out of the original (our site uses four PHP includes which build a template around the static pages); We moved the new content into the original slot, where all the links push, and made the less complicated original version the variation which could be passed in and out of the tags. BINGO....works like a charm.

3. Split traffic with a redirect. Our pages work on a template, like I stated above. For that reason, when we used a redirect to attempt splitting traffic, it worked, but only after performing a load of the page includes before the redirect occurred. This does not mean that this should be an idea which is so easily abandoned. If your page does not rely on using includes or any server code which has to be called prior to the body, you can create a variation which, when passed into a variable set to a priority in the code, will call the redirect and split the traffic to a new page on the site. For me, it was more important to maintain fluidity over the possibility of making the redirect work. So, I saw no value in that, but I'm sure someone can use that tidbit.

I hope these relatively simple tips are helpful to you if you are interested in getting the most from the Google Website Optimizer. If you have questions, suggestions, or comments, please feel free to voice them. A note to those interested, however, is that I will be migrating the blog to the CableOrganizer.com site over the next week. Any articles which you have read here will be made available there from this point on. When the migration is complete, I will publish the link here, in the Yahoo! web group and through my Technorati profile.

16 June 2007

Call to Action: Hierarchical Cruciality in Colors & Their Intensity

Throughout my recent experiments and tests which I have been running, a notion occured to me of which I have found it difficult to uncover any existing, definitive, and supported best practices. This notion, which for lack of formal acknowledgement I will call hierachical cruciality, is that which may or may not state that people will respond to calls to action based on their position within a path to conversion with a correlative level of intensity of the calls. By saying that, I should more clearly state, I mean that choosing button colors and their intensity should be based on their linear point within a funnel to conversion. It is my theory that not only is the color and intensity important, but that the path-relative color and intensity tuned at the right frequency may, at the outcome of my experiment, lead to substantial increases in feature-specific use as well as conversion.

I've recently read a blog by Jonathan Mendez, I believe its called Optimize&Prophesize, located at http://www.optimizeandprophesize.com, which I thought was somewhat insightful in my quest to uncover current best practices with regard to my topic. In it, Mendez said that he had uncovered through his multivariate tests that the use of a single button or color for buttons and calls to action accross a site was flawed. His testing, he asserted, uncovered the fact that people in different paths react differently to the same stimuli. He listed several factors with respect to buttons and colors specifically that leant themselves to the optimization of certain funnels. I thought this was brilliant. Although a little intensive to implement I think his findings are probably very accurate.

In the case of CableOrganizer.com, pathing reports consistently yield statsitical data which tells me that paths are as unique as the individuals that click-through them. To assume that each of these individual paths and significant funnels react to the same cues is naive. However, being able to predict the behaviors and adjust to them with the appropriate cues is inherently very difficult. So then, the case becomes a matter of testing the most significant and valuable path and optimizing that path with the best variations for maximized outcome.

As a subcomponent of my theory, I would like to state that Orange is the 'New Red'. I think red is a very action-evoking color, don't get me wrong. However, I would be willing to assert that the color red, while valuable in its ability to get the users attention, is not a single powerful cue. Further, as cited by many, it can be related to many negative connotations which often repel as frequently as they entice. 'Red' is the color of debt, imagine that thought as you plan on using your high-interest credit card. It is also the color of danger and several hundred other apprehension illiciting sentiments. Orange, a vibrant alternative, has less volatile implications and can be very visible in digital media. Of course, any good analyst at this point should be saying, prove it.

CableOrganizer.com recently implemented an on-site, action promoted minicart. The packaged button for this cart was a red image with a size 10 font which said "checkout". It was nearly invisible and somewhat repulsive when the cart was brought up after adding an item. Immediately, I thought back to our usability tests and the functional invisibility of our shopping cart. Previously we had used a yellow button. That stunk. Then, we used this red, which was also visibility impaired. So, I created a sizeable orange version. Outcome? Well, suffice it to say that currently, having collected custom link data for more than a few days, more than 40% of the people exposed to the button click on it. My estimation is that, when significant data can be collected on this particular instance, this change will have positively impacted conversion by as much as 10%. In our context that's saying from 2.7% conversion to 2.97% conversion, but, over the course of a year, or more, that is significant.

My goal, now, is to uncover a comprehensive color-based mapping for behavior which could be implemented to our site for the purpose of meeting the de facto policy of cableorganizer.com with a sophisicated method of presenting action elements which are ideal for their position within the funnel and enhancing user experience. Doing so should yield significant positive gains and a cross-section of business to business customer behaviors which could be applied accross the industry in best practices models. Further, it may provide insights to human behaviors and user behaviors online which would be applicable accross cultures and industries.

I would like to ask anyone with interest in this particular study to submit and research, comments, observations, or experiences through this blog or via an email as to aid in my preparation of elements and to make all possible considerations. While this is work related, it may need a significant amount of time dedicated which my employment may or may not make consideration of. If any person is willing to, in a pro bono fashion, provide elements or ideas, please also submit those by these means.

12 June 2007

Web Analytics an Hour a Day: Preliminary Review

On Monday I received my copy of Avinash Kaushik's "Web Analytics: An Hour A Day" published by Sybex. At first glance, it seemed like any other book on analytics you would buy off a shelf in the Geeks and Dweebs section of the local Barnes & Noble. When I opened the book, the first thing that caught my eye was a small box which indicated that all the profits and proceeds of the book were to be donated to two separate medical causes. Aside from my firm belief that the Author is equally as dedicated to the practice of Web Analytics as anyone I know, I am sincerely touched by his strength of character in taking the incentive to provide this much needed source of revenue to respected medicine. Slainte' Avinash...

Avinash seems to have taken the time to pin down the key points of the evolving use of analytics as it applies generally to business. He's built a few systems of thought around the rudimentary ideas which nearly anyone can grasp and developed more complex modes to help coax new ideas. One idea which I particularly enjoyed was his explanation of an idea called a Trinity, commencing on page 15. Using this theme, Kaushik was able to tie together integrated analyses based in Behaviors, Experiences, and Outcomes and how these influence each other.

The central foundation of the Trinity which Avinash discussed is the production and extraction of Actionable Insights and Metrics. For us, this is a simple identification of the low-hanging fruit, or the items which need the most work to perform the satisfactory duty of the page or item for navigation etc. This might be as simple as looking at how clearly defined the navigation is, or how visible our "View Cart" button is.

Based on the Trinity, then, we would find a way to quantify the behavior exhibited by people who had access to that element or were exposed to it. This takes on the chore of asking the question: Of everyone who had the need or opportunity to interact through this provided means, who did or did not use it, and why? In the case of our "View Cart" button on the site, we actually built a custom link in our Omniture Suite, and pair the stats of the collected link executions versus our log files which tell us how many times the item has been presented. We can even go so far as to monetize this button and then compare it to a proposed improvement over similar time lines. When the outcome from the changes made based on the behavior can be clearly correlative, it gets placed in the "experiences" area of the Trinity. These aren't epistles which remain unchanged until the end of time, but the operational relationships which exist for any site at its current and evolving state.

This is only the first of the sections which I read. I enjoyed it very much and I am looking forward to reading and reporting on a great deal more. If you have the means or the gusto to ask the boss, get out and pick up the book. I'm sure you can probably read a great deal of similar insights and news on Avinash's blog, Occam's Razor. (http://www.kaushik.net/avinash/)I'll be continuing my reading and reporting on areas which I think will benefit the useful content of my blog and iterate to people in the practice some of what I was able to get out of the publication.

If anyone wants to open discussion on what I've mentioned here or take some time open up new threads, please feel free to do so here. I will respond when the time allows and does not interfere with my ability to perform the duties responsible for my primary source of income.

06 June 2007

Upside Downside of Taguchi Tests Revealed in Progress

Okay, so there are a couple good things and a small number of bad things which I find are happening in the use of Google Website Optimizer for the Taguchi tests.

First: WEBSITE OPTIMIZER NOTE: Your collection code for the goal DOES have to be after the analytics tag, however, it DOES NOT have to be before the 'body' tag.



Tests are progressing very quickly. I actually had to shut down the test, copy and restart twice since the original post because it ran into some tag deletion from an update in our site from a saved version. It made for a frustrating couple of days until I could clear up the data. However, once they get out there, they run very quickly with limited traffic exposure and you can begin to see projections, albeit with a significant confindence interval, within a couple hours.

Picture of Element Value is Much Clearer. When we built the test variations, we purposely built in a few 'dogs' which we knew were going to be terrible indicators of conversion to each of the element variations. These include purposely bad spacing, terrible wording, image problems, and excess text. I mean, we really tried to get an idea of how bad we could make something. In the variations where there are clearly bad elements, there is corresponding poor performance. This is wonderful as far as I'm concerned because it validates our hypotheses about conversion driving elements. Paul Holstein actually put out a post to the Yahoo! group on these which is available at: http://tech.groups.yahoo.com/group/webanalytics/message/9383
We're pretty sure the list is worth printing and having prior to testing.


Editability is much more complicated. With the boons of quick results and dynamic testability, come the burdens of human development. Errors which have not been discovered prior to launching the test create the possibility that the test is not going to give a clear picture. When things DO go wrong, it wipes out whatever progress was made with the test and you have to copy, then restart the test with the corrections. With the single element, full factorial tests, you will be able to make changes to the elements which are not tied into the html directly (say an image needs to be clearer or its link needs adjustment).

Element Strength is Combination Strength. With Taguchi tests, the combination is the element. This means that the test is geared toward optimizing the best possible combination and not the strength of a single varied element While this falls under the CON section, its only a conditional con as it makes for great follow up testing for your traditional uses of the Website Optimizer.