Personally, the minute I see a page with double-underlined links I a) downgrade my expectations of that page’s content b) skirt round the links at all costs when mousing around and c) curtail my visit to the site. So you could say yes, my awareness was raised – but for all the wrong reasons
In the scenario presented, the very choice of link wording – ‘skin condition’ – is designed to appeal to those for whom that may be a concern – after all, if you have a baby-soft glow from head to toe like me, why would you care to click? The second paragraph focuses on abnormally dry skin, not ‘winter dry skin’ – so that page would (in search) draw in people with skin problems. So if you have clicked on the link, you are more likely to be in a group for whom “Good news, everyone!” (< my Hubert J Farnsworth impression) purchasing the product is a possibility.
In short, there is way too little data on the nature of the user group, the context in which the tests were run, and the polarity of the awareness (+/-)
But there are some nice green upward-pointing arrows :)
]]>IN-TEXT IMPROVED COMPETITIVE POSITION
Saying that the brand position improved because the control group was x% and the test group was x% is inaccurate
– they should have interviewed the control and test groups PRIOR about their feelings on the brands and then again after.
– just because the control group didn’t see the ads doesn’t mean they had the same initial feelings about the brands as the test group. It’s possible that the test group had a higher opinion of vaseline to begin with
If you accidentally move your cursor over the double underlined link, you trigger a pop up with a video in it that automatically plays, with loud audio.
Bleurgh.
]]>With this effectiveness comes irritation, and if it reaches a certain level, users will leave the site and never come back. This is the real issue. It would be easy to run an experiment that shows the short term impact on awareness. But the long term effects, that’s the big question (like Ethan said). It is also obvious what most big publishers think the answer is. Very few run intext ads.
]]>A scientific approach means taking a cold, dispassionate view of the question at hand to understand what’s happening, removing as much bias as possible – acting more like a judge seeking the truth rather than a lawyer gathering evidence. This should help avoid a methodology geared to finding desired results.
Although this is clearly a piece of marketing so in this scenario they’ve probably done a reasonable job!
]]>Simulated! So, they contrived a made-up test and then used it to pretend like they collected real, meaningful numbers.
]]>some comments:
– first, it’s not clear whether the study was conducted in a lab setting or not; I mean, if the people were invited to a market research conducted in a dedicated facility, it is easy that they’d paid more attention to the page text than they would have normally done;
– What about the size of the groups?
– What were exactly the conditions of the two groups? it’s not explained – e.g. was “being exposed to the ad” the only relevant difference?
– No measure of the confidence in the results is given;
– “Who saw the ad” refers to whole exposed group or just to a fraction?
– “Users who clicked were most engaged”: is it a sentence with an amazing informative value? I mean, it seems kind of obvious to me to expect people who click through as more interested;
– “Users who clicked were most engaged”: I suppose that the percentage are made on different sizes (users who clicked vs. the control group): again, there is no measure of statistical relevance of the comparison;
– About the share of mind: again is the 6% uplift really relevant from a statistical point of view? Or just got by chance.?
About the suggestions:
– I cannot say that Vibrant fails in its communication, if their purpose is to convince people that the in-text ad works. But, if we assume the perspective of “critical thinking” then we should add information like the sample size, the conditions of the experiments and at least the confidence interval.
But again: I suppose that people at Vibrant, aiming to prove “they are right”, preferred a more straightforward communication rather than overloading the audience with this “technical” information.
thanks for the post!
cheers
carlo
]]>I’m also wondering where they got the “9x” from.
]]>