What specifically is the lie here? That they report average view time without trivial views? That seems like a more useful metric.
You also can't really directly compare these simple metrics across sites as they don't get at customer lifetime value. With these simple metrics you run into problems where the average video view may be higher on site A, but the average spend for users is higher on site B, and simply following the metric will lead you to an ineffective campaign. Whose responsibility is it to effectively manage an advertising campaign?
While I agree that more oversight and transparency is welcome here, I think jumping out and calling it fraud is exactly the kind of reaction that makes me feel this is overblown.
Brick and mortal local retailer here. I spend over $100,000 annually on advertising in my local market. That is all digital at this point. Facebook video is a significant chunk.
The lie is about the effectiveness of video on Facebook.
I directed a fair amount of additional ad spend to Facebook precisely because Facebook users on average seemed to be giving a lot of attention to videos on the platform, even if they were promoted. It actually seemed like there was finally something that could work to replace the kind of lift we would see from TV advertising 5-10 years ago.
This calls that into question (which may be putting it mildly). It also calls into question all of the other metrics Facebook provides advertisers.
Does the additional ad spend convert to an acceptable lift in sales? If yes, why do you care about video engagement? If no, why maintain such spend even if the engagement numbers are genuinely high?
How could anyone possibly know? You maintain spend because Facebook claims it's providing value. There's no way to actually know this with any certainty, and Facebook manipulating their figures doesn't add clarity in the least. That's why this whole issue is important — these people are spending millions on ads because of the numbers Facebook reports. Then people are spending millions more on stocks because of the profit generated by aforementioned hand waving.
This is a bias of online thinking. Brick and mortar retail doesn't really know the conversion rate on ad spend. The old adage is that you are wasting half of your advertising spend, but the problem is that you never know which half.
So, we make decisions based on imperfect information. When it was TV or Radio ads, there were 3rd parties that measured the market and reach of the station. With Facebook, Youtube, etc we have to take their word for it.
You don't measure the conversion in case of a brick and mortar vendor. But you can still measure proxy values for effectiveness: an uptick in sales volume or number of clients after after starting a campaign, for example.
But we aren't just running one form of advertising at a time, so what to we attribute the uptick?
Also, our marketing doesn't exist in a vacuum. Weather, competitors, a traffic jam down the street, variations in employee performance, etc all impact the performance as well.
It's not like we can just A/B test Facebook video ads vs Pandora Ads and pick the winner.
So, we shift our budget based on the available information.
And the average video view length is part of the information that we used to make decisions.
He said that he used the avg view length on FB as a reason to choose that channel. If he had know that they didn't factor in <3 sec views he would perhaps not used that channel.
See my reply to the comment above. With local retail, you can't just measure the conversion rate.
It is not about the specific number of seconds. It is about the implied engagement and focus of the audience. It is about a reported difference in the behavior of Facebook users vs other online properties.
The one thing that would make it misleading is if they reported the average time excluding views under 3 seconds, but reported impressions including them. From the article it is not clear that this is the case
I think your comment is the key one. I actually don't see a problem at all with what they've reported IF they also separately excluded views under 3 seconds from their total count of views (i.e. the denominator). If they included them in view count, I would think a lawsuit would be warranted.
The dimensions to break out reporting columns for various view duration percentages has always existed. Any advertiser worth their salt is looking at the breakdown for stats on meaningful views because out outside of cases where your message is delivered in a couple seconds you typically only care about longer views and would just look at those numbers.
So the question is how deep this polluted other metrics.
As far as I understand it, what happens is that views under 3 seconds aren't counted towards the average. So if you have 999 999 views of <1 second each and 1 view of 3 seconds, the reported average is 3 seconds / 1 view = average view time of 3 seconds, while the true average is something like 500 000 seconds / 1 000 000 views.
You also can't really directly compare these simple metrics across sites as they don't get at customer lifetime value. With these simple metrics you run into problems where the average video view may be higher on site A, but the average spend for users is higher on site B, and simply following the metric will lead you to an ineffective campaign. Whose responsibility is it to effectively manage an advertising campaign?
While I agree that more oversight and transparency is welcome here, I think jumping out and calling it fraud is exactly the kind of reaction that makes me feel this is overblown.