Can There Be AV Product Shoot-Outs?

shootout_450x300I had a flashback last week, prompted by the AV industry’s own Gary Kayye. In a blog post, Gary drew attention to online videos by, we’ll call them, AV Manufacturer 1, comparing one of its switchers to those from two others. One of the others, we’ll call AV Manufacturer 2, took issue with the comparisons and issued a response.

What should come as no surprise is the fact that two competitors don’t agree whose switcher is better. That’s the way it should be. Competition yields better products for all. What initiated the flashback was the technical detail to which each side argued its case. (Whose case was more valid is a subject for other blogs.) It got Gary thinking about the days of the Projector Shoot-out at the InfoComm Show. It got me thinking of my days testing computer equipment and whether we’d ever see third-party testing like that in AV.

I take no sides in the current matter, but the techie in me admires the endeavor of attempting to determine–empirically–if one product performs better than another, whether it’s a switcher or some other AV product.  I’d like to know if one switcher (or other AV product) performs better than another, and another, and another. And knowing that, I’d factor it into my purchasing decisions, though it would never be my only criteria (service, support, software, features.) I’m probably preaching to the choir here.

My first job out of college was with a prominent computer magazine. We tested stuff all the time. My first job out of grad school was with another prominent computer magazine. Same thing. Readers appreciated it, but it was not without a significant behind-the-scenes investment in time and resources (plus angst). To do a product shoot-out, you need to establish a couple things: First, that there are meaningful differences between products. And second, that people care. Even then, true product shoot-outs, conducted by interested third parties, should meet a very high bar.

Tests need to be defensible. You can’t design a test in a vacuum. You should do it in concert with the most prominent companies that might be subject to the test. In the case of a switcher shoot-out, you need to at least offer switcher makers input on how the test(s) should be designed. You can’t please everybody all the time, but you need to try.

Tests need to be transparent. If you can’t (or don’t want to) get participation from other parties, you need to at least be detailed about your methodology.

Tests need to be repeatable. If you test a product one time, you should be able to test it another time and get the same or similar results. If slight variability is expected, you should test every product the same number of times and average the results. (The fact that you’re doing so–and everything else, for that matter–should be part of your publicly-available methodology.)

Tests themselves need to be publicly available. If you’ve written software to test products and collect results, you need to offer the code. If you test product A and get certain results, the manufacturer of product A needs access to the tests and should be able to reproduce your results under the same circumstances you tested.

And here’s where things start to go off the rails. If none of the above is in place, you open yourself to skepticism. Shoot-outs by manufacturers are not necessarily invalid. Some are very well done, whether in AV of another consumer product category. But few would argue that they reflect cooperation with their competition, for example.

And all of the above takes resources, which is why it’s not done very often. I spent several years “benchmarking,” as we called it, personal computers using tests that took years to develop. And we saw it all. Companies reverse-engineering the test, submitting test systems designed specifically to generate top scores, etc. It was ongoing (and expensive) but the methodology was sound and earned respect.

I also spent years swapping video cards in and out of computers and testing them for their graphics performance. Same thing. And in those years, I rubbed elbows with other people doing similar tests–consultants, researchers, entrepreneurs, individuals with an avid interest. And yes, manufacturers. Guess who has great resources to test products? The people who make them. Many of them test products as part of their QA process. Sometimes, their tests are very good and do rise to most of the levels above.

Testing products is easy. Testing them well is hard. And if test results are for public consumption, then testing well is a responsibility. Consumer Reports, a nonprofit organization, sets a high standard, but it can’t help much with pro AV systems.

Over time, the performance differences between certain computer products became harder to discern. And people cared less that computer 1 scored X and computer 2 scored Y. (To say nothing of the fact that, apparently, people have come to care less about computers, lured to smartphones and tablets.) This doesn’t mean good performance benchmark tests have gone away. They haven’t. But there are certainly other ways to test technology products. And really, most buyers have their own “test” criteria (their internal compass). “Does it cost less than $X? Does it have Y features? Is it capable of doing Z, which I really need it to do? Then it passes my test.”

There’s part of me that does want to know if one AV product can do its job faster  or better (measurably) than another, especially for applications where performance is important. How do you satisfy that curiosity? Would shoot-outs be of business value? Do you conduct them yourselves, in your own labs or warehouses? Done properly, they can be a service. But they take effort.

About Brad Grimes

Brad Grimes is the Director of Communications for InfoComm International and the former editor of Pro AV magazine. He has been writing about technology for more than 25 years.

3 Responses to “Can There Be AV Product Shoot-Outs?”

  1. It interesting to me that these shoot outs were performed by the magazines. The current crop of magazines seems more interested in glorifying the latest install by company x and editorial content paid for by company y than in actual product information or technology inovation. I would love to be involved in such shoot outs, if you want to do it and need me as a manufacturers member please let me know.

    • In all fairness, the testing I remember so fondly started when the publishing industry was much stronger and you couldn’t just surf the web for product information. In fact, the first magazine I worked on no longer publishes a print edition. As I said, good product testing requires resources and support from all parties involved (including advertisers).

  2. Nice article, Brad. In essence, you’ve reminded us about the importance of employing The Scientific Method in such product comparisons. I don’t know why so many AV people think we and the industry are exempt for this type of thing.