“Proof of Concept” Season is Coming
Three signs your cybersecurity vendor might be gaming the system
For those of you who attended the RSA Conference in April, I am sure the bombardment of vendor emails, phone calls, and LinkedIn meeting requests is underway. While I’d bet many of the vendors begging for meetings offer products or services that are not on your radar for 2023-2024, there are probably a handful that you would like to put to the test to see if they can deliver better results than what you are currently using. For those vendors, after the compulsory introductory meeting, some technical discussion, or even a customer reference call, you will be offered a Proof of Concept (sometimes also called a Proof of Value). This time-honored tradition allows you to “test the product for yourself.”
During the PoC, the vendor attempts to show you how their product stops more attacks, detects more phishing emails, spots more malicious websites, and the like to validate their marketing claims. The intent behind offering a PoC makes perfect sense and should help buyers avoid buying products with oversold capabilities. But unfortunately, all too often, products that seemed to perform well during a PoC don’t deliver similar results once deployed entirely in the buyer’s environment.
How can this be? Over my almost 20 years of building, selling, and marketing cybersecurity products, I have seen just about every way vendors try to game these PoCs. Here are three signs that your PoC might be less than above board.
“We provide all the data in our environment, so you don’t have to worry about it.”
The fact is that thoroughly testing cybersecurity products that rely on some sort of external threat or a vast amount of internal data for training machine learning models can be cumbersome. Since no security practitioner would, or should, agree to test an unknown product in their production environment, they will need a reasonably robust testing environment to put this new product to the test. Given this requirement, it will be tempting to accept a vendor’s offer to conduct the test using their data in the vendor’s environment. If you think about it for a minute, this is like asking students to write the questions for their final exam. Don’t you think testing in the vendor’s environment with their data might skew the results in their favor? Of course. While no one wants their PoCs to take longer than an NHL hockey season, you’ll need to provide data to vet a product properly. Some vendors may offer up using some tools that simulate attacks, which is reasonable, so long as you, as the potential customer, have the choice to use them or not. The best way to mitigate the length of the PoC is to select two or three use cases you are targeting, maximum, and provide the necessary data for those use cases alone. Ideally, the products you test will make integrating the products that generate the data in your environment easy.
“What version are we testing? It’s pretty much our GA product. You can’t tell the difference.”
When I entered the cybersecurity industry and prepared for a PoC, we asked potential customers to stand up a server meeting our minimum requirements. Then, I would manually install the product on the machine so the PoC participants could see what version of the product we were going to use for the testing.
In today’s world, where SaaS is the standard, knowing the version of the product you are testing can feel like a trip down a wormhole. Sadly I have heard horror stories from practitioners where they were in a PoC with a vendor, and the results were outstanding, so they entered into a contract to purchase the product. Fast forward a month or two, and the practitioners and management are beyond frustrated. The product installed in their environment looks nothing like what they tested. Features are missing, integrations they used are nowhere to be found, and the story from the vendor is, “That version should be coming out soon.” In some instances, I see no issue using an unreleased product version for a PoC so long as the vendor is transparent with the prospective client. Unfortunately, when a client feels like a vendor is trying to hide something from them, a customer/vendor relationship that should be collaborative can instantly become combative.
“We have never missed a threat during a PoC.”
In 1941, Ted Williams, also known as The Splendid Splinter, had a magical season at the plate, finishing his season with the Boston Red Sox with a staggering .406 batting average and an on-base percentage of .553. Many baseball historians argue Ted Williams was the purest hitter ever to play the game. To date, no batter in the AL or NL has eclipsed the.400 average for the year. So, what does Ted Williams have to do with a blog about cybersecurity PoCs? The point is that nothing, whether a cybersecurity product or the best batter ever to play the game, is always perfect. Can you test a product against a specific set of threat vectors, and the product identifies them all? Absolutely. Could it do it consecutively with new threats for two days, 5, 10, or even 100 days? But as sure as I know that during that 1941 season, Ted Williams struck out 27 times, there will come a day when your shiny new cybersecurity product misses a threat you thought it would detect.
During a PoC, if the raw results from the product test aren’t available immediately or you notice folks from the vendor working on the project that you have never met, be warned. You are witnessing a sales team calling in help from developers, threat intel researchers, or software engineers working on your deployment, trying to figure out why they are missing threats or a feature isn’t working. Again, can software defects surface during a PoC? Absolutely. When they do, could you find developers and engineers debugging code live – for sure. The key here is transparency. Ethical vendors will be upfront with you if and when a defect surfaces. They will also explain why a threat, if any, was missed during a PoC. Remember, when running a product PoC, you should compare the results to what you are currently using and other products you are testing, not a mythical system that continuously detects 100% of threats. You are looking for significantly better outcomes, not perfection.
PoC for the People
With so many products in the market claiming similar capabilities, benefits, and results, it is nearly impossible to determine what will work best for you without running a PoC. Therefore, I am a fan of the PoC since, when run fairly, it gives competing products a chance to go head-to-head in the real world.
Let the best product win.