Cybersecurity tool vendors understand that historically, the determining factors for a successful Security Operations Center (SOC) have been dependent entirely on 2 operational factors rather than the underlying technology:
- The number of cybersecurity experts involved.
- The level of experience and domain expertise of those individuals.
Organizations who bet early on machine learning based cybersecurity tools have learned that despite vendor claims, deployment always requires additive and costly machine learning teams to perpetually train, tune and maintain the tool’s contextual models, required rules, as well as data transformation, aggregation, and enrichment required to operate.
As a result of increasing consumer skepticism, there is now a significant amount of hype and unsupported marketing literature in the cybersecurity market regarding “1-click automated remediation” to advanced cybersecurity threats.
Several vendors are reverting to fanciful claims built on disproven techniques, to lure inexperienced customers to sign multi-year contracts. A common example, falsely claiming that their legacy tool will provide automated advanced threat isolation, quarantine and remediation more effectively than industry leading technologies including, but not limited to, next-gen firewalls and endpoint protection solutions. Ironic given that the endpoint and firewall providers have literally decades of experience, success, practical and detailed customer examples to support their approach and technology.
Perhaps a more interesting, and often ignored or intentionally hidden limitation of legacy cyber tool vendors is their lack of visibility and context into relevant data sources and data formats required to make even a manual decision about the appropriate remediation action. Relevant cloud data, log data, end user data, and end point data to name a few. It is a truly challenging and complicated issue.