Field Perspective in SOAR Projects: The Role of Automation
- 2 days ago
- 4 min read
SOAR projects tend to follow a similar pattern across many organizations: the system is deployed, integrations are completed, and playbooks are written. Technically, everything appears to be functioning. However, the expected transformation on the operational side does not materialize. Analysts continue to work manually, and although automation exists within the system, it does not become central to the process.
This situation is often perceived as a technical shortcoming. However, field experience suggests otherwise. In SOAR projects, the issue is rarely the presence of automation itself; rather, it lies in how automation is positioned, the expectations set at the outset, and how it is integrated into operations.
Based on our field experience from projects we have conducted at Natica, this article examines the most common challenges and misconceptions encountered in SOAR implementations.
Incorrect Starting Point
SOAR projects sometimes begin with a flawed assumption: that automation will directly reduce the need for human resources, and the project is shaped around this expectation.
“Automation requires fewer analysts.”
This assumption does not hold true in practice. SOAR is not designed to replace analysts, but to reduce repetitive tasks and allow analysts to focus on more critical activities.
“SOAR does not take over the analyst’s role and is not a tool for downsizing. Its primary function is to redirect existing analyst capacity toward higher-value work.”
When this distinction is not properly understood, projects are evaluated against incorrect success criteria, leading to a mismatch between expectations and reality even if the technical implementation is sound.
Process Issues
The success of SOAR is directly tied to process maturity. Automation cannot succeed in an environment where processes are not clearly defined.
“When I ask, ‘What do you do when a phishing email arrives?’ and each analyst gives a different answer, it’s an early warning sign for a SOAR project.”
This indicates the absence of a standardized action model within the organization. SOAR does not create new processes; it accelerates existing ones.
“SOAR accelerates a well-defined action plan but cannot invent a process that does not exist.”
Therefore, the foundation of automation is not technology, but clearly defined processes.
Lack of Context
One of the most critical breaking points in SOAR projects is the gap between technical correctness and operational correctness. A playbook may function correctly according to rules, yet still produce incorrect outcomes due to lack of context.
In scenarios such as automated IP blocking or account locking, the system may correctly process signals and trigger actions. However, these actions may not always impact the correct target.
“The test team’s IP or account gets blocked by the playbook.”
Here, the issue is not technology, but the absence of contextual awareness in the model.
Lack of Trust
Technical accuracy alone is not sufficient in SOAR projects. For automation to be truly utilized, analysts must trust it.
“If a playbook takes an incorrect or incomplete action even once, analysts may stop relying on it and revert to manual validation.”
Over time, this behavior shifts from exception to standard practice. Even a single mistake can result in automation remaining in the system but being excluded from actual operations. For this reason, successful SOAR projects begin not with full automation, but with controlled adoption.
Noise Problem
SOAR does not eliminate existing problems; in many cases, it accelerates them. This is especially evident in environments with poor data quality.
“The more false positives, the more unnecessary playbook triggers.”
Instead of reducing workload, this increases it and leads to alert fatigue.
“A real incident can be overlooked within the noise generated by false positive-driven playbooks.”
Automation cannot be considered independently of data quality and must be built on top of a solid SIEM foundation.
“SOAR should be built on top of SIEM quality—not beneath it.”
Limits of Automation
Not every SOC process is suitable for automation. Scenarios with high uncertainty reveal the limits of automation.
“Zero-day response is the clearest example—there is no predefined flow because the scenario has not been encountered before.”
Similarly, processes dependent on context and organizational dynamics are resistant to automation.
“Insider threat investigations and actions involving critical assets are challenging because they require organization-specific context, intuition, and sometimes political judgment.”
Automation delivers the highest value in repetitive, clearly defined processes.
Integration Complexity
As SOAR projects grow, the number of integrations increases, creating additional dependencies.
“Tokens expire, APIs change, services go down—playbooks can fail.”
Over time, the system becomes more complex, and tracking dependencies becomes difficult. Without proper testing, errors are often only discovered during real incidents.
Measuring Value
To determine whether SOAR truly delivers value, speed metrics alone are insufficient.
“Would the analyst have skipped this step if the playbook didn’t exist?”
This question reveals the true contribution of automation.
The Right Starting Approach
Successful SOAR projects do not begin with large, comprehensive deployments, but with controlled and targeted initiatives.
“I don’t touch any tools at first.”
The initial focus is not on tools, but on understanding existing operations.
“The real issue is identifying and reducing noise.”
Then, the right starting scenario is selected:
“Choose the highest-volume, most clearly defined, and lowest-risk scenario.”
This approach enables controlled growth and measurable value creation.
Operational Reality
In SOAR projects, the key determinant is not technical accuracy, but operational relevance. A system simply working is not enough; it must actually be used.
The most common situation observed in the field is clear: automation exists, but processes continue manually.
This is not a malfunction—it is a signal. It indicates unclear processes, missing context, or lack of trust.
Therefore, the real question in SOAR projects is not “Is the automation working?” but “Is the automation actually being used?”
At Natica, our approach is to treat SOAR not merely as playbook development, but as an integral part of operations. Because in practice, the real differentiator is not the existence of automation, but whether it is truly adopted and used. Our focus is not only on technology, but on building processes, data quality, and trust together.


