
When using Splunk’s Enterprise Security Platform, we all too often see the tool being used inefficiently, with the main culprit for this inefficiency being the data the tool is supplied with. Often we see every possible data point being ingested to the tool with the justification of ‘improving security posture’; however this will often lead to the contrary, causing bloat, inefficient searches and substantial compute costs.
A common misconception is that security posture can only be gained at the expense of platform efficiency. Whereas in reality the two metrics go hand in hand and when the tool is used well, the two metrics support each other and improve in sync. As such we will discuss how we can shift this paradigm and reinforce security posture through quality data and an efficient platform.
Throughout this article, we will discuss how you can extract the maximum value from your Enterprise Security tool, both in terms of financials and security maturity. We will primarily focus on technical elements of the tool; however, we will briefly discuss how the users of the platform can also adjust their behaviours to fully utilise the tool.
Detective Use Case Framework Tagging
The main use for the Enterprise Security tool is for creating, analysing and managing security detections and their results. In order for this to be achieved, all use cases deployed within Enterprise Security need to be tagged with their relevant frameworks. This is an easy task and can be a quick win for improving system effectiveness and can be a pivotal step in improving an organisation’s overall security posture.
This tagging also enables a business to analyse the gaps in their current security solutions against the framework they have chosen to adhere to. This can enable rapid improvement of security posture, by highlighting weak points in the organisation’s defences.
Evading Redundant Data
Another common issue with Enterprise Security is querying the wrong data, this can include querying too much data, as well as data that is no longer relevant. The most common cause of this is data that has not been sufficiently segmented into either indexes or data models, this can force your searches to have to query through hundreds if not thousands of logs that are not pertinent to the query at hand, for example storing all endpoint logs in a single index may cause a windows specific search to have to analyse Linux logs, which will only bloat search time and cause skip ratio.
Data pipelining is another tool that can be used to not only reduce the ingest into, and therefore cost of, Enterprise Security but can also be used to improve search efficiency by only allowing security critical logs into the platform. A key example of this is Windows event log which can generate hundreds of logs per device per day. Using a pipelining tool, you can drop all events that do not contain event codes relevant to your security applications and can instead store them in a long-term storage solution where they can be recalled into Splunk if needed for threat hunting or advanced analysis.
Utilising Technical Add-ons (TAs)
When data is ingested into Splunk, technical add-ons are responsible for determining how data is processed, however if you are ingesting data without a technical add on where one is available then you are placing a heavy amount of computational cost onto the indexers as they are left effectively guessing on how data should be parsed. This can cause significant performance issues in your Enterprise Security deployment, causing searches to take an increased amount of time to complete, which can impact your Mean Time to Detection, weakening your security posture.
TAs can also provide field extractions that will improve the insight both your searches and your analysts are able to gain from the data within your platform. Checking if a TA exists for your data on Splunkbase is quick and easy and if you are using a common application or piece of hardware, there is a high change a TA already exists that you can simply install and configure on your instance.
Conclusion
There are many techniques that can be used to maximise the value your data adds in the Enterprise security tool, some of which we have discussed here. The points raised in this blog highlight the importance data quality and strategy plays in building a mature, resilient security posture. For further tips on how you can maximise the value from your data, read our previous blog post: Premium Products – Understanding Your Data.
-
26 June 2025
SIEM Platform Management
-
26 June 2025
Using Your Data Effectively in Enterprise Security
-
26 June 2025
Data Discovery Process
See how we can build your digital capability,
call us on +44(0)845 226 3351 or send us an email…