In previous blogs we have explored the data lifecycle and data value chain, both conceptual frameworks that describe the way data moves from source to destination. This blog will build off the initial touch points in this journey, using the first 3 links in the data value chain to explain why a refined and discretely defined data source journey is integral for monitoring success.
Data generation within your organisation encompasses all the ways logs, metrics, and traces are generated in your data estate. Collected or not, this first phase defines the possibility of monitoring within your environment and offers the first opportunity for a data engineering strategy to be implemented. Conceptually this first decision point is an exciting space to define your data approach but often organisations are left with a vendor defined structure for their data, as tooling rules what is and isn’t collected and more importantly how.


In this landscape data becomes siloed and locked into tools and the agents deployed to collect it. Bespoke mechanisms for collection tie data down into correlated ingestion tooling, meaning you can only view data in the vendor ecosystem used to collect it. This creates a culture of tunnel vision; where insights are capped by tooling and users experience fatigue and friction when trying to use downstream platforms. This approach also sets a limiting tone for future scale and purpose, as vendor costs drive decisions over business value. Optimising your landscape becomes dependent on cost, rather than the possible value that can be leveraged.

The solution to this vendor-first approach centres on an open dialogue around business value, utility, and strategic vision. Autonomy over tooling begins with the creation of a unified framework, and a developed understanding of the goals downstream tooling is trying to achieve. This begins by understanding what data is generated within your landscape, the frequency at which it is available, and the utility it provides your broader monitoring vision. From here you can develop a strategic vision for collection, implementing mechanisms based on source rather than destination requirements. Ingestion is then trivial to manage, routing autonomy available, and cost easily communicated. This lays the foundation for successful logging and opens up a wider conversation around processing and storage to leverage business value, a topic we will explore further in upcoming blogs.
-
27 November 2025
5 ways to get the MOST out of Cribl Copilot
-
24 November 2025
Data Processing and Storage – The Backbone of your Logging Approach
-
20 November 2025
Things to consider when creating a SIEM Specification
See how we can build your digital capability,
call us on +44(0)845 226 3351 or send us an email…
