This post is written from the perspective of the Elastic Observability design team. It’s aimed at developers and SREs who work with logs and ingest pipelines, and it explains how design decisions shaped the Processing experience in Streams.
The Design Problem in Log Processing
We rarely talk about how projects actually begin.
How do you design something that doesn't fully exist yet?
How do you align AI capabilities, system constraints, real user pains into one coherent experience?
Streams gave us that challenge.
Logs are one of the richest signals in observability - but also one of the messiest. Streams is an agentic AI-powered solution that rethinks how teams work with logs to enable fast incident investigation and resolution.
Streams uses AI to partition and parse raw logs, extract relevant fields, reduce schema management overhead, and surface significant events like critical errors and anomalies.
This led us to make logs investigation-ready from the start, and not force the Site Reliability Engineer to fight their data. But in order to enable such experience, we had to carefully rethink a core concept and step in the process - Processing.
Designing Processing UX in Elastic Streams
Logs are powerful, but only if they are structured correctly. Today, a user would onboard logs via Elastic Agent, using a custom integration, extract something as simple as an IP field by:
- Write GROK patterns
- Create pipelines
- Manage mappings
- Test transformation
- Iterate repeatedly
What sounds simple requires 20+ steps — and deep expertise most teams shouldn’t need. Our goal became simple: make this dramatically simpler.
Our early design question was:
“ Can we reduce this experience to 2 meaningful steps instead of 20 technical ones?”
That question shaped how we approached the Stream UX.
The Foundation
Before we jumped into designing the UI in Kibana, we defined a core mental model.
A Stream is a collection of documents stored together that share:
- Retention
- Configuration
- Mappings
- Processing rules
- Lifecycle behaviour
The key design principle:
“A Stream should contain data that behaves consistently.”
Why Does Data Consistency Matter?
We started with an example to test our thinking. Take Nginx access and error logs.
Access logs describe request/response events:
Error logs describe diagnostic events:
If both live in the same Streams that might cause:
- Processing logic conflicts
- Field divergence
- Mapping conflicts
- Investigations would be fundamentally harder
That insight clarified something critical:
“Processing isn’t just about extracting fields. It’s about protecting consistency.”
Making Complexity Manageable
The ingest ecosystem isn’t small, simple, or hypothetical. Real pipelines use dozens of processors — from common ones like
The UI had to support both high-frequency actions and long-tail edge cases without losing structure. Currently Elasticsearch supports over 40 different ingest processors. We had to make sure our interface could handle the different types.
We introduced a clear, nested structure for pipeline steps. Users could create, reorder, edit, or remove individual steps or grouped ones with confidence. The nested drag and drop capability was also added as a pattern in our EUI library.
This gave us the context and foundation to work on integrating those concepts into a model that would be definitive for everything in Streams.
Page Archetypes
Processing is powerful - and risky. Changing a parsing condition or step might affect:
- Field availability
- Search behaviour
- Alerts
- AI Insights
- Investigations
So we asked ourselves how do we make something so powerful and important, safe for the user? The answer led to a core page archetype:
Create > Preview > Confirm
This wasn’t a UI pattern added later. It emerged directly from our concept work and understanding what users would have to deal with.
To support this archetype and core idea, we also introduced a split-screen structure.
Left: Build
This is where users would:
- Add processing steps
- Define conditions
- Apply rules
- Leverage AI suggestions both as a whole pipeline creation or individual steps like a GROK processor
It remained focused, intentional and structured.
Right Preview
This is where users would:
- See real life log samples
- See extracted fields in context
- Immediate feedback on changes, with insights about the matched and unmatched percentage of documents
- Optional drilldown side panel on the right
The preview panel became the anchor of confidence. This was not about visual symmetry, but to reinforce experimentation, control over errors and decrease the level of mistakes. Knowing that users might want to switch their focus from interaction to detailed preview, we introduced the resizeable function to both panels, and unlocked more flexiblity and control over the use cases.
AI Automation
Streams is agentic and AI powered. That added another layer of complexity for the design, but also another opportunity to unlock even more power and insights from users' log data.
AI introduced a new tension: how do you accelerate processing without turning it into a black box?
We established a few guardrails:
- Clear, concise suggestions
- Visible impact through matched document metrics
- Inspectability
- Alignment with the Create → Preview → Confirm model
Processing UX became the bridge between automation and human in the loop. Log data is one of the most powerful investigation signals. Every design decision reinforced that belief.
What We Learned
Designing for the future does not start with screens. It starts with:
- Edge case testing
- Clear mental models
- Strong and guiding principles
- Behavioral consistency
- Scalable and stress-tested archetypes
We know that in order for a user to be able unlock insightful discoveries from their logs, they would need to process and manage their data effectively. We knew we were shaping their entire observability foundation.
Processing is about trust, control, and scalable data management.
Trust enables investigation speed.
Investigation speed enables resilience.
Learn more
Sign up for an Elastic trial at cloud.elastic.co, and trial Elastic's Serverless offering which will allow you to play with all of the Streams functionality. You want to know more about Streams? Check some of the links below:
Read about Reimagining streams
Read about Retention management
Look at the Streams website
Check the Streams documentation
