Read-only MCP for Kibana-backed log investigation

Faster Kibana investigations for agents and teams

Give agents structured access to source discovery, field inspection, and focused queries without relying on brittle UI automation.

Pick the install path that fits your workflow.

Best for: direct machine install and quick guided setup

npx -y @havesomecode/kibana-mcp-server setup
  • Read-only by design
  • Guided setup in one command
  • Saved machine profile for later threads
  • Works with the repo-local Codex plugin

See a real investigation flow

From source discovery to structured query output

The first proof block follows one investigation from source selection to structured results.

01

Discover the sources you can trust

Start from named sources instead of hunting through raw Kibana index structure.

02

Inspect fields before tightening the query

When schema endpoints are available, agents can validate exact-match fields before pushing into grouped or filtered analysis.

03

Run focused queries that return structured results

Ask for hits, counts, histograms, terms, stats, or grouped top hits without custom glue for every team.

Why it exists

Kibana works for humans. Agents need structure.

Kibana is powerful, but repeatable agent workflows break down when every investigation depends on UI state, tribal knowledge, or one-off wrappers. This MCP adds a structured path on top of the log sources you already expose.

Read-only by design

No writes to Kibana, Elasticsearch, or data views.

Real install paths

Use `npx` directly or the repo-local Codex plugin workflow.

Maintained release path

Install from the package, inspect tagged releases, or review the repo directly.

Built for real investigation workflows

Capabilities that matter during an investigation

Discover sources

Start from logical sources, not raw index details

Give teams and agents a bounded catalog of sources with names, tags, and field hints instead of making every investigation start from manual digging.

Inspect fields

Check exact-match fields before tightening filters

describe_fields helps when the deployment exposes schema metadata. Without it, the rest of the query workflow still works.

Run tighter queries

Move from broad search to grouped, filtered, or statistical output

Return structured results agents can reason about without manual UI parsing or custom glue for every team.

Reuse saved setup

Install once, keep a default machine profile for later threads

Guided setup saves machine-level profile state so new threads do not need a manual configure step just to get back to investigation work.

Install

Choose the path that matches your workflow

Both paths work. Pick the one that matches how you work today.

npx

Best for direct machine setup

npx -y @havesomecode/kibana-mcp-server setup

Run guided setup once. Later threads can reuse the default profile.

Repo + Codex

Best for repo-local plugin use in Codex

git clone https://github.com/Havesomecode/kibana-mcp-server.git
cd kibana-mcp-server
npm install
npm run build
npm run setup

Install the local plugin, run setup once, and let Codex threads reuse the saved default profile.

Environment notes

What to expect in your environment

The main workflow works across deployments. Some features depend on schema metadata being exposed.

What it handles well

Read-only source discovery, field inspection, exact-field filtering, and structured queries over Kibana-backed logs.

What it will not do

It does not write to Kibana or Elasticsearch, and it does not try to turn log investigation into a broader automation product.

What depends on your setup

Schema-aware features depend on the deployment. If metadata endpoints are blocked, those features may be limited or unavailable.

Ready to use it?

Install it or inspect the repo