aboutsummaryrefslogtreecommitdiff
path: root/README.md
diff options
context:
space:
mode:
authorSiavash Safi <siavash.safi@gmail.com>2017-10-14 14:23:42 +0200
committerBen Kochie <superq@gmail.com>2017-10-14 14:23:42 +0200
commitf3a70226025bf81e55eaf9d7386f9f7b3a267694 (patch)
tree1a721e7563babb1980089a31c2bf6557ab678f8e /README.md
parent8f9edf87b5e826fa6a4dc43883d2cb6311d2e1ee (diff)
downloadprometheus_node_collector-f3a70226025bf81e55eaf9d7386f9f7b3a267694.tar.bz2
prometheus_node_collector-f3a70226025bf81e55eaf9d7386f9f7b3a267694.tar.xz
prometheus_node_collector-f3a70226025bf81e55eaf9d7386f9f7b3a267694.zip
Add `collect[]` parameter (#699)
* Add `collect[]` parameter * Add TODo comment about staticcheck ignored * Restore promhttp.HandlerOpts * Log a warning and return HTTP error instead of failing * Check collector existence and status, cleanups * Fix warnings and error messages * Don't panic, return error if collector registration failed * Update README
Diffstat (limited to 'README.md')
-rw-r--r--README.md33
1 files changed, 33 insertions, 0 deletions
diff --git a/README.md b/README.md
index 04053a1..d4ef98a 100644
--- a/README.md
+++ b/README.md
@@ -107,6 +107,39 @@ echo 'role{role="application_server"} 1' > /path/to/directory/role.prom.$$
107mv /path/to/directory/role.prom.$$ /path/to/directory/role.prom 107mv /path/to/directory/role.prom.$$ /path/to/directory/role.prom
108``` 108```
109 109
110### Filtering enabled collectors
111
112The node_exporter will expose all metrics from enabled collectors by default, but it can be passed an optional list of collectors to filter metrics. The `collect[]` parameter accepts values matching enabled collector names.
113
114This can be useful for specifying different scrape intervals for different collectors in Prometheus:
115
116```yaml
117scrape_configs:
118 - job_name: 'node resources'
119 scrape_interval: 15s
120 static_configs:
121 - targets:
122 - '192.168.1.2:9100'
123 params:
124 collect[]:
125 - cpu
126 - meminfo
127 - diskstats
128 - netdev
129 - netstat
130
131 - job_name: 'node storage'
132 scrape_interval: 1m
133 static_configs:
134 - targets:
135 - '192.168.1.2:9100'
136 params:
137 collect[]:
138 - filefd
139 - filesystem
140 - xfs
141```
142
110## Building and running 143## Building and running
111 144
112Prerequisites: 145Prerequisites: