Giter Club home page Giter Club logo

mssql_exporter's Introduction

mssql_exporter

MSSQL Exporter for Prometheus

GitHub Actions GitHub Docker Hub
Build Dockerfile check GitHub release Docker Hub

Quickstart docker-compose

docker-compose up

docker-compose.yml

version: '3'
services:
  mssql_exporter:
    image: "danieloliver/mssql_exporter:latest"
    ports:
      - "80:80"
    depends_on:
      - sqlserver.dev
    environment:
      - PROMETHEUS_MSSQL_DataSource=Server=tcp:sqlserver.dev,1433;Initial Catalog=master;Persist Security Info=False;User ID=sa;Password=yourStrong(!)Password;MultipleActiveResultSets=False;Encrypt=False;TrustServerCertificate=True;Connection Timeout=10;
      - PROMETHEUS_MSSQL_ConfigFile=metrics.json
      - PROMETHEUS_MSSQL_ServerPath=metrics
      - PROMETHEUS_MSSQL_ServerPort=80
      - PROMETHEUS_MSSQL_AddExporterMetrics=false
      - PROMETHEUS_MSSQL_Serilog__MinimumLevel=Information
      - |
        PROMETHEUS_MSSQL_ConfigText=
        {
          "Queries": [
            {
              "Name": "mssql_process_status",
              "Query": "SELECT status, COUNT(*) count FROM sys.sysprocesses GROUP BY status",
              "Description": "Counts the number of processes per status",
              "Usage": "GaugesWithLabels",
              "Columns": [
                {
                  "Name": "status",
                  "Label": "status",
                  "Usage": "GaugeLabel",
                  "Order": 0
                },
                {
                  "Name": "count",
                  "Label": "count",
                  "Usage": "Gauge"
                }
              ]
            },
            {
              "Name": "mssql_process_connections",
              "Query": "SELECT ISNULL(DB_NAME(dbid), 'other') as dbname, COUNT(dbid) as connections FROM sys.sysprocesses WHERE dbid > 0 GROUP BY dbid",
              "Description": "Counts the number of connections per db",
              "Usage": "GaugesWithLabels",
              "Columns": [
                {
                  "Name": "dbname",
                  "Label": "dbname",
                  "Usage": "GaugeLabel",
                  "Order": 0
                },
                {
                  "Name": "connections",
                  "Label": "count",
                  "Usage": "Gauge"
                }
              ]
            },
            {
              "Name": "mssql_deadlocks",
              "Query": "SELECT cntr_value FROM sys.dm_os_performance_counters where counter_name = 'Number of Deadlocks/sec' AND instance_name = '_Total'",
              "Description": "Number of lock requests per second that resulted in a deadlock since last restart",
              "Columns": [
                {
                  "Name": "cntr_value",
                  "Label": "mssql_deadlocks",
                  "Usage": "Gauge",
                  "DefaultValue": 0
                }
              ]
            }
          ],
          "MillisecondTimeout": 4000
        }
  sqlserver.dev:
    image: "mcr.microsoft.com/mssql/server:2017-latest"
    ports:
      - "1433:1433"
    environment:
      - ACCEPT_EULA=Y
      - SA_PASSWORD=yourStrong(!)Password

QuickStart binary

  1. Download system of your choice from latest release.

  2. Create a file "metrics.json" and put this in it:

{
  "Queries": [
    {
      "Name": "mssql_process_status",
      "Query": "SELECT status, COUNT(*) count FROM sys.sysprocesses GROUP BY status",
      "Description": "Counts the number of processes per status",
      "Usage": "GaugesWithLabels",
      "Columns": [
        {
          "Name": "status",
          "Label": "status",
          "Usage": "GaugeLabel",
          "Order": 0
        },
        {
          "Name": "count",
          "Label": "count",
          "Usage": "Gauge"
        }
      ]
    },
    {
      "Name": "mssql_process_connections",
      "Query": "SELECT ISNULL(DB_NAME(dbid), 'other') as dbname, COUNT(dbid) as connections FROM sys.sysprocesses WHERE dbid > 0 GROUP BY dbid",
      "Description": "Counts the number of connections per db",
      "Usage": "GaugesWithLabels",
      "Columns": [
        {
          "Name": "dbname",
          "Label": "dbname",
          "Usage": "GaugeLabel",
          "Order": 0
        },
        {
          "Name": "connections",
          "Label": "count",
          "Usage": "Gauge"
        }
      ]
    },
    {
      "Name": "mssql_deadlocks",
      "Query": "SELECT cntr_value FROM sys.dm_os_performance_counters where counter_name = 'Number of Deadlocks/sec' AND instance_name = '_Total'",
      "Description": "Number of lock requests per second that resulted in a deadlock since last restart",
      "Columns": [
        {
          "Name": "cntr_value",
          "Label": "mssql_deadlocks",
          "Usage": "Gauge",
          "DefaultValue": 0
        }
      ]
    }
  ],
  "MillisecondTimeout": 4000
}
  1. Run mssql_exporter
./mssql_exporter serve -ConfigFile "metrics.json" -DataSource "Server=tcp:{ YOUR DATABASE HERE },1433;Initial Catalog={ YOUR INITIAL CATALOG HERE };Persist Security Info=False;User ID={ USER ID HERE };Password={ PASSWORD HERE };MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=8;"

or

.\mssql_exporter.exe serve -ConfigFile "metrics.json" -DataSource "Server=tcp:{ YOUR DATABASE HERE },1433;Initial Catalog={ YOUR INITIAL CATALOG HERE };Persist Security Info=False;User ID={ USER ID HERE };Password={ PASSWORD HERE };MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=8;"
  1. Open http://localhost/metrics

Content should look like

# HELP mssql_process_status Counts the number of processes per status
# TYPE mssql_process_status gauge
mssql_process_status{status="runnable"} 2
mssql_process_status{status="sleeping"} 19
mssql_process_status{status="background"} 24
# HELP mssql_process_connections Counts the number of connections per db
# TYPE mssql_process_connections gauge
mssql_process_connections{dbname="master"} 29
mssql_process_connections{dbname="tempdb"} 1
# HELP mssql_timeouts Number of queries timing out.
# TYPE mssql_timeouts gauge
mssql_timeouts 0
# HELP mssql_exceptions Number of queries throwing exceptions.
# TYPE mssql_exceptions gauge
mssql_exceptions 0
# HELP mssql_deadlocks mssql_deadlocks
# TYPE mssql_deadlocks gauge
mssql_deadlocks 0
# HELP mssql_up mssql_up
# TYPE mssql_up gauge
mssql_up 1

Note

  • mssql_up gauge is "1" if the database is reachable. "0" if connection to the database fails.
  • mssql_exceptions gauge is "0" if all queries run successfully. Else, this is the number of queries that throw exceptions.
  • mssql_timeouts is "0" if all queries are running with the configured timeout. Else, this is the number of queries that are not completing within the configured timeout.
  1. Add Prometheus scrape target (assuming same machine).
global:
  scrape_interval:     15s # Set the scrape interval to every 15 seconds. Default is every 1 minute.
  evaluation_interval: 15s # Evaluate rules every 15 seconds. The default is every 1 minute.
  
scrape_configs:
  - job_name: 'netcore-prometheus'
    # metrics_path defaults to '/metrics'
    static_configs:
    - targets: ['localhost']

Command Line Options

Commands
   help
   serve
      -DataSource (Connection String)
      -ConfigFile (metrics.json)
      -ServerPath (/metrics)
      -ServerPort (80)
      -AddExporterMetrics (false)
      -ConfigText ()

Or environment variables:
      PROMETHEUS_MSSQL_DataSource
      PROMETHEUS_MSSQL_ConfigFile
      PROMETHEUS_MSSQL_ServerPath
      PROMETHEUS_MSSQL_ServerPort
      PROMETHEUS_MSSQL_AddExporterMetrics
      PROMETHEUS_MSSQL_ConfigText
      PROMETHEUS_MSSQL_Serilog__MinimumLevel
  • DataSource
    • Default: empty
    • SQL Server .NET connection String
  • ConfigFile
    • Default: "metrics.json"
    • The path to the configuration file as shown in "metrics.json" above.
  • ServerPath
    • Default: "metrics"
    • specifies the path for prometheus to answer requests on
  • ServerPort
    • Default: 80
  • AddExporterMetrics
    • Default: false
    • Options:
      • true
      • false
  • ConfigText
    • Default: empty
    • Optionally fill in this with the contents of the ConfigFile to ignore and not read from the ConfigFile.

Run as windows service

You can install the exporter as windows service with the following command

sc create mssql_exporter binpath="%full_path_to_mssql_exporter.exe%"

Changing logging configuration

Logging is configured using Serilog Settings Configuration

Editing "config.json" allows for changing aspects of logging. Console is default, and "Serilog.Sinks.File" is also installed. Further sinks would have to be installed into the project file's dependencies.

Debug Run and Docker

  1. Run Docker
docker run -e 'ACCEPT_EULA=Y' -e "SA_PASSWORD=yourStrong(!)Password" --net=host -p 1433:1433 -d --rm --name sqlserverdev mcr.microsoft.com/mssql/server:2017-latest
  1. Run exporter from "src/server" directory.
dotnet run -- serve -ConfigFile "../../metrics.json" -DataSource "Server=tcp:localhost,1433;Initial Catalog=master;Persist Security Info=False;User ID=sa;Password=yourStrong(!)Password;MultipleActiveResultSets=False;Encrypt=False;TrustServerCertificate=True;Connection Timeout=8;"
dotnet run -- serve -ConfigFile "../../metrics.json" -DataSource 'Server=tcp:localhost,1433;Initial Catalog=master;Persist Security Info=False;User ID=sa;Password=yourStrong(!)Password;MultipleActiveResultSets=False;Encrypt=False;TrustServerCertificate=True;Connection Timeout=8;'

OR

  1. Docker-compose!
docker-compose up

mssql_exporter's People

Contributors

danieloliver avatar olsh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

mssql_exporter's Issues

how can i get more query in prometheus

I need more than this

HELP mssql_up mssql_up

TYPE mssql_up gauge

mssql_up 1

HELP mssql_exceptions Number of queries throwing exceptions.

TYPE mssql_exceptions gauge

mssql_exceptions 0

HELP mssql_process_status Counts the number of processes per status

TYPE mssql_process_status gauge

mssql_process_status{status="runnable"} 1
mssql_process_status{status="suspended"} 1
mssql_process_status{status="background"} 86
mssql_process_status{status="sleeping"} 28

HELP mssql_timeouts Number of queries timing out.

TYPE mssql_timeouts gauge

mssql_timeouts 0

HELP mssql_deadlocks mssql_deadlocks

TYPE mssql_deadlocks gauge

mssql_deadlocks 0

mssql_deadlocks should be a counter

According to the description for the Number of Deadlocks/sec, that metric should be a counter. However, it's specified as a gauge in README.md and in src/server/metrics.json.

how to get the query data correctly

I have the following query but it is not giving me what I need, I would need each of the connections and who is doing them:
{
"Queries": [
{
"Name": "conexiones_abiertas",
"Query": "SELECT count(*) n,s.host_name a FROM sys.dm_exec_sessions s INNER JOIN sys.dm_exec_connections e ON s.session_id = e.session_id group by s.host_name order by n",
"Description": "numero de conexiones abiertas de cada proceso",
"Usage": "GaugesWithLabels",
"Columns": [
{
"Name": "n",
"Label": "status",
"Usage": "GaugeLabel",
"Order": 0
},
{
"Name": "a",
"Label": "count",
"Usage": "Gauge"
}
]
},
],
"MillisecondTimeout": 4000
}

Timeout per Query?

Hi, I have a question since documentation is sparse:

in the example there is a

"MillisecondTimeout": 4000

for the whole metrics.json file. Is this a TOTAL SUM for the scraping of all querys? Or for each of them?
Is it also possible to define a Timeout on per-query basis?

There may be querys that are fast (and shall fail fast) and ones that take longer than the default timeout.

Or should you start more than one exporter processes with different configurations?

Thanks in advance, BR Mario

Connection failing

Hi,

I have the exporter running on the windows database server. When i do a curl command i get the following response:
`# HELP mssql_exceptions Number of queries throwing exceptions.

TYPE mssql_exceptions gauge

mssql_exceptions 4

HELP mssql_deadlocks mssql_deadlocks

TYPE mssql_deadlocks gauge

mssql_deadlocks 0

HELP mssql_up mssql_up

TYPE mssql_up gauge

mssql_up 0

HELP mssql_timeouts Number of queries timing out.

TYPE mssql_timeouts gauge

mssql_timeouts 0

HELP mssql_process_status Counts the number of processes per status

TYPE mssql_process_status gauge`

I used the proposed command to start the exporter:
mssql_exporter.exe serve -ConfigFile "test.json" -DataSource "Server=tcp:{ YOUR DATABASE HERE },1433;Initial Catalog={ YOUR INITIAL CATALOG HERE };Persist Security Info=False;User ID={ USER ID HERE };Password={ PASSWORD HERE };MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"

It reports the database is down, while it is actually up. How can i debug/analyse why the connection to the DB is failing?

Kind regards,
Patrick

Unable to fetch metrics which column consist "String with spaces in between"

{
"Name": "mssql_process_description",
"Query": "select S5_ScheduleType,S5_ScheduleDescription from stmScheduleTask where s5_scheduleType in ('DCC','UMT','UMI') order by s5_scheduleType",
"Description": "mssql_process_description details",
"Usage": "GaugesWithLabels",
"Columns": [
{
"Name": "S5_ScheduleType",
"Label": "S5_ScheduleType",
"Usage": "GaugeLabel",
"Order": 0
},
{
"Name": "S5_ScheduleDescription",
"Label": "S5_ScheduleDescription",
"Usage": "Gauge"
}
]
}

S5_ScheduleDescription is varchar with spaces sample as "the content provider with cargo"

Unable to fetch and display. working same for other columns String and Number in the Database table.
@DanielOliver : Please help me?

Connect failed

Thank You for Your work on this Exporter!
I am working with your product and i have the following problem with it, if You will be able to help me, i will appreciate it.

Try to start:

/home/monitoring/monitoring/bin/mssql_exporter/mssql_exporter \
serve -ServerPort 18080 -ServerPath /metrics -AddExporterMetrics True \
-ConfigFile "/home/monitoring/monitoring/conf/mssql_exporter/mssql_exporter.json" \
-DataSource "Server=tcp:10.20.2.103,1433;Initial Catalog=master;Persist Security Info=False;User ID=monitoring;Password=monitoring;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;"

Console output:

Hosting environment: Production
Content root path: /home/monitoring/monitoring
Now listening on: http://[::]:18080
Application started. Press Ctrl+C to shut down.
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[1]
      Request starting HTTP/1.1 GET http://172.30.168.33:18080/metrics
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[2]
      Request finished in 885.6413ms 200 text/plain; version=0.0.4
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[1]
      Request starting HTTP/1.1 GET http://172.30.168.33:18080/metrics
info: Microsoft.AspNetCore.Hosting.Internal.WebHost[2]
      Request finished in 0.9955ms 200 text/plain; version=0.0.4

Exporter http:

# HELP mssql_timeouts Number of queries timing out.
# TYPE mssql_timeouts gauge
mssql_timeouts 0
# HELP mssql_deadlocks mssql_deadlocks
# TYPE mssql_deadlocks gauge
mssql_deadlocks 0
# HELP mssql_up mssql_up
# TYPE mssql_up gauge
mssql_up 0
# HELP mssql_exceptions Number of queries throwing exceptions.
# TYPE mssql_exceptions gauge
mssql_exceptions 3
# HELP mssql_process_status Counts the number of processes per status
# TYPE mssql_process_status gauge

The problem is: when Exporter connecting with MSSQL, i got the following error:

mssql_up 0

How can i turn on the debug mode ?

Return value of '1' for informational metric

Description of problem

In some cases, it's useful to have a metric that's informational only, and always returns a value of 1. For example, in the blog post Exposing the software version to Prometheus, the author uses the following example to expose application build information in Prometheus:

prometheus_build_info{branch="HEAD",goversion="go1.6.2",
    revision="16d70a8b6bd90f0ff793813405e84d5724a9ba65",version="1.0.1"} 1

I have a similar use case with this MSSQL exporter. I want to run a PromQL query which returns metrics for MSSQL queries that are causing high CPU usage. I want to return something like:

mssql_high_cpu_queries{query="<the-query>", server="<server-hostname>", spid="<process-id>"} 1

Is there a way to always return 1 for the value?

Change con prometheus 2.4

prometheus: level=warn ts=2019-04-15T10:17:05.077190687Z caller=scrape.go:867 component="scrape manager" scrape_pool=node target=http://10.100.200.4:80/metrics msg="append failed" err="invalid metric type "GAUGE""
prometheus: level=warn ts=2019-04-15T09:34:06.79543191Z caller=scrape.go:867 component="scrape manager" scrape_pool=node target=http://10.100.200.4:80/metrics msg="append failed" err="invalid metric type "COUNTER""

prometheus/prometheus#4602

From version 2.4 not capital letters are accepted on counter type

System.Globalization.Invariant and SqlClient issue

When

<RuntimeHostConfigurationOption Include="System.Globalization.Invariant" Value="true" /> 

Then SqlConnection.Open() throws System.InvalidOperationException: Internal connection fatal error.

According to the documentation, this option "enables you to remove application dependencies on globalization data and globalization behavior." Apart from smaller file size, I don't know what the benefits of that are, but SqlConnection doesn't work with it.

Please make it false.

Run query each X seconds

Hi,
I would like to know how frequently the queries runs (by default).
Is there a way to configure it to run once a day for example?
Also, could I set each query to run on different occurrence?
Only think I saw relates to timeout.

Thanks a lot!

metrics .json demo

{
"Queries": [
{
"Name": "mssql_process_status",
"Query": "SELECT status, COUNT(*) count FROM sys.sysprocesses GROUP BY status",
"Description": "Counts the number of processes per status",
"Usage": "GaugesWithLabels",
"Columns": [
{
"Name": "status",
"Label": "status",
"Usage": "GaugeLabel",
"Order": 0
},
{
"Name": "count",
"Label": "count",
"Usage": "Gauge"
}
]
},
{
"Name": "mssql_deadlocks",
"Query": "SELECT cntr_value FROM sys.dm_os_performance_counters where counter_name = 'Number of Deadlocks/sec' AND instance_name = '_Total'",
"Description": "Number of lock requests per second that resulted in a deadlock since last restart",
"Columns": [
{
"Name": "cntr_value",
"Label": "mssql_deadlocks",
"Usage": "Gauge",
"DefaultValue": 0
}
]
},
{
"Name": "mssql_kill_connection_errors",
"Query": "SELECT cntr_value FROM sys.dm_os_performance_counters where counter_name = 'Errors/sec' AND instance_name = 'Kill Connection Errors'",
"Description": "Number of kill connection errors/sec since last restart",
"Columns": [
{
"Name": "cntr_value",
"Label": "mssql_kill_connection_errors",
"Usage": "Gauge",
"DefaultValue": 0
}
]
},
{
"Name": "mssql_batch_requests",
"Query": "SELECT TOP 1 cntr_value FROM sys.dm_os_performance_counters where counter_name = 'Batch Requests/sec'",
"Description": "Number of Transact-SQL command batches received per second. This statistic is affected by all constraints (such as I/O, number of users, cachesize, complexity of requests, and so on). High batch requests mean good throughput",
"Columns": [
{
"Name": "cntr_value",
"Label": "mssql_batch_requests",
"Usage": "Gauge",
"DefaultValue": 0
}
]
},
{
"Name": "mssql_page_fault_count",
"Query": "SELECT page_fault_count from sys.dm_os_process_memory",
"Description": "Number of page faults since last restart",
"Columns": [
{
"Name": "page_fault_count",
"Label": "mssql_page_fault_count",
"Usage": "Gauge",
"DefaultValue": 0
}
]
},
{
"Name": "mssql_memory_utilization_percentage",
"Query": "SELECT memory_utilization_percentage from sys.dm_os_process_memory",
"Description": "Number of page faults since last restart",
"Columns": [
{
"Name": "memory_utilization_percentage",
"Label": "mssql_memory_utilization_percentage",
"Usage": "Gauge",
"DefaultValue": 0
}
]
},
{
"Name": "mssql_total_physical_memory_kb",
"Query": "SELECT total_physical_memory_kb from sys.dm_os_sys_memory",
"Description": "Total physical memory in KB",
"Columns": [
{
"Name": "total_physical_memory_kb",
"Label": "mssql_total_physical_memory_kb",
"Usage": "Gauge",
"DefaultValue": 0
}
]
},
{
"Name": "mssql_available_physical_memory_kb",
"Query": "SELECT available_physical_memory_kb from sys.dm_os_sys_memory",
"Description": "Available physical memory in KB",
"Columns": [
{
"Name": "available_physical_memory_kb",
"Label": "mssql_available_physical_memory_kb",
"Usage": "Gauge",
"DefaultValue": 0
}
]
},
{
"Name": "mssql_total_page_file_kb",
"Query": "SELECT total_page_file_kb from sys.dm_os_sys_memory",
"Description": "Total page file in KB",
"Columns": [
{
"Name": "total_page_file_kb",
"Label": "mssql_total_page_file_kb",
"Usage": "Gauge",
"DefaultValue": 0
}
]
},
{
"Name": "mssql_available_page_file_kb",
"Query": "SELECT available_page_file_kb from sys.dm_os_sys_memory",
"Description": "Available page file in KB",
"Columns": [
{
"Name": "available_page_file_kb",
"Label": "mssql_available_page_file_kb",
"Usage": "Gauge",
"DefaultValue": 0
}
]
},
{
"Name": "mssql_connections",
"Query": "SELECT DB_NAME(sP.dbid) db_name,COUNT(sP.spid) count FROM sys.sysprocesses sP GROUP BY DB_NAME(sP.dbid)",
"Description": "Number of active connections",
"Usage": "GaugesWithLabels",
"Columns": [
{
"Name": "db_name",
"Label": "db_name",
"Usage": "GaugeLabel",
"Order": 0
},
{
"Name": "count",
"Label": "count",
"Usage": "Gauge"
}
]
},
{
"Name": "mssql_user_errors",
"Query": "SELECT cntr_value FROM sys.dm_os_performance_counters where counter_name = 'Errors/sec' AND instance_name = 'User Errors'",
"Description": "Number of user errors/sec since last restart",
"Columns": [
{
"Name": "cntr_value",
"Label": "mssql_user_errors",
"Usage": "Gauge",
"DefaultValue": 0
}
]
},
{
"Name": "mssql_io_read_stall",
"Query": "SELECT cast(DB_Name(a.database_id) as varchar) as name,max(io_stall_read_ms) as readIO FROM sys.dm_io_virtual_file_stats(null, null) a INNER JOIN sys.master_files b ON a.database_id = b.database_id and a.file_id = b.file_id group by a.database_id",
"Description": "Wait time (ms) of stall since last restart",
"Usage": "GaugesWithLabels",
"Columns": [
{
"Name": "name",
"Label": "name",
"Usage": "GaugeLabel",
"Order": 0
},
{
"Name": "readIO",
"Label": "readIO",
"Usage": "Gauge"
}
]
},
{
"Name": "mssql_io_write_stall",
"Query": "SELECT cast(DB_Name(a.database_id) as varchar) as name,max(io_stall_write_ms) as writeIO FROM sys.dm_io_virtual_file_stats(null, null) a INNER JOIN sys.master_files b ON a.database_id = b.database_id and a.file_id = b.file_id group by a.database_id",
"Description": "Wait time (ms) of stall since last restart",
"Usage": "GaugesWithLabels",
"Columns": [
{
"Name": "name",
"Label": "name",
"Usage": "GaugeLabel",
"Order": 0
},
{
"Name": "writeIO",
"Label": "writeIO",
"Usage": "Gauge"
}
]
},
{
"Name": "mssql_io_queued_read_stall",
"Query": "SELECT cast(DB_Name(a.database_id) as varchar) as name,max(io_stall_queued_read_ms) as readQueuedIO FROM sys.dm_io_virtual_file_stats(null, null) a INNER JOIN sys.master_files b ON a.database_id = b.database_id and a.file_id = b.file_id group by a.database_id",
"Description": "Wait time (ms) of stall since last restart",
"Usage": "GaugesWithLabels",
"Columns": [
{
"Name": "name",
"Label": "name",
"Usage": "GaugeLabel",
"Order": 0
},
{
"Name": "readQueuedIO",
"Label": "readQueuedIO",
"Usage": "Gauge"
}
]
},
{
"Name": "mssql_io_queued_read_stall",
"Query": "SELECT cast(DB_Name(a.database_id) as varchar) as name,max(io_stall_queued_write_ms) as writeQueuedIO FROM sys.dm_io_virtual_file_stats(null, null) a INNER JOIN sys.master_files b ON a.database_id = b.database_id and a.file_id = b.file_id group by a.database_id",
"Description": "Wait time (ms) of stall since last restart",
"Usage": "GaugesWithLabels",
"Columns": [
{
"Name": "name",
"Label": "name",
"Usage": "GaugeLabel",
"Order": 0
},
{
"Name": "writeQueuedIO",
"Label": "writeQueuedIO",
"Usage": "Gauge"
}
]
},
{
"Name": "mssql_io_stall_total",
"Query": "SELECT cast(DB_Name(a.database_id) as varchar) as name,max(io_stall) as totalStall FROM sys.dm_io_virtual_file_stats(null, null) a INNER JOIN sys.master_files b ON a.database_id = b.database_id and a.file_id = b.file_id group by a.database_id",
"Description": "Wait time (ms) of stall since last restart",
"Usage": "GaugesWithLabels",
"Columns": [
{
"Name": "name",
"Label": "name",
"Usage": "GaugeLabel",
"Order": 0
},
{
"Name": "totalStall",
"Label": "totalStall",
"Usage": "Gauge"
}
]
}
],
"MillisecondTimeout": 4000
}

connection timeout

When starting 70 metrics during debugging, an error appears:


Content root path: C:\mssql_exporter

Now listening on: http://[::]:83

Application started. Press Ctrl+C to shut down.

[18:59:04 DBG] Connection id "0HM7GBT8HPG0E" accepted.

[18:59:04 DBG] Connection id "0HM7GBT8HPG0E" started.

[18:59:04 INF] Request starting HTTP/1.1 GET http://localhost:83/metrics

[18:59:04 DBG] Wildcard detected, all requests with hosts will be allowed.

[18:59:34 ERR] Query fastpayments_transactions_total failed

System.Data.SqlClient.SqlException (0x80131904): Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.

---> System.ComponentModel.Win32Exception (258): The wait operation timed out.

   at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)

   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)

   at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)

   at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)

   at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()

   at System.Data.SqlClient.SqlDataReader.get_MetaData()

   at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString)

   at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, SqlDataReader ds)

   at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean asyncWrite, String method)

   at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)

   at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior)

   at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)

   at System.Data.Common.DbCommand.System.Data.IDbCommand.ExecuteReader(CommandBehavior behavior)

   at System.Data.Common.DbDataAdapter.FillInternal(DataSet dataset, DataTable[] datatables, Int32 startRecord, Int32 maxRecords, String srcTable, IDbCommand command, CommandBehavior behavior)

   at System.Data.Common.DbDataAdapter.Fill(DataSet dataSet, Int32 startRecord, Int32 maxRecords, String srcTable, IDbCommand command, CommandBehavior behavior)

   at System.Data.Common.DbDataAdapter.Fill(DataSet dataSet)

   at mssql_exporter.core.QueryExtensions.<>c__DisplayClass0_0.<MeasureWithConnection>b__0() in /home/vsts/work/1/s/src/core/QueryExtensions.cs:line 39

ClientConnectionId:1541ed4e-b3c8-4009-8693-c3f9f246c257

Error Number:-2,State:0,Class:11

Application launch:

mssql_exporter.exe serve -ConfigFile "metrics.json" -DataSource "Server=tcp:localhost,1433;Initial Catalog=dbname;Persist Security Info=False;User ID=userdb;Password=blablabla;MultipleActiveResultSets=False;Encrypt=False;TrustServerCertificate=False;Connection Timeout=600;" -ServerPort 83 -LogLevel Debug

MillisecondTimeout parameter: 60000

Why is there a timeout?
Feeling that the timeout is 30 seconds

Prometheus Error - invalid metric type "GAUGE"

I have Prometheus setup in a test environment connecting to a few servers for various exporters and they are working fine. I have node exporter, wmi exporter, mysql exporter and I just added the mssql exporter. For the mssql exporter, I am getting this error in Prometheus - invalid metric type "GAUGE".

Thanks,
Anthony

Customizing the label names and extracting the values for the same

Hi,

We are trying to create the metrics for windows service using your code.
Here we are not able to fetch the values for the corresponding labels.
Only the labels are getting displayed inside the values as well.
PFA the screenshot.
image

Kindly let me know how we are setting the Value for a particular column.

Regards,
Rashmi

Collation Problem

Hi daniel,
I can not connect different collation sql instance
only connect collation SQL_Latin1_General_CP1_CI_AS

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.