Giter Club home page Giter Club logo

openlit / openlit Goto Github PK

View Code? Open in Web Editor NEW
451.0 4.0 27.0 32.14 MB

OpenLIT is an open-source LLM Observability tool built on OpenTelemetry. ๐Ÿ“ˆ๐Ÿ”ฅ Monitor GPU performance, LLM traces with input and output metadata, and metrics like cost, tokens, and user interactions along with complete APM for LLM Apps. ๐Ÿ–ฅ๏ธ

Home Page: https://docs.openlit.io

License: Apache License 2.0

Dockerfile 0.12% JavaScript 0.19% TypeScript 27.62% CSS 0.01% Shell 0.18% Python 71.88%
observability llmops ai-observability openai metrics anthropic clickhouse genai grafana langchain

openlit's Introduction

OpenLIT Logo

OTel-native Observability and Evals for LLMs & GPUs

Documentation | Quickstart | Python SDK

OpenLIT License Downloads GitHub Last Commit GitHub Contributors

Slack Discord X

OpenLIT Banner

OpenLIT is an OpenTelemetry-native tool designed to help developers gain insights into the performance of their LLM applications in production. It automatically collects LLM input and output metadata, and monitors GPU performance for self-hosted LLMs.

OpenLIT makes integrating observability into GenAI projects effortless with just a single line of code. Whether you're working with popular LLM providers such as OpenAI and HuggingFace, or leveraging vector databases like ChromaDB, OpenLIT ensures your applications are monitored seamlessly, providing critical insights including GPU performance stats for self-hosted LLMs to improve performance and reliability.

This project proudly follows the Semantic Conventions of the OpenTelemetry community, consistently updating to align with the latest standards in observability.

What is LIT?

LIT stands for Learning and Inference Tool, which is a visual and interactive tool designed for understanding AI models and visualizing data. The term LIT was introduced by Google.

โšก Features

  • Advanced Monitoring of LLM and VectorDB Performance: OpenLIT offers automatic instrumentation that generates traces and metrics, providing insights into the performance and costs of your LLM and VectorDB usage. This helps you analyze how your applications perform in different environments, such as production, enabling you to optimize resource ussage and scale efficiently.
  • Cost Tracking for Custom and Fine-Tuned Models: OpenLIT enables you to tailor cost tracking for specific models by using a custom JSON file. This feature allows for precise budgeting and ensures that cost estimations are perfectly aligned with your project needs.
  • OpenTelemetry-native & vendor-neutral SDKs: OpenLIT is built with native support for OpenTelemetry, making it blend seamlessly with your projects. This vendor-neutral approach reduces barriers to integration, making OpenLIT an intuitive part of your software stack rather than an additional complexity.

๐Ÿš€ Getting Started

flowchart TB;
    subgraph " "
        direction LR;
        subgraph " "
            direction LR;
            OpenLIT_SDK[OpenLIT SDK] -->|Sends Traces & Metrics| OTC[OpenTelemetry Collector];
            OTC -->|Stores Data| ClickHouseDB[ClickHouse];
        end
        subgraph " "
            direction RL;
            OpenLIT_UI[OpenLIT UI] -->|Pulls Data| ClickHouseDB;
        end
    end

Step 1: Deploy OpenLIT Stack

  1. Git Clone OpenLIT Repository

    git clone [email protected]:openlit/openlit.git
  2. Start Docker Compose

    docker-compose up -d

Step 2: Install OpenLIT SDK

Open your command line or terminal and run:

pip install openlit

Step 3: Initialize OpenLIT in your Application

Integrating OpenLIT into LLM applications is straightforward. Start monitoring for your LLM Application with just two lines of code:

import openlit

openlit.init()

To forward telemetry data to an HTTP OTLP endpoint, such as the OpenTelemetry Collector, set the otlp_endpoint parameter with the desired endpoint. Alternatively, you can configure the endpoint by setting the OTEL_EXPORTER_OTLP_ENDPOINT environment variable as recommended in the OpenTelemetry documentation.

๐Ÿ’ก Info: If you dont provide otlp_endpoint function argument or set the OTEL_EXPORTER_OTLP_ENDPOINT environment variable, The OpenLIT SDK directs the trace directly to your console, which can be useful during development.

To send telemetry to OpenTelemetry backends requiring authentication, set the otlp_headers parameter with its desired value. Alternatively, you can configure the endpoint by setting the OTEL_EXPORTER_OTLP_HEADERS environment variable as recommended in the OpenTelemetry documentation.

Example


Initialize using Function Arguments

Add the following two lines to your application code:

import openlit

openlit.init(
  otlp_endpoint="http://127.0.0.1:4318", 
)


Initialize using Environment Variables

Add the following two lines to your application code:

import openlit

openlit.init()

Then, configure the your OTLP endpoint using environment variable:

export OTEL_EXPORTER_OTLP_ENDPOINT = "http://127.0.0.1:4318"

Step 4: Visualize and Optimize!

With the LLM Observability data now being collected and sent to OpenLIT, the next step is to visualize and analyze this data to get insights into your LLM application's performance, behavior, and identify areas of improvement.

Just head over to OpenLIT UI at 127.0.0.1:3000 on your browser to start exploring. You can login using the default credentials

๐ŸŒฑ Contributing

Whether it's big or small, we love contributions ๐Ÿ’š. Check out our Contribution guide to get started

Unsure where to start? Here are a few ways to get involved:

  • Join our Slack or Discord community to discuss ideas, share feedback, and connect with both our team and the wider OpenLIT community.

Your input helps us grow and improve, and we're here to support you every step of the way.

๐Ÿ’š Community & Support

Connect with OpenLIT community and maintainers for support, discussions, and updates:

  • ๐ŸŒŸ If you like it, Leave a star on our GitHub
  • ๐ŸŒ Join our Slack or Discord community for live interactions and questions.
  • ๐Ÿž Report bugs on our GitHub Issues to help us improve OpenLIT.
  • ๐• Follow us on X for the latest updates and news.

License

OpenLIT is available under the Apache-2.0 license.

Visualize! Analyze! Optimize!

Join us on this voyage to reshape the future of AI Observability. Share your thoughts, suggest features, and explore contributions. Engage with us on GitHub and be part of OpenLIT's community-led innovation.

openlit's People

Contributors

amanagarwal041 avatar cherishcai avatar dependabot[bot] avatar fpreiss avatar murnitur avatar patcher9 avatar patcher99 avatar pratiksinghchauhan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

openlit's Issues

Feat: Add support for Azure OpenAI monitoring

๐Ÿš€ What's the Problem?

Doku currently supports OpenAI Monitoring along with Cohere and Anthropic. Monitoring LLM applications using Azure OpenAI don't have any support for monitoring

๐Ÿ’ก Your Dream Solution

Same level of support for monitoring Azure OpenAI based LLM Applications

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Feat: Add more panels to the Dashboard page

๐Ÿš€ What's the Problem?

The LLM Monitoring dashboard currently only has a few panels which don't give much information

๐Ÿ’ก Your Dream Solution

More panels like seen in the Grafana Dashboard we've built for export functionality
https://docs.dokulabs.com/0.0.1/integrations/grafanaoss

๐Ÿค” Seen anything similar?

1
2

๐Ÿ–ผ๏ธ Pictures or Drawings

Shared above

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Feat: Add support for Mistral AI monitoring

๐Ÿš€ What's the Problem?

Doku currently supports OpenAI Monitoring along with Cohere and Anthropic. Monitoring LLM applications using Mistral AI dont have any support for monitoring

๐Ÿ’ก Your Dream Solution

Same level of support for monitoring Mistral AI based LLM Applications

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Fix: Request Page load time, Date and Time for each request and sub-text for

๐Ÿ› What's Going Wrong?

  • The load time for the request page is slow. The SQL Query itself returns pretty fast so probably the rendering of the UI elements is causing Load time issues
  • Set default time as 24h for both pages(Dashboard and Requests) and ideally have the time interval be carried between the two pages (same on visiting)
  • Add time to along with Date on each LLM request log panel
  • Add sub-text like how we do for total token, cost etc in the LLM request log panel

๐Ÿ•ต๏ธ Steps to Reproduce

Go to the UI and checkout requests Page

๐ŸŽฏ What Did You Expect?

Explained above

๐Ÿ“ธ Any Screenshots?

NA

๐Ÿ’ป Your Setup

  • Doku Version: [e.g., 0.0.1]
  • DokuMetry SDK: [Python or NodeJS]
  • DokuMetry SDK Version: [e.g., 0.0.3]
  • Deployment Method: [Helm, Linux, Windows, Doker]

๐Ÿ“ Additional Notes

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Add Getting Started Steps in the UI

๐Ÿš€ What's the Problem?

When we login into the Doku UI for the first time (When we haven't created an API Key or sent any data) the UI should show some instructions. The instructions should be such that all values are already added in the provided snippet.

๐Ÿ’ก Your Dream Solution

NA

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Fix: loading states for different pages

๐Ÿ› What's Going Wrong?

There are no loading states for :

  • Login/logout Form
  • Dashboard page
  • All Request page

๐Ÿ•ต๏ธ Steps to Reproduce

To see the bug, what should we do?

  1. Start at '...'
  2. Click '...'
  3. Look for '...'
  4. Oops, there's the issue!

๐ŸŽฏ What Did You Expect?

Describe what you thought would happen.

๐Ÿ“ธ Any Screenshots?

Pictures can say a thousand words and can be super helpful!

๐Ÿ’ป Your Setup

  • Doku Version: [e.g., 0.0.1]
  • DokuMetry SDK: [Python or NodeJS]
  • DokuMetry SDK Version: [e.g., 0.0.3]
  • Deployment Method: [Helm, Linux, Windows, Doker]

๐Ÿ“ Additional Notes

Got more to say? Tell us here.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Feat: Build a LLM Monitoring dashboard for DataDog

๐Ÿš€ What's the Problem?

While Doku, is capable of exporting metrics to DataDog, we currently do not provide any premade dashboards. This absence of initial setup tools forces users to spend additional time and effort crafting their own dashboards to monitor their data effectively. This step poses a challenge for users who may not be familiar with best practices for dashboard creation in DataDog, potentially leading to suboptimal monitoring setups.

๐Ÿ’ก Your Dream Solution

The ideal solution would be the development of a comprehensive LLM monitoring dashboard for DataDog tailored specifically for Doku's exported metrics. The goal is to allow users to monitor LLM Applications witj:

  • Predefined Dashboard: Prebuilt dashboard to display LLM Monitoring data processed by Doku' in a clear, insightful manner. These dashboards would highlight key performance indicators and provide users with an immediate overview of their system's health.

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Feat: add theming colors to whole project

๐Ÿš€ What's the Problem?

Currently, our project's user interface lacks a cohesive color scheme that aligns with our logo.

๐Ÿ’ก Your Dream Solution

The ideal solution would involve integrating a theming system throughout our application that is both dynamic and versatile, allowing for our primary, secondary, and any additional branding colors to be consistently applied across all UI components. This theming system would:

  • Reflect Our Branding: Seamlessly integrate our logo's color palette(orange) throughout the application's interface, creating a visually cohesive experience.
  • Support Dark/Light Modes: Include both light and dark mode support, ensuring colors adapt well to both themes without compromising readability or visual appeal.
  • Customizable Components: Ensure that UI components such as buttons, headers, backgrounds, and text elements can dynamically adapt to the defined theme.

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Build a dashboard/Mixin for Grafana Cloud

๐Ÿš€ What's the Problem?

While Doku, is capable of exporting metrics to Grafana Cloud, we currently do not provide any premade dashboards or alert configurations for our users. This absence of initial setup tools forces users to spend additional time and effort crafting their own dashboards to monitor their data effectively. This step poses a challenge for users who may not be familiar with best practices for dashboard creation in Grafana, potentially leading to suboptimal monitoring setups.

๐Ÿ’ก Your Dream Solution

The ideal solution would be the development of a comprehensive monitoring mixin for Grafana Cloud tailored specifically for Doku's exported metrics. This mixin would include:

  • Predefined Dashboards: Prebuilt dashboard to display LLM Monitoring data processed by Doku' in a clear, insightful manner. These dashboards would highlight key performance indicators and provide users with an immediate overview of their system's health.

๐Ÿค” Seen anything similar?

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Bug]: Time interval should be carried forward between the dashboard and request page

๐Ÿ› What's Going Wrong?

When we update the time filter on dashboard and switch to request page, the filter value goes back to default.

๐Ÿ•ต๏ธ Steps to Reproduce

To see the bug, what should we do?

  1. Go to dashboard page
  2. Change time filter
  3. Go to request page via navigation
  4. The filter goes back to default

๐ŸŽฏ What Did You Expect?

It should ideally be carried on the navigation

๐Ÿ“ธ Any Screenshots?

Pictures can say a thousand words and can be super helpful!

๐Ÿ’ป Your Setup

  • Doku Version: [e.g., 0.0.1]
  • DokuMetry SDK: [Python or NodeJS]
  • DokuMetry SDK Version: [e.g., 0.0.3]
  • Deployment Method: [Helm, Linux, Windows, Docker]

๐Ÿ“ Additional Notes

Got more to say? Tell us here.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[FEAT]: Add `GET` for `/api/connections` endpoint in Ingester

๐Ÿš€ What's the Problem?

Currently, there's no straightforward way for users to verify or view the configuration details of the observability platform they have set up to send LLM observability data to from Doku. This lack of transparency can lead to confusion or misconfiguration issues, as users cannot easily confirm the current setup.

๐Ÿ’ก Your Dream Solution

I envision a robust GET endpoint that users can call to retrieve the current configuration details of their observability platform. This endpoint would ideally return information such as the platform name (e.g., Grafana, NewRelic), API keys, the URLs for metrics and logs endpoints, any relevant usernames, and timestamps showing when the configuration was last updated.

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

No specific diagrams provided, but imagine a simple JSON structure depicting a clear, hierarchical organization of configuration details like

{
    "connections": {
        "apiKey": "xxx",
        "created_at": "2024-03-17T18:12:54Z",
        "logsUrl": "https://logxxx.com",
        "logsUsername": "123",
        "metricsUrl": "https://metrics.com",
        "metricsUsername": "123",
        "platform": "grafana"
    },
    "status": 200
}

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Add a page for Tracking all requests

๐Ÿš€ What's the Problem?

Add a page for tracking all Requests makde to OpenAI, Anthropic and Cohere

๐Ÿ’ก Your Dream Solution

Users should be able to filter by Platform, Category, Application and Environment

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Reduce client docker image size using alpine

๐Ÿš€ What's the Problem?

Doku client Image is currently a whopping 2.14 GB in size. Reduce the size for this image using alpine images

๐Ÿ’ก Your Dream Solution

To reduce the image size <200 MB

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Bug]: Minor UI fixes

๐Ÿ› What's Going Wrong?

  • Logo on the Home page is not a PNG
  • At the bottom, Slack Invite link seems to have expired, Update that to use our slack workspace URL?
  • Auth pages make the corners a bit more rounded
  • The Doku logo on dashboard fix on sidebar
  • In Profile page, Maybe add email as that what the user used to signin?
  • In request page, When I open Audio speech request it has 00 in between cost and source lang

image

  • Why does the request page show page as 1 of 0, 2 of 0. Probably fix that

image

  • In request page, When opening a image request, It doesnt show the image
    image

Doku Ingester

๐Ÿ•ต๏ธ Steps to Reproduce

To see the bug, what should we do?

  1. Start at '...'
  2. Click '...'
  3. Look for '...'
  4. Oops, there's the issue!

๐ŸŽฏ What Did You Expect?

Describe what you thought would happen.

๐Ÿ“ธ Any Screenshots?

Pictures can say a thousand words and can be super helpful!

๐Ÿ’ป Your Setup

  • Doku Version: [e.g., 0.0.1]
  • DokuMetry SDK: [Python or NodeJS]
  • DokuMetry SDK Version: [e.g., 0.0.3]
  • Deployment Method: [Helm, Linux, Windows, Docker]

๐Ÿ“ Additional Notes

Got more to say? Tell us here.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Add LLM Monitoring Data retention policy to `DOKU_LLM_DATA` Table

๐Ÿš€ What's the Problem?

The LLM Monitoring data stored in DOKU_LLM_DATA Table currently has no retention period which can lead to storage issue

๐Ÿ’ก Your Dream Solution

Add data retention policy which can be user configured with a default of 6 months.

๐Ÿค” Seen anything similar?

https://docs.timescale.com/use-timescale/latest/data-retention/create-a-retention-policy/

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Bug]: Check load times for the request page, it seems to be a little slow.

๐Ÿ› What's Going Wrong?

Explain the bug. Tell us what's happening that shouldn't be.

๐Ÿ•ต๏ธ Steps to Reproduce

To see the bug, what should we do?

  1. Start at '...'
  2. Click '...'
  3. Look for '...'
  4. Oops, there's the issue!

๐ŸŽฏ What Did You Expect?

Describe what you thought would happen.

๐Ÿ“ธ Any Screenshots?

Pictures can say a thousand words and can be super helpful!

๐Ÿ’ป Your Setup

  • Doku Version: [e.g., 0.0.1]
  • DokuMetry SDK: [Python or NodeJS]
  • DokuMetry SDK Version: [e.g., 0.0.3]
  • Deployment Method: [Helm, Linux, Windows, Docker]

๐Ÿ“ Additional Notes

Got more to say? Tell us here.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Create apis for different stats

๐Ÿš€ What's the Problem?

  • Finding Total Requests
  • Finding Request per time
  • Finding Avg Request Duration
  • Finding Total Cost
  • Finding Avg Cost per request
  • Finding Top models
  • Finding Distribution by generation categories

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Feat: Add temporary home page

๐Ÿš€ What's the Problem?

Tell us what's not working right or what's getting in your way.

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Support for `Async` OpenAI Python usage

๐Ÿš€ What's the Problem?

The current DokuMetry Python SDK doesnt support Async OpenAI

๐Ÿ’ก Your Dream Solution

Similar setup for Async OpenAI as we have for normal OpenAI usage

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Add Monitoring Integration for Pinecone

๐Ÿš€ What's the Problem?

With Doku, We can easily monitor our LLM Applications, but other parts of the LLM Stack like Vector DBs are currently not supported

๐Ÿ’ก Your Dream Solution

Support monitoring of vector DBs like Pinecone to begin with

๐Ÿค” Seen anything similar?

Pinecone already emits standard prometheus style metrics which can be used as is by Doku Ingester

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Create user login, register and logout

๐Ÿš€ What's the Problem?

Tell us what's not working right or what's getting in your way.

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Add a page for API Keys in the UI

๐Ÿš€ What's the Problem?

Add a page in the UI for creating and Deleting API Keys

๐Ÿ’ก Your Dream Solution

The API Key should be visible only for the first time, and should be masked in the UI

๐Ÿค” Seen anything similar?

OpenAI UI has a good page for managing API Keys

๐Ÿ–ผ๏ธ Pictures or Drawings

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Fix redirection for auth and create common auth components

๐Ÿ› What's Going Wrong?

Explain the bug. Tell us what's happening that shouldn't be.

๐Ÿ•ต๏ธ Steps to Reproduce

To see the bug, what should we do?

  1. Start at '...'
  2. Click '...'
  3. Look for '...'
  4. Oops, there's the issue!

๐ŸŽฏ What Did You Expect?

Describe what you thought would happen.

๐Ÿ“ธ Any Screenshots?

Pictures can say a thousand words and can be super helpful!

๐Ÿ’ป Your Setup

  • Doku Version: [e.g., 0.0.1]
  • DokuMetry SDK: [Python or NodeJS]
  • DokuMetry SDK Version: [e.g., 0.0.3]
  • Deployment Method: [Helm, Linux, Windows, Doker]

๐Ÿ“ Additional Notes

Got more to say? Tell us here.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Feat: Add `platform`, `generation` and `job` labels to LLM Observability data exported to Grafana Cloud

๐Ÿ› What's Going Wrong?

When sending LLM Monitoring data to Grafana Cloud, We don't extract the platform and generation type from the endpoint key which leads to difficulty when building and aggregating data by provider and generation in the dashboard

๐Ÿ•ต๏ธ Steps to Reproduce

To see the bug, what should we do?

  1. Export monitoring data from LLM Applications to Grafana Cloud via Doku Ingester

๐ŸŽฏ What Did You Expect?

A single label to aggregate all metrics and also be able to drill down on metrics from platform and generation

๐Ÿ“ธ Any Screenshots?

NA

๐Ÿ’ป Your Setup

  • Doku Version: 0.0.1
  • DokuMetry SDK: Python/NodeJS
  • DokuMetry SDK Version: 0.0.3
  • Deployment Method: Helm

๐Ÿ“ Additional Notes

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Update: Fix `id` in `DOKU_DATA` table to be a UUID instead of `BIGSERIAL`

๐Ÿ› What's Going Wrong?

The id field currently is a BIGSERIAL which can lead to security issues, Convert it to UUID

๐Ÿ•ต๏ธ Steps to Reproduce

NA

๐ŸŽฏ What Did You Expect?

id field to be UUID

๐Ÿ“ธ Any Screenshots?

NA

๐Ÿ’ป Your Setup

  • Doku Version: 0.0.1
  • DokuMetry SDK: Python
  • DokuMetry SDK Version: 0.0.3
  • Deployment Method: Helm

๐Ÿ“ Additional Notes

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Implement state management tool app wide

๐Ÿš€ What's the Problem?

Recommended tool : zustand

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Add export Connection for OpenObserve

๐Ÿš€ What's the Problem?

Add Connection for exportng LLM Monitoring metrics and logs to OpenObserve

๐Ÿ’ก Your Dream Solution

Similar to the current setup for DataDog etc, Auto export all LLM Observability data from Ingester

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Support Tracing for RAG based Applications

๐Ÿš€ What's the Problem?

Currently for direct use of OpenAI and similar models, We are able to gather metrics. But when using a RAG based approach in our LLM Applications, We don't get a full overview of the steps taken by the model

๐Ÿ’ก Your Dream Solution

DokuMetry SDK has a flag for collecting OpenTelemetry based Traces and spans

๐Ÿค” Seen anything similar?

OpenTelemetry libraries already support tracing and these can be visualised easily in any of the observability tools.

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Add retries to Database connection in Ingester

๐Ÿ› What's Going Wrong?

When the database is not running or currently not available, The Initialization in Ingester fails and restarts the whole process

๐Ÿ•ต๏ธ Steps to Reproduce

To see the bug, what should we do?

  1. Install Ingester without TimescaleDB access

๐ŸŽฏ What Did You Expect?

Ingester should have atleast 5 retries before exiting the whole process

๐Ÿ“ธ Any Screenshots?

NA

๐Ÿ’ป Your Setup

  • Doku Version: 0.0.1
  • DokuMetry SDK: NA
  • DokuMetry SDK Version: NA
  • Deployment Method: Helm

๐Ÿ“ Additional Notes

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Feat: Add `platform`, `generation` and `job.Name` labels to LLM Observability data exported to DataDog and New Relic

๐Ÿ› What's Going Wrong?

When sending LLM Monitoring data to DataDog and New Relic, We don't extract the platform and generation type from the endpoint key which leads to difficulty when building and aggregating data by provider and generation in the dashboard

๐Ÿ•ต๏ธ Steps to Reproduce

To see the bug, what should we do?

  1. Export monitoring data from LLM Applications to DataDog and New Relic via Doku Ingester

๐ŸŽฏ What Did You Expect?

A single label to aggregate all metrics and also be able to drill down on metrics from platform and generation

๐Ÿ“ธ Any Screenshots?

NA

๐Ÿ’ป Your Setup

  • Doku Version: 0.0.3
  • DokuMetry SDK: Python/NodeJS
  • DokuMetry SDK Version: 0.0.4
  • Deployment Method: Helm

๐Ÿ“ Additional Notes

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Add docker image to build client UI

๐Ÿš€ What's the Problem?

Tell us what's not working right or what's getting in your way.

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Move to sqlite for doku UI client

๐Ÿš€ What's the Problem?

The problem right now if that if a user installs doku, one would end up using 4 services i.e. ingester, ui client, clickhouse and postgres. We want to keep it to limited resources, so we would be moving our ui client db to use sqlite.

[Feat]: Create a page to add database configuration

๐Ÿš€ What's the Problem?

Provide a mechanism to let users add their databases for the metrics

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Reduce Doku Ingester Image size using Multi Stage Docker build and alpine image

๐Ÿš€ What's the Problem?

Doku Ingester Image is currently a whopping 1.2Gb in size. Reduce the size for this image using multi stage builds and alpine images

๐Ÿ’ก Your Dream Solution

Image size <100mi

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Feat: Update Anthropic usage monitoring in `DokuMetry`

๐Ÿš€ What's the Problem?

Anthropic has released an update to their APIs and SDKs which needs corresponding updates in DokuMetry

๐Ÿ’ก Your Dream Solution

Supprt messages with sync and async in DokuMetry

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Add caching layer to fetch active db configs and other settings

๐Ÿš€ What's the Problem?

Recommended tool : lru-cache

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Build a dashboard/Mixin for Grafana OSS

๐Ÿš€ What's the Problem?

Tell us what's not working right or what's getting in your way.

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Add connections page for exporting functionality through ingester

๐Ÿš€ What's the Problem?

Through this page, user can add connections to export the llm data to their respective platforms mentioned below :

  • grafana
  • newrelic
  • datadog
  • dynatrace
  • signoz

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Add Monitoring Integration for LangChain framework

๐Ÿš€ What's the Problem?

Currently we support LLM Monitoring integration when directly using official SDKs. LangChain is a popular choice when building RAG based applications but Doku doesn't support monitoring it currently

๐Ÿ’ก Your Dream Solution

Similar setup as OpenAI Monitoring, Build a LLM monitoring integration for LangChain

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Track LLM Request ID as `llmReqId`

๐Ÿš€ What's the Problem?

dokumetry SDKs currently do not track LLM Request ID and the ingester also does not store it in the DOKU_LLM_DATA table

๐Ÿ’ก Your Dream Solution

Track LLM Request ID as LLM_REQ_ID

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Add Evaluation and Tuning Playground

๐Ÿš€ What's the Problem?

Currently we cant experiment quickly and compare different models side-by-side on their response and related metrics like token usage, cost and latency

๐Ÿ’ก Your Dream Solution

Add a playground where users can compare between different models from different providers side by siude on the same set of prompts.

๐Ÿค” Seen anything similar?

OpenAI Playground but think of it for multiple model providers and for testing a single prompt rather than the whole chat

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Feat: Add `created_on` fields to the `DOKU_APIKEYS` table

๐Ÿš€ What's the Problem?

The created_on field is not stored when creating API Keys

๐Ÿ’ก Your Dream Solution

created_on field in the table

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Track OpenAI Usage with No-code

๐Ÿš€ What's the Problem?

OpenAI has a hidden API endpoint, We can make a curl request to get the token usage, number of requests and some basic details that are also showed on OpenAI Usage page. This is a sample curl request we can make to get this information

curl -X GET 'https://api.openai.com/v1/usage?date=2024-03-02' \
-H "Authorization: Bearer sk-xxxxxxxx" \
-H "Openai-Organization: org-xxxxxx"

๐Ÿ’ก Your Dream Solution

  • Users who are not looking to get the information on request, prompts and inner details, Essentially using LLM tools where they have to pass OpenAI API key. They can pass the keys and orgs that they want details for and Ingester runs a cronjob very 24hours to get usage for the day and add it into the database.

๐Ÿค” Seen anything similar?

llm.report has a similar functionality

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Feat: Build a LLM Monitoring dashboard for New Relic

๐Ÿš€ What's the Problem?

While Doku, is capable of exporting metrics to New Relic, we currently do not provide any premade dashboards. This absence of initial setup tools forces users to spend additional time and effort crafting their own dashboards to monitor their data effectively. This step poses a challenge for users who may not be familiar with best practices for dashboard creation in New Relic, potentially leading to suboptimal monitoring setups.

๐Ÿ’ก Your Dream Solution

The ideal solution would be the development of a comprehensive LLM monitoring dashboard for New Relic tailored specifically for Doku's exported metrics. The goal is to allow users to monitor LLM Applications witj:

  • Predefined Dashboard: Prebuilt dashboard to display LLM Monitoring data processed by Doku' in a clear, insightful manner. These dashboards would highlight key performance indicators and provide users with an immediate overview of their system's health.

๐Ÿค” Seen anything similar?

NA

๐Ÿ–ผ๏ธ Pictures or Drawings

NA

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Add init db seed in case of local clickhouse installation

๐Ÿš€ What's the Problem?

Add an init db seed for creating a user with local clickhouse db config. Update client readme for the same

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Feat: Add a view to display request details on click of any request in all request page

๐Ÿš€ What's the Problem?

There is no way to display all the details of a request

๐Ÿ’ก Your Dream Solution

Display data in similar kind of popover
Screenshot 2024-02-07 at 2 42 14 AM

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Getting started page

๐Ÿš€ What's the Problem?

To define the steps to introduce dokumetry in user's backend and to determine whether the database is connected or not

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feature]: Integrate Metrics APIs for dashboard

๐Ÿš€ What's the Problem?

Integrate the APIs mentioned in the issue : #7 to display UI on the dashboard screen

๐Ÿ’ก Your Dream Solution

Tell us what you wish for, with as much detail as you like.

๐Ÿค” Seen anything similar?

Share other things you've tried or seen that are close to what you're thinking.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

[Feat]: Add Alerting support

๐Ÿš€ What's the Problem?

In our current setup with the Doku, and monitoring OpenAI API Calls, we lack the ability to proactively monitor and react to critical metrics such as usage spikes, approaching request limit thresholds, and other relevant statistics that are pivotal for maintaining operational stability and cost management. Without real-time alerts, teams are often reactive rather than proactive, leading to disrupted services or unnecessary expenses.

๐Ÿ’ก Your Dream Solution

A robust, configurable alerting system within LLM Observability that allows users to:

  1. **Pre-built Alerts:**Alerts should be prebuilt into Doku with proper default thresholds
  2. Set Custom Thresholds: Users should be able to define custom thresholds for various metrics (e.g., number of requests, usage spikes) beyond which alerts would be triggered.
  3. Multi-Channel Alerts: Alerts should be dispatchable through multiple channels, including email, SMS, Slack, or webhooks, to ensure prompt notifications. We can start with email and Slack maybe?
  4. Comprehensive Dashboard: An integrated dashboard that displays current metrics, historical data, and alert configuration settings for easy monitoring and management.

๐Ÿค” Seen anything similar?

Similar to Prometheus AlertManager where user can silence alerts, send to different channels etc.

๐Ÿ–ผ๏ธ Pictures or Drawings

Share any picture or drawing that shows your idea if you have.

๐Ÿ‘ Want to Help Make It Happen?

  • Yes, I'd like to volunteer and help out with this!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.