Giter Club home page Giter Club logo

Comments (10)

mintuhouse avatar mintuhouse commented on August 24, 2024 1

btw there is a pending PR reverting above linked change
(Not sure if it will get merged though)

Above PR won't be merged. Can this be fixed?

from dd-trace-py.

theseus905 avatar theseus905 commented on August 24, 2024 1

Will a fix for this be included in v2.9.0?

from dd-trace-py.

Yun-Kim avatar Yun-Kim commented on August 24, 2024 1

Thanks for raising the issue, and apologies for the late response.

We're working on getting a fix out ASAP. In the meantime as a workaround, I recommend pinning your version of langchain-openai<0.1.2.

from dd-trace-py.

emmettbutler avatar emmettbutler commented on August 24, 2024

@Yun-Kim

from dd-trace-py.

mintuhouse avatar mintuhouse commented on August 24, 2024

Adding another stacktrace (not using async)

ERROR:    Exception in ASGI application"{"TimeStamp":"2024-05-06T04:13:32.6669377Z","Log":"
  + Exception Group Traceback (most recent call last):"
  |   File "/.cache/venv/lib/python3.12/site-packages/starlette/_utils.py", line 87, in collapse_excgroups"
  |     yield"
  |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 190, in __call__"
  |     async with anyio.create_task_group() as task_group:"
  |   File "/.cache/venv/lib/python3.12/site-packages/anyio/_backends/_asyncio.py", line 678, in __aexit__"
  |     raise BaseExceptionGroup("
  | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)"
  +-+---------------- 1 ----------------"
    | Traceback (most recent call last):"
    |   File "/.cache/venv/lib/python3.12/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi"
    |     result = await app(  # type: ignore[func-returns-value]"
    |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__"
    |     return await self.app(scope, receive, send)"
    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__"
    |     await super().__call__(scope, receive, send)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/applications.py", line 123, in __call__"
    |     await self.middleware_stack(scope, receive, send)"
    |   File "/.cache/venv/lib/python3.12/site-packages/ddtrace/contrib/asgi/middleware.py", line 290, in __call__"
    |     return await self.app(scope, receive, wrapped_send)"
    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 186, in __call__"
    |     raise exc"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 164, in __call__"
    |     await self.app(scope, receive, _send)"
    |   File "/.cache/venv/lib/python3.12/site-packages/asgi_correlation_id/middleware.py", line 90, in __call__"
    |     await self.app(scope, receive, handle_outgoing_request)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 189, in __call__"
    |     with collapse_excgroups():"
    |   File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__"
    |     self.gen.throw(value)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/_utils.py", line 93, in collapse_excgroups"
    |     raise exc"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 191, in __call__"
    |     response = await self.dispatch_func(request, call_next)"
    |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/mlai/src/config/logger.py", line 28, in log_request"
    |     return await call_next(request)"
    |            ^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 165, in call_next"
    |     raise app_exc"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 151, in coro"
    |     await self.app(scope, receive_or_disconnect, send_no_error)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/trustedhost.py", line 53, in __call__"
    |     await self.app(scope, receive, send)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 189, in __call__"
    |     with collapse_excgroups():"
    |   File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__"
    |     self.gen.throw(value)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/_utils.py", line 93, in collapse_excgroups"
    |     raise exc"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 191, in __call__"
    |     response = await self.dispatch_func(request, call_next)"
    |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/mlai/src/middleware/authentication.py", line 41, in authentication"
    |     response = await call_next(request)"
    |                ^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 165, in call_next"
    |     raise app_exc"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 151, in coro"
    |     await self.app(scope, receive_or_disconnect, send_no_error)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 65, in __call__"
    |     await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app"
    |     raise exc"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app"
    |     await app(scope, receive, sender)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/routing.py", line 756, in __call__"
    |     await self.middleware_stack(scope, receive, send)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/routing.py", line 776, in app"
    |     await route.handle(scope, receive, send)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/routing.py", line 297, in handle"
    |     await self.app(scope, receive, send)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/routing.py", line 77, in app"
    |     await wrap_app_handling_exceptions(app, request)(scope, receive, send)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app"
    |     raise exc"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app"
    |     await app(scope, receive, sender)"
    |   File "/.cache/venv/lib/python3.12/site-packages/starlette/routing.py", line 72, in app"
    |     response = await func(request)"
    |                ^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/fastapi/routing.py", line 278, in app"
    |     raw_response = await run_endpoint_function("
    |                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/fastapi/routing.py", line 191, in run_endpoint_function"
    |     return await dependant.call(**values)"
    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/mlai/main.py", line 103, in decide_and_act"
    |     return router.route(body)"
    |            ^^^^^^^^^^^^^^^^^^"
    |   File "/mlai/src/router.py", line 102, in route"
    |     agent_response = self.agent_executor.invoke({"input": json.dumps(input_for_agent)})"
    |                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/ddtrace/contrib/trace_utils.py", line 334, in wrapper"
    |     return func(mod, pin, wrapped, instance, args, kwargs)"
    |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/ddtrace/contrib/langchain/patch.py", line 612, in traced_chain_call"
    |     final_outputs = func(*args, **kwargs)"
    |                     ^^^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain/chains/base.py", line 163, in invoke"
    |     raise e"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain/chains/base.py", line 153, in invoke"
    |     self._call(inputs, run_manager=run_manager)"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain/agents/agent.py", line 1432, in _call"
    |     next_step_output = self._take_next_step("
    |                        ^^^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain/agents/agent.py", line 1138, in _take_next_step"
    |     ["
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain/agents/agent.py", line 1166, in _iter_next_step"
    |     output = self.agent.plan("
    |              ^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain/agents/agent.py", line 514, in plan"
    |     for chunk in self.runnable.stream(inputs, config={"callbacks": callbacks}):"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 2875, in stream"
    |     yield from self.transform(iter([input]), config, **kwargs)"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 2862, in transform"
    |     yield from self._transform_stream_with_config("
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1881, in _transform_stream_with_config"
    |     chunk: Output = context.run(next, iterator)  # type: ignore"
    |                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 2826, in _transform"
    |     for output in final_pipeline:"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1282, in transform"
    |     for ichunk in input:"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 4736, in transform"
    |     yield from self.bound.transform("
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain_core/runnables/base.py", line 1300, in transform"
    |     yield from self.stream(final, config, **kwargs)"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 249, in stream"
    |     raise e"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 229, in stream"
    |     for chunk in self._stream(messages, stop=stop, **kwargs):"
    |   File "/.cache/venv/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 480, in _stream"
    |     with self.client.create(messages=message_dicts, **params) as response:"
    | TypeError: 'generator' object does not support the context manager protocol"
    +------------------------------------"}

from dd-trace-py.

mintuhouse avatar mintuhouse commented on August 24, 2024

Tagging @Yun-Kim for visibility since I see you in most of openai related commits

Please retag if you know who to ask

from dd-trace-py.

dariatsenter avatar dariatsenter commented on August 24, 2024

Confirming this is happening for

langchain = "^0.1.16"
ddtrace = "^2.3.1"

from dd-trace-py.

mattflo avatar mattflo commented on August 24, 2024

Thanks for getting a fix for this @Yun-Kim!

Just to point out, I ran into this with langchain, but it is an issue with just openai and ddtrace:

ddtrace                                 2.8.5          Datadog APM client...
langchain-openai                        0.1.2          An integration pac...
openai                                  1.30.1         The official Pytho...

The following test illustrates the issue using openai without langchain:

import openai
import pytest

@pytest.mark.asyncio
async def test_openai_client():
    client = openai.AsyncOpenAI().chat.completions

    # this exhibits the issue only when run under ddtrace
    # TypeError: 'async_generator' object does not support the asynchronous context manager protocol
    response = await client.create(
        messages=[
            {
                "role": "user",
                "content": "what is the meaning of life?",
            }
        ],
        model="gpt-3.5-turbo-0125",
        stream=True,
    )

    # type of response:
    # without ddtrace: <class 'openai.AsyncStream'>
    # with ddtrace: <class 'async_generator'>
    print(type(response))

    async with response as resp:
        async for chunk in resp:
            print(chunk)

Failure output:

>       async with response as resp:
E       TypeError: 'async_generator' object does not support the asynchronous context manager protocol

from dd-trace-py.

Yun-Kim avatar Yun-Kim commented on August 24, 2024

Yes, thanks for clarifying @mattflo, this is a bug with our openAI integration which didn't account for the fact that OpenAI started making Stream/AsyncStream objects be usable as context managers starting in openai>=1.6.0. We will get this fixed released in the coming weeks for 2.8 and 2.9, in the meantime I would recommend pinning your version of openai to a previous version.

from dd-trace-py.

amanrocks11 avatar amanrocks11 commented on August 24, 2024

Hi is this fixed? If I set my openai<1.6.0 it is not compatible with langchain-openai (0.1.1)

from dd-trace-py.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.