Comments (19)
Hello @TIDIALLO can you please provide all of your repro steps to understand when you get that error?
from chat-with-your-data-solution-accelerator.
Hi i'm also hitting a 500 when calling the api/conversation/custom, with {"error":"Invalid connection string"}
Environment Variables and Key are looking good
from chat-with-your-data-solution-accelerator.
Hi @TIDIALLO , i have done the whole stuff in Azure, but have not used the bicep, instead of this i have written down all the neccessary components in Terraform
from chat-with-your-data-solution-accelerator.
Hi @gmndrg, I resumed the implementation on the azure side by following all steps and everything went well until deployment but the problem is if I start the test on the chat it generates the error, it does not even answer the greeting (for example hi)
from chat-with-your-data-solution-accelerator.
Have you uploaded any files so you can start asking questions about your content? What kind of files have you uploaded to the index via the admin portal?
from chat-with-your-data-solution-accelerator.
yes I loaded the ones on the doc folder in my storage account in a container via the admin interface (which I found cool with streamlit
from chat-with-your-data-solution-accelerator.
Unfortunately, this looks related to perhaps some (or one of the files you uploaded). Unless you upload your documents online so we can check what may be the issue to assist (which is not recommended unless public), there is not much guidance we can provide for this.
from chat-with-your-data-solution-accelerator.
@krohm one of your environment variables for a connection string has to be incorrect since that is a product error, not a repo error. Please double check. Also check if you have any firewalls or anything setup besides the repo config that is not letting check the keys.
from chat-with-your-data-solution-accelerator.
Hi, it was the APPINSIGHTS_CONNECTION_STRING setting which was missing ... :-(
from chat-with-your-data-solution-accelerator.
Hi @krohm, I would like to know if you did the implementation locally or directly in azure?
thanks in advance
from chat-with-your-data-solution-accelerator.
this s the error that i occure
from chat-with-your-data-solution-accelerator.
Like mentioned above without giving us all the repro steps of everything, unfortunately we cannot repro, since this looks like a specific error when using the deployment you have, We can only recommend to redeploy the whole solution from scratch and try uploading a doc, try to repro, and until you get the error, perhaps you're able to identify if related to an unsupported doc or similar.
from chat-with-your-data-solution-accelerator.
okay,
Iβll star the solution again and see it all go well, πitβs weird
Thank you
from chat-with-your-data-solution-accelerator.
@gmndrg is there a possibility to make an inspection deu code a part of azure
from chat-with-your-data-solution-accelerator.
Hello @TIDIALLO custom code review is not part of the support, unfortunately. We try to help with documentation and general guidance accordingly.
from chat-with-your-data-solution-accelerator.
this the error that I have when deploying in local
from chat-with-your-data-solution-accelerator.
Please review this post to see if the suggestions help: https://learn.microsoft.com/en-us/answers/questions/1401622/azure-openai-issue-with-endpoint-connection-errno. If you still face the issues after checking DNS and if you're not using a proxy, then please help opening a support case with the Azure OpenAI team for assistance: https://learn.microsoft.com/en-us/azure/azure-portal/supportability/how-to-create-azure-support-request Thanks.
from chat-with-your-data-solution-accelerator.
Hi team, thank you for building this.
After deploying I get:
"Traceback (most recent call last): File "/usr/local/src/myscripts/admin/pages/02_Explore_Data.py", line 38, in search_client = vector_store_helper.get_vector_store().client File "/usr/local/src/myscripts/admin/../utilities/helpers/AzureSearchHelper.py", line 33, in get_vector_store vector_search_dimensions=len(llm_helper.get_embedding_model().embed_query("Text")), File "/usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 516, in embed_query return self.embed_documents([text])[0] File "/usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 488, in embed_documents return self._get_len_safe_embeddings(texts, engine=self.deployment) File "/usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 374, in _get_len_safe_embeddings response = embed_with_retry( File "/usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 107, in embed_with_retry return _embed_with_retry(**kwargs) File "/usr/local/lib/python3.9/site-packages/tenacity/init.py", line 289, in wrapped_f return self(f, *args, **kw) File "/usr/local/lib/python3.9/site-packages/tenacity/init.py", line 379, in call do = self.iter(retry_state=retry_state) File "/usr/local/lib/python3.9/site-packages/tenacity/init.py", line 314, in iter return fut.result() File "/usr/local/lib/python3.9/concurrent/futures/_base.py", line 439, in result return self.__get_result() File "/usr/local/lib/python3.9/concurrent/futures/_base.py", line 391, in __get_result raise self._exception File "/usr/local/lib/python3.9/site-packages/tenacity/init.py", line 382, in call result = fn(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 104, in _embed_with_retry response = embeddings.client.create(**kwargs) File "/usr/local/lib/python3.9/site-packages/openai/api_resources/embedding.py", line 33, in create response = super().create(*args, **kwargs) File "/usr/local/lib/python3.9/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create response, _, api_key = requestor.request( File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 298, in request resp, got_stream = self._interpret_response(result, stream) File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 700, in _interpret_response self._interpret_response_line( File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 763, in _interpret_response_line raise self.handle_error_response( openai.error.InvalidRequestError: The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again."
I seem to face similar issues when using the playground and the chat with your own data capabilities there.
my OpenAI resource name (not the endpoint, right?) seems to be OK and so does the key.
Any suggestions or is there something broken because the API for AzureAISearch has changed?
I can see that no indexer has been created.
I used this super simple PDF:
https://dagrs.berkeley.edu/sites/default/files/2020-01/sample.pdf
kind regards,
Daniel
from chat-with-your-data-solution-accelerator.
@daniheck-msft If you're facing the same issues with chat with your data as well, it might be related to your Azure OpenAI deployment. Please open a support ticket in the portal to engage the Azure OpenAI team: https://learn.microsoft.com/en-us/azure/azure-portal/supportability/how-to-create-azure-support-request
from chat-with-your-data-solution-accelerator.
Related Issues (20)
- DevContainer doesn't work on Mac. HOT 5
- Error processing pdf, jpg/png files HOT 1
- Move Semantic Kernel functional tests to `default`
- Allow user to choose between five query types: simple, semantic, vector, simple-hybrid and simple-hybrid-vector from both /custom and /byod endpoints
- Introduce a new config attribute in main.bicep file choose between custom and byod conversations. HOT 1
- Use the new config in create_app.py to create corresponding API request. HOT 1
- make changes to frontend to use either the custom or byod endpoint by calling /conversation HOT 1
- Use release tags for Docker images
- Add Docker image version as parameter
- Update documentation regarding deploying specific application version
- Functional tests for Integrated Vectorization
- Add the PromptFlow from "On Your Data" and check it works with our index HOT 4
- Separate code for admin webapp, frontend API and the function app. HOT 1
- Move business logic out of create_app.py HOT 1
- Enable semantic search type
- Deploy To Azure button, the backend app service deployment cannot get the clientKey HOT 1
- Unable to view images in Explore Data tab in Admin app
- chatbot UI is throwing errors
- Run unit and functional tests together
- Include BYOD environment variables in custom endpoint
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from chat-with-your-data-solution-accelerator.