Giter Club home page Giter Club logo

azure-samples / contoso-chat Goto Github PK

View Code? Open in Web Editor NEW
333.0 333.0 2.2K 21.79 MB

This sample has the full End2End process of creating RAG application with Prompt Flow and AI Studio. It includes GPT 3.5 Turbo LLM application code, evaluations, deployment automation with AZD CLI, GitHub actions for evaluation and deployment and intent mapping for multiple LLM task mapping.

License: MIT License

Python 2.36% Jupyter Notebook 48.88% Dockerfile 0.35% Bicep 45.71% Shell 2.09% PowerShell 0.61%

contoso-chat's Introduction

name description languages products page_type urlFragment
Contoso Chat Retail with Azure AI Studio and Promptflow
A retail copilot that answers customer queries with responses grounded in retailer's product and customer data.
python
bicep
azdeveloper
prompty
azure-openai
azure-cognitive-search
azure
azure-cosmos-db
sample
contoso-chat

Contoso Chat Retail with Azure AI Studio and Promptflow

This sample creates a customer support chat agent for an online retailer called Contoso Outdoors. The solution uses a retrieval-augmented generation pattern to ground responses in the company's product and customer data. Customers can ask questions about the retailer's product catalog, and also get recommendations based on their prior purchases.

Open in GitHub Codespaces Open in Dev Containers


About This Sample

In this sample we build, evaluate and deploy a customer support chat AI for Contoso Outdoors, a fictitious retailer who sells hiking and camping equipment. The implementation uses a Retrieval Augmented Generation (RAG) architecture to implement a retail copilot solution that responds to customer queries with answers grounded in the company's product catalog and customer purchase history.

The sample uses Azure AI Search to create and manage search indexes for product catalog data, Azure Cosmos DB to store and manage customer purchase history data, and Azure OpenAI to deploy and manage the core models required for our RAG-based architecture.

By exploring and deploying this sample, you will learn to:

Table of Contents

  1. Features
  2. Getting Started
  3. Azure Deployment
  4. Local Development
  5. Guidance
  6. Troubleshooting
  7. Resources
  8. Contributing
  9. Trademarks

Features

The project comes with:

  • Sample model configurations, chat and evaluation prompts for a RAG-based copilot app.
  • Prompty assets to simplify prompt creation & iteration for this copilot scenario.
  • Sample product and customer data for the retail copilot scenario.
  • Sample application code for copilot chat and evaluation workflows.
  • Sample azd-template configuration for managing the application on Azure.
  • Managed Identity configuration as a best practice for managing sensitive credentials.

This is also a signature sample for demonstrating new capabilities in the Azure AI platform. Expect regular updates to showcase cutting-edge features and best practices for generative AI development.

Architecture Diagram

The Contoso Chat application implements a retrieval augmented generation pattern to ground the model responses in your data. The architecture diagram below illustrates the key components and services used for implementation and highlights the use of Azure Managed Identity to reduce developer complexity in managing sensitive credentials.

Architecture Diagram

Demo Video

🌟 | Watch for a video update showing how easy it is to go from code to cloud using this template and the Azure Developer CLI for deploying your copilot application.

Versions

This has been the signature sample used to showcase end-to-end development of a copilot application code-first on the Azure AI platform. It has been actively used for training developer audiences and industry partners at key events including Microsoft AI Tour and Microsoft Build. Use the links below to reference specific versions of the sample corresponding to a related workshop or event session.

Version Description
v0 : #cc2e808 Microsoft AI Tour 2023-24 (dag-flow, jnja template) - Skillable Lab
v1 : msbuild-lab322 Microsoft Build 2024 (dag-flow, jnja template) - Skillable Lab
v2 : main Latest version (flex-flow, prompty asset)- Azure AI Template

Getting Started

Pre-Requisites

You will also need:

Setup Environment

You have three options for getting started with this template:

  • GitHub Codespaces - Cloud-hosted dev container (pre-built environment)
  • VS Code Dev Containers - Locally-hosted dev container (pre-built environment)
  • Manual Setup - Local environment setup (for advanced users)

We recommend using GitHub Codespaces for the fastest start with least effort. However, we have provided instructions for all three options below.

1. GitHub Codespaces

  1. Click the button to launch this repository in GitHub Codespaces.

    Open in GitHub Codespaces

  2. This opens a new browser tab with setup taking a few minutes to complete. Once ready, you should see a Visual Studio Code editor in your browser tab, with a terminal open.

  3. Sign into your Azure account from the VS Code terminal

    azd auth login --use-device-code

2. VS Code Dev Containers

This is a related option that opens the project in your local VS Code using the Dev Containers extension instead. This is a useful alternative if your GitHub Codespaces quota is low, or you need to work offline.

  1. Start Docker Desktop (install it if not already installed)

  2. Open the project by clickjing the button below:

    Open in Dev Containers

  3. Once ready, the tab will refresh to show a Visual Studio Code editor with a terminal open.

  4. Sign into your Azure account from the VS Code terminal

    azd auth login

3. Manual Setup (Local)

  • Verify you have Python3 installed on your machine.

  • Install dependencies with pip install -r requirements.txt

  • Install Azure Developer CLI

    • Windows: winget install microsoft.azd
    • Linux: curl -fsSL https://aka.ms/install-azd.sh | bash
    • MacOS: brew tap azure/azd && brew install azd
  • Sign into your Azure account from the VS Code terminal

    azd auth login

Azure Deployment

  1. Use the same terminal where you previously authenticated with Azure.

  2. Provision and deploy your application to Azure. You will need to specify a valid subscription, deployment location, and environment name.

    azd up
  3. This step will take some time to complete.

    • Visit the Azure Portal to monitor progress.
    • Look for a new resource group matching the environment name
    • Click Deployments to track the status of the provisioning process
  4. Once provisioning completes, monitor progress for app deployment.

    • Visit the Azure AI Studio
    • Look for an AI Project associated with the above resource group
    • Click Deployments to track the status of the application deployment
  5. Once deployment completes, test the deployed endpoint from Azure AI Studio

    • Click the newly-created chat-deployment-xx endpoint listed
    • In the details page, click the Test tab for a built-in testing sandbox
    • In the Input box, enter a new query in this format and submit it:
      {"question": "Tell me about hiking shoes", "customerId": "2", "chat_history": []}
      
    • If successful, the response will be printed in the area below this prompt.

You can find your deployed retail copilot's Endpoint and Primary Key information on the deployment details page in the last step. Use them to configure your preferred front-end application (e.g., web app) to support a customer support chat UI capability that interacts with the deployed copilot in real time.

Local Development

Exploring the Prompty Asset

This sample contains an example chat.prompty asset that you can explore, to understand this new capability. The file has the following components:

  1. A frontmatter section that defines the following attributes:
    • name of the application
    • description of the application functionality
    • authors of the application (one per line)
    • model description (with these parameters)
      • api type of endpoint (can be chat or completion)
      • configuration parameters including
        • type of connection (azure_openai or openai)
        • environment variables (e.g., azure_deployment for chat model)
      • parameters (max_tokens, temperature, response_format)
    • inputs - each with type and optional default value
    • outputs - specifying a type (e.g., string)
    • sample - an example of the inputs (e.g., for testing)
  2. A system context (defining the agent persona and behavior)
    • #Safety section enforcing responsible AI requirements
    • #Documentation section with template for filling product documentation
    • #Previous Orders section with template for filling relevant history
    • #Customer Context section with template for filling customer details
    • question section to embed user query
    • Instructions section to reference related product recommendations

This specific prompty takes 3 inputs: a customer object, a documentation object (that could be chat history) and a question string that represents the user query. You can now load, execute, and trace individual prompty assets for a more granular prompt engineering solution.

Testing the Application Flow

This sample uses a flex-flow feature that lets you "create LLM apps using a Python class or function as the entry point" - making it easier to test and run them using a code-first experience.

  • This sample implements a Function based flow
  • The entry point is the get_response functionin chat_request.py

You can now test the flow in different ways:

  • Run it directly, like any Python script
  • Convert it to a flow, then use pf flow test --flow ...
  • Start a UI to chat with the flow using pf flow test --flow ... --ui

🌟 | Watch this space for more testing guidance.

Guidance

Region Availability

This template uses gpt-35-turbo for chat completion, gpt-4 for chat evaluation and text-embedding-ada-002 for vectorization. These models may not be available in all Azure regions. Check for up-to-date region availability and select a region accordingly.

This template uses the Semantic Ranker feature of Azure AI Search which may be available only in certain regions. Check for up-to-date region availability and select a region accordingly.

  • We recommend using sweden-central for the OpenAI Models
  • We recommend using eastus for the Azure AI Search Resource

Note

The default azd deploy takes a single location for deploying all resources within the resource group for that application. We set the default Azure AI Search location to eastus (in infra/ configuration), allowing you to now use the default location setting to optimize for model availability and capacity in region.

Costs

Pricing for services may vary by region and usage and exact costs cannot be estimated. You can estimate the cost of this project's architecture with Azure's pricing calculator with these services:

  • Azure OpenAI - Standard tier, GPT-4, GPT-35-turbo and Ada models. See Pricing
  • Azure AI Search - Basic tier, Semantic Ranker enabled See Pricing
  • Azure Cosmos DB for NoSQL - Serverless, Free Tier See Pricing

Security

This template uses Managed Identity for authentication with key Azure services including Azure OpenAI, Azure AI Search, and Azure Cosmos DB. Applications can use managed identities to obtain Microsoft Entra tokens without having to manage any credentials. This also removes the need for developers to manage these credentials themselves and reduces their complexity.

Additionally, we have added a GitHub Action tool that scans the infrastructure-as-code files and generates a report containing any detected issues. To ensure best practices we recommend anyone creating solutions based on our templates ensure that the Github secret scanning setting is enabled in your repo.

Resources

Troubleshooting

Have questions or issues to report? Please open a new issue after first verifying that the same question or issue has not already been reported. In the latter case, please add any additional comments you may have, to the existing issue.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.

contoso-chat's People

Contributors

amynic avatar anderl80 avatar cassiebreviu avatar jongio avatar microsoft-github-operations[bot] avatar microsoft-github-policy-service[bot] avatar microsoftopensource avatar nitya avatar pamelafox avatar revodavid avatar sethjuarez avatar tonybaloney avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

contoso-chat's Issues

The README is missing the step to create a custom connection for Cosmos DB

Required to successfully run evaluate-chat-prompt-flow.ipynb

Creating Custom Connection (contoso-cosmos)

  1. Visit https://ml.azure.com instead
  2. Under Recent Workspaces, click project (contoso-chat-aiproj)
  3. Select Prompt flow (sidebar), then Connections (tab)
  4. Click Create and select Custom from dropdown
  5. Name: contoso-cosmos
  6. Provider: Custom (default)
  7. Key-value pairs: Add 4 entries (get env var values from .env)
  8. key: key, value: "COSMOS_KEY", check "is secret"
    1. key: endpoint, value: "COSMOS_ENDPOINT"
    2. key: containerId, value: customers
    3. key: databaseId, value: contoso-outdoor
    4. Click Save to complete step.

Error `Environment '{}' is not registered` when deploying Prompt Flow in Azure AI Studio and ML Studio

I tried deploying to 1 instance and to 3 and get the same error every time. Also tried selecting a different instance types. Also tried deploying in ML Studio instead of AI Studio.

Environment promptflow_a80*************************ab is not registered

There's a link in the log pointing here.

But I haven't figured out how to "register" the environment.
I can see the environment it refers to in ML Studio -> contoso-chat-aiproj -> Environments. It says "build failed". If I click the "Rebuild" button there, it rebuilds without error. But when I try to deploy the Prompt Flow, I always get the same error.

Here's the error message from ml.azure.com (I redacted part of my environment name in the two spots its shown):

Execution failed. User process 'python' exited with status code 1. Please check log file 'user_logs/std_log.txt' for error details. Error: Traceback (most recent call last):
  File "prepare.py", line 26, in <module>
    build_local_environment(args.name, args.version, args.platform)
  File "prepare.py", line 15, in build_local_environment
    environment.build_local(workspace, platform, useDocker=True, pushImageToWorkspaceAcr=True)
  File "/azureml-envs/image-build/lib/python3.8/site-packages/azureml/core/environment.py", line 1549, in build_local
    raise UserErrorException("Environment '{}' is not registered".format(self.name))
azureml.exceptions._azureml_exception.UserErrorException: UserErrorException:
	Message: Environment 'promptflow_a80*************************ab' is not registered
	InnerException None
	ErrorResponse 
{
    "error": {
        "code": "UserError",
        "message": "Environment 'promptflow_a80*************************ab' is not registered"
    }
}

Also these two warnings (redacted by Azure, not me):

AzureMLCompute job failed
ExecutionFailed: [REDACTED]
	exit_codes: 1
	Appinsights Reachable: Some(true)

Serverless job failed.

Here's what I see in the ML Studio notifications:
image

Also, I've been assuming that when I deploy the Prompt Flow, I do not have to use 3 instances? I've tried with 1 and with 3 and I get the same error with the same exact "not registered" environment in both cases.

Thank you!

Test Template

  • Testing works following the Readme, Codespaces is the priority
  • Make sure azd works in Codespaces (flag to fix if not)
  • We have verification step in README after the resources are deployed, there is a description of what the customer needs to do next
  • Ensure managed identity is implemented and working
  • If local doesn't work, we need a note in the README (make this explicit)

Nice to have (but document if not working)

  • runs local

Must have two testers approval, minimum.

$RANDOM does not exist in `sh`

I tried setting up the project using azd up and I noticed that the endpointName var is not set correctly. This is because azd hooks only run in sh or pwsh and $RANDOM does not exist in sh.

Code reference

endpointName="contoso-chat-$RANDOM"

As a workaround, I've added my own random_number generator which does similar.

random_number=$(od -An -N2 -i /dev/urandom | awk '{print $1}')

Acceptance criteria checklist (DoD)

The following checklist must be complete before a template is published.

Repository Management

Source code structure and conventions

  • GitHub Actions (This refers to .github/workflows/azure-dev.yml or custom workflow to run on a GitHub runner) is in place
  • DevContainer (/.devcontainer folder where applicable) configuration is in place
  • Infrastructure as code is in place (/infra folder where applicable, manifest files or code generators in the case of Aspire and similar )
  • Azure services configuration (/azure.yml file) is in place
  • Minimum coverage tests are in place

Functional requirements

  • azd up successfully provisions and deploys a functional app
  • GitHub Actions run tasks without errors
  • DevContainer has been tested locally and runs
  • Codespaces run [locally and in browser]
  • All tests pass

In the absence of e2e tests,

  • The application has been manually tested to work as per the requirement

Security requirements

When a service selected doesn't support Managed Identity, the corresponding issue must have been reported and the security considerations section in the readme, should clearly explain the alternatives.

  • Azure Key Vault is a preferred alternative

The following items are not strictly enforced but may prevent the template from being added to the gallery.

Project code follows standard structure, per language. Please check one.

  • Yes, follows standards
  • No, doesn't follow standards

Code follows recommended style guide

  • Yes, follows style guide
  • No, doesn't follow style guide

Batch run fails

In step "7. Evaluating prompt flow results", the cell with the contents pf_azure_client.stream(base_run) fails with an error. The error message in the Notebook is:

(Run status is 'NotStarted', continue streaming...)
(Run status is 'NotStarted', continue streaming...)
(Run status is 'NotStarted', continue streaming...)
(Run status is 'NotStarted', continue streaming...)
(Run status is 'NotStarted', continue streaming...)
(Run status is 'NotStarted', continue streaming...)
(Run status is 'NotStarted', continue streaming...)
(Run status is 'NotStarted', continue streaming...)
The run '2024_02_02_235111chat_base_run' is in status 'NotStarted' for 5 minutes, streaming is stopped.Please make sure you are using the latest runtime.
======= Run Summary =======
Run name: "2024_02_02_235111chat_base_run"
Run status: "NotStarted"
Start time: "None"
Duration: "None"
Run url: [REDACTED]

Checking the Run URL in AI Studio, the following error appears in promptflow-automatic.log:

Error       Failed to provision Automatic runtime, id: 4bdd38a060a1f0887395eafff8895f61bbc62444bc318e0f, error code: ValidationError/ValidationError, message: [docker.io/sethjuarez/contoso-store:v20240124.1] is not a valid name for an ACR image with tag specified.

InvalidRunStatusError When Running "evaluate-chat-prompt-flow.ipynb" notebook

I get the following error: InvalidRunStatusError: The input for batch run is incorrect. Input from key 'run.outputs' is an empty list, which means we cannot generate a single line input for the flow run. Please rectify the input and try again.

when I run this cell pf_azure_client.stream(eval_run_variant)

Can anyone tell me what I am missing?

image

Some default options are $expensive$ for a low-throughput example program

  1. Default for Cosmos DB is Provisioned throughput. Serverless should be much cheaper for this sample program.
  2. Default for Prompt Flow deployment is 3 instances. Would 1 work? What about different instance types?

Question
If I create a new Cosmos DB using Serverless, can I swap it out with my current provisioned DB? Can I just create a new connection for it in Azure and update .env in the Codespace.
Or I could start over with the updated repo and edit the bicep file section for Cosmos DB, but that seems like a lot more work.

Fill in the 'About' section for SEO

to improve SEO update from "No description, website, or topics provided." to something more meaningful.

Nicely written sample, you might want to make it easier for people to find it.

No module named 'promptflow.azure'

When running the "evaluate-chat-prompt-flow.ipynb" notebook in codespaces I get the following error..
Has something changed in this cell? I forked the latest version of the repo...
image

Soft delete of Key Vault prevents redeployment

If you run the provision.sh script once, delete the created resource group, and then attempt to run it a second time you will get an error message like this:

[{"code":"ConflictError","message":"A vault with the same name already exists in deleted state. You need to either recover or purge existing key vault. Follow this link https://go.microsoft.com/fwlink/?linkid=2149745 for more information on soft delete."}]}}

To redeploy, you will first need to purge the deleted keyvault with a command like:

az keyvault purge --name kv-contosooipsaukqmp24o --location swedencentral

or search for "Key Vaults" in the Azure Portal and then click on "Manage Deleted Vaults" to purge it.

Provision script does not create Prompt Flow connections

Section 4.5 of README.md suggests that the prompt flow connections should have been created by ./provision.sh, but the output of pf connection list is empty.

The solution is as suggested, to open connections/create-connections.ipynb and run the notebook, which creates the required connections.

Notebook for testing History from an Endpoint deployment

I was wondering there was example code in one of these Jupyter notebooks in this repo or another repo for how to test a prompt flow endpoint after deploying it? A Sample notebook that would include history from with multiple follow up questions.

Thanks in advance!

404 when 'Run All' local PromptFlow

I fiiled .env file with correct key and endpoint values in Azure Portal.
But when I run all local PromptFlow, it fails at 'text embedding' step.
I think that is OAI endpoint problem.

I'm in MS AI Tour Seoul 2024 now
Thank you, staffs

Models gpt-3.5-turbo, gpt-4, and text-embedding-ada-002 not available?

I saw them yesterday, I'm sure. But now I don't see them. I don't know if it's an issue with the Sweden Central region or something else.
image

EDIT: After getting no hint of the problem anywhere else, I decided, like a cornered wild animal, to frantically click on every button I could find on ai.azure.com. I came across the "Explore" tab, where I found the "Model Catalog". There, I searched for gpt-35-turbo and found this:

Screenshot 2024-01-12 at 9 53 15 PM

I shall now continue clicking haphazardly until I can figure out whether its my subscription, or the region, that is preventing me from using this resource. Unless anyone has any tips. Thanks!

InsufficientQuota Error

Hi,

I haven't been able to provision the required resources for this project. I was approved to use OpenAI, but I wasn't able to increase the quota tokens even though I've submitted 2 requests to do so.

Below are the 2 requests I sent and the most recent error code I've received. The original error was for gpt-35-turbo, the current error is for GPT4.

@mroopram ➜ /workspaces/contoso-chat-mr (cc2e808) $ ./provision.sh
Running provisioning using this subscription:
{
  "name": "Azure subscription 1",
  "subscriptionId": "a36d451c-ca02-4072-b151-d92c51e3a9ed"
}
If that is not the correct subscription, please run 'az account set --subscription "<SUBSCRIPTION-NAME>"'
Creating resource group contchat-rg in swedencentral...
Provisioning resources in resource group contchat-rg...
ERROR: {"code": "InvalidTemplateDeployment", "message": "The template deployment 'contchat' is not valid according to the validation procedure. The tracking id is '70e2f569-cf28-4458-909e-7caaaaf19381'. See inner errors for details."}

Inner Errors: 
{"code": "InsufficientQuota", "message": "This operation require 10 new capacity in quota Tokens Per Minute (thousands) - GPT-4, which is bigger than the current available capacity 1. The current quota usage is 0 and the quota limit is 1 for quota Tokens Per Minute (thousands) - GPT-4."}
ERROR: Failed to provision resources. Please check the error message above.

QuotaRequest2
QuotaRequest1

Deployment template validation failed: 'The provided value for the template parameter 'environmentName' is not valid. Length of the value should be greater than or equal to '1'

fresh install running ./provision.sh

ERROR: {"code": "InvalidTemplate", "message": "Deployment template validation failed: 'The provided value for the template parameter 'environmentName' is not valid. Length of the value should be greater than or equal to '1'. Please see https://aka.ms/arm-syntax-parameters for usage details.'.", "additionalInfo": [{"type": "TemplateViolation", "info": {"lineNumber": 14, "linePosition": 20, "path": "properties.template.parameters.environmentName.minLength"}}]}
ERROR: Failed to provision resources. Please check the error message above.

I also had to change provision.sh to overcome Issue 79: #79

Azure Developer CLI Template Guidelines

Azure Developer CLI Template Guidelines

These requirements are for samples that are using azd for deployment.

Code

Infra Folder

  • Bicep Format - Bicep files should be formatted with bicep format. Can possibly be automated with pre-commit or GitHub actions.

  • Core Modules - main.bicep should reference modules from core, copied from azure-dev.

    cp -r ../azure-dev/templates/common/infra/bicep/core/* infra/core/.

  • Dashboard - Resources should include a dashboard so that azd monitor works, either by referencing the monitoring.bicep module or creating a dashboard separately. See main.bicep

  • Monitoring - Application code should include either OpenCensus or OpenTelemetry so that the monitor is populated. See todo/app.py.

  • Managed Identity - Application must use Managed Identity instead of keys wherever possible.

Elsewhere

  • AZD Telemetry - azure.yaml should include metadata for telemetry including version number. See azure.yaml

  • Service Source - In azure.yaml, the project property for each service should point at a subfolder, not at root (.). Typically the subfolder is labeled src but that may vary. See azure.yaml

  • azd Pipeline Config - .github/workflows should include azure-dev.yaml to support azd pipeline config.

  • Devcontainer - .devcontainer should contain a devcontainer.json and Dockerfile/docker-compose.yaml in order to create a full local dev environment. Start with azure-dev versions and modify as needed. See docker-compose.yaml for example that includes a local database service.

  • security-devops-action - The application must run microsoft/security-devops-action. Example

  • ** Hook Scripts** - all hook scripts (pre-/post- provisioning and deployment scripts) shall include both sh and pwsh versions.

README.md

  • Short Description - a must description should be included. Example

  • Prerequisites - a Prerequisites section should be included. Example

  • Architecture - an architecture diagram and description must be included. Example

  • 'Open In __' buttons - must use same "Open in " buttons as the TODO samples. Example

  • Cost Estimation - a Cost Estimation section should be included. Example

Publicizing

Bug in `create-azure-search.ipynb` building the endpoint

The Β΄.envΒ΄ file suggests to only give the name of the created AI Search service - which is also needed by other code parts. Then, building the endpoint URL in the mentioned notebook fails. Suggest to change cell 3 in this notebook to (like it is already done in the other cells):

def delete_index(search_service: str, search_index: str, search_api_key: str):
    print(f"Deleting index {search_index} in {search_service}...")
    response = requests.delete(
        f"https://{search_service}.search.windows.net/indexes/{search_index}",
        headers={"api-key": search_api_key},
    )
    print(response.status_code)
    return response.status_code

ERROR: failed running post hooks: 'postprovision' hook failed with exit code: '1', Path: 'infra\hooks\postprovision.sh'. : exit code: 1

When I run "azd up", I see that all services have been created successfully. However, I receive an error message afterwards. I have searched for a solution, but have not been successful. Can anyone provide me with any recommendations?
please help me.

(-) Skipped: Didn't find new changes.
<3>WSL (10) ERROR: CreateProcessParseCommon:711: Failed to translate D:\Workspace\contoso-chat
<3>WSL (10) ERROR: CreateProcessParseCommon:757: getpwuid(0) failed 2
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate d:\Workspace\contoso-chat.venv\Scripts
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\Microsoft SDKs\Azure\CLI2\wbin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\Java\jdk1.8.0_351\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\Common Files\Oracle\Java\javapath
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files (x86)\Common Files\Oracle\Java\javapath
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\WINDOWS\system32
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\WINDOWS
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\WINDOWS\System32\Wbem
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\WINDOWS\System32\WindowsPowerShell\v1.0
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\WINDOWS\System32\OpenSSH
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate D:\download\apache-maven-3.8.7-bin\apache-maven-3.8.7\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\PostgreSQL\13\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\Git\cmd
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Roaming\nvm
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\nodejs
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\PuTTY
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\Java\scripts
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\MySQL\MySQL Server 8.0\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\Documents\cdk
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\RedHat\java-1.8.0-openjdk-1.8.0.372-1\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\RedHat\java-1.8.0-openjdk-1.8.0.372-1\jre\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\Git\usr\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\Programs\Python\Python39
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\Programs\Python\Python39\Scripts
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\Programs\Python\Python311
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\Programs\Python\Python311\Scripts
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\Amazon\AWSCLIV2
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\Downloads\sqlite3
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\Downloads\sqlite
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\Docker\Docker\resources\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\dotnet
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\anaconda3
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\MongoDB\mongosh-2.2.2-win32-x64\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate D:\file\Release-24.02.0-0.zip\poppler-24.02.0\Library\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Gradle\gradle-7.0.2\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files (x86)\Microsoft SQL Server\160\Tools\Binn
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\Microsoft SQL Server\160\Tools\Binn
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\Microsoft SQL Server\Client SDK\ODBC\170\Tools\Binn
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\Microsoft SQL Server\160\DTS\Binn
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files (x86)\Microsoft SQL Server\160\DTS\Binn
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\scoop\shims
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\Programs\Python\Python39\Scripts
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\Programs\Python\Python39
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\MySQL\MySQL Shell 8.0\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\Microsoft\WindowsApps
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Roaming\npm
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Roaming\nvm
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\nodejs
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\Programs\Microsoft VS Code\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\RedHat\Podman
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\Programs\mongosh
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\Programs\Ollama
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Program Files\JetBrains\IntelliJ IDEA 2024.1\bin
<3>WSL (10) ERROR: UtilTranslatePathList:2866: Failed to translate C:\Users\dbhoang\AppData\Local\Programs\Azure Dev CLI
Processing fstab with mount -a failed.
Failed to mount C:, see dmesg for more details.
Failed to mount D:, see dmesg for more details.

<3>WSL (10) ERROR: CreateProcessEntryCommon:334: getpwuid(0) failed 2
<3>WSL (10) ERROR: CreateProcessEntryCommon:505: execvpe /bin/bash failed 2
<3>WSL (10) ERROR: CreateProcessEntryCommon:508: Create process not expected to return

ERROR: failed running post hooks: 'postprovision' hook failed with exit code: '1', Path: 'infra\hooks\postprovision.sh'. : exit code: 1

ERROR: error executing step command 'provision': failed running post hooks: 'postprovision' hook failed with exit code: '1', Path: 'infra\hooks\postprovision.sh'. : exit code: 1

Connection contoso-cosmos not found

Hi
I am trying to deploy the prompt flow to Azure AI Studio but when I run a test it indicates that the contoso-cosmos connection was not found, even though the connection is created, I attach the screen shots. Any idea how it can be solved?

Captura de pantalla prompt flow

Captura de pantalla Conexion cosmos

flow.dag.yaml Visual Editor Cannot Find question_embedding tool

I'm running this locally in VS Code using a venv. When I try to run the flow through the visual editor, it fails with AttributeError: 'EntryPoints' object has no attribute 'get'

In the visual editor, I see that the question_embedding node shows an error: "Can't find tool promptflow.tools.embedding.embedding
The package may not be installed in your Python environment."

I can see an embeddings.py file in my venv folder:
image

Here's a screen shot of the error in the PF visual editor in VS Code
image

Error when running cell 36 in evaluate-chat-prompt-flow.ipynb

The error I get is: "InvalidRunStatusError: The input for batch run is incorrect. Input from key 'run.outputs' is an empty list, which means we cannot generate a single line input for the flow run. Please rectify the input and try again"

Log shows its running successfully until it starts the aml run step...
image

Error during deployment

Get the below error when using push_and_deploy_pf.ipynb to deploy on the 'pf_azure_client.flows.create_or_update' step. Also notice the logs message is truncated so don't see full extent of what happened. Thoughts?

ResourceNotFoundError: (UserError) Please make sure that you are passing valid secret names and that the keyvault https://ainorthcentus/
Code: UserError
Message: Please make sure that you are passing valid secret names and that the keyvault https://ainorthcentus/

Error in connection type

Get this error when running evaluate-chat-prompt-flow.ipynb, which points to the connection definition in AI Studio. Could a sample be provided to compare?

...
Reason: NotSupported ConnectionType 0 for connection contoso-cosmos. 
Error message: (UserError) NotSupported ConnectionType 0 for connection contoso-cosmos.
Code: UserError

What happened with ./provision.sh?

Apparently the file ./provision.sh has been moved or replaced. How can I create azure resources now?

P.D. I was planning a pull request including a provision.ps1 so you don't need WSL to create azure resources. Would that make sense now?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.