In this Code Pattern, we will use fictional sales data to create a logistic regression model using Azure Machine Learning Studio. We will use Watson OpenScale to bind the ML model deployed in the Azure cloud, create a subscription, and perform payload and feedback logging.
When the reader has completed this Code Pattern, they will understand how to:
- Prepare data, train a model, and deploy using Azure Machine Learning Studio
- Score the model using sample scoring records and the scoring endpoint
- Setup Watson OpenScale Data Mart
- Bind the Azure model to the Watson OpenScale Data Mart
- Add subscriptions to the Data Mart
- Enable payload logging and performance monitoring for both subscribed assets
- Use Data Mart to access tables data via subscription
- The developer creates a Jupyter Notebook.
- The Jupyter Notebook is connected to a PostgreSQL database, which is used to store Watson OpenScale data.
- An ML model is created using Azure ML Studio, using data from GoSales_Tx, and then it is deployed to the cloud.
- Watson Open Scale is used by the notebook to log payload and monitor performance.
- An IBM Cloud Account.
- IBM Cloud CLI
- An account on Azure Machine Learning Studio
- Clone the repository
- Create a Databases for PostgreSQL DB
- Create a Watson OpenScale service
- Run the notebook
git clone https://github.com/IBM/monitor-sagemaker-ml-with-ai-openscale
cd monitor-sagemaker-ml-with-ai-openscale
-
Using the IBM Cloud Dashboard catalog, search for PostgreSQL and choose the
Databases for Postgres
service: -
Wait a couple of minutes for the database to be provisioned.
-
Click on the
Service Credentials
tab on the left and then clickNew credential +
to create the service credentials. Copy them or leave the tab open to use later in the notebook.
- Using the IBM Cloud Dashboard create a Watson OpenScale service.
- You will get the Watson OpenScale instance GUID when you run the notebook using the IBM Cloud CLI
-
Create an experiment in Azure ML Studio using the diagram in the notebook. (You can search for each module in the palette by name)
-
When you get to the
Train Model
module, select theProduct Line
column as the label. -
Run the experiment to train the model.
-
Create (deploy) web service (Choose the
new
NOTclassic
) -
Follow the instructions for
ACTION: Get Watson OpenScale instance_guid and apikey
using the IBM Cloud CLI
How to get api key using ibmcloud console:
ibmcloud login --sso
ibmcloud iam api-key-create 'my_key'
How to get your Watson OpenScale instance GUID:
ibmcloud resource service-instance <WatsonOpenScale_instance_name>
- Enter the
instance_guid
andapikey
in the next cell for theWATSON_OS_CREDENTIALS
. - In the cell after that enter
POSTGRES_CREDENTIALS
using the value for the PostreSQL credentials from Step #2. - In the cell after
2.1 Bind Azure machine learning engine
enter theclient_id
,client_secret
,subscription_id
, andtenant
for theAZURE_ENGINE_CREDENTIALS
.
NOTE: Setting up Azure Active Directory for the AZURE_ENGINE_CREDENTIALS is beyond the scope of this document. See Azure documentation for help with this.
- After running
3.1 Add subscriptions
you will get asource_uid
to enter in the cell that follows. - Move your cursor to each code cell and run the code in it. Read the comments for each cell to understand what the code is doing. Important when the code in a cell is still running, the label to the left changes to In [*]:. Do not continue to the next cell until the code is finished running.
See the example notebook with output
This code pattern is licensed under the Apache License, Version 2. Separate third-party code objects invoked within this code pattern are licensed by their respective providers pursuant to their own separate licenses. Contributions are subject to the Developer Certificate of Origin, Version 1.1 and the Apache License, Version 2.