SlideShare a Scribd company logo
Azure ML Deployment using ACI
Components – Directory Structure
1. echo_score.py has py
extension while train.ipynb is
a notebook file
2. Data is loaded manually into
data folder. However, the
right way is to first create a
data store and then data
asset. Post that, we can use
that data asset in the code.
3. Train.ipynb has to run on
Python 3.10 SDK V2 since
there were issues installing
sklearn lib in earlier SDK
versions.
Train File – Define Workspace
# Define workspace variables
from azureml.core import Workspace, Dataset
import numpy as np
import pandas as pd
secret_subscription_id_value = ‘XXXXXXXXXXX'
resource_group = ‘XXXXXXXXXXX'
workspace_name = ‘XXXXXXXXXXX'
workspace = Workspace(secret_subscription_id_value, resource_group, workspace_name)
1. Id values can be stored in key vault. However, that code did not work.
Train File – Save Model Artifacts
# Save the model and its artifacts
import joblib
joblib.dump(kmeans, "../model_artifacts/kmeans.joblib")
joblib.dump(enc, "../model_artifacts/enc.bin", compress=True)
joblib.dump(model_rf, "../model_artifacts/taxi_demand_prediction.joblib")
1. At the end of training, you store the model artifacts into Azure ML compute.
This is referred in the scoring file.
Train File – Register Model
# Register model
from azureml.core.model import Model
import urllib.request
model = Model.register(workspace,
model_name="taxi_demand_prediction",
model_path="../model_artifacts/")
1. When you register the model,
all the artifacts stored in Azure
ML compute instance are
registered as a folder in the
model registry (view on the
right)
Train File – Environment Setup
# Environment setup
from azureml.core import Environment
from azureml.core.model import InferenceConfig
env = Environment(name="taxi_demand_prediction")
python_packages = ['azure-ml-api-sdk','numpy', 'pandas', 'seaborn', 'matplotlib',
'scipy', 'scikit-learn', 'joblib','requests']
for package in python_packages:
env.python.conda_dependencies.add_pip_package(package)
1. While setting up environment for inference, the first lib “'azure-ml-api-sdk”
required to be installed as well.
Train File – Inference Config
# Inference configuration setup
inference_config = InferenceConfig(
environment=env,
source_directory="../source_dir",
entry_script="echo_score.py",
)
Train File – Local Deployment
# Local Deployment
from azureml.core.webservice import LocalWebservice
from azureml.core.webservice import AciWebservice
deployment_config = LocalWebservice.deploy_configuration(port=6789)
service = Model.deploy(
workspace,
"taxidemandprediction",
[model],
inference_config,
deployment_config,
overwrite=True,
)
service.wait_for_deployment(show_output=True)
print(service.get_logs())
1. There should be no underscore in
the deployment name. Hence, we
the name is “taxidemandprediction”
Train File – Local Testing
# Local Testing
import requests
import json
uri = service.scoring_uri
requests.get("http://localhost:6789")
headers = {"Content-Type": "application/json"}
data = {"pickup_latitude":"-
73.980492","pickup_longitude":"40.777981","tpep_pickup_datetime":"2020-02-13
23:40:00"}
data = json.dumps(data)
response = requests.post(uri, data=data, headers=headers)
print(response.json())
service.get_logs() # Get logs
Train File – Remote ACI Deployment
# Remote Deployment
deployment_config = AciWebservice.deploy_configuration(
cpu_cores=0.5, memory_gb=1, auth_enabled=True
)
service = Model.deploy(
workspace,
"taxidemandprediction",
[model],
inference_config,
deployment_config,
overwrite=True,
)
service.wait_for_deployment(show_output=True)
print(service.get_logs())
1. ACI is a serverless setup made by Azure
for low cost real time deployments.
Alternatives are Kubernetes.
2. For batch inferences, we can use azure
compute itself.
3. We can also do “bring your own
container” and only use Azure for
deployment.
Train File – ACI Remote Testing
import requests
import json
from azureml.core import Webservice
service = Webservice(workspace=workspace, name="taxidemandprediction")
scoring_uri = service.scoring_uri
# If the service is authenticated, set the key or token
key, _ = service.get_keys()
# Set the appropriate headers
headers = {"Content-Type": "application/json"}
headers["Authorization"] = f"Bearer {key}"
# Make the request and display the response and logs
data = {"pickup_latitude":"-
73.980492","pickup_longitude":"40.777981","tpep_pickup_datetime":"2020-02-13 23:40:00"}
data = json.dumps(data)
resp = requests.post(scoring_uri, data=data, headers=headers)
print(resp.text)
Other issues faced
1. The model path given in score file refers to the path in model
registry. The path added was
“taxi_demand_prediction/model_artifacts/kmeans.joblib” before
and we changed it to “model_artifacts/kmeans.joblib”. Similar
changes were done to other paths in score file.
2. Always make sure to do local testing on compute and then deploy on
ACI. The logs are detailed during local testing and it is easier to
debug.
Key views - Models
Key views – Endpoints
1. End point also
provides a Rest API
for the deployment
that can be queried
from outside Azure
setup.
Key views – Compute used for Training

More Related Content

PPTX
Unsupervised Aspect Based Sentiment Analysis at Scale
Aaron (Ari) Bornstein
 
PDF
Viktor Tsykunov: Azure Machine Learning Service
Lviv Startup Club
 
PPT
RESTful API In Node Js using Express
Jeetendra singh
 
PPTX
Azure machine learning service
Ruth Yakubu
 
PDF
Heroku pop-behind-the-sense
Ben Lin
 
PDF
Kraken at DevCon TLV
Tim Messerschmidt
 
PPTX
Designing a production grade realtime ml inference endpoint
Chandim Sett
 
PDF
Controlling The Cloud With Python
Luca Mearelli
 
Unsupervised Aspect Based Sentiment Analysis at Scale
Aaron (Ari) Bornstein
 
Viktor Tsykunov: Azure Machine Learning Service
Lviv Startup Club
 
RESTful API In Node Js using Express
Jeetendra singh
 
Azure machine learning service
Ruth Yakubu
 
Heroku pop-behind-the-sense
Ben Lin
 
Kraken at DevCon TLV
Tim Messerschmidt
 
Designing a production grade realtime ml inference endpoint
Chandim Sett
 
Controlling The Cloud With Python
Luca Mearelli
 

Similar to AzureMLDeployment.ppt (20)

PDF
Automation in angular js
Marcin Wosinek
 
PDF
NodeJS and ExpressJS.pdf
ArthyR3
 
PDF
Future Decoded - Node.js per sviluppatori .NET
Gianluca Carucci
 
PDF
I want my model to be deployed ! (another story of MLOps)
AZUG FR
 
PPTX
Taking advantage of the Amazon Web Services (AWS) Family
Ben Hall
 
PDF
09 - express nodes on the right angle - vitaliy basyuk - it event 2013 (5)
Igor Bronovskyy
 
PDF
Python Google Cloud Function with CORS
RapidValue
 
PDF
Django
Mohamed Ramadan
 
PPTX
Rapid prototyping and easy testing with ember cli mirage
Krzysztof Bialek
 
PDF
Using Task Queues and D3.js to build an analytics product on App Engine
River of Talent
 
PDF
Celery with python
Alexandre González Rodríguez
 
PDF
Deploy and Serve Model from Azure Databricks onto Azure Machine Learning
Databricks
 
PPTX
Building a scalable web application by combining modern front-end stuff and A...
Chris Klug
 
PPTX
Protractor framework – how to make stable e2e tests for Angular applications
Ludmila Nesvitiy
 
PPTX
NodeJs
dizabl
 
PDF
Practical Google App Engine Applications In Py
Eric ShangKuan
 
PDF
Alex Casalboni - Configuration management and service discovery - Codemotion ...
Codemotion
 
PDF
Prompt engineering for iOS developers (How LLMs and GenAI work)
Andrey Volobuev
 
PDF
Build Web Apps using Node.js
davidchubbs
 
PPTX
Viktor Tsykunov "Microsoft AI platform for every Developer"
Lviv Startup Club
 
Automation in angular js
Marcin Wosinek
 
NodeJS and ExpressJS.pdf
ArthyR3
 
Future Decoded - Node.js per sviluppatori .NET
Gianluca Carucci
 
I want my model to be deployed ! (another story of MLOps)
AZUG FR
 
Taking advantage of the Amazon Web Services (AWS) Family
Ben Hall
 
09 - express nodes on the right angle - vitaliy basyuk - it event 2013 (5)
Igor Bronovskyy
 
Python Google Cloud Function with CORS
RapidValue
 
Rapid prototyping and easy testing with ember cli mirage
Krzysztof Bialek
 
Using Task Queues and D3.js to build an analytics product on App Engine
River of Talent
 
Celery with python
Alexandre González Rodríguez
 
Deploy and Serve Model from Azure Databricks onto Azure Machine Learning
Databricks
 
Building a scalable web application by combining modern front-end stuff and A...
Chris Klug
 
Protractor framework – how to make stable e2e tests for Angular applications
Ludmila Nesvitiy
 
NodeJs
dizabl
 
Practical Google App Engine Applications In Py
Eric ShangKuan
 
Alex Casalboni - Configuration management and service discovery - Codemotion ...
Codemotion
 
Prompt engineering for iOS developers (How LLMs and GenAI work)
Andrey Volobuev
 
Build Web Apps using Node.js
davidchubbs
 
Viktor Tsykunov "Microsoft AI platform for every Developer"
Lviv Startup Club
 
Ad

Recently uploaded (20)

PPTX
Web_Engineering_Assignment_Clean.pptxfor college
HUSNAINAHMAD39
 
PPTX
artificial intelligence deeplearning-200712115616.pptx
revathi148366
 
PPTX
Complete_STATA_Introduction_Beginner.pptx
mbayekebe
 
PDF
Technical Writing Module-I Complete Notes.pdf
VedprakashArya13
 
PPTX
Pipeline Automatic Leak Detection for Water Distribution Systems
Sione Palu
 
PPTX
Future_of_AI_Presentation for everyone.pptx
boranamanju07
 
PPTX
Azure Data management Engineer project.pptx
sumitmundhe77
 
PDF
oop_java (1) of ice or cse or eee ic.pdf
sabiquntoufiqlabonno
 
PDF
Research about a FoodFolio app for personalized dietary tracking and health o...
AustinLiamAndres
 
PPTX
Probability systematic sampling methods.pptx
PrakashRajput19
 
PDF
TIC ACTIVIDAD 1geeeeeeeeeeeeeeeeeeeeeeeeeeeeeer3.pdf
Thais Ruiz
 
PPTX
Introduction to Biostatistics Presentation.pptx
AtemJoshua
 
PPTX
Analysis of Employee_Attrition_Presentation.pptx
AdawuRedeemer
 
PPTX
Decoding Physical Presence: Unlocking Business Intelligence with Wi-Fi Analytics
meghahiremath253
 
PDF
Blue Futuristic Cyber Security Presentation.pdf
tanvikhunt1003
 
PPTX
Presentation (1) (1).pptx k8hhfftuiiigff
karthikjagath2005
 
PDF
Chad Readey - An Independent Thinker
Chad Readey
 
PPTX
Introduction to Data Analytics and Data Science
KavithaCIT
 
PDF
Company Presentation pada Perusahaan ADB.pdf
didikfahmi
 
PPTX
Measurement of Afordability for Water Supply and Sanitation in Bangladesh .pptx
akmibrahimbd
 
Web_Engineering_Assignment_Clean.pptxfor college
HUSNAINAHMAD39
 
artificial intelligence deeplearning-200712115616.pptx
revathi148366
 
Complete_STATA_Introduction_Beginner.pptx
mbayekebe
 
Technical Writing Module-I Complete Notes.pdf
VedprakashArya13
 
Pipeline Automatic Leak Detection for Water Distribution Systems
Sione Palu
 
Future_of_AI_Presentation for everyone.pptx
boranamanju07
 
Azure Data management Engineer project.pptx
sumitmundhe77
 
oop_java (1) of ice or cse or eee ic.pdf
sabiquntoufiqlabonno
 
Research about a FoodFolio app for personalized dietary tracking and health o...
AustinLiamAndres
 
Probability systematic sampling methods.pptx
PrakashRajput19
 
TIC ACTIVIDAD 1geeeeeeeeeeeeeeeeeeeeeeeeeeeeeer3.pdf
Thais Ruiz
 
Introduction to Biostatistics Presentation.pptx
AtemJoshua
 
Analysis of Employee_Attrition_Presentation.pptx
AdawuRedeemer
 
Decoding Physical Presence: Unlocking Business Intelligence with Wi-Fi Analytics
meghahiremath253
 
Blue Futuristic Cyber Security Presentation.pdf
tanvikhunt1003
 
Presentation (1) (1).pptx k8hhfftuiiigff
karthikjagath2005
 
Chad Readey - An Independent Thinker
Chad Readey
 
Introduction to Data Analytics and Data Science
KavithaCIT
 
Company Presentation pada Perusahaan ADB.pdf
didikfahmi
 
Measurement of Afordability for Water Supply and Sanitation in Bangladesh .pptx
akmibrahimbd
 
Ad

AzureMLDeployment.ppt

  • 2. Components – Directory Structure 1. echo_score.py has py extension while train.ipynb is a notebook file 2. Data is loaded manually into data folder. However, the right way is to first create a data store and then data asset. Post that, we can use that data asset in the code. 3. Train.ipynb has to run on Python 3.10 SDK V2 since there were issues installing sklearn lib in earlier SDK versions.
  • 3. Train File – Define Workspace # Define workspace variables from azureml.core import Workspace, Dataset import numpy as np import pandas as pd secret_subscription_id_value = ‘XXXXXXXXXXX' resource_group = ‘XXXXXXXXXXX' workspace_name = ‘XXXXXXXXXXX' workspace = Workspace(secret_subscription_id_value, resource_group, workspace_name) 1. Id values can be stored in key vault. However, that code did not work.
  • 4. Train File – Save Model Artifacts # Save the model and its artifacts import joblib joblib.dump(kmeans, "../model_artifacts/kmeans.joblib") joblib.dump(enc, "../model_artifacts/enc.bin", compress=True) joblib.dump(model_rf, "../model_artifacts/taxi_demand_prediction.joblib") 1. At the end of training, you store the model artifacts into Azure ML compute. This is referred in the scoring file.
  • 5. Train File – Register Model # Register model from azureml.core.model import Model import urllib.request model = Model.register(workspace, model_name="taxi_demand_prediction", model_path="../model_artifacts/") 1. When you register the model, all the artifacts stored in Azure ML compute instance are registered as a folder in the model registry (view on the right)
  • 6. Train File – Environment Setup # Environment setup from azureml.core import Environment from azureml.core.model import InferenceConfig env = Environment(name="taxi_demand_prediction") python_packages = ['azure-ml-api-sdk','numpy', 'pandas', 'seaborn', 'matplotlib', 'scipy', 'scikit-learn', 'joblib','requests'] for package in python_packages: env.python.conda_dependencies.add_pip_package(package) 1. While setting up environment for inference, the first lib “'azure-ml-api-sdk” required to be installed as well.
  • 7. Train File – Inference Config # Inference configuration setup inference_config = InferenceConfig( environment=env, source_directory="../source_dir", entry_script="echo_score.py", )
  • 8. Train File – Local Deployment # Local Deployment from azureml.core.webservice import LocalWebservice from azureml.core.webservice import AciWebservice deployment_config = LocalWebservice.deploy_configuration(port=6789) service = Model.deploy( workspace, "taxidemandprediction", [model], inference_config, deployment_config, overwrite=True, ) service.wait_for_deployment(show_output=True) print(service.get_logs()) 1. There should be no underscore in the deployment name. Hence, we the name is “taxidemandprediction”
  • 9. Train File – Local Testing # Local Testing import requests import json uri = service.scoring_uri requests.get("http://localhost:6789") headers = {"Content-Type": "application/json"} data = {"pickup_latitude":"- 73.980492","pickup_longitude":"40.777981","tpep_pickup_datetime":"2020-02-13 23:40:00"} data = json.dumps(data) response = requests.post(uri, data=data, headers=headers) print(response.json()) service.get_logs() # Get logs
  • 10. Train File – Remote ACI Deployment # Remote Deployment deployment_config = AciWebservice.deploy_configuration( cpu_cores=0.5, memory_gb=1, auth_enabled=True ) service = Model.deploy( workspace, "taxidemandprediction", [model], inference_config, deployment_config, overwrite=True, ) service.wait_for_deployment(show_output=True) print(service.get_logs()) 1. ACI is a serverless setup made by Azure for low cost real time deployments. Alternatives are Kubernetes. 2. For batch inferences, we can use azure compute itself. 3. We can also do “bring your own container” and only use Azure for deployment.
  • 11. Train File – ACI Remote Testing import requests import json from azureml.core import Webservice service = Webservice(workspace=workspace, name="taxidemandprediction") scoring_uri = service.scoring_uri # If the service is authenticated, set the key or token key, _ = service.get_keys() # Set the appropriate headers headers = {"Content-Type": "application/json"} headers["Authorization"] = f"Bearer {key}" # Make the request and display the response and logs data = {"pickup_latitude":"- 73.980492","pickup_longitude":"40.777981","tpep_pickup_datetime":"2020-02-13 23:40:00"} data = json.dumps(data) resp = requests.post(scoring_uri, data=data, headers=headers) print(resp.text)
  • 12. Other issues faced 1. The model path given in score file refers to the path in model registry. The path added was “taxi_demand_prediction/model_artifacts/kmeans.joblib” before and we changed it to “model_artifacts/kmeans.joblib”. Similar changes were done to other paths in score file. 2. Always make sure to do local testing on compute and then deploy on ACI. The logs are detailed during local testing and it is easier to debug.
  • 13. Key views - Models
  • 14. Key views – Endpoints 1. End point also provides a Rest API for the deployment that can be queried from outside Azure setup.
  • 15. Key views – Compute used for Training