How to Create an Azure Notebooks Project and Deploy a Summarization Service- Experiment #102

In this new post, we will learn step by step how to create an Azure Notebook Project for our Experiment #102 and implement a text summary service by writing some scripts in Python and running them with Jupyter.

1.     Create an Azure Notebooks Project

First, we can access this Azure Notebooks Service by visiting https://notebooks.azure.com/ and then, we can create a new Azure Notebooks Project from “My Projects”.

 

 

2.     Install Machine Learning Libraries

Before creating our own functions, we must load a bundle of Python libraries that includes Azure Machine Learning SDK (azureml-sdk), ONNX Runtime (onnxruntime) and the Natural Language Toolkit (nltk).

The “init” file or Python script looks like that:

!pip install --upgrade azureml-sdk[notebooks]
%%sh
pip install onnxruntime
import nltk
nltk.download('all')

 

After that, we need to upload the file to our Azure Notebook project. Then, we can go inside, see the code and run it from Jupyter to load all the libraries. This step may take a few minutes to complete.

 

 

3.     Test a Simple Service Usage Example

For our example we will create a new “.ipynb” file that imports some of the libraries we load before.

import nltk
import re
import unicodedata
import numpy as np
from gensim.summarization import summarize

 

Then, we need to create two functions: one for normalize the text and other for summarize it.

def normalize_text(text):
    text = re.sub('\n', ' ', text)
    text = text.strip()
    sentences = nltk.sent_tokenize(text)
    sentences = [sentence.strip() for sentence in sentences]
    return sentences

def summarize_text(text, summary_ratio=None, word_count=30):
    sentences = normalize_text(text)
    cleaned_text = ' '.join(sentences)
    summary = summarize(cleaned_text, split=True, ratio=summary_ratio, word_count=word_count)
    return summary

 

Now, we can pass a sample text to our main function, run the commands and see the Summarize service in action.

 

 

4.     Create a Machine Learning Workspace

Meanwhile, in our Azure subscription, we need to create an Azure Machine Learning Service Workspace resource in order to get a few necessary configuration parameters to continue.

 

 

5.     Connect to Workspace and Prepare the Model

Now, we will set up a “.ipynb” file to connect to the Azure Machine Learning Service Workspace and prepare the Machine Learning model to deploy it in an Azure Container Instance.

First, we must provide the Workspace connection parameters and import the needed libraries.

subscription_id = "xxx-xxx-xxx-xxx-xxx" 
resource_group = "Machine_Learning_Experiment" 
workspace_name = "Experiment_Workspace" 
workspace_region = "westeurope" 
import azureml.core
print('azureml.core.VERSION: ', azureml.core.VERSION)

from azureml.core import Workspace

ws = Workspace.create(
    name = workspace_name,
    subscription_id = subscription_id,
    resource_group = resource_group, 
    location = workspace_region, 
    exist_ok = True)

ws.write_config()
print('Workspace configuration succeeded')
!cat .azureml/config.json

 

Then, we can define our model, package it by using the Azure Machine Learning SDK and deploy a web service in an Azure Container Instance

%%writefile summarizer_service.py

import re
import nltk
import unicodedata
from gensim.summarization import summarize, keywords

def clean_and_parse_document(document):
    if isinstance(document, str):
        document = document
    elif isinstance(document, unicode):
        return unicodedata.normalize('NFKD', document).encode('ascii', 'ignore')
    else:
        raise ValueError("Document is not string or unicode.")
    document = document.strip()
    sentences = nltk.sent_tokenize(document)
    sentences = [sentence.strip() for sentence in sentences]
    return sentences

def summarize_text(text, summary_ratio = None, word_count = 30):
    sentences = clean_and_parse_document(text)
    cleaned_text = ' '.join(sentences)
    summary = summarize(cleaned_text, split = True, ratio = summary_ratio, word_count = word_count)
    return summary 

def init():  
    nltk.download('all')
    return

def run(input_str):
    try:
        return summarize_text(input_str)
    except Exception as e:
        return (str(e))
from azureml.core.conda_dependencies import CondaDependencies 

myacienv = CondaDependencies.create(pip_packages = ['gensim','nltk'])

with open("mydeployenv.yml","w") as f:
    f.write(myacienv.serialize_to_string())
from azureml.core.webservice import AciWebservice, Webservice

aci_config = AciWebservice.deploy_configuration(
    cpu_cores = 1, 
    memory_gb = 1, 
    tags = {'name':'Summarization'}, 
    description = 'Summarizes text.')
service_name = "summarizer"
runtime = "python"
driver_file = "summarizer_service.py"
conda_file = "mydeployenv.yml"

from azureml.core.image import ContainerImage

image_config = ContainerImage.image_configuration(execution_script = driver_file,
                                                  runtime = runtime,
                                                  conda_file = conda_file)
webservice = Webservice.deploy(
  workspace = ws, 
  name = service_name, 
  model_paths = [],
  deployment_config = aci_config,
  image_config = image_config, 
  )

webservice.wait_for_deployment(show_output = True)

 

After that, we can set an example to test the service.

example = """
I would like to thank you for a wonderful stay at the Hotel California.   
The room we stayed in was very nice and had plenty of room for the whole family and the beds were especially comfortable.  
The kids loved going to the Kids Club and the swimming pools all day which gave us some time to relax and have a few cocktails before eating at the Bistro restaurant. The food was great and the kids menu was priced very reasonably.
A big thank you to all the staff who couldn't do enough for us and were polite and friendly throughout our stay. 
We had a great family holiday and can't wait to book again for next year. 
"""
result = webservice.run(input_data = example)
print(result)
webservice.scoring_uri

 

6.     Deploy and Test the Web Service

Finally, we can upload this file an run the project to deploy and test the model.

 

 

 

Stay up to date!



Leave a comment