Introduction
Media files stored through Microsoft’s Azure cloud platform can be easily integrated with Dolby.io to create a pipeline for media enhancement and analysis, allowing Azure users to enrich and understand their audio, all from the cloud. In this guide, we will explore how to integrate Dolby.io’s Media Processing APIs with Azure Blob Storage to help users enhance their audio in a simple and scalable way.
What do we Need to Get Started
Before we get started there are five parameters you need to make sure you have ready for integration with Azure:
- Azure Account Name: The name of your Azure account.
- Azure Container Name: The name of the Container where your file is located.
- Azure Blob Name: The name of the Blob where your file is stored.
- Azure Primary Access Key: The Primary Access Key for your Azure storage account.
- Dolby.io API Key: The Dolby.io mAPI key found on your Dolby.io dashboard.
These parameters help direct Dolby.io to the appropriate location for finding your cloud-stored file, as well as handling necessary approval for accessing your privately stored media.
Getting Started with Python
To begin building this integration we first need to install Azure.Storage.Blob v12.8.1. It is important to note that installing the Azure.Storage.Blob Python package differs from just installing the base Azure SDK, so make sure to specifically install Azure.Storage.Blob v12.8.1 as shown in the code below. Once installed we can import the Python datetime, requests, and time packages, along with generate_blob_sas and BlobSasPermissions functions from Azure.Storage.Blob.
#!pip install azure.storage.blob==12.8.1
from datetime import datetime, timedelta #For setting the token validity duration
import requests #For using the Dolby.io REST API
import time #For tracking progress of our media processing job
#Relevent Azure tools
from azure.storage.blob import (
generate_blob_sas,
BlobSasPermissions
)
Next, we define our input parameters. We specify both an input and an output for our Blob files. The input represents the name of the stored file on the server and the output represents the name of the enhanced file that will be placed back onto the server.
# AZURE
AZURE_ACC_NAME = 'your-account-name'
AZURE_PRIMARY_KEY = 'your-account-key'
AZURE_CONTAINER = 'your-container-name'
AZURE_BLOB_INPUT='your-unenhanced-file'
AZURE_BLOB_OUTPUT='name-of-enhanced-output'
We also need to define some Dolby.io parameters including the server and the function applied to your files. In this case, we pick enhance and follow up by defining our Dolby.io Media Processing API key.
# DOLBY
server_url = "https://api.dolby.com"
url = server_url +"/media/enhance"
api_key = "your Dolby.io Media API Key"
With all our variables defined, we can now create the Shared Access Signatures (SAS) the Dolby.io API will use to find the files. To do this we use the generate_blob_permissions function in conjunction with the BlobSasPermissions function and the datetime function.
input_sas_blob = generate_blob_sas(account_name=AZURE_ACC_NAME,
container_name=AZURE_CONTAINER,
blob_name=AZURE_BLOB_INPUT,
account_key=AZURE_PRIMARY_KEY,
permission=BlobSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1))
output_sas_blob = generate_blob_sas(account_name=AZURE_ACC_NAME,
container_name=AZURE_CONTAINER,
blob_name=AZURE_BLOB_OUTPUT,
account_key=AZURE_PRIMARY_KEY,
permission=BlobSasPermissions(read=True, write=True, create=True),
expiry=datetime.utcnow() + timedelta(hours=1))
Note this in the code sample above we define both an input and an output SAS Blob. For the input SAS, we only need to give the signature the ability to read in files, however, for the output SAS, we need to give it the ability to create a file on the Azure server and then write to that file. We also specify how long we want our signatures to be valid. In this case, the links are valid for one hour, however, for larger jobs, we may need to increase this window of validity.
With our SAS tokens created we now need to format the tokens into URLs. Again we need to create two URLs, one for the input and one for the output.
'https://'+AZURE_ACC_NAME+'.blob.core.windows.net/'+AZURE_CONTAINER+'/'+AZURE_BLOB_OUTPUT+'?'+output_sas_blob
Using both SAS URLs we plug the values into the Dolby.io API and initiate the media processing job. The unique identifier for the job is captured in the job_id parameter which we can use to track progress.
body = {
"input" : input_sas,
"output" : output_sas
}
headers = {
"x-api-key":api_key,
"Content-Type": "application/json",
"Accept": "application/json"
}
response = requests.post(url, json=body, headers=headers)
response.raise_for_status()
job_id = response.json()["job_id"]
Note how the input and output of the body have their corresponding URLs assigned.
Our job has now begun. To track the progress of the job we can create a loop that reports job status.
while True:
headers = {
"x-api-key": api_key,
"Content-Type": "application/json",
"Accept": "application/json"
}
params = {"job_id": job_id}
response = requests.get(url, params=params, headers=headers)
response.raise_for_status()
print(response.json())
if response.json()["status"] == "Success" or response.json()["status"] == "Failed":
break
time.sleep(20)
print("response.json()["status"]")
Once the job has been completed the loop will exit and our enhanced file will be visible on the Azure Blob Storage server. Alternatively, instead of looping through such as that seen in the example above, Dolby.io offers functionality for webhooks and callbacks which can be used to notify users of a jobs competition.
Conclusion
In summary, once we have created an Azure valid SAS, the process for integrating Azure with Dolby.io is simple, allowing for seamless integration between the two services. If you are interested in learning more about how to integrate Azure with Dolby.io or explore examples in alternative coding languages check out our documentation here.