hamburger icon close icon

AWS Lambda Integration with Cloud Volumes Service APIs

July 27, 2019

Topics: AWS4 minute read

NetApp® Cloud Volumes Service for AWS offers a fully managed file service for workloads hosted in Amazon Web Services (AWS), and its capabilities are far reaching. In addition to managing cloud volumes from the intuitive, web-based Cloud Volumes Service UI, administrators can also use the Cloud Volumes API to automate provisioning and management operations.

Using the Cloud Volumes Service API, you can extend Cloud Volumes Service to other AWS automation services, such as AWS Lambda. AWS Lambda is one of the building blocks of IT automation, and it can be used to run code in a serverless way. Read on for a step-by-step guide to integrating Cloud Volumes Service APIs with AWS Lambda to provision persistent storage for your cloud workloads.

Using Cloud Volumes Service APIs to Automate Volume Provisioning and Management

When getting started with Cloud Volumes Service, you normally use the Cloud Volumes Service UI to provision storage, create NetApp Snapshot™ copies, and carry out other important tasks. Within rapidly changing, large enterprises, managing these operations manually won’t be sufficient to meet your organization’s needs. The Cloud Volumes Service APIs cater to those automation needs by allowing you to make REST API calls to rapidly provision and manage cloud volumes.

You can see the APIs for automation by using the API documentation link on the Storage page after you log in to Cloud Volumes Service UI.

APIs that can be used for automation are also listed on this page, along with supported actions, such as GET, POST, PUT, and DELETE. These functions can be integrated into your automation logic when you create provisioning scripts and tools.

Cloud Volumes Service API Integration with AWS Lambda: Sample Configuration

AWS Lambda enables a serverless architecture in which you can write code in any of the supported languages (including Node.js, Python, Ruby, .NET, and Java) and execute it in response to a configured trigger. The code will be executed in the Lambda runtime, and Lambda will take care of the memory, CPU, network, and other resources required. Lambda can be used to execute provisioning scripts, like those you write to make API calls to Cloud Volumes Service APIs for storage provisioning and management.

The following example demonstrates how to build an AWS Lambda function that’s integrated with the Cloud Volumes Service API for automation. The goal of this automation is to provision a persistent volume on a serverless application by using Amazon API Gateway, Amazon Lambda, NetApp Cloud Volumes Service APIs, and Python. It can be used to provision a persistent volume simply by sending SMS messages.


To work on an automation tool that uses the Cloud Volumes Service API, you first need to obtain the API’s URL, the API key, and the secret key for making the API call. You can get these values from the Cloud Volumes Service UI by going to the Storage tab and selecting API Access.

Sample Code

Use the following Python function code to get all the volumes from a specific account in a specific region. This code will be used in step 5 of the configuration. (You can also refer to this code on GitHub.)   

from botocore.vendored import requests
import logging


logger = logging.getLogger()

CVAPI_APIKEY="enter your Cloud Volumes Service API key here"
CVAPI_SECRETKEY="enter your Cloud Volumes Service secret key here"

          'content-type': 'application/json',
          'api-key': CVAPI_APIKEY,
          'secret-key': CVAPI_SECRETKEY

getfilesystemDetailsHeaders = {
         'content-type': 'application/json',
         'api-key': CVAPI_APIKEY,
         'secret-key': CVAPI_SECRETKEY

filesystemURL = CVAPI_BASEURL + "/FileSystems"
filesystemCreateURL = CVAPI_BASEURL

def lambda_handler(event, context):
    getResult = requests.get(url=filesystemURL, headers=HEADERS)
    print("get File system success, the response code : ", getResult.status_code)
    fileSystemsData = getResult.json()
    for i in fileSystemsData:
        fileSystemId = (i['fileSystemId'])
        name = (i['name'])
        print("FileSystemId : ", fileSystemId, " = VolumeName : ", name) 


1. In the AWS console, go to the AWS Lambda service.

2. Click the Create a Function button.

AWS Lambda

3. Choose Author from Scratch, and either select the options shown here or create a role with Amazon CloudWatch log permissions.

Author from scratch

4. Click Create Function.

5. Go to Function; then copy the sample code shown earlier and paste it into the function code. 
           include the pass, the URL, API, and secret key (refer to the earlier "Prerequisites" section) in the code.

6. Configure the trigger of your choice. 

7. Click Configure Test Event, and set options according to the following screenshot. Because this configuration is used only for test purposes, create a dummy test event.

Configuration test event

8. Click Test.

9. The response should fetch all the volumes configured in the Cloud Volumes Service account. The following graphic illustrates the end-to-end architecture of this provisioning process.

End-to-end architecture of the provisioning process

The architecture in the illustration uses webhooks, which are user-defined HTTP callbacks that you can trigger by sending an SMS. A small web application accepts these HTTP requests and sends them to Amazon API Gateway. The requests are integrated with the Lambda function that we created earlier for provisioning cloud volumes. To learn more, check out this demo of the provisioning process:


Integration—Plain and Simple

Cloud Volumes Service offers an easy path for migrating NAS-dependent applications to the cloud and is backed by trusted NetApp storage technology, providing the first-in-class flexibility required by mission-critical applications. By taking advantage of NetApp’s API, you can easily extend this powerful service to your existing automation solutions like AWS Lambda.

Are You Ready to Give It a Try?

Try out Cloud Volumes Service today or request a demo to get a firsthand look!

Solution Architect