Out of all the critical aspects of cloud-based file services—security, compliance, access control, flexibility, availability—high performance may be the most demanding.
In this blog, we’ll examine the results of a File Server Capacity Tool (FSCT) benchmark test that rates the performance of NetApp® Cloud Volumes Service for AWS. Because the product caters to multiple enterprise cloud-based file share use cases, the results should help customers select the right solution based on their individual performance requirements.
Cloud Volumes Service uses the industry-leading NetApp® ONTAP technology to deliver a secure, flexible, and high-performance cloud-based NAS file service in AWS. Some of the features and advantages of using the service are
To benchmark the performance of Cloud Volumes Service for AWS, we decided to benchmark how a home directory would perform at peak usage. To that end, we used the File Server Capacity Tool (FSCT).
Home directories have traditionally been a quintessential component of every organizations’ file- and folder-sharing structure. In large organizations, the data in these file shares is often accessed simultaneously by thousands of users for their day-to-day work. The underlying storage system or service that hosts the home directories and file shares should be capable of handling storage read/write requests from these users—especially during peak office hours—without any performance degradation.
The File Server Capacity Tool (FSCT) can be used to simulate a peak usage scenario for a home directory, where multiple SMB requests from clients are generated to put a load on your storage system used for storing user’s home directories (known as the “home folder workload” in FSCT terminology) and measure its performance. The FSCT controller is configured to initiate the test and collect performance data using perfmon counters. A client computer is also used to generate the SMB requests for the stress test. The windows server perfmon counter gives insight into storage performance and throughput while the test is in progress. This data is also stored in FSCT controller once the test is completed.
We conducted the benchmark test using the FSCT home folder workload with target home directories hosted on Cloud Volumes Service for AWS. The test was conducted to simulate a load of 12,000+ concurrent users accessing the home directory to simulate a typical enterprise-class deployment scenario. The user load was systematically increased to test the maximum capabilities of Cloud Volume Service for AWS.
Workload metadata operations are as crucial as data read/write operations, and as such metadata retrieval speeds have a great impact on performance. In FSCT, performance is measured in terms of users supported without “overload,” or the point at which the input to the system exceeds its processing capability. It also considers the following:
The following chart shows the FSCT user/throughput of NetApp Cloud Volume Service in our test:
The results of the FSCT benchmark test can lead to the following conclusions:
Weekend/weekday cost calculation:
Total weekday and weekend cost calculation:
260 weekdays = $0.0455 per user * 5
105 weekends = $0.023 per user * 2
Average yearly cost per user = $0.034
Our FSCT benchmark tests results show that NetApp Cloud Volumes Service for AWS has an advantage in terms of performance during peak utilization hours for a home directory. Additionally, Cloud Volumes Service also provides flexibility for customers in terms of data protection capabilities, security, pricing, and more.
For instance, Cloud Volumes Service offers multiple options for DR for your mission critical data. Cloud Volumes Service provides volume-level restore using snapshot backups and file-level restore through the Cloud Backup Service (in beta release).
When it comes to security, Cloud Volumes Service can be integrated with AWS Managed AD (here) or standalone Microsoft AD configured by the user.
Cloud Volumes Service also has a flexible pricing model. Cloud Volumes Service pricing is based on the performance tier selected—Standard, Premium, or Extreme—each of which can be changed on the fly based on target use cases. Also, all the data-management features of Cloud Volumes Service are accessible from single UI with minimal setup and configuration overhead required.
From these findings, Cloud Volume Service for AWS emerges as the best-fit solution not just for home directories, but for enterprise-grade cloud file share use cases such as data analytics, application migration, content management, and more.
Learn more about Cloud Volume Service for AWS or take it for a spin.