October 25, 2017
Topics: 3 minute read
When NetApp decided to create the Data Fabric concept, and to align all its production capabilities behind that idea, a quite subtle change happened in the company.
Subtle, but important. NetApp began the inevitable shift to delivering technology, not just arrays.For a hundred reasons – economics, op-ex, ease of deployment, scale, etc., customers have made their desire for choice in the technology space manifestly obvious. Simply having a hardware array, even if it is a “hot” item with a cool bezel and GUI, doesn’t cut it.
Customers want software-defined options, a choice of scale-up and scale-down, and cloud deployment options, and they want them available in a variety of financial models:
- Licensing only
- Service delivery; and
- Capital expense
And here’s the rub – they want the same products delivered in all those ways at once. Silos are OUT.
The natural way to provide this capability to the customer base (and customers-to-be) is to abstract the array – that is, deliver the technology that’s inside the array, but make it available in lots of different wrappers.
So ONTAP becomes abstracted from the FAS and AFF hardware, and becomes able to run in hyperscalers (Cloud Volumes ONTAP (formerly ONTAP Cloud)) and on third-party hardware (ONTAP Select). Solidfire’s Element OS is able to run on its engineered hardware and also on Cisco’s hardware as part of the FlexPod SF. AltaVault is able to run on virtualized instances in addition to on its engineered hardware. And so on.
But there is a deeper push to this abstraction of technology. In addition to providing software-defined whole products for purchase or license to end-users, individual elements of technology can be driven into services provided by other companies.
In this scenario, a service or a product might be offered to a customer or end user, and NetApp technology would be powering it, but it wouldn’t be obvious. The technology would operate behind the scenes, as it were. The actual hardware or software isn’t advertised, marketed, or even known to the end user. The customer simply interacts as with an SLA and through an interface. The sausage-making is hidden.
All these thoughts came into sharp focus at NetApp’s recent Insight conference in early October when NetApp and Microsoft announced that Azure would begin offering Enterprise-class NFS services through their customer console early next year. A preview sign-up is already underway.
Hard to overstate how big and important this announcement really is. Because we’re not talking about Cloud Volumes ONTAP running as a pay-go license in Azure; that particular solution has been available in the marketplace for quite a while.
No, this is the next step – a full abstraction of the best enterprise-class NFS solution in the world, offered as a service by Microsoft natively. It’s an SLA – an interface if you will, and nary a mention need to be made about what hardware or software is behind it.
The important thing is that NetApp technology powers it. And Microsoft has chosen NetApp to be able to drive that enterprise-class technology into Azure, providing it to all customers directly from their user console.
Other services will come, obviously. But at this moment the usage implications for NetApp are in motion and moving fast – abstract the technology, offer it in a number of packages, and provide real value. Consumption will follow, sometimes in radical and unusual ways.
Thank goodness for that.
Want to get started? Try out Cloud Volumes ONTAP today with a 30-day free trial.