NetApp Data Fabric: A la Hybrid Cloud! – An update from NetApp Insight 2018


History

For those of you who have genuinely been following NetApp as a storage company over the years, you may already know that NetApp, contrary to the popular belief as a storage company, has always been a software company at their core. Unlike most of their competitors back in the day such as EMC or even HPe, who were focused primarily on raw hardware capabilities and purpose built storage offerings specific for each use case, NetApp always had a single storage solution (FAS platform) with fit for purpose hardware. However their real strength was in the piece of software they developed on top (Data OnTAP) which offered so many different data services that often would require 2 or 3 different solutions altogether to achieve when it comes to their competition. That software driven innovation kept them punching well beyond their weight to be in the same league as their much bigger competitors.

Over the last few years however, NetApp did expand out their storage offerings to include some additional purpose built storage solutions out of necessity to address many niche customer use cases. They built the E series for raw performance use cases with minimal data services, EF for extreme all flash performance and acquired SolidFire offering which was also a very software driven, scalable storage solution built on commodity HW. The key for most of these storage solution offerings was still the software defined storage & software defined data management capabilities of each platform and the integration of all them through the software technologies such as SnapMirror and SnapVault to move data seamlessly in between these various platform.

In an increasingly software defined world (Public & Private cloud all powered primarily through software), the model of leading with software defined data storage and data management services enables many additional possibilities to expand things out beyond just these Data Center solutions for NetApp, as it turned out.

NetApp Data Fabric

NetApp Data Fabric was an extension of that OnTAP & various other software centric storage capabilities beyond the customer data centers in to other compute platforms such as Public clouds and 3rd party CoLo facilities that NetApp set their vision a while ago.

The idea was that customers can seamlessly move data across all these infrastructure platforms as and when needed without having to modify (think “convert”) the data. NetApp’s Data Fabric at its core, aims to address the data mobility problem caused by platform locking of data, by providing a common layer of core NetApp technologies to host data across all those tiers in a similar manner. In addition, it also aims to provide common set of tools that can be used to manage those data, on any platform, during their lifetime, from the initial creation of data at the Edge location, to processing the data at the Core (DC) and / or on various cloud platforms to then long term storage & archival storage on the core and / or Public cloud platforms. In a way, this provide customers the choice of platform neutrality when it comes to their data which, lets admit it, that is the life blood of most digital (that means all) businesses of today.

New NetApp Data Fabric

Insight 2018 showcased how NetApp managed to extend the initial scope of their Data Fabric vision beyond Hybrid Cloud to new platforms such as Edge locations too, connecting customer’s data across Edge to Core (DC) to Cloud platforms providing data portability. In addition, NetApp also launched a number of new data services to help manage and monitor these data, as they move from one pillar to another across the data fabric. NetApp CEO George Kurian described this new Data Fabric as a way of “Simplifying and integrating orchestration of data services across the Hybrid Cloud providing data visibility, protection and control amongst other features”. In a way, its very similar to VMware’s “Any App, Any device, Any cloud” vision, but in the case of NetApp, the focus is all about the data & data services.

The new NetApp Data Fabric consist of the following key data storage components at each of its pillars.

NetApp Hybrid Cloud Data Storage
  • Private data center
    • NetApp FAS / SolidFire / E / EF / StorageGRID series storage platforms & AltaVault backup appliance. Most of these components now directly integrates with public cloud platforms.
  • Public Cloud
    • NetApp Cloud Volumes        – SaaS solution that provides file services (NFS & SMB) on the cloud using a NetApp FAS xxxx SAN/NAS array running Data OnTAP that is tightly integrated to the native cloud platform.
    • Azure NetApp files        – PaaS solution running on physical NetApp FAS storage solutions on Azure DCs. Directly integrated in to Azure Resource Manager for native storage provisioning and management.
    • Cloud volumes ONTAP        – NetApp OnTAP virtual appliance that runs the same ONTAP code on the cloud. Can be used for production workloads, DR, File shares and DB storage, same as on-premises. Includes Cloud tiering and Trident container support as well as SnapLock for encryption.
  • Co-Lo (Adjacent to public clouds)NetApp private storage        – Dedicated, Physical NetApp FAS (ONTAP) or a FlexArray storage solution owned by the customer, that is physical adjacent to major cloud platform infrastructures. The storage unit is hosted in an Equinix data center with direct, low latency 10GBe link to Azure, AWS and GCP cloud back ends. Workloads such as VMs and applications deployed in the native cloud platform can consume data directly over this low latency link.
  • Edge locationsNetApp HCI            – Recently repositioned as a “Hybrid Cloud Infrastructure” rather than a “Hyper-Converged Infrastructure”, this solution provides a native NetApp compute + Storage solution that is tightly integrated with some of the key data services & Monitoring and management solutions from the Data Fabric (described below).

Data Fabric + NetApp Cloud Services

While the core storage infrastructure components of Data Fabric enables data mobility without the need to transform data across each hop, customers still need the tools to be able to provision, manage, monitor these data on each pillar of the data fabric. Furthermore, customers would also need to use these tools to manage the data across non NetApp platforms that are also linked to the Data Fabric storage pillars described above (such as native cloud platforms).

Insight 2018 (US) revealed the launch of some of these brand new data services & Tool from NetApp most of which are actually SaaS solutions hosted and managed by NetApp themselves on a cloud platform. While some of these services are fully live and GA, not all of these Cloud services are live just yet, but customers can trial them all free today.

Given below is a full list of the announced NetApp Cloud services that fall in to 2 categories. By design, these are tightly integrated with all the data storage pillars of the NetApp Data Fabric as well as other 3rd party storage and compute platforms such as AWS, Azure and 3rd party data center components.

NetApp Hybrid Cloud Data Services (New)

  • NetApp OnCommand Cloud Manager    – Deploy and manage Cloud Volumes ONTAP as well as discover and provision on-premises ONTAP clusters. Available as a SaaS or an on-premises SW.
  • NetApp Cloud Sync            – A NetApp SaaS offering that enables easier, automated data migration & synchronisation across NetApp and non NetApp storage platforms across the hybrid cloud. Currently supports Syncing data across AWS (S3, EFS), Azure (Blob), GCP (Storage bucket), IBM (Object storage) and NetApp StorageGRID.
  • NetApp Cloud Secure            – A NetApp SaaS security tool that aim to identify malicious data access across all Hybrid Cloud storage solutions. Connects to various storage back ends via a data collector and support NetApp Cloud Volumes, OnTAP, StorageGRID, Microsoft OneDrive, AWS, Google GSuite, HPe Command View. Dropbox, Box, Workplace and Office 365 as end points to be monitored. Not live yet and more details here.
  • NetApp Cloud Tiering            – Based on ONTAP Fabric Pools, enables direct tiering of infrequently used data from an ONTAP solution (on premises or on cloud) seamlessly to Azure blob, AWS S3 and IBM Cloud Object Storage. Not a live solution just yet but a technical preview is available.
  • NetApp SaaS Backup            – A NetApp SaaS backup solution for backing up Office 365 (Exchange online, SharePoint online, One drive for business, MS Teams and O365 Groups) as well as Salesforce data. Formerly known as NetApp Cloud Control. Can back up data to native storage or to Azure blob or AWS S3. Additional info here.
  • NetApp Cloud backup            – Another NetApp SaaS offering, purpose built for backing up NetApp Cloud Volumes (described above)
NetApp Cloud Management & Monitoring (New)
  • NetApp Kubernetes service        – New NetApp SaaS offering to provide enterprise Kubernetes as a service. Built around the NetApp acquisition of Stackpoint. Integrated with other NetApp Data Fabric components (NetApp’s own solutions) as well as public cloud platforms (Azure, AWS and GCP) to enable container orchestration across the board. Integrates with NetApp TRIDENT for persistent storage vlumes.
  • NetApp Cloud Insights            – Another NetApp SaaS offering built around ActiveIQ, that provides a single monitoring tool for visibility across the hybrid cloud and Data Fabric components. Uses AI & ML for predictive analytics, proactive failure prevention, dynamic topology mapping and can also be used for resource rightsizing and troubleshooting with infrastructure correlation capabilities.

My thoughts

In the world of Hybrid Cloud, customer data, from VMs to file data can now be stored in various different ways across various data centers, various different Edge locations and various different Public cloud platforms, all underpinned by different set of technologies. This presents an inevitable problem for customers where their data requires transformation each time it gets moved or copied across from one pillar to another (known as platform locking of data). This also means that it is difficult to seamlessly move that data across those platforms during its life time should you want to benefit from every pillar of the Hybrid cloud and different benefits inherent to each. NetApp’s new strategy, powered by providing a common software layer to store, move and manage customer data, seamlessly across all these platforms can resonate well with customers. By continuing to focus on the customer’s data, NetApp are focusing on the most important asset organisations of today, and most definitely the organisations of tomorrow, have. So enabling their customers to avoid un-necessary hurdles to move this asset from one platform to another is only going to go down well with enterprise customers.

This strategy is very similar to that of VMware’s for example (Any App, Any Device, Any Cloud) that aim to also address the same problem, albeit with a more application centric perspective. To their credit, NetApp is the only “Legacy Storage vendor” that has this all-encompassing strategy of having a common data storage layer across the full hybrid cloud spectrum where as most of their competition are either still focused on their data centre solutions with limited or minor integration to cloud through extending backup and DR capabilities at best.

Only time will tell how successful this strategy would be for NetApp, and I suspect most of that success or the failure will rely on the continued execution of this strategy successfully through building additional data and data management services and their positioning to address various Hybrid cloud use cases. But the initial feedback from the customers appears to be positive which is good to see. Being focused on the software innovation has always provided NetApp with an edge over their competitors and continuing on that strategy, especially in an increasingly software defined world is only bound to bring good things in my view.

Slide credit to NetApp & Tech Field Day!

Continuation of Any Cloud, Any Device & Any App strategy – An update from VMworld 2018 Europe

The beginning

As an avid technologist, I’ve always had a thing for disruptive technologies, especially those that are not just cool tech but also provide genuine business benefits. Some of these benefits are obvious at first, but some are often not even anticipated until after a technology innovation has been achieved.

VMware’s inception: Through the emulation of X86 computing components within software was one of these moments where the power of software driven computing started a whole new shift in the IT industry. In an age of Hardware centric IT, this software defined computing technology paved way to achieve genuine cost savings through consolidation of multiple servers in to a handful of servers instead. For me back then as a lowly server engineer, introduction to this technology was one of those “goose bump” moments, especially when I thought about the possibilities of where this technology innovation could take us going forward, especially when that’s extended beyond just computing.

Fast forward about 12 more years, the software defined capabilities extended beyond compute in to storage and networking too, paving the way for brand new possibilities such as cloud computing. Recognising the commoditisation of this software defined approach by various other vendors, VMware strategically changed their direction to focus on building tools and solutions that provide customers the choice to run any application, on any cloud platform, accessible by any end user device (PC & Mobile). This strategy was launched back in 2015 and I’ve blogged about it here.

Continuation of a solid strategy

Following on from vSphere, vSAN and NSX as pillars of core software defined data center (SDDC), last couple of years showed how this vision from VMware was coming in to reality through the launch of various new solutions as well as modernisation of exiting solutions. IBM cloud (based on SDDC) & VMware Cloud on AWS (based on SDDC) were launched to harness cloud computing capabilities for customers without having to re-platform their workloads saving transformation costs. Along with over 2000 VMware Cloud Provider partner platforms (built on SDDC) all of whom that runs these very same technologies underneath their cloud platforms, this common architecture enabled customers to easily move their workload from on premises to any of these platforms relatively easily. Introduction of technologies such as VMware HCX last year made it even easier through one click migration of these workloads as well as the ability to move a running workload on to a cloud platform with zero downtime (Cloud motion).

In addition to the core infrastructure components, the existing infrastructure management and monitoring toolset deployed on-premises (vRealize suite) was also revamped over the last few years such that they can manage and monitor these environments across all these cloud platforms. vRealize suite was now one of the best Cloud Management Platforms that could provision workloads on to on-prem & on native cloud platforms such as AWS and Azure providing a single pane of glass.

NSX capabilities were also extended to cloud platforms to effectively bring cloud platforms closer to on-premises data centers via network adjacency providing customers easy migration and fall back choices while maintaining networking integrity across both platforms. With these updates, the vision of “Any Cloud” became more of a reality, though most of the use cases were limited to IaaS capabilities across the cloud platforms.

During last year, VMware also launched a number of fully managed, born in the cloud SaaS applications under the category of VMware Cloud Services (v1.0) aimed at extending this “Any Cloud” capabilities to cover none IaaS platforms. These SaaS offerings enabled ability to provision, manage and run cloud native workloads on none vSphere based cloud platforms such as Azure and native AWS platforms. These extended the “Any cloud” capabilities right in to various PaaS platforms too enabling better value to customers. A list of these new solutions and updates were listed on my previous post here.

Last few years also showed us how VMware intended on achieving the “Any Device” vision through the Workspace One platform & Air Watch. Incremental feature upgrades ensured that support for a wide array of end user computing and mobile devices to consume various enterprise IT services in a consistent, secure manner, regardless of where the applications & the data are hosted (on-premises or cloud). These updates include support for key none vSphere based cloud platforms and even competitive technologies such as Citrix providing customers plenty of choice to use any device of their choice to access applications hosted via all major avenues such as Mobile / PC / VDI / Citrix / Microsoft RDS.

“Any App” vision of enabling customers deploy and run any application was all about providing support for traditional (VM) based apps, micro-services based apps (containers) and SaaS apps. The partnership with Google for the implementation formed and new products such as PKE were also launched to provision, manage and run container workloads via an enterprise grade Kubernetes platform, both on premises as well as on cloud platforms, making the Any App strategy also a reality.

Update in 2018!

2018’s VMworld (Europe) messaging was very much an incremental continuation of this same multi-platform, multi app and multi device strategy, adding additional capabilities for core use cases. Some of the new updates also showed how VMware are also adding new use cases such as Edge computing and IoT solutions in to the mix.

Some of the key updates to note from VMworld 2018 include,

  • Heptio acquisition:    To strengthen the VMware’s Kubernetes platform offerings (Complements on-premises focused PKS as well as a SaaS offering for managed Kubernetes in VKE)
  • VMware Cloud PKS:    PKS as a Service (managed by VMware) on AWS with support coming for VMware Cloud on AWS, Azure, GCP and vSphere
  • Project Dimension:    Fully managed VMware Cloud Foundation solution for on-premises with Hybrid Cloud control plane. Beta announced!
  • Launch of VCF 3.5:    Latest version of Cloud Foundation with incremental updates and cloud integration via HCX.
  • CloudHealth in VCS:    Integration of recently acquired CloudHealth in to the VMware cloud services (SaaS offering) portfolio which now extends the cloud platform cost monitoring and resource management as a SaaS offering with better cloud scalability than vROPs
  • Pulse IoT center aaS:    IoT Infrastructure management solution previously available as an on-premises solution now available as a service. Beta announced!
  • New SaaS solutions:    Additional solutions are announced such as Cloud Assembly (vRA aaS), Service broker & Code stream to enhance DevOps app delivery & management.
  • VMware Blockchain:    Enterprise blockchain service inherently more secure than public blockchain that is integrated to underlying VMware tools and technologies for enterprises to consume.

Amongst these, there were also other minor incremental updates to existing tools and solutions such as vRealize suite 2018, Log Intelligence, Wavefront updates to provide application telemetry data (similar to App Dynamics) from container based deployments, vSphere & vSAN incremental updates, availability of vSphere platinum edition (with bundled in AppDefense) that learn (Good app behaviour), lock (the state in) and adapts security (based on changes to the application), Adaptive micro-segmentation via integrating NSX & AppDefense, Increased geo availability of VMware Cloud on AWS (Ireland, Tokyo, N California, Ohio, Gov clud west), availability of AWS RDS on vSphere on premises to name few.

In addition to the above based on the previously established Any Cloud, Any Device & Any App strategy, VMware are also embracing different target markets such as Telco clouds by offering industry specific solutions through the use of their VeloCloud technologies, in preparation for the 5G revolution that is imminent in the industry and large telco Vodafone are helping VMWare co-engineer and test these solutions to ensure their business relevance.

So all in all, there weren’t any attention grabbing headline announcements in this year’s VMworld event, but the focus was rather on providing evidence of the execution of that strategy set back in 2015/2016. VMware’s increasing pivoting to Cloud based solutions is becoming more and more obvious as almost all the net new products and solutions announced within 2017 and 2018 VMworlds are all SaaS offerings managed by VMware. This is a powerful message and customers seem to take note too, if the record breaking 12,000 attendees of VMworld 2018 Europe is anything to go by.

As I mentioned at the beginning of this post, as these technology updates and new innovation is continuing, no doubt there will be additional use cases being realised, and the associated business requirements previously not envisioned being established. In an age of rapid advancements of technology that often driving new business requirements retrospectively, I like how VMware are pushing ahead with a coherent technology strategy focused on providing customer the choice to benefit from innovations across these technology platforms.