VMworld 2017 US – VMware Strategy & My Thoughts

This is a quick post to summerise all the key announcements from VMworld 2017 US event and share my thoughts and insights of the strategy and the direction of VMware, the way I see it.

Key Announcements

A number of announcements were made during the week on products and solutions and below is a high level list of those to recap.

  • Announced the launch of the VMware Cloud Services which consists of 2 main components
    • VMware Cloud on AWS (VMC)
      • Consist of VMware vSphere + vSAN + NSX
      • Running on AWS data centers (bare metal)
      • A complete Public Cloud platform consisting of VMware Software Defined Data Center components
      • Available as a
    • A complete Hybrid-Cloud infrastructure security, management & monitoring & Automation solution made available through a Software as a Service (SaaS) platform
      • Work natively with VMware Cloud on AWS
      • Also work with legacy, on-premises VMware data center
      • Also work with native AWS, Azure and Google public cloud platforms
  • Next generation of network virtualisation solution based NSX-T (aka NSX Multi hypervisor)
    • Version 2.0 announced
    • Supports vSphere & KVM
    • Likely going to be strategically more important to VMware than the NSX-v (vSphere specific NSX that is commongly used today by vSphere customers). Think What ESXi was for VMware when ESX was still around, during early days!

 

 

  • Next version of vRealize Network Insight (version 3.5) released
    • Various cloud platform integrations
    • Additional on-premises 3rd party integrations (Check Point FW, HP OneView, Brocade MLX)
    • Support for additional NSX component integration (IPFIX, Edge dashboard, NSX-v DFW PCI dashboard)

 

  • VMware AppDefense
    • A brand new application security solution that is available via VMware Cloud Services subscription

 

  • VMware Pivotal Container Services (PKS) as a joint collaboration between VMware, Pivotal & Google (Kubernetes)
    • Kubernetes support across the full VMware stack including NSX & vSAN
    • Support for Sever-Less solution capabilities using Functions as a Service (Similar to AWS Lambda or Azure Functions)
    • Enabling persistent storage for stateful applications via the vSphere Cloud Provider, which provides access to vSphere storage powered by vSAN or traditional SAN and NAS storage,
    • Automation and governance via vRealize Automation and provisioning of service provider clouds with vCloud Director,
    • Monitoring and troubleshooting of virtual infrastructure via VMware vRealize Operations
    • Metrics monitoring of containerized applications via Wavefront.

 

  • Workspace One enhancements and updates
    • Single UEM platform for Windows, MacOS, Chrome OS, IOS and Android
    • Integration with unique 3rd party endpoint platform API’s
    • Offer cloud based peer-to-peer SW distribution to deploy large apps at scale
    • Support for managing Chrome devices
    • Provides customers the ability to enforce & manage O365 security policies and DLP alongside all of their applications and devices
    • Workspace One intelligence to provide Insights and automation to enhance user experience (GA Q4 FY18)
  • VMware Integrated OpenStack 4.0 announced
    • OpenStack Ocata integration
    • Additional features include
      • Containerized apps alongside traditional apps in production on OpenStack
      • vRealize Automation integration to enable OpenStack users to use vRealize Automation-based policies and to consume OpenStack components within vRealize Automation blueprints
      • Increased scale and isolation for OpenStack clouds enabled through new multi-VMware vCenter support
    • New pricing & Packaging tier (not free anymore)
  • VMware Skyline
    • A new proactive support offering aligned to global support services
    • Available to Premier support customers (North America initially)
    • Requires an appliance deployment on premise
    • Quicker time to incident resolution

Cross Cloud Architecture Strategy & My Thoughts

VMware announced the Cross Cloud Architecture (CCA) back in VMworld 2016 where they set the vision for VMware to provide the capability to customers to run & manage any application, on any cloud using any device. This was ambitious and was seen as the first step towards VMware recognising that running vSphere on premise should no longer be VMware’s main focus and they want to provide customers with choice.

This choice of platform options were to be,

  • Continue to run vSphere on premise if that is what you want to do
  • OR, let customers run the same vSphere based SDDC stack on the cloud which can be spun up in minutes in a fully automated way (IaaS)
  • OR, run the same workload that used to run on a VMware SDDC platform on a native public cloud platform such as AWS or Azure or Google cloud or IBM Cloud

During that VMworld, VMware also demoed the capability of NSX to bridge all these various private and public cloud platforms through the clever use of NSX to extend networks across all of those platforms. Well, VMworld 2017 has shown additional steps VMware have taken to make this cross cloud architecture even more of a reality. VMware Cloud on AWS (VMC) now lets you spin up a complete VMware based Software Defined Data Center running vSphere on vSAN connected by NSX through a simple web page, much similar to how Azure and AWS native infrastructure platforms allows you to provision VM based infrastructure on demand. Based on some initial articles, this could even be cheaper than running vSphere on-premise which is great news for customers. In addition to this price advantage, when you factor in the rest of Total Cost of Ownership factors such as maintaining on premise skill to set up and manage the infrastructure platforms that are no longer needed, the VMC platform is likely going to be extremely interesting to most customers. And most importantly, most customers will NOT need to go through costly re-architecting of their monolithic application estate to fit a native cloud IaaS platform which simplifies cloud migration of their monolithic application stack. And if that is not enough, you also can carry on managing & securing that workload using the same VMware management and security toolset, even on the cloud too.

When you then consider the announcement of VMware Cloud Services (VCS) offering as a SaaS solution, it now enables integrating a complete VMware hybrid cloud management toolset in to various platforms and workloads, irrespective of where they reside. VCS enables the discovery, monitoring, management and securing of those workloads across different platforms, all through a single pane of glass which is a pretty powerful message that no other public cloud provider can claim to provide in such a heterogeneous manner. This holistic management and security platform allows customers to provision, manage and secure any workload (Monolithic or Microservices based) on any platform (vSphere on premise, VMC on AWS, native AWS, native Azure, Native Google cloud) to be accessed on any device (workstation, laptop, Pad or a mobile). That to me is a true Cross Cloud vision becoming a reality and my guess is once the platform matures and capabilities increase, this is going to be very popular amongst almost all customers.

In addition to this CCA capabilities, VMware obviously appear to be shifting their focus from the infrastructure layer (read “virtual machine”) to the actual application layer, focusing more on enabling application transformation and application security which is great to see. As many have already, VMware too are embracing the concept of containers, not only as a better application architecture but also as the best way to decouple the application from the underlying infrastructure and using containers as a shipping mechanism to enable moving applications across to public cloud (& back). The announcement of various integrations within their infrastructure stack to Docker ecosystem such as Kubernetes testifies to this and would likely be welcomed by customers. I’d expect such integration to continue to improve across all of VMware’s SDDC infrastructure stack. With VMware solutions, you can now deploy container based applications on on-premise vSphere using VIC or Photon or even VMC or a native public cloud platform, store them on vSAN with volume plugins on premise or on cloud, extend the network to the container instance via NSX (on premise or on cloud), extend visibility in to container instance via vRNI and vROPS (on premise or cloud) and also automate provisioning or most importantly, migration of these container apps across on-premise or public cloud platforms as you see fit.

NSX cloud for example will let you extend all the unique capabilities of software defined networking such as micro-segmentation, security groups and overlay network extensions to not just within private data centers but also to native public cloud platforms such as AWS & Azure (roadmap) which enriches the capabilities of a public cloud platform and increases the security available within the network.

My Thoughts

All in all, it was a great VMworld where VMware have genuinely showcased their Hybrid Cloud and Cross Cloud Architecture strategy. As a technologist that have been working with VMware for a while, it was pretty obvious that a software centric organisation like VMware, similar to the likes of Microsoft was always gonna embrace changes, especially changes driven by software such as the public cloud. However most people, especially sales people in the industry I work in as well as some of the customers were starting to worry about the future of VMware and their relevance in the increasingly Cloudy world ahead. This VMworld has showcased to all of those how VMware has got a very good working strategy to embrace that software defined cloud adoption and empower customers by giving them the choice to do the same, without any tie in to a specific cloud platform. The soaring, all time high VMware share price is a testament that analysts and industry experts agree with this too.

If I was a customer, I would want nothing more!

Keen to get your thoughts, please submit via comments below

Other Minor VMworld 2017 (Vegas) Announcements

  • New VMware & HPe partnership for DaaS
    • Include Workspace ONE to HPe DaaS
    • Include Unified Endpoint Management through Airwatch
  • Dell EMC to offer data protection to VMC (VMware Cloud on AWS)
    • Include Data Domain & Data protection app suite
    • Self-service capability
  • VCF related announcements
    • CenturyLink, Fujitsu & Rackspace to offer VCF + Services
    • New HCI and CI platforms (VxRack SDDC, HDS UCP-RS, Fujitsu PRIMEFLEX, QCT QxStack
    • New VCF HW partners
      • Cisco
      • HDS
      • Fujitsu
      • Lenovo
  • vCloud Director v9 announced
    • GA Q3 FY18
  • New vSphere scale-out edition
    • Aimed at Big data and HPC workloads
    • Attractive price point
    • Big data specific features and resource optimisation within vSphere
    • Includes vDS
  • VMware Validated Design (VVD) 4.1 released
    • Include a new optional consolidated DC architecture for small deployments
  • New VMware and Fujitsu partnerships
    • Fujitsu Cloud Services to delivery VMware Cloud Services
  • DXC Technology partnership
    • Managed Cloud service with VMC
    • Workload portability between VMC, DXC DCs and customer’s own DCs
  • Re-announced VMware Pulse IoT Center  with further integration to VMware solutions stack to manage IoT components

 

Cheers

Chan

Apple WWDC 2017 – Artificial Intelligence, Virtual Reality & Mixed Reality

Introduction

As a technologist, I like to stay close to key new developments & trends in the world of digital technology to understand how these can help users address common day to day problems more efficiently. Digital disruption and technologies behind that such as Artificial Intelligence (AI), IoTVirtual Reality (VR), Augmented Reality (AR) & Mixed Reality (MR) are hot topics as they have the potential to significantly reshape how consumers will consume products and services going forward. I am a keen follower on these disruptive technologies because the potential impact they can have on traditional businesses in an increasingly digital, connected world is huge in my view.

Something I’ve heard today coming out of Apple, the largest tech vendor on the planet about how they intend on using various AI technologies along with VR and AR technologies in their next product upgrades across the iPhone, iPad, Apple Watch, App store, Mac…etc made me want to summarise those announcements and add my thoughts on how Apple will potentially lead the way to mass adoption of such digital technologies by many organisations of tomorrow.

Apple’s WWDC 2017 announcements

I’ve been an Apple fan since the first iPhone launch as they have been the prime example when it comes to tech vendors who utilizes cutting edge IT technologies to provide an elegant solution to address day to day requirements in a simple and effective manner that providers a rich user experience. I practically live on my iPhone every day for work and non-work related activities and also appreciate their other ecosystem products such as the MacBook, Apple watch, Apple TV and the iPad. This is typically not because they are technologically so advanced, but simply because they provide a simple, seamless user experience when it comes to using them to increase my productivity during day to day activities.

So naturally I was keen on finding out about the latest announcements that came out of Apple’s latest World Wide Developer Conference event that was held earlier today in San Jose. Having listened to the event and the announcements, I was excited by the new product and software upgrades announced but more than that, I was super excited about couple of related technology integrations Apple are coming out with which include a mix of AI, VR & AR to provide an even better user experience by integrating these technology advancements in to their product offerings.

Now before I go any further, I want to highlight this is NOT a summary of their new product announcements. What interested me out of these announcements were not so much the new apple products, but mainly how Apple, as a pioneer in using cutting edge technologies to create positive user experiences like no other technology vendor on the planet, are going to be using these potentially revolutionary digital technologies to provide a hugely positive user experience. This is relevant to every single business out there that manufacture a product, provides a service or solution offering to their customers as anyone can potentially look to incorporate the same capabilities in a similar or even a more creative and an innovative manner than Apple to provide a positive user experience in a similar fashion.

Use of Artificial Intelligence

Today Apple announced the increased use of various AI technologies everywhere within the future apple products as summarised below

  • Increased use of Artificial Intelligence technologies by the personal assistant “Siri”, to provide a more positive & a more personalised user experience
    • In the upcoming version of the Watch OS 4 for Apple Watch, AI technologies such as Machine Learning is going to be used to power the new Siri face such that Siri can now provide you with dynamic updates that are specifically relevant to you and what you do (context awareness)
    • The new iOS 11 will include a new voice for Siri, which now uses Deep Learning Technologies (AI) behind the scene to offer a more natural and expressive voice that sounds less machine and more human.
    • Siri will also use Machine Learning on each device (“On device learning”) to understand specifically what’s more relevant to you based on what you do on your device so that more personalised interactions can be made by Siri – In other words, Siri is becoming more context aware thanks to Machine Learning to provide a truly personal assistant service unique to each user including predictive tips based on what you are likely to want to do / use next.
    • Siri will use Machine Learning to automatically memorise new words from the content you read (i.e. News) so these words are now included on the dictionary & predictive texts automatically if you want to type them

 

  • Use of Machine Learning in iOS 11 within the photo app to enable various new capabilities to make life easier with your photos
    • Next version of Apple Mac OS, code named High Sierra, will supports additional features on the photo app including advanced face recognition capabilities which utilises AI technologies such as Advanced convolution Neural networks in order to let you group / filter your photos actually based on who’s on them
    • Machine learning capabilities will also be used to automatically understand the context of each photo based on the content of the photo to identify photos from events such as sporting events, weddings…etc and automatically group them / create events / memories
    • Using computer vision capabilities to create seamless loops on live photos
    • Use of Machine Learning to activate palm rejection on the iPad during writing using the apple Pen
    • Most Machine Learning capabilities are now available for 3rd party programmers via the iOS API’s such as Vision API (enables iOS app developers harness machine learning for face tracking, face detection, landmarks, text detection, rectangle detection, barcode detection, object tracking, image registration), Natural Language API (provides language identification, tokenization, lemmatisation, part of speech, named entity recognition)
    • Introduction of Machine Learning Model Converter, 3rd party ML contents can be converted to native iOS 11 Core ML functions.

 

  • Use of Machine Learning to improve graphics on iOS 11
    • Another Mac OS high sierra updates will include Metal 2 (the Apple API that provides app developers near direct access to the GPU capabilities) that will now integrate Machine Learning to graphic processing to provide advanced graphical capabilities such as Metal performance shaders, Recurrent neural network kernels, binary convolution, dilated convolution, L-2 norm pooling, Dilated pooling etc. (https://developer.apple.com/metal/)
    • Newly announced Mac Pro graphics powered by AMD Radeon Vega can provide up to 22 teraflops of half precision compute power which is specifically relevant for machine learning related content development

 

Use of Virtual Reality & Augmented Reality

  • Announcement on the introduction of Metal API for Virtual Reality to be used by developers – That includes Virtual Reality integration to Mac OS High Sierra Metal2 API to enable features such as VR-optimised display pipeline for video editing using VR and other related updates such as viewport arrays, system trace stereo timelines, GPU queue priorities, Frame debugger stereoscopic visualisation.
  • Availability of the ARKit for iOS 11 to create Augmented Reality straight out of iPhone using its camera and built in Machine Learning to identify contents on the live video, real time.

 

Use of IoT capabilities

  • Apple Watch integration for bi directional information synchronisation between Apple Watch and ordinary gym equipment so that your apple watch will now act as an IoT gateway to your typical gym equipment’s like the Treadmill or the cross trainer to provide more accurate measurements from the apple watch and the gym equipment will adjust the workouts based on those readings.
  • Apple watch OS 4 will also provide core Bluetooth connectivity to other devices such as various healthcare tools that open the connectivity of those devices through the Apple Watch

 

My thoughts

The use cases for digital technologies, such as AI, AR & VR in a typical corporate or an enterprise environment to create a better product / service / solution offering like Apple has used them, is immense and is often only limited by one’s level of creativity & imaginations. Many organisations around the world, from other tech or product vendors to Independent Software Vendors to an ordinary organisation like a high street shop or a supermarket can all benefit from the creative application of new digital technologies such as Artificial Intelligence, Augmented Reality and Internet Of Things in to their product / service / solution offerings to provide their customers with richer user experience as well as exciting new solutions. Topics like AI and AR are hot topics in the industry and some organisations are already evaluating the use of them while some already benefit from some of these technologies made easily accessible to the enterprise through platforms such as public cloud (Microsoft Cortana Analytics and Azure Machine Learning capabilities available on Microsoft Azure for example) platforms. But there are also a large number of organisations who are not yet fully investigating how these technologies can potentially make their business more innovative, differentiated or at the very least, more efficient.

If you belong to the latter group, I would highly encourage you to start thinking about how these technologies can be adopted by your business creatively. This applies to any organisation of any size in the increasingly digitally connected world of today. If you have a trusted partner for IT, I’d encourage you to talk to them about the same as the chances are that they will have more collective experience in helping similar businesses adopt such technologies which is more beneficial than trying to get their on your own, especially if you are new to it all.

Digital disruption is here to stay and Apple have just shown how advanced technologies that come out of digital disruption can be used to create better products / solutions for average customers. Pretty soon, AI / AR / IoT backed capabilities will become the norm rather than the exception and how would your business compete if you are not adequately prepared to embrace them?

Keen to get your thoughts?

 

You can watch the recorded version of the Apple WWDC 2017 event here.

Image credit goes to #Apple

 

#Apple #WWDC #2017 #AI #DeepLearning #MachineLearning #ComputerVision #IoT #AR #AugmentedReality #VR #VirtualReality #DigitalDisruption #Azure #Microsoft

 

The impact of digital revolution on software licensing – Or is that the other way around?

I happened to come across the below post which, after reading got me thinking about few things which I thought would be a good idea to write a quick post about and get everyone else’s thoughts too.

http://diginomica.com/2017/02/20/sap-v-diageo-important-ruling-customers-indirect-access-issues/

The article was effectively about a court battle between SAP (an enterprise SW vendor, that in their own admission is “The market leader in enterprise application software”) & Diageo (the drinks manufacturing giant) where SAP was suing Diageo to secure additional licensing revenue for indirect use of the data produced by SAP system software. If you didn’t read the full abstract via the above link, what that essentially meant was that when the data SAP generates (for the legal, fee paying customers that is Diageo) is accessed by a 3rd party, presumably for proving Diageo with a service, that SAP needs to be paid additional licensing revenue for that indirect usage which is the responsibility of Diageo.

In this case, this ability for SAP to claim additional licensing revenue from Diageo was in its contract which was why it was ruled in SAP’s favour by the judge (according to the article). While admitting that I am NOT a legal eagle, this brings the question to my mind that, if this in fact is the final verdict on this case (which I’m sure would be challenged in appeals court…etc.), is this approach fair, especially in a world faced with a massive digital revolution where everything, starting from a small electronic device, to a large multi-node machine, to a piece of software are all connected through digital technologies  to relay data from one to another with the intention of processing and re-processing data as it’s being parsed through each piece of software (disparate consumption)?

In my line of work (IT), almost all IT systems are interconnected and that interconnection is typically there for one system to consume the data produced by another system and the number of hops involved in this inter-connection chain can go up from a couple of systems to few dozen on how digital each customer’s environment is. In a pre “Digital Enterprise” world, typically all these connection hops (i.e. IT Systems) belong to one department, one business unit, or worst case, one organisation and therefore typically licensed to be used by that department / Business unit / Organisation (which covers all the users of that department / business unit / organisation).

But the digital revolution currently sweeping across all forms of industries will increase such inter-connectivity of Software systems to go beyond one organisation as multiple organisations will collaborate through data sharing, often real time across various platforms in order to create a truly digital enterprise. Some of these type of digital integration is already common place, especially amongst finance sector customers…etc. Such digital connectivity of software platforms across organisations will now likely be relevant to many other organisations which previously would have thought that they are mutually exclusive when it comes to their business operations.  So I guess my questions is if the software that underpins those key digital connectivities happened to have such attitude to licensing like SAP did in the instance above, what would be the implications on true digital connectivity, across multiple software platforms? Are we fully aware of the exact small print of each and every software we ever use within our business to fully understand how each one of them define its permitted usage, who’s classed as its users are and where we can connect it to other systems vs where we cannot? How do we know precisely that we are not violating such draconian licensing laws during this multi-platform, API driven, digital inter-connectivity?

What do you think? Just curious on getting people’s views as I’m sure there’s no right or wrong answer. Do you think such draconian licensing rules are wrong or would you argue given a dwindling market for Independent Software Vendors (courtesy of public cloud), they should be allowed to benefit from not just direct but also indirect (2nd and 3rd level) interaction with data initially produced by their software systems? If you would, do you then have the sufficient licensing expertise in house to ensure that you are not violating software licensing agreements that you often sign up to without reading them? If you don’t have in house knowledge, do you have a trusted partner who can advise on such licensing matters in an increasing complex, digitally inter-connected world including Cloud platforms (PaaS, SaaS) and help you achieve a true digitally connected enterprise without paying over the odds for software licenses?

Keen to see what others think!

Cheers

Chan

#DigitalEnterprise #Connected #IoT #DigitalRevolution #Licensing

VVDs, Project Ice, vRNI & NSX – Summary Of My Breakout Sessions From Day 1 at VMworld 2016 US –

Capture

Quick post to summerise the sessions I’ve attended on day 1 at @VMworld 2016 and few interesting things I’ve noted. First up are the 3 sessions I had planned to attend + the additional session I managed to walk in to.

Breakout Session 1 – Software Defined Networking in VMware validated Designs

  • Session ID: SDDC7578R
  • Presenter: Mike Brown – SDDC Integration Architect (VMware)

This was a quick look at the VMware Validated Designs (VVD) in general and the NSX design elements within the SDDC stack design in the VVD. If you are new to VVD’s and are typically involved in designing any solutions using the VMware software stack, it is genuinely worth reading up on and should try to replicate the same design principles (within your solution design constraints) where possible. The diea being this will enable customers to deploy robust solutions that have been pre-validated by experts at VMware in order to ensure the ighest level of cross solution integrity for maximum availability and agility required for a private cloud deployment. Based on typical VMware PSO best practices, the design guide (Ref architecture doc) list out each design decision applicable to each of the solution components along with the justification for that decision (through an explanation) as well as the implication of that design decision. An example is given below

NSX VVD

I first found out about the VVDs during last VMworld in 2015 and mentioned in my VMworld 2015 blog post here. At the time, despite the annoucement of availability, not much content were actually avaialble as design documents but its now come a long way. The current set of VVD documents discuss every design, planning, deployment and operational aspect of the following VMware products & versions, integrated as a single solution stack based on VMware PSO best practises. It is based on a multi site (2 sites) production solution that customers can replicate in order to build similar private cloud solutions in their environments. These documentation set fill a great big hole that VMware have had for a long time in that, while their product documentation cover the design and deployment detail for individual products, no such documentaiton were available for when integrating multiple products and with VVD’s, they do now. In a way they are similar to CVD documents (Cisco Validated Designs) that have been in use for the likes of FlexPod for VMware…etc.

VVD Products -1

VVD Products -2

VVD’s generally cover the entire solution in the following 4 stages. Note that not all the content are fully available yet but the key design documents (Ref Architecture docs) are available now to download.

  1. Reference Architecture guide
    1. Architecture Overview
    2. Detailed Design
  2. Planning and preperation guide
  3. Deployment Guide
    1. Deployment guide for region A (primary site) is now available
  4. Operation Guide
    1. Monitoring and alerting guide
    2. backup and restore guide
    3. Operation verification guide

If you want to find out more about VVDs, I’d have a look at the following links. Just keep in mind that the current VVD documents are based on a fairly large, no cost barred type of design and for those of you who are looking at much smaller deployments, you will need to exercise caution and common sense to adopt some of the recommended design decisions to be within the appplicable cost constraints (for example, current NSX design include deploying 2 NSX managers, 1 integrated with the management cluster vCenter and the other with the compute cluster vCenter, meaning you need NSX licenses on the management clutser too. This may be an over kill for most as typically, for most deployments, you’d only deploy a single NSX manager integrated to the compute cluster)

As for the Vmworld session itself, the presenter went over all the NSX related design decisions and explained them which was a bit of a waste of time for me as most people would be able to read the document and understand most of those themselves. As a result I decided the leave the session early, but have downloaded the VVD documents in order to read throughly at leisure. 🙂

Breakout Session 2 – vRA, API, Ci Oh My!

  • Session ID: DEVOP7674
  • Presenters

vRA Jenkins Plugin

As I managd to leave the previous session early, I manage to just walk in to this session which had just started next door and both Kris and Ryan were talking about the DevOps best practises with vRealize Automation and vrealize Code Stream. they were focusing on how developpers who are using agile development that want to invoke infrastructure services can use these products and invoke their capabilities through code, rather than through the GUI. One of the key focus areas was the vRA plugin for Jenkins and if you were a DevOps person of a developper, this session content would be great value. if you can gain access to the slides or the session recordings after VMworld (or planning to attend VMworld 2016 Europe), i’d highly encourage you to watch this session.

Breakout Session 3 – vRealize, Secure and extend your data center to the cloud suing NSX: A perspective for service providers and end users

  • Session ID: HBC7830
  • Presenters
    • Thomas Hobika – Director, America’s Service Provider solutions engineering & Field enablement, vCAN, vCloud Proviuder Software business unit (VMware)
    • John White – Vice president of product strategy (Expedient)

Hosted Firewall Failover

This session was about using NSX and other products (i.e. Zerto) to enable push button Disaster Recovery for VMware solutions presented by Thomas, and John was supposed to talk about their involvement in designing this solution.  I didn’t find this session content that relevent to the listed topic to be honest so left failrly early to go to the blogger desks and write up my earlier blog posts from the day which I thought was of better use of my time. If you would like more information on the content covered within this sesstion, I’d look here.

 

Breakout Session 4 – Practical NSX Distributed Firewall Policy Creation

  • Session ID: SEC7568
  • Presenters
    • Ron Fuller – Staff Systems Engineer (VMware)
    • Joseph Luboimirski – Lead virtualisation administrator (University of Michigan)

Fairly useful session focusing about NSX distributed firewall capability and how to effectively create a zero trust security policy on ditributed firewall using vairous tools. Ron was talking about various different options vailablle including manual modelling based on existing firewall rules and why that could potentially be inefficient and would not allow customers to benefit from the versatality available through the NSX platform. He then mentioned other approaches such as analysing traffic through the use of vRealize Network Insight (Arkin solution) that uses automated collection of IPFIX & NetFlow information from thre virtual Distributed Switches to capture traffic and how that capture data could potentialy be exported out and be manipulated to form the basis for the new firewall rules. He also mentioned the use of vRealize Infrastructure Navigator (vIN) to map out process and port utilisation as well as using the Flow monitor capability to capture exisitng communication channels to design the basis of the distributed firewall. The session also covered how to use vRealize Log Insight to capture syslogs as well.

All in all, a good session that was worth attending and I would keep an eye out, especially if you are using / thinking about using NSx for advanced security (using DFW) in your organisation network. vRealize Network Insight really caught my eye as I think the additional monitoring and analytics available through this platform as well as the graphical visualisation of the network activities appear to be truely remarkeble (explains why VMware integrated this to the Cross Cloud Services SaS platform as per this morning’s announcement) and I cannot wait to get my hands on this tool to get to the nitty gritty’s.

If you are considering large or complex deployment of NSX, I would seriously encourage you to explore the additional features and capabilities that this vRNI solution offers, though it’s important to note that it is licensed separately form NSX at present.

vNI         vNI 02

 

Outside of these breakout sessions I attended and the bloggin time in between, I’ve managed to walk around the VM Village to see whats out there and was really interested in the Internet Of Things area where VMware was showcasing their IOT related solutions currently in R&D. VMware are currently actively developing an heterogeneous IOT platform monitoring soluton (internal code name: project Ice). The current version of the project is about partnering up with relevent IOT device vendors to develop a common monitoring platform to monitor and manage the various IOT devices being manufacured by various vendors in various areas. If you have a customer looking at IOT projects, there are opportunities available now within project Ice to sign up with VMware as a beta tester and co-develop and co-test Ice platform to perform monitoring of these devices.

An example of this is what VMware has been doing with Coca Cola to monitor various IOT sensors deployed in drinks vending machines and a demo was available in the booth for eall to see

IOT - Coke

Below is a screenshot of Project Ice monitoring screen that was monitoring the IOT sensors of this vending machine.   IOT -

The solution relies on an Open-Source, vendor neutral SDK called LIOTA (Little IOT Agent) to develop a vendor neutral agent to monitor each IOT sensor / device and relay the information back to the Ice monitoring platform. I would keep and eye out on this as the use cases of such a solution is endless and can be applied on many fronts (Auto mobiles, ships, trucks, Air planes as well as general consumer devices). One can argue that the IOT sensor vendors themselves should be respornsible for developping these mo nitoring agents and platforms but most of these device vendors do not have the knowledge or the resources to build such intelligent back end platforms which is where VMware can fill that gap through a partship.

If you are in to IOT solutions, this is defo a one to keep your eyes on for further developments & product releases. This solution is not publicly available as of yet though having spoken to the product manager (Avanti Kenjalkar), they are expecting a big annoucement within 2 months time which is totally exciting.

Some additional details can be found in the links below

Cheers

Chan

#vRNI #vIN #VVD # DevOps #Push Button DR # Arkin Project Ice # IOT #LIOTA