Jan 17

What is your Data Amplification Ratio (DAR)?

It’s time to think differently about how you measure the efficiency of your Data and the technology that underpins it!

For many years we Storage Vendors have continuously challenged each other for supremacy in the amount of Storage efficiency that we could give you, starting with SnapShots as a way of reducing backup overheads, then on to Deduplication, Compression, Compaction, Cloning. There have been some incredible innovations here and we will continue to see incremental improvements as these techniques continue to improve and new ones are developed.
As with all things though at some time you get to a point of diminishing returns and you need to start considering Data efficiency much more broadly. In a world where you’re working with larger and larger numbers of applications and combining your own resources with that of applications running in SaaS providers or Hyperscalers, your ability to have your Data available across all of these simply, with the smallest footprint and available to as many apps as you may possibly need, becomes a very important measurement.

I call this your Data Amplification Ratio (DAR).
Think about it this way, when you consider efficiency there are what I call ‘below the data’ capabilities, of which all the ones I mentioned before are examples, how do I pack the greatest amount of Data into the smallest amount of space on whatever technology I’m using to store it. But when you think about your Data Amplification Ratio it’s the ‘Above the Data Efficiencies’ that become more important.
Cloning was probably the first of the technologies that started this crossover, sure Cloning enables you to efficiently store multiple virtual copies of data with almost zero overhead, that’s efficiency, but that’s not the real value. The real value is what you can now use these virtual copies for, from one full copy of Data you can now create clones for Test / Dev / Backup / Disaster Recovery, but this has been very ‘Sys Adminy’ if such a term exists, it creates value for the Admins to accelerate what they need to do whilst reducing the cost of doing it, but it also starts to help Application Developers as they can now use this to accelerate Test / Dev workflows. This is the first step toward effectively understanding, implementing and realising the value based on ‘Data Amplification’.

So how do you measure your DAR?
Well let’s say that you have Data that you store to support an IoT project, you’ve got a large amount of data flowing into your Data Collection platform in real time, this is your ‘Base Data set’, so how can we amplify this data? If your platform is open (Hybrid rather than just block), then other Apps can now easily be pointed to this data, your SAP system, your Hadoop platform, Spark etc. if your storage platform is open to enable these different Apps to connect then by simply using Snapshot’s and Clones you can start to have all of these amplifying from a single core set of data. As you bring in new applications to the environment to create new opportunities then these also feed off and therefore continue to amplify the value of your core set of Data.
The future for most organisations is Hybrid, a mixture of on-premises capabilities and those delivered from Cloud providers, so how does this fit into Data Amplification? Here’s a couple of NetApp examples…
1. If my IoT data is feeding into my on premises Data platform, but I am able to simply replicate all or parts of this straight into AWS or Amazon, where I have all of the same Data capabilities, SnapShots, Clones etc. then how many Applications in the AWS environment are now available to amplify this data further? That’s right, hundreds, if not thousands, all amplifying based off a highly efficient core set of data. This is what our ONTAP and Cloud ONTAP software delivers.

2. What about synchronising my on premises data to the Cloud, using any of the hundreds or thousands of tools there to interrogate it, manipulate it, basically extract more and more value from it, then replicate this back again. This is what our Cloud Sync software enables for you.
Your ‘Data Amplification Ratio’ is not something that’s easily measured and you know what? measuring it is less important than being aware of it and considering it when you think about how you’ll build your Data Fabric for the future.
The silo’d approaches of the last generation will continue to become more efficient, but just making them more efficient simply isn’t the answer to the problems that you’re trying to solve for the future. Don’t be misled by a vendor trying to get you to fixate on efficiency as the biggest challenge and opportunity for you, it isn’t, it’s important but keep in context what the real difference in capacity or cost actually is. You have to consider a Data Fabric that allows you to be efficient in the way you store and manage your data, but that also enables you to start to Amplify the value of your Data both on premises and across the Hybrid Cloud.
Storage Efficiency is useful and we can and will keep making improvements, but ‘Data Amplification’ is where real value and innovation lies.

Nov 16

NetApp Insight 2017 – Farewell Berlin

Four years ago we began the first chapter of our story in Berlin, today we close the book and the Insight train rolls on to its next destination, and the first chapter for a new book in Barcelona.

NetApp Insight is over for another year, the main stage equipment is already back on the trucks, Insight Central has closed and our partners are taking everything apart ready for their journeys home to the many different parts of the world that they joined us from.

My flight home isn’t till late this evening so I find myself sat alone in the lounge at my hotel. At the end of an event like this, where so much effort has been put in by so many people it’s always an unusual moment for me, there’s an energy, a buzz, a sense of excitement and anticipation while the event is happening and when that’s over, it’s hard to describe it, there’s the happiness of thinking back on some of the special moments with friends and colleagues, a little sadness that it’s over and a time for reflection. As the piano player tinkles the ivories in the background in this wonderful city I find myself reminiscing.

I was at one of the first Insight events 13 years ago, they were called Fall Classic back then. Two hundred of us gathered in ‘The bunker’ at the end of Java Drive inSunnyvale and we spent three days Networking ‘Filers’, configuring switches that Brocade had been good enough to lend us and connecting up Tape Drives also kindly lent to us from StorageTek. It was a very technical event.

Since that first event, the momentum really picked up and we’ve had to keep moving in order to deal with the increasing number of attendees and the increasing number of breakout sessions as more and more capabilities were being rolled into the products. We moved to the Crowne Plaza in San Jose, then on to San Francisco and Los Angeles, then the decision was made that we had to split the event for EMEA and the US.

Insight EMEA landed in Prague and the momentum carried on, Athens, Rome, Dublin and Berlin, the event expanding from less than one thousand people in the first year to over four thousand people this year of which nearly a third came from our customers; we made the decision to invite customers a few years back and they’ve become a bigger and bigger part of the event ever since.

NetApp is now 25 years old and like a child growing up it’s gone through many stages in order to become what it is today. It was born as the ‘Filer’ company, establishing the category of Networked Attached Storage and brought capabilities to it that no one had ever seen before, becoming a billion-dollar business in the process. When I joined they were simple times, we had three products, small, medium and large and we supported Unix and / or Windows, we had to fight to prove ourselves, but the technology was good, very good. Once we’d managed to convince people to give us a try the job of the SE was pretty simple.

How much data do you currently have?

How much do you think you’ll have in the future?

And will you be using Windows, Unix or both?

The theme for the event this year, which is also our purpose is ‘Change the world with Data‘, more specifically to empower our customers to change the world with Data, pretty lofty eh? but think about it this way…Data defines everything! autonomous vehicles make decisions based on their ability to collect and analyse data, Finance companies make trades based on the rapid analysis of data, the plane that I’ll fly home on later will fly and land itself based on Data.

Existing companies are looking at how they evolve for the future, how they build new business models from data and every week a new company is born because of data. Data is an incredible resource and everyone is looking at what they have, what they need and how they can extract the maximum amount of value from it, and we are helping them to do this.

This requires change, companies need Data Visionaries, people who can understand the value of data to the existing business processes, but also to find entirely new business opportunities. It requires existing roles to adapt to new technologies and new capabilities, traditional roles such as the Storage Admin will change, that’s inevitable, if that’s currently what you do then you must consider what you could be doing in the future, where are the new areas you could add value.

NetApp now has a significant portfolio of capabilities that we can bring together to help companies establish a Data Fabric, enabling them to have their data in the right place at the right time with the right resources available so that they can extract the maximum value from it. That could be physical systems or it could be in the Cloud with our many partners or the hyperscalers such as Amazon, Microsoft, IBM or Google.

I spent a lot of time at our Insight briefing centre this year and I think the one comment that came up most frequently was ‘We didn’t realise NetApp could enable us to do that!‘ and this is why Insight is so important for everyone that attends. For our employees it shows them that they have to change the way they talk, we are a Data company and to truly help people to understand the incredible value we can offer we must start by talking about the Data. For our partners, it’s about how they enrich this data conversation even further by bringing more capabilities to the data, be that Analytics, Cognitive capabilities, Machine Learning and a wide variety of others. For our customers, it’s to ensure that they get to hear about the complete scope of what we can help them do today and our plans for the future.

Our Data Fabric journey started in Berlin, it brought with it a renewed energy, a new passion within our employees and our partners and it was important for us to have our customers join us to be part of this.

We arrived in this city as a Storage Company and now, four years later we leave with a clear purpose ‘Empower our customers to change the world with Data’ and that’s an exciting opportunity.

Thank you to everyone, employees, partners, customers and the hospitality of the people of Berlin for the wonderful four years that we’ve spent together.

Auf wiedersehen

Sep 29

The future is a fabric – look at Sonos for a clue

I like music, always have, but I’m no audiophile, few people really are but I do like listening to a wide variety of loud music and VERY loud films

Buying the components

Growing up I always used to have separates in my music system, so I could choose the best Amp, the best CD Player and the most suitable speakers for my environment. It was expensive, very manual, but I would argue that each of the components made a difference, if I wanted it louder I could change the Amp, bigger room then I could change or add speakers. I could scale each component always ensuring I got the ‘best of breed’ that was most appropriate for my needs. (Buy it, build it with best of breed components and manage it yourself manually). The traditional way that we’ve built IT infrastructures, this has a lot of limitations though.

I needed something a little more portable, something for the kitchen, so I also bought a Bose Sound dock, it was good, very good, everything in one neat little box and it sat in my kitchen for many years. It was also very limited though, I had to physically connect my iPod to it and I couldn’t simply add more if I wanted to extend the system around my house and play music in different rooms. I ended up with music silos, much like the technology silos in many companies data centres.


When I moved into my new house there was a big seperates system in place, albeit they had added little automation panels on the walls of all of the main rooms, so now I could have my ‘best of breed’ components but with a level of automation that I didn’t have before. (Buy it, have best of breed components with a level of Automation, I think of this like a Converged Infrastructure). Pretty simple to run and to manage, still best of breed and for my home theatre system this is what I still run, for me this is a workload that really benefits from this approach.

Enter Sonos


The Hyper Converged Infrastructure (HCI) of the Music world. Aside from my home theatre, I ripped out the system described above as soon as I moved in, I wanted to be able to manage everything from my iPhone and use the Sonos Software as the virtualisation layer to integrate iTunes, Spotify, Rdo and many many more music streaming applications and I was prepared to compromise to get this. I was happy to use the Hyperconverged Sonos systems as the building blocks for my music. Are they the best speakers? Probably not, do they have the best amplifiers in them? Probably not, are they cheaper? Definitely not. But the convenience of being able to add Sonos nodes in different rooms of different sizes that all function as a single cluster, being able to run many many different music streaming apps on the Sonos virtualization layer and the ability to run all of this through a simple and elegant iPhone app that manages everything for me meant I was happy to compromise. And if anything stops working then I simply get in touch with Sonos to get it sorted out. Sonos for me is the HCI of the music world (This post is in no way funded or sponsored by Sonos)

The real benefit is not the hardware, albeit it’s pretty good, it’s the Fabric, I still have my best of breed system for my home cinema, in fact I just upgraded it and I was happy to pay more in order to get the advantages that it offers. When I watch I film I want to hear the details of what’s happening all around me, I want the sofa to shake and my teeth to rattle when there’s an explosion, but I also want it be be a part of my Sonos Fabric so that I can stream music to it and control it through a single clean interface. How long will I continue to buy this way? I don’t know, I’ve just upgraded so I won’t be changing any time soon, but when the time comes I’ll be open to new approaches, sound bars and subs, but for now in my opinion these are not as good, however technology moves fast.

For me this is where Generation 2 of HCI is positioned, it’s kind of where IT is moving to and people love it. Aside from my Cinema system, don’t spend time telling me that your speakers have a higher output or better tweeters because I DON’T CARE, or that your AMP has more input and output connections because again, I DON’T CARE. This is just my maybe odd view of the world and I know that no analogy stands up to too much scrutiny, but I think it’s a viewpoint worth considering in the discussion around the evolution of IT albeit one from outside of the world of IT infrastructure.

When I think about IT imperatives, those areas that most companies I meet with are focused on, Modernise, build a Next Generation Data Center and harness the Hybrid Cloud they tie in very well to the above, I continue to modernise my cinema system with best of breed components and am always looking for new features that will improve this, I’ve taken a Next Generation approach for the rest of my house with my Sonos system which has given me have a fabric that allows me to be able to store my music in the Cloud or stream from MaaS (Music as a Service providers) across all elements of my environment.

I started by saying the future is a fabric, however with NetApp this fabric for your Data Centre exists today and is just getting better and better with every software release, every technology innovation and the increasing strength of our partnerships with Amazon, Microsoft and others. Is it time you stopped buying a product and focus on building a fabric?

Jul 25

Tell me a Story

From the earliest years of our lives we are entertained by stories, from the books that our parents read to us as children to the wonderful films created by the entertainment industry. They engage us, enthral us and often leave a lasting impression on us.

The best advice I can give you if you’re thinking about a talk or presentation is ‘Tell me a story’

There’s a basic structure to every good story. Whenever I’m preparing for a keynote, a meeting with a potential client or working with groups inside my company, I’m always thinking about this structure and how I’ll build the content for what I need to say and the feeling that I want to create around it.

‘Once upon a time’
Every good story has a clear beginning, we’re setting the scene, establishing the normal state of events. This is about making a connection with people and where they currently are.

‘The Villain’
Now it’s about disruption, the villain in this context doesn’t have to be a person, it’s a disruption to the life we know. It could just be change, something that’s happening or is going to happen that will change the peaceful existence that we know into something worse. In the case of the IT industry it could be a new regulation or something far more sinister such as Ransomware.

‘The Hero’
Our Knight in shining armour, new technology or a new approach to doing something. Again the hero isn’t necessarily a person, it’s something that allows us to deal with the villain, to slay the dragon.

‘Happily ever after’
All good stories end with the world being a better place than it was at the beginning, the villain is gone, the hero victorious and people can go back to living their lives in peace and comfort.

There are many other elements that can go into developing a powerful story, a story that people want to listen to, but I believe the ones above are essential.

From my experience in the Tech industry we often start the wrong way when we think about the content we’re developing, we think about the products that we want to talk about, then we go off and find slides that we like that describe these products in the best detail. Once we have our 50 or 60 slide uber deck we then start to whittle this down to our favourite slides, usually applying the one slide per minute rule. 30 minutes of content? that’ll be 30 slides then.
When you see these types of decks presented it’s like someone reading an instruction manual to you, full of good information but typically very dull to listen too.

Then once you do have your content often the next biggest mistake is that you don’t practice, how on earth can you think that you’ll get through 30 slides in 30 minutes if you haven’t actually tried?

At an event a few months ago the speaker before me had a 30 minute speaking slot and had prepared 114 slides! As someone in the audience I felt physically exhausted by the end of his talk, he managed 70 slides before he was pulled off stage. I remember almost nothing that he talked about, what a wasted opportunity

Here are my 5 recommendations, most learnt through trial and error;

  1. Start with the story. I have a slide template and word document with the four sections I described above, using this I start to build my story for each of the sections. There’s an added benefit to this approach, you can begin to move away from relying on slides altogether. If you’ve done the ‘Once upon a time’ section then you know the next step is the ‘Villain’, once you’ve done this it’s the ‘Hero’. Over time you can reduce the number of slides to the point that you can tell the whole story without them if you wanted to.
  2. Get your timing right. For a keynote you have a fixed window and you have to develop your talk to this. But if you’re presenting a solution then you should decide how long you need to present for, not have this dictated to you by someone who is trying to fit you into an agenda. If you truly believe you can cover your content in 20 minutes then don’t agree to a 45 minute slot on an agenda.
  3. Begin with a thank you, don’t finish with it, you’re not a comedian and it’s not a performance. Finish with what you want to happen next, what do you want people to do because they listened to you, this is the reason you spoke in the first place, don’t lose sight of this.
  4. Find a critic, someone you respect that will be honest with you and practice your talk in front of them. Do not pick someone that is likely just to placate you, choose someone that will be honest. I’ve made this mistake, I chose my critic poorly and went to an event feeling like I had a strong talk, it wasn’t pretty.
  5. Be prepared. In my years of speaking at events I’ve found that if I do these things then I’m usually well prepared and I enjoy giving my talk. I’m not thinking about what is on the next slide because I know exactly what’s coming. I can relax and drop in anecdotes or be more descriptive where I want to be. I can even get a sense for how the audience is responding and adjust if necessary.

The Mirror Effect

In acting circles they say that how you feel on stage is often reflected in your audience, if you’re stressed, they are probably stressed, if you’re uncomfortable then they are often uncomfortable as well. Practice and enjoy your talk, if you are relaxed and enjoying delivering it then there’s a very high chance that your audience is enjoying listening to it.

Jun 06

Data Visionaries Wanted!

I saw this slide the other day presented by one of our Chief Architects, he was discussing how the relative importance of Infrastructure, Applications and Data has changed over time and how this will continue to change into the future, it made me realise very clearly that as the bubbles grow and shrink so does the importance of the conversation.

Generational Shift in Data Value and Management Over Time

While the phases are not distinct, this process and trend are absolutely correct.

Infrastructure Phase

Back in the days of Infrastructure it was great to discuss Infrastructure and there were plenty of people willing to hear about your sprocket and how it was better than the next persons widget; ours is faster, larger, has more ports, more features.

As this phase declined, it took with it many companies that didn’t make the transition to the new conversation. Companies we assumed would  be around for a long time.

But I guess when you’ve had a solid business for years, accepting change and trying to get the people inside your organisation to change is difficult. Some companies made the transition and changed, but many didn’t.

Application Phase

The next phase was about applications and how they created better value for our organisations and did it significantly faster. Some of these applications we wanted to run ourselves, while others were much better being delivered to us as a service.  This allowed us to focus on how to run and manage our business and to find new revenue opportunities and for the stuff you keep? oh yea you’ll need something to run it on.

Data Phase
As we move forward, the discussion is increasingly about Data. What data do we have, what new data should we collect, where can we collect data from, how do we use data to create new value based upon it.

We still are in discussion about the applications that will enable us to do this, and our infrastructure conversation now includes the cloud. But as the conversation has shifted, our needs have shifted as well. We need solutions that are simpler. We need to find ways to run things with far fewer people. Because it all comes back to focusing our efforts toward creating new ways to deliver compelling services to our customers

So how does this play out?

Let’s take something simple such as navigation, I’m sure many of you reading this will remember having your road atlas in the car, one of the tools of the trade for the road warrior.

Then we saw the move toward digital navigation, These were websites that would plot the optimum route for us that we would print out and have lying on the floor of our cars for weeks.

As we saw the development of web and mobile Navigation and we saw the rise of companies providing little custom GPS units that we stuck to the windscreens of our cars. They were selling infrastructure and applications and look at what’s happened to their business models. Why would you buy a GPS when it’s now either built into your car, or you can simply install an app on your phone.

The companies that could provide the maps with the best and quickest updates had a jump on their competitors.

Step in the Data Visionaries. This movement has led to applications being delivered in entirely new ways, with data coming from not hundreds of sources but in realtime from hundreds of thousands of people using the application.

If you’re in a traffic jam, tell the app and everyone is now aware of it, Accident? tell the app. The number of applications required to support this model is huge and the quantity of data being provided in real time is staggering.

This truly is a significant move toward what it means to be digital. And in order for your company to make this move you have to simplify and modernise what you have. You also have to consider how far into the Cloud you want to go. The Cloud provides the means to get the resources you need to be able to progress.

And you have to consider how to build a Next Generation Data Centre that uses technology designed to support thousands of applications, and capable of collecting millions of data points. Which enables you to focus on the business outcomes you want to create from data.

It’s this future that has driven the evolution of NetApp

Data Visionaries are wanted and needed to support Digital Transformation and to enable the full potential of what companies need to become for the future.

Apr 11

Why the Future of Healthcare is Being Driven by Data

Healthcare organisations are facing unprecedented demand for high quality, affordable clinical services.This demand is driven by the ageing population, rapid advances in medical technology and pharmacology, and a better informed and litigious population. Organisations globally are under immense cost pressure as consumers demand demonstrable value for their money. Fortunately, NetApp offers a number of solutions with the sole goal of delivering improvements in the delivery of patient care.

Posted in Uncategorized