The following is a recording and full transcript from the webinar, “12 Architectural Requirements for Protecting Business Data in the Cloud”. You can download the full slide deck on Slideshare


Full Transcript: 12 Architectural Requirements for Protecting Business Data in the Cloud

Taran Soodan:             Hello everyone, and welcome to a SoftNAS webinar today on the 12 architectural requirements for protecting business data in the cloud. My name is Taran Soodan, and along with me, I have our presenter today Erick Olson the VP of engineering for SoftNAS.

Eric, do you want to go ahead and take a second to say hi to everyone?

Eric Olson:             Good afternoon everyone. Thank you for joining.

Taran:             Awesome. Thanks for that, Eric. Before we begin today’s webinar, we do want to cover a couple of housekeeping items that are on here. Just as a reminder, the webinar audio today can be through either your computer or your local telephone.

If you want to go ahead and dial-in to today’s webinar, the information is available for you on the top right of the GoToWebinar control panel. Also, we will have a Q&A session at the end of today’s webinar.

If you have any questions that come up regarding some of the content that we are talking about or questions about how SoftNAS works, just go ahead and post your questions in the questions pane and we will go ahead and answer them at the end of today’s webinar.

Today’s session is being recorded. For those of you who want to be able to watch the webinar recording on-demand or have access to the slides, we’ll share links with you after today’s webinar in a couple of hours.

Moving on to the agenda for today’s webinar, basically, we’re going to be talking about those 12 architectural requirements for protecting your data. We will cover some best practices, some lessons learned from what we’ve done for our customers in the cloud.

Finally, we will tell you a little bit about our product SoftNAS Cloud NAS and how it works. Then we will close it off with a Q&A to answer any questions that might pop up.

With that, I’ll go ahead and hand it over to Eric to go ahead and walk you through those architectural requirements. It’s all you, Eric.

Eric:              Thanks, Taran. Good afternoon everyone. Good morning to those of you on the West Coast. Thanks for joining us. I am here to talk to you about the 12 architectural requirements for protecting your data in the cloud.

I’d like to start with the first one, and these are in no particular order. However, I would like to stress this is probably one of the most important of these requirements and that would be high availability.

It’s important to understand that not all HA solutions are created equal. The key aspect to high availability is to make sure that data is always highly available. That would require some type of replication from point A to point B.

You also need to ensure-depending upon what public cloud infrastructure you may be running this on-that your high availability supports properly the redundancy that’s available on the platform.

Take something like Amazon Web Services which offers different availability zones. You would want to find an HA solution that is available to run in different zones and provide availability across set availability zones, which is to the point of making sure that you have all your data stored in more than one zone.

You would want to ensure that you have a greater uptime in a single computer instance. The high availability that is available today from SoftNAS, for example, allows you on a public cloud infrastructure to deploy an instance in separate availability zones.

It allows you to choose different storage types. It allows you to replicate that data between those two. And in case of a failover or an incident that would require a failover, your data should be available to you within 30 to 60 seconds.

You also want to ensure that whatever HA solution you’re looking for avoids what we like to call the split-brain scenario, which means that data could end up on a node that is not on the other node or newer data could end up on the target node after an HA takeover.

You want to ensure that whatever type of solution you find that provides high availability that it meets the requirements of ensuring there is no split-brain in nodes.

The next piece that we want to cover is around data protection. I want to stress that when we talk about data protection, there’s multiple different ways to perceive that requirement.

We are looking at the data protection from the storage architecture standpoint. You want to find a solution that supports snapshots and rollbacks. In snapshots, we look at these as a form of insurance – you buy them; you use them, but you hope you never have to need them.

I want to point out that snapshots do not take the place of a backup either. You want to find a solution that can replicate your data, whether that would be from replicating your data on-premise to a cloud environment, whether it would be replicating it from different regions within a public cloud environment, or even if you wanted to replicate your data from one public cloud platform to the other to ensure that you had a copy of the data in another environment.

You want to ensure that you can provide RAID mirroring. You want to ensure that you have efficiency with your data – being able to provide different features like compression, deduplication, etc.

A copy-on-write file system is a key aspect to avoid data integrity risks. Being able to support things like Windows previous versions for rollbacks. These are all key aspects to a solution that should provide you the proper data protection.

Data security and access control. This is always at the top of mind with everyone these days, so a solution that supports encryption. You want to ensure that the data is encrypted not only at rest but during all aspects of data transmission – data-at-rest, data-in-flight.

The ability to provide the proper authentication authorization, whether that’s integration with LDAP for NFS permissions, for example, or leveraging active directory for Windows environments.

You want to ensure that whatever solution you find can support the native cloud IAM roles available or servers principle roles available on the different public cloud platforms.

You want to ensure that you’re using firewalls and limiting access to who can gain access to certain things to limit the amount of exposure you have.

Performance is always at the top of everyone’s mind. If you take a look at a solution, you want to ensure that it uses dedicated storage infrastructure so that all applications can have the performance throughput that’s required.

No-burst limitations. You will find that some cloud platform vendor solutions use a throttling mechanism in order to give you only certain aspects of performance. You need a solution that can give you guaranteed predictable performance.

You want a solution that when you start it up on day one with 100,000 files, the performance is the same on day-300 and there’s 5 million files. It’s got to be predictable and it can’t change.

You have to ensure that you look at what your actual storage’s throughput and IOPS requirements are before you deploy a solution. This is the key aspect to things.

A lot of people come in and look to deploy a solution with really understanding what their performance requirements are and sometimes we see people that undersized the solution; but a lot of times, we see people that oversized the solution as well. It’s something to really take into consideration to understand.

You want a solution that’s very flexible from a usability standpoint – something that can run on multiple cloud platforms. You can find a good balance of cost for performance; broad support for protocols like CIFS, NFS, iSCSI, AFP; some type of automation with the cloud integration; the ability to support automation via uses of script, APIs, command-lines, all of these types of things.

Something that’s software-defined, and something that allows you to actually create clones of your data so that you can actually test your usable production data in a development environment; this is a key aspect that we found.

If you have the functionality, it really allows you to test out what your real performance is going to look like prior to going into production.

You need a solution that’s very adaptable. The ability to support all of the available instances in VM types on different platforms, so whether you want to use high-memory instances, whether your requirements mandate that you need some type of ephemeral storage for your application.

Whatever that instance may be or VM may be, you want a solution that can work with it. Something that will support all the storage types that are available; whether that would be block storage, so EBS on Amazon, or Premium storage on Microsoft Azure, to also being able to support object storage.

You want to be able to leverage that lower cost object storage like Azure Blobs or Amazon S3 in order to set specific data that maybe doesn’t have the same throughput and IOPS requirements as something else to be able to leverage that lower cost storage.

This goes back to my point of understanding what your throughput and IOPS requirements so that you can actually select the proper storage to run your infrastructure.

Something that can support both an on-premise on a cloud or a hyper cloud environment – multiple cloud support and being able to adapt to the requirements as they change.

You need to find a solution that can expand as you grow. If you have a larger storage appetite and your need for storage to grow, you want to be able to extend this on the fly.

This is going to be one of the huge benefits that you’d find in a software-defined solution ran on a cloud infrastructure. There is no more rip and replace to extend storage. You could just attach more disks and extend your useable data sets.

This goes then to dynamic storage capacity and being able to support maximum files and directories. We’ve seen certain solutions that once they get to a million files, performance starts to degrade.

You need something that can handle billions of files and Petabyte worth of data so that you know that what you got deployed now today will meet your data needs five years from now.

You need a solution that has that support safety net and availability of 24/7, 365, with different levels of support so that you can access it through multiple channels.

You would probably want to find a solution that has design support, offered a free-trial or a proof of concept version. Find out what guarantees and warranties and SLA different solutions can provide to you.

The ability to provide monitoring integration with it, integrations with things like Cloud Watch, integration with things like Azure Monitoring and Reporting uptime requirements, all of your audit log, system log, integration.

Make sure that whatever solution you find can handle all of the troubleshooting gates. The SLA which I covered, how will a vendor stand behind their offer? What’s their guarantee?

Get a documented guarantee from each vendor that spells out exactly what’s covered, what’s not covered, and if there is a failure, how is that covered from a vendor perspective.

You need to make sure that whatever solution you choose to deploy that its enterprise ready. You need something that can scale to billions of files because we’re long passed millions of files.

We are dealing with people and customers that have billions of files with Petabyte of data.

It can be highly resilient. It can provide a broad range of applications and workloads. It can help you meet your DR requirements in the cloud and also can give you some reporting and analytics on the data that you have deployed and in-place.

Is the solution cloud-native? Was the solution built from the ground to reside in the public cloud or is it a solution that was converted to run in a public cloud? How easy is it to move legacy applications onto the solution in the cloud?

You should outline your cloud platform requirements. Really honestly take the time and outline what your cost and your company’s requirements to the public cloud are. Are you doing this to save money to get better performance?

Maybe you’re closing a data center. Maybe your existing hardware NAS is up for a maintenance renewal or it’s requiring a hardware upgrade because it is no longer supported. Whatever those reasons, they are very important to understand.

Look for a solution that has positive product reviews. If you look in the Amazon Web Services Marketplace for any type of solutions out there, the one thing about Amazon is it’s really great for reviews.

Whether that’s deploying a software solution out of the marketplace, or whether that’s going and buying a novel on Amazon.com, check out all earlier reviews. Look at third-party testing and benchmark results.

Run your own tests and benchmarks. This is what I would encourage you. Look at different industry analysts, customer and partner testimonials and find out if you have a trustworthy vendor.

I’d like to talk to you for just a few seconds now about SoftNAS and our product SoftNAS Cloud NAS. What SoftNAS offers is a fully featured Enterprise cloud NAS for primary data storage.

It allows you to take your existing applications that maybe are residing on-premise that needs that legacy protocol support like NFS, CIFS, or iSCSI and move them over to a public cloud environment.
The solution allows you to leverage all of the underlying storage of the public cloud infrastructure. Whether that would be object storage or block storage, you can use both. You could mix and match.

We offer a solution that can run not only on VMware on-premise but it can also run on public cloud environments such as AWS and Microsoft Azure. We offer a full high availability cross-zone solution on Amazon and a full high availability cross-network in Microsoft Azure.

We support all types of storage on all platforms. Whether that’s hot blob or cool blob on Azure, magnetic EBS or S3 disks on Amazon, we can allow you to create pools on top of that and essentially give you file server like access to these particular storage mediums.

At this point, I’d like to go ahead and take a pause. If you have any questions, please feel free to put them into that chat. I’ll go ahead and turn it over to Taran here to wrap things up and to begin our Q&A session.

Hopefully, you found this part of the webinar to be useful and good information. Taran, over to you.

Taran:             Awesome. Thank you for that, Eric. What we would like to let everyone know is that at SoftNAS, we do partner with a lot of cloud technology companies so, Amazon Web Services, Microsoft Azure, VMware; and cloud consulting companies like 2ndWatch, Relis, and other well-known cloud providers.

Eric, could you move on to the next slide, please. Just to give you guys a sense of who is using SoftNAS, we have companies of all sizes using our product whether they are small businesses or large enterprises like Nike, Netflix, Samsung, Deloitte, Symantec, Ratheone, and a lot of well-known and large companies are using SoftNAS.

If you have any concerns about whether or not SoftNAS is able to meet your needs, just know that some of the largest companies in the world that had these legacy on-premises systems have shifted to the cloud and they are using SoftNAS for that.

The next slide, please, Eric. Before we move on to the Q&A, we just want to let everyone know that you are able to try SoftNAS free for 30 days on either AWS, Azure, or even your on-prem through VMware VSphere.

We’ve got some links. You’re able to click here on the right. If you want to try SoftNAS on AWS, just go to softnas.com/trynow and you will be able to go and download SoftNAS and choose which platform you want to use SoftNAS for.

Moving on to our Q&A, it looks like we’ve got a bunch of questions here so let’s just go ahead and get these knocked out. The first question that we have here is, “Is SoftNAS a physical device? How does it work on AWS?”

Eric:              SoftNAS is a software-based solution. On VMware, for example, the solution is packaged up as an OBA and deployed as a VM. On Amazon, it is available via the AWS Marketplace and is an AMI or an Amazon Machine Image. On Azure, the solution is available via the Azure Marketplace as well.

Taran:             Thanks for that Eric. The next question here, “What are some of the use-cases for SoftNAS in the cloud?”

Eric:              That’s a very broad question so I’ll try to cover a couple of them fairly simply just to give you an idea. One of them would be let’s say that you had a bunch of 19:21 that require access to an NFS share, so being able to provide NFS share access.

The same thing with the availability, you require CIFS access. We see a lot of our customers deploy us when it comes to using SoftNAS, for example, as a target for backups in the cloud. That’s another great use-case.

We also see a lot of customers that deploy us to taking existing applications and SaaSify it so to speak – turn it into a Software-as-a-Application. We see a lot of customers that deploy us and those particular use-cases are fairly common.

Taran:             Thanks, Eric. The next question that we have here is, “What file protocols are supported?”

Eric:             We support CIFS. We support NFS version 3 and 4. We support iSCSI. We also support AFP.

Taran:             The next question that we have here is, “Will I be able to get access to the slides?” The answer to that is yes you will. After we’re done here within the next hour or two, we’re going to send you an email with a link to the slides on SlideShare along with a recording of this webinar on YouTube.

The final question that we have here is, “I have about 15 TB of storage I need to move from my legacy NAS. Do you recommend moving all the data at once or should it be done in phases?”

Eric:              Without knowing more about what your particular use-case and problem that you are trying to solve is, I would have to defer that. For the person that asked that question, if you’d follow up and contact us at sales@softnas.com, I’d be happy to further discuss your particular application and use-case.

Taran:             Thanks for that, Eric. Everyone, that’s all the questions that we have for today’s webinar. Just as a reminder, you are able to go to softnas.com and try our product for free for 30 days.

After this webinar is over there is a quick survey. We’d like to ask that everyone fill out that survey so that we know how to better improve future webinar that we do for our customers and everyone else who is interested in learning more about SoftNAS.

With that, we want to go ahead and thank everyone for joining today’s webinar and we look forward to seeing you again in the future.