Cloud storage is a cloud computing model that stores data on the Internet through a cloud computing provider who manages and operates data storage as a service. It’s delivered on demand with just-in-time capacity and costs, and eliminates buying and managing your own data storage infrastructure. This gives you agility, global scale and durability, with “anytime, anywhere” data access.
Cloud storage is purchased from a third party cloud vendor who owns and operates data storage capacity and delivers it over the Internet in a pay-as-you-go model. These cloud storage vendors manage capacity, security and durability to make data accessible to your applications all around the world.
Home cloud storage, on the other hand, is a little-known solution to store all your digital files safely and from the comfort of your home. With the rise to glory of services such as Google Drive, Google Photos, OneDrive, iCloud, Amazon’s cloud services, to name just a few, another segment of cloud storage solutions has developed.
Applications access cloud storage through traditional storage protocols or directly via an API. Many vendors offer complementary services designed to help collect, manage, secure and analyze data at massive scale.
Storing data in the cloud lets IT departments transform three areas:
Ensuring your company's critical data is safe, secure, and available when needed is essential. There are several fundamental requirements when considering storing data in the cloud.
Durability. Data should be redundantly stored, ideally across multiple facilities and multiple devices in each facility. Natural disasters, human error, or mechanical faults should not result in data loss.
Availability. All data should be available when needed, but there is a difference between production data and archives. The ideal cloud storage will deliver the right balance of retrieval times and cost.
Security. All data is ideally encrypted, both at rest and in transit. Permissions and access controls should work just as well in the cloud as they do for on premises storage.
There are three types of cloud data storage: object storage, file storage, and block storage. Each offers their own advantages and have their own use cases:
Backup and recovery is a critical part of ensuring data is protected and accessible, but keeping up with increasing capacity requirements can be a constant challenge. Cloud storage brings low cost, high durability, and extreme scale to backup and recovery solutions. Embedded data management policies like Amazon S3 Object Lifecycle Management can automatically migrate data to lower-cost tiers based on frequency or timing settings, and archival vaults can be created to help comply with legal or regulatory requirements. These benefits allow for tremendous scale possibilities within industries such as financial services, healthcare, and media that produce high volumes of data with long-term retention needs.
White Paper: Backup, Archive and Restore Architectures in AWS
White Paper: Best Practices for Backup and Recovery on-prem to AWS
Learn more about Backup to the Cloud.
Software test and development environments often requires separate, independent, and duplicate storage environments to be built out, managed, and decommissioned. In addition to the time required, the up-front capital costs required can be extensive.
Some of the largest and most valuable companies in the world have created applications in record time by leveraging the flexibility, performance, and low cost of cloud storage. Even the simplest static websites can be improved for an amazingly low cost. Developers all over the world are turning to pay-as-you go storage options that remove management and scale headaches.
White paper: Building Static Websites
Learn more about Software Development.
The availability, durability, and cost benefits of cloud storage can be very compelling to business owners, but traditional IT functional owners like storage, backup, networking, security, and compliance administrators may have concerns around the realities of transferring large amounts of data to the cloud. Cloud data migration services services such as AWS Import/Export Snowball can simplify migrating storage into the cloud by addressing high network costs, long transfer times, and security concerns.
Storing data in the cloud can raise concerns about regulation and compliance, especially if this data is already stored in compliant storage systems. Cloud data compliance controls like Amazon Glacier Vault Lock are designed to ensure that you can easily deploy and enforce compliance controls on individual data vaults via a lockable policy. You can specify controls such as Write Once Read Many (WORM) to lock the data from future edits. Using audit log products like AWS CloudTrail can help you ensure compliance and governance objectives for your cloud-based storage and archival systems are being met.
Learn more about Amazon Glacier's Vault Lock.
White Papers: AWS Compliance
AWS Complance Overview
Traditional on-premises storage solutions can be inconsistent in their cost, performance, and scalability — especially over time. Big data projects demand large-scale, affordable, highly available, and secure storage pools that are commonly referred to as data lakes.
Data lakes built on object storage keep information in its native form, and include rich metadata that allows selective extraction and use for analysis. Cloud-based data lakes can sit at the center of all kinds data warehousing, processing, big data and analytical engines, such as Amazon Redshift, Amazon RDS, Amazon EMR and Amazon DynamoDB to help you accomplish your next project in less time with more relevance.
Cloud storage is a critical component of cloud computing, holding the information used by applications. Big data analytics, data warehouses, Internet of Things, databases, and backup and archive applications all rely on some form of data storage architecture. Cloud storage is typically more reliable, scalable, and secure than traditional on-premises storage systems.
AWS offers a complete range of cloud storage services to support both application and archival compliance requirements. Select from object, file, and block storage services as well as cloud data migration options to start designing the foundation of your cloud IT environment.
Learn more in an IDC whitepaper that evaluates the AWS storage portfolio
and analyzes the Total Cost of Ownership for AWS cloud storage.
If You Need: | Consider Using: |
---|---|
Persistent local storage for Amazon EC2, for relational and NoSQL databases, data warehousing, enterprise applications, Big Data processing, or backup and recovery | Amazon Elastic Block Store (Amazon EBS) |
A simple, scalable, elastic file system for Linux-based workloads for use with AWS Cloud services and on-premises resources. It is built to scale on demand to petabytes without disrupting applications, growing and shrinking automatically as you add and remove files, so your applications have the storage they need – when they need it. | Amazon Elastic File System (Amazon EFS) |
A fully managed file system that is optimized for compute-intensive workloads, such as high performance computing, machine learning, and media data processing workflows, and is seamlessly integrated with Amazon S3 | Amazon FSx for Lustre |
A fully managed native Microsoft Windows file system built on Windows Server so you can easily move your Windows-based applications that require file storage to AWS, including full support for the SMB protocol and Windows NTFS, Active Directory (AD) integration, and Distributed File System (DFS). | Amazon FSx for Windows File Server |
A scalable, durable platform to make data accessible from any Internet location, for user-generated content, active archive, serverless computing, Big Data storage or backup and recovery | Amazon Simple Storage Service (Amazon S3) |
Highly affordable long-term storage classes that can replace tape for archive and regulatory compliance | Amazon S3 Glacier & Amazon S3 Glacier Deep Archive |
A hybrid storage cloud augmenting your on-premises environment with Amazon cloud storage, for bursting, tiering or migration | AWS Storage Gateway |
A portfolio of services to help simplify and accelerate moving data of all types and sizes into and out of the AWS cloud | Cloud Data Migration Services |
A fully managed backup service that makes it easy to centralize and automate the back up of data across AWS services in the cloud as well as on premises using the AWS Storage Gateway. | AWS Backup |
Amazon Elastic Block Store (Amazon EBS) provides highly available, consistent, low-latency block storage for Amazon EC2. It helps you tune applications with the right storage capacity, performance and cost.
EBS is designed for workloads that require persistent storage accessible by single EC2 instances. Typical use cases include relational and NoSQL databases (like Microsoft SQL Server and MySQL or Cassandra and MongoDB), Big Data analytics engines (like the Hadoop/HDFS ecosystem and Amazon EMR), stream and log processing applications (like Kafka and Splunk), and data warehousing applications (like Vertica and Teradata).
For more information visit the Amazon EBS page.
Amazon Elastic File System (Amazon EFS) provides a simple, scalable, elastic file system for Linux-based workloads for use with AWS Cloud services and on-premises resources. It is built to scale on demand to petabytes without disrupting applications, growing and shrinking automatically as you add and remove files, so your applications have the storage they need – when they need it. It is designed to provide massively parallel shared access to thousands of Amazon EC2 instances, enabling your applications to achieve high levels of aggregate throughput and IOPS with consistent low latencies. Amazon EFS is well suited to support a broad spectrum of use cases from lift-and-shift enterprise applications, big data analytics, web serving and content management, application development and testing, media and entertainment workflows, database backups, and container storage.
Baseball 9 download. For more information visit the Amazon EFS page.
Amazon Simple Storage Service (Amazon S3) is object storage designed to store and access any type of data over the Internet.
It is secure, 99.999999999% durable, and scales past tens of trillions of objects. S3 is used for backup and recovery, tiered archive, user-driven content (like photos, videos, music and files), data lakes for Big Data analytics and data warehouse platforms, or as a foundation for serverless computing design.
For more information, visit the Amazon S3 page.
Amazon FSx for Lustre is a fully managed file system that is optimized for compute-intensive workloads, such as high performance computing, machine learning, and media data processing workflows. With Amazon FSx, you can launch and run a Lustre file system that can process massive data sets at up to hundreds of gigabytes per second of throughput, millions of IOPS, and sub-millisecond latencies.
Amazon FSx for Lustre is seamlessly integrated with Amazon S3, making it easy to link your long-term data sets with your high performance file systems to run compute-intensive workloads.
For more information, visit the Amazon FSx for Lustre page.
Amazon FSx for Windows File Server provides a fully managed native Microsoft Windows file system so you can easily move your Windows-based applications that require file storage to AWS. Built on Windows Server, Amazon FSx provides shared file storage with the compatibility and features that your Windows-based applications rely on, including full support for the SMB protocol and Windows NTFS, Active Directory (AD) integration, and Distributed File System (DFS).
Amazon FSx uses SSD storage to provide the fast performance your Windows applications and users expect, with high levels of throughput and IOPS, and consistent sub-millisecond latencies. This compatibility and performance is particularly important when moving workloads that require Windows shared file storage, like CRM, ERP, and .NET applications, as well as home directories.
For more information, visit the Amazon FSx for Windows File Server page.
Amazon S3 Glacier and S3 Glacier Deep Archive are secure, durable, and extremely low-cost Amazon S3 cloud storage classes for data archiving and long-term backup.
S3 Glacier and S3 Deep Archive are solutions for customers who want low-cost storage for infrequently accessed data. It can replace tape for media and entertainment applications, and assist with compliance in highly regulated organizations like healthcare, life science, and financial services.
For more information visit the Amazon S3 Glacier and S3 Glacier Deep Archive page.
AWS Backup is a fully managed backup service that makes it easy to centralize and automate the back up of data across AWS services in the cloud as well as on premises using the AWS Storage Gateway. Using AWS Backup, you can centrally configure backup policies and monitor backup activity for AWS resources, such as Amazon EBS volumes, Amazon RDS databases, Amazon DynamoDB tables, Amazon EFS file systems, and AWS Storage Gateway volumes.
AWS Backup provides a fully managed, policy-based backup solution, simplifying your backup management, enabling you to meet your business and regulatory backup compliance requirements.
For more information visit the AWS Backup page.
The AWS Storage Gateway is a software appliance that seamlessly links your on-premises environment to Amazon cloud storage.
It offers local storage with highly optimized connectivity to AWS Cloud storage, and helps with migration, bursting and storage tiering use cases. Replace tape automation without disrupting existing processes, supplement on-premises workloads with storage capacity on demand, or augment existing on-premises storage investments with a cloud tier.
For more information visit the AWS Storage Gateway page.
Amazon offers a portfolio of data transfer services to migrate data into and out of the AWS cloud.
These services help you do things like securely and quickly move multi-petabyte archives, accelerate network transfers with existing infrastructure, and capture continuous streaming data from multiple sources.
For more information visit the Cloud Data Migration page.
AWS Marketplace sellers offer hundreds of industry-leading products that are equivalent, identical to, or integrate with existing storage products in your on-premises environments. These offerings complement the existing AWS services to enable you to deploy a comprehensive storage architecture and a more seamless experience across your cloud and on-premises environments.
AWS Storage Competency Partners leverage AWS solutions including Amazon EBS and Amazon S3 to provide secure and efficient storage solutions for running primary workloads in the cloud or extending an on-premises solution to create a hybrid architecture.
Learn how AWS and APN partners have helped organizations like EidosMedia and HUSCO implement efficient and cost-effective primary storage solutions.
Cloud storage is a cloud computing model that stores data on the Internet through a cloud computing provider who manages and operates data storage as a service. It’s delivered on demand with just-in-time capacity and costs, and eliminates buying and managing your own data storage infrastructure. This gives you agility, global scale and durability, with “anytime, anywhere” data access.
Cloud storage is purchased from a third party cloud vendor who owns and operates data storage capacity and delivers it over the Internet in a pay-as-you-go model. These cloud storage vendors manage capacity, security and durability to make data accessible to your applications all around the world.
Home cloud storage, on the other hand, is a little-known solution to store all your digital files safely and from the comfort of your home. With the rise to glory of services such as Google Drive, Google Photos, OneDrive, iCloud, Amazon’s cloud services, to name just a few, another segment of cloud storage solutions has developed.
Applications access cloud storage through traditional storage protocols or directly via an API. Many vendors offer complementary services designed to help collect, manage, secure and analyze data at massive scale.
Storing data in the cloud lets IT departments transform three areas:
Ensuring your company's critical data is safe, secure, and available when needed is essential. There are several fundamental requirements when considering storing data in the cloud.
Durability. Data should be redundantly stored, ideally across multiple facilities and multiple devices in each facility. Natural disasters, human error, or mechanical faults should not result in data loss.
Availability. All data should be available when needed, but there is a difference between production data and archives. The ideal cloud storage will deliver the right balance of retrieval times and cost.
Security. All data is ideally encrypted, both at rest and in transit. Permissions and access controls should work just as well in the cloud as they do for on premises storage.
There are three types of cloud data storage: object storage, file storage, and block storage. Each offers their own advantages and have their own use cases:
Backup and recovery is a critical part of ensuring data is protected and accessible, but keeping up with increasing capacity requirements can be a constant challenge. Cloud storage brings low cost, high durability, and extreme scale to backup and recovery solutions. Embedded data management policies like Amazon S3 Object Lifecycle Management can automatically migrate data to lower-cost tiers based on frequency or timing settings, and archival vaults can be created to help comply with legal or regulatory requirements. These benefits allow for tremendous scale possibilities within industries such as financial services, healthcare, and media that produce high volumes of data with long-term retention needs.
White Paper: Backup, Archive and Restore Architectures in AWS
White Paper: Best Practices for Backup and Recovery on-prem to AWS
Learn more about Backup to the Cloud.
Software test and development environments often requires separate, independent, and duplicate storage environments to be built out, managed, and decommissioned. In addition to the time required, the up-front capital costs required can be extensive.
Some of the largest and most valuable companies in the world have created applications in record time by leveraging the flexibility, performance, and low cost of cloud storage. Even the simplest static websites can be improved for an amazingly low cost. Developers all over the world are turning to pay-as-you go storage options that remove management and scale headaches.
White paper: Building Static Websites
Learn more about Software Development.
The availability, durability, and cost benefits of cloud storage can be very compelling to business owners, but traditional IT functional owners like storage, backup, networking, security, and compliance administrators may have concerns around the realities of transferring large amounts of data to the cloud. Cloud data migration services services such as AWS Import/Export Snowball can simplify migrating storage into the cloud by addressing high network costs, long transfer times, and security concerns.
Storing data in the cloud can raise concerns about regulation and compliance, especially if this data is already stored in compliant storage systems. Cloud data compliance controls like Amazon Glacier Vault Lock are designed to ensure that you can easily deploy and enforce compliance controls on individual data vaults via a lockable policy. You can specify controls such as Write Once Read Many (WORM) to lock the data from future edits. Using audit log products like AWS CloudTrail can help you ensure compliance and governance objectives for your cloud-based storage and archival systems are being met.
Learn more about Amazon Glacier's Vault Lock.
White Papers: AWS Compliance
AWS Complance Overview
Traditional on-premises storage solutions can be inconsistent in their cost, performance, and scalability — especially over time. Big data projects demand large-scale, affordable, highly available, and secure storage pools that are commonly referred to as data lakes.
Data lakes built on object storage keep information in its native form, and include rich metadata that allows selective extraction and use for analysis. Cloud-based data lakes can sit at the center of all kinds data warehousing, processing, big data and analytical engines, such as Amazon Redshift, Amazon RDS, Amazon EMR and Amazon DynamoDB to help you accomplish your next project in less time with more relevance.
Cloud storage is a critical component of cloud computing, holding the information used by applications. Big data analytics, data warehouses, Internet of Things, databases, and backup and archive applications all rely on some form of data storage architecture. Cloud storage is typically more reliable, scalable, and secure than traditional on-premises storage systems.
AWS offers a complete range of cloud storage services to support both application and archival compliance requirements. Select from object, file, and block storage services as well as cloud data migration options to start designing the foundation of your cloud IT environment.
Learn more in an IDC whitepaper that evaluates the AWS storage portfolio
and analyzes the Total Cost of Ownership for AWS cloud storage.
If You Need: | Consider Using: |
---|---|
Persistent local storage for Amazon EC2, for relational and NoSQL databases, data warehousing, enterprise applications, Big Data processing, or backup and recovery | Amazon Elastic Block Store (Amazon EBS) |
A simple, scalable, elastic file system for Linux-based workloads for use with AWS Cloud services and on-premises resources. It is built to scale on demand to petabytes without disrupting applications, growing and shrinking automatically as you add and remove files, so your applications have the storage they need – when they need it. | Amazon Elastic File System (Amazon EFS) |
A fully managed file system that is optimized for compute-intensive workloads, such as high performance computing, machine learning, and media data processing workflows, and is seamlessly integrated with Amazon S3 | Amazon FSx for Lustre |
A fully managed native Microsoft Windows file system built on Windows Server so you can easily move your Windows-based applications that require file storage to AWS, including full support for the SMB protocol and Windows NTFS, Active Directory (AD) integration, and Distributed File System (DFS). | Amazon FSx for Windows File Server |
A scalable, durable platform to make data accessible from any Internet location, for user-generated content, active archive, serverless computing, Big Data storage or backup and recovery | Amazon Simple Storage Service (Amazon S3) |
Highly affordable long-term storage classes that can replace tape for archive and regulatory compliance | Amazon S3 Glacier & Amazon S3 Glacier Deep Archive |
A hybrid storage cloud augmenting your on-premises environment with Amazon cloud storage, for bursting, tiering or migration | AWS Storage Gateway |
A portfolio of services to help simplify and accelerate moving data of all types and sizes into and out of the AWS cloud | Cloud Data Migration Services |
A fully managed backup service that makes it easy to centralize and automate the back up of data across AWS services in the cloud as well as on premises using the AWS Storage Gateway. | AWS Backup |
Amazon Elastic Block Store (Amazon EBS) provides highly available, consistent, low-latency block storage for Amazon EC2. It helps you tune applications with the right storage capacity, performance and cost.
EBS is designed for workloads that require persistent storage accessible by single EC2 instances. Typical use cases include relational and NoSQL databases (like Microsoft SQL Server and MySQL or Cassandra and MongoDB), Big Data analytics engines (like the Hadoop/HDFS ecosystem and Amazon EMR), stream and log processing applications (like Kafka and Splunk), and data warehousing applications (like Vertica and Teradata).
For more information visit the Amazon EBS page.
Amazon Elastic File System (Amazon EFS) provides a simple, scalable, elastic file system for Linux-based workloads for use with AWS Cloud services and on-premises resources. It is built to scale on demand to petabytes without disrupting applications, growing and shrinking automatically as you add and remove files, so your applications have the storage they need – when they need it. It is designed to provide massively parallel shared access to thousands of Amazon EC2 instances, enabling your applications to achieve high levels of aggregate throughput and IOPS with consistent low latencies. Amazon EFS is well suited to support a broad spectrum of use cases from lift-and-shift enterprise applications, big data analytics, web serving and content management, application development and testing, media and entertainment workflows, database backups, and container storage.
Baseball 9 download. For more information visit the Amazon EFS page.
Amazon Simple Storage Service (Amazon S3) is object storage designed to store and access any type of data over the Internet.
It is secure, 99.999999999% durable, and scales past tens of trillions of objects. S3 is used for backup and recovery, tiered archive, user-driven content (like photos, videos, music and files), data lakes for Big Data analytics and data warehouse platforms, or as a foundation for serverless computing design.
For more information, visit the Amazon S3 page.
Amazon FSx for Lustre is a fully managed file system that is optimized for compute-intensive workloads, such as high performance computing, machine learning, and media data processing workflows. With Amazon FSx, you can launch and run a Lustre file system that can process massive data sets at up to hundreds of gigabytes per second of throughput, millions of IOPS, and sub-millisecond latencies.
Amazon FSx for Lustre is seamlessly integrated with Amazon S3, making it easy to link your long-term data sets with your high performance file systems to run compute-intensive workloads.
For more information, visit the Amazon FSx for Lustre page.
Amazon FSx for Windows File Server provides a fully managed native Microsoft Windows file system so you can easily move your Windows-based applications that require file storage to AWS. Built on Windows Server, Amazon FSx provides shared file storage with the compatibility and features that your Windows-based applications rely on, including full support for the SMB protocol and Windows NTFS, Active Directory (AD) integration, and Distributed File System (DFS).
Amazon FSx uses SSD storage to provide the fast performance your Windows applications and users expect, with high levels of throughput and IOPS, and consistent sub-millisecond latencies. This compatibility and performance is particularly important when moving workloads that require Windows shared file storage, like CRM, ERP, and .NET applications, as well as home directories.
For more information, visit the Amazon FSx for Windows File Server page.
Amazon S3 Glacier and S3 Glacier Deep Archive are secure, durable, and extremely low-cost Amazon S3 cloud storage classes for data archiving and long-term backup.
S3 Glacier and S3 Deep Archive are solutions for customers who want low-cost storage for infrequently accessed data. It can replace tape for media and entertainment applications, and assist with compliance in highly regulated organizations like healthcare, life science, and financial services.
For more information visit the Amazon S3 Glacier and S3 Glacier Deep Archive page.
AWS Backup is a fully managed backup service that makes it easy to centralize and automate the back up of data across AWS services in the cloud as well as on premises using the AWS Storage Gateway. Using AWS Backup, you can centrally configure backup policies and monitor backup activity for AWS resources, such as Amazon EBS volumes, Amazon RDS databases, Amazon DynamoDB tables, Amazon EFS file systems, and AWS Storage Gateway volumes.
AWS Backup provides a fully managed, policy-based backup solution, simplifying your backup management, enabling you to meet your business and regulatory backup compliance requirements.
For more information visit the AWS Backup page.
The AWS Storage Gateway is a software appliance that seamlessly links your on-premises environment to Amazon cloud storage.
It offers local storage with highly optimized connectivity to AWS Cloud storage, and helps with migration, bursting and storage tiering use cases. Replace tape automation without disrupting existing processes, supplement on-premises workloads with storage capacity on demand, or augment existing on-premises storage investments with a cloud tier.
For more information visit the AWS Storage Gateway page.
Amazon offers a portfolio of data transfer services to migrate data into and out of the AWS cloud.
These services help you do things like securely and quickly move multi-petabyte archives, accelerate network transfers with existing infrastructure, and capture continuous streaming data from multiple sources.
For more information visit the Cloud Data Migration page.
AWS Marketplace sellers offer hundreds of industry-leading products that are equivalent, identical to, or integrate with existing storage products in your on-premises environments. These offerings complement the existing AWS services to enable you to deploy a comprehensive storage architecture and a more seamless experience across your cloud and on-premises environments.
AWS Storage Competency Partners leverage AWS solutions including Amazon EBS and Amazon S3 to provide secure and efficient storage solutions for running primary workloads in the cloud or extending an on-premises solution to create a hybrid architecture.
Learn how AWS and APN partners have helped organizations like EidosMedia and HUSCO implement efficient and cost-effective primary storage solutions.