More about S3 Storage
- S3 Intelligent-Tiering's Archive Instant Access Tier
- Amazon S3 Storage Lens: A Single Pane of Glass for S3 Storage Analytics
- S3 Access: How to Store Objects With Different Permissions In the Same Amazon S3 Bucket
- S3 Lifecycle Rules: Using Bucket Lifecycle Configurations to Reduce S3 Storage Costs
- S3 Storage: The Complete Guide
- S3 Pricing Made Simple: The Complete Guide
- How to Copy AWS S3 Objects to Another AWS Account
- Amazon S3 Bucket Security: How to Find Open Buckets and Keep Them Safe
- How to Test and Monitor AWS S3 Performance
- How to Secure S3 Objects with Amazon S3 Encryption
- How to Secure AWS S3 Configurations
- Comparing AWS SLAs: EBS vs S3 vs Glacier vs All the Rest
- AWS Certification Cheat Sheet for Amazon S3
March 21, 2018
Topics: Cloud Volumes ONTAP Cloud StorageData ProtectionAWSAdvanced7 minute read
Securing public cloud infrastructure is now a top priority for every organization that leverages cloud services to host their application or data. Remember that while the cloud provider is responsible for keeping their infrastructure secure, ensuring your data—including everything you keep in Amazon S3 storage—is up to you.
In this blog, we’ll look at how organizations that use Amazon S3 can avoid cloud security challenges, protect their data, and take advantage of NetApp cloud solutions for an added layer of protection.
Amazon S3 Security Checklist
Managed storage services such as Amazon S3 are where the customer is responsible for managing Amazon S3 resources and the data they upload and store on it. According to the shared responsibility model, AWS manages the network, physical security, OS patching and availability of their service, while the user is responsible for ensuring proper access to the Amazon S3 bucket and objects. Making a mistake on that level means putting your data at risk.
The following checklist will help you learn the best practices and tasks to protect Amazon S3 configurations.
1. Restrict Access to Amazon S3 Resources
It’s important in cloud security considerations to restrict access to your Amazon S3 resources to make sure no one who isn’t authorized ever gets to see that data. In order to restrict access to Amazon S3 resources you can use IAM policies, bucket policies, and NACLs. Here is how to do each of those things:
- Restrict access to S3 using IAM Policies: With IAM policies you can allow or deny actions that will be performed on Amazon S3 resources. This is the ideal solution for cases where you want to limit access for a particular user or group.
- Set bucket policies: Amazon S3 bucket policies are the same as IAM policies, the only difference is that Amazon S3 bucket policies are set up at the bucket level. A bucket policy can also be configured to provide access for different objects with the same bucket-level policy.
- Use Amazon CloudFront to serve private content: One can disallow users to access directly via Amazon S3 URLs, or you can use Amazon CloudFront to serve Amazon S3 content while keeping the buckets private.
- Limit access using signed URLs: Using CloudFront’s Origin Access Identity, applications can be set up to distribute signed URLs to authenticated users so that only those users are able to access Amazon S3 content.
- Limit access using signed cookies: Signed cookies allow you to authenticate users by sending a set-cookie header on the viewer to authenticate users and give them access to content via Amazon CloudFront.
- Manage Amazon S3 ACLs: Access Control Lists (ACLs) existed well before IAM policies were introduced. They work similar to bucket policies, however, an ACL provides access to resources such as a bucket or object. There are some limitations to ACL use; for starters, an ACL is not as powerful an access tool as bucket policy is. A policy can work at the individual API level to grant or deny access—ACLs can’t do that. ACLs are much better for a massive amount of restrictions. Since bucket policies are limited to 20 kb in size, a growing bucket policy may work better as an ACL instead. An ACL can be applied at the bucket as well as at the object level.
2. Bucket and Object Deletion RestrictionsAccess restriction is good for protecting Amazon S3 configurations, but it can’t guarantee full security. There may always be cases of human error where an object is deleted by mistake.
To rule out such situations, Amazon S3 users can put various restrictions on the ability to delete buckets or objects. There are two ways to do this:
- Enable versioning: To avoid the accidental deletion of objects you can enable versioning on buckets. With versioning enabled, if a user updates or deletes an object by mistake, a saved variant of that object will be used to restore the original version.
- MFA Delete on critical bucket or object: With MFA Delete enabled, the user is asked for an MFA code before they are allowed to delete a bucket or object.
3. Monitor AWS and Your Amazon S3 Resources
A good way to understand how to strengthen security settings is by logging actions performed on Amazon S3 configurations, buckets, and objects. Services such as AWS CloudTrail and Amazon S3 server access logging can help you monitor AWS.
- Enabling API event logging: AWS CloudTrail logs are API-level events which help keep track of all the resource-level changes. For Amazon S3, AWS CloudTrail can help keep a record of API activities for any changes on buckets and objects. You can view AWS CloudTrail logs on Amazon CloudWatch events as well, to get a better view of resource-level changes.
- Storing access logs: Amazon S3 server access logging is responsible for storing information about every single access request that is made, which can help track and understand traffic. By default, such logging is disabled and once enabled, you can set a bucket as a log delivery location to store access logs and information.
- Trusted Advisor: Trusted Advisor is a paid service that can check permissions for Amazon S3 buckets and alert you to any with open access.
4. Know What to Avoid
Don’t store all your data in single bucket: As a best practice, never store all your data in a single bucket, as any mishap can lead to all that data being compromised. Instead, use separate buckets, which limits access. If you have data in one bucket that depends on data in another bucket, use Cross-origin resource sharing (CORS) to grant access between bucket files across domains.
Avoid getting locked out! There are 3 types services that SSE S3 offers:
1) SSE-S3 is Server-Side Encryption with Amazon S3-managed encryption keys, which provides a strong multi-factor encryption standard for Amazon S3 objects that not only encrypts objects but safeguards them by encrypting the key itself with a master key that it regularly rotates.
2) SSE-KMS is Amazon Key Management Service (KMS) is a scalable cloud-based key management system that makes it easy to manage and rotate keys using IAM policies and rotation policies.
3) And SSE-C is server-side encryption using customer-provided keys to helps encrypt data without sharing management of your key with AWS.
Don’t use naked-source URLs in websites and web applications: This is something which most of development and security teams miss. A hacker can potentially get the source URL of a bucket just by scanning the website to reveal the source URL of content stored on your Amazon S3 bucket. One should always customize Amazon S3 URLs with CNAMEs to avoid such risks.
5. NetApp Tools for Protecting Amazon S3
As security is a shared responsibility, enterprises will want to use tools in addition to those provided by AWS to help reduce the manual dependency on security configurations and protect their data on Amazon S3. NetApp understands the risks of running in the public cloud, and has various tools that help maintain optimum security without having to compromise on your security.
In an AWS migration, NetApp’s Cloud Sync can safely migrate copies of your NFS and CIFS or AWS EFS data to Amazon S3 with fast and secure data transfers that will update incrementally once the baseline is set.
Cloud Sync is also a handy way to create a secondary copy of your data on Amazon S3 for archiving, analytics, testing, and more.
Since backup is always a necessary way to protect your data, it is important to find a way to store those backups efficiently or else costs will add up. For that, Cloud Volumes ONTAP (formerly ONTAP Cloud) provides powerful storage efficiencies, such as data tiering, thin provisioning, deduplication, compaction, and compression, all of which can work together to significantly cut the amount of storage needed for backup copies. Working with Cloud Volumes ONTAP, your data is protected by NetApp’s Role-based Access Control (RBAC) and NetApp Storage Encryption (NSE).
Data or files stored in cloud storage need to be protected and constantly monitored. Given that despite the best security efforts cloud security breaches still take place, customers using public cloud platforms need to be diligent about data security in cloud computing and protecting Amazon S3 configurations.
Customers should use the security configurations outlined above within Amazon S3 and leverage additional tools such as Cloud Volumes ONTAP and Cloud Sync.