Best AWS Serverless Security Practices
A research report by Global Market Insights Inc indicates that the global serverless architecture market is projected to exceed 90 billion dollars by the close of 2032.
This growth projection is not surprising, considering the escalating need for swift and streamlined software development and deployment methods.
With this increasing popularity, enterprise applications that rely on serverless computing power will, as a matter of urgency, have to be protected from the ever-evolving cyber threats.
AWS is one of the most popular if not the top provider of serverless technology. We know there are so many businesses and developers already using this framework. And since Becloud is big on AWS services, we would like to leverage our knowledge and give some valuable perspectives on the best practices for AWS serverless security, beginning with brief definitions for purposes of clarity.
What is serverless computing?
This is a computing model in which the cloud provider manages the infrastructure and automatically allocates resources as needed. Users do not have to manage servers or other infrastructure components. Instead of deploying and managing servers, developers only write and implement code in the form of functions triggered by specific events or requests.
Serverless computing is popular in part due to the following benefits:
- Reduced costs - Users pay only for actual computing time
- Scalability - Serverless architectures automatically scale up or down with the demand This helps enterprises manage sudden spikes in traffic on their applications.
- Improved developer experience and productivity - Rather than wasting time managing infrastructure, developers can focus on writing code. This will save time and free them to become more productive and much better at their craft.
What is serverless security?
Serverless security refers to the measures, policies, and practices that are implemented to protect applications, data, and infrastructure in serverless computing environments.
Traditionally, most apps face security risks such as cross-site scripting, database injection, sensitive data exposure and many others. The security measures for these attacks would often include installing firewalls and using server-based protection tools such as SSL certificates.
However, this cannot efficiently work for serverless apps since their security focuses on client-side permissions and code & behavior protection. This means that the code must be written securely, and access to client-side data should be strictly controlled.
As a result, serverless security is like an additional protection layer for the app to secure code data and functions which puts the apps at a better standpoint in terms of security.
Some of the most daunting serverless security risks include:
- Increased attack points - Serverless apps rely on inputs from various sources such as cloud storage and IoT interconnected devices. The links used to fetch data may be used to carry out attacks if independent points are compromised by vulnerabilities.
- Misconfigurations - Unstable settings in the services offered by the provider may cause security issues. Misconfigurations such as misconfigured timeout settings between the server and host may render serverless applications prone to cyber attacks as hackers may attack the functions’ low points.
- Broken authentication - This can be caused by mishandling a function’s security in serverless apps. Since serverless applications are stateless, this will cause all other functions running the app to be compromised by vulnerabilities as well.
AWS security model for serverless computing
Writing for the AWS Security Blog, Brian McNamara, a Solutions Architect at AWS with a focus on serverless, says that "Security and compliance are shared responsibilities between AWS and you, the customer".
This simply means that from a broader perspective, AWS utilizes a shared responsibility security framework that outlines the responsibilities of both AWS and its customers. The model suggests that the security “of” the cloud is AWS’ responsibility, while security “in” the cloud is the customer’s responsibility.
Overall, AWS takes a multi-layered approach to security for its serverless functions, providing a range of security features to protect user code and data.
AWS serverless security best practices
As previously stated, AWS serverless is a highly sought-after framework. This obviously makes it a prime target for cyber criminals who are constantly searching for vulnerabilities in applications running on this platform. It is crucial to prioritize keeping them out, just like all other security responsibilities.
Although it is impossible to address every aspect of AWS serverless computing security, our extensive experience working with this framework has allowed us to confidently recommend these top best practices for AWS serverless application security.
1. Compliance and governance
AWS provides a range of compliance and governance tools, such as compliance certifications which include SOC 2 and ISO 27001.
SOC2 is a standard that specifies how to handle customer data based on the Trust Services Criteria: privacy, confidentiality, security, availability, and processing integrity. Following these guidelines earns your customers’ trust since they know their data is handled securely. ISO 27001 sets standards for how organizations ought to handle information security in terms of people, processes, and technology. It covers three dimensions of information security: confidentiality, integrity, and availability.
Governance tools include AWS Config, which allow you to set policies and drive functions such as authentication to fish out users who violate usage policies.
AWS Config provides a set of pre-defined rules that you can use to monitor and enforce compliance with industry standards and best practices. These rules cover a variety of areas such as security, identity and access management, network configuration, and resource usage. A monitoring dashboard tracks configuration changes and compares them to the established rules.
You can also create custom rules in AWS Config using AWS Lambda functions. For example, you can create a custom rule to check if instances in your environment are properly tagged, or to ensure that encryption is enabled for all S3 buckets. You can also set rules for things like an unencrypted API cache, where an alert will be triggered and necessary actions taken to rectify the issue.
Once a rule is defined, AWS Config continuously monitors your environment and generates compliance reports. You can use these reports to identify non-compliant resources and take action to remediate issues.
AWS Config can also be used in conjunction with other AWS services such as AWS CloudTrail, AWS CloudFormation, and AWS Identity and Access Management (IAM) to provide a comprehensive governance and compliance solution.
2. Code auditing
In the context of AWS Serverless, code auditing involves reviewing and analyzing the code used in serverless applications to identify potential vulnerabilities, errors, and performance issues.
To implement code auditing in AWS Serverless, you need to establish a set of guidelines and standards for code review, version control, and deployment. These guidelines should be communicated to developers and enforced through tools and processes.
You can use code analysis tools such as AWS CodeGuru, Amazon CodeGuru Reviewer, and third-party tools to automatically identify issues and recommend improvements in code quality and security. These tools can analyze code for issues such as resource leaks, code complexity, and insecure coding practices.
To ensure that code is deployed to production without issues, you can implement Continuous Integration and Continuous Deployment (CI/CD) pipelines. These pipelines automate the process of building, testing, and deploying code changes to production, ensuring that changes are deployed not just quickly but reliably.
3. Safeguard against DoS
This measure protects your network’s boundaries against Direct Denial of Service attacks on external facing APIs. AWS Shield is a managed Denial of Service solution that when coupled with AWS cloud front CDN, gives your serverless functions protection against all known layer 3 and layer 4 attacks. It provides automatic detection and mitigation of DoS attacks, allowing serverless applications to remain available to legitimate users even during an attack.
Another effective way to protect against DoS attacks is to implement rate-limiting mechanisms. Rate limiting helps to control the number of requests sent to a service, which helps to prevent overload situations that could lead to service unavailability. AWS provides several services that can be used to implement rate limiting, including the Amazon API Gateway we mentioned earlier and AWS WAF.
You can also apply the principle of timing out functions. By default, AWS Lambda sets a maximum execution time limit of 15 minutes (900 seconds) for any function. This is done to ensure that functions do not run indefinitely, which could result in a security risk or performance issues. It is recommended to set the execution time to the minimum amount of time required for the function to complete its task. If a function exceeds the set time limit, it will be automatically terminated, and an error message will be generated.
In addition to these measures, it is also important to regularly monitor serverless applications for signs of DoS attacks. Monitoring can help detect and mitigate DoS attacks before they cause significant damage to the application.
4. Authentication and authorization measures
Authentication involves verifying the identity of a user or system attempting to access an application. This process typically involves validating the user's credentials, such as a username and password, or using multi-factor authentication methods.
Authorization involves determining whether a user or system is allowed to access a particular resource or perform a specific action within the application. This involves defining access controls and permissions for various resources and users.
Robust authentication and authorization mechanisms will effectively control and manage access to your serverless ecosystem, reducing the risk of unauthorized access, data breaches, and other security incidents.
The Amazon API Gateway supports multiple methods to control fine-grained user access.
You can integrate a variety of identity providers, such as Identity and Access Management (IAM). This can be configured with roles and policies for different user groups and sessions via secure tokens. IAM also enables users to create temporary credentials, which expire after a custom period of time.
Whenever you can, apply the principle of least privilege. This is the practice of limiting user access rights to only the minimum level required to perform their job function. In other words, users should have access to only the resources and data that are necessary to complete their tasks.
5. Configuration management, monitoring & logging
Configuration management involves ensuring that the configuration settings of your serverless application are appropriately set up to prevent vulnerabilities and maintain security.
Monitoring involves observing the behavior and performance of the application to detect and respond to security incidents or abnormal activity.
Logging involves recording events and data related to the application to enable troubleshooting and forensic analysis in the event of a security incident.
Together, these practices can help improve the security of your serverless application by enabling you to quickly detect and respond to security incidents, as well as ensure that your application is configured and operating securely.
For every activity, user activity, or just fetching and storing of data through API calls, you would want to log them for post-analysis and processing. You can use monitoring and logging tools such as the AWS Cloud Trail or AWS CloudWatch to monitor your serverless functions, identify, then report any unwanted behavior such as:
- Wrong credentials
- Unauthorized access to resources
- Excessive invocation of functions or multiple requests
- Unusually long-running time.
While these tools allow you to sufficiently log every component’s activity, ethics suggests that you should refrain from logging sensitive data such as passwords and data that users haven't given access to.
Security misconfigurations can cause vulnerabilities at the data endpoint in one of the serverless functions, which can, in turn spread the attack to the rest of the application’s functions due to their stateless nature. This calls for regular auditing of security configurations.
6. Code isolation
This refers to the practice of separating untrusted code from trusted code to prevent potential security threats. This is an important security measure to protect serverless applications against attacks from malicious code that may be introduced through third-party libraries or plugins.
Untrusted code can be introduced in several ways, such as through the use of third-party libraries or open-source packages. These code packages can contain vulnerabilities or malicious code that could compromise the security of the application. To mitigate these risks, it is recommended to isolate any untrusted code and run it in a separate environment.
In AWS serverless, one approach to untrusted code isolation is to use containerization solutions. By running each function in its own container, AWS Lambda ensures that any untrusted code is separated from the trusted code, reducing the risk of potential security threats.
Another approach is to use a sandbox environment that provides a secure and isolated execution environment for untrusted code. This can be achieved through the use of AWS Elastic Beanstalk or AWS CodeBuild, which provide a secure environment for building and deploying code.
7. Data encryption and integrity
As we know, encryption involves converting sensitive data into an unreadable format using encryption algorithms so that unauthorized individuals cannot read or access the data.
Data integrity involves ensuring that the data in your application is not modified or tampered with during transmission or storage.
By implementing encryption and data integrity measures, you are able to prevent network malware from intercepting and altering your application’s data.
AWS Encryption SDK is a client-side encryption library that makes it easy for developers to encrypt and decrypt data. It uses an encryption-in-envelope type of security where data is encrypted using a data encryption key. The encryption key is then stored in a wrapper key.
Data at rest like the one stored in your databases, can be encrypted via encryption keys stored in the AWS Key Management System (KMS). On the other hand, data in transit like the one submitted by users through forms, can always be encrypted and validated for integrity and authentication purposes.
It’s also important to specify the required parameters for forms. This can be used to validate data submitted to a server from a form by a user.
Next steps
If you have multiple serverless applications with many users or even one huge application that is so important for your business, you may reach a point where you realize that managing the security component starts to transform into a hefty task over time. Yet you must always guarantee optimal security not just for the app but also for those who use it, whether its customers, partners or stakeholders.
If this is where you currently find yourself, it is advisable to consider transitioning to trusted professional AWS serverless security services. This is especially critical when your organization lacks the expertise or resources to do so in-house.
The right provider should help you customize the various best practices we have discussed here and tailor them to your unique environment. This includes ensuring that all your organization’s applications conform to the AWS Well-architected Framework policies and guidelines.
Not only that, your entire app development life cycle must put security first, and AWS serverless security ought to be a core ingredient across all serverless-based projects.