STATE OF IDENTITY SECURITY Permiso has released the 2024 Survey Report

[GET THE REPORT]
Illustration Cloud

How Adversaries Abuse Serverless Services to Harvest Sensitive Data from Environment Variables

 

Introduction

In cloud computing, the evolution of serverless technology has significantly transformed how developers build and run applications. Over the years, the adoption of serverless computing has grown rapidly, with developers and organizations increasingly turning to cloud deployment. The three major cloud service providers offer serverless options such as AWS Lambda, Azure Functions, and Google Cloud Functions, which have become essential tools for deploying cloud applications.

Serverless computing allows developers to focus on writing code without worrying about the underlying infrastructure, leading to faster development lifecycles and more efficient resource utilization. As a result, more organizations are leveraging these services to build and run scalable, cost-effective applications in the cloud. This trend reflects a broader shift towards modern, cloud-native application architectures that prioritize agility and operational efficiency.

Serverless environment variables contain key-value pairs that are used to configure and pass information to serverless functions and applications. DevOps engineers and cloud security engineers commonly use these environment variables because they provide a convenient and portable way to supply secrets to applications across almost all services. However, sensitive data stored in environment variables, such as API keys, database credentials, or other crucial configuration settings, can be compromised if accidentally shared or made public.

Adversaries can exploit serverless functions for persistent execution by setting them to trigger on specific cloud events, such as user creation. For example, in AWS, attacker might create a Lambda function that adds extra cloud credentials whenever a new user is created, triggered by a CloudWatch rule. In Office 365, adversaries could use Power Automate to forward emails or create anonymous sharing links when access to a SharePoint document is granted. This method allows continuous exploitation and unauthorized access.

In this blog, we will explore how threat actors can exploit sensitive data stored in serverless environment variables in AWS, Azure, GCP and Kubernetes. We will also examine the use of cloud-offensive tools for this purpose. Additionally, this blog expands on the MITRE ATT&CK Cloud Matrix for Enterprise by analyzing the Serverless Execution (T1648) technique, which falls under the Execution tactic.

Serverless-Execution-MITRE-Technique-Overview

Environment Variables in Serverless Environments

Serverless environment variables contain key-value pairs that are commonly configured used to store configuration settings and sensitive information such as tokens, APIs keys, database credentials, and encryption keys needed by serverless functions such as AWS Lambda, Azure Functions and Google Cloud Functions.

Threat actors can exploit these serverless services offered by cloud service providers to execute arbitrary code, including malicious activities like crypto-mining. They can also abuse IAM (Identity and Access Management) permissions to perform privilege escalation on serverless functions. For example, in AWS, attackers may use the IAM:PassRole permission or the iam.serviceAccounts.actAs permission in Google Cloud to assign additional roles to a serverless function. This allows them to elevate privileges and execute actions on serverless functions beyond the original user's permissions.

Cloud secrets stored in environment variables

Serverless environment variables are configured differently depending on the cloud service provider. However, here are the common formats and methods for configuring environment variables in AWS Lambda, Azure Functions, and Google Cloud Functions via command-line Interface (CLI):

  • In AWS Lambda, environment variables can be configured through the AWS Management Console. Additionally, AWS CLI and AWS SDK can be programmatically used to set environment variables.
#AWS CLI
aws lambda update-function-configuration --function-name my-prod-dev --environment "Variables={STORAGE_ACCOUNT_NAME=myStorageAccount,STORAGE_ACCOUNT_KEY=ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890EXAMPLEKEY}"

  • In Azure Functions, environment variables can be configured through the Azure Portal. Additionally, Azure CLI and Azure Resource Manager Templates can be used to set and define environment variables.
#Azure CLI
az functionapp config appsettings set --name my-prod-function-app --resource-group prod-resource-group --settings "STORAGE_ACCOUNT_NAME=prodstorageaccountname" "STORAGE_ACCOUNT_KEY=ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890EXAMPLEKEY"

  • In Google Cloud Functions, environment variables can be configured through the Google Cloud Console when deploying a function**.** Additionally, gcloud command-line tool and Deployment Configuration Files can be used to set and define environment variables.
#gcloud CLI
gcloud functions deploy my-function --set-env-vars "GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/service-account-key.json"

In using AWS CLI, For example, environment variables can be used to configure AWS access keys which can be retrieved within a Lambda function to programmatically interact with other AWS services (S3 bucket). This allows the function to access resources without hardcoding credentials in the codebase.

aws-configure-access-variable

Image from AWS Documentation: How to set environment variables in Linux, Windows and PowerShell

environment-variable-linux-windows-powershell

Image showing the configuration of AWS access keys in the AWS CLI.

AWS CLI Example Illustration

#Configuring Environment Variables via AWS CLI
AWS_ACCESS_KEY_ID = AKIAIOSFODNN7EXAMPLE
AWS_SECRET_ACCESS_KEY = wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY

#Python code to Retrieve AWS access keys from environment variables

import os
import boto3

access_key = os.environ['AWS_ACCESS_KEY_ID']
secret_key = os.environ['AWS_SECRET_ACCESS_KEY']

# Create an S3 client using the access keys
s3_client = boto3.client(
's3',
aws_access_key_id=access_key,
aws_secret_access_key=secret_key
)

def lambda_handler(event, context):
# Use the S3 client to list all buckets
response = s3_client.list_buckets()
return response

 

The above code sets AWS access keys as environment variables, retrieves them in a Python script, and uses them to create an S3 client with Boto3 to list all S3 buckets.

Similarly, the above environment variable functionality is possible in both Azure and Google Cloud Platform (GCP) for configuring and using access keys or other sensitive information through environment variables.

#Configuring Azure storage account key as environment variables for Azure Functions application.
STORAGE_ACCOUNT_NAME = Prod-storage-name
STORAGE_ACCOUNT_KEY = ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890EXAMPLEKEY

#Configuring Google Cloud service account key as an environment variable for Google Cloud Functions.
GOOGLE_APPLICATION_CREDENTIALS = /path/to/your/service-account-key.json

 

Security Concern with Environment Variables

According to an article by CyberArk titled "Environment Variables Donโ€™t Keep Secrets: Best Practices for Plugging Application Credential Leaks," it was established that despite the flexibility of environment variables for providing global access to serverless services, the continuously evolving threat landscape has rendered this method unsuitable for securely storing sensitive data. Environment variables are now considered less secure for providing sensitive values to applications and workloads due to their inherent vulnerabilities, such as global access, potential exposure through logging, and process listing.

In containerized environments like Docker or Kubernetes, it is worth noting, as stated in the OWASP Secrets Management Cheat Sheet, that โ€œโ€ฆenvironment variables are generally accessible to all processes and may be included in logs or system dumps. Using environment variables is therefore not recommended unless other methods are not possible.โ€ For example, in Kubernetes, anyone with access can run kubectl exec <pod-name> -- env, which would print all the environment variables to the console.

#Assume there is a pod named my-pod
kubectl exec my-pod -- env

## COMMAND OUTPUT
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
HOSTNAME=my-pod
KUBERNETES_PORT=tcp://10.96.0.1:443
KUBERNETES_PORT_443_TCP=tcp://10.96.0.1:443
KUBERNETES_PORT_443_TCP_PROTO=tcp
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT_443_TCP_ADDR=10.96.0.1
DATABASE_URL=postgresql://user:password@localhost:5432/mydatabase
ENV=production
SECRET_KEY=supersecretkey
API_KEY=1234567890abcdef
NODE_ENV=production

 

When the kubectl exec my-pod -- env command is executed, it prints all the environment variables set in the container running inside my-pod. Anyone with the necessary permissions can run this command and potentially expose sensitive information stored in environment variables.

In another case related to containerized environments, according to the blog post by Trend Micro titled โ€œAnalyzing the Hidden Danger of Environment Variables for Keeping Secrets,โ€ environment variables can be defined in various ways such as in shell scripts, Docker, and Kubernetes. One global definition method is the export command, which is commonly used inside shell scripts. When using shell scripts, the -e parameter is used before starting containerized applications or during a container build where the ENV command in a Dockerfile indicates that the variable will be set at runtime. It is worth noting that environment variables are available in plain text within the environment and are not encrypted.

For Example: The command scripts shown below details examples of how to set and use environment variables in different contexts: in a shell script, when starting a Docker container, within a Dockerfile, and in a Python application.

#!/bin/bash
## In Shell script
# Setting an environment variable
export DATABASE_URL="postgresql://user:password@localhost:5432/mydatabase"

# Using the environment variable
echo "The database URL is: $DATABASE_URL"

## command used to starting a Containerized Application
# Using the -e parameter to set an environment variable when starting a Docker container
docker run -e "DATABASE_URL=postgresql://user:password@localhost:5432/mydatabase" my-docker-image

## In Dockerfile
# Setting environment variables in a Dockerfile
FROM python:3.8-slim

# Set environment variables
ENV DATABASE_URL="postgresql://user:password@localhost:5432/mydatabase"

# Use the environment variable in the application
CMD ["python", "app.py"]

## Python code (app.py)
import os

# Retrieving the environment variable in a Python application
database_url = os.getenv('DATABASE_URL')

print(f"The database URL is: {database_url}")

 

Security Concerns with Storing Credentials in Serverless Environments

From a security standpoint, according to Trend Micro's published white paper on 'Securing Weak Points in Serverless Architectures: Risks and Recommendations,' it was established that credentials, such as tokens, access keys and API keys stored as environment variables within serverless runtime environment such as AWS Lambda's could be abused in different ways by attackers. Threat actors can gain unauthorized control over AWS services by compromising applications within serverless environments, such as AWS Lambda, through techniques like injecting malicious shell commands into functions with high permissions. Once compromised, the attackers can download the cloud service providerโ€™s (CSP) official CLI tool inside the serverless environment. They can then use the extracted secrets, which are stored as plain text in environment variables like AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY, to gain access to critical resources or take control of the entire account. Exposed credentials in serverless environment variables provide attackers with the means to configure the AWS CLI and access AWS services without additional authentication.

In a research blog by Cado Security titled, "Cado Discovers Denonia: The First Malware Specifically Targeting Lambda," also noted in the Wiz cloud threat landscape as the Denonia campaign, it was observed that threat actors (name withheld) use a new type of malware named Denonia (labeled by Cado) to target AWS Lambda environments. Cado Security notes that Denonia's delivery and deployment methods are not known at the moment, but it is designed to infect Lambda instances by using the aws-lambda-go library to enable execution inside Lambda environments and to search for Lambda-related environment variables. The Denonia malware delivers XMRig, an open-source Monero cryptocurrency mining software, which malicious actors often weaponize to run undetected on compromised systems by using DNS over HTTPS (DoH) to encrypt DNS queries and avoid detection.

Abuse of Open-Source Offensive Tools

In AWS, threat actors may utilize open-source cloud-offensive tools like Pacu and Cloudfox to deploy malicious Lambda functions. These tools provide modules that can be exploited to enumerate and manipulate AWS serverless service, potentially allowing attackers to execute unauthorized actions.

In Pacu, the following modules can be used by threat actors in AWS to manipulate serverless for malicious purposes.

  • The module lambda__enum enumerates Lambda functions.
  • The module lambda__backdoor_new_roles adds new roles with backdoor access.
  • The module lambda__backdoor_new_sec_groups alters security groups to grant unauthorized access.
  • The module lambda__backdoor_new_users creates new users with elevated privileges.

pacu-lambda-manipulate-serverless

In Cloudfox, the env-vars module can be used to enumerate environment variables within AWS serverless environments. This module scans services such as App Runner, Elastic Container Service, Lambda, Lightsail Containers, and SageMaker to identify secrets stored in their environment variables, including API keys, tokens, and credentials. This capability can potentially be exploited by threat actors to uncover sensitive information.

  • The CLI command when executed, it scans AWS Lambda functions for secrets in environment variables.
# Run Cloudfox to scan AWS Lambda functions for secrets in environment variables
โฏ cloudfox aws --profile attacker --region us-west-2 env-vars
[๐ŸฆŠ cloudfox v1.6.0 ๐ŸฆŠ ] AWS Caller Identity: arn:aws:sts::029933748493:assumed-role/CloudFox-exec-role/aws-go-sdk-1662942784490595000
[env-vars] Enumerating environment variables in all regions for account 029933748493.
[env-vars] Supported Services: App Runner, Elastic Container Service, Lambda, Lightsail Containers, Sagemaker
[env-vars] Status: 12/8 tasks complete (48 errors -- For details check /Users/perm-sandbox/.cloudfox/cloudfox-error.log)
โ•ญโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€----------------โ•ฎ
โ”‚ Service โ”‚ Region โ”‚ Name โ”‚ Key โ”‚ Value โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€----------------โ”ค
โ”‚ Lambda โ”‚ us-west-2 โ”‚ my-lambda-function-1 โ”‚ DATABASE_URL โ”‚ postgresql://user:password@localhost:5432/mydatabase โ”‚
โ”‚ Lambda โ”‚ us-west-2 โ”‚ my-lambda-function-1 โ”‚ SECRET_KEY โ”‚ supersecretkey โ”‚
โ”‚ Lambda โ”‚ us-west-2 โ”‚ my-lambda-function-1 โ”‚ AWS_ACCESS_KEY_ID โ”‚ AKIAIOSFODNN7EXAMPLE โ”‚
โ”‚ Lambda โ”‚ us-west-2 โ”‚ my-lambda-function-1 โ”‚ AWS_SECRET_ACCESS_KEY โ”‚ wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY โ”‚
โ”‚ Lambda โ”‚ us-west-2 โ”‚ my-lambda-function-2 โ”‚ API_TOKEN โ”‚ 1234567890abcdef โ”‚
โ”‚ Lambda โ”‚ us-west-2 โ”‚ my-lambda-function-2 โ”‚ S3_BUCKET โ”‚ my-secure-bucket โ”‚
โ”‚ Lambda โ”‚ us-west-2 โ”‚ my-lambda-function-2 โ”‚ SMTP_PASSWORD โ”‚ emailpassword โ”‚
โ•ฐโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€----------------โ•ฏ
[env-vars] Output written to [cloudfox-output/aws/attacker/table/env-vars.txt]
[env-vars] Output written to [cloudfox-output/aws/attacker/csv/env-vars.csv]
[env-vars] 7 environment variables found.

Recommendations:
1. Move secrets to AWS Secrets Manager or AWS Systems Manager Parameter Store.
2. Update Lambda functions to retrieve secrets securely from the secret management service.
3. Review and implement strict access controls to environment variables.

It's worth mentioning that DevOps and cloud security engineers can utilize these cloud offensive tools (Pacu and Cloudfox) to simulate potential attacks. These simulations help them identify vulnerabilities and allow engineers to take proactive measures to secure their cloud environments. By using these tools, they can enhance the security of serverless deployments and ensure a more robust defense against potential threats.

Potential Threats with Environment Variables and Mitigations

  1. Exposure in Code Repositories:

    • Threat: If environment variables are hardcoded or included in configuration files that are committed to version control systems (e.g., Github), they can be exposed to unauthorized users.

       

       

      environment-variable-secret-exposure-bug

      (Image showing "sensitive" environmental variables REMOTE_WRITE_PASSWORD and CLOUD_PROVIDER_API_KEY defined as plain text in Kubernetes)

    • Mitigation: Use .gitignore files to exclude configuration files from being committed, and utilize environment-specific configuration management.


      Illustration

      Assume a configuration file called config.env that contains environment variables for a project and this file is accidentally committed to a Git repository.

# config.env File:
DATABASE_URL=postgresql://user:password@localhost:5432/mydatabase
SECRET_KEY=supersecretkey
API_KEY=1234567890abcdef

# Git Add and Commit:
$ git add config.env
$ git commit -m "Add config.env with environment variables"

#Viewing Committed File:
$ cat config.env
DATABASE_URL=postgresql://user:password@localhost:5432/mydatabase
SECRET_KEY=supersecretkey
API_KEY=1234567890abcdef

#Mitigation
# Create .gitignore file and add the config.env file to it
$ echo "config.env" >> .gitignore
$ git add .gitignore
$ git commit -m "Add .gitignore to exclude config.env"

2. Env Var Application Misconfiguration:

    • Threat: Improperly configured environment variables can lead to security vulnerabilities, such as exposing secrets to unauthorized parts of the application.

    • Mitigation: Use tools and frameworks that provide secure handling of environment variables and follow best practices for configuration management.

      Illustration

      Imagine an environment variable for a database password set in a Docker container, but it is exposed to the entire application rather than just the part that needs it.

FROM python:3.8-slim

# Set environment variable for database password
ENV DATABASE_PASSWORD="supersecretpassword"
# Set environment variable for API key
ENV API_KEY="12345-abcdef-67890"
# Copy the application code
COPY . /app
# Set the working directory
WORKDIR /app
# Install dependencies
RUN pip install -r requirements.txt
# Run the application
CMD ["python", "app.py"]

#Python Code (app.py):
import os

def connect_to_database():
# Retrieve the database password from environment variable
db_password = os.getenv('DATABASE_PASSWORD')
print(f"Connecting to the database with password: {db_password}")

def access_api():
# Retrieve the API key from environment variable
api_key = os.getenv('API_KEY')
print(f"Accessing API with key: {api_key}")

if __name__ == "__main__":
connect_to_database()
access_api()

#NB: Once the Docker container is executed, the environment variables are exposed to the entire application,
#Execute Docker container
docker run my-docker-image

#Console Output:
Connecting to the database with password: supersecretpassword
Accessing API with key: 12345-abcdef-67890

#Mitigation
* Use Docker Secrets for Secure Handling of environment variables.
* Ensure that only the services that need access to the secrets can retrieve them.

3. Process Listing:

    • Threat: On some operating systems, environment variables can be viewed by other users on the same system through process listing commands like ps or top.

    • Mitigation: ensure that only trusted users have access, configure user permissions based on the principle of least privilege, and run sensitive applications under dedicated user accounts with restricted access.

      Illustration

      Assume there is a script that is used to sets an environment variable and runs a process. named run_my_app.sh

#run_my_app.sh - script 
#!/bin/bash

# Setting an environment variable
export SECRET_KEY="supersecretkey"

# Running a dummy application (e.g., sleep)
sleep 1000

#Execute command to run the script in the background:
./run_my_app.sh &

#Viewing Environment Variables with ps Command
ps e -o pid,cmd | grep sleep

#OUTPUT
12345 sleep 1000 SECRET_KEY=supersecretkey

4. Logging and Error Handling:

    • Threat: Sensitive data in environment variables can be accidentally logged or included in error messages.

    •  

      Mitigation: To mitigate this risk, ensure that logging configurations do not output environment variable values and sanitize error messages to exclude sensitive information

Illustration

Consider a simple Python application that inadvertently logs environment variables and sensitive data.


#python Code (app.py)
import os
import logging

# Configure logging
logging.basicConfig(level=logging.INFO)

# Set some environment variables
os.environ['DATABASE_URL'] = 'postgresql://user:password@localhost:5432/mydatabase'
os.environ['SECRET_KEY'] = 'supersecretkey'

# Simulate an error that logs all environment variables
try:
raise Exception("Something went wrong!")
except Exception as e:
logging.error(f"Error occurred: {e}")
logging.info("Current environment variables: %s", os.environ)

# NB: Once the above python code (app.py) is executed,
# it logs all the environment variables
ERROR:root:Error occurred: Something went wrong!
INFO:root:Current environment variables: environ({
'PATH': '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin',
'HOSTNAME': 'my-pod',
'DATABASE_URL': 'postgresql://user:password@localhost:5432/mydatabase',
'SECRET_KEY': 'supersecretkey',
...
})

 

Detection and Mitigation Strategies

CSP cloud-native tools

The following CSP cloud-native tools can be used to mitigate the risks associated with storing cloud secrets in environment variables and ensure proper management of these secrets.

  1. AWS Secrets Manager:
    • Functionality: AWS Secrets Manager is ****used to manages and retrieves secrets, ensuring they are not stored directly in environment variables.
    • Use Case: It replace hardcoded secrets in environment variables with references to AWS Secrets Manager.
  2. HashiCorp Vault:
    • Functionality: HashiCorp is used to securely store and manage sensitive data, providing dynamic secrets for applications.
    • Use Case: It can be Integrated with applications to retrieve secrets securely at runtime instead of using environment variables.
  3. Azure Key Vault:
    • Functionality: Azure Key Vault helps safeguard cryptographic keys and secrets used by cloud applications and services, ensuring they are not exposed in environment variables.
    • Use Case: Manages and secures access to sensitive data, including API keys, passwords, and certificates, which can be referenced in environment variables without exposing the actual values.
  4. Google Cloud Secret Manager:
    • Functionality: Google Cloud Secret Manager is used to securely store and manage access to sensitive information such as API keys, passwords, and certificates. It also allows for the centralized management of secrets and integrates seamlessly with Google Cloud services.
    • Use Case: Itโ€™s used to securely store and access secrets in applications running on Google Cloud Platform, ensuring that sensitive data is not exposed in environment variables or hardcoded.
  5. Kubernetes Secrets: In a Reddit chat on how to manage secrets in Kubernetes, the following tools were recommended for enhancing Kubernetes secret management: Vault, AWS Secrets Manager, Secrets Store CSI Driver, ArgoCD with Vault Plugin, and KMS Encryption.

Similarly, check out our previous blog post on "Exploiting Cloud Secrets Management Repositories: Adversary Tactics and Mitigation Strategies" to understand security strategies for protecting cloud secrets and resources stored in CSP cloud-native management repositories in AWS, Azure, and GCP.

Detection Strategies

  1. Monitor Serverless Logs: Regularly review logs generated by serverless functions for unusual activities, such as unexpected function invocations or anomalies in execution patterns.
  2. Track IAM Activity: Monitor changes in IAM roles and permissions, particularly for actions like role creation or modification, which could indicate privilege escalation attempts.
  3. Configuration Changes: Monitor changes to environment variables, triggers, permissions, and other configurations, as these can signal attempts to gain unauthorized access or escalate privileges.

Mitigation Strategies

The following are mitigation best practices that DevOps and cloud security engineers should adopt for proper management of their cloud secrets in serverless environments.

  1. Use Secret Management Services: Replace environment variables containing sensitive data with secure secret management solutions like AWS Secrets Manager, Azure Key Vault, and Google Cloud Secret Manager.
  2. Encrypt Environment Variables: Ensure that any sensitive data stored in environment variables is encrypted both at rest and in transit.
  3. Principle of Least Privilege: Limit permissions to create, modify, or run serverless functions only to users and service who explicitly require them. This minimizes the attack surface by restricting access to critical resources.
  4. Regular Audits: Conduct regular audits of environment variables to ensure no sensitive data is exposed.
  5. Monitoring and Alerts: Set up monitoring and alerts for unusual activities related to serverless resources, helping to quickly detect and respond to potential security incidents.
  6. Adopt the cloud-based offensive tools such as pacu and cloudfox to simulate credential harvesting in your serverless environment variables.
  7. Environment Segmentation: Use separate environments (e.g., development, testing, production) to isolate serverless resources and limit potential impact in case of a compromise.
  8. Immutable Infrastructure: Use Infrastructure as Code (IaC) to define and manage serverless resources, ensuring consistency and traceability.
  9. Continuous Training: Regularly train users and administrators on security best practices and the secure use of serverless technologies.

Conclusion

In this blog, we discussed how using serverless environment variables has become a common practice among developers and DevOps professionals for storing sensitive data such as API keys, database credentials, and other crucial configuration settings. We also explored how threat actors can exploit vulnerabilities in serverless services to extract secrets from environment variables, potentially gaining access to critical resources or taking control of entire accounts.

From a security perspective, as established in this blog, it is recommended that the DevOps community and cyber defenders adopt the detection and mitigation strategies detailed here. Additionally, organizations' security teams should consider using cloud-based offensive tools like Pacu and CloudFox to simulate credential harvesting in their serverless environments.

Additional Resource

Reference

Illustration Cloud

Related Articles

INTRODUCING CAPICHE DETECTION FRAMEWORK: AN OPEN-SOURCE TOOL TO SIMPLIFY CLOUD API-BASED HUNTING

Intro Attacks on cloud infrastructure have been steadily increasing in quantity, sophistication and scope. Common cryptomining attacks still exists, but the proliferation of BEC (Business Email Compromise) and SMS spamming along with full-bore

BucketShield: Track Log Flow, Secure Buckets, Simulate Threats โ€“ All in One Open-Source Tool

Introduction In todayโ€™s cloud-powered world, keeping your logs secure and intact is more important than ever. AWS CloudTrail serves as the backbone for tracking all activities across your cloud environment, but simply enabling it isn't enough.

Breaking free from the chains of fate - Bypassing AWSCompromisedKeyQuarantineV2 Policy

Intro AWSCompromisedKeyQuarantineV2 (v3 was released during the creation of this article) is an AWS policy that attaches to identities whose credentials are leaked. It denies access to certain actions, applied by the AWS team in the event that an

View more posts