In cloud computing, the evolution of serverless technology has significantly transformed how developers build and run applications. Over the years, the adoption of serverless computing has grown rapidly, with developers and organizations increasingly turning to cloud deployment. The three major cloud service providers offer serverless options such as AWS Lambda, Azure Functions, and Google Cloud Functions, which have become essential tools for deploying cloud applications.
Serverless computing allows developers to focus on writing code without worrying about the underlying infrastructure, leading to faster development lifecycles and more efficient resource utilization. As a result, more organizations are leveraging these services to build and run scalable, cost-effective applications in the cloud. This trend reflects a broader shift towards modern, cloud-native application architectures that prioritize agility and operational efficiency.
Serverless environment variables contain key-value pairs that are used to configure and pass information to serverless functions and applications. DevOps engineers and cloud security engineers commonly use these environment variables because they provide a convenient and portable way to supply secrets to applications across almost all services. However, sensitive data stored in environment variables, such as API keys, database credentials, or other crucial configuration settings, can be compromised if accidentally shared or made public.
Adversaries can exploit serverless functions for persistent execution by setting them to trigger on specific cloud events, such as user creation. For example, in AWS, attacker might create a Lambda function that adds extra cloud credentials whenever a new user is created, triggered by a CloudWatch rule. In Office 365, adversaries could use Power Automate to forward emails or create anonymous sharing links when access to a SharePoint document is granted. This method allows continuous exploitation and unauthorized access.
In this blog, we will explore how threat actors can exploit sensitive data stored in serverless environment variables in AWS, Azure, GCP and Kubernetes. We will also examine the use of cloud-offensive tools for this purpose. Additionally, this blog expands on the MITRE ATT&CK Cloud Matrix for Enterprise by analyzing the Serverless Execution (T1648) technique, which falls under the Execution tactic.
Serverless environment variables contain key-value pairs that are commonly configured used to store configuration settings and sensitive information such as tokens, APIs keys, database credentials, and encryption keys needed by serverless functions such as AWS Lambda, Azure Functions and Google Cloud Functions.
Threat actors can exploit these serverless services offered by cloud service providers to execute arbitrary code, including malicious activities like crypto-mining. They can also abuse IAM (Identity and Access Management) permissions to perform privilege escalation on serverless functions. For example, in AWS, attackers may use the IAM:PassRole
permission or the iam.serviceAccounts.actAs
permission in Google Cloud to assign additional roles to a serverless function. This allows them to elevate privileges and execute actions on serverless functions beyond the original user's permissions.
Serverless environment variables are configured differently depending on the cloud service provider. However, here are the common formats and methods for configuring environment variables in AWS Lambda, Azure Functions, and Google Cloud Functions via command-line Interface (CLI):
#AWS CLI
aws lambda update-function-configuration --function-name my-prod-dev --environment "Variables={STORAGE_ACCOUNT_NAME=myStorageAccount,STORAGE_ACCOUNT_KEY=ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890EXAMPLEKEY}"
#Azure CLI
az functionapp config appsettings set --name my-prod-function-app --resource-group prod-resource-group --settings "STORAGE_ACCOUNT_NAME=prodstorageaccountname" "STORAGE_ACCOUNT_KEY=ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890EXAMPLEKEY"
#gcloud CLI
gcloud functions deploy my-function --set-env-vars "GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/service-account-key.json"
In using AWS CLI, For example, environment variables can be used to configure AWS access keys which can be retrieved within a Lambda function to programmatically interact with other AWS services (S3 bucket). This allows the function to access resources without hardcoding credentials in the codebase.
Image from AWS Documentation: How to set environment variables in Linux, Windows and PowerShell
Image showing the configuration of AWS access keys in the AWS CLI.
#Configuring Environment Variables via AWS CLI
AWS_ACCESS_KEY_ID = AKIAIOSFODNN7EXAMPLE
AWS_SECRET_ACCESS_KEY = wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
#Python code to Retrieve AWS access keys from environment variables
import os
import boto3
access_key = os.environ['AWS_ACCESS_KEY_ID']
secret_key = os.environ['AWS_SECRET_ACCESS_KEY']
# Create an S3 client using the access keys
s3_client = boto3.client(
's3',
aws_access_key_id=access_key,
aws_secret_access_key=secret_key
)
def lambda_handler(event, context):
# Use the S3 client to list all buckets
response = s3_client.list_buckets()
return response
The above code sets AWS access keys as environment variables, retrieves them in a Python script, and uses them to create an S3 client with Boto3 to list all S3 buckets.
Similarly, the above environment variable functionality is possible in both Azure and Google Cloud Platform (GCP) for configuring and using access keys or other sensitive information through environment variables.
#Configuring Azure storage account key as environment variables for Azure Functions application.
STORAGE_ACCOUNT_NAME = Prod-storage-name
STORAGE_ACCOUNT_KEY = ABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890EXAMPLEKEY
#Configuring Google Cloud service account key as an environment variable for Google Cloud Functions.
GOOGLE_APPLICATION_CREDENTIALS = /path/to/your/service-account-key.json
According to an article by CyberArk titled "Environment Variables Don’t Keep Secrets: Best Practices for Plugging Application Credential Leaks," it was established that despite the flexibility of environment variables for providing global access to serverless services, the continuously evolving threat landscape has rendered this method unsuitable for securely storing sensitive data. Environment variables are now considered less secure for providing sensitive values to applications and workloads due to their inherent vulnerabilities, such as global access, potential exposure through logging, and process listing.
In containerized environments like Docker or Kubernetes, it is worth noting, as stated in the OWASP Secrets Management Cheat Sheet, that “…environment variables are generally accessible to all processes and may be included in logs or system dumps. Using environment variables is therefore not recommended unless other methods are not possible.” For example, in Kubernetes, anyone with access can run kubectl exec <pod-name> -- env
, which would print all the environment variables to the console.
#Assume there is a pod named my-pod
kubectl exec my-pod -- env
## COMMAND OUTPUT
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
HOSTNAME=my-pod
KUBERNETES_PORT=tcp://10.96.0.1:443
KUBERNETES_PORT_443_TCP=tcp://10.96.0.1:443
KUBERNETES_PORT_443_TCP_PROTO=tcp
KUBERNETES_PORT_443_TCP_PORT=443
KUBERNETES_PORT_443_TCP_ADDR=10.96.0.1
DATABASE_URL=postgresql://user:password@localhost:5432/mydatabase
ENV=production
SECRET_KEY=supersecretkey
API_KEY=1234567890abcdef
NODE_ENV=production
When the kubectl exec my-pod -- env
command is executed, it prints all the environment variables set in the container running inside my-pod. Anyone with the necessary permissions can run this command and potentially expose sensitive information stored in environment variables.
In another case related to containerized environments, according to the blog post by Trend Micro titled “Analyzing the Hidden Danger of Environment Variables for Keeping Secrets,” environment variables can be defined in various ways such as in shell scripts, Docker, and Kubernetes. One global definition method is the export
command, which is commonly used inside shell scripts. When using shell scripts, the -e
parameter is used before starting containerized applications or during a container build where the ENV
command in a Dockerfile indicates that the variable will be set at runtime. It is worth noting that environment variables are available in plain text within the environment and are not encrypted.
For Example: The command scripts shown below details examples of how to set and use environment variables in different contexts: in a shell script, when starting a Docker container, within a Dockerfile, and in a Python application.
#!/bin/bash
## In Shell script
# Setting an environment variable
export DATABASE_URL="postgresql://user:password@localhost:5432/mydatabase"
# Using the environment variable
echo "The database URL is: $DATABASE_URL"
## command used to starting a Containerized Application
# Using the -e parameter to set an environment variable when starting a Docker container
docker run -e "DATABASE_URL=postgresql://user:password@localhost:5432/mydatabase" my-docker-image
## In Dockerfile
# Setting environment variables in a Dockerfile
FROM python:3.8-slim
# Set environment variables
ENV DATABASE_URL="postgresql://user:password@localhost:5432/mydatabase"
# Use the environment variable in the application
CMD ["python", "app.py"]
## Python code (app.py)
import os
# Retrieving the environment variable in a Python application
database_url = os.getenv('DATABASE_URL')
print(f"The database URL is: {database_url}")
From a security standpoint, according to Trend Micro's published white paper on 'Securing Weak Points in Serverless Architectures: Risks and Recommendations,' it was established that credentials, such as tokens, access keys and API keys stored as environment variables within serverless runtime environment such as AWS Lambda's could be abused in different ways by attackers. Threat actors can gain unauthorized control over AWS services by compromising applications within serverless environments, such as AWS Lambda, through techniques like injecting malicious shell commands into functions with high permissions. Once compromised, the attackers can download the cloud service provider’s (CSP) official CLI tool inside the serverless environment. They can then use the extracted secrets, which are stored as plain text in environment variables like AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
, to gain access to critical resources or take control of the entire account. Exposed credentials in serverless environment variables provide attackers with the means to configure the AWS CLI and access AWS services without additional authentication.
In a research blog by Cado Security titled, "Cado Discovers Denonia: The First Malware Specifically Targeting Lambda," also noted in the Wiz cloud threat landscape as the Denonia campaign, it was observed that threat actors (name withheld) use a new type of malware named Denonia (labeled by Cado) to target AWS Lambda environments. Cado Security notes that Denonia's delivery and deployment methods are not known at the moment, but it is designed to infect Lambda instances by using the aws-lambda-go
library to enable execution inside Lambda environments and to search for Lambda-related environment variables. The Denonia malware delivers XMRig, an open-source Monero cryptocurrency mining software, which malicious actors often weaponize to run undetected on compromised systems by using DNS over HTTPS (DoH) to encrypt DNS queries and avoid detection.
In AWS, threat actors may utilize open-source cloud-offensive tools like Pacu and Cloudfox to deploy malicious Lambda functions. These tools provide modules that can be exploited to enumerate and manipulate AWS serverless service, potentially allowing attackers to execute unauthorized actions.
In Pacu, the following modules can be used by threat actors in AWS to manipulate serverless for malicious purposes.
lambda__enum
enumerates Lambda functions.lambda__backdoor_new_roles
adds new roles with backdoor access.lambda__backdoor_new_sec_groups
alters security groups to grant unauthorized access.lambda__backdoor_new_users
creates new users with elevated privileges.In Cloudfox, the env-vars
module can be used to enumerate environment variables within AWS serverless environments. This module scans services such as App Runner, Elastic Container Service, Lambda, Lightsail Containers, and SageMaker to identify secrets stored in their environment variables, including API keys, tokens, and credentials. This capability can potentially be exploited by threat actors to uncover sensitive information.
# Run Cloudfox to scan AWS Lambda functions for secrets in environment variables
❯ cloudfox aws --profile attacker --region us-west-2 env-vars
[🦊 cloudfox v1.6.0 🦊 ] AWS Caller Identity: arn:aws:sts::029933748493:assumed-role/CloudFox-exec-role/aws-go-sdk-1662942784490595000
[env-vars] Enumerating environment variables in all regions for account 029933748493.
[env-vars] Supported Services: App Runner, Elastic Container Service, Lambda, Lightsail Containers, Sagemaker
[env-vars] Status: 12/8 tasks complete (48 errors -- For details check /Users/perm-sandbox/.cloudfox/cloudfox-error.log)
╭────────────┬───────────┬───────────────────────┬───────────────────────┬──────────────────────────────────────----------------╮
│ Service │ Region │ Name │ Key │ Value │
├────────────┼───────────┼───────────────────────┼───────────────────────┼──────────────────────────────────────----------------┤
│ Lambda │ us-west-2 │ my-lambda-function-1 │ DATABASE_URL │ postgresql://user:password@localhost:5432/mydatabase │
│ Lambda │ us-west-2 │ my-lambda-function-1 │ SECRET_KEY │ supersecretkey │
│ Lambda │ us-west-2 │ my-lambda-function-1 │ AWS_ACCESS_KEY_ID │ AKIAIOSFODNN7EXAMPLE │
│ Lambda │ us-west-2 │ my-lambda-function-1 │ AWS_SECRET_ACCESS_KEY │ wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY │
│ Lambda │ us-west-2 │ my-lambda-function-2 │ API_TOKEN │ 1234567890abcdef │
│ Lambda │ us-west-2 │ my-lambda-function-2 │ S3_BUCKET │ my-secure-bucket │
│ Lambda │ us-west-2 │ my-lambda-function-2 │ SMTP_PASSWORD │ emailpassword │
╰────────────┴───────────┴───────────────────────┴───────────────────────┴──────────────────────────────────────----------------╯
[env-vars] Output written to [cloudfox-output/aws/attacker/table/env-vars.txt]
[env-vars] Output written to [cloudfox-output/aws/attacker/csv/env-vars.csv]
[env-vars] 7 environment variables found.
Recommendations:
1. Move secrets to AWS Secrets Manager or AWS Systems Manager Parameter Store.
2. Update Lambda functions to retrieve secrets securely from the secret management service.
3. Review and implement strict access controls to environment variables.
It's worth mentioning that DevOps and cloud security engineers can utilize these cloud offensive tools (Pacu and Cloudfox) to simulate potential attacks. These simulations help them identify vulnerabilities and allow engineers to take proactive measures to secure their cloud environments. By using these tools, they can enhance the security of serverless deployments and ensure a more robust defense against potential threats.
Exposure in Code Repositories:
Threat: If environment variables are hardcoded or included in configuration files that are committed to version control systems (e.g., Github), they can be exposed to unauthorized users.
(Image showing "sensitive" environmental variables REMOTE_WRITE_PASSWORD
and CLOUD_PROVIDER_API_KEY
defined as plain text in Kubernetes)
Mitigation: Use .gitignore
files to exclude configuration files from being committed, and utilize environment-specific configuration management.
Assume a configuration file called config.env
that contains environment variables for a project and this file is accidentally committed to a Git repository.
# config.env File:
DATABASE_URL=postgresql://user:password@localhost:5432/mydatabase
SECRET_KEY=supersecretkey
API_KEY=1234567890abcdef
# Git Add and Commit:
$ git add config.env
$ git commit -m "Add config.env with environment variables"
#Viewing Committed File:
$ cat config.env
DATABASE_URL=postgresql://user:password@localhost:5432/mydatabase
SECRET_KEY=supersecretkey
API_KEY=1234567890abcdef
#Mitigation
# Create .gitignore file and add the config.env file to it
$ echo "config.env" >> .gitignore
$ git add .gitignore
$ git commit -m "Add .gitignore to exclude config.env"
2. Env Var Application Misconfiguration:
Threat: Improperly configured environment variables can lead to security vulnerabilities, such as exposing secrets to unauthorized parts of the application.
Mitigation: Use tools and frameworks that provide secure handling of environment variables and follow best practices for configuration management.
Imagine an environment variable for a database password set in a Docker container, but it is exposed to the entire application rather than just the part that needs it.
FROM python:3.8-slim
# Set environment variable for database password
ENV DATABASE_PASSWORD="supersecretpassword"
# Set environment variable for API key
ENV API_KEY="12345-abcdef-67890"
# Copy the application code
COPY . /app
# Set the working directory
WORKDIR /app
# Install dependencies
RUN pip install -r requirements.txt
# Run the application
CMD ["python", "app.py"]
#Python Code (app.py):
import os
def connect_to_database():
# Retrieve the database password from environment variable
db_password = os.getenv('DATABASE_PASSWORD')
print(f"Connecting to the database with password: {db_password}")
def access_api():
# Retrieve the API key from environment variable
api_key = os.getenv('API_KEY')
print(f"Accessing API with key: {api_key}")
if __name__ == "__main__":
connect_to_database()
access_api()
#NB: Once the Docker container is executed, the environment variables are exposed to the entire application,
#Execute Docker container
docker run my-docker-image
#Console Output:
Connecting to the database with password: supersecretpassword
Accessing API with key: 12345-abcdef-67890
#Mitigation
* Use Docker Secrets for Secure Handling of environment variables.
* Ensure that only the services that need access to the secrets can retrieve them.
3. Process Listing:
Threat: On some operating systems, environment variables can be viewed by other users on the same system through process listing commands like ps
or top
.
Mitigation: ensure that only trusted users have access, configure user permissions based on the principle of least privilege, and run sensitive applications under dedicated user accounts with restricted access.
Assume there is a script that is used to sets an environment variable and runs a process. named run_my_app.sh
#run_my_app.sh - script
#!/bin/bash
# Setting an environment variable
export SECRET_KEY="supersecretkey"
# Running a dummy application (e.g., sleep)
sleep 1000
#Execute command to run the script in the background:
./run_my_app.sh &
#Viewing Environment Variables with ps Command
ps e -o pid,cmd | grep sleep
#OUTPUT
12345 sleep 1000 SECRET_KEY=supersecretkey
4. Logging and Error Handling:
Threat: Sensitive data in environment variables can be accidentally logged or included in error messages.
Mitigation: To mitigate this risk, ensure that logging configurations do not output environment variable values and sanitize error messages to exclude sensitive information
Consider a simple Python application that inadvertently logs environment variables and sensitive data.
#python Code (app.py)
import os
import logging
# Configure logging
logging.basicConfig(level=logging.INFO)
# Set some environment variables
os.environ['DATABASE_URL'] = 'postgresql://user:password@localhost:5432/mydatabase'
os.environ['SECRET_KEY'] = 'supersecretkey'
# Simulate an error that logs all environment variables
try:
raise Exception("Something went wrong!")
except Exception as e:
logging.error(f"Error occurred: {e}")
logging.info("Current environment variables: %s", os.environ)
# NB: Once the above python code (app.py) is executed,
# it logs all the environment variables
ERROR:root:Error occurred: Something went wrong!
INFO:root:Current environment variables: environ({
'PATH': '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin',
'HOSTNAME': 'my-pod',
'DATABASE_URL': 'postgresql://user:password@localhost:5432/mydatabase',
'SECRET_KEY': 'supersecretkey',
...
})
The following CSP cloud-native tools can be used to mitigate the risks associated with storing cloud secrets in environment variables and ensure proper management of these secrets.
Similarly, check out our previous blog post on "Exploiting Cloud Secrets Management Repositories: Adversary Tactics and Mitigation Strategies" to understand security strategies for protecting cloud secrets and resources stored in CSP cloud-native management repositories in AWS, Azure, and GCP.
The following are mitigation best practices that DevOps and cloud security engineers should adopt for proper management of their cloud secrets in serverless environments.
In this blog, we discussed how using serverless environment variables has become a common practice among developers and DevOps professionals for storing sensitive data such as API keys, database credentials, and other crucial configuration settings. We also explored how threat actors can exploit vulnerabilities in serverless services to extract secrets from environment variables, potentially gaining access to critical resources or taking control of entire accounts.
From a security perspective, as established in this blog, it is recommended that the DevOps community and cyber defenders adopt the detection and mitigation strategies detailed here. Additionally, organizations' security teams should consider using cloud-based offensive tools like Pacu and CloudFox to simulate credential harvesting in their serverless environments.