STATE OF IDENTITY SECURITY Permiso has released the 2024 Survey Report

[GET THE REPORT]
Illustration Cloud

Introducing YetiHunter: An open-source tool to detect and hunt for suspicious activity in Snowflake

Summary

On May 30, 2024 Snowflake confirmed many clients were affected by an attacker leveraging compromised NHI credentials to perform data theft. In their notice, Snowflake included some indicators and suggested hunts. Our good friends at Mandiant also provided insight into the attacker (UNC5537) and more hunting opportunities in their recent blog post as well.

To summarize the high level facts of the case:
  • An attacker gained access to over 150 credentials (non-mfa) to client organizations of Snowflake
  • Mandiant believes the attacker gained access to these credentials through a variety of infostealers
  • The attacker leveraged these credentials to perform data theft for the purpose of extortion
  • The earliest evidence of the attackers unauthorized access to a victim organization was on April 14, 2024

To see if your Snowflake instance has been compromised, Permiso has released YetiHunter – an open source tool combining several Indicators of Compromise published by the community. Reach out if you need any assistance in scanning your Snowflake environment. 

Snowflake Hunting with YetiHunter

In order to ease the burden of hunting through Snowflake data for indicators, the Permiso team has created YetiHunter. YetiHunter combines the Indicators that Snowflake, Mandiant, DataDog and Permiso have into one easy to run script for triage.

YetiHunter performs the following queries:

Query Name Description Query
select_all_without_where

select * queries that do not contain a WHERE clause.


Attackers often use simple select statements without extra filtering via WHERE clauses.

SELECT query_id, user_name, query_text
FROM snowflake.account_usage.query_history
WHERE query_text ILIKE 'SELECT *'
AND query_text NOT ILIKE '%WHERE%';
copy_into_select_all

COPY INTO and select * in a single query.

 

Attackers often attempt to combine these two commands to export all records to a temp location.

SELECT query_id, user_name, query_text
FROM snowflake.account_usage.query_history
WHERE query_text ILIKE 'COPY INTO%'
AND query_text ILIKE 'SELECT *%';
show_tables

Performing a show tables query

 

show tables is a common recon command that attackers may use to understand the environment they are in.

SELECT query_id, user_name, query_text
FROM snowflake.account_usage.query_history
WHERE query_text ILIKE 'SHOW TABLES';
create_temp_storage

Creating temp storage

 

Attackers will often create a temporary storage location (usually with a generic name like test) as a staging location.

SELECT query_id, user_name, query_text
FROM snowflake.account_usage.query_history
WHERE query_text ILIKE 'CREATE TEMP%’
10_largest_queries

Top 10 largest queries by rows_produced

 

Review queries that have produced a large amount of rows.

SELECT query_id, user_name, query_text, rows_produced
FROM snowflake.account_usage.query_history
WHERE rows_produced > 2000
ORDER BY rows_produced DESC
LIMIT 10;
dbeaver_used DBEAVER is a common utility that is used by administrators and attackers alike to interact with many database types to include SNOWFLAKE SELECT
created_on,
user_name,
authentication_method,
PARSE_JSON(client_environment) :APPLICATION :: STRING AS client_application,
PARSE_JSON(client_environment) :OS :: STRING AS client_os,
PARSE_JSON(client_environment) :OS_VERSION :: STRING AS client_os_version,
session_id
FROM
snowflake.account_usage.sessions,
WHERE
PARSE_JSON(CLIENT_ENVIRONMENT):APPLICATION = 'DBeaver_DBeaverUltimate'
ORDER BY CREATED_ON;
accountadmin_changes Review recent ADMIN role grants select user_name || ' granted the ' || role_name || ' role on ' || end_time ||' [' || query_text ||']' as Grants
from snowflake.account_usage.query_history where execution_status = 'SUCCESS'
and query_type = 'GRANT' and
query_text ilike '%grant%accountadmin%to%'
order by end_time desc;
impactful_modifications This query will list many impactful modifications that occurred in snowflake, review for any suspicious modifications SELECT
start_time,
user_name,
role_name,
query_type,
query_text
FROM snowflake.account_usage.query_history
WHERE execution_status = 'SUCCESS'
AND query_type NOT in ('SELECT')
AND query_type NOT in ('SHOW')
AND query_type NOT in ('DESCRIBE')
AND (query_text ILIKE '%create role%'
OR query_text ILIKE '%manage grants%'
OR query_text ILIKE '%create integration%'
OR query_text ILIKE '%alter integration%'
OR query_text ILIKE '%create share%'
OR query_text ILIKE '%create account%'
OR query_text ILIKE '%monitor usage%'
OR query_text ILIKE '%ownership%'
OR query_text ILIKE '%drop table%'
OR query_text ILIKE '%drop database%'
OR query_text ILIKE '%create stage%'
OR query_text ILIKE '%drop stage%'
OR query_text ILIKE '%alter stage%'
OR query_text ILIKE '%create user%'
OR query_text ILIKE '%alter user%'
OR query_text ILIKE '%drop user%'
OR query_text ILIKE '%create_network_policy%'
OR query_text ILIKE '%alter_network_policy%'
OR query_text ILIKE '%drop_network_policy%'
OR query_text ILIKE '%copy%'
)
ORDER BY end_time desc;
least_common_applications_used Applications in SNOWFLAKE are much like user-agents. This query will show the least common applications used in your environment. Review the list for any abnormal applications SELECT
COUNT(*) AS client_app_count,
PARSE_JSON(client_environment) :APPLICATION :: STRING AS client_application,
PARSE_JSON(client_environment) :OS :: STRING AS client_os,
PARSE_JSON(client_environment) :OS_VERSION :: STRING AS client_os_version
FROM
snowflake.account_usage.sessions sessions
WHERE
1 = 1
AND sessions.created_on >= '2024-04-01'
GROUP BY
ALL
ORDER BY
1 DESC;
malicious_ips This will check all of the known observed malicious IPs that SNOWFLAKE and Mandiant have released associated with the UNC5537 campaign.
frostbite_used Frostbite is the Mandiant name for one of the applications known used by UNC5537. This query looks for it’s usage. SELECT
created_on,
user_name,
authentication_method,
PARSE_JSON(client_environment) :APPLICATION :: STRING AS client_application,
PARSE_JSON(client_environment) :OS :: STRING AS client_os,
PARSE_JSON(client_environment) :OS_VERSION :: STRING AS client_os_version,
session_id
FROM
snowflake.account_usage.sessions,
WHERE
PARSE_JSON(CLIENT_ENVIRONMENT):APPLICATION = 'rapeflake'
ORDER BY CREATED_ON;
copy_http This query will identify any usage of COPY INTO with HTTP in the query_text as well SELECT *,
FROM
snowflake.account_usage.query_history
WHERE
CONTAINS(QUERY_TEXT, 'COPY INTO') AND CONTAINS(QUERY_TEXT, 'http')
get_file After copying into temp storage, attackers will often use the GET command to retrieve that file. This looks for GET and File: in a single query


SELECT *,
FROM
snowflake.account_usage.query_history
WHERE
query_text ilike '%GET' AND query_text ilike '%file:%’

YetiHunter Details

https://github.com/Permiso-io-tools/YetiHunter 

Installation

To install, the only thing needed, is to install the required libraries.

python3 -m venv ./venv
source venv/bin/activate
python3 -m pip install -r requirements.txt

If you prefer to run in Docker check for installation instructions in our README file.

Then, just run the tool by running yetihunter.py:

python3 yetihunter.py -h
usage: yetihunter [-h] [-u USER] [-p PASSWORD] [-w WAREHOUSE] [-a ACCOUNT] [-d DATABASE] [-s SCHEMA] [-am {SSO,USERPASS}] [-cf CONFIG_FILE_PATH] [-gcf] [-o OUTPUT_DIRECTORY]

Permiso Security has created a tool to query snowflake environments for evidence of compromise, based on indicators from Permiso and the community.

options:
-h, --help show this help message and exit

-u USER, --user USER The Snowflake user to authenticate as. Even if the authentication is SSO, the email of the identity authenticating should be added.

-p PASSWORD, --password PASSWORD
This field is not required if the authentication is SSO, but required if the authentication is USERPASS. If needed and the field is empty, an imput will popup to ask for the password.

-w WAREHOUSE, --warehouse WAREHOUSE
The Snowflake warehouse to connect to.

-a ACCOUNT, --account ACCOUNT
The Snowflake Account to connect to. It should be in the format of <organization>-<account>

-d DATABASE, --database DATABASE
The database name to connect to.

-s SCHEMA, --schema SCHEMA
The name of the schema inside the database. This can be empty.

-am {SSO,USERPASS}, --authentication-method {SSO,USERPASS}

-cf CONFIG_FILE_PATH, --config-file-path CONFIG_FILE_PATH
If a config file is used, this option will have the path of the config file. No other flag is needed besides this.

-gcf, --generate-config-file
If put, the tool will create an empty config file that can be filled by the user. The fields with null value can be left as null as they are not required.

-o OUTPUT_DIRECTORY, --output-directory OUTPUT_DIRECTORY
The directory inside ./output/, where the query output files will be saved in.

 

YetiHunter Usage

Authenticating

YetiHunter supports two (2) forms of authentication. -am {SSO,USERPASS}, --authentication-method {SSO,USERPASS}

  • Username + password
  • SSO

Authentication and database information can be passed via command line parameters or stored in a config. To make this process seamless, YetiHunter includes a config generator:

-cf CONFIG_FILE_PATH, --config-file-path CONFIG_FILE_PATH
                        If a config file is used, this option will have the path of the config file. No other flag is needed besides this.

-gcf, --generate-config-file
                        If put, the tool will create an empty config file that can be filled by the user. The fields with null value can be left as null as they are not required.!

 

YetiHunter Queries

The rest of the execution is done by the tool itself, based on the queries provided. YetiHunter has a list of queries on queries/queries.json, which the user can extend or update.

yetihunter-queries-list

At the end of the execution, a list of CSV files are created with the output of each query.

YetiHunter Custom Queries

As mentioned, YetiHunter stores its queries on queries/queries.json. Each query has a name, a description and the query itself. Queries can be added, removed or modified according to the users’ needs and they will automatically get executed by the tool.

YetiHunter Custom Malicious IPs

A list of known malicious IPs is provided to the users. That list resides in queries/known_bad_ips . Users can add new IPs to the list or swap the current ones for their list and the tool will continue to run. The list will be ingested on a rule that checks for logins coming from these IPs.

When there is a hit, a file called login_from_malicious_ips.csv will be created with the list of identities and their login info.

Permiso Product

For Permiso clients with Snowflake integrated, the following detections provide coverage of indicators related to this and other attacks on Snowflake:

  • p0_SNOW_APP_DBEAVER
  • p0_SNOW_APP_FROSTBITE
  • p0_SNOW_SELECTALL_NOWHERE
  • p0_SNOW_ROWSPRODUCED_LARGE
  • p0_SNOW_EXECUTIONTIME_LARGE
  • p0_SNOW_TEMPDIR_CREATED
  • p0_SNOW_COPYINTO_TEMPDIR
  • p0_SNOW_GET_TEMPDIR
  • p0_SNOW_UNC5537_NBIS
  • p0_SNOW_ACCESS_RESPROXY
  • p0_SNOW_ACCESS_VPN
  • p0_SNOW_ACCESS_TOR

 

 

Illustration Cloud

Related Articles

How Adversaries Abuse Serverless Services to Harvest Sensitive Data from Environment Variables

Introduction In cloud computing, the evolution of serverless technology has significantly transformed how developers build and run applications. Over the years, the adoption of serverless computing has grown rapidly, with developers and

INTRODUCING CAPICHE DETECTION FRAMEWORK: AN OPEN-SOURCE TOOL TO SIMPLIFY CLOUD API-BASED HUNTING

Intro Attacks on cloud infrastructure have been steadily increasing in quantity, sophistication and scope. Common cryptomining attacks still exists, but the proliferation of BEC (Business Email Compromise) and SMS spamming along with full-bore

BucketShield: Track Log Flow, Secure Buckets, Simulate Threats – All in One Open-Source Tool

Introduction In today’s cloud-powered world, keeping your logs secure and intact is more important than ever. AWS CloudTrail serves as the backbone for tracking all activities across your cloud environment, but simply enabling it isn't enough.

View more posts