Summary
This document applies to CDP Public Cloud and Data Services.
Last Updated:
12/22/2021, 2:26:43 PM
Symptoms
The Apache Security team has released a security advisory for CVE-2021-44228 which affects Apache Log4j2. A malicious user could exploit this vulnerability to run arbitrary code as the user or service account running the affected software. Software products using log4j versions 2.0 through 2.14.1 are affected and log4j 1.x is not affected. Cloudera is making short-term workarounds available for affected software and is in the process of creating new releases containing fixes for this CVE.
Instructions
Short Term Resolution:
The fix provided in this section is applicable to Data Lake and Data Hub (Cloudera Runtime) only. See the Long Term Resolution section for resolution applicable to CDP Public Cloud Data Services.
NOTE: After applying the Short Term Resolution, if you conduct a repair operation on Data Hub or Data Lake, you will need to re-apply the Short Term Resolution again on the repaired nodes or apply the Long Term Resolution.
NOTE: In case you do a scale up operation (including auto-scale) on Data Hub, you will need to re-apply the Short Term Resolution on the new nodes or apply the Long Term Resolution.
Script Download Location
Download the script repo from here - https://github.com/cloudera/cloudera-scripts-for-log4j
You must run the following script on all affected cluster nodes.
Script: run_log4j_patcher.sh [cdp]
Function: This script scans a directory for jar files and removes JndiLookup.class from the ones it finds
- Stop all running jobs in the production cluster before executing the script
- Navigate to Cloudera Manager > YARN > Configuration and ensure that yarn.nodemanager.delete.debug-delay-sec is set to 0
If the value is not zero, you must restart the YARN service after setting the value to 0
- Navigate to Cloudera Manager > YARN > Configuration and search for yarn.nodemanager.local-dirs to get the configured Node Manager Local Directory path
- Remove filecache and usercache folder located inside the folders that are specified in yarn.nodemanager.local-dirs
- Download all files from the GitHub repo and copy to all nodes of your cluster
- Run the script as root on ALL nodes of your Data Hub and and Data Lake clusters
- Scripts will take 1 mandatory argument (cdh|cdp|hdp)
- The script takes 2 optional arguments: a base directory to scan in, and a backup directory. The default for both are /opt/cloudera and /opt/cloudera/log4shell-backup, respectively
- Ensure that the last line of the script output indicates ‘Finished’ to verify that the job has completed successfully. The script will fail if a command exits unsuccessfully
- Restart Cloudera Manager Server, all clusters, and all running jobs and queries
Usage: $PROG (subcommand) [options]
Function: This script scans a directory for jar files and removes JndiLookup.class from the ones it finds
- Stop all running jobs in the production cluster before executing the script
- Navigate to Cloudera Manager > YARN > Configuration and ensure that yarn.nodemanager.delete.debug-delay-sec is set to 0
If the value is not zero, you must restart the YARN service after setting the value to 0
- Navigate to Cloudera Manager > YARN > Configuration and search for yarn.nodemanager.local-dirs to get the configured Node Manager Local Directory path
- Remove filecache and usercache folder located inside the folders that are specified in yarn.nodemanager.local-dirs
- Download all files from the GitHub repo and copy to all nodes of your cluster
- Run the script as root on ALL nodes of your Data Hub and and Data Lake clusters
- Scripts will take 1 mandatory argument (cdh|cdp|hdp)
- The script takes 2 optional arguments: a base directory to scan in, and a backup directory. The default for both are /opt/cloudera and /opt/cloudera/log4shell-backup, respectively
- Ensure that the last line of the script output indicates ‘Finished’ to verify that the job has completed successfully. The script will fail if a command exits unsuccessfully
- Restart Cloudera Manager Server, all clusters, and all running jobs and queries
Usage: $PROG (subcommand) [options]
Rollback Procedure
Vulnerable files are fixed in-place as part of the script execution. A backup of the original files is kept in a backup directory (default: /opt/cloudera/log4shell-backup) or in the directory specified as part of the -b option. To roll back, copy these files after removing the extension (.backup) to their original locations.
Similarly, the changed files on the HDFS are backed up to /tmp/hdfs_tar_files.<date> and the same procedure can be applied as mentioned above. While the .backup extension should prevent the backed up files from being loaded by Java, be aware that these files should be considered vulnerable.
Long Term Solution:
For Data Lake and Data Hub (Cloudera Runtime), Cloudera will provide hotfixes for versions 7.2.7 and above. If you are running an older version of Cloudera Runtime, you can use the short-term resolution until it’s possible to upgrade to a newer version.
Cloudera will provide hotfix releases for all Data Services: Cloudera Data Warehouse (CDW), Cloudera Machine Learning (CML), Cloudera Data Engineering (CDE), CDP Operational Database (COD), and Cloudera Data Flow (CDF).
List of CDP Public Cloud and Other Products and the Applicable Resolution
Product |
Short-Term Resolution |
Long-Term Resolution |
Release Notes |
Data Lakes, Data Hub, and all Data Hub templates (powered by Cloudera Runtime) |
Yes |
Yes (for versions >= 7.2.7) |
|
Cloudera Manager (Including Backup Disaster Recovery (BDR) and Replication Manager) |
Yes |
Yes (for versions >= 7.2.7) |
|
Cloudera Data Warehouse (CDW) |
Yes |
Yes |
Note |
Cloudera Machine Learning (CML) |
No |
Yes |
Note |
Cloudera Data Engineering (CDE) |
No |
Yes |
Note |
CDP Operational Database (COD) |
Not Impacted |
Not Impacted |
Not Applicable |
Cloudera DataFlow (CDF) |
No |
Yes |
Note |
Cloudera Streaming Analytics (CSA) |
No |
Yes |
|
Cloudera Data Visualization (CDV) |
No |
Yes |
Note |
Data Catalog |
No |
Yes |
|
Replication Manager |
No |
Yes (part of Data Lake and Data Hub upgrade) |
|
Workload Manager |
Not Impacted |
Not Impacted |
Not Applicable |
List of Cloudera Drivers
Product |
Version |
Hortonworks JDBC Driver for Apache Hive |
Only version 2.6.12 |
Cloudera JDBC Driver for Apache Hive |
Only versions 2.6.13 through 2.6.15 |
Cloudera JDBC Data Connector for Impala |
Only versions 2.6.18 through 2.6.24 |
Cloudera ODBC Driver for Apache Hive |
Not Impacted |
Cloudera ODBC Data Connector for Impala |
Not Impacted |
Hortonworks ODBC Driver for Apache Hive |
Not Impacted |
Phoenix ODBC Driver |
Not Impacted |
Please refer back to this article as it will be updated continuously as new information is made available.
#OpenSourceOfferings