OEM & Open Source Offerings

OEM & Open Source Offerings

Connect with Db2, Informix, Netezza, open source, and other data experts to gain value from your data, share insights, and solve problems.

 View Only

Resolution for TSB 2021-545 - Critical vulnerability in log4j2 CVE-2021-44228 for CDH, HDP, HDF, and CDP Private Cloud and Data Services

By Sangeeta Badiger posted Thu December 16, 2021 07:43 PM

  

Summary

This document applies to CDH, HDP, HDF, and CDP Private Cloud and Data Services.

Symptoms

The Apache Security team has released a security advisory for CVE-2021-44228 which affects Apache Log4j2. A malicious user could exploit this vulnerability to run arbitrary code as the user or service account running the affected software. Software products using log4j versions 2.0 through 2.14.1 are affected and log4j 1.x is not affected. Cloudera is making short-term workarounds available for affected software and is in the process of creating new releases containing fixes for this CVE.

Instructions

Short Term Resolution

Script Download Location

Download all files from the GitHub repo here - https://github.com/cloudera/cloudera-scripts-for-log4j

You must run the following script on all affected cluster nodes.

NOTE: After applying the Short Term Resolution, if you add a node, you will need to re-apply the Short Term Resolution again on the new nodes.

Script: run_log4j_patcher.sh [cdp|cdh|hdp|hdf]

Function: The run_log4j_patcher.sh script scans a directory for jar files and removes JndiLookup.class from the ones it finds. Do not run any other script in the downloaded directory--they will be called by run_log4j_patcher.sh automatically.

  1. Stop all running jobs in the production cluster before executing the script
  2. Navigate to Cloudera Manager > YARN > Configuration and ensure that yarn.nodemanager.delete.debug-delay-sec is set to 0
    If the value is not zero, you must restart the YARN service after setting the value to 0
  3. Navigate to Cloudera Manager > YARN > Configuration and search for yarn.nodemanager.local-dirs to get the configured Node Manager Local Directory path 
  4. Remove filecache and usercache folder located inside the folders that are specified in yarn.nodemanager.local-dirs
  5. Download all files from the GitHub repo and copy to all nodes of your cluster
  6. Run the script as root on ALL nodes of your cluster 
    1. Script will take 1 mandatory argument (cdh|cdp|hdp)
    2. The script takes 2 optional arguments: a base directory to scan in, and a backup directory. The default for both are /opt/cloudera and /opt/cloudera/log4shell-backup, respectively. These defaults work for CM/CDH 6 and CDP 7. A different folder will be updated for HDP.
  7. Ensure that the last line of the script output indicates ‘Finished’ to verify that the job has completed successfully. The script will fail if a command exits unsuccessfully. 
  8. Restart Cloudera Manager Server, all clusters, and all running jobs and queries.

Usage: $PROG (subcommand) [options]

Subcommands:

  • help          Prints this message
  • cdh           Scan a CDH cluster node
  • cdp           Scan a CDP cluster node
  • hdp           Scan a HDP cluster node
  • hdf            Scan a HDF cluster node

Options:
-t <targetdir> Override target directory (default: distro-specific)
-b <backupdir> Override backup directory (default: /opt/cloudera/log4shell-backup)

Environment Variables:
The SKIP_* environment variables should only be used if you are running the script again and want to skip phases that have already completed.
SKIP_JAR       If non-empty, skips scanning and patching .jar files
SKIP_TGZ      If non-empty, skips scanning and patching .tar.gz files
SKIP_HDFS*  If non-empty, skips scanning and patching .tar.gz files in HDFS
RUN_SCAN    If non-empty, runs a final scan for missed vulnerable files.
This can take several hours.

NOTE: CDH/CDP Parcels: The script removes the affected class from all CDH/CDP parcels already installed under /opt/cloudera. This script needs to be re-run after new parcels are installed or after upgrading to versions of CDH/CDP that do not include the long-term fix.

Removing affected classes from Oozie Shared Libraries (CDH & CDP):

The vulnerability affects client libraries uploaded in HDFS by Cloudera Manager. The script takes care of Tez and MapReduce libraries however Oozie libraries will need to be updated manually. The following section only applies to CDH and CDP releases. 

Follow the instructions below to secure the Oozie shared libraries:

  1. Execute the ​​run_log4j_patcher.sh on the affected cluster.
  2. Navigate to Cloudera Manager > Oozie > Actions -> “Install Oozie ShareLib” to re-upload the Oozie libraries in the HDFS from Cloudera Manager.
    IMPORTANT: Ensure that the Oozie service is running prior to executing the command.

 

Removing affected classes from Oozie Shared Libraries (HDP):

Run these commands to update Oozie share lib:

su oozie
kinit oozie /usr/hdp/current/oozie-server/bin/oozie-setup.sh sharelib create -fs hdfs://ns1
oozie admin -oozie http(s)://<oozie-host/loadbalancer>:11(000|443)/oozie -sharelibupdate 

Rollback Procedure

Vulnerable files are fixed in-place as part of the script execution. A backup of the original files is kept in a backup directory (default: /opt/cloudera/log4shell-backup) or in the directory specified as part of the -b option. To roll back, copy these files after removing the extension (.backup) to their original locations.

Similarly, the changed files on the HDFS are backed up to /tmp/hdfs_tar_files.<date> and the same procedure can be applied as mentioned above. While the .backup extension should prevent the backed up files from being loaded by Java, be aware that these files should be considered vulnerable.

Known Limitations

CDH clusters using packages rather than parcels are not yet supported with this short-term fix.

List of CDH, HDP, HDF, and CDP Private Cloud Products and the Applicable Resolution



Product Short Term Resolution
CDH, HDP, and HDF  
Hortonworks Data Platform (HDP) Yes
Ambari Yes
AM2CM Tool Yes
SmartSense Not Impacted
Data Platform Search Yes
Cloudera Cybersecurity Platform Research Ongoing
Cloudera Enterprise Yes
Cloudera Manager (Including Backup Disaster Recovery (BDR) and Cloudera Navigator) Yes
Cloudera Data Science Workbench (CDSW) Yes
Hortonworks Data Flow (HDF) Yes
Streams Messaging Manager (SMM) for HDF and HDP Yes
Streams Replication Manager (SRM) for HDF and HDP Yes
Cloudera Edge Management (CEM) Yes
Hortonworks DataPlane Platform Not Impacted
Data Lifecycle Manager (DLM) Not Impacted
Data Steward Studio (DSS) Not Impacted
Data Analytics Studio (DAS) Yes
Arcadia Enterprise Yes
CDP Private Cloud  
CDP Private Cloud Base Yes
Cloudera Manager
(Including Backup Disaster Recovery (BDR) and Replication Manager)
Yes
Cloudera Data Warehouse (CDW) Yes
Cloudera Machine Learning (CML) Yes
Cloudera Data Engineering (CDE) Yes
Management Console Yes
Workload XM Not Impacted
Cloudera Flow Management (CFM) Research Ongoing
Cloudera Streaming Analytics (CSA) Research Ongoing
Cloudera Edge Management (CEM) Not Impacted
Cloudera Stream Processing (CSP) Not Impacted
CDS 3 Powered by Apache Spark Research Ongoing
CDS 3.2 for GPUs Research Ongoing

 
This is a reproduction of the below KB article from my.cloudera.com
https://my.cloudera.com/knowledge/Resolution-for-TSB-2021-545---Critical-vulnerability-in-log4j2?id=332012

Please refer back to the original mycloudera article as it will be updated continuously as new information is made available.


#OpenSourceOfferings
0 comments
14 views

Permalink