hdfs: distcp with to cloud storage

Using DistCp with Amazon S3

S3 credentials can be provided in a configuration file (for example, core-site.xml):

<property>
    <name>fs.s3a.access.key</name>
    <value>...</value>
</property>
<property>
    <name>fs.s3a.secret.key</name>
    <value>...</value>
</property>

hadoop distcp -Dfs.s3a.access.key=myAccessKey -Dfs.s3a.secret.key=mySecretKey hdfs://MyNameservice-id/user/hdfs/mydata s3a://myBucket/mydata_backup

 

Using DistCp with Microsoft Azure (WASB)

Configure connectivity to Azure by setting the following property in core-site.xml.

<property>
  <name>fs.azure.account.key.youraccount.blob.core.windows.net</name>
  <value>your_access_key</value>
</property>
hadoop distcp wasb://<sample_container>@<sample_account>.blob.core.windows.net/ hdfs://hdfs_destination_path
Advertisements

Fix Under-replicated blocks in HDFS manually

https://community.hortonworks.com/articles/4427/fix-under-replicated-blocks-in-hdfs-manually.html

Short Description:

Quick instruction to fix under-replicated Blocks in HDFS manually

Article

To Fix under-replicated blocks in HDFS, below is quick instruction to use:

####Fix under-replicated blocks###

  1. su <$hdfs_user>
  2. bash4.1$ hdfs fsck / | grep ‘Under replicated’ | awk F‘:’ ‘{print $1}’ >> /tmp/under_replicated_files
  3. bash4.1$ for hdfsfile in `cat /tmp/under_replicated_files`; do echo “Fixing $hdfsfile :” ; hadoop fs setrep 3 $hdfsfile; done

How to deploy custom jar files in apache hive (hortonworks hdp)

Below activity need to be performed in all hive servers, hive metastore and hive client nodes.

  1. Create the folder if not exists  “/usr/hdp/2.5.4.0-121/hive/auxlib”
  2. copy the custom build jar into this folder “customserde.jar”
  3. Restart the hive service
  4. verify with “ps -ef|grep -hive|grep customserde”. Hive process should have loaded this file along with path in section “–hiveconf hive.aux.jars.path=”