Unix and Hadoop

SCP to a seperate PORT


scp -P 5050 asd.tar.gz user@192.168.1.15:/home/user


Tomcat Webapps Location


/usr/share/tomcat/webapps



Get IP address



ifconfig eth0 | awk '/inet /{print $2}'

Get Tomcat logs


tail -f /usr/share/apache-tomcat-7.0.30/logs/catalina.out

tail -f -n 10

Running Hadoop Jobs


set mapred.job.queue.name=dev

hadoop jar acs.jar -D mapreduce.reduce.speculative=false -D mapreduce.map.speculative=false -D mapred.job.queue.name=dev -D mapreduce.task.io.sort.factor=256 -D file.pattern=.*20110110.* -D

mapred.reduce.slowstart.completed.maps=0.9 -D mapred.reduce.tasks=10 /Input /Output

hadoop jar test.jar -D mapred.job.queue.name=dev -D mapred.textoutputformat.separator=,



Glassfish Webapps folder location



/usr/share/glassfish4/glassfish/domains/domain1/applications




Killing Mysql Processes



ps -ef | grep mysql | awk -F" " '{system("kill -9  "$2)}'




Starting MySQL



/etc/init.d/mysqld start


Mounting Windows Folder on Linux


mount -t cifs -o username=admin,password=passme //129.29.126.22/folder1 /mnt/cifs



Redirecting SQL output to CSV


mysql accessDB -p < k2.sql | sed 's/\t/,/g' > out11.csv


curl -i "http://<HOST>:<PORT>/webhdfs/v1/?op=GETHOMEDIRECTORY"

Find Command


find / -name 'hadoop-site.xml' 2>/dev/null
locate libxml2
find / -name libxml2 -exec ls -l {} \;
find -maxdepth 1 -type f -print0 | xargs -r -0 -P0 -n 60 sh -c 'mkdir part$$; mv "$@" part$$/' xx


Perl - Setting Proxy


o conf http_proxy http://10.5.32.50:2328
o conf proxy_user user
o conf proxy_pass


Perl Install Package


install Array::Iterator
install XML::Simple
install XML::Parser
install XML::SAX


R Install


rpm -ivh http://mirror.chpc.utah.edu/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm
yum install R R-devel



Netstats Check


netstat -anp |grep 80
netstat -anp |grep 10000
netstat -tulpn | grep :80




Adding proxy IP to Unix Box



export http_proxy="http://1.2.2.5:328"


Count number of Files


ls | wc -l

Space commands


df -h
free -h

Change User


sudo -i -u hdfs
su - hdfs




Change ownership



hadoop fs -chown  -R user1:root /user/k2




Copy folder to HDFS after making archive file/ Distcp



for a in `ls -1`; do hadoop fs -copyFromLocal $a /user/JZ2; done
for a in `ls -1`; do hadoop archive -archiveName files$a.har -p file:///data/MJData/HTTP/J$a /user/new_1; done
hadoop distcp file:///data3/big_data/HTTP/J /k3
hadoop fs -put J /user
hadoop archive -archiveName files.har -p file:///data/big_data/HTTP/JZ /user/new_1
hadoop distcp file:///home/Hadoop-Streaming-0.122420/examples/codes/input22/* /user/in1
-copyFromLocal
CombinedFileInputFormat
SequenceFiles




Remove a file on HDFS



hadoop fs -rmr /user/input/part*


Copying Key for Passwordless access


cat /root/.ssh/id_rsa.pub | ssh user@IP 'cat >> .ssh/authorized_keys'


Untarring a file


tar -zcvf jz1.tar.gz /data/M/HTin



SCP a file to another machine


scp -r jz1.tar.gz root@IP:/data/HTTP/j

For using Firefox with Java you must install the Icedtea plugin.

sudo apt-get install icedtea-plugin



Comments

Popular posts from this blog

Machine Learning Algorithm timeline

Anomaly Detection