hadoop配置自动化之一ssh自动化
hadoop配置自动化之一ssh自动化
此篇博客参考:SSH无密码登录-多节点自动化部署SHELL篇。
测试环境:ubuntu12.04.2 server 64bit 、expect version 5.45、GNU bash, version 4.2.24(1)-release (x86_64-pc-linux-gnu)
说明:hadoop自动化配置出来的结果是:整个集群一个namenode、一个secondary、一个JobTracker,且这三个进程在同一个机器上面,datanode和tasktracker在其他slaves机器上面(如果有需要可以修改相应的shell script即可)
hadoop配置自动化怎么做?这个应该涉及到很多方面的内容。假如说这个集群全部是裸机,所有机器共用一个用户名和密码,配置了expect工具,默认在/etc/hosts里面配置了集群所有机器的机器名和ip。那么应该可以按照下面的思路来做:(1)ssh的自动化部署配置;(2)jdk的自动化部署配置;(3)hadoop的自动化配置;
(1)ssh的自动化部署配置的思路用户首先要在namenode节点上面配置一个slaves.host的文件,该文件包含所有集群slaves的机器名,然后运行脚本在namenode上面自动生成id_rsa.pub文件,并且产生authorized_keys授权文件,然后把该文件分发到slaves的集群上面完成ssh的配置;
(2)jdk的配置,jdk主要是把一个jdk.tar.gz包进行解压,然后配置.bashrc文件,然后把.bashrc文件和解压后的jdk.tar.gz文件分发到slaves集群;即可完成对jdk的配置;
(3)hadoop的自动化配置,这个配置要是配置conf文件夹下面的文件,首先下载hadoop的安装包,然后解压修改conf里面的一些常规配置,然后根据namenode节点上面的jdk路径和namenode机器名以及salves机器名配置相应的.xml、.env文件,最后把修改后的hadoop解压包分发到各个slaves即可;
这里首先贴上第一篇ssh的配置shell代码:
#!/bin/bash # auto generate ssh key and distribute the authorized_keys to the salves machine # the script should run on the namenode manchine if [ $# -lt 2 ]; then cat << HELP generate_ssh_v1 --generate ssh key for login without typing password; this script should run on the namenode machine and user should edit the ip-list file USAGE: ./generate_ssh_v1 user pasaword EXAMPLE: ./generate_ssh_v1 hadoop1 1234 HELP exit 0 fi user=$1 ip=$HOSTNAME pass=$2 rm -rf ~/.ssh echo '' echo "####################################################" echo " generate the rsa public key on $HOSTNAME ..." echo "####################################################" expect -c " set timeout 3 spawn ssh $user@$ip expect \"yes/no\" send -- \"yes\r\" expect \"password:\" send -- \"$pass\r\" expect \"$\" send \"ssh-keygen -t rsa -P '' -f $HOME/.ssh/id_rsa\r\" expect \"$\" send \"ssh-copy-id -i $HOME/.ssh/id_rsa.pub $HOSTNAME\r\" expect \"password\" send -- \"$pass\r\" expect eof " echo '' echo "####################################################" echo " copy the namenode's authorized_keys to slaves ..." echo "####################################################" for slave in `cat slaves.host` do expect -c " set timeout 3 spawn ssh $user@$slave expect \"yes/no\" send -- \"yes\r\" expect \"password\" send -- \"$pass\r\" expect \"$\" send \"rm -rf $HOME/.ssh\r\" expect \"$\" send \"mkdir $HOME/.ssh\r\" expect \"$\" expect eof " done for slave in `cat slaves.host` do expect -c " set timeout 3 spawn scp $HOME/.ssh/authorized_keys $user@$slave:$HOME/.ssh/ expect \"password\" send -- \"$pass\r\" expect eof " done/etc/hosts :
192.168.128.138 hadoop 192.168.128.130 ubuntu
slaves.host:
hadoop
测试信息:
hadoop1@ubuntu:~$ ./generate_ssh_v1 hadoop1 1234 #################################################### generate the rsa public key on ubuntu ... #################################################### spawn ssh hadoop1@ubuntu The authenticity of host 'ubuntu (192.168.128.130)' can't be established. ECDSA key fingerprint is 53:c7:7a:dc:3b:bc:34:00:4a:6d:18:1c:5e:87:e7:e8. Are you sure you want to continue connecting (yes/no)? yes Warning: Permanently added 'ubuntu,192.168.128.130' (ECDSA) to the list of known hosts. hadoop1@ubuntu's password: Welcome to Ubuntu 12.04.2 LTS (GNU/Linux 3.5.0-23-generic x86_64) * Documentation: https://help.ubuntu.com/ Last login: Mon Sep 23 15:22:03 2013 from ubuntu ssh-keygen -t rsa -P '' -f /home/hadoop1/.ssh/id_rsa hadoop1@ubuntu:~$ ssh-keygen -t rsa -P '' -f /home/hadoop1/.ssh/id_rsa Generating public/private rsa key pair. Your identification has been saved in /home/hadoop1/.ssh/id_rsa. Your public key has been saved in /home/hadoop1/.ssh/id_rsa.pub. The key fingerprint is: e1:5e:20:9d:4e:11:f8:dc:05:35:08:83:5d:ce:99:ed hadoop1@ubuntu The key's randomart image is: +--[ RSA 2048]----+ | +=+o+o | | o..*.+.. | | .o*.=.. | | =oo.. | | S . E | | . . | | . | | | | | +-----------------+ hadoop1@ubuntu:~$ ssh-copy-id -i /home/hadoop1/.ssh/id_rsa.pub ubuntu hadoop1@ubuntu's password: Now try logging into the machine, with "ssh 'ubuntu'", and check in: ~/.ssh/authorized_keys to make sure we haven't added extra keys that you weren't expecting. hadoop1@ubuntu:~$ #################################################### copy the namenode's authorized_keys to slaves ... #################################################### spawn ssh hadoop1@hadoop The authenticity of host 'hadoop (192.168.128.138)' can't be established. ECDSA key fingerprint is 10:8f:d1:8e:63:0a:af:1e:fb:d9:a8:bb:9a:39:ab:46. Are you sure you want to continue connecting (yes/no)? yes Warning: Permanently added 'hadoop,192.168.128.138' (ECDSA) to the list of known hosts. hadoop1@hadoop's password: Welcome to Ubuntu 12.04.2 LTS (GNU/Linux 3.5.0-23-generic i686) * Documentation: https://help.ubuntu.com/ System information as of Tue Aug 6 20:11:49 CST 2013 System load: 0.1 Processes: 76 Usage of /: 24.8% of 7.12GB Users logged in: 2 Memory usage: 34% IP address for eth0: 192.168.128.138 Swap usage: 0% Graph this data and manage this system at https://landscape.canonical.com/ 85 packages can be updated. 45 updates are security updates. Last login: Tue Aug 6 20:11:16 2013 from 192.168.128.130 hadoop1@hadoop:~$ rm -rf /home/hadoop1/.ssh hadoop1@hadoop:~$ mkdir /home/hadoop1/.ssh hadoop1@hadoop:~$ spawn scp /home/hadoop1/.ssh/authorized_keys hadoop1@hadoop:/home/hadoop1/.ssh/ hadoop1@hadoop's password: authorized_keys 100% 396 0.4KB/s 00:00 hadoop1@ubuntu:~$ ssh ubuntu Welcome to Ubuntu 12.04.2 LTS (GNU/Linux 3.5.0-23-generic x86_64) * Documentation: https://help.ubuntu.com/ Last login: Mon Sep 23 16:13:39 2013 from ubuntu hadoop1@ubuntu:~$ exit logout Connection to ubuntu closed. hadoop1@ubuntu:~$ ssh hadoop Welcome to Ubuntu 12.04.2 LTS (GNU/Linux 3.5.0-23-generic i686) * Documentation: https://help.ubuntu.com/ System information as of Tue Aug 6 20:12:17 CST 2013 System load: 0.12 Processes: 76 Usage of /: 24.8% of 7.12GB Users logged in: 2 Memory usage: 34% IP address for eth0: 192.168.128.138 Swap usage: 0% Graph this data and manage this system at https://landscape.canonical.com/ 85 packages can be updated. 45 updates are security updates. Last login: Tue Aug 6 20:11:50 2013 from 192.168.128.130 hadoop1@hadoop:~$ exit logout Connection to hadoop closed.总结:刚开始编写shell的时候连着写 spawn 然后直接敲shell的命令,结果老是expect后面的读不到。。。
分享,成长,快乐
转载请注明blog地址:http://blog.csdn.net/fansy1990
评论暂时关闭