您的位置:首页 > 其它

NFS 共享和DNS配置的 AWK使用

2014-09-20 23:18 288 查看
检查是否nfs

#rpm -qa|grep nfs
#rpm -qa|grep rpcbind

将nfs rpcbind 设置为开机启动
#chkconfig nfs on
#chkconfig rpcbind on

#启动 nfs rpcbinds
#service rpcbind start
#service nfs start

#查看 rpcbind nfs 服务的状态
#service rpcbind status
#service nfs status

安装NFS rpcbind 的命令
#yum -y install rpcbind nfs-utils

#设置共享目录
#vi /etc/exports
添加以下内容
/home/hadoop *(rw,sync,no_root_squash)

重新启动 nfs rpcbind 服务:
#service rpcbind restart
#service nfs restart

到另外服务器192.168.1.58查看共享是否成功
#showmount -e 192.168.1.57

挂载目录
#mkdir /nfs_share
#mount -t nfs 192.168.1.58:/home/hadoop/ /nfs_share/
#mount

开机自动挂载NFS目录
#vi /etc/fstab
添加以下内容
192.168.1.58:/home/hadoop /nfs_share nfs defaults 1 1

创建密钥的软连接
#ln -s /nfs_share/.ssh/authorized_keys ./.ssh/authorized_keys
测试是否成功
ssh 192.168.1.58

DNS服务器配置
#rpm -qa|grep bind
#rpm -qa|grep bind-utils
#rpm -qa|grep bind-chroot

#yum -y install bind
#yum -y install bind-utils
#yum -y install bind-chroot

#rpm -qa|grep ‘^bind’
#发现未安装完全 bind-chroot
再执行:
yum -y install bind-chroot

#vi /etc/named.conf
listen-on port 53 {any;};
allow-query {any;};

#vi /etc/named.rfc1912.zones

zone "hadoop.com" IN{

type master;

file "named.hadoop.com";

allow-update{ none; };

}

zone "0.168.192.in-addr.arpa" IN{

type master;

file "named.192.168.0.zone";

allow-update{ none; };

}

cd /var/named/

cp -p named.localhost named.hadoop.com

vi named.hadoop.com

#

cp -p named.localhost named.192.168.0.zone

vi named.192.168.0.zone

client

#vi /etc/sysconfig/network-scripts/ifcfg-eth0

DNS1 192.168.1.60

#重新加载DNS配置文件

#service named reload

#service named restart

#service named stop

#service named start

将DNS开机设置为开机启动

#chkconfig named on

验证访问:

nslookup 192.168.1.60

批量复制:

cat list.txt |awk -F ':' '{print("scp hadoop.tar.gz hadoop@" $1 ":/app/hadoop") 'a'}'

scp hadoop.tar.gz hadoop@192.168.1.60:/app/hadoop
scp hadoop.tar.gz hadoop@192.168.1.59:/app/hadoop
scp hadoop.tar.gz hadoop@192.168.1.58:/app/hadoop

cat list.txt |awk -F ':' '{print("scp hadoop.tar.gz hadoop@" $1 ":/app/hadoop") 'a'}'>listscp.sh

[hadoop@namenode57 ~]$ cat listscp.sh
scp hadoop.tar.gz hadoop@192.168.1.60:/app/hadoop
scp hadoop.tar.gz hadoop@192.168.1.59:/app/hadoop
scp hadoop.tar.gz hadoop@192.168.1.58:/app/hadoop
[hadoop@namenode57 ~]$ chmod 775 listscp.sh
[hadoop@namenode57 ~]$ ./listscp.sh
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: