您的位置:首页 > 其它

以一个上传文件的例子来说 DistributedFileSystem

2013-08-21 19:53 846 查看
public class UploadAndDown {

public static void main(String[] args) {
UploadAndDown uploadAndDown = new UploadAndDown();
try {
//将本地文件local.txt上传为HDFS上cloud.txt文件
uploadAndDown.upLoadToCloud("local.txt", "cloud.txt");
//将HDFS上的cloud.txt文件下载到本地cloudTolocal.txt文件
uploadAndDown.downFromCloud("cloudTolocal.txt", "cloud.txt");
} catch (FileNotFoundException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}

}

private void upLoadToCloud(String srcFileName, String cloudFileName)
throws FileNotFoundException, IOException {
// 本地文件存取的位置
String LOCAL_SRC = "/home/linuxidc/hbase2/bin/" + srcFileName;
// 存放到云端HDFS的位置
String CLOUD_DEST = "hdfs://localhost:9000/user/linuxidc/" + cloudFileName;
InputStream in = new BufferedInputStream(new FileInputStream(LOCAL_SRC));
// 获取一个conf对象
Configuration conf = new Configuration();
// 文件系统
FileSystem fs = FileSystem.get(URI.create(CLOUD_DEST), conf);
// 输出流
OutputStream out = fs.create(new Path(CLOUD_DEST), new Progressable() {
@Override
public void progress() {
System.out.println("上传完成一个文件到HDFS");
}
});
// 连接两个流,形成通道,使输入流向输出流传输数据
IOUtils.copyBytes(in, out, 1024, true);
}

private void downFromCloud(String srcFileName, String cloudFileName) throws FileNotFoundException, IOException {
// 云端HDFS上的文件
String CLOUD_DESC = "hdfs://localhost:9000/user/linuxidc/"+cloudFileName;
// down到本地的文件
String LOCAL_SRC = "/home/linuxidc/hbase2/bin/"+srcFileName;
// 获取conf配置
Configuration conf = new Configuration();
// 实例化一个文件系统
FileSystem fs = FileSystem.get(URI.create(CLOUD_DESC), conf);
// 读出流
FSDataInputStream HDFS_IN = fs.open(new Path(CLOUD_DESC));
// 写入流
OutputStream OutToLOCAL = new FileOutputStream(LOCAL_SRC);
// 将InputStrteam 中的内容通过IOUtils的copyBytes方法复制到OutToLOCAL中
IOUtils.copyBytes(HDFS_IN, OutToLOCAL, 1024, true);
}

}


  如图是个从远程向hadoop 文件系统上传文件的例子。
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: 
相关文章推荐