IDL 读取显示HDF文件
2014-01-06 15:35
253 查看
自己看到的方法,不知道是谁写的,就没办法注明出处了。作者见谅!
1==========
2.=============
Reading HDF/netCDF Data using IDL
In this seminar we will explore the commands available for exploring HDF and netCDF data sets and retrieving data using IDL. Then we will look at procedures for automatically printing all relevant information from a file (using a MODIS file for HDF-EOS and
a GOES file for netCDF), and a second procedure for extracting data once you know what you are looking for.
HDF/netCDF files typically contain several datasets. The entire file and each dataset within the file is accompanied by a number of descriptive fields called attributes. Importing your desired dataset from an unfamiliar file into IDL will require the following
steps:
Open the file and asign it a file ID.
Find the number of file attributes and datasets.
Read the file attributes.
Select a dataset and assign it a dataset ID.
Find the number of dataset attributes.
Read dataset attributes.
Import the dataset.
Once you are completely familiar with the file steps 2, 3, 5 and 6 may be omitted. Required syntax will be designated by bold script, user supplied variables by italics. The user supplied variables may be displayed by using the print command. I will include
examples as though you were typing it directly while in the IDL environment.
--------------------------------------------------------------------------------
Reading Data in HDF
1. Open the file and assign it a file ID
fileID = HDF_SD_Start('filename', /read)
If you have assigned the actual filename to a variable (file = 'my_file.hdf'), you do not need the quotes. When you are completely through with the file you should close it using the HDF_SD_End, fileID command.
2. Find the number of file attributes and datasets
HDF_SD_FileInfo, FileID, num_datasets, num_attributes
3. Read the file attributes
HDF_SD_AttrInfo, FileID, attribute_index, name = attr_name, data = attr_data
4. Select a dataset and assign it a dataset ID
If you already know the name of the dataset, then you may use
dataset_index = HDF_SD_NameToIndex(fileID, dataset_name)
Then you use this index to assign a dataset ID:
datasetID = HDF_SD_Select(fileID, dataset_index)
if you don't yet know the exact name of the dataset, you'll have to explore them one by one (steps 5 and 6). The datasets are zero indexed; for example, EOS data starts with the geolocation data so that longitude is index = 0, latitude is index = 1, etc.
5. Find the number of dataset attributes
HDF_SD_GetInfo, datasetID,name = dataset_name, natts = num_attributes, $
ndim=num_dims, dims =dimvector
In which either the variables assigned with the equals signs are optional
6. Read the dataset attributes
HDF_SD_AttrInfo, datasetID, attribute_index, name = attr_name, data = attr_data
Note that this is the same syntax used to read the global file information. You will want to get the scale factor and offset from the attribute data to convert from the integerized data to the true data (see below).
7. Import the selected dataset
HDF_SD_GetData, datasetID, data_variable, $
Start = [x, y, z], Count=[xdim,ydim,zdim], Stride=[xjump,yjump,zjump]
Where the Start, Count, and Stride variables are optional.
If the data is in integerized form, you will need to convert it to the true values:
true_data = scale*(filedata - offset)
Now you're done!
Procedures for Exploring and Importing data from HDF-EOS Files
Both sample files discussed below may be copied from the MetoGrads directory into a newly created file 'hdfsem' by the following UNIX commands:
mkdir hdfsem
cp ~gcm/hdf_netcdf/*.pro hdfsem/
The modis_sds.pro procedure will loop through all the attributes for each data set in an hdf file, and print the information out to a txt file of your choice. When you run the program, you will be prompted to select the file of your choice from a directory.
There is a cloud file in the MetoGrads directory. To print information about it into the file 'cloud_info.txt' issue the following command from the IDL command line:
modis_sds,'cloud_info.txt'
When the file selection window appears, edit the Filter box (top of the window) to read '/homes/metogra/gcm/hdf_netcdf/*.hdf' and hit the 'filter' button at the bottom of the window. You'll see a hdf file on the right side. Select this with the mouse and
hit the 'okay' button at the bottom of the screen. IDL will tell you the information has been saved in 'cloud_info.txt'. You can open this file to find what the variables are named so that you can use the exact character string to extract the data.
The modread.pro procedure will read a specified variable from an HDF file, convert it from integerized form, and also provide information like the variable dimensions and the fill value (which it replaces with the IDL fill value !Values.F_NaN). Let's look
at the 'Cloud_Top_Temperature' variable. The procedure requires a file name, but instead of spelling the whole thing out, let's pick out out of a list and store it in a variable called 'filename':
filename = dialog_pickfile(filter='*.hdf')
Now we can read the data and put it in the variable Tcld using the following command:
modread, filename, Tcld, 'Cloud_Top_Temperature', dims, fillvalue
Now that the variable is stored in Tcld, you can explore it using IDL.
--------------------------------------------------------------------------------
Reading Data in netCDF
Much of this will look similar to the HDF methodology, so some of the commentary is reduced. Those who want a fuller explanation may refer to similar sections above.
1. Open the file and assign it a file ID
fileID = ncdf_open('filename', /read)
When you are completely through with the file you should close it using the ncdf_close, fileID command.
2. Find the number of file attributes and datasets (or variables). The information will be contained in the structure variable that we have named 'fileinq_struct', but you may give it any name you wish so long as you use the proper record names.
fileinq_struct=ncdf_inquire(fileID)
nvars = fileinq_struct.nvars
natts = fileinq_struct.natts
3. Read the file (global) attributes
global_attname=ncdf_attname(fileID,attndx,/global)
ncdf_attget,fileID,global_attname,value,/global
4. Use the variable index to get the name, dimensions, and number of attributes
varinq_struct=ncdf_varinq(fileID,varndx)
variable_name = varinq_struct.name
dimensions = varinq_struct.dims
numatts = varinq_struct.natts
Note that the lack of a NameToIndex function such as found in HDF means you'll have to explore the variables one by one by index in order to find the one you want.
5. Read the variable attributes
First get the name of the attribute by index
attname=ncdf_attname(fileID,varndx,attndx)
Now read the attribute
ncdf_attget,fileID,varndx,attname,value
Note how this uses the same command as for getting global attributes, but the variable index must be included when the /global switch is not set.
6. Get an ID for the variable
varID=ncdf_varid(fileID,varname)
7. Import the selected dataset
ncdf_varget,fileID,varID,variable
If the data is in integerized form, you will need to convert it to the true values using the scale factor and offset that (hopefully) is stored in the attributes for the variable.
Now you're done!
IDL Procedures for Exploring and Reading netCDF Data
You will need to copy the goes data file to your directory, as well as the IDL procedures 'ncdfshow.pro' and 'ncdfread.pro'.
The ncdfshow.pro procedure will find the number of variables, the number of attributes per variable, and loop through them in order to write them into a text file that you specify. To run it from the IDL command line type:
ncdfshow,'fileinfo.txt'
Where you should typically provide something more descriptive than 'fileinfo' to write the result into. You will be prompted to select a file, so select the GOES data file provided. After it runs, open fileinfo.txt and see what information is in the file.
Now that you know the name of the variable you want to look at, you can run the ncdfread.pro procedure to get the data in raw form. You will want to put the actual filename into a variable that's a little easier to handle first.
filename = dialog_pickfile()
ncdfread,filename,'variable_name',data_variable,dims
Where 'variable_name' is a string that must match exactly the name you found in 'fileinfo.txt', data_variable is where the data ends up, and dims is a vector of the array dimensions.
Now you have the data in an IDL variable, you may need to use any offset and scale information found in the attributes to scale it to a physically meaningful quantity.
1==========
1 pro readhdf 2 ; Set some constants 3 FILE_NAME="E:/sssbak/MODSST.hdf" 4 SDS_NAME="sst" 5 X_LENGTH=1354 6 Y_LENGTH=4856 7 8 ; Open the file and initialize the SD interface 9 sd_id = HDF_SD_START( FILE_NAME, /read ) 10 ; Find the index of the sds to read using its name 11 sds_index = HDF_SD_NAMETOINDEX(sd_id,SDS_NAME) 12 ; Select it 13 sds_id = HDF_SD_SELECT( sd_id, sds_index ) 14 15 ; Set the data subset limits. Actually, it is read from the first element, so "start" is [0,0]. X_LENGTH elements will be read along the X axis and Y_LENGTH along Y. 16 start=INTARR(2) ; the start position of the data to be read 17 start[0] = 0 18 start[1] = 0 19 edges=INTARR(2) ; the number of elements to read in each direction 20 edges[0] = X_LENGTH 21 edges[1] = Y_LENGTH 22 ; Read the data : you can notice that here, it is not needed to allocate the data array yourself 23 HDF_SD_GETDATA, sds_id, data 24 dims = size(data) 25 ;HDF_SD_GETDATA, sds_id, data, start = start, count = edges 26 27 ; print them on the screen. You can notice that the IDL method HDF_SD_GETDATA swaps the HDF indexes convention [Z,Y,X] to [X,Y,Z]. This method is more efficient on IDL. If you prefer the usual HDF convention, you should better use the set the NOREVERSE keyword when calling HDF_SD_GETDATA 28 ;FOR i=0,(X_LENGTH-1),1 DO BEGIN ; crosses X axis 29 ; FOR j=0,(Y_LENGTH-1),1 DO BEGIN ; crosses Y axis 30 ; PRINT, FORMAT='(I," ",$)', data[i,j] 31 ; ENDFOR 32 ; PRINT,"" 33 ;ENDFOR 34 35 ; end access to SDS 36 HDF_SD_ENDACCESS, sds_id 37 ; close the hdf file 38 HDF_SD_END, sd_id 39 print,dims[1];1354 也就是说,实际上X_LENGTH,X_LENGTH可以通过size读出。 40 print,dims[2];4856 41 print,dims[3] 42 print,dims[4] 43 filter_data = uintarr(dims[1], dims[2],dims[3]) 44 ;imagedata = BYTSCL(data) 45 imagedata = bytscl( congrid(data,800,600) );缩小图片。 46 47 ;print,imagedata 48 loadct, 15;调色板编号不一样。 49 window, 0, xsize=800, ysize=600, retain=2 50 tv, imagedata
2.=============
Reading HDF/netCDF Data using IDL
In this seminar we will explore the commands available for exploring HDF and netCDF data sets and retrieving data using IDL. Then we will look at procedures for automatically printing all relevant information from a file (using a MODIS file for HDF-EOS and
a GOES file for netCDF), and a second procedure for extracting data once you know what you are looking for.
HDF/netCDF files typically contain several datasets. The entire file and each dataset within the file is accompanied by a number of descriptive fields called attributes. Importing your desired dataset from an unfamiliar file into IDL will require the following
steps:
Open the file and asign it a file ID.
Find the number of file attributes and datasets.
Read the file attributes.
Select a dataset and assign it a dataset ID.
Find the number of dataset attributes.
Read dataset attributes.
Import the dataset.
Once you are completely familiar with the file steps 2, 3, 5 and 6 may be omitted. Required syntax will be designated by bold script, user supplied variables by italics. The user supplied variables may be displayed by using the print command. I will include
examples as though you were typing it directly while in the IDL environment.
--------------------------------------------------------------------------------
Reading Data in HDF
1. Open the file and assign it a file ID
fileID = HDF_SD_Start('filename', /read)
If you have assigned the actual filename to a variable (file = 'my_file.hdf'), you do not need the quotes. When you are completely through with the file you should close it using the HDF_SD_End, fileID command.
2. Find the number of file attributes and datasets
HDF_SD_FileInfo, FileID, num_datasets, num_attributes
3. Read the file attributes
HDF_SD_AttrInfo, FileID, attribute_index, name = attr_name, data = attr_data
4. Select a dataset and assign it a dataset ID
If you already know the name of the dataset, then you may use
dataset_index = HDF_SD_NameToIndex(fileID, dataset_name)
Then you use this index to assign a dataset ID:
datasetID = HDF_SD_Select(fileID, dataset_index)
if you don't yet know the exact name of the dataset, you'll have to explore them one by one (steps 5 and 6). The datasets are zero indexed; for example, EOS data starts with the geolocation data so that longitude is index = 0, latitude is index = 1, etc.
5. Find the number of dataset attributes
HDF_SD_GetInfo, datasetID,name = dataset_name, natts = num_attributes, $
ndim=num_dims, dims =dimvector
In which either the variables assigned with the equals signs are optional
6. Read the dataset attributes
HDF_SD_AttrInfo, datasetID, attribute_index, name = attr_name, data = attr_data
Note that this is the same syntax used to read the global file information. You will want to get the scale factor and offset from the attribute data to convert from the integerized data to the true data (see below).
7. Import the selected dataset
HDF_SD_GetData, datasetID, data_variable, $
Start = [x, y, z], Count=[xdim,ydim,zdim], Stride=[xjump,yjump,zjump]
Where the Start, Count, and Stride variables are optional.
If the data is in integerized form, you will need to convert it to the true values:
true_data = scale*(filedata - offset)
Now you're done!
Procedures for Exploring and Importing data from HDF-EOS Files
Both sample files discussed below may be copied from the MetoGrads directory into a newly created file 'hdfsem' by the following UNIX commands:
mkdir hdfsem
cp ~gcm/hdf_netcdf/*.pro hdfsem/
The modis_sds.pro procedure will loop through all the attributes for each data set in an hdf file, and print the information out to a txt file of your choice. When you run the program, you will be prompted to select the file of your choice from a directory.
There is a cloud file in the MetoGrads directory. To print information about it into the file 'cloud_info.txt' issue the following command from the IDL command line:
modis_sds,'cloud_info.txt'
When the file selection window appears, edit the Filter box (top of the window) to read '/homes/metogra/gcm/hdf_netcdf/*.hdf' and hit the 'filter' button at the bottom of the window. You'll see a hdf file on the right side. Select this with the mouse and
hit the 'okay' button at the bottom of the screen. IDL will tell you the information has been saved in 'cloud_info.txt'. You can open this file to find what the variables are named so that you can use the exact character string to extract the data.
The modread.pro procedure will read a specified variable from an HDF file, convert it from integerized form, and also provide information like the variable dimensions and the fill value (which it replaces with the IDL fill value !Values.F_NaN). Let's look
at the 'Cloud_Top_Temperature' variable. The procedure requires a file name, but instead of spelling the whole thing out, let's pick out out of a list and store it in a variable called 'filename':
filename = dialog_pickfile(filter='*.hdf')
Now we can read the data and put it in the variable Tcld using the following command:
modread, filename, Tcld, 'Cloud_Top_Temperature', dims, fillvalue
Now that the variable is stored in Tcld, you can explore it using IDL.
--------------------------------------------------------------------------------
Reading Data in netCDF
Much of this will look similar to the HDF methodology, so some of the commentary is reduced. Those who want a fuller explanation may refer to similar sections above.
1. Open the file and assign it a file ID
fileID = ncdf_open('filename', /read)
When you are completely through with the file you should close it using the ncdf_close, fileID command.
2. Find the number of file attributes and datasets (or variables). The information will be contained in the structure variable that we have named 'fileinq_struct', but you may give it any name you wish so long as you use the proper record names.
fileinq_struct=ncdf_inquire(fileID)
nvars = fileinq_struct.nvars
natts = fileinq_struct.natts
3. Read the file (global) attributes
global_attname=ncdf_attname(fileID,attndx,/global)
ncdf_attget,fileID,global_attname,value,/global
4. Use the variable index to get the name, dimensions, and number of attributes
varinq_struct=ncdf_varinq(fileID,varndx)
variable_name = varinq_struct.name
dimensions = varinq_struct.dims
numatts = varinq_struct.natts
Note that the lack of a NameToIndex function such as found in HDF means you'll have to explore the variables one by one by index in order to find the one you want.
5. Read the variable attributes
First get the name of the attribute by index
attname=ncdf_attname(fileID,varndx,attndx)
Now read the attribute
ncdf_attget,fileID,varndx,attname,value
Note how this uses the same command as for getting global attributes, but the variable index must be included when the /global switch is not set.
6. Get an ID for the variable
varID=ncdf_varid(fileID,varname)
7. Import the selected dataset
ncdf_varget,fileID,varID,variable
If the data is in integerized form, you will need to convert it to the true values using the scale factor and offset that (hopefully) is stored in the attributes for the variable.
Now you're done!
IDL Procedures for Exploring and Reading netCDF Data
You will need to copy the goes data file to your directory, as well as the IDL procedures 'ncdfshow.pro' and 'ncdfread.pro'.
The ncdfshow.pro procedure will find the number of variables, the number of attributes per variable, and loop through them in order to write them into a text file that you specify. To run it from the IDL command line type:
ncdfshow,'fileinfo.txt'
Where you should typically provide something more descriptive than 'fileinfo' to write the result into. You will be prompted to select a file, so select the GOES data file provided. After it runs, open fileinfo.txt and see what information is in the file.
Now that you know the name of the variable you want to look at, you can run the ncdfread.pro procedure to get the data in raw form. You will want to put the actual filename into a variable that's a little easier to handle first.
filename = dialog_pickfile()
ncdfread,filename,'variable_name',data_variable,dims
Where 'variable_name' is a string that must match exactly the name you found in 'fileinfo.txt', data_variable is where the data ends up, and dims is a vector of the array dimensions.
Now you have the data in an IDL variable, you may need to use any offset and scale information found in the attributes to scale it to a physically meaningful quantity.
相关文章推荐
- IDL 读取显示HDF文件
- IDL学习笔记(3)--hdf文件SDS读取并显示为图片
- 菜鸟速成记-IDL读取显示dat文件
- IDL读取并三维显示Lidar数据
- IDL读取MODIS 1B数据中500分辨率下的数据 并显示成伪彩
- IDL读取显示保存图像(李英冰老师教程)
- VisualC 中实现对图像数据的读取显示
- ASP读取显示TXT文件内容
- 后台数据库读取Html值 到前台显示不了样式
- ffmpeg如何从内存读取h264音视频流进行解码显示
- evc中BMP文件的读取与显示资料集
- 关于读取数据库绑定Spinner显示问题
- WPF——图片的预览,以流的方式将图片保存在数据库中,再以流的方式从数据库中读取显示图片
- *SQLDB中图片的读取并显示*
- Metro中如何直接读取图片并显示出来。
- c++,对txt文件进行读取显示
- 求助,这样的文本文件如何读取出来并放在一个DataGrid控件中显示?
- 读取并显示位图文件
- ASP.net 从数据库中读取html格式的数据并显示在页面中
- Form2.Image 控件显示ADO 读取的二进制图片