I attach a picture which illustrate my problem.
When i open xx files, and i want to get the picture, or see if the picture concern the atlantic region, i have to double click on each line, click on "Spatial subset" tab, see the coverage ..
If the coverage is correct i make the display. If not, i double click on the next.
There a possibility to do this import with an local server (like i use to do with area files) ?
Thank you again.
Best Regards
françois
Error when open HDF file in McIdas
- francoisborel
- Posts: 75
- Joined: Sat Oct 22, 2016 6:21 pm
Re: Error when open HDF file in McIdas
Hello -
There are no local servers for working with this Nimbus satellite data.
As for finding the geographical extents of each data file, you could run something like the following in the Jython Shell (change file_dir to point at the directory with your data).
getDomainSet() is a JPythonMethod that extracts coordinate information about the data. The output will look something like:
Where "Length" is the total number of data points, "Dimension 0" is longitude, and "Dimension 1" is latitude. The Range of Dimension 0 and Dimension 1 would be the lat/lon bounds of each data file. You can use this output to get figure out if each file contains your desired lat/lon location. You can find more information about inspecting these data objects pulled in via loadGrid() in the Advanced Scripting - User Defined Functions scripting tutorial on the Documentation page.
- Bob
There are no local servers for working with this Nimbus satellite data.
As for finding the geographical extents of each data file, you could run something like the following in the Jython Shell (change file_dir to point at the directory with your data).
Code: Select all
# define variables
file_dir = "C:/Users/rcarp/Data/francois/"
file_list = os.listdir(file_dir)
num_files = len(file_list)
# loop through all of the files in the file_dir directory
# getDomainSet is a JPythonMethod that returns dimension lengths
# and ranges. In this case, latitude and longitude.
# The output is the file name, followed by what is returned
# from getDomainSet()
for n in range(0, int(num_files)):
data = loadGrid(filename=file_dir+file_list[int(n)],field="HRIR-Temp")
dataDomainSet = getDomainSet(data)
print file_list[int(n)]+":"
print dataDomainSet
getDomainSet() is a JPythonMethod that extracts coordinate information about the data. The output will look something like:
Code: Select all
NmHRIR1H.19660926_13-51-17_1788_008.hdf:
visad.Gridded2DSet: Dimension = 2 Length = 95472
Dimension 0: Length = 432 Range = 100.22484 to 177.18561
Dimension 1: Length = 221 Range = -47.494244 to -20.164627
NmHRIR1H.19660926_15-12-39_1789_003.hdf:
visad.Gridded2DSet: Dimension = 2 Length = 94176
Dimension 0: Length = 432 Range = 89.001335 to 179.3158
Dimension 1: Length = 218 Range = 29.303286 to 59.02645
NmHRIR1H.19660926_15-17-34_1789_004.hdf:
visad.Gridded2DSet: Dimension = 2 Length = 95040
Dimension 0: Length = 432 Range = 94.73797 to 166.9683
Dimension 1: Length = 220 Range = 16.643381 to 43.157455
Where "Length" is the total number of data points, "Dimension 0" is longitude, and "Dimension 1" is latitude. The Range of Dimension 0 and Dimension 1 would be the lat/lon bounds of each data file. You can use this output to get figure out if each file contains your desired lat/lon location. You can find more information about inspecting these data objects pulled in via loadGrid() in the Advanced Scripting - User Defined Functions scripting tutorial on the Documentation page.
- Bob
Re: Error when open HDF file in McIdas
Hi francoisborel,
In addition to Bob's solution above, I wanted to add that if you load multiple files then click through each of them in the Field Selector's Data Sources section and also click the desired field in its Fields section, it will show the approximate domain of the image. In other words, you don't have to open each image's Properties window and click its Spatial Subset tab just to see the approximate domain.
For example, here are screenshots showing my Data Explorer window when I click each of the three "3 files" entries in the Data Sources section.
In addition to Bob's solution above, I wanted to add that if you load multiple files then click through each of them in the Field Selector's Data Sources section and also click the desired field in its Fields section, it will show the approximate domain of the image. In other words, you don't have to open each image's Properties window and click its Spatial Subset tab just to see the approximate domain.
For example, here are screenshots showing my Data Explorer window when I click each of the three "3 files" entries in the Data Sources section.
- francoisborel
- Posts: 75
- Joined: Sat Oct 22, 2016 6:21 pm
Re: Error when open HDF file in McIdas
Hello Bob, hello Barry,
I thank you a lot for help me like you do... Thank you..
For Barry : Yes i saw that, but i have several mouse click to do for each file and i have downloaded more than thousand files, so i searched an another way... But thank you !!
For Bob : I was able to understand in the documentation that i have to go to in Tools -> Formulas - >Jython Shell.. I pasted your code in the bottom of the window (after changing the directory) and i clicked on evaluate ..
But i have an error. The system tell me that the directory does not exist (while it's correct). I saw that it put a "/1" behind the directory and i don't know why...
I attached two pictures : before_run.jpg (when i paste the command lines) and after_run.jpg (after click on evaluate).
I thank you a lot to make me manipulate very interesting things..
Cheers
françois
I thank you a lot for help me like you do... Thank you..
For Barry : Yes i saw that, but i have several mouse click to do for each file and i have downloaded more than thousand files, so i searched an another way... But thank you !!
For Bob : I was able to understand in the documentation that i have to go to in Tools -> Formulas - >Jython Shell.. I pasted your code in the bottom of the window (after changing the directory) and i clicked on evaluate ..
But i have an error. The system tell me that the directory does not exist (while it's correct). I saw that it put a "/1" behind the directory and i don't know why...
I attached two pictures : before_run.jpg (when i paste the command lines) and after_run.jpg (after click on evaluate).
I thank you a lot to make me manipulate very interesting things..
Cheers
françois
Re: Error when open HDF file in McIdas
Hello -
I believe the reason you are seeing the error is that your "C:/refonte_videos/1964_1978/24/" directory not only contains files, but also directories. From the error message, it looks like you have a directory in C:/refonte_videos/1964_1978/24 called "1". This os.listdir method (as I have it in the script) only works with files, not directories. Therefore, with the current state of the script, you will have to expand the file_dir path further to point to the directory with the HDF files and not sub-directories. You can explore the Python os documentation, there are likely ways you can modify the script to look through multiple directories.
As another note, I simplified the script I gave you earlier:
The change from earlier is how we are looping through the file names in the directory. Instead of looping through the file number indices (num_files), we are looping through the actual file name strings (in the file_list list). This gives the same output as the earlier script, it is just a bit cleaner.
Thanks -
Bob
I believe the reason you are seeing the error is that your "C:/refonte_videos/1964_1978/24/" directory not only contains files, but also directories. From the error message, it looks like you have a directory in C:/refonte_videos/1964_1978/24 called "1". This os.listdir method (as I have it in the script) only works with files, not directories. Therefore, with the current state of the script, you will have to expand the file_dir path further to point to the directory with the HDF files and not sub-directories. You can explore the Python os documentation, there are likely ways you can modify the script to look through multiple directories.
As another note, I simplified the script I gave you earlier:
Code: Select all
# define variables
file_dir = "C:/Users/rcarp/Data/francois/"
file_list = os.listdir(file_dir)
# loop through all of the files in the file_dir directory
# getDomainSet is a JPythonMethod that returns dimension lengths
# and ranges. In this case, latitude and longitude.
# The output is the file name, followed by what is returned
# from getDomainSet()
for x in file_list:
data = loadGrid(filename=file_dir+x,field="HRIR-Temp")
dataDomainSet = getDomainSet(data)
print str(x)+":"
print dataDomainSet
The change from earlier is how we are looping through the file names in the directory. Instead of looping through the file number indices (num_files), we are looping through the actual file name strings (in the file_list list). This gives the same output as the earlier script, it is just a bit cleaner.
Thanks -
Bob
- francoisborel
- Posts: 75
- Joined: Sat Oct 22, 2016 6:21 pm
Re: Error when open HDF file in McIdas
Hello Bob
Yes it's work !! I have just deleted the 1 folder and work with your first script. I will use your second script to try to understand how all this work...
I thank you a lot. I have now all to make my pictures with Nimbus... Fantastic !
Thank you..
Have a nice week end
françois
Yes it's work !! I have just deleted the 1 folder and work with your first script. I will use your second script to try to understand how all this work...
I thank you a lot. I have now all to make my pictures with Nimbus... Fantastic !
Thank you..
Have a nice week end
françois