Move sampling code to a python file from transient notebook for other validation approaches #23
Move sampling code to a python file from transient notebook for other validation approaches #23dbekaert merged 20 commits intonisar-solid:mainfrom kanglcn:main
Conversation
…ia --dayless and --daymore options
|
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
| @@ -46,19 +46,25 @@ | |||
| "NISAR must be able to constrain displacements in two look directions (left-looking on both ascending and descending tracks) for the active area of 70% of target sites with an accuracy that scales with baseline distance L between any two locations within a scene. This pertains to all directly measured or inferred 12-day interferograms over the duration of the mission. Here, accuracy is calculated using L in kilometers but with the units removed, at or better than 100 m resolution. The 12-day time scale corresponds to a repeat NISAR pass on each of the ascending and descending satellite tracks, from which interferograms can be made and displacements estimated. The NISAR mission has compiled a list of 2000 global targets covering areas of known or potential transient deformation related to the processes specified in the requirement (NISAR Handbook Appendix H, 2018). These targets include all Earth's active volcanoes above sea-level, areas of rapid glacial mass changes, selected deforming reservoirs of water, oil, gas, CO2 and steam and landslide-prone areas near major population centers, as well as sites where selected disaster-related events have occurred. \n", | |||
There was a problem hiding this comment.
There was a problem hiding this comment.
The Secular and Coseismic requirements mention a three-year interval, but Transient does not. If Sentinel-1 acquisitions are complete, then we will get 30 cycles per year. With the sampling scheme of the Transient notebook to only use pairs without common dates, then that would leave only 15 pairs per year. Seems like we need at least two years to get a reasonable sample for the statistics. Three years is good, too. It should not be very large if we only download the 12-day pairs for the three years.
There was a problem hiding this comment.
I do the change as it is Eric's requirement. Eric says:
but the NISAR primary mission is still defined as 3 years, so we will need to validate NISAR data before 3 years are passed. I think we should use 3 years or less for the notebooks.
I used only one year for the Coseismic requirement notebook.
There was a problem hiding this comment.
I believe that the track 144 over the Central Valley has both S1A and S1B acquisitions, so that means that we can get 30 independent pairs with 12-day lengths (15 S1A-S1A pairs and 15 S1B-S1B pairs). Is 30 independent pairs enough for the Transient statistics?
There was a problem hiding this comment.
I only get 34 products downloaded and 18 independent pairs are left. I don't know why this is less than your estimation. 18 products may not be enough.
|
Sorry I forget to remove several code cells that I used to test the pickle file interface for Adrian. I will make a new commit to remove them. |
|
The *.ipynb file naming is currently not consistent between different requirements. I like the one from Adrian as |
|
I thought that the first part of the notebook name |
|
Shorter names sound even better. |
Yeah i agree. Lets make them consistent with a short name. |
…samp_pair from remove_trend to deramp; modify gitignore; modify transient_base notebook to make it more clear and readable; add more descriptions to sampling.py.
|
All, I have made an update according to your comments. Please check if it solves the problems you mentioned. Thanks for your effort. @yunjunz @EJFielding @dbekaert |
|
Thank you @kanglcn for the updates. I put a few more minor comments, for the last round I think. Could you also check the style issues flagged by Codacy? |
Thanks @yunjunz . I will look into them. Thanks for pointing out the Codacy problem. I haven't used it before but I think this shouldn't be a big problem. |
|
I have updated under @yunjunz 's suggestions. Please check. |
|
Thanks, @yunjunz for helping me fix more codacy issues. I can't find them even with their hints. |
I have moved the sampling code to
ATBD/solid_utils/sampling.py. It includes several functions to calculate local coordinates for products, make sampling, and pair them up. The detailed usage is written in the docstrings.Also, I decrease the number of products downloaded by using
--dayslessand--daysmoreoptions inariaDownload.py. And I am using three years of products now.