Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 CAA 2009 Peer Review, Jesus College, Cambridge, UK, 23-24 March 2009 2009 CAA Peer Review: Selected Recommendations.

Similar presentations


Presentation on theme: "1 CAA 2009 Peer Review, Jesus College, Cambridge, UK, 23-24 March 2009 2009 CAA Peer Review: Selected Recommendations."— Presentation transcript:

1 1 CAA 2009 Peer Review, Jesus College, Cambridge, UK, 23-24 March 2009 2009 CAA Peer Review: Selected Recommendations

2 2 CAA 2009 Peer Review, Jesus College, Cambridge, UK, 23-24 March 2009 General Recommendations  Very little if any results on the instrument cross-calibration  Cluster is about measuring parameters at four points and how well one can calculate gradients  Show graphic examples of caveats (PEACE): In order to guide the user in understanding many of the instrument caveats, a set of example plots showing the effects of penetrating radiation, photoelectron contamination, etc on the data products (2D, 3D and moments) would be useful. There is some discussion on how these effects are mitigated (section on removal of penetrating radiation is TBW), but here some more detail would be good on the methods used for correcting the data. Photoelectons are removed using spacecraft potential from EFW – how reliable is that? Effects of EDI operation on PEACE data – what does that look like in the data?  The User Guide in particular needs to be tailored more to the general user. It needs to avoid use of a lot of specific instrument terminology and have its language more oriented towards products in the CAA – instruments specifics are only confusing/meaningless to a general user

3 3 CAA 2009 Peer Review, Jesus College, Cambridge, UK, 23-24 March 2009 Examples of recommendations, 1 CIS:  Need for a quality assessment on which instrument (HIA / CODIF) and which sub- instrument (Low/High sensitivity side) is the best for different plasma regions. Information in the UG part 6.1 should be explained in more detail.  CIS modes: Could they be provided in the data files. Is it possible to add a flag for mode mismatch (wrong mode in a given region)? DWP:  Many users will be unfamiliar with the COR_FX and COR_ST products, the UG could be significantly enhanced with some examples of the use of these data products. EDI:  We recommend including more detailed explanation of the various status/quality flags:  It is not straightforward to figure out what the status flags of MPD files actually mean. The values for the first flag are given in the UG while the remaining six are not explained either in the UG or in the ICD.  Winner and loser quality flags need more explanation in the text since this info are necessary for users to select different datasets.  What exactly do good/caution/bad quality flags mean for the E-field e.g. in terms of quality for publication?

4 4 CAA 2009 Peer Review, Jesus College, Cambridge, UK, 23-24 March 2009 EFW:  We recommend that the EFW team aim to improve the quality of the ISR2 offset and amplitude correction factor determination for the electric field. Statistically, it is fine but plenty of smaller-scale variations due to a change of environment can take place and it is not clear if any effort is put towards removing faster and localized offsets. FGM:  The user guide (UG) should include a figure which shows how to identify data intervals with range changes, interference, spin modulation (a frequency domain plot may be required for the latter).  We suggest to add quality flags to each data record as done by many of the other instrument teams. This flag should take into account any factors that influences the quality of the data, e.g., range changes, offsets, spin modulation etc. The user guide should contain a description of this quality flag. PEACE:  PEACE requires magnetic field data at energy-sweep step resolution timing, requiring additional processing of lower-time resolution magnetic field data. How is this done, and are there associated caveats? Examples of recommendations, 2

5 5 CAA 2009 Peer Review, Jesus College, Cambridge, UK, 23-24 March 2009 RAPID  After the instrument description (section 2), immediately describe the key science measurements and data sets (section 5). After that, put in a section that covers the general usability of the instrument:  Regions where the instrument operates well  Regions where there are issues (e.g. the radiation belts)  The effect of the “donut” IIMS efficiency on the usability of the instrument in different regions  The effect of the very low count rates on IIMS, due to low efficiencies.  Effects of solar particle events on the instrument.  Dividing up the 3D distributions into the three partial distributions makes it more difficult for the novice user to get started. We recommend creating a full 3d distribution at 3-spin resolution that can be used more easily. It is good that the partial distributions are there, for cases when time aliasing is a problem, but we recommend that the combined distribution should be put first as the default 3D product. It should be described first in the document, with the partial distributions recommended only for the more advanced user. Examples of recommendations, 3

6 6 CAA 2009 Peer Review, Jesus College, Cambridge, UK, 23-24 March 2009 STAFF  Improve usability of Staff-SA  There are no documented quality flags in Staff-SA (whisper on, interferences...)  The 5x5 spectral matrix is a too crude information for providing a publication level data set. Prassadco is a known software for producing theses key parameters. For instance, it could be run at the CAA for providing these key parameters on demand. WHISPER  Quality of data, caveats  Two flags (“quality” and “uncertainty”) are often not indicative of how reliable the density estimate is. The team should estimate the quality of the density estimates based on the actual uncertainty of the estimate (width of the peak/cut-off, clarity of the spectrogram etc)  Over 80% of density data points are based on EFW potential. For these data-points, uncertainty is set to 0 and quality to -1. This feature of the data has to be clearly stated in the UG, so users can exploit the quality flags properly (Users might be mislead by these values and consider all the data invalid).  Rename “Quality” of density estimates to contrast. The name is misleading, because it is not a proper quality flag. Density data is always reliable. Examples of recommendations, 4


Download ppt "1 CAA 2009 Peer Review, Jesus College, Cambridge, UK, 23-24 March 2009 2009 CAA Peer Review: Selected Recommendations."

Similar presentations


Ads by Google