1 CAA 2009 Peer Review, Jesus College, Cambridge, UK, 23-24 March 2009 2009 CAA Peer Review: Selected Recommendations.

Slides:



Advertisements
Similar presentations
Rosetta Magnetic Field PDS Review B. J. Anderson.
Advertisements

Stat-JR: eBooks Richard Parker. Quick overview To recap… Stat-JR uses templates to perform specific functions on datasets, e.g.: – 1LevelMod fits 1-level.
CAA 10th Cross Calibration Workshop, Paris, France, 2-4th Nov CLUSTER / STAFF Action items C. Burlaud, P. Robert, M. Maksimovic, N. Cornilleau-Wehrlin,
STAFF Report. 1.Status of data delivery 2.Delivery Plan 3.Status of data pipeline 4.STAFF/FGM cross calibration 5.Conclusions 2 15th Cross-Calibration.
Wizards, Templates, Styles & Macros Chapter 3. Contents This presentation covers the following: – Purpose, Characteristics, Advantages and Disadvantages.
FGM report 10 th Cross calibration workshop Elizabeth Lucek, Patrick Brown, Chris Carr, Tim Oddy, André Balogh I mperial College London November 2009.
Chapter 6 : Software Metrics
Dependence of the Walén test on the density estimate: A Cluster case study A. Blăgău (1,2), B. Klecker (1), G. Paschmann (1), O. Marghitu (2, 1), M. Scholer.
CHAPTER 36 Averages and Range. Range and Averages RANGE RANGE = LARGEST VALUE – SMALLEST VALUE TYPES OF AVERAGE 1. The MOST COMMON value is the MODE.
FGM report 9 th Cross calibration workshop Elizabeth Lucek, Patrick Brown, Paul French, Chris Carr, Tim Oddy, André Balogh I mperial College London March.
This material is approved for public release. Distribution is limited by the Software Engineering Institute to attendees. Sponsored by the U.S. Department.
11 14th CAA Cross-Calibration meeting, York, 5-7 Oct 2011 STAFF CAA products & Cross-Calibration activities Patrick ROBERT & STAFF Team 5) STAFF-SC CWF.
CODIF Status Lynn Kistler, Chris Mouikis Space Science Center UNH July 6-8, 2005 Paris, France.
Data Collection and Processing (DCP) 1. Key Aspects (1) DCPRecording Raw Data Processing Raw Data Presenting Processed Data CompleteRecords appropriate.
Chapter 3: Software Project Management Metrics
Herschel Open Time Cycle 1 DP workshop ESAC, March page 1 Tour of HIFI data.
CAA 12th Cross-cal meeting Toulouse Oct STAFF status report N. Cornilleau-Wehrlin,P. Robert, V. Bouzid, and STAFF team.
The Software Development Process
Architecture View Models A model is a complete, simplified description of a system from a particular perspective or viewpoint. There is no single view.
Some thoughts on error handling for FTIR retrievals Prepared by Stephen Wood and Brian Connor, NIWA with input and ideas from others...
CLUSTER/STAFF DATA at CAA. 11th Cross-Calibration Meeting, 7-9th April 2010, Goslar. C. Burlaud, P. Robert, O. Santolik, N. Cornilleau-Werhlin, P. Canu,
Ch. 1: Introduction: Physics and Measurement. Estimating.
STAFF Report Patrick Robert, Rodrigue Piberne & STAFF team.
1 12th CAA Cross-Calibration meeting, Toulouse, Oct 2010 STAFF/SC Calibration & Cross-Calibration activities Patrick ROBERT & STAFF Team, LPP 2)
WHISPER action items Gábor Facskó, Jean-Gabriel Trotignon,Séna Kougblénou, Xavier Vallières, Guillaume Lointier LPC 2 E/CNRS, Orléans, France 10th CAA.
CAA PEACE Status Iryna Rozum, Andrew Fazakerley, Natasha Doss, Andrew Lahiff, Yulia Bogdanova, Branislav Mihaljčić & the PEACE ops team MSSL 12 th Cross-Calibration.
DAA PEACE Status A. Fazakerley, B. Mihaljčić, I. Rozum, A. Lahiff, G. Watson, D. Kataria UCL Department of Space and Climate Physics Mullard Space Science.
Cluster Active Archive 1 Drift Velocity/Electric Field Comparison: CIS, EDI, EFW, FGM Jonathan Kissi-Ameyaw Cluster Active Archive ESTEC.
9 th CAA Cross-Calibration Workshop, Jesus College, Cambridge, UK, March /17 CAA Graphics: Pre-generated/On-demand Panels and Cross-Calibration.
16 th CAA Cross-Calibration Workshop IRAP, Toulouse, 6-9 November20121 Removing strong solar array disturbances and telemetry errors from DC magnetic field.
CAA 8th Cross-Cal meeting Kinsale (Ireland), 28 Oct 2008 Edita Georgescu EDI Status of Calibration and Archiving Activities.
1 CAA 2009 Cross Cal 9, Jesus College, Cambridge, UK, March 2009 Caveats, Versions, Quality and Documentation Specification Chris Perry.
15th CAA Cross-calibration workshop CIS archiving activities report University College of London 2012, April
Cluster Active Archive Status of DWP Data Activities Simon Walker, Keith Yearby, Michael Balikhin Automatic Control and Systems Engineering, University.
Cluster Active Archive Status of DWP Activities Simon Walker, Keith Yearby, Hugo Alleyne ACSE, University of Sheffield.
15 th CAA Cross-Calibration Workshop, 17th – 19th April 2012, UCL, London PEACE OPS TEAM Presented by Natasha Doss UCL Department of Space and Climate.
14 th CAA Cross-Calibration Workshop, 5th – 7th October 2011, York, UK PEACE OPS TEAM Presented by Natasha Doss UCL Department of Space and Climate Physics.
Double Star Active Archive - STAFF-DWP Data errors and reprocessing Keith Yearby and Hugo Alleyne University of Sheffield Nicole Cornilleau-Wehrlin LPP.
I'm concerned that the OS requirement for the signal is inefficient as the charge of the TeV scale leptons can be easily mis-assigned. As a result we do.
CIS : Data Quality Indices Iannis Dandouras, Alain Barthe, Sylvain Brunato CLUSTER ACTIVE ARCHIVE April 2012.
10th CrossCal Workshop, L'Observatoire de Paris, France Cluster Active Archive Status of DWP Action Items Simon Walker, Keith Yearby, Hugo Alleyne ACSE,
FGM report 5 th Operations Review of the CAA Elizabeth Lucek, Patrick Brown Chris Carr, Tim Oddy, André Balogh I mperial College London June 9 th 2010.
Cluster Active Archive Science User Working Group (CAASUWG) - update Matt Taylor on behalf of CAASUWG.
Generation of plasma boundary datasets for CAA Tiera Laitinen Finnish Meteorological Institute.
20th CAA Cross Calibration Göttingen - October 15-16, 2014 Status of CIS Data Archival I. Dandouras, A. Barthe.
CIS Action Items 10 th Cross-Calibration Workshop Observatoire de Paris, Nov
8th Cross-calibration Workshop CIS data delivery report Kinsale – October 2008.
Double Star Active Archive - DWP/STAFF 1 Double Star Active Archive STAFF/DWP Keith Yearby and Hugo Alleyne University of Sheffield Nicole Cornilleau-Wehrlin.
1 11 th CAA Cross-Calibration Workshop, Hotel die Tanne, Goslar, Germany 7-9 April 2010 CAA 11 th Cross-Calibration Workshop Hotel die Tanne, Goslar, Germany.
Laboratoire de Physique des Plasmas 20 TH CAA CROSS-CAL M EETING CLUSTER-STAFF REPORT O CTOBER 2014 G ÖTTINGEN The STAFF Team Laboratoire de Physique.
Dynamic Black-Box Testing Part 1 What is dynamic black-box testing? How to reduce the number of test cases using: Equivalence partitioning Boundary value.
MSSL * A.N. Fazakerley, I. Rozum, N. Doss, B. Mihaljcic & the PEACE ops team PEACE CAA Action Items Status 11 th CAA Cross Calibration Meeting, 7 – 9 April.
Status of CIS Calibration Work Iannis Dandouras and the CIS Team presented by: Harri Laakso 3 rd CAA Cross-Calibration Meeting MSSL, October 2006.
CAA PEACE Status PEACE Operations Team Presented by Iryna Rozum MSSL Cross-Calibration Meeting, Uppsala, Sweden, April 2011.
Investigation of a discrepancy between magnetic field magnitudes determined by the FGM and EDI instruments Jonny Gloag, Edita Georgescu, Elizabeth Lucek,
WEC meeting TED status and WEC timing.
8 th Cross-Calibration Workshop, Kinsale, Ireland, October 20081/14 Draft Replies and Actions to the Recommendations of the Review Panel for Final.
CAA 6 th Cross Cal Meeting RAL, th Oct 2007 RAPID/IES Calibration Status J.A. Davies.
26th Oct 2006CAA cross cal meeting, MSSL RAPID Calibration Status RAPID team.
10 th CAA Cross-calibration meeting, 2nd – 4th November 2009, Paris N. Doss, A. Fazakerley, B. Mihaljčić and I. Rozum UCL Department of Space and Climate.
20 th CAA Cross-Calibration Workshop MPS, Göttingen, Germany Oct Ways of Measuring DC Electric Field: Who Does it Correctly?
14 th CAA Cross-calibration Workshop CIS archiving activities York October 5-7, 2011.
CAA Operational Review 9 Double Star PEACE Team Report
Data-Model Comparisons
CIS Data Archival Status
10th CAA Operations Review Annual Report of the CIS Experiment
H. Rème, I.Dandouras and A. Barthe IRAP, Toulouse, France
Solar Wind Core Electrons
Annual Report of the DWP Experiment 9th CAA Operations Review
Descriptive Statistics
Presentation transcript:

1 CAA 2009 Peer Review, Jesus College, Cambridge, UK, March CAA Peer Review: Selected Recommendations

2 CAA 2009 Peer Review, Jesus College, Cambridge, UK, March 2009 General Recommendations  Very little if any results on the instrument cross-calibration  Cluster is about measuring parameters at four points and how well one can calculate gradients  Show graphic examples of caveats (PEACE): In order to guide the user in understanding many of the instrument caveats, a set of example plots showing the effects of penetrating radiation, photoelectron contamination, etc on the data products (2D, 3D and moments) would be useful. There is some discussion on how these effects are mitigated (section on removal of penetrating radiation is TBW), but here some more detail would be good on the methods used for correcting the data. Photoelectons are removed using spacecraft potential from EFW – how reliable is that? Effects of EDI operation on PEACE data – what does that look like in the data?  The User Guide in particular needs to be tailored more to the general user. It needs to avoid use of a lot of specific instrument terminology and have its language more oriented towards products in the CAA – instruments specifics are only confusing/meaningless to a general user

3 CAA 2009 Peer Review, Jesus College, Cambridge, UK, March 2009 Examples of recommendations, 1 CIS:  Need for a quality assessment on which instrument (HIA / CODIF) and which sub- instrument (Low/High sensitivity side) is the best for different plasma regions. Information in the UG part 6.1 should be explained in more detail.  CIS modes: Could they be provided in the data files. Is it possible to add a flag for mode mismatch (wrong mode in a given region)? DWP:  Many users will be unfamiliar with the COR_FX and COR_ST products, the UG could be significantly enhanced with some examples of the use of these data products. EDI:  We recommend including more detailed explanation of the various status/quality flags:  It is not straightforward to figure out what the status flags of MPD files actually mean. The values for the first flag are given in the UG while the remaining six are not explained either in the UG or in the ICD.  Winner and loser quality flags need more explanation in the text since this info are necessary for users to select different datasets.  What exactly do good/caution/bad quality flags mean for the E-field e.g. in terms of quality for publication?

4 CAA 2009 Peer Review, Jesus College, Cambridge, UK, March 2009 EFW:  We recommend that the EFW team aim to improve the quality of the ISR2 offset and amplitude correction factor determination for the electric field. Statistically, it is fine but plenty of smaller-scale variations due to a change of environment can take place and it is not clear if any effort is put towards removing faster and localized offsets. FGM:  The user guide (UG) should include a figure which shows how to identify data intervals with range changes, interference, spin modulation (a frequency domain plot may be required for the latter).  We suggest to add quality flags to each data record as done by many of the other instrument teams. This flag should take into account any factors that influences the quality of the data, e.g., range changes, offsets, spin modulation etc. The user guide should contain a description of this quality flag. PEACE:  PEACE requires magnetic field data at energy-sweep step resolution timing, requiring additional processing of lower-time resolution magnetic field data. How is this done, and are there associated caveats? Examples of recommendations, 2

5 CAA 2009 Peer Review, Jesus College, Cambridge, UK, March 2009 RAPID  After the instrument description (section 2), immediately describe the key science measurements and data sets (section 5). After that, put in a section that covers the general usability of the instrument:  Regions where the instrument operates well  Regions where there are issues (e.g. the radiation belts)  The effect of the “donut” IIMS efficiency on the usability of the instrument in different regions  The effect of the very low count rates on IIMS, due to low efficiencies.  Effects of solar particle events on the instrument.  Dividing up the 3D distributions into the three partial distributions makes it more difficult for the novice user to get started. We recommend creating a full 3d distribution at 3-spin resolution that can be used more easily. It is good that the partial distributions are there, for cases when time aliasing is a problem, but we recommend that the combined distribution should be put first as the default 3D product. It should be described first in the document, with the partial distributions recommended only for the more advanced user. Examples of recommendations, 3

6 CAA 2009 Peer Review, Jesus College, Cambridge, UK, March 2009 STAFF  Improve usability of Staff-SA  There are no documented quality flags in Staff-SA (whisper on, interferences...)  The 5x5 spectral matrix is a too crude information for providing a publication level data set. Prassadco is a known software for producing theses key parameters. For instance, it could be run at the CAA for providing these key parameters on demand. WHISPER  Quality of data, caveats  Two flags (“quality” and “uncertainty”) are often not indicative of how reliable the density estimate is. The team should estimate the quality of the density estimates based on the actual uncertainty of the estimate (width of the peak/cut-off, clarity of the spectrogram etc)  Over 80% of density data points are based on EFW potential. For these data-points, uncertainty is set to 0 and quality to -1. This feature of the data has to be clearly stated in the UG, so users can exploit the quality flags properly (Users might be mislead by these values and consider all the data invalid).  Rename “Quality” of density estimates to contrast. The name is misleading, because it is not a proper quality flag. Density data is always reliable. Examples of recommendations, 4