Presentation is loading. Please wait.

Presentation is loading. Please wait.

01 January 2016.

Similar presentations


Presentation on theme: "01 January 2016."— Presentation transcript:

1 01 January 2016

2 Introduction This presentation covers the following topics:
DRRS-N Course Background Assessments Reporting Rules DRRS-N Accounts CNAF Readiness Brief Training Keys to Success

3 Background – How We Manage Readiness
GFM OPLAN/CONPLAN Joint Staff OPNAV N43 Overall Sponsor for Navy Readiness $FHP, $Ranges, $Ordnance OPNAV/Fleet FFC/CPF Allocates resources across TYCOMS Talking Point: Readiness reporting is a result of a balancing of requirements (GFM, MAP, ROC/POE, FRTP, T&R matrices) and resources (R+ month allocation via Current Readiness standards). MAP: Master Aviation Plan Commander, Naval Air Force schedule for all squadrons Produced to meet GFM requirements while supporting Force recapitalization (Type/Model/Series [T/M/S] transition plans) Future: Moving to a common, integrated NAE process to: Meet operational demand and recapitalization requirements Provide schedule transparency across the Enterprise Institutionalize resource responsiveness Evaluate collective fiscal requirements MAP CNAP/CNAL Sets R+months, allocates FHP, TAD Manages Aircraft maintenance TYCOM FRTP Training Events FRTP Profile resources T&R Matrixes C Entitlements Warfighters CSG/CVW/SQDN/DET requirements 3

4 Background – CVW Funding Profile
Example of a 27-month CVW profile. These profiles are notional; the MAP is the planning tool. The O-FRP profile (36-month model) is still under construction. There may be another deployment during R+25 to R+27 where funding will go back to 80% During the post deployment sustain there is a requirement for a SUSTEX every 120 days which will require an increase in funding in the form of Afloat MESH

5 Background – CVW Funding Profile (OFRP)
OFRP = Optimized Fleet Response Plan Notional profile, 27 month funding profile is currently in use There may be another deployment during R+25 to R+36 where funding will go back to 80% During the post deployment sustain there is a requirement for a SUSTEX every 120 days which will require an increase in funding in the form of Afloat MESH

6 Fleet Response Plan Profile Current Readiness Standard
Background – Current Readiness Standards Standards are published in CNAFINST 3510 (series) Instructions on the CNAF SHAREPOINT SITE Fleet Response Plan Profile Current Readiness Standard

7 Current Readiness Standard
Background – Current Readiness Standards Current Readiness Standard Daily AMSRR Data Current Readiness Standards Training hours and sim fidelity standards come from the Capabilities-based Training and Readiness matrices (CBTRM) Driven by R+ month, the CR tables sets demand signal for: Aircraft on flight line Parts support Mission systems Aircrew Manning Fit/Fill SHARP data (via the Current Readiness Assessment (CRA)) is married in ADW with the R+ month profile as set by the CR tables and pushed to DRRS-N. SHARP

8 Background – Mission Essential Tasks
METs are tasks considered essential to accomplish and support missions and requirements assigned by a Joint or Naval Commander The Mission Essential Task List (METL) is based on the Unit’s ROC/POE and maps tasks to capability areas Capability Areas = Warfare Mission Areas from the ROC/POE Squadrons only report on Primary Mission Areas from their ROC/POE Each MET contains a set of conditions and standards The METL is common to all units within a Responsible Organization (RespOrg) Assessment of Mission Essential Tasks (METs) is the basis for the DRRS and DRRS-N readiness systems. Each unit type has a list of METs (Called a METL – Mission Essential Task List) which is based on Warfare Mission Areas in the approved ROC/POE. METs come from the UNTL (Universal Naval Task List) which is a common document to the Marine Corps, Navy and Coast Guard. For squadrons and detachments, the Capability Areas in the METL reflect the Primary Mission Areas in the ROC/POE. The MET (also referred to as an NTA – Navy Tactical Level Task) is a task which contains conditions and standards. The NTA itself is unclassified. When conditions and standards are applied, the MET becomes classified. The METL is specific to a TMS and required mission sets. In some cases, a TMS may have more than one METL. The METL group is called a RespOrg (Responsible Organization). METLs are developed by the Type Wing and approved via the COC by US Fleet Forces Command. Examples 1. The VFA METL is specific to the FA-18C. VFA UDP, VFA E/F, are separate METLs. 2. The H-60S community also has several METLs due to the various mission sets (HSC Expeditionary, HSC CVW, HSC NSW, and HSC FRS).

9 Background – DRRS-N DRRS-N is the tool Navy organizations use to report MET readiness data to DRRS-S DRRS-N has two software applications DRRS-N Ashore (SIPRNET) DRRS-N Afloat (Shipboard) Navy Policy Guidance NTRP OPNAV CNAP/CNAL As we discussed on the previous slide, DRRS-N is the tool that Navy units use to report readiness to DRRS. DRRS-N has two software applications that capture unit and group level readiness data. 1. DRRS-N Ashore: Used by units when ashore 2. DRRS-N Afloat: Used by units when embarked on ships. We will discuss these more in depth over the next few slides. The Navy policy documents on DRRS-N are listed on the slide. 1. The NTRP provides assessment rules and describes the metrics behind the PESTO automated data. 2. The OPNAV provides basic guidance on how the Navy uses DRRS-N. 3. The CNAP/CNAL is the Air Forces policy on DRRS-N reporting for all CNAF units. Note: All references are on the CNAF Readiness Reference Tool.

10 Background – DRRS-N Sample

11 Background – DRRS-N Assessments
DRRS-N contains two assessments: Computed Assessment (Automated) PESTO Pillars (Per, Eqp, Sup, Tng, Ord) Pre-calculated data from authoritative data sources Squadrons will see only the Personnel, Equipment and Training pillars Detachments only see the Equipment and Training pillars Commander’s Assessments (Subjective) The Commander shall assess Core, Capability Areas, all METs, and the legacy SORTSREPNV C Rates (OARS) PESTO Pillar data is provided to assist the Commander in making an assessment The Commander’s Core, Capability Area, and MET assessments are the only data that flow up beyond Navy lifelines to DRRS-S Personnel Equipment Training Just cover what is shown on the slide.

12 Background – DRRS-N Data Sources
DRRS-N compiles unit resource values (called FOMs or Figures of Merit), using authoritative data sources. These values and associated data are available to the Commander as a guide for making unit assessments. The graphic on this page illustrates the data sources that feed the squadron Figure of Merit (FOM) data to the DRRS-N Personnel, Equipment, and Training Pillars. Note that the originating data sources are unclassified sources that have been used by squadrons for years. No new data is required to feed the DRRS-N automated pillars. The squadron is not responsible for moving data to the SIPRNET. That process is done by support personnel in the background. The PESTO data sources all have 4 things in common. 1. An authoritative data source 2. An approved metric to calculate the FOM score 3. An approved mapping that ties resources to an NTA 4. A data flow process to get the FOM data to DRRS-N. Acronyms Defined: TFMMS – Total Force Manpower Management System FLTMPTS – Fleet Manpower Planning and Training System AMSRR – Aircraft Material Supply Readiness Report SHARP – Sierra Hotel Aviation Readiness Program PFOM – Personnel Figure of Merit ADW – Aviation Data Warehouse AMFOM – Aviation Figure of Merit NTIMS – Navy Training Information Management System DRRS- N provides Ashore and Afloat applications to allow units to report readiness from home or while embarked on Navy ships. The DRRS-N server is located on the DISA (Defense Information Systems Agency). When you use DRRS-N Ashore, you are accessing that server directly. DRRS- N Ashore also includes other Business Intelligence tools that are not available on DRRS-N Afloat. DRRS-N Afloat resides on ARRS servers on each Navy ship. When using DRRS-N Afloat, you are accessing a server on your ship. Every 4 minutes (optimal) the Afloat server communicates with the Ashore server. Your draft assessment stays on DRRS-N Afloat until you hit the submit button at which time it is sent to the Ashore server. Also note that both DRRS-N Ashore and Afloat require separate accounts. DRRS-N Ashore accounts are requested online from the DRRS-N Homepage DRRS-N Afloat accounts are requested from the CVN OPS Admin Officer or the OPS Officer on all other ships. Note: The parent commands of detachments display the P-Pillar for the entire squadron Individual detachments do not have their own P-Pillar.

13 Background – PESTO Metrics
PESTO pillar data is dependent on inputs, data sources, frequencies, and metrics. The illustration below summarizes the inputs, data sources, frequencies, and metrics for Personnel (P), Equipment (E), and Training (T). Pillar Input Data Source Frequency Metric Metric Owner Personnel NEC FIT Rating FIT Officer FIT COB, BA TFMMS, FLTMPS PFOM Bi-Monthly PFOM = Rs – Gs Rs CNAP N1 Type Wings Equipment Aircraft Assigned Aircraft RBA Systems Assigned Systems RBA AMSRR ADW Daily Based on CNAF Point and Map Tables CNAL N42 Training Skilled Crews (Pf) Sqdn Requirements (Ef) SHARP NTIMS Every 15 days or as updated by the squadron (CRA) TFOM = Pf x Ef CNAP N40 This slide provides an overview of the metrics and data sources that feed the FOM data. These metrics are approved at the TYCOM level and support the overall Navy Metrics formulas required to feed DRRS-N. Note the frequency at which some of these data sources update DRRS-N. The slide is for reference mostly, but do highlight the Frequency column. Note that two of the data sources update bi-monthly and the E pillar updates daily.

14 Assessments – Types The Commander assesses the following areas with a Y, Q, or N: The MET The Capability Area The Core In addition, the Commander shall update the OARS (Organization and Resource Status) data. OARS will be discussed later. There are four assessments that must be made before a unit submits their DRRS-N data. 1. Each MET must be assessed. Commander’s assessments do not need to match the computed score. 2. The preponderance of MET assessments drives the Capability Area assessment. 3. The preponderance of the Capability Area assessments drives the Core (overall). Concentrate your comments is the Core assessment block. 4. The OARS (Organization And Resource Status) data also must be updated every time the unit submits. The OARS data will be discussed in greater detail later. Note: if you see a gray box anywhere in the Commander Assessment column, that means that the MET or Capability Area has never been assessed. Simply select the gray box and make an assessment.

15 Commander’s Assessments
Assessments – Values The following values are used to assess the Core assessment, Capability Areas, and METs: Commander’s Assessments The commander’s assessment has three possible values. The following slides will cover each of these. When a Commander assesses a MET, he is technically basing his assessment on the conditions and standards attached to the MET. Until we have an automated method to capture quantitative data on the conditions and standards, CNAF will use the completion of T&R events as the measure of performance of a MET.

16 Assessments – Values The following values are used to assess the Core assessment, Capability Areas, and METs: Y – Yes (Green) The unit can successfully perform an assigned Capability or MET to the prescribed standards and conditions. A “Yes” assessment should reflect demonstrated performance in training and operations.

17 Assessments – Values The following values are used to assess the Core assessment, Capability Areas, and METs: Q – Qualified Yes (Yellow) (Used when data may not readily support a “Yes” assessment). The unit is expected to accomplish an assigned Capability or MET to the prescribed standards, under most conditions. This performance, however, has not been observed or demonstrated in operations or training. Units with this rating can be employed for those tasks. A Qualified Yes is still a yes assessment. When selecting Qualified Yes, the commander shall provide comments that articulate the portions of the mission that unit is not capable of performing and any associated risks that the unit could face if tasked with that mission or Capability Area. A common example is when a unit is proficient in a given mission during the day, but has not completed all night requirements. The comment would talk to the projected completion time to obtain proficiency at night, and any resources that would be required. Yellow is not bad when it is explained properly in comments. Remember that DRRS-N was designed to articulate which tasks and capabilities you can and cannot perform and why.

18 Assessments – Values The following values are used to assess the Core assessment, Capability Areas, and METs: N – No (Red) The unit cannot perform an assigned Capability or MET to the prescribed standards and conditions at this time. Supporting explanations are mandatory for any “N” assessment. A No assessment mean that the unit can not perform that MET to any of the conditions or standards.

19 Assessments – TYCOM Guidance
Core assessment shall not be assessed as green if either MOB or CCC are assessed less than green The Commander’s assessment shall not be green for any Capability Area if the unit fails to complete the full training requirements for any phase of the FRTP Always Assess to the MCO standard and keep comments pointed and operationally focused “MCO ready” is defined as a unit’s ability to deliver it’s full design capability as delineated in the ROC/POE and should not be measured against the requirement of a specific OPLAN Unit Commanding Officers should not report “Y/Green” across the board because their unit is green for its current FRTP phase (PCTEF in OARS addresses readiness for current tasking) Source: CNAP/CNALINST DRRS-N Supp Guidance

20 Assessments – Commander’s Remarks
The Core assessment block contains specific sections for your comments: Overall Section: Describe the unit’s current readiness in relation to the wartime mission and ability to conduct current tasking PESTO Sections: Describe the specific readiness issues and operational impacts Comments should focus on current and future capability and issues Projections on readiness changes should be based on events and not just the next DRRS-N assessment Minimum comments should include; Current FRTP (R+) month/phase Next major FRTP milestone Expected deployment date Commander’s top readiness concerns POC Info

21 Assessments – Commander’s Remarks
Personnel Remarks Squadrons comment on IAs, TADs, and other issues that cause personnel shortages Detachments comment on losses/swap outs and the squadron’s plan to fill gaps Equipment Remarks Squadrons comment on long term aircraft maintenance issues and equipment shortages Detachments comment on shorter term NMC/PMC issues Avoid focusing on perishable, extremely short-term issues such as today’s FCF flight Your get well date drives your DRRS-N update Training Remarks Use the SHARP skills report to highlight skills that are keeping your TFOM scores low Pf data does not appear in the T Pillar drill down, so you must have good comments General Remarks/Tips Use plain language; this is not a message, so use sentence case DRRS-N is about MCO capability and not what your unit is tasked with currently Do not use SORTS terminology in DRRS-N comments Don’t just write, “Readiness as per FRTP phase” There is no need to delay an assessment due to PESTO pillars that are not populated or erroneous pillar data

22 Assessments – Commander’s Remarks
Good Example: Overall: Currently resourced at R+10. Next FRTP event: C2X in July. Expected deployment date: 15 Oct. Major readiness concern is the allocation of (specific ordnance) which will cap T-pillar readiness at 80 percent for multiple NTAs. NTAs affected are (1), (2), and (3), which affect STW and SUW capability areas. Expect capability areas (except STW and SUW) to change to GREEN after the completion of C2X at the end of July. However, we expect to gain two CAT 1 aircrew and lose three experienced aircrew right before deployment. Time to train these aircrew and building ACTC quals will continue to be our limiting factor and primary area of focus. POC: LT That Guy, (123) , Bad example Overall: Currently resourced at R+10. Squadron is YELLOW overall. Submitted due to change-of-command 12 June. Squadron shows REDs and YELLOWs in most NTAs, however, I will continue to assess the squadron as YELLOW overall. Primary focus is safety, maintaining readiness and assets, and building aircrew qualifications. POC: CDR This Guy,

23 Assessments – Commander’s Remarks
The SHARP Crew METs w/Individual Skills report is a useful tool for writing Training comments. This report helps the squadron determine the status of skilled crews (Pf) for any given NTA One of the best sources of training comments is a report from SHARP called the Crew METS w/ Individual Skills Report. Since there is no detailed data in the DRRS-N T Pillar about why a unit shows a given Pf (Performance Factor), the unit can use this report to highlight those skill sets from the T&R that are lacking. This report clearly shows which skills are red and when compared to the T&R matrix can show where and why you are short skilled crews. This report is found in SHARP under the “Reports” tab, under the sub-menus for “CBR Analysis Tools” and “Crew METs w/ Individual Stats”.

24 Assessments – OARS Data Overview
The OARS data allows the unit to report those SORTSREPNV data sets required by Joint Staff through the DRRS-N application The OARS data is the minimum data required to generate a SORTSREPNV report When a unit submits a DRRS-N assessment, the OARS data is updated at the DRRS-N server which, in turn, generates a SORTSREPNV report for processing up line to Joint Staff The DRRS-N generated SORTSREPNV message is not transmitted via the Naval Message System and is only visible via the Navy Readiness Reporting Enterprise Business Intelligence Tool in DRRS-N This data is still used by higher echelon staffs and thus, units must make every effort to ensure that OARS data accurately reflects the unit’s current readiness Note: The guidance for completing the OARS page is contained in Section 5 of the CNAP/CNAL

25 Assessments – OARS Data Calculations
Everyone with a DRRS-N account can view your OARS data. To update the data in the OARS page, select the Edit buttons or by click on the C Rate boxes. The DRRS-N software update in April 2014 changed the approved assessment view to include the OARS page data. Note: Since the CO name is no longer a data field, the unit does not have to update at the Change of Command.

26 Assessments – CARR (CVW units only)
The Carrier Air Wing Readiness Report (CARR) allows the CAG to report readiness issues for the Air Wing Reported in DRRS-N by the CAG only The CARR is a roll-up of all CVW squadrons and VRC detachments by Capability Area The CAG will be able to see all the squadrons and detachments under their OPCON in their roll up view and assess the CVW Capability Areas based on the squadron and detachment DRRS-N assessments The CARR is a roll-up of other units so it does not have an OARS page.

27 Reporting Rules - Frequency
Submit assessments in DRRS-N as follows: Within 24 hours of a significant change in readiness Within 30 days of the last assessment Following completion of major FRTP milestones to include TSTA, ARP, Air Wing Fallon, and C2X When otherwise directed Assessments must be submitted in DRRS-N within 24 hours of a significant change in readiness. An updated assessment must be submitted within 30 days of the last assessment, even if there is no change in readiness. A significant change in readiness is defined as a change in any Capability Area (Yes, Qualified Yes, and No) assessment from the previously reported assessment as determined by the Unit Commander (e.g. Commander’s Assessment). Notes: (1) Your ISIC may also direct a DRRS-N assessment based on COCOM or Fleet Commander reporting requirement. (2) Submission not required for changes of command.

28 Reporting Rules – Readiness Expectations
The Fleet Response Training Plan (FRTP) standards are designed to incrementally prepare a unit for its full wartime readiness During an FRTP cycle, the expectations of readiness increase progressively from “No”(Red) in the maintenance phase to “Yes”(Green) at the beginning of a unit’s employability window Units should strive to maintain a “Yes”(Green) throughout their employability window unless their mission forces a degrade in certain Capability Areas (single mission focus, non-wartime tasking) Not all units follow this type of FRTP model (HM, VQ(T), VPU, and FDNF) Maintenance Basic Intermediate Employability Readiness expectations are based on your TMS FRTP cycle. In the notional FRTP cycle, a unit is expected to be… Red in the Basic Phase (due to lack of resources – Aircraft, Flight hours, ranges, ordnance etc) Red/yellow in the basic and intermediate phases (red increase to yellow as the unit completes major FRTP milestones such as ARP and TSTA. Green in Sustainment for units that have completed C2X and Fallon or Phase Transition (non-CVW units) Mobility is designed to be green sooner that Sustainment since METS in that CA have fewer requirements. In DRRS-N, you are assessing your ability to perform Capabilities and METS based on MCO standards. Your assessments will show the chain of command if you are meeting those expectations. T&R Funding DRRS-N Note: Currently, MOB is the only Capability Area that is intended to be green prior to the employability phase.

29 Required Capability Areas
Reporting Rules – Readiness Expectations HSC (EXP) only HSC (EXP) detachments will train to one of four tailored Capability-Based Training and Readiness Matrices based on their deployment requirements Each matrix will require the detachment to report readiness on specific Capability Areas Capability Areas in DRRS-N that are not listed below should be assessed as RED with an associated comment that states that the detachment does not train to the Capability Area in their current assignment Detachment T&R Matrix Crew Requirements Required Capability Areas Non-Armed Helo SOF MOB, LOG, FSO, MOS, NCO, CCC, NSW Armed Helo SOF/ASU MOB, LOG, FSO, MOS, CCC, NSW, STW, ASU CLF/HADR MOB, LOG, FSO, MOS, NCO, CCC Naples HSC Only

30 DRRS-N Accounts DRRS-N Ashore DRRS-N Afloat Tips
Requested via the SIPRNET ( Accounts are managed by USFFC At least, CO, XO, and OPS should have Commander permissions Help Desk: (DSN 836) DRRS-N Afloat Primarily where you will make DRRS-N assessments when embarked OPS Dept on the ship/OPS ADMIN on the CVN creates accounts for all embarked units Help Desk: (DSN 836) Tips DRRS-N accounts are tied to the individual thus no group access accounts are permitted Remember to sync all data when you first use your DRRS-N Afloat account Sync PESTO data only if the data has not refreshed VRC detachments that are land based will use DRRS-N Ashore to report

31 DRRS-N Accounts - Permissions
Account permission levels for both DRRS-N Ashore and Afloat have the same functionality but have different names. The table below describes the permission levels. CNAF recommends that the CO/OIC, XO, and OPS-O all have the Commander/Releaser permission level. The CO has the discretion to assign any permission level to personnel in their command DRRS-N Ashore URL (SIPRNET): DRRS-N permission levels: Ashore Afloat Description View Browser Can only view approved assessments. Cannot make changes to assessments. Cannot see the OARS data. Unit User Drafter Can view and change the assessments of capabilities and NMETs, and save assessment changes, but cannot submit assessment changes. Unit Commander Releaser Can view and change the assessments of capabilities and NMETs, save assessment changes, and submit assessment changes. Can also select the reporting method. NA Releaser and ADMIN CNAL/CNAP Recommends that CO , XO, and OPS all have a DRRS-N account with Commander Permissions. The CO ultimately decides who can submit DRRS-N for the unit.

32 CNAF Readiness Brief (CVN/CVW)
Both CNAP and CNAL present a bi-monthly readiness brief to CNAL N00 using DRRS-N readiness data. The CVN/CVW pages are organized by FRTP phase. All Air Wings are displayed on the brief. The CVW CARR assessment is displayed on the far left. Unit DRRS-N comments are displayed below.

33 CNAF Readiness Brief (non CVW)
The non-CVW pages are organized by community first then further organized by FRTP phase. Detachments in maintenance phase are not displayed on the brief. Detachments assigned to a CSG or ESG are displayed on the same page as their deployment group. Parent Commands are also displayed along with their detachments Detachments do not have a P Pillar

34 Training Plan – Methods
Three avenues for DRRS-N training are: On-line and embedded training CNAF Readiness Reference Tool Help functions in DRRS-N Pipeline training PXO Course HSM/HSC OIC Course CVW DRRS-N Training during the FRTP Waterfront briefs Fleet concentration sites – at least annually As requested by units

35 Keys to Success CO/OIC gets 100% of the vote in the Commander’s assessment regardless of the PESTO pillars The unit needs to understand their FRTP cycle and their readiness expectations Remarks should be focused on readiness issues for your squadron/detachment Do not use SORTS terminology in remarks Ensure that all remark fields are continually updated Concentrate your remarks in the Core Comment block Evaluate all METs, Capability Areas, Core, and OARS on every assessment OARS data should match the current DRRS-N Capability Area assessments Manage the input systems (AMSRR, SHARP, etc.) Flight logging in SHARP really matters The AMSRR must contains numbers for all aircraft and equipment fields, even if the number is zero Take advantage of the TYCOM and Type Wing experts

36 CNAF DRRS-N Pillar Leads
CDR Nate “Dude” Ogle DRRS-N Program Manager Mr John Olanowski P Pillar CDR Robert Palmore E Pillar CDR Timothy “Tonto” Young T Pillar Contractor Support Leads CNAL Support Mr Chris Soler Mr. John Bryson CNAP Support Ms. Mary Erausquin Mr. Tim Leonard Mr. Jim McDonald Mr. Dan Scholtes Lead SHARP Support Rep Mr. Mark Burgunder SHARP Aviation Support Programs

37


Download ppt "01 January 2016."

Similar presentations


Ads by Google