Presentation is loading. Please wait.

Presentation is loading. Please wait.

Air Force Satellite Control Network Interoperability

Similar presentations


Presentation on theme: "Air Force Satellite Control Network Interoperability"— Presentation transcript:

1 Air Force Satellite Control Network Interoperability
Space Internet Workshop #4 June 2004 John Pietras Global Science and Technology, Inc

2 Contents Background of the AFSCN Interoperability Project
CCSDS Space Link Extension – candidate for AFSCN interoperability Summary of Interoperability Project Phase 3 Completed in Summer 2003 Transition from legacy- to standards-based infrastructure Overview of Phase 4 activities Spring-Summer 2004 Interoperability with other US space organizations

3 AFSCN Interoperability Project Background
Project is managed by AF Space and Missile Systems Command Satellite and Launch Control SPO as part of AF Satellite Control Network (AFSCN) modernization program Space vehicles to be supported will use legacy AFSCN radio frequency (RF) and modulation standards for another decade Goals Adopt/define services that will facilitate interoperation among US government satellite ground control networks Base services on packet-switched network technology to continue the AFSCN’s migration from circuit-switched technology Approach Adopt existing space data standards where available and appropriate Adapt standards where necessary Feed enhancements back to standards community for broader acceptance Feed results into Satellite Control Network Contract (SCNC) Architecture development Multi-phase study and demo project started in 2001 Phase 1: Standards assessments and lab tests Phase 2: Field tests with AF R&D assets Phase 3: Field tests with commercial and civil agency ground stations Phase 4: Develop standards profile for national infrastructure, field test vendor implementations

4 CCSDS Space Link Extension – Candidate for AFSCN Interoperability
Consultative Committee for Space Data Systems (CCSDS) has developed standards for the exchange of space vehicle (SV) command and telemetry data between mission ground facilities (e.g., satellite operations centers) and TT&C ground stations Dubbed Space Link Extension (SLE) because they extend the space link protocols across the terrestrial WAN Originally developed to transport CCSDS-defined space link data units Adoption of SLE by the international civil space community makes SLE attractive as interoperability standard for AFSCN Adaptations to CCSDS SLE services were prototyped and demonstrated in Interoperability Phases 1 & 2 SLE Return All Frames (RAF) service adapted to support AFSCN’s time-correlated streaming telemetry (in contrast to discrete CCSDS space link data units) SLE Forward Communication Link Transmission Unit (FCLTU) service adapted to support time-critical streaming command data

5 CCSDS Space Link Extension Services
Standard services for exchanging space link data between ground termination of the space link and remote users Evolution of mission-unique/provider-unique services Formally assumes that CCSDS link protocols are used on the space-ground link AFSCN project has loosened even this assumption Appropriate for missions that do not use Internet-interoperable protocols Domain of Space Link Extension Control Center CCSDS (or legacy) space link interface CCSDS (or legacy) space link data structures tunneled thru IP WAN Return Link Data Processing Facility

6 Phase 3 Service Architecture
Johns Hopkins University Applied Physics Lab (JHU APL) Laurel, MD VPN FCLTU (cmd) RF & Mod TACO 1 ioNET cmd) CERES SOC RAF (echo) FCLTU (cmd) ioNET (echo) COBRA SV C2 system ioNET cmd) ioNET (tlm) Az/El RAF (echo) RAF (trk) ioNET (echo) TACO 2 ioNET (tlm) NOAA Command and Data Acquisition (CDA) Wallops Flight Facility, VA RAF (tlm) This diagram shows the various configurations of services as they were provided by the three ground stations that were used. All services from all stations were provided across the Internet. In addition, a dedicated T1 link was used between the JHU APL site and the CERES SOC to support testing using the 1Mbps downlink channel generated by one of the TACO vehicles. All inter-facility data transmissions were conducted a VPN that was formed by using IPsec routers to access the Internet and T1 connections (not explicitly shown). Two Telemetry data transfer services were tested in Phase 3. The CCSDS-standard RAF service was used, but the input and output had to be in non-RAF-standard way to accommodate streaming telemetry and AFSCN time-data correlation requirements. The Avtec ioNET, which implements a proprietary multiplexing algorithm, was also tested. Fundamental difference in approaches – conveying timestamps and syncing output to a separately-generated IRIG signal, vs digitizing the analog IRIG signal at the ground station and conveying it across the WAN. Two command data transfer services were tested for in Phase 3. The CCSDS-standard FCLTU service was used, but the input and output had to be processed in non-FCLTU-standard way to accommodate streaming commands and AFSCN time-critical command requirements. The Avtec ioNET was also tested. For command echo, two DT services were again tested. Because CCSDS does not have, and is not planning to develop, a standard command echo service, we pressed the SLE RAF service into use, with input/output modifications to allow the service to handle the streamed data. Again, the Avtec ioNET was the other service tested. Finally, CCSDS is only in the early stages of developing a standard for the delivery of tracking data from the GS to a user of tracking data (such as the SOC). But the work to-date has been based on the SLE data transfer service model. We chose to use the RAF service as the basis for our Phase 3 tracking data transfer capability, in anticipation of a future CCSDS SLE Tracking Data Transfer Service standard FCLTU (cmd) RF & Mod RAF (trk) RAF (echo) RAF (tlm) TACO 3 COBRA COTS-based Research Architecture FCLTU Forward Communication Link Transmission Unit (CCSDS SLE) ioNET Avtec multiplexer/demultiplexer system RAF Return All Frames (CCSDS SLE) TACO Test And CheckOut SV Dual connectivity: Internet and dedicated T1 Internet connectivity only NASA Experimental 5.4 m, Wallops Flight Facility, VA RF & Mod GST implementation RAF (tlm) Avtec implementation

7 Phase 3 Service Management Interfaces
DataLynx Ops Center Columbia, MD CERES SOC DataLynx Sched/Mgmt system JHU APL client Service Request and Response XML file attachments client & SLE-SM provider application Internet Service Request and Response XML files 1/sec service status messages None of the Phase 3 TT&C networks supported for the AFSCN SIS-509 resource management interface, so we used the emerging XML-based CCSDS Service Management Service Request standard as the basis for a scheduling prototype. Service requests (schedule requests) were exchanged between CERES and the DataLynx Operations Center (which service as the control agent for the APL station) and the NASA WFF. Scheduling of the NOAA Wallops CDA station. Service monitoring from the APL station was accomplished by reporting selected status parameters in 1/sec XML-formatted messages that were sent to CERES over a TCP connection and and displayed on the service management workstation there. None of the ground stations supported real-time user control of the station resources, so no such capability was provided. NASA WFF WFF Sched/Mgmt system(s) SLE-SM SOC application Service status display client & SLE-SM provider application

8 Summary of Phase 3 Results*
Telemetry Use of TCP and data buffering (~2-4 seconds additional delay) provided reliable delivery of the serial telemetry stream Telemetry TDC of SLE RAF implementation was accurate to within several tens of milliseconds, but not required 1 msec ioNET TDC was accurate to within 1 msec, but 170kbps digitized IRIG-B signal deemed overly consumptive of bandwidth Command Time-critical commanding was successful using both the FCLTU and ioNET based implementations Command echo Command echo was successful using both the RAF and ioNET based implementations Service Management Contacts scheduled using SLE standard format schedule requests/responses Ad hoc methods needed for tracking data and ground station status * Full results are documented in the SCNC Interoperability Phase 3 Project Report, Honeywell Technology Solutions, Inc. 28 October 2003, prepared by Lance Williams

9 Transition from Legacy to Standards - Based Infrastructure (1 of 3)
3 reference points for a transition architecture RF & modulation interface with the SV Currently SIS-502D New standards-based interoperable interface between ground station and SOC Telemetry, command, and command echo interface as seen by the user C2 system Currently SIS-508E SIS interoperable SIS-508 I/F Ground Station SOC Range Segment Communication Segment Current AFSCN C2 System Remains in place as long as compliant SVs are flying Put in place ASAP Can be replaced or eliminated as SOC technology and designs evolve

10 Transition from Legacy to Standards - Based Infrastructure (2 of 3)
3 sets of “standards” to enable interoperability for AFSCN-client SVs using SLE transfer services SLE service production specifications that define the processing required to transform the data transferred by the SLE transfer service to/from RF Currently SIS-502D SLE transfer service specifications provide the interoperable interface SLE service adaptation specifications that define the required transformations between SLE and legacy user interfaces Currently SIS-508E SIS SLE Transfer SIS-508 Service Ground Station SOC SGLS/ SLE service production SLE TS entity SLE TS entity SIS-508/ SLE adaptation Current AFSCN C2 System

11 Transition from Legacy to Standards - Based Infrastructure (3 of 3)
Adaptations eventually disappear as SOC designs natively incorporate SLE transfer services SLE service productions evolve with SV evolution SLE Transfer SIS Service Ground Station SOC Packet mode C2 system eliminates Adaptation USB augments SGLS SGLS/ SLE service production SLE TS entity AFSCN packet mode C2 System SLE TS entity Updated SIS-502 Ground Station SOC SGLS & USB/ SLE service production SLE TS entity AFSCN packet mode C2 System SLE TS entity

12 Phase 4 Implementation of SLE Services
Telemetry Vendor-supported implementation of prototype modifications of RAF SLE service to transfer and time-correlate serial stream of telemetry data Command Vendor-supported implementation of prototype modifications of FCLTU SLE service to transfer AFSCN SGLS-formatted command data SGLS command FCLTU service supports two modes: Discrete commands (inter-command idle added at ground station) Streaming commands (all symbols, including inter-command idle) used to transfer discrete commands Command Echo Vendor-supported implementation of prototype modifications of RAF SLE service to transfer serial stream of command echo symbols Service Management Continue exploring use of SLE scheduling and configuration standards

13 Establishing Interoperability With Other US Space Organizations
Telemetry Develop new CCSDS Return Telemetry (RT) SLE service Supports AFSCN serial telemetry Supports civil space needs not met by current RAF SLE service CCSDS Birds of a Feather (BOF) group formed Adopt time-correlated telemetry adaptation in 508-legacy AFSCN SOCs Command Adopt SGLS command production for FCLTU as US national interoperability capability USTAG13 BOF group formed Adopt SGLS command adaptation in 508-legacy AFSCN SOCs Command echo Adopt SGLS command echo production for RT service as US interoperability capability Adopt SGLS command echo adaptation in 508-legacy AFSCN SOCs

14 Acknowledgements U.S. Air Force Space and Missile Systems Center Satellite and Launch Control SPO (SMC/RN) AJ Ashby, 1Lt, USAF, project officer Carl Sunshine, The Aerospace Corporation, technical lead Satellite Control Network Contract Lance Williams, Interoperability project lead JHU APL ground station AF Center for Research Support (CERES) NASA WFF and NOAA Wallops CDA ground stations Avtec Systems General Dynamics Advanced Information Systems

15 Phase 4 Participants U.S. Air Force Space and Missile Systems Center Satellite and Launch Control SPO (SMC/RN) Satellite Control Network Contract The Aerospace Corporation GST, Inc. AF Center for Research Support (CERES) Avtec Systems NOAA WFF GD-AIS L3Com RTLogic JPL, NASA


Download ppt "Air Force Satellite Control Network Interoperability"

Similar presentations


Ads by Google