Presentation is loading. Please wait.

Presentation is loading. Please wait.

Discussion and conclusion The OGC SOS describes a global standard for storing and recalling sensor data and the associated metadata. The standard covers.

Similar presentations


Presentation on theme: "Discussion and conclusion The OGC SOS describes a global standard for storing and recalling sensor data and the associated metadata. The standard covers."— Presentation transcript:

1 Discussion and conclusion The OGC SOS describes a global standard for storing and recalling sensor data and the associated metadata. The standard covers information about sensors such as manufacture, operational range and sensitivities through SensorML as well the data acquisition points using features of interest. As the software is follows OGC SWE standard the system can interface and share data with many existing online tools and information infrastructures. The SOS was used to capture data from the Shale Hills experimental watershed, part of the Critical Zone. As well as capturing the real-time data from approximately 50 sensors, all available historic data was imported through the API. The website provided all functionality required by the users in early testing, such as simple querying and downloadable content. The modelling system was tested with small example models running on a local server which completed as expected. The message brokering system allowed for rapid passing of command messages without being clogged with the resulting data files. This could be improved by developing a queuing system for requesting model runs. It is hoped to expand the system to handle data from other critical zones. Data modelling As most of the results captured from the Shale Hills watershed will be hydrological data the system was expanded to allow modelling through the interfaces, both API and website. The system can run models over data recalled for a specified timed using Octave [5], a free open source equivalent to MatLab. The web interface allows the uploading of models, updating existing models (while maintaining older versions) and reviewing results, as shown in Fig. 3. When a model is executed a webpage is generated allowing the user to view the results. If the modelling failed, download links are provide to repeat the process locally (allowing debugging). The execution of the models are handled via a message brokering system. When a user selects a model to be run a lightweight XML message is sent via a broker to a remote computer, this can be a compute cluster. The execution of the model is carried out on the remote computer and another lightweight XML message containing status information is returned. The output files are transferred directly between the server and client reducing the volume of data passing through the message broker, this is shown in Fig. 4. Data acquisition The web service developed was based on the 52º North [3] implementation of the SOS. The database model and core functionality was used as a basis for development. The web service was coded in PHP in two parts, the machine to machine API and a human to machine web interface. The machine to machine API follows a similar approach to the 52º North solution, allowing this to interact with existing tools. This API uses a remote procedure call interface to request data, which is returned as XML. The web interface, shown in Fig. 2, uses a structured approach allowing for a RESTful interface. Users first select feature of interest, this then displays all the sensors within this feature of interest as well as a Simile TimePlot [4] of all the sensor data for a given time period. A single sensor can then be selected to show the sensor metadata as well as a plot of its data for a given time period. A data recall page is also available to allow the user to select a time period and sensors they wish to retrieve data for. For each Simile Timeplot a CSV download is also available so data can be processed locally. Introduction The Shale Hills critical zone laboratory in Pennsylvania, shown in Fig. 1, was one of three critical zones funded by the U.S National Science Foundation [1]. The goal of these areas was to study the complex processes occurring on the Earth's surface, including research into hydrology, geomorphology and biogeochemical systems. To allow sharing of this data between researchers a standardised approach of storing the data is required. The volume of data and need for automation means that the data must also be accessible through machine to machine interactions as well as a human to machine interface. In this work an implementation of the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS) has been developed to capture, store and view the Shale Hill critical zone observatory sensor data. Critical Zone Observatories and Sensor Repositories Stephen Wilson & Jeremy Frey* School of Chemistry, University of Southampton, Highfield, Southampton, SO17 1BJ, UK; sw1703@soton.ac.uk Fig 3. Screenshots from the Shale Hills Critical SOS data modelling service Fig 4. The message and data flow system for execution of models via the SOS Fig 1. The Shale Hills critical zone observatory Src: http://www.rthnet.psu.edu/ Sensor Observation Service (SOS) overview The Sensor Observation Service is part of the OGC Sensor Web Enablement (SWE) standard [2] aimed to allow integration of sensor webs with existing information infrastructure. The SOS standard describes a web service interface for storing, filtering and retrieving sensor data and metadata in real-time, for both fixed and dynamic sensors. There are three core operation the SOS must provide, getCapabilities (retrieving all operations and allowed values of the SOS), describeSensor (returning the metadata of a given sensor as SensorML) and getObservation (used to recall data) along with a number of non- mandatory functions. The SOS uses the concept of an observation as an event that produces a result, the result being an estimate of the observed phenomenon. Each event is classified by a time stamp, a feature of interest, the observed phenomenon and the procedure (sensor). Data is recalled from the SOS through observation offerings, these are non-overlapping groups of related observations. References 1.S. P. Anderson, R. C. Bales, and C. J. Duffy. Critical zone observatories: Building a network to advance interdisciplinary study of earth surface processes, February 2008. 2.Sensor Web Enablement WG. OGC. [web page] http://www.opengeospatial.org/projects/groups/sensorweb. [Accessed 17th September]. 3.Welcome to 52North. 52North.org. [web page] http://52north.org/. [Accessed 17th September]. 4.Simile | Timeplot. Simile. [web page] http://simile.mit.edu/timeplot/. [Accessed 17th September]. 5.Octave. [web page] http://www.gnu.org/software/octave/. [Accessed 17th September]. Fig 2. Screenshots from the Shale Hills Critical SOS data acquisition service Acknowledgments This project is funded by the EPSRC and Worldwide Universities Network. The author would like to thank Chris Duffy for making critical zone data available, Brian Bills for providing hosting and Karl Mueller & William Brouwer for supporting the WUN trip. Background The data acquisition system and use of the Sensor Observation Service had originally been developed for use in the chemistry laboratory. This has been running within the School of Chemistry at the University of Southampton for a number of years. The software was designed in such a way that it would scale to other sensor deployments (as has been proven). During this time the captured data has been used to determine faults in equipment as well as following identifying possible causes for failed experiments.


Download ppt "Discussion and conclusion The OGC SOS describes a global standard for storing and recalling sensor data and the associated metadata. The standard covers."

Similar presentations


Ads by Google