Semantic Sensor Web

Millions of sensors around the globe currently collect avalanches of data about our environment. The rapid development and deployment of sensor technology involves many different types of sensors, both remote and in situ, with such diverse capabilities as range, modality, and maneuverability. It is possible today to utilize networks with multiple sensors to detect and identify objects of interest up close or from a great distance. The lack of integration and communication between these networks, however, often leaves this avalanche of data stovepiped and intensifies the existing problem of too much data and not enough knowledge. With a view to alleviating this glut, we propose that sensor data be annotated with semantic metadata to provide contextual information essential for situational awareness. In particular, we present an approach to annotating sensor data with spatial, temporal, and thematic semantic metadata. This technique builds on current standardization efforts within the W3C and Open Geospatial Consortium (OGC) and extends them with semantic Web technologies to provide enhanced descriptions and access to sensor data.

Research Topics

Semantic Modeling and Annotation of Sensor Data

Ontologies and other semantic technologies can be key enabling technologies for sensor networks because they will improve semantic interoperability and intergration, as well as facilitate reasoning, classification and other types of assurance and automation not included in the OGC standards. A semantic sensor network will allow the network, its sensors and the resulting data to be organised, installed and managed, queried, understood and controlled through high-level specifications. Ontologies for sensors will provide a framework for describing sensors. These ontologies will allow classification and reasoning on the capabilities and measurements of sensors, provenance of measurements and may allow reasoning about individual sensors as well as reasoning about the connection of a number of sensors as a macroinstrument. The sensor ontologies will, to some degree, reflect the OGC standards and, given ontologies that can encode sensor descriptions, understanding how to map between the ontologies and OGC models is an important consideration. Semantic annotation of sensor descriptions and services that support sensor data exchange and sensor network management will serve a similar purpose as that espoused by semantic annotation of Web services. This research is conducted through the W3C Semantic Sensor Network Incubator Group (SSN-XG) activity.

Semantic Sensor Observation Service

Sensor Observation Service (SOS) is a Web service specification defined by the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) group in order to standardize the way sensors and sensor data are discovered and accessed on the Web. This standard goes a long way in providing interoperability between repositories of heterogeneous sensor data and applications that use this data. Many of these applications, however, are ill equipped at handling raw sensor data as provided by SOS and require actionable knowledge of the environment in order to be practically useful. There are two approaches to deal with this obstacle, make the applications smarter or make the data smarter. We propose the latter option and accomplish this by leveraging semantic technologies in order to provide and apply more meaningful representation of sensor data. More specifically, we are modeling the domain of sensors and sensor observations in a suite of ontologies, adding semantic annotations to the sensor data, using the ontology models to reason over sensor observations, and extending an open source SOS implementation with our semantic knowledge base. This semantically enabled SOS, or SemSOS, provides the ability to query high-level knowledge of the environment as well as low-level raw sensor data.

Perception and Analysis of Sensor Data

Currently, there are many sensors collecting information about our environment, leading to an overwhelming number of observations that must be analyzed and explained in order to achieve situation awareness. As perceptual beings, we are also constantly inundated with sensory data; yet we are able to make sense out of our environment with relative ease. This is due, in part, to the bi-directional information flow between our sensory organs and analytical brain. Drawing inspiration from cognitive models of perception, we can improve machine perception by allowing communication from processes that analyze observations to processes that generate observations. Such a perceptual system provides effective utilization of resources by decreasing the cost and number of observations needed for achieving situation awareness.

Trust on Semantic Sensor Web

Trust and confidence are becoming key issues in diverse applications such as ecommerce, social networks, semantic sensor web, semantic web information retrieval systems, etc. Both humans and machines use some form of trust to make informed and reliable decisions before acting. In this work, we briefly review existing work on trust networks, pointing out some of its drawbacks. We then propose a local framework to explore two different kinds of trust among agents called referral trust and functional trust, that are modelled using local partial orders, to enable qualitative trust personalization. The proposed approach formalizes reasoning with trust, distinguishing between direct and inferred trust. It is also capable of dealing with general trust networks with cycles.

Analysis of Streaming Sensor Data

Sensors are increasingly being deployed for continuous monitoring of physical phenomena, resulting in avalanche of sensor data. Current sensor data streams provide summaries (e.g., min., max., avg.) of how phenomena change over time; however, such summaries are of little value to decision makers attempting to attain an insight or an intuitive awareness of the situation. Feature-streams, on the other hand, provide a higher-level of abstraction over the sensor data and provide actionable knowledge useful to the decision maker. This work presents an approach to generate feature-streams in real-time. This is accomplished through the application of ontological domain knowledge in order to integrate multiple, multimodal, heterogeneous low-level sensor data streams and infer the existence of real-world events like Blizzard, RainStorm etc. The generated feature-streams are publicly accessible on the Linked Open Data (LOD) Cloud.








  • The SSN Ontology of the W3C Semantic Sensor Network Incubator Group
    • Michael Compton, Payam Barnaghi, Luis Bermudez, Raul Garcia-Castro, Oscar Corcho, Simon Cox, John Graybeal, Manfred Hauswirth, Cory Henson, Arthur Herzog, Vincent Huang, Krzysztof Janowicz, W. David Kelsey, Danh Le Phuoc, Laurent Lefort, Myriam Leggieri, Holger Neuhaus, Andriy Nikolov, Kevin Page, Alexandre Passant, Amit Sheth, Kerry Taylor
    • Journal of Web Semantics, 2012


  • Demonstration: Real-Time Semantic Analysis of Sensor Streams (prototype, demo)
    • Harshal Patni, Cory Henson, Michael Cooney, Amit Sheth, and Krishnaprasad Thirunarayan
    • in Proceedings of the 4th International Workshop on Semantic Sensor Networks (SSN 2011), co-located with the 10th International Semantic Web Conference (ISWC 2011), 23-27 October 2011, Bonn, Germany.
  • Semantic Sensor Network XG Final Report
    • Laurent Lefort, Cory Henson, Kerry Taylor, Payam Barnaghi, Michael Compton, Oscar Corcho, Raul García Castro, John Graybeal, Arthur Herzog, Krzysztof Janowicz, Holger Neuhaus, Andriy Nikolov, and Kevin Page
    • W3C Incubator Group Report


  • Provenance Aware Linked Sensor Data
    • Harshal Patni, Satya S. Sahoo, Cory Henson, and Amit Sheth
    • in Proceedings of 2010 2nd Workshop on Trust and Privacy on the Social and Semantic Web, Co-located with ESWC, Heraklion Greece, 30th May - 03 June 2010.
  • Linked Sensor Data
    • Harshal Patni, Cory Henson, and Amit Sheth
    • in Proceedings of 2010 International Symposium on Collaborative Technologies and Systems (CTS 2010), Chicago, IL, May 17-21, 2010.


  • A Survey of the Semantic Specification of Sensors
    • Michael Compton, Cory Henson, Laurent Lefort, Holger Neuhaus, and Amit Sheth
    • in Proceedings of the 8th International Semantic Web Conference (ISWC 2009), 2nd International Workshop on Semantic Sensor Networks, 25-29 October 2009, Washington DC.
  • SemSOS: Semantic Sensor Observation Service
    • Cory Henson, Josh Pschorr, Amit Sheth, and Krishnaprasad Thirunarayan
    • in Proceedings of the 2009 International Symposium on Collaborative Technologies and Systems (CTS 2009), Baltimore, MD, May 18-22, 2009.

2008 (and earlier)

  • Semantic Sensor Web
    • Amit Sheth, Cory Henson, and Satya Sahoo
    • IEEE Internet Computing, vol. 12, no. 4, July/August 2008, pp. 78-83.
  • Video on the Semantic Sensor Web
    • Cory Henson, Amit Sheth, Prateek Jain, Josh Pschorr, and Terry Rapoch
    • W3C Video on the Web Workshop, December 12-13, 2007, San Jose, CA, and Brussels, Belgium.

Book Chapters


Ontologies and Datasets

Prototypes, Demos, and Tools

  • SemMOB - Dynamic Sensor Registration and Semantic Processing for ad-hoc MOBile Environments:demo
    SemMOB enables dynamic registration of sensors via mobile devices, search, and near real-time inference over sensor observations in ad-hoc mobile environments (e.g., fire fighting). We demonstrate SemMOB in the context of an emergency response use case that requires automatic and dynamic registrations of sensor devices and annotation of sensor observations, decoding of latitude-longitude information in terms of human sensible names, fusion and abstraction of sensor values using background knowledge, and their visualization using mash-up.
  • SECURE - Semantics Empowered Rescue Environment: demo
    This demo is a Semantic Web research effort towards a Physical-Cyber-Social system that uses background knowledge on the web, and an ontology of perception, to reason over the sensor observations generated by a mobile robot.
  • Real Time Feature Streams: demo
    Real Time Feature Streams focuses on reasoning over lower-level raw sensor data streams to detect higher-level abstractions called features (a concept that represents a real world entity like Blizzard, RainStorm etc) in real-time. The feature-streams are added to the Linked Open Data Cloud (LOD).Main contributions involve - (1). Integration of multiple, multimodal, heterogeneous “low-level” sensor data streams to generate “high-level” feature streams. (2). The summarization is across the thematic dimension involving multiple data streams and the use of background knowledge as opposed to summarizations of single streams across temporal dimension (like min,max,average etc).
  • Trusted Perception Cycle: demo
    This demo helps in visualizing the perception cycle (abductive inference) and reputation values computed for weather stations over a period of six days. Various features inferred from raw sensor data using the perception cycle are depicted with different colors of the bars and height of the bar represents the reputation value of each weather station. The demo shows all the inferred features and the way in which the reputation computation converges. Main contributions include: (1) Development and formalization of perception cycle (2) Implementation of a reputation system which used beta-pdf distribution to compute trust values.
  • Sensor Discovery On Linked Data: demo
    This application provides ability to search for sensors based on location names rather than location coordinates. Main contributions include: (1) large sensor description dataset (> 20,000 weather stations) with links to "nearby" locations described in dataset, (2) large sensor observation dataset (> 1.7 billion triples) with links to sensor dataset, (3) both datasets openly available on LOD, (4) extend SemSOS (described above) to semantically annotate SOS GetCapabilities document with model references to Geonames locations.
  • Semantic Sensor Observation Service (MesoWest): demo
    SemSOS is an extension of SOS to allow SOS queries and access to an ontological knowledgebase. Main contributions include: (1) sensor/observation ontology based on Observations and Measurements (O&M), (2) semantic annotation of O&M and SML documents, (2) mappings and translation scripts to convert O&M and SML into RDF (and vice-versa), (3) rule-based reasoning to infer events from low-level sensor data, (4) query translation from SOS format into SPARQL.
  • Semantic Sensor Observation Service (Buckeye Traffic): demo
  • Video on the Semantic Sensor Web: demo

Related Material

Training video:

Computing for Human Experience

Physical-Cyber-Social Computing

    Previous Projects

    • Trusted Semantic Sensor Web to Support Decision Making over Massive Amounts of Sensor and Social Data
      • Social networks are increasingly used by humans, both civilians and military personnel, to report on observations related to vast variety of events. Use of mobile devices and smart phones has further accelerated the rate at which such social data is shared through social networks. This is complemented by a regular stream of observations reported by machine sensors at an ever growing pace, already exceeding petascale. It has consequently become impossible for humans to derive insights or make decisions by just accessing and searching such observational data. Instead, what is necessary is to have integrated access to variety of multimodal sensor and social data centered on events, and their analysis such that humans are presented with highly relevant information at the level of abstractions that lends itself to decision making. Current efforts in semantic sensor web and semantic social web are showing promise in achieving this capability. However, it is critical that trustworthiness of observational data as well as reported information at higher level abstraction be integral part of any system that is of value to military decision makers.
    • TSSW - Trusted Semantic Sensor Web
      • Trust relationships occur naturally in many diverse contexts such as e-commerce, social interactions, social networks, ad hoc mobile networks, distributed systems, decision-support systems, (semantic) sensor web, emergency response scenarios, etc. As the connections and interactions between humans and/or machines (collectively called agents) evolve, and as the agents providing content and services become increasingly removed from the agents that consume them, miscreants attempt to corrupt, subvert or attack existing infrastructure. This in turn calls for support for robust trust inference (e.g., gleaning, aggregation, propagation) and update (also called trust management). Because Web, social networking and sensor information often provide complementary and overlapping information about an activity or event that are critical for overall situational awareness, there is a unique need for understanding and development of techniques for managing trust that span all these information channels. Currently, we are pursuing research on trust and trustworthiness issues in interpersonal, social, and sensor networks, to potentially unify and integrate them for exploiting their complementary strengths.
    • SIDFOT - Sensors Integration for Data Fusion in Operations and Training
      • Social and sensor data is increasingly being used in continuous monitoring of events like disasters (natural or man-made), political unrest, etc. Collecting data from multi-modal sources will provide a holistic view of an event since each source of information may be complementary to each other, consequently, providing better Situational Awareness. The representation of sensors (machines or humans) and their observations (quantitative or qualitative) will help us annotate the raw data for further analysis and integration of sensor observations. This project is focused on research issues involved in representation, modeling, and annotation of sensors and observations using OGC standards along with Semantic Web technologies for access, discovery, search, and integration of sensors and its observations.
    • Architectures for Secure Semantic Sensor Networks for Multi-Layered Sensing
    • Contact Information: Pramod Anantharam