Archive for 'Spatial Infrastructure'

So what do you know about machine to machine (M2M) technologies?  Not so much you say.  But I suspect you may have heard of Smart Grids.  These are a form of ‘smart infrastructure’ that employs M2M technology.

A basic definition of M2M is the suite of technologies that support wired or wireless communication between machines allowing for the exchange of information about such factors as location, temperature, device status, etc.

Simple M2M Architecture (courtesy ETSI)

Smart Grids are an example of an M2M technology implementation where infrastructure within a utility grid are instrumented with sensors that can monitor a wide variety of information such as energy consumption, power outages, etc.  These sensors are networked into a communication network that allows the sensed information to be fed to a central system where events can be monitored and various responses affected.

M2M technologies are emerging as a significant technology growth area with applications being realized in many areas.

From a geospatial perspective, M2M presents many opportunities.  At CeBit Australia in August 2010 David Hocking, CEO of the Spatial Industries Business Association boldly stated “No smart infrastructure is smart unless it’s geo-enabled.  Spatial data is the glue for Smart Infrastructure.”

Whether the centrality of geospatial information is as great as that or not, there are clear opportunities for the inclusion of geospatial technologies within the M2M technology mix for many applications.

Many of the classic benefits of geographic information systems can be exploited when M2M systems account for and collect geospatial data.  These include:

Asset visualization – the ability to view network devices being monitored in an appropriately scaled spatial context is beneficial in quickly assessing network status, problems, patterns, etc.

Asset management – geoanalytical tools can be utilized to understand the dynamic nature of the network as assets by assessing qualities like grouping, patterns, movement, etc.

Spatially structured dashboard views of key network parameters can provide both alerts and contextual information about network asset behavior.

Correlation with ancillary information – one of the strengths of GIS technologies is the ability to correlate various types of data on the basis of location.  For instance it might be relevant in an M2M utility grid application to be able to visualize the network against weather information to better understand causes of network disruption.

M2M network performance analysis – spatial measurement tools can be utilized to visualize and better understand network behavior.  These can be summative (evaluating past performance) or real time in nature.

Resource allocation and deployment – network management planning – whether for regular maintenance or emergency response, can be better planned and coordinated with a spatial reference.

While the integration of geospatial technologies with M2M infrastructure has the potential to add value to the overall network, there are issues that need to be addressed in order to achieve maximum benefit.  These include:

The degree to which network monitoring and analysis needs to be dynamic in nature?

Issues around data management:

Where is it stored?

Who owns the data?

Is the data complete?

Do the various data layers have compatible levels of accuracy?

Has the data been cleaned and structured so that it can be interfaced with other data layers?

How is the data maintained?

System connectivity:

How will systems talk to each other?

Can data from disparate systems be reliably accessed?

While there may be challenges to integrating M2M and geospatial technologies, there are also clear benefits that can enhance the value of M2M networks.

So how do you go about creating a dialogue between technical experts and those who may be able to benefit from the technology in question whether they realize it or not?

Location Intelligence Conference 2009

Location Intelligence Conference bridges this gap and creates dialogue.   LI 2009 has come and gone, in the process offering a wide ranging presentation of location technology trends and their application.

While not the opening talk, Jeff Christensen’s (Rhiza Labs) presentation entitled “Designing Simple Tools for Powerful Analysis” framed the discussion when he reminded those in attendance that ultimately data is used to tell stories and make decisions.  Regardless of complexity of the solution or the technology applied, the end goal is the same.

From a technology perspective, LI 2009 opened with Steve Coast founder of OpenStreetMap describing the phenomenal growth of the crowd sourced alternative to Navteq and TeleAtlas street network data.  As much as anything OpenStreetMap is a reminder that new paradigms can lead to technology advances and amazing new applications.

Cloud computing was a recurring theme throughout the conference with various speakers offering their expert opinion and hands on experience with cloud computing.  Discussion touched on cloud computing concepts, benefits, technical challenges and trends.  A particularly interesting presentation by Mark Sundt described the development of Appistry’s CloudIQ that provides a cloud solution for clients who want the benefits of cloud services but want to deploy it in their own data centres.

On the application side presentations covered the spectrum of location technology application from the complex with John Bennett (Hunt Energy IQ) describing Hunt Energy IQ’s work to integrate a range of sensors to develop green intelligent buildings where it is possible to calculate and manage energy costs in real time through the role of IP location information at the foundation of Examiner.com (Dave Shafer, Co-Founder and COO) allowing them to provide users with hyperlocal news content.

Three days of presentations, panel discussions and individual conversations provided a basket load of information on location technologies and their application.

My takeaways from LI 2009:

  1. Confirmation that location intelligence technologies continue to evolve and offer opportunity for new applications, and
  2. Successful applications of location intelligence technology consistently exhibit clear understanding and very specific use of technology regardless of the simplicity or complexity of the technology.

Spatial context is a part of our decision making but spatial information technology may not.

The recent announcement by YourStreet.com that they were discontinuing the use of maps in their hyperlocal news service is a reminder that there is nothing sacred about the application of spatial information tools in a business context.  That is sometimes hard for us to imagine – at least those of us living with spatial data and technology day in and day out.

The reason cited by Directions Magazine was a financial one – maintaining the service was too costly.  Assuming that is true, what does one make of it?

  • Technology used to communicate spatial context has a value associated with it
  • At some point the value of spatial information technology may not justify the cost
  • If that point is reached, the technology in question will be dropped or will atrophy

Should that surprise us? Not really since it pretty much is the way life goes.

In the case of Yourstreet has the importance of spatial information disappeared? I would argue that it has not given that they premise for their business revolves around local (read spatially relevant) news.  Instead, YourStreet has simply determined they will not use online mapping tools as a spatial reference system to help their users.  They have deemed that a descriptive spatial reference system (ie, a user defines the spatial context for news of a particular location in his or her request) is adequate for their user’s needs.

We need to be clear that spatial context is not the same as spatial information technology.  The former can be achieved in a variety of ways.  Technology may aid in providing spatial context but it needs to be evaluated within a cost/benefit framework appropriate to the business or organization in question.

Spatial Data Infrastructures (SDI) have been around since the mid 1980’s when the Australian Land Information Council was first created.  While somewhat of a generalization – SDI promotion and use has largely been confined to those with a high degree of knowledge of spatial data and technology.

Recent coverage of government transparency and data access in the United States got me thinking about the role spatial information tools and infrastructure play in enabling a broader community of people and organizations to access and understand the value of spatial information.

The concept behind an SDI is that it provides a mechanism for greater accessibility to spatial information with resulting economic (to both the data providers and users) and social benefits to countries or regions implementing SDIs.  Today SDIs have been or are being implemented in well over 100 countries.

sdi-architecture

(modified from Rajabifard et al, 2002)

In their early formulation the focus of SDIs was largely on database creation but in recent years the focus has been more towards user community needs with emphasis on processes for data access, use and dissemination.  This shift can be seen clearly in the refocused mandate of the Canadian GeoConnections program which now concentrates much of its effort (and funding) on encouraging user communities to take advantage of the Canadian SDI (Canadian Geospatial Data Infrastructure).

Despite the shift in focus, considerable work remains to be done in order to facilitate greater access to data.  Surveys have shown that SDI usage is still predominantly in the world of research and government where knowledge of spatial concepts is on average higher and technology infrastructure greater than in the general public as a whole.

However, the emergence of tools such as Google Earth and other “geo-browser” tools and the associated interest in spatial information among a broader range of users has not gone unnoticed by SDI policy experts and researchers. Today there is considerable discussion about the evolution of SDIs and what can be learned from the Web 2.0 world of spatial information.  The challenge will be to draw the best from both worlds to create an environment where the value of spatial information can be realized more simply and by a broader group of people.

I see at least two threads emerging in this discussion: one around the challenges of evolving SDIs to bring all the value of past investments to a point where a broader group can access it in a user friendly manner and the second around helping users to fully benefit from all spatial information has to offer.

What form will user implementations take, how will they be sustained, and what benefits will users realize?  It will be interesting to see what the future holds.