Rabkin’s ROI has a great post on the impact new technology has on our “baseline experience”.  Barry Rabkin points out that when new technology is seen to add value to our lives or improve the way we go about our business, a new baseline of expectation is established against which we measure future technology.

His post is a good reminder that disruptive technologies are always changing market expectations.  While we may believe our product is the disruptive one, we can can never lose sight of the reality that our competitors are also responding to changing market conditions.

For companies in the spatial technology world, this change is all too evident.  Particularly as one observes the business of organizations such as Microsoft and Google and others as they continually raise the bar of expectation among consumers with respect to access to spatial information, ease of use, etc.  What was pretty heady stuff a few years ago has now fallen below the threshold of user expectation.

I not sure the saying “a rising tide lifts all boats” is entirely true in a competitive market place but it is clear that even when the bar is being steadily raised by some large market players, opportunity remains for others.  In the spatial technology field we continue to see many companies thrive (not all to be sure) and new companies emerge, set to introduce their own technologically disruptive products and services into the market.

The challenge remains to innovate around sound technology with a sharp eye on what the market requires.  Recognize the market expectation is always changing, don’t forget that your competition is probably gauging the market as closely as you are, anticipate what  they will do and have a plan to deal with it.

So how do you go about creating a dialogue between technical experts and those who may be able to benefit from the technology in question whether they realize it or not?

Location Intelligence Conference 2009

Location Intelligence Conference bridges this gap and creates dialogue.   LI 2009 has come and gone, in the process offering a wide ranging presentation of location technology trends and their application.

While not the opening talk, Jeff Christensen’s (Rhiza Labs) presentation entitled “Designing Simple Tools for Powerful Analysis” framed the discussion when he reminded those in attendance that ultimately data is used to tell stories and make decisions.  Regardless of complexity of the solution or the technology applied, the end goal is the same.

From a technology perspective, LI 2009 opened with Steve Coast founder of OpenStreetMap describing the phenomenal growth of the crowd sourced alternative to Navteq and TeleAtlas street network data.  As much as anything OpenStreetMap is a reminder that new paradigms can lead to technology advances and amazing new applications.

Cloud computing was a recurring theme throughout the conference with various speakers offering their expert opinion and hands on experience with cloud computing.  Discussion touched on cloud computing concepts, benefits, technical challenges and trends.  A particularly interesting presentation by Mark Sundt described the development of Appistry’s CloudIQ that provides a cloud solution for clients who want the benefits of cloud services but want to deploy it in their own data centres.

On the application side presentations covered the spectrum of location technology application from the complex with John Bennett (Hunt Energy IQ) describing Hunt Energy IQ’s work to integrate a range of sensors to develop green intelligent buildings where it is possible to calculate and manage energy costs in real time through the role of IP location information at the foundation of Examiner.com (Dave Shafer, Co-Founder and COO) allowing them to provide users with hyperlocal news content.

Three days of presentations, panel discussions and individual conversations provided a basket load of information on location technologies and their application.

My takeaways from LI 2009:

  1. Confirmation that location intelligence technologies continue to evolve and offer opportunity for new applications, and
  2. Successful applications of location intelligence technology consistently exhibit clear understanding and very specific use of technology regardless of the simplicity or complexity of the technology.

logoLeafy3 Yesterday I watched an inspiring interview with Jessica Jackley, co-founder of Kiva.org.   The interview was part of an event called The Leadership Summit which is an annual faith-based leadership event that includes presentations from recognized leaders from all walks of life.

While there were many things about the interview with Ms. Jackley that I found fascinating and inspiring, one thing that stood out in particular was how clearly Kiva understood the importance of defining its mission statement and then how that permeated the organization.  Kiva’s mission is “to connect people through lending for the sake of alleviating poverty”. The organization links individual credit lenders (some lending as little as $25) with entrepenuers in developing countries through in-country micro finance organizations.  When the interviewer asked her how Kiva managed to grow (Kiva has helped raise tens of millions of dollars in capital since its formation in 2005) while maintaining a relatively flat organizational structure Ms. Jackley referred to the importance of Kiva’s mission statement as providing the guidance and direction for a growing organization.  She asserts that Kiva’s mission statement provides an important mechanism for self-governance and reduces (not eliminates) the need for organizational structure and bureaucracy, allowing for great creativity and productivity within the organization.

So, a Mission Statement can be more than a line item or paragraph in your business plan but also a guiding light enabling an organization to operate efficiently and to flourish.

But My GPS Told Me To….

It seems like I am hearing a lot about the problems with personal navigation technology these days. A recent twitter post by @mapserving of a news article by the BBC describes another case of GPS misadventure. And only a few days ago, my wife and I, attending a reception listened to an extended dinner table discussion of the problems of in car navigation. Some of the accounts are humorous but sometimes the consequences may be serious.

So as a one promoting the benefits of spatial technology, what does one make of these types of reports?

As the growth in location aware applications and services is upon us, we need to remind ourselves first, that technology in itself is probably not the complete solution to any user’s needs.

In the case of GPS navigation there are many potential sources for error including the following:

  • Outdated map data – recent street addition
  • Incorrect or incomplete data
  • Inaccurate geocoding
  • Poor routing models
  • GPS satellite system responsiveness and accuracy
  • Interpretation of user queries
  • Operator error

Take a look at the manuals that accompany your GPS device. If these issues are addressed at all, it is not in an overt way. And even if they were spelled out more prominently, would it make a difference? My sense is that in today’s technological world, there is a tendency among all of us to focus on the benefits of technology while losing site of its limitations.

There is a fine balance between promoting new technology and ensuring that users are aware of the limitations of its use or the need for other information, common sense, etc. One of the challenges of those providing technology based products and services is to minimize the limitations of the technology during the user experience. This can be accomplished by:

  • Understanding the use case – this will change as products and service uptake moves from early adopters to mainstream users;
  • Ensuring you have thought through and are able to provide a complete solution to the user – are things like documentation, training, etc necessary and how should they be implemented to be effective; and
  • Constantly work to solve technical limitations or provide workarounds.

These are pretty fundamental and there are probably others but we need to keep at least these three in the foreground as we work to advance the use of spatial information and technologies.

In the world of technology, it seems the focus is usually on the early stages of the product lifecycle curve.  While we dream (and hopefully plan) for the point when our product moves beyond the realm of the visionary and early adopters, much of our effort and challenge remains with those early stages.

On June 22, 2009 Kodak announced the end of the line had come for its Kodachrome film products – after a 74 year run.  Amazing.

Kodachrome Announcement

In today’s technological world, to imagine a product life cycle of 74 years is hard.  But Kodak succeeded with Kodachrome for many years..

Marketed to both professional photographers and the rest of us more in the snapshot category, the Kodachrome film products  were known for their vibrant colors, fine grain, sharpness and archival qualities.  For many, it was the film of choice.  The product line made its way into a popular song recorded by Paul Simon in 1973.

In making the decision to discontinue this product line, Kodak indicated that today Kodachrome generated revenues had declined to a fraction of the company’s total revenue.  Its product life cycle peaked in the 1950s and ’60s but as Kodak’s business changed in response to the disruption of digital photography sales declined.

The reality is that the end of the cycle comes for all products.  Often in the technology world, the cycle is disrupted by new innovation and the tail of the curve is cut short.  But for some – like Kodachrome the cycle is long and successful.

For the nostalgic, Kodak has compiled a gallery of iconic images shot with Kodachrome film.

“Kodachrome
They give us those nice bright colors
They give us the greens of summers
Makes you think all the world’s a sunny day, Oh yeah
I got a Nikon camera
I love to take a photograph
So mama don’t take my Kodachrome away”—Paul Simon, 1973

When you combine appropriate organizational structure, defined roles and responsibilities and appropriate processes that are properly linked to a mission or business model, an organization can be comfortable that it has a proper governance structure to guide its operations.

Put another way, the key elements of a governance model are:

  • Build on corporate level mandate
  • Define authority
  • Establish and enforce rules of operation
  • Manage change
  • Measure results and optimize

So how is this relevant to an organization’s implementation of web-based mapping applications?

In the rapidly evolving world of technology the only thing that seems certain about the future is that it will be different from today and the degree of difference is proportional to the time scale.  I would suggest this picture applies to the current state of web-based mapping technology.

For an organization considering or already engaged in the development of a web mapping application, the challenge of making choices today that remain valid tomorrow can be daunting – and particularly so if the organization does not see its strengths in the world of technology.

Is it just me or do the terms governance and technical innovation seems at opposite ends of the cool spectrum?

All too often, inadequate attention is paid to constructing an application-appropriate governance structure to ensure the long term sustainability and evolution of web-based mapping applications.  My observation is that even though web mapping is a relatively young area of endeavour, many applications have a tendency to flag or grow stale over time.

The areas an appropriate governance model will touch on include:

  • Application alignment with corporate goals
    • Definition and refinement of application objectives
    • Budgeting/resource procurement
  • Definition of performance criteria
  • Application lifecycle management
    • Management of the initial service/application functionality
    • Data management
    • Application enhancements
    • Internal staff resource management
    • User training
  • Monitoring of application services performance and effectiveness
    • Application use
    • Service uptime/downtime or underperformance
    • Benefits to user organizations
    • Benefits to information users

The objective should be to strike a balance between a sufficient level of governance to provide direction without it becoming overbearing and bureaucratic.

As Kim Guenther has stated “… governance structures are most noticeable in their absence and seem invisible when working effectively.”

Spatial context is a part of our decision making but spatial information technology may not.

The recent announcement by YourStreet.com that they were discontinuing the use of maps in their hyperlocal news service is a reminder that there is nothing sacred about the application of spatial information tools in a business context.  That is sometimes hard for us to imagine – at least those of us living with spatial data and technology day in and day out.

The reason cited by Directions Magazine was a financial one – maintaining the service was too costly.  Assuming that is true, what does one make of it?

  • Technology used to communicate spatial context has a value associated with it
  • At some point the value of spatial information technology may not justify the cost
  • If that point is reached, the technology in question will be dropped or will atrophy

Should that surprise us? Not really since it pretty much is the way life goes.

In the case of Yourstreet has the importance of spatial information disappeared? I would argue that it has not given that they premise for their business revolves around local (read spatially relevant) news.  Instead, YourStreet has simply determined they will not use online mapping tools as a spatial reference system to help their users.  They have deemed that a descriptive spatial reference system (ie, a user defines the spatial context for news of a particular location in his or her request) is adequate for their user’s needs.

We need to be clear that spatial context is not the same as spatial information technology.  The former can be achieved in a variety of ways.  Technology may aid in providing spatial context but it needs to be evaluated within a cost/benefit framework appropriate to the business or organization in question.

Spatial Data Infrastructures (SDI) have been around since the mid 1980’s when the Australian Land Information Council was first created.  While somewhat of a generalization – SDI promotion and use has largely been confined to those with a high degree of knowledge of spatial data and technology.

Recent coverage of government transparency and data access in the United States got me thinking about the role spatial information tools and infrastructure play in enabling a broader community of people and organizations to access and understand the value of spatial information.

The concept behind an SDI is that it provides a mechanism for greater accessibility to spatial information with resulting economic (to both the data providers and users) and social benefits to countries or regions implementing SDIs.  Today SDIs have been or are being implemented in well over 100 countries.

sdi-architecture

(modified from Rajabifard et al, 2002)

In their early formulation the focus of SDIs was largely on database creation but in recent years the focus has been more towards user community needs with emphasis on processes for data access, use and dissemination.  This shift can be seen clearly in the refocused mandate of the Canadian GeoConnections program which now concentrates much of its effort (and funding) on encouraging user communities to take advantage of the Canadian SDI (Canadian Geospatial Data Infrastructure).

Despite the shift in focus, considerable work remains to be done in order to facilitate greater access to data.  Surveys have shown that SDI usage is still predominantly in the world of research and government where knowledge of spatial concepts is on average higher and technology infrastructure greater than in the general public as a whole.

However, the emergence of tools such as Google Earth and other “geo-browser” tools and the associated interest in spatial information among a broader range of users has not gone unnoticed by SDI policy experts and researchers. Today there is considerable discussion about the evolution of SDIs and what can be learned from the Web 2.0 world of spatial information.  The challenge will be to draw the best from both worlds to create an environment where the value of spatial information can be realized more simply and by a broader group of people.

I see at least two threads emerging in this discussion: one around the challenges of evolving SDIs to bring all the value of past investments to a point where a broader group can access it in a user friendly manner and the second around helping users to fully benefit from all spatial information has to offer.

What form will user implementations take, how will they be sustained, and what benefits will users realize?  It will be interesting to see what the future holds.

Time and again technology companies are reminded that for success their business needs to be defined around business objectives not just technology. This is no less true in the world of spatial information were companies frequently fall in the trap of promoting sophisticated technology rather than identifying true market needs for which they can create products and services from their technological strengths.

I came across a recent announcement about a new service called Go iLawn. In my mind this is a perfect example of a service developed to address a very specific market segment with a percieved need.  Go iLawn is targeted at the landscape services market. The company that developed the service has its roots in the spatial information world and is utilizing high resolution imagery, cadastral data and a user friendly web interface to provide a service to landscape companies that allows them to view the yards of individual customers. With the tools provided by Go iLawn they can calculate lawn area, yard dimensions, locations of trees, shrubs, etc. all neccessary for the landscape company to produce a cost estimate for its customers. Go iLawn provides a service that allows them to respond to customers quicker, reduce costs in preparing quotes and providing services – ultimately leading to a more profitable business.

If you check out the Go iLawn website Go iLawn, you won’t find a lot of information about spatial information technologies – even the GIS company behind the service is only referenced in the copyright and the “Contact Us” section. The Go iLawn website is completely focused on the customers needs and the benefits Go iLawn provides – exactly where the emphasis should be placed.