Data Continuity, Data Accessibility

In an age when the creation of data is growing exponentially and the conversation about big data analytics nears hype proportions, I think the question about data continuity and accessibility becomes increasingly important.

Missing Data

A recent post in Nature by Elizabeth Gibney and Richard Van Noorden regarding the loss of raw data associated with published research articles caught my attention when it was published.

The author summarized the work of a group of scientists who wanted to understand the level of accessibility of research data as a function of the date when results were published. They endeavoured to access the raw data associated with 516 published research articles ranging from 2 to 22 years old, considering factors such as active author email addresses, access to data, etc. In the original work, the investigators found that the odds of associated data being accessible fell by 17% per year and that within 20 years of an article’s publication up to 80% of the associated raw data can be lost along with the lost possibility of future researchers utilizing the data. The authors conclude by advocating for public archiving of data at the time of publication to ensure future accessibility.

First Nimbus ImageThis somewhat disconcerting article was offset by a more positive story by Sid Perkins, published on Science‘ website, describing work being undertaken at the University of Colorado Snow and Ice Data Center to make available archived Nimbus satellite imagery dating back to the mid 1960’s. This involves digitizing analogue data, mosiacing resulting digital images and then adding them to an accessible data archive. To date more than 250,000 images have been made available, adding considerable to a time series record of value in the assessment of issues such as high latitude sea ice variability and tropical and mid latitude weather variability. The extension of the data record to 50 plus years is truly impressive.

While it seems there are many questions around data compatibility (for another discussion), efforts to establish processes to ensure research data continuity and accessibility are to be commended and should be valued, even as new data is being generated at tremendous rates.

References

Scientists losing data at a rapid rate. Gibney, E. and R Van Noorden. 2013. Nature doi:10.1038/nature.2013.14416

Vines, T. H. et al. 2014. The availability of research data declines rapidly with article age. Curr. Biol. http://dx.doi.org/10.1016/j.cub.2013.11.014 (2013)

Nimbus data rescue: recovering the past to understand the future. 2014. http://cires.colorado.edu/news/press/2014/nimbus.html

Long lost data reveals new insights to climate change. Perkins, Sid. 2014. http://news.sciencemag.org/climate/2014/09/long-lost-satellite-data-reveal-new-insights-climate-change

While I don’t consider myself a luddite when it comes to the administrative details of my business, it is true that I didn’t get into business for the sake of bookkeeping, new phone systems and the like. But all of these business support details are a necessary aspect of keeping a business running and serving customers.

For a small business owner who wears several hats, attention to the administrative aspects of the business can sometimes be lost among all the other responsibilities. And the associated costs can creep upward with little notice.

This past year we took steps to address a number of recurring monthly costs. All of the services we looked at have value – in the right context. In our case, requirements had changed over time but the services were never intentionally evaluated.

So here are the changes we made:

Web conferencing @ $50/month. This service had been set up a couple years back because of the need for regular conference calling between one of my team and a particular client. With the project completed, we were using the service very infrequently so we cancelled it and moved to an on demand service through our telecommunications services provider.

Land telephone line – $60/mo. Speaking of telecommunications, late in 2014 we made a decision to cut the landline. This was a shared service for my family and our business that has its office in the house. There was seldom a conflict since the landline was hardly ever used for either personal or business purposes but it did represent a recurring cost for both. The replacement – Ooma’s internet based service that provides both personal and business lines at substantially reduced cost.

Fax service – $12/mo. Does anyone fax anymore? I figured we hadn’t sent or received a fax in well over a year so this one was ditched without much thought.

Government tender delivery service – $20/mo. This was an office/marketing expense that we weren’t receiving much benefit from. The original purpose was to access government of Canada (primarily) tenders but in the past year the government created its own portal that allowed for free access to the same data. Since we seldom pursue opportunities from other agencies the former service continues to support, it was not worthwhile to continue paying the monthly subscription fee.

Accounting software. We are in the process of converting from a desktop accounting service to a web based service. From what we can tell, we will have all the functionality we need and save about $50/month.

This past year we addressed several cost issues. I believe we have reduced costs without negatively impacting our business and some cases; we have gained in terms of service. While some of you may see these as relatively inconsequential costs, I believe they are significant for a small to medium sized business.

So I have resolved to stay on top of our administrative business services in 2015 – hopefully ensuring cost creep doesn’t recur and maybe even find additional savings. On the radar for this year are mobile phone charges and banking fees (including wire transfer fees).

If you have suggestions or past lessons you have learned, I’d love to hear from you.

Map Your Fishing Products

Cardinalus’ business is focused on enhancing is clients business value through the incorporation of location technologies.

For recreation, I enjoy flyfishing.  So when I saw how Cheeky Fly Fishing was using maps to help explain the range of fishing reels they offer, I was intrigued.

Cheeky offers a number of fishing reels to suit different fishing environments.  Those environments are location dependent so they use a clever map to illustrate where the reels are best suited – from the mountain streams and ponds, to larger rivers and eventually the open ocean.  By sliding the mouse over the various product labels different regions of the map are highlighted to illustrate the preferred zone for each reel.

Cheeky Reels

The application is effective but certainly not advanced in terms ofdata handling and representation. However it caused me to think about other applications companies might consider to more effectively illustrate their product and service offerings.

A few quickly came to mind:

  • a product manufacturer with a diverse product offering and an extensive distribution network could illustrate on a map both location of distributors but also which of their product line each distributor offers;
  • a retail business with multiple stores within a city or region could leverage real time inventory management and mapping to illustrate which products are available in various store outlets

Those are just a start.  The point is that spatial representation of your company’s product and service data can benefit your customer’s experience.  And that’s a good thing.

 

 

 

Last week was a big week in the web and mobile mapping world on at least two fronts.
First, Amazon announced plans to release a mapping API to support the Kindle Fire HD.  That alone would have been significant news but to some extent it was overshadowed by Apple’s release of iOS6.  The fact that Apple’s new OS would include a built at home mapping application replacing Google Maps was not news.  The plans have been known for a while but this was the first time Apples’ users had a chance to test drive the application.  And what a ride it got.  I can’t imagine how many blog posts have been published about Apple’s new mapping application. Apple has been greeted with comments about lack of functionality, aesthetic differences as well as out and out errors.  There have been plenty to comment on.  Websites have been set up to post user discovered flaws and they have chimed in.

In little more than a decade since Google released Google Maps users and developers have come to expect well-designed functional mapping applications.  Mapping applications have become a part of our everyday use.   We depend on applications that are built on them with the expectation that certain functionality and information will be present.  Functionality such as geographical search and navigation are now imbedded in our personal and our business lives.

For the most part, we take web and mobile mapping capability for granted.  Our expectation is that the applications will be there, they will work and they will work well.  That’s why the response to Apple’s product has been so immediate and vociferous.  The fact that building and maintaining a mapping application is extremely complex is lost on most people.

Circling back to Amazon.  What does Apple’s experience mean for their foray into the world of mapping?  Their API is now in beta, accessible to Kindle application developers.  Presumably they are providing Amazon feedback that will allow them to address the sorts of issues Apple has experienced.  But the task is not trivial.  Leaders in the mapping world have invested substantial time and effort to provide seamless data and a complete set of mapping tools.  Drew Olanoff provides a great TechCrunch summary of some of the inner workings of map database creation.

At a higher level, I think an important question is to what end should hardware developers be investing in proprietary core mapping capability?  It has to be a question that developers need to consider.  The cost of developing applications having to deal with different mapping applications is not insignificant, nor is the ongoing support and enhancement effort. To what extent can consistent user application experience be maintained across platforms? Does it matter if there are differences?

And what about third party providers positioning themselves as independent brokers?  At least one – deCarta has convinced itself there is an opportunity to be had.

It will be interesting to see how Apple responds to the community response to their mapping application.  How will Amazon fare?   What strategy will application developers take to accommodate the differences they face?

Not doubt there is more to come in the world of mapping APIs.

Going Off Location – Gowalla

Today it was announced that Gowalla was going off location.

Gowalla was acquired by Facebook last December and clearly the new owners have plans for the technology that don’t include a standalone application.  Facebook will leverage much of the technology and knowledge into its own location services.  Gowalla was one of the first location sharing services along with Foursquare.

 

 

 

 

 

GoGeomatics is leading the charge in bringing together the geomatics community in Canada.  While the company’s initial focus has to serve as a bridge between people and job opportunities in the geomatics sector, they are doing much more to promote the Canadian geomatics industry.  Their most recent articles have shone the light on women in the industry providing valuable insights into careers, challenges and opportunities facing women in our business.  Check out their new website – they are developing as an important platform for people and companies in the Canadian geomatics sector.

Opportunity: Geomatics Commercialization

Geomatics Canadian Technology Commercialization

 

 

 

For many geomatics companies the challenges of bringing new innovation to the market can seem insurmountable. If you are an SME and don’t know who Tecterra is you should find out.  Tecterra a Canadian commercialization support program with a mission to help Canadian companies bring innovative geomatics solutions to market.

Small or medium sized geomatics technology company based in the Ottawa area should consider attending a one day event being organized by Tecterra at the end of February that will focus on Investing in Geomatics Innovation.

The event will include several interesting and informative guest speakers (Ed Parsons of Google and General Rick Hillier, Former Chief of the Defence Staff of the Canadian Forces) as well as representatives from Tecterra who will provide information about the various investment and support programs they offer.

The exact date is February 29.  There is no charge for the event but I understand seating is limited so request your invitation early.

Yesterday Chris Brogan had an interesting blog post sharing his take on the state of LBS and what it will need to take it to the next level.  As a noted and respected voice in the areas of new media communication and social networking, Chris points out what he considers to be current limitation of LBS technology application and also identifies some things he thinks would add value to the the LBS offering to consumers.  For the record, I would agree that LBS is in its infancy, that its value to the average consumer is pretty limited.  Recent studies have show that the uptake of LBS applications is limited to a small, keen segment of the population but others suggest it is growing. Having said that, I also believe there is great potential for growth in LBS application development.

Key value adds today:

  • Proximity.  Identify your location to business, provide you with real time updates on information such as local traffic and weather.
  • Navigation.  Plan your route, obtain real time directions.

Some new interesting developments:

  • Geofencing.  An extension of proximity capability to define a region of interest around your current location or some fixed point. Applications might be to monitor the movement of a known object (like your kids or a pet?), identify businesses within some limit of my current location (barbers within three blocks).  In his post Chris Brogan refers to this as an identity register.
  • M2M. Machine to machine technologies are emerging in a wide array of b2b markets it will be interesting to see how effectively these can be extended to a consumer market.

Some things that would take LBS to the next level:

Chris Brogan also mentioned temporary groups and commerce capability as important enhancements to the LBS experience.

From my perspective I see analytics as being another important enhancement both from a business and consumer perspective.

Challenging issues:

LBS applications are dependent on content.  To the extent that it is available, applications with flourish or remain marginal.  For instance, if I want to know the barbers in a three block radius of my current location, how many of the existing barbers are actually discoverable?  Obviously those that are, will benefit from the application but if I perceive the information content presented to me is incomplete my confidence in the LBS application will lag.

The other side of the content coin is information privacy.  An issue not limited to the world of LBS applications, the question of protecting information a user considers private (such as current or past location) is an important one. The idea of temporary groups may be one way of addressing privacy concerns.

Those are a few of my thoughts.   Let me know what you think.

So what do you know about machine to machine (M2M) technologies?  Not so much you say.  But I suspect you may have heard of Smart Grids.  These are a form of ‘smart infrastructure’ that employs M2M technology.

A basic definition of M2M is the suite of technologies that support wired or wireless communication between machines allowing for the exchange of information about such factors as location, temperature, device status, etc.

Simple M2M Architecture (courtesy ETSI)

Smart Grids are an example of an M2M technology implementation where infrastructure within a utility grid are instrumented with sensors that can monitor a wide variety of information such as energy consumption, power outages, etc.  These sensors are networked into a communication network that allows the sensed information to be fed to a central system where events can be monitored and various responses affected.

M2M technologies are emerging as a significant technology growth area with applications being realized in many areas.

From a geospatial perspective, M2M presents many opportunities.  At CeBit Australia in August 2010 David Hocking, CEO of the Spatial Industries Business Association boldly stated “No smart infrastructure is smart unless it’s geo-enabled.  Spatial data is the glue for Smart Infrastructure.”

Whether the centrality of geospatial information is as great as that or not, there are clear opportunities for the inclusion of geospatial technologies within the M2M technology mix for many applications.

Many of the classic benefits of geographic information systems can be exploited when M2M systems account for and collect geospatial data.  These include:

Asset visualization – the ability to view network devices being monitored in an appropriately scaled spatial context is beneficial in quickly assessing network status, problems, patterns, etc.

Asset management – geoanalytical tools can be utilized to understand the dynamic nature of the network as assets by assessing qualities like grouping, patterns, movement, etc.

Spatially structured dashboard views of key network parameters can provide both alerts and contextual information about network asset behavior.

Correlation with ancillary information – one of the strengths of GIS technologies is the ability to correlate various types of data on the basis of location.  For instance it might be relevant in an M2M utility grid application to be able to visualize the network against weather information to better understand causes of network disruption.

M2M network performance analysis – spatial measurement tools can be utilized to visualize and better understand network behavior.  These can be summative (evaluating past performance) or real time in nature.

Resource allocation and deployment – network management planning – whether for regular maintenance or emergency response, can be better planned and coordinated with a spatial reference.

While the integration of geospatial technologies with M2M infrastructure has the potential to add value to the overall network, there are issues that need to be addressed in order to achieve maximum benefit.  These include:

The degree to which network monitoring and analysis needs to be dynamic in nature?

Issues around data management:

Where is it stored?

Who owns the data?

Is the data complete?

Do the various data layers have compatible levels of accuracy?

Has the data been cleaned and structured so that it can be interfaced with other data layers?

How is the data maintained?

System connectivity:

How will systems talk to each other?

Can data from disparate systems be reliably accessed?

While there may be challenges to integrating M2M and geospatial technologies, there are also clear benefits that can enhance the value of M2M networks.

What Direction Is Your Company Headed?

It has been said that it is the direction you set for yourself and not your intention that will determine where you end up.  I believe this to be true –  both at a personal and at a business level.

The path we set for our business is essential to where our company will be 5, 10 or 20 years from now.  While we cannot fully anticipate the future, defining the path and making critical decisions as to what needs to be done along the path must be done.

Several years ago I was providing some advice to a company that was in the process of revamping its business plan – determined to take their company to the next level.  They had arrived at a point where partly through their own hard work and partly because of outside circumstance and opportunity, they saw an opportunity for their company to expand and achieve significant growth.  I recall a lunch meeting with the founder and his long time partner – both of whom had invested time and money in getting the company to where it was.  At that meeting the founder confirmed the intentions for growth and then he made a very significant statement – “I believe the company would be better served if I handed the president role to someone more skilled at the functions required of that position.” He would continue with the company in a role that was more suited to his skill set and interest.  It was a bold statement and one I knew was appropriate based on my knowledge of the company and the business plan they were in the process of fine tuning.  The founder made the statement understanding that for his company to grow, it would take more than just intention.  It was necessary to take a very specific step – bring into the company expertise that would be required to achieve the growth they all desired and would benefit from.

All too often the scenario is completely different – one where an individual or group of individuals come up with an idea for a business.  They become business founders and owners.  They are the knowledge centre, the energy centre and the major stakeholders.  Sometimes the decision to make changes such as bringing new skill sets into the company is not easily recognized nor executed.  Sometimes the need is something else but whatever it is, it is essential that it be taken to achieve the end result – not always easy decisions but absolutely necessary.