A Forgotten Art in the Era of ‘Smart’ Technology: Data Management

Updated: Jul 2, 2018

by | John Young


Originally Published on LinkedIn on August 18, 2017


Ok, yes, I’m trying to be a bit provocative but the topic of this blog has been building in my mind for over a year now. It’s especially provoked by the surge of interest in “data science” and all the buzz around “smart” or what I will call “smart era” technologies. I’ll also start by saying the audience for this blog includes the community of professionals, owners, and IT folks responsible for installation or campus asset and facility management (AM-FM), but I believe my “discourse” here applies to just about any organization managing mission critical assets – assets that help an organization deliver performance goals.


I’ll start by paraphrasing a few words from Thomas L. Friedman’s recent book “Thank You for Being Late: An Optimist’s Guide to Thriving in an Age of Accelerations” and suggest … it’s time for a “pause” in most organizations’ approach to data management. You know, a small breather with time to step back and consider 1) the volume of data ones systems are collecting and 2) whether or not one has a system(s) capable of keeping up with demands of advances in technology. This won’t be a sermon on designing and integrating systems to support big data’s “four V’s” – volume, velocity, variety and veracity – or opinions on current state of technologies as they travel along the “technology hype curve”, but rather an even simpler step back to consider the benefits of a strategic data management plan and the real risks associated with not having one. I suppose at a high level I’m digging into the importance of responsible adoption of innovative technology and proper balance with ones data management strategy -- effectively, responsible business transformation. And regardless of the assets being managed, I should point out my ardent belief that one’s data management strategy should – at its core – be location-based.


While I have only read a couple Friedman books, I found “Thank You for Being Late” to be the perfect catalyst for this blog. The rate of technological change and human adaptability chart he presents on page 34 is quite revealing (shown below). This chart depicts the biggest technological leap since the Second Industrial Revolution over 120 years ago – when electrification was introduced to industry. Back then, it took 30 years or an entire workforce generation to fully recognize the productivity enhancements of the technology. Today, the expectation of time to recognize productivity gains is much smaller … but is it practical?


Taken from Thomas L. Friedman’s book, “Thank You for Being Late: An Optimist's Guide to Thriving in the Age of Accelerations”, 2016; presenting thoughts of Eric “Astro” Teller, the ‘Captain of Moonshots’ at Google X Research.

There are a number of key takeaways from this chart. Most noteworthy is that growth in technology has surpassed humans ability to adapt and process all the inputs [read: data] being generated; and further, that the only way to catch up is via improved education and smarter governing models.


For this blog, I’ll concentrate on the “we are here” point along the technology curve. I have replicated Friedman’s chart below and added a highlighted area I will refer to as the “data usefulness gap”. I have also taken the liberty to collectively call the “time” to the right of the line intersection the “smart era”. It’s this space and the future of this growing gap that causes me to pause and suggest that it’s time to get back to data management fundamentals. It’s this gap that causes me concern for stewards of AM-FM data and how to help them strike a balance with business transformation goals.


Foreshadowing: shrinking or controlling the expansion of the data usefulness gap will be a function of evolving data management best practice and the existence of historic data fed into artificial intelligence or cognitive learning tools – tools whose purpose is fast assimilation, predictive analysis, and delivery of probability and confidence weighted answers or information choices [think: IBM Watson-like]. Future productivity will be determined by these choices. AM-FM practitioners will increasingly be challenged to manage structured data enterprise systems with the capabilities to support unstructured data analysis.


The current "supernova" or major catalyst of productivity described by Friedman is the "computer-internet-mobile-broadband" economy. While the current supernova started a little less than 20 years ago, it has only been in the past decade that smart era technology has kicked into high gear. The current question not yet answered is … can the current workforce generation make use of the fantastic technologies currently hitting the market quickly enough to recognize desired productivity gains? Or will it take another decade? And, if available, are there key foundational, even bridge, technologies that will help facilitate reducing the time required? In my mind, the answer to this question is yes and shortly I will describe what I believe is a key bridge technology needed to take advantage of all the new smart era tools.


So what are some examples of technologies capturing all this data on the other “smart era” side of the intersection? Which ones are recognized as current or future productivity enhancement or business transformation tools? I won’t claim to be an expert on this topic; but in conversation with some of the brightest AM-FM technology minds in the business like Mike Parkin, Director of Applied Technology, Innovation and Research at the Massachusetts Institute of Technology, the following technologies stand out as front-runners:


  • Internet of Things (IoT)/ Industrial Internet of Things (IIoT) solutions with sensors being placed on just about every type of asset both outdoors and indoors. These solutions generate large volumes of data and thankfully the options available for analyzing and storing this data continue to improve in-step.

  • Augmented, Virtual, and Mixed Reality (AR/VR/MR) providing ability to do before/after site planning analysis and look-before-you-dig field operations. These technologies will be the tool of choice in up-coming generation of college graduates. It’s a little early but dramatic improvements in smart phone and browser-based 3D visualization already make this technology a reasonable option today.

  • Laser scanning or Lidar & Immersive (360) photography – Lidar scanning, Matterport photography, and image or photo-tagging assets for change/clash detection and picture-based search and discovery.

  • UAV or drone data collection for quick-turn aerial imagery, volumetric analysis, and asset inspections.


While this list is not exhaustive, it includes technologies that can quickly be deployed to gather data for analysis and delivery of actionable information for immediate and short-term (inside six months) decisions. What is less well-understood or trickier is making sure this data has a proper home or location for future weighting against historic data for 1-3-5 … 25(?) year pattern/trend detection, planning and prediction. Put simply, can this data be used for predictions that impact future productivity gains? The answer is yes, but only in certain industries presently. It will take some time for this to be possible for AM-FM practitioners. Part of the answer to effective use of these data collection technologies is choosing the right bridge technology capable of ingesting, storing, managing, integrating, and sharing the data. For more on this answer and in the interest of not spoiling the book, I’ll suggest readers check out “Thank You for Being Late”. I’d be remiss if I did not mention the digital design format BIM, or building information models, and BIM producing technologies. While an important part of the equation, I am intentionally not including it here, as this topic will be more fully covered in my next blog described at the end of this one.


Regardless, the spate of data being collected should challenge organizations to take some time to step back and evaluate how their AM-FM systems are setup to handle the growing “data usefulness gap” while also protecting itself (mitigating risk) and responsibly minimizing total cost of ownership.


So what are the risks of not having a location-based strategic data management plan? What are risks of not elevating the human adaptability curve? I would argue the three most important, preventable, unexpected risks in no particular order are:


  1. Damage to infrastructure and facility assets

  2. Increases in total cost of ownership for infrastructure and facility assets

  3. Injury to organization staff, contractors, visitors, or reputation


To better understand these risks let’s consider a couple scenarios. In the first scenario, Organization A maintains its infrastructure and facility asset information with a collection of as-built paper drawings, digital design files (e.g. CAD), spreadsheets, and PDF files. Organization B uses contemporary relational databases to store asset attribute information and an integrated combination of enterprise systems –enterprise asset management or EAM, building automation system or BAS, and geographic information system (GIS). Both organizations decide to set-up location- or proximity-based preventative maintenance alerts for a set of mission critical assets. An example would be if a maintenance operator gets within three meters of a critical surveillance asset and his/her smart phone receives an alert if the assets preventative maintenance schedule is within one month of its due date. In this scenario, Organization B will be in a much better position to add this capability quickly. Organization A can also make it happen but the path to achieving similar capability will be longer, more costly, and much harder to maintain. It will put Organization A squarely in the middle of the aforementioned “data usefulness gap” with difficult decisions to make regarding future storage of asset maintenance record data. In my opinion, Organization A also introduces unnecessary risk in a world where technology advancements make significant leaps every six months; i.e., the gap will continue to widen at alarming rate making catch-up and maintenance even more difficult, more expensive, more time-consuming.


Let’s consider another scenario where the implications of a widening data usefulness gap could introduce even greater risk. In this likely scenario, government legislation or regulations require asset maintenance, repair response, or emergency response times within a set interval of time. If we take Organization A and B from the previous scenario and consider both organizations maintain their current state asset management systems as-is at the time new regulations are implemented, which organization will be at greater risk of non-compliance?


I’ll suggest that most organizations will not want to be in a place where a slow or inaccurate response or improperly maintained asset fails and results in loss of productivity, performance, or worse human injury. "If only we had better, easier access to the data" and "if only our data would have been organized for efficient analysis and presentation as actionable information" are preventable scenarios with consequences most organizations will likely want to avoid. Not all scenarios are doom and gloom. Here are a few smart installation examples pertinent both today and into the not-so-distant future: indoor e911 response, real-time or current state building occupancy, space charge-back optimization, capital project location analysis, smart asset location analysis (placement of sensors, camera's etc...), proximity-based preventative maintenance alerts, AR/VR for in-field asset inspection or “look before you dig” analysis, fiber network and cyber security control point location analysis, etc... .


With the steep rise of technology advancement in the smart era, it is only a matter of time before these types of regulations become adopted. When this happens, I suggest it’s better to be starting from the state of Organization B. As for real world parallel, I’ll suggest that one needs to look no further than the recent GAO report dated May 25, 2016, on "Information Technology: Federal Agencies Need to Address Aging Legacy Systems" to understand the magnitude of the cost and risk of waiting to modernize ones data management strategies.

What are we doing about it at Patrick Engineering & Geospatial Services (Patrick)?


Above all of my responsibilities, I view data science and strategic AM-FM technology implementation planning as the most important. Central to this responsibility is helping an organization develop a strategy for data management. The saying I heard over 20 years ago, “[data] garbage going in, garbage coming out” still applies today but as mentioned above on hyper-accelerated pace. At Patrick, I am working with our team of professionals to build solutions and service offerings that help owners and operators implement integrated AM-FM-GIS solutions. We apply best-of-breed strategies, technologies, and methods that help organizations achieve business transformation goals balanced with the growing set of requirements in the smart era. At the core of this best practice is improved hierarchical, location-based organization of data. In this regard, we work closely with our partner Esri to follow ArcGIS platform architecture and design best practices. Regardless of asset type – land, infrastructure, or buildings – and variety of AM-FM systems being used, we provide data management, system design, and integration, and implementation services that help installation-campus-plant owners/operators deliver productivity goals today with a solution foundation that can adapt and support advances in technology into the future.


Esri and the ArcGIS platform provide information [read: aggregated data] models for many asset categories. They provide templates for hierarchical, location-based organization of asset data – from tax parcels and land holdings to electric and water utilities to building indoor facilities. They also support a wide variety of industry standards used to define and classify infrastructure and facility assets as well as hooks or connections to other enterprise systems.

Interestingly, ArcGIS platform technologies have evolved and become a primary data store and management system for much of the planet’s natural and built environment for almost 50 years. This is particularly true for land and infrastructure assets. Along this journey, Esri has consistently kept in-step or even slightly in front of information technology evolution. Increasingly over the past decade, Esri and its customer organizations (over 350,000 globally) have starting adding building and building indoor environments to the list of managed assets. I’ve been there with them on every step of that journey.


My next blog will take a deep dive into the drivers [read: consumerization and urbanization] behind this increasing trend and why the ArcGIS platform and its Geodatabase may be one of THE key bridge technologies that holds the “data usefulness gap” in check.


To learn more about the ideas, technologies, and services described here and in my previous articles, please contact me.

let's connect
follow us
  • White LinkedIn Icon
call us
800.799.7050

Copyright © 2018 Patrick Engineering Inc.   All Rights Reserved.