Beyond Mapping III

Topic 21: Human Dimensions of GIS

 

Cover_small

Map Analysis book with

 companion CD-ROM for

hands-on exercises

and further reading

 

An Experiential GIS  discusses a participatory GIS experience 

An Understanding GIS  describes the translation of mapped data to spatial information for decision-making

Dreams and Nightmares Are Born of Frustration  identifies concerns with cost/benefit analysis of GIS

GIS Is Never Having to Say You Are Sorry  discusses several human considerations in implementing GIS 

Note: The processing and figures discussed in this topic were derived using MapCalcTM software.  See www.innovativegis.com to download a free MapCalc Learner version with tutorial materials for classroom and self-learning map analysis concepts and procedures.

 

<Click here> right-click to download and print a printer-friendly version of this topic (.pdf).

(Back to the Table of Contents)
______________________________


Don’t Forget the Human Factor: An Experiential GIS  

(GeoWorld, July 1996)

(return to top of Topic)

 

It is often said that "experience is what you get when you don't get what you want."  The corollary to this universal truth is "learn from other's mistakes, so you won't have to make them all yourself."  As GIS moved from its infancy in the early 1970's to its present maturity, the school of hard-knocks coughed-up an ample set of good, bad examples.  We might not know what is best for all GIS environments, nor have the omnipresent formula for assured success, but the growing layers of scar tissue in the GIS community clearly point to the paths not to follow.

 

Given this line of reasoning, let me describe an early experience in the application of GIS to land use planning.  It was a class project for a graduate course in GIS at Yale University in the spring of 1980.  The saga pits a naive and somewhat dim-witted assistant professor backed by a covey of bright students against an enraged portion of the populace of Guilford, Connecticut, a picturesque town along Long Island Sound.  But I am getting ahead of myself.  The early stages of the project were typically blissful, with focused energy on data base development within the tender arms of academia.  The students feverishly encoded twenty data layers for the nearly 70 square mile town, including the usual set from standard map sheets, augmented with special town maps, such as zoning, sensitive soils, and land use.  This in itself was a great learning experience, given the pre-Paleolithic tools of the time.

 

Where we went wrong was an attempt to address a "real world" problem.  The town had recently completed its Comprehensive Plan of Development and Conservation as a requirement of the Coastal Wetlands Act.  It was the result of several years effort among citizen groups and town officials.  The plan consisted of twenty-one policy statements, such as "protect inland wetlands ...from contamination and other modifications," "preserve farmlands," and "encourage development near or within existing developed areas." 

 

Since all twenty-one of the statements had a spatial component, it seemed natural to map the conceptual model embodied in the plan.  Using a three-tier ranking scheme of suitable, less suitable and unsuitable, each policy statement was interpreted into a map of suitability for development.  For example, the policy to "preserve farmland" used the town's land use map to identify farmland and then assign the areas as less suitable.  Similarly, the policy statement to "protect inland wetlands" caused these areas on the sensitive soil map to be designated as unsuitable.  In contrast, the areas near or within existing development indicated on the land use map were identified as suitable for development.  Following the plan's organization, the statements were grouped into four submodels of Water and Sewage, Growth, Preservation, and Natural Land Use, then combined into one overall suitability map.

 

Near the end of the term, enthusiasm was high and success seemed imminent.  That was until we hosted a town meeting at the local high school to present the results.  Students served refreshments and proudly stood by their computer-generated maps draping the walls.  As fledgling GIS technocrats, they were eager to enlighten the audience as to the importance of the technology and the elegance of the map analysis process.  However, the congregation seemed bored by the techno-babble and focused their collective attention on the final map of suitability.  Once they located their property (you know, the parcel they were holding to pay for Sonny's college tuition), they did one of two things-- 1) profusely thanked the students for an undoubtedly thorough job and promptly departed to relieve the baby-sitter, or 2) lock the last student in the reception line in animated debate and, once pried loose, sat down in seething hostility.  In less than a half-hour we had distilled our audience to a residue of enraged citizens holding "unsuitable" property.  We left about midnight and had to sneak back in the morning before basketball practice to recover what maps we could from the walls.

 

So what went wrong?  We had done our homework.  We had developed an accurate database.  We had conscientiously translated their policy statements into maps and integrated them as implied by their plan.  We thought we had done it all... and we had from a GIS-centric perspective.  What we had missed is GIS's wildcard-- the human factor.  The textual rendering of the comprehensive plan was comfortably innocuous as it lacked threatening spatial specificity.  It seemed natural to outline a set of amorphous goals, then proceed with incremental planning whenever a developer proposes a specific parcel.  If contention arises, there are always planning variances, exceptions, mitigation, and the ultimate recourse of lawyers and judges.  This is the way things had always been done... the natural law of land use planning.  The idea of an actual map of the spatial ramifications of a comprehensive plan is akin to poking a stick into a den a rattlesnakes.  Any seasoned planner knows, you plan, then move on before you implement... it's dangerous out there.

 

Being a slow learner and somewhat bent on self-flagellation, I decided to extend the project the following year.  First, the students refined both the database and the model, then determined the most limiting policy goals by systematically relaxing criteria in successive runs (sensitivity analysis).  Armed with this insight, we solicited the help of the three town commissions instrumental in the plan's development; the Economic Development Commission, the Planning and Zoning Commission and the Conservation Commission.  At working meetings, policy-rating questions were posed to each group and their hierarchical orderings of the policy statements where used for subsequent model runs. 

 

The results were three maps of overall suitability, expressing alternative interpretations of the plan.  For example, the Conservation Commission's interpretation of "protect inland wetlands" was emphatic.  Since it's damp about everywhere, 83% of the town was deemed unsuitable for development.  The Economic Commission, on the other hand, believed sound engineering protects wetlands, thereby lowering the wetland policy's rating, which resulted in only 21% being unsuitable.  By simply subtracting the two maps, the locations of agreement and contention were easily identified.  The comparison map and the three alternative interpretations by the commissions were published in the local paper... "healthy a priori discussion ensued."  Most importantly, we minimized GIS student casualties.

 

The Guilford experience has forever altered my perspective of what GIS is (and isn't).  Yes, it's hardware and software.  It's a database.  And GIS models.  But, in actuality, it is the domain of the end-user and those impacted.  Neither GIS Jerk nor Jock, can "solve" someone else's concern with rapid geo-query and pallet of 64,000 colors draped on a 3-dimension plot.  In realworld applications, GIS acts as a communication tool in understanding the important factors, their interactions and various interpretations of both. 

_______________________

For more on this "watershed" experience, see Assessing Spatial Impacts of Land Use Plans, by Berry and Berry, 1988, in Journal of Environmental Management, 27:1-9; and Analysis of Spatial Ramifications of the Comprehensive Plan of a Small Town, Berry, et. al., 1981, in the proceedings of the 41st Symposium, American Congress of Surveying and Mapping.

 

 

Developing an Understanding GIS 

(GeoWorld, August 1996)

(return to top of Topic)

 

Effective GIS applications have little to do with data and everything to do with understanding, creativity and perspective.  It is a common observation of the Information Age that the amount of knowledge doubles every 14 months or so.  It is believed, with the advent of the information super highway, this periodicity will likely accelerate.  But does more information directly translate into better decisions?  Does the Internet enhance information exchange or overwhelm it?  Does the quality of information correlate with the quantity of information?  Does the rapid boil of information improve or scorch the broth of decisions?

 

GIS technology is a prime contributor to the landslide of information, as we feverishly release terra bytes of mapped data on an unsuspecting (and seemingly ungrateful) public.  From a GIS-centric perspective, we are doing a bang-up job.  Lest I sound like a mal-content, let me challenge that observation.  My perspective might not meet the critical eye of a good philosopher, but that's not the objective.  The thoughts simply explore the effects of information rapid transit on our changing perceptions of the world around us.

 

First, let's split hairs on some important words borrowed from the philosophers-- data, information, knowledge, and wisdom.  You often hear them interchangeably, but they are distinct from one another in some subtle and not-so-subtle ways.

 

The first is data, the "factoids" of our Information Age.  Data are bits of information, typically but not exclusively, in a numeric form, such as cardinal numbers, percentages, statistics, etc.  It is exceedingly obvious that data are increasing at an incredible rate.  Coupled with the barrage of data, is a requirement for the literate citizen of the future to have a firm understanding of averages, percentages, and to a certain extent, statistics.  More and more, these types of data dominate the media and are the primary means used to characterize public opinion, report trends and persuade specific actions.

 

The second term, information, is closely related to data.  The difference is that we tend to view information as more word-based and/or graphic than numeric.  Information is data with explanation.  Most of what is taught in school is information.  Because it includes all that is chronicled, the amount of information available to the average citizen substantially increases each day.  The power of technology to link us to information is phenomenal.  As proof, simply "surf" the exploding number of "home pages" on the Internet.

 

The philosophers' third category is knowledge, which can be viewed as information within a context.  Data and information that are used to explain a phenomenon become knowledge.  It probably does not double at fast rates, but that really has more to do with the learner and processing techniques than with what is available.  In other words, knowledge is data and information once we can process and apply it.

 

The last category, wisdom, is what certainly does not double at a rapid rate.  It is the application of all three previous categories, and some intangible additions.  Wisdom is rare and timeless, and is important because it is rare and timeless.  We seldom encounter new wisdom in the popular media, nor do we expect deluge of newly derived wisdom to spring forth from our computer monitors each time we log on.

 

Knowledge and wisdom, like gold, must be aggressively processed from tons of near worthless overburden.  Simply increasing data and information does not assure the increasing amounts of the knowledge and wisdom we need to solve pressing problems.  Increasing the processing "thruput" by efficiency gains and new approaches might.

 

OK, how does this philosophical diatribe relate to GIS technology?  What is our role within the framework?  What do we deliver— data, information, knowledge or wisdom?  Actually, if GIS is appropriately presented, nurtured and applied, we can affect all four.  That is provided we recognize technology's role as an additional link that the philosophers failed to note. 

 

Understanding sits at the juncture between information and knowledge.  Understanding involves the honest dialog among various interpretations of data and information in an attempt to reach common knowledge and wisdom.  Note that understanding is not a "thing," but a process.  It's how concrete facts are translated into the slippery slope of beliefs.  It involves the clash of values, tempered by judgment based on the exchange of experience.  Technology, and in particular GIS, has a vital role to play in this process.  We not only need to deliver spatial data and information, but deliver a methodology for translating them into knowledge and wisdom.

 

Our earliest encounters with GIS viewed maps as "images," with automated cartography providing rapid updating and redrafting of traditional map products.  The field quickly progressed from computer mapping to spatial database management by focusing on the derivation and organization of mapped data.  It provides efficient storage and retrieval of vast amounts of land-based data in both tabular and graphic form.  From this view, GIS acts like a "cash register" to record transactions on the landscape.  More recently, GIS is viewed as a "toolbox" of map analysis operations in which entire maps are treated as variables and related within a specific context.  It is the GIS toolbox that transposes mapped data into spatial information.

 

Tomorrow's GIS builds on the cognitive basis, as well as the spatial databases and analytical operations of the technology.  This new view pushes GIS beyond data mapping, management and modeling, to spatial reasoning and dialogue focusing on the communication of ideas.  In a sense, GIS extends the toolbox to a "sandbox," in which alternative perspectives are constructed, discussed and common knowledge and wisdom flows.  

 

This step needs to fully engage the end-user in GIS itself, not just its encoded and derived products.  It requires a democratization of GIS that goes beyond GUI interfaces and attractive icons.  It requires the GIS priesthood and technocrats to relish the opportunity to explain concepts in layman terms and provide access to the conceptual expressions of geographic space through intuitive means divorced from macro code. 

 

I hope we consider the importance of knowledge and wisdom in the Information Age, and eagerly grasp the opportunity GIS has in contributing to their derivation.  I fear that GIS "factlets" masquerading as knowledge in the Information Age will mask the importance of wisdom.  I fear that our all-consuming focus on maps and "home pages" on the Internet will distract from the assimilation of the significance embedded in spatial information and the communication of the ideas it spawns.  GIS has an opportunity to empower people with new decision-making tools, not simply entrap them in a new technology and an avalanche of data.  What we have accomplished is necessary, but not sufficient for effective GIS solutions. 

 

Like the automobile and indoor plumbing, GIS won't be an important technology until it fades into the fabric of society and is taken for granted.  It must become second nature for both accessing information and translating it into knowledge... we must refocus its emphasis beyond mapping to that of spatial reasoning.

 

 

Both Dreams and Nightmares are Born of Frustration 

(GeoWorld, May 1992)  

(return to top of Topic)

 

The dream is that GIS can do anything... the reality is that it isn't easy.  With increasing fervor, technologists and users alike define and redefine the "unlimited" potential of GIS technology.  These dreams are, at least in part, an expression of our hopes, as well as our science.  When considering if GIS is for you, often you're biggest challenge is to carefully separate what you hear into two distinct piles-- the quixotic dream and the pragmatic reality. 

 

Your first step in this process is establishing "where you are coming from."  GIS means different things to different people.  At least four distinct perspectives flavor both our expectations and our realities-- economic, organizational, visionary and emotional.  The economic perspective is usually based on labor and time-savings considerations.  Standard cost/benefit analysis is particularly appropriate in distilling the dreams from reality.  A careful audit of your organization's current mapping and spatial data handling procedures establishes a reference to estimate the savings in moving "from pen to plotter and from file drawer to keyboard."  If the savings are greater than the expenditures, you are economically irrational (foolish) if you don't implement GIS immediately.

 

There, that's easy.  There is nothing to it.  Just call in the accountants and they will identify the numbers to plug into the Cost/Benefit equation.  The reality is that even a strictly economic perspective is not that easy.  The comfortable feeling of quantifying the evaluation process is quickly lost to the pliable nature of the "yardsticks" used to measure the costs and benefits. 

 

The time-span used in the analysis is critical.  If it is too short, the stream of benefits is artificially truncated.  The high front-end costs, combined with the confusion and frustration of implementing a new system, will far outweigh the benefits.  It's like a bare-knuckle battle between Sylvester Stallone and a tiger cub.  If it is delayed a few years, the outcome will likely be different.  If you had used a two-week cost recovery period for word processing, would you have ever dropped your pencil? 

 

So what time period should be used?  That's a judgement call-- your judgement call.  Like lying with statistics, you can choose the time period that insures the answer you want.  In general, a longterm position favors the adoption of GIS.

 

Just as important (and "mushy") is how you identify and quantify the variables of the cost/benefit equation.  Four cost considerations quickly surface-- hardware/software, data base development/administration, training and application models.  The hardware figures are the easiest to quantify through a litany of parameters including MegaHertz, GigaBytes, RAM, SIMMs, MIPS (DIPs, DRIPS and SLIPS).  The software specifications are a bit more difficult, yet factors, such as, point-in-polygon, buffering, coordinate accuracy, and transfer formats can be used. 

 

Although relatively easy to quantify, these figures are fleeting and set you up for a bad case of "buyer's remorse."  About the time you finally push through your procurement and take first delivery, your system is out of date.  It's like that pocket calculator.  Within a couple of months, the same expenditure gets you five more keys at half the price.  The difficulty in nailing down the hardware/software cost component isn't in the definitions, it is keeping your footing in the quicksand of technology.  Like shooting ducks, you had better have a good lead on your target.  For large, bureaucratic organizations, it may be prudent to just set a budgetary figure for the "best available technology" and postpone the specifications to the moment of purchase.  That may seem preposterous, but it may be more realistic.

 

Data base development, maintenance and management are not only larger expenses than hardware and software, but it is even more tricky and slippery to estimate.  Rarely does a simple inventory of your current map and file cabinets multiplied times an estimate of encoding costs produce an acceptable cost figure.  The differences between the digital and paper map make it too tricky for such a mechanical approach.  It's prudent to launch an Information Needs Assessment (INA) to determine data base contents, structure, policy and costs (a later issue will focus on this process). 

 

Even if you do get a good handle on the data base, you must develop, you're not out of the woods yet.  How you obtain these data is slippery turf.  Manual encoding, scanning or purchasing are your basic options.  Not so long ago, in-house, manual encoding was your only option.  More recently the scales have been tipping toward scanning and purchasing, as a room full of digitizer folks is a major cost and distraction from normal business activities.  Also, many of the maps you might encode have time-bombs ticking within them.  For example, if you encode (in-house or contract) a soils map, it will become invalid once the Soil Conservation Service's "authoritative" version is released.  Its back to shooting ducks, you had better get your data requirements in line and lead them, or you will just be pumping pellets into the air.

 

The costs of training your people to use GIS can easily outstrip the combined costs of hardware/software and data base development.  Early successes in using GIS were often more a function of the zealots using it than the technology itself.  Like The Little Train That Could, GIS could do a lot.  The pitfalls that accompany any new technology are overcome by innovative "work-a-arounds" of committed users.  In a wholesale adoption, however, the user community is expanded to "I don't think I can" and "I am damned if I will" outlooks.

 

One reaction to this reality is to form a GIS division.  On the surface it is a plausible alternative.  All you have to do is train a small cadre of experts.  There, that's both efficient and effective.  But it rarely works for two reasons. 

 

First, the GIS product produced is just that-- a GIS product, not the direct expression of the final user.  In the late 1970's I had an opportunity to observe a large timber company's centralized implementation of GIS.  Most of the field personnel merely dismissed the "computer jerk's" forest management maps handed to them through the glass windows of the computer center.  "What do they know about the @#*^! forest anyway?", was the rallying cry.  If the maps were used at all, they became the center of attention for the short period it took to locate that "one" forest stand in the middle of a lake.  This meant wasted effort on both the GIS and user sides, a situation that could be helped with sufficient investment in training. 

 

If costs of training are identified at all, they are usually associated with vocational instruction on system operations.  But GIS is a challenging new way of thinking, as well as a new sequence of buttons to push.  The mechanics of translating what you currently do with maps into a GIS is straight forward.  In fact, colorful icons and mouse clicking can make it almost fun.  However, most of the potential GIS applications within and organization are yet to be discovered. 

 

The development of application models is the other reason for failure of a centralized approach.  How the new technology leads to new ways of doing things is the least understood cost (and benefit) of GIS technology.  It's like your son or daughter dumping the tin of tinker toys on the floor.  The mechanics of how the pieces fit together is fairly simple.  What ought to be  built with the individual pieces is the difficult part.  The tinker toy makers (vis GIS experts) can supply some ideas, but they certainly do not cover all of the possibilities.  Vocational training develops an awareness of the GIS "owner's manual" description of the pieces and parts, but beware "some assembly is required" before you are up and running. 

 

The creative assembly is entirely up to your people.  If you ignore or skimp on training and application model development, you will incur opportunity costs at the minimum.  More likely, you will generate a backlash of confusion and apprehension that quickly outweighs the set benefits you identify.  A couple of strategically placed anti-GIS terrorists will reek havoc with the even your best laid plans.

 

A strict economic perspective is the first step in scoping GIS technology.  Identification (and ultimately quantification) of the costs and benefits sets the stage.  However, organizational, visionary and emotional perspectives are needed to complete the picture-- whether a dream or a nightmare.  That gives us something to discuss in the next issue.

 

 

GIS Is Never Having to Say You Are Sorry 

(GeoWorld, June 1992)

(return to top of Topic)

 

Most organizations begin their first step of what seems to be a thousand mile journey to GIS implementation with economic cost/benefit analysis.  At first glance the seductive appearance of a rigorous, quantitative analysis is quickly lost to the pliable nature of the "yardsticks" used to measure the costs and benefits.  At best, a cost/benefit analysis sets the stage for further investigation into the full impact of implementing a GIS.  Even the most favorable C/B ratio should be further scrutinized in terms of the organizational and human impacts of GIS.  Whether real or imagined, the perceived threats of GIS technology form the actual mine field that you must traverse. 

 

The organizational structure (both formal and informal) is an important concern, as it is the direct expression of the "corporate character"— the most basic element of any organization.  If extensive individual latitude and autonomy best describes the current character, GIS will likely have a rocky-road to implementation.  Within this environment, data often are viewed as the medium of exchange for power brokers at all levels.  Simply stated, "…if you must pass through me to get to important data in my map cabinets and file drawers, then I am as important as the data I keep." 

 

However, if GIS places my data in some central repository accessible to all by a single mouse click, my corporate worth has been severely devalued.  The result, as viewed by some, is an electronic end-run around the current data gatekeepers and a direct assault on the existing organizational structure.  It may be a benefit to the organization to have a corporate data base, but to many it represents a personal loss of influence.  If your implementation plan ignores this reality, you'll be sorry.

 

Another concern which may run amuck with the corporate character is the imposition of data standards.  In many organizations, mapping standards are either non-existent, or merely address geographic registration and data exchange formats.  But this is just the tip of the chilling iceberg of standards.  The ability to export a map from one GIS package and swallow it in another is basic and rapidly becoming a non-issue.  Likewise, the ability to convert projections, rectify and register maps is commonplace (although not necessarily easy).  The confusion and frustration isn't in the locational (where) set of standards, but in the informational (what) set. 

 

A corporate data base consists of three levels of maps based on their degree of abstraction-- base, derived and interpreted.  Base maps are usually physical data we collect, such as roads, water and ownership boundaries.  They have minimal abstraction, and as much as possible, represent a scale model with all of the detail of a flatten model train set.  Definitions and procedures for mapping most these data are in place... but not all. 

 

Consider a map of cover type.  Is Forest/Non-Forest a sufficient standard?  Or should the Forest class be further divided into Conifer and Deciduous?  And the Conifer, in turn, subdivided into Pine, Fir and Hemlock?  What about age and stocking classes?  Should you identify a lone pine tree in the middle of a meadow as a Conifer Stand?  Two, three, four, five trees— what does it take to form a forest stand?  Ask a forester, ecologist and recreation scientist and you'll get at least three different responses.  Or maybe four or five different definitions depending on how different applications decipher the landscape.  You'll be sorry if you don't tackle these questions before you implement GIS. 

 

For example, a wildfire had the audacity to burn across the boundary of two National Forests.  Maps of cover type were encoded for both Forests, but they couldn't be edge-matched.  One Forest had six classes of age and stocking for Douglas Fir, the other had eight.  The GIS was able to account for location adjustments during encoding, but not the differences in informational content.  A common classification standard for cover type had to be established and encoded.  The struggle for whose classification scheme was the best eclipsed the mundane tasks of reconstruction and encoding a compatible cover type map.  The challenges to human and organizational interests run much deeper than those encountered at the digitizing tablet.

 

Vested interests in the definitions of map categories goes beyond base data.  Derived maps, such as slope, visual exposure and proximity to roads, are physical things.  However, the data are too difficult to collect, so we use the computer to calculate them.  Even something as simple as slope calculation has several algorithms, each with its pros and cons.  For something as complex as visual exposure, there is a quagmire of assumptions, approaches and procedures.  Which will you entrench in your system?  Rest assured that the choice won't be by consensus, nor the dissenting voices reserved.

 

Even more volatile are the assumptions embedded in interpreted maps.  These data are the most abstract, as they are conceptual renderings of expert opinion.  Taunts of "my elk habitat model is better than yours" reverberate through the halls whenever two wildlife ecologists are cornered in the same room.  It is naive to assume that an elk model will edge-match across two forests, much less an entire region.  And certainly not across the paradigm chasm of two experts. 

 

So whose derived and interpreted maps capture the standards in the corporate data base?  The question of standards runs a lot deeper than just geographic registration and encoding effort.  It involves organizational and individual perceptions, reputations and vested interests.  You'll be sorry if your implementation plan ignores these elements.  Sure, they will get sorted out later-- after you and the GIS system fail.

 

A GIS implementation strategy has to go beyond simply scoping system design to nurturing a receptive environment.  This passes the baton from the system engineers and GIS specialists to the sociologists and human relation professionals.  As continually reminded in this column (possibly to the point of being shrill) GIS is not just automating what you do, but changing how you do things.  Sensitivity to the full impact of these changes, human as well as procedural, is paramount. 

 

Figure 1.  Institutional and Individual Threats and responses.

 

Figure 1 outlines some of the threats and responses which need to be addressed.  The outline is designed to stimulate discussion in a workshop setting, but hopefully they will trip some thoughts in your mind.  As you look over the outline, try some "free associations" with the points.  Conjure up some of your own threats and possible coping responses.  It is a lot of fun at the workshops and sparks a broader perspective on GIS implementation.  At minimum, the exercise should encourage you to go beyond a focus on the mechanics of GIS technology to its institutional and human implications... if you don't, you'll be sorry.

_____________________

 

 

(return to top of Topic)

 (Back to the Table of Contents)