IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Selected Readings on the Human Side of Information Technology

Selected Readings on the Human Side of Information Technology
Author(s)/Editor(s): Edward J. Szewczak (Canisius College, USA)
Copyright: ©2009
DOI: 10.4018/978-1-60566-088-2
ISBN13: 9781605660882
ISBN10: 1605660884
EISBN13: 9781605660899

Purchase

View Selected Readings on the Human Side of Information Technology on the publisher's website for pricing and purchasing information.


Description

Selected Readings on the Human Side of Information Technology supplements course instruction and student research with quality articles focused on key issues concerning the behavioral and social aspects of information technology. Containing over 30 chapters from authors across the globe, these selected readings in areas such as user behavior, human computer interaction, and social computing depict the most relevant and important areas of classroom discussion within the categories of Fundamental Concepts and Theories; Development and Design Methodologies; Tools and Technologies; Application and Utilization; Critical Issues; and Emerging Trends.



Table of Contents

More...
Less...

Preface

I’m always trying to stave off technology. I can’t help it. I have this deep-seated feeling that technology somehow is anti-human, anti-biological, anti-relationship. When am I going to realize that it doesn’t matter? I can’t stop it… Now maybe we staver-offers just have to quit wanting so much control. Couldn’t we just allow new ideas and things into our lives? And let the folks who enjoy them – even ourselves – embrace them? Even the word sounds more graceful: embrace.
-- Mary Scalzi, My View, The Buffalo News (February 12, 2008)

Mary Scalzi’s perspective on technology intrigues me. Humans have a love-hate relationship with machines. Machines are useful. They can do some things better than humans can. But there is something about machines that humans do not entirely trust. Charlie Chaplin’s antics in Modern Times were played out against a background of dead seriousness. When Frederick Winslow Taylor fathered scientific management – an attempt to apply the scientific method to the management of workers, thereby rendering the worker more machine-like – workers protested that Taylorism was dehumanizing. Of course, Modern Times and Taylorism took place in the world of low technology. Are the same things true about humans and machines in the world of high technology? The quote above suggests that perhaps it is, at least for some humans.

In late 1993 I was on the phone with Mehdi Khosrowpour, who was the founder and head of the Information Resources Management Association (IRMA), who during the course of the conversation asked me to suggest a new track for the Association’s annual conference that was meeting for only the fifth time in San Antonio, Texas the following spring. Not wanting to appear daft, I blurted “the human side of IT.” The phrase stuck and is still being used today at IRMA conferences.

When IGI Global asked me to contribute this prologue, I was somewhat hesitant at first to commit to the project. My hesitancy was based on what I believe is a natural uneasiness about revisiting in memory events that happened some fifteen years ago. After all, memory is fallible and, like recalling history, subject to invention. Yet at the same time the prospect of recalling why I coined the phrase in the first place, on the fly as it were, intrigued me. What influences were at work in the recesses of my mind at the time? After some reflection, I believe there were three – why information systems fail, the Minnesota experiments, and human factors.

As a student of MIS in the 1980s, certain research efforts impressed upon me the importance of focusing on the role of people in systems. Henry Lucas’ study of why information systems fail (Lucas, 1975) is a case in point. In particular, his finding that human behavior is at least as important as technical excellence to the success of an information system left a lasting impression on me. It is easy to be dazzled by the stream of technological innovations in hardware and software. However systems are built for people. It is how people react to technology that is of fundamental importance to systems’ success. Today the concept of the “operational feasibility” of a systems idea is included in all textbooks on systems analysis and design. To be assured that a systems idea has a fighting chance of succeeding, the people for whom the system is being developed must ultimately accept the system and use it in the manner it was developed to be used.

The “Minnesota Experiments” (Dickson, Senn & Chervany, 1977) were another influence on my thinking about people and systems. These experiments were conducted to study the significance of IS characteristics on decision activity. Decision effectiveness was seen as dependent on the decision maker, the decision environment, and the characteristics of the IS. One aspect of these experiments appealed to me greatly at the time – the consideration of individual differences, especially the concepts of psychological type and cognitive style. That individuals may gather information differently and then use that information in different ways opened up possibilities for studying people and systems that seemed limitless. My earlier college course work in psychology and sociology suddenly seemed truly meaningful and useful. I even thought briefly about incorporating psychological types into my dissertation work – until I realized that my dissertation committee would not be amused. In any event, individual differences and decision making continue to interest me to this day.

“Human factors” as a field of study began to address the need to understand how people are affected by and respond to IT. A good definition of human factors is provided by Beard & Peterson (1988, pp. 12-13): “Human factors is the scientific study of the interaction between people, machines, and their work environment. The knowledge gained from this study is used to create systems and work environments which help to make people more productive and more satisfied with their work life.” To more clearly define major research themes on human factors, Beard and Peterson (1988) divided the existing research into five categories: human-machine interaction (how people and computers communicate); interface specification tools (formal techniques for the design of the focus of interaction between people and computers); information presentation (the way data are displayed to the user (including graphic, numeric, alphanumeric, tabular, text, audible, tactile); system user documentation (producing documentation in a form and style suitable to the expert or novice system user); and end-user involvement (methods used to involve users directly in the various stages of system development).

Within these five fundamental categories are included many different topics and issues relevant to the human side of IT. These topics and issues are often featured as tracks at professional conferences or form the focus of special issues of scholarly journals. Prior to my telephone conversation with Mehdi Khosrowpour, I had noted that all of the tracks in the earlier IRMA conferences had a technological focus and that a new track that emphasized the human aspects of systems was sorely needed. I can’t recall if I was asked to suggest a list of topics that would be appropriate for the track, but if I did I’m sure the list was short, general in nature, and woefully inadequate to the task. In any event, the Human Side of IT track was created for the 5th Information Resources Management Association International Conference. If the number of paper contributors is any indication of success, the new track focused on people and IT was a big hit. It was evident that many researchers had an interest in the human side of IT and wanted an outlet for their work. I’m still not sure what the boundaries of the human side of IT may be. I do know that they are not technological.

The new-found popularity of the Human Side of IT track gave way to the publication of three books of readings: Szewczak, E. & Khosrowpour, M. (Eds.), The Human Side of Information Technology Management; Szewczak, E. & Snodgrass, C. (Eds.), Managing the Human Side of Information Technology: Challenges and Solutions; and Szewczak, E. & Snodgrass, C. (Eds.), Human Factors in Information Systems. Contributors to these books of readings canvassed a wide range of topics, including but not limited to human-centered methods in IS, user satisfaction in IS, IT assimilation, computer anxiety, the impact of office automation on user health and stress, multimedia computing in support of knowledge work and group collaboration, attitude and use of computer mediated communication, student personality traits and expert systems, cultural diversity and group DSS, IT and privacy, IT and leadership, the cultural characteristics of IT professionals, motivation for using IT, individual differences and computer attitudes, and ethics and IT. This wide range of topics is indicative of the many avenues of research afforded to scholars interested in the human side of IT. It should be noted that the continual introduction of new technologies will provide ample opportunities for researchers to further explore the human side of IT and to add to the list of relevant topics.

Professional conferences are an ideal outlet for researchers to present their work on a wide range of different, often only distantly related, topics. But more focused efforts in the field of human factors in particular have yielded results of both theoretical as well as practical importance. The work of Ben Schneiderman at the University of Maryland is a notable case in point. He has compiled an impressive collection of results of many different researchers (including himself) focusing on the dimensions of the issue of human-computer interaction that is relevant to researchers and practitioners in many fields of endeavor, including computer science, psychology, library and information science, business and information systems, education technology, communication arts and media studies, and technical writing and graphic design. According to Shneiderman (1998, p. 10), “[e]ffective systems generate positive feelings of success, competence, mastery, and clarity in the user community. The users are not encumbered by the computer and can predict what will happen in response to each of their actions. When an interactive system is well designed, the interface almost disappears, enabling users to concentrate on their work, exploration, or pleasure. Creating an environment in which tasks are carried out almost effortlessly and users are ‘in the flow’ requires a great deal of hard work from the designer.”

What is involved in the “great deal of hard work from the designer?” To begin with, the designer must recognize the human diversity involved in the environment wherein the system is to be used. Are the users novices or first-time users, knowledgeable intermittent users, or expert frequent users? What tasks will the users be expected to perform? For example, in a medical clinic, will users be expected to perform queries by patient, update data, perform queries across patients, add relations, and/or evaluate the system? The designer must then consider the various interaction styles available. Is direct manipulation using a mouse appropriate or do users feel better entering commands directly using a keyboard? Will menu selection be acceptable, or will form fillin be more desirable? However these questions are answered, the designer should adhere to the eight golden rules of interface design (Adapted from Shneiderman, 1998, p. 74-75):

Rule #1: Strive for consistency. Consistent sequences of actions should be required in similar situations; identical terminology should be used in prompts, menus, and help screens; and consistent color, layout, capitalization, fonts, and so on should be employed throughout. Exceptions, such as no echoing of passwords or confirmation of the delete command, should be comprehensible and limited in number.

Rule #2: Enable frequent users to use shortcuts. As the frequency of use increases, so do the user’s desire to reduce the number of interactions and to increase the pace of interaction. Abbreviations, special keys, hidden commands, and macro facilities are appreciated by frequent knowledgeable users. Short response times and fast display rates are other attractions for frequent users.

Rule #3: Offer informative feedback. For every user action, there should be system feedback. For frequent and minor actions, the response can be modest, whereas for infrequent and major actions, the response should be more substantial. Visual presentation of the objects of interest provides a convenient environment for showing changes explicitly, for example, using a direct manipulation interface.

Rule #4: Design dialogs to yield closure. Sequences of actions should be organized into groups with a beginning, middle, and end. The informative feedback at the completion of a group of actions gives users the satisfaction of accomplishment, a sense of relief, the signal to drop contingency plans and options from their minds, and an indication that the way is clear to prepare for the next group of actions.

Rule #5: Offer error prevention and simple error handling. Design the system such that users cannot make a serious error; for example, prefer menu selection to form fillin and do not allow alphabetic characters in numeric entry fields. If users make an error, the system should detect the error and offer simple, constructive, and specific instructions for recovery. Erroneous actions should leave the system state unchanged, or the system should give instructions about restoring the state.

Rule #6: Permit easy reversal of actions. Actions should be reversible. This feature relieves anxiety, since the user knows that errors can be undone, thus encouraging exploration of unfamiliar options.

Rule #7: Support internal locus of control. Experienced users strongly desire the sense that they are in charge of the system and that the system responds to their actions. Surprising system actions, tedious sequences of data entries, inability or difficulty in obtaining necessary information, and inability to produce the action desired all build anxiety and dissatisfaction.

Rule #8: Reduce short-term memory load. The limitation of human information processing in short-term memory requires that displays be kept simple, multiple page displays be consolidated, window-motion frequency be reduced, and sufficient training time be allotted for codes, mnemonics, and sequences of actions. Where appropriate, online access to command-syntax forms, abbreviations, codes, and other information should be provided.

As straightforward and “obvious” as these guidelines may seem today, they are the result of years of careful study and research in the area of human factors. They have also been implemented in modern interface technologies, especially graphical user interface (GUI) technologies. As impressive as these research results are, as an educator I find that the idea that people are at least as important as technology in systems is a hard sell to IS students. After all, IS programs at colleges and universities are fundamentally focused on various aspects of technology – hardware, software, database management systems, telecommunications, Web site development, Internet security, etc. IS graduates need to be technologically proficient in order to compete successfully in the job market. Even in the area of systems analysis and design there are many tools and techniques that are fundamentally technical that IS students need to master. Still, when I teach systems analysis and design, I cannot resist relating an experience which speaks to the role of people in systems. As a Ph.D. student, one of my jobs was to work with the MBA Program Director (“PD”) in the redesign of the MBA admissions program. The program used a database management system called System 1022 through a COBOL interface. “PD” wanted me to rewrite the COBOL interface in the language of System 1022. For me this was an enjoyable technical exercise which involved taking 30 plus pages of COBOL and reducing the program to just 7 pages of System 1022 code with system enhancements that were transparent to the system user. “PD” was delighted with the results. But delight soon turned to misery when “PD” tried to convince people in the MBA office to change the way they did things. “PD” learned firsthand the power of “resistance to change” over the course of the 12 months it took to convince people that the new system was actually a good thing for all concerned and that the old COBOL interface was really a bad thing. (Thankfully, I was not involved with the people issue.) Also, “PD” probably learned a few things on the job about the importance of “user involvement” during systems development.

Yet most students readily respond to topics related to the human side of IT. For example, the topic of personal information privacy and IT has great interest for me. It has been my observation that most students are very much aware of the information privacy issue and that most do care about keeping their personal information private. Take, for instance, the topic of “cookies.” I explain that cookies are small text files that are surreptitiously stored on the students’ computer hard disks when they visit various websites on the Internet and that contain information about what websites they have visited. I further explain why the cookies are created and stored; namely, so businesses can use the information in the cookies to develop “profiles” of the students for use in, among other possible things, creating marketing pitches in the form of things like pop-up ads. When I’m finished, most students want to know how to get rid of the cookies, so I explain to the best of my ability the many approaches to managing cookies that are available today, including the use of anti-spyware programs and even Microsoft’s Internet Options under Windows Tools.

But here is my concern – do students actually act on my suggestions? If not, why not? Now I know that my concerns about safeguarding my personal information privacy when using the Internet (as well as in the low tech world) affect the way I approach using IT in a negative way. I do not give away my personal to any website, even if it means some kind of reward (say, free shipping) will be withheld from me. I’m afraid of being a victim of identity theft. I worry about nameless, faceless people in cyberspace who are ready and waiting to do me harm, the nature of which I can only conceive in a dark moment of reflection. I do not join social networks in cyberspace for fear that my personal information will somehow in some way be compromised. I will not use Google’s Gmail because I’m aware of the possibility that anything I send may be saved in Google’s server farms for a very long time to come. This is my response to Internet technology. But what about my students? I know that they do not always follow my prescriptions for doing well in my courses. So why would they listen to my concerns about sharing personal information with websites?

I think that one key avenue for further research in the human side of IT is studying how Internet users respond to IT in the face of threats to their personal information privacy. We know, for example, that when Facebook.com added a feature that makes it easier for users to keep abreast of their friends by tracking users’ activities on the website and then communicating these activities to all the people in the friends’ social network, hundreds of thousands of Facebook.com users expressed outrage at what was perceived as an unwarranted use of their personal information. How will the human-computer interface be affected by personal and social issues such as concerns about personal information privacy? Does the human need to belong to social networks overcome concerns about safeguarding personal information? Or does concern about protecting personal information mitigate against joining social networks? These are just a few of the interesting questions that further research into the human side of IT can help to answer. As technology evolves, so will people’s response to it. And along with the response will come the need to understand it.

In any event, I wonder what Mary Scalzi would think about all this.

More...
Less...

Reviews and Testimonials

This book addresses the need to understand how people are affected by and respond to IT.

– Edward J. Szewczak, Canisius College, USA

The planning and execution of electronic resources within libraries is analyzed in this references volume.

– Book News Inc. (Decemeber 2008)

Author's/Editor's Biography

Edward Szewczak (Ed.)
Edward J. Szewczak is Professor of Information Systems at Canisius College. He has co-edited a number of scholarly readings texts for Idea Group Publishing, including Human Factors in Information Systems (with Coral Snodgrass), Managing the Human Side of Information Technology: Challenges and Solutions (with Coral Snodgrass), Measuring Information Technology Investment Payoff (with Mo Mahmood), The Human Side of Information Technology Management (with Mehdi Khosrow-Pour), and Management Impacts of Information Technology: Perspectives on Organizational Change and Growth (with Coral Snodgrass and Mehdi Khosrow-Pour). He is currently serving as an Associate Editor of The Information Resources Management Journal.

More...
Less...

Body Bottom