Linguistic Responsibility and Security

Dec. 19, 2013
Communication -- technologically and physically -- often is the gating issue when it comes to making us more secure

In the past half century and certainly in the years since 9/11, the security industry has been leading the way in becoming strategic, smart and secure, as reiterated by this year’s ASIS show slogan  The country now has the Department of Homeland Security (DHS), an agency dedicated to making us smarter and more secure than we once were.  Private industry has taken up the cause through new technology.  Recent advancements since 9/11 have eclipsed those prior to it.  Examples of recent newly implemented technologies include smart cards with biometrics, video analytics, cloud computing and big data.  Service providers have broken new ground in the integration of security systems and in the recent application of Physical Security Information Management (PSIM) systems.  Security forces are now manned with advanced technology tools and communication systems.  Put all this together and you come up with a nation that has become smarter and safer thanks to the contribution of the security industry both private and public. 

Communication -- technologically and physically -- often is the gating issue when it comes to making us more secure.  In the industry’s attempt to communicate better, communications technology fixes have been made since the experiences of 9/11 and the Hurricane Katrina disaster, when both physical communications and technology communications systems failed their purpose.  For physical communications, that is the words and thoughts we communicated to each other, the question remains are we better in what we say and how we say it? Have we progressed as much in our physical communications as we have in our technological communications?  The results of a successful or failed security and emergency management operation and the ability or lack thereof to effectively use technology in our communications, as well as to effectively physically communicate with each other, serves as a timeless reminder that language matters.

Linguistic responsibility has been a concern in the U.S., especially in times of war.  “Loose lips sink ships”, was an important slogan during WWII designed to make American’s aware that “careless talk costs lives.” The basic concept of being careful with what and how we say things is part of social psychology’s discourse on linguistics, referred to as propositions of the Sapir-Whorfian Hypothesis, which serves as a timeless reminder that language matters.

Social psychologists contend language has the potential to help us understand human cognition. However, language can also be ambiguous and create uncertainty.  The conflict of cognition versus uncertainty is an important linguistics concept because of unintended implications of language as well as what it explicitly states.

For example, as part of a continuing education program for attorneys, based on an episode from Sesame Street, Dr. Joe McGahan recently gave a talk entitled “Elephants or Pachyderms: The Effects of Subtle Linguistic Variations on Reasoning.” He stated that it was more appropriate to call 10 elephants, “10 elephants,” than “10 pachyderms.” Similarly, if someone were to communicate on how to be secure and safe, instead of communicating how to be more secure and safer, the audience might imply we are unsafe. This subtlety in language could create a disruptive paradigm where someone would assume they are not safe, rather than take actions that could ensure their safety.

To avoid confusion and miscommunications, perhaps the smarter approach would be to think of safety as a continuous variable. This would allow people to discuss being safe in terms of statistical degrees or probabilities. For example, what is the probability quotient of a person being safe walking on a particular city street versus a different street in the same city?  That comparison of being “safer” might lead to a “smarter” choice of action or conversation as opposed to a debate on whether that same person was safe or unsafe.

Nominal and ordinal data are two types of data categories. The difference between nominal and ordinal data lies primarily in the presentation of the data. Ordinal data is a numerical score that exists on a scale and is a numerical quantity of a particular value. Nominal data is a set of information organized by category or name. A nominal data set is also known as a categorical data set.

There is no indication of value within a nominal data set. The color red, for example, can be viewed in a number of different ways.  Considering “safety” as a spectrum rather than a dichotomy is much like looking at the color red.  Thinking of safety as nominal data implies that there is only safe or unsafe.  But when looking at safety as ordinal data, you incorporate a spectrum of value that can imply varying degrees of safety. 

Just as language has ambiguous tendencies, colors can as well. Colors are often used as nominal categories such as in the defense readiness condition (DEFCON) levels. Here colors are used as ordinal categories ranging from blue to white, with each level corresponding to a higher threat. Yet even this use of colors can fail to communicate the proper threat and subsequent response.

If a particular threat falls between levels on the color map, where would these threats be categorized? Having spectrum of color threats rather than more undefined blocked categories might be a better way to go. This could alleviate many ambiguities and allow for situations that may fall into the “gray area” as opposed to being “black and white.”

The importance of language -- what we say and how we say it -- can be further illustrated by the personal experience of former U.S. Marine, now social psychology student at the University of Louisiana (Monroe) Chad Lambert. Lambert was deployed to the Al Anbar Province of Iraq and the Helmand Province of Afghanistan in support of Operation Iraqi Freedom and Operation Enduring Freedom, respectively.

The mission in Iraq was to hold their Area of Operation (AO) and provide security to the local population as local security forces transitioned into a more dominant role. In Afghanistan, Lambert’s team was tasked with clearing the AO of hostile forces and securing a foot hole in an area that had no previous Coalition Forces presence. The level of security required for each of these missions, along with the individual security required in the day to day activity of his unit, was dependent upon the number of conditions in its Operational Risk Management and Operational Risk Assessment criteria.

Based upon Lambert’s experience, he understood that an individual -- or an organization -- could never be completely “safe”, nor was it ever defenseless. Lambert and his peers were constantly reminded about the dangers of complacency and were instructed to mindful of the next threat. Death was the ultimate penalty for any lapse in vigilance.

In their 1986 research on correlations titled “The Psychometrics of Everyday Life,” Kunda and Nisbett argued that accurate correlation estimates can only be achieved if two criteria are met: the subjects must be highly familiar with the data in question, and the data must be highly capable of being unitized and interpreted clearly. Without these two criteria being met the researchers found people were subject to “extreme inaccuracy” in the correlations they made.

In Lambert’s case, the communication was known and understood. This allowed him to correctly assess, compare and correlate his security level with general safety.  He understood that “absolute safety” was not possible, but there were steps he could take to be smarter and become more safe or secure. Lambert realized that safety was a continuous variable rather than a dichotomous variable. He could never be truly safe, because the threats that existed were constantly evolving. Therefore, his security assessments had to constantly evolve to meet the ongoing threat. Lambert also understood that he was never without defense – whether or not they were adequate enough to stop the threat.

Morgan Sneed, a retired staff sergeant assigned to HQ PACAF as a combat correspondent during multiple deployments throughout Asia, also discovered that thinking in dichotomies or using absolutes like “safe” could actually be quite dangerous.

 Morgan tells of the time in 2007, where he was responsible for “securing” polling sites for the Iraqi national elections. He began work at the Area of Responsibility (AOR), two weeks prior to the election. The mission was to make the polling site, “more secure” for the nationals. The soldiers began assessing potential threats and security vulnerabilities. Roving patrols and counter sniper positions were established.

While conceding the fact that no precaution would prevent enemy attacks on their position, the security plan called for Sneed’s force to minimize the threat and optimize their response. For Sneed, it was a given that combat offered no safe situations, he also realized there were varying degrees of being safe. Now as a student of social psychology, that realization has been reinforced. If he had allowed his men to resign themselves to worst-case scenarios, he knew complacency would set in with potentially fatal consequences. At each team meeting Morgan communicated to his troops in a way that helped them understand not only about being more secure but being smarter about their own security.

In the security solutions arena, very few security providers attempt to claim that any one product or even a combination of products will make you safe. Vendors have learned that it is counterproductive if potential users are misled about technology capabilities. Technology provider can’t afford to deal in absolutes, so using the language of safer and less safe can help them avoid the misrepresentation of what their products really can do for the end user.

The assumption is that language matters, and this holds especially true in the security industry. 

The commitment should be to use valid or appropriate language responsibly, with the understanding that language validity also is a continuous variable. While the use of technology is evolving quickly to help us become smarter, more strategic and safer, the security industry may be negligent in its careless use of words.

The bottom line is that words do matter. They can have a profound impact on how people behave, react and deal with security issues. The goal of the security industry should be to improve their use of language to ensure technology advancements mirror sound application practices.

About the authors and contributors:

Dr. Joe McGahan is the Professor of Social Psychology and co- director of The Social Science Research Lab at the University of Louisiana (Monroe, LA). His expertise is in the fields of social psychology, cognition, perception, social identity and statistical reasoning.  Dr. McGahan also leads the Chautauqua Nexus and is part of the Marketing Revelation team.

Dick Salzman, CPP, is the Chief Principal of Marketing Revelation, a full service marketing group that serves the security and technology sectors. Dick has an MBA and 30 years experience in the marketing, branding and business development of security and advanced technology products and services.

Morgan Sneed is a Psychology major at the University of Louisiana (Monroe, LA).  He is a retired SSGT of the USAF, Combat Correspondent HQ PACAF.

Chad Lambert is a Psychology and Criminal Justice major at the University of Louisiana (Monroe, LA). He is a retired Infantry Rifleman and Infantry Team Leader with Easy Co., 2nd Battalion, 2nd Marine Regiment.