Contact Us   |    Join   |    Donate
THIS WEBSITE IS SPONSORED BY PROGENY, A CORPORATE MEMBER OF THE NSL

LESSONS LEARNED AND MIS-CONCEPTIONS: EIGHTEEN YEARS OF SUBMARINE DECISION RESEARCH

The application of Human Factors research to submarining is not new. The Combat Systems Department at NUWC (Naval Undersea Warfare Center Division Newport) has been investigating the best way to present information to submariners for more than two decades. In fact, the first use of the term “Human Factors” was in a WWII report from the National Research Council on “The Human Factors in Submarines” (Panel on Psychology and Physiology, 1949). We know that today there are many more choices for how, what, and why to display information-and therefore many more human factors! This paper is intended to show how the science of Human Factors and Cognitive Engineering can provide answers that are both broader and more useful than just guidance on font size and style.

One of the first things that I did when I joined the group at NUWC was to take one of the excellent courses at the U.S. Submarine School, Naval Submarine Base, New London. That was only an introduction. I have spent the last 18 years interviewing submariners, observing in attack centers, running the analysis of Concept Of Operations Experiments (COOPEXs), and collecting and analyzing experimental data in a large number of research projects. The objective of this research is to provide guidance for the development and implementation of technology to support human decision making. The prerequisite for improving support tools is understanding the relationship between information (content and structure} and human performance so that is where I have focused my efforts. Below I give examples of research and results from my own work and those of colleagues at NUWC and else-where. However, my main goal in writing this paper is not to describe specific research results. It is to demonstrate that there are decades of research and experience that can be brought to bear on the problems of supporting submariners (and others) at every level. from the most junior operator to the most senior decision maker.

Expertise

Expertise is an important research area because NUWC designs decision support systems for experts-and novices. Therefore, the research asks how expertise develops, and how we can better support the expert (and pre-expert) at work. This research has implications for training and for the design of systems to be used by the full range of users, from novices to experts.

One of the hallmarks of expertise is the ability to respond appropriately to difficult situations. One of the most difficult kinds of situation is the one where a particular signal can arise from a number of causes. For a rather obvious example, a low contact bearing rate can be due to range (distant) or geometry (bow null or parallel relative motion geometry). Navy training and doctrine teach the potential Officer of the Deck (OOD) to think about the dangerous situation of a contact close aboard on own ship’s track. However. non-experts often assume that if the contact is not on own ship’s bow, it is distant. Failure to test for other, lower probability alternatives (e.g., parallel geometry), is called a conformational bias and is common in all but the best experts.

One of the ways that experts accommodate the dynamic submarine problem is by continually sampling the full range of available information. They move from plots to Fire Control consoles to sonar to weapons. They look at the target of interest and at other contacts. In this way they avoid the hazard of tunnel vision and are often the first to recognize a change in the situation. This argues for flexibility in distributing and displaying information. It also argues for including contacts other than the contact of interest on displays, whenever possible.

Another consistent finding is that experts often look at raw data as a way of confirming information that has been analyzed either by automation or by more junior individuals. Forcing the expert to depend solely on analyses conducted by automation or by those with less experience places a limit on his ability to apply his experience to the problem.

Perhaps surprisingly, experts are more variable than non-experts. This may be, in part, because they know more different ways to accomplish the same goals. Experts are also more variable because they are sensitive to local variation. For example, the experienced Approach Officer (AO) or OOD knows that he can trust what Joe tells him but that the Fire Control Tech (FT) has been up for 22 hours, there is a problem with some piece of equipment, and the Contact Evaluation Plot (CEP) plotter is standing his first watch at that station. Thus, the expert can respond appropriately, even to previously unforeseen events. To accommodate this variability (flexibility) experts need the possibility of viewing things in a variety of ways. This translates into a flexible variety of displays to support many ways of accomplishing the goal. Actually, flexibility supports more than just experts, it supports human variability, in general.

The findings above do not imply that the AO/OOD should be doing all of the jobs nor that his displays should be the same as those who are. It only argues that he should not be limited to a couple of displays, a few fixed views, or only highly processed information. It argues for access. Likewise, the AO is not the only expert in the crew. Nor is his the only perspective. The experienced FT or Sonar Tech (ST) is also an expert at his job. The jobs are different, but the description of expertise cuts across specialties and even domains.

User-Centered Design

The concept of User-Centered Design is strongly supported by the Human Factors literature. It provides a set of methods that facilitate designing systems that advance the goals of the user and support the tasks people are actually doing, or need to do, to accomplish those goals. What is the difference between User-Centered design and other design processes? User-Centered design begins with the needs of the user. At this point in our technological history there are far more choices of how to proceed and what to build than we can afford (in time or money) to produce. User-Centered design supplements and supports the military idea of requirements-driven acquisition by assuring that the acquired technology actually meets the requirements-when the entire human-machine system is considered.

One Example. For example, in many domains, databases have made the transition from books of paper look-up tables to electronic format. The user has a specific task to accomplish (it could be creating an air tasking order, a mine clearance plan, or a transit plan for one platform or an entire battlegroup).

One possible implementation is for the interface to mimic the earlier table look-up process. Thus, it would require the user must inspect potential solutions, data-cell by data-cell. Many tools have been built in just this way! Alternatively, the interface could have the user input the inflexible data (platform data, dates, constraints, etc.) and would output graphical (geo-referenced, if appropriate) color-coded, information. If, for example, the task were route planning, the tool could output all routes that satisfy the constraints, color coding segments for risk (safety, time delay, etc.), recommended speed, or other factors. When no route satisfies all requirements, the output could show the best compromises and potential alternatives, again, color coded for additional information. Such an interface would not require any additional information, just an understanding of the user’s goals and common query ability.

Design for Reduced Training. Another use of User-Centered Design is to provide a tool that is as self-explanatory and easy to learn as possible. In this way the new tool does not add a new training requirement. Training is important but we must not confuse task training with equipment training. Most Navy tasks are complex and difficult. Task training should not be complicated by tools that are difficult to work, hide or scatter the essential information that needs to be integrated, or actually hinder the user (operator or senior decision maker). Yet, for years we have been told that any design problem, can be fixed by “training the operator.”

Training is expensive. It takes time, space, and uses equipment and people that could be put to better use actually working the problem. How do we design for reduced training? One way is to design equipment to support the user by taking advantage of his strengths. A good example of a new tool idea that is built upon the way the human system works is the set of sonar displays being designed by Ray Rowland at Naval Undersea Warfare Center Division Newport. These displays capitalize on animation and the fact that the human eye is optimized as an edge detector to facilitate target detection across different frequencies. They do not require much user training because they capitalize on his natural strengths. They even reduce content training because they facilitate the perception of key features.

Bottom Line. It is time to stop trying to fix mistakes with training. It is far cheaper to design a system right-and test it to be sure it is designed for operability-than it is to train every user for the life of the system or pay for the consequences of a single catastrophic accident!

Uncertainty

Uncertainty is a well recognized problem among submariners. It exists in other domains, but in submarines, it is the major source of difficulty. The problem is that we don’t know how to communicate (or analyze), the degree, source, or even the possibility of uncertainty. Even if we could mathematically describe the numerous uncertainties, that does not mean that the decision maker can use that information appropriately. The most common human response to uncertainty is to delay taking action, but often, in the military, that is not an option. In fact, delay can increase uncertainty in a dynamic problem. The solution that is good at time(!) can quickly fall apart by time(t+ J).

Lessons learned from research are sparse, but provide some guidance. For example, if no information is given on uncertainty, people will search for it. Verbal and numeric information are about equally informative. However, spatial and dynamic representations of uncertainty are often better than verbal/numeric ones. Again, the intuitive solution is not usually the right one. Representing uncertainty is truly an area where the hard work is just beginning. It will build on efforts to model the physical and statistical phenomena.

Opinion and Data

I am trained as a researcher and have many years experience working for the Submarine Force. I have worked on many projects including COOPEXs for combat control and ship control. I have always been impressed with the effort and enthusiasm of the submariners who have participated in these COOPEXs and in all of my experiments. On the whole they have been knowledgeable and innovative. However, no matter how knowledgeable a practitioner is, there is no substitute for data. For example, in one experiment I tested the effects of a new kind of information on performance. At the end of the data collection session, I asked each OOD if he thought the new information would be useful and how he thought it might help. Interestingly, the data showed that the new information improved solution accuracy but had no effect on time-of-fire. However, the OODs thought that this new information would improve time-of-fire but not solution accuracy. Their experienced and professional, but subjective, judgment was exactly the opposite of the data on their experienced and professional performance! That is the reason why testing involves more than just asking, even when the answer comes from an experienced professional.

Every component of the submarine system is thoroughly tested to see if it performs as expected. Engineers know that even when something should work in theory and in the model, it might fail for any number of reasons when placed in the real environment or with other systems. That is the reason why systems are tested and certified. The only component that is not tested is the user interface, but that is the piece that communicates the state of the world (or system) to the decision maker and returns his intentioned actions to the system and hence to the environment. It is the most critical part of the entire system, yet it is the only one that is not tested. Data, not opinion and stress testing, not just theory are required to certify reliable user interfaces, just as it is with software and hardware!

Lessons Learned

None of the above means don’t listen to the operator. He has generations of experience in his insights. Rather, it means use the voice of experience as guidance for where to look, but do not fail to test that guidance as well. On the other hand, there is an entire field that specialized in Human Factors, applied cognitive engineering, and user-centered design. Not everyone is a psychologist because they are human or an expert at human cognition because they think or an Human-Computer-Interaction (HCI) specialist because they use a computer. The HCI/applied cognitive psychologist has at her or his disposal test, analysis, and application methods to support the design and evaluation of systems that will better support and even significantly improve the performance of our Submarine Force.

Perhaps the most important of the lessons learned, and the most difficult to implement, is that systems need to be designed to meet the needs of the users (regardless of level). Although often accepted, in principle, this requirement is not usually followed. Years ago, systems were designed to do the possible. However, with today’s fast computers and virtually unlimited memory, there are few limits on the possible, except for those imposed by development time and money. Hence, there is a greater than ever need for guidance in selecting where to focus our efforts. I suggest turning the design strategy upside down and driving design choices from the perspective of users’ needs, not developers’ possibilities! To make this radical change in design strategy requires that we know what the user needs. This brings us full circle, to the methods and results of the science of Human Factors and Cognitive Engineering. Let’s use this science to build a better submarine!

Naval Submarine League

© 2022 Naval Submarine League