1 Critical Thinking Processes
Modeling Expert Approaches
According to a case study by Ertmer et al. (2008), expert instructional designers use the following critical thinking processes: activate a frame of reference, highlight an issue, form a mental map, and articulate a problem statement. The ability to articulate the problem clearly and concisely regarding a situation should be addressed first (Ertmer et al.). This becomes the problem statement, defined as narrowing the problem space. However, the problem statement might not always narrow to a single, concise statement.
In Ertmer’s case study, expert instructional designers approached the problem using their frame of reference, which refers to their experience and body of knowledge. Ertmer et al. suggested learning from experts while they talk aloud through the process, as a cognitive model for novice instructional designers. Devoid of expert advice, novices can utilize expert models of decision-making processes like Harless’ (1973) smart questions for front-end analysis, Lee and Owens’ Rapid Analysis Model (2000), or Mager and Pipe’s (1999) road map for solving performance problems, as well as the human performance technologists’ standards in the field.
Defining Problems
Understanding the nature and structure of a problem will aid an analyst in figuring out how to solve it best. Jonassen (2000) created a typology of problem-solving (P-S) for ID purposes, which can serve as a job aid. He identified the following problems: logical, story, troubleshooting, algorithms, rule using, decision-making, diagnoses solution, strategic performance, case analysis, designs, and dilemmas. His research described each type of problem’s resolution process. For example, if a problem presents limited variables that can be controlled through manipulation, then an analyst would know that they have a logical problem by referring to Jonassen’s typology chart. In Jonassen’s description of its structuredness, logical problems are discovered as solutions are drawn from logic.
Jonassen’s typology chart provides the learning activity, inputs, success criteria, contexts, structuredness, and level of abstractness of problems. For example, the learning activity of a story problem is easy to surmise—figure out the necessary framework, essential elements, and appropriate language to create a good story. A story problem could be used for a marketing sales story, a storyboard for online gaming, a mission statement, or other script-related issues that arise in organizations.
Jonassen (2004) stated that troubleshooting is one of the most common problems encountered in the professional world. Therefore, it would behoove ID novices to understand how to troubleshoot problems efficiently and effectively. Jonassen provided the following cognitive processes for troubleshooting: identify the fault state and related system > construct a mental model of the problem > diagnose the problem > implement the solution by replacing the defective part or subsystem > and record the results in a fault database. The fault database serves to diagnose new problems based on previously solved ones to reuse and adapt the solutions accordingly. Capturing your troubleshooting efforts in a fault database will save time.
Algorithm problems are formulas, steps, or tools that bring about precise answers in a short amount of time (e.g., backward design, listing problems, organizational charts). Heuristics are engaged when algorithms are insufficient in solving the problem (e.g., brainstorming, drawing an analogy, identifying subgoals, self-explanation, using visual imagery). Heuristics are mental shortcuts based on experience, intuition, or simplification of larger problems. Most P-S involves a combination of algorithmic and heuristic thinking. Of note, many of the strategies aforementioned can have both heuristic and algorithmic elements depending on their application (Anthropic, 2024).
Rule-using problems refer to situations where you must apply a known set of rules (e.g., statistical, procedural) or principles to solve them. There is a clear goal, but the pathway to the solution may involve different sets of options.
Decision-making problems use knowledge or skills learned and transfer them to the problem at hand or unanswered questions. Any problem has three components: the goal, the givens, and the operations (Ormrod, 2020). P-S involves restructuring and insight. Kohler (1925) suggested that P-S involved mentally combining and recombining various elements and eventually creating an organizational structure that solves the problem. Here are some practical decision-making theories and techniques:
- Trial and error, but this is only effective if you have a lot of time and resources. This P-S technique is attributed to early scholars such as Charles Darwin, Herbert Spencer, Alexander Bain, and Edward Thorndike (Anthropic, 2024).
- The four mental stages of P-S are prep, incubation, inspiration, and verification (Wallas, 1926).
- Do not let irrelevant thoughts crowd your working memory (Swanson et al., 2008).
- Encoding of the problem: determine which aspect is relevant and which is not to find a solution (Ormrod, 2020).
- Do not get locked into a mental set of approaching and encoding a problem. Think outside the box (Gardner, 1978).
- Do not allow functional fixedness to occur (Birch & Rabinowitz, 1951). For example, use an object in a new way, originally unintended.
Diagnoses-solution problems require you to “select and evaluate treatment options and monitor (Jonassen, 2000).” For instance, understand the issue, devise a plan, carry out the plan, and reflect on it (Polya, 1945).
Tactical strategies can resolve strategic performance problems. These strategies are based on a thorough understanding of the situational context in which they can be applied. Solving these will require trial and error, as no situation is the same.
Case analysis problems are complex and require time to identify possible solutions. These are generally specific scenarios involving ill-structured problems that could have multiple outcomes. One way to address this would be to mentally restructure the problem situation until insight is achieved (Kohler, 1925).
Design problems occur when trying to create an artifact from loosely defined goals with little input. It requires defining the problem the design is intended to solve. Reference different use cases for similar designs.
Dilemmas are complicated by opposing opinions and situations with no clear pathway to resolution without compromise. Solving this type of problem requires clearly defining it (i.e., problem statement) and delineating possible options.
P-S depends upon the nature of the topic, as different content requires different ways of thinking. Bruning et al. (2011) refer to these as thinking frames such as how one would think about scientific inquiry and research models. Thinking critically about the type of problem, the best approach, and any previous encounters (fault database) will help novice analysts analyze issues effectively and efficiently. Knowing that issues can be categorized into 11 problem types is a useful parameter for initially understanding an issue.
Front-end Analysis
Once the problem type is identified, an instructional designer proceeds to analyze the presenting problem, which is called front-end analysis (FEA). The term is interchangeable with PA. FEA is a performance improvement approach to solving problems in ID. Harless (1973) recognized that many organizations want to make fiscally sound decisions to improve performance, but they do not want to spend the money or the time on FEA. He defended FEA as a way to achieve human potential in fiscally sound ways. To this end, he provided 13 smart questions for the FEA process. Harless deemed them an intelligent way to analyze a situation before deciding upon a specific solution. FEA aims to avoid costly mistakes such as unnecessary training when proper informational feedback, a job aid, or task realignment could resolve the issue.
Smart Questions
These questions help determine if the problem is performance-based and whether it requires training: (Harless, 1973, p. 240-244)
- Do we have a problem?
- Do we have a performance problem?
- How will we know when the problem is solved?
- What is the performance problem?
- Should we allocate resources to solve it?
- What are the probable causes of the problem?
- What evidence bears on each possibility?
- What is the probable cause?
- What general solution type is indicated?
- What are the alternative subclasses of solution?
- What are the costs, effects, and developmental times of each solution?
- What are the constraints?
- What are the overall goals?
The first two questions are critical, as not all problems are performance-based. Harless described the relevant analyses to determine if there is a performance problem as follows: (1) form a hypothesis of the nonperformance cause of the problem, (2) test the hypothesis, and (3) observe a situation demonstrating a mastery performance. An overlooked process is establishing mastery of objectives for solutions: “How will we know when the problem is solved?” Harless noted the importance of asking this question early in the process. His questions serve as a guide for the analyst to form a hypothesis about the problem, determine the probable cause(s), and identify constraints to resolving the issue(s). None of these questions is mutually exclusive. Nor are they locked in sequence, because the ID process is nonlinear. His smart questions and FEA process are not copyrighted or trademarked and, therefore, are free to use on the job.
Using a systematic process helps with adequately thinking through the complexity of the context and content of problem-solving. For example, some ID approaches hierarchically ask questions. Harless’ (1973) first question for front-end analysis may often be overlooked: “Do we have a problem?” An analyst must use critical thinking to avoid making assumptions about a situation. Combat functional fixedness with divergent and creative thinking. Functional fixedness is the inability to view common objects differently, inhibiting critical thinking. Is it a problem or an opportunity? Does it require training or non-training solutions? Is it an ill-structured or well-structured problem?
FEA Pitfalls
Inherent in the planning and implementation of the FEA process is losing sight of reflective thinking. To illustrate, Table 1 depicts common work-related pitfalls in the FEA planning process such as having a false sense of safety, groupthink, and failing to recognize a participant’s rank or political power (IPDET, 2007). Each pitfall is aligned with the critical thinking processes proposed by Dick et al. (2009).
Table 1
How to Avoid Pitfalls by Engaging in Critical Thinking
FEA Pitfalls (IPDET, 2007) | Countering Critical Thinking Processes (Dick et al., 2009) |
Over Planning | Suspending judgment until all pertinent information has been heard, being open-minded, seeking root causes |
Thinking you have it all figured out | Suspending judgment until all pertinent information has been heard, being objective, being open-minded, viewing a problem from multiple perspectives, changing a conclusion in the face of compelling information |
Being Gung Ho | Being objective, being open-minded, viewing a problem from multiple perspectives |
Overreliance on Frameworks | Being objective, being open-minded, viewing a problem from multiple perspectives, changing a conclusion in the face of compelling information |
Overreliance on Truisms | Being objective, being open-minded, seeking root causes |
Groupthink | Suspending judgment until all pertinent information has been heard, being open-minded, seeking root causes, viewing a problem from multiple perspectives, listening to contrary views, changing a conclusion in the face of compelling information |
Ignoring Power | Being open-minded, listening to contrary views, changing a conclusion in the face of compelling information |
Illogical Thinking | Being objective, being open-minded |
Another pitfall is trying to solve problems in seclusion and not adhering to the standards of the field. To illustrate, Schneider (2009) urged analysts not to fall into the career pitfall of “Dumbledore” wizardry for FEA. (Dumbledore is the headmaster of the Hogwarts School in the Harry Potter series.) He said that the consultant-as-magician was, unfortunately, a widespread phenomenon. Schneider claimed this attitude exists because many analysts endeavor to be wizards and solve problems magically (quickly behind closed doors). He acknowledged that human performance technology (HPT) models are boring and take longer to produce results than wizardry. Schneider warned that some clients might view the time it takes to do a systematic and thorough analysis as retarding the outcome. This is called analysis paralysis in the industry.
Analysis paralysis is not a pitfall; however, not collaborating with the client at every step of the analysis would be. For example, Schneider highlighted how troubleshooting with a client could lead to learning at each stage of the analytical process for the consultant and the client. This is especially critical when things learned in the process are negative. Minor issues can be addressed right away and serve as team-building opportunities. He urged analysts to adhere to the International Society for Performance Improvement (ISPI) Performance Technologist Standard Number 4: Work in Partnerships with Clients and Stakeholders. Schneider stated that each analysis phase teaches something new to both sides involved.
Rapid Analysis Techniques
Rapid Analysis Model
There are numerous reasons why an organization might want to get to a root cause quickly for military, health, safety, or employment issues. For example, when organizations face dire dilemmas, they seek quick solutions to their problems. A faster approach to FEA is the Rapid Analysis Model (RAM) developed by Lee and Owens (2000), which is a prescriptive list of activities, time parameters, predetermined operations, and anticipated outcomes established through a series of validations. The activities and time parameters prescribed for an analyst include questioning (9%), listening (50%), observation (40%), and reporting (1%). RAM follows the anthropological approach to research, which includes triangulation of data gathering to obtain a preponderance of evidence.
Triangulation of Data. Triangulation refers to multiple forms of data (evidence from interviews, documents, observations) from diverse perspectives such as administration, workers, and other stakeholders. Lee and Owens (2000) warned that reliance on data solely from management would cripple your RAM and lead to incomplete analysis. They proposed the 25-foot rule where the worker within 25 feet of the job knows the most about it. They prescribed asking the same five questions to different stakeholders to discover gaps and reconcile them. This results in system-wide checks and balances that can be conducted in a few days.
Active Listening. Lee and Owens (2000) stated that actively categorizing participant needs during the listening process of interviews helps delineate root causes and provides direction for addressing the problem. Their listening categories align with the Dick et al. (2009) typology of needs assessments: normative, felt need, expressed/demand, comparative need, and anticipated/future need. For instance, if the needs assessment responses pertain overwhelmingly to an anticipated need instead of a normative one (industry standard) or demand, then you can assume that the current problem is a future one and proceed with your solution accordingly. An analyst must follow interviews with observations to verify comments. Lee and Owens suggested observing average and exemplary performers to gather baseline data to determine the performance gap.
Timesaving Strategies. The reporting component should include a thorough analysis and recommendations at three levels: systemic, performance, and training. Systemic issues affect the entire system, process, or product. These must be addressed first to avoid wasting time addressing performance areas or training that could be affected by system-wide issues. This is known as systems thinking. As the name implies, the RAM process has several timesaving strategies such as understanding the appropriate time allocations for conducting a problem analysis (e.g., questioning is only 9%), triangulating data as a measure of checks and balances, actively listening and categorizing needs during interviews, and the aforementioned multilevel reporting to understand the bigger issue. Remember that all these timesaving strategies hinge on uncovering the actual root cause of a problem.
Root Cause Analysis
A root cause is the fundamental breakdown or trigger of a process which, when resolved, prevents a recurrence of the problem. Remember that issues can be positive, too, so we need to look for triggers that could superficially appear to be helpful like rewards but cause problems with productivity. It is essential to look at all the aspects of a presenting issue, not just the negative ones. Digging deep for the root cause of a problem forces the analyst to go beyond the surface issues and avoid preparing interventions for superficial ones.RCA is a way to invite criticism and avoid Groupthink.
To determine the appropriate direction of your effort, you must flesh out the root cause through deep analysis. Superficial questioning and data gathering will not suffice, as there may be multiple causes for the presenting problems. Willmore (2016) warned not to push the RCA to reasons outside the workplace like the economy or human nature. He stated that human error is not a root cause; there is a preceding cause to any human mistake. Harless holds the same viewpoint. For example, one of his FEA goals is to isolate the root cause of the performance problem rather than its symptoms or effects of the problem. You should ask: Why does Jim keep making the same mistake? Has he been adequately guided (training, job aid, informational feedback, or reassignment)? It is imperative to find the root cause of the problem and not fall into the blame game. Therefore, human error is a nonexample of a root cause.
To be a successful analyst, build a resource library of problem-solving models, RCA tools, and tested interventions. There are ID handbooks with interventions that apply to various types of root causes where one can find practical solutions that are prearranged categorically to save time (Langdon et al., 1999; Rossett & Gautier-Downes, 1991; Stolovitch & Keeps, Eds., 1992). Intervention models and tools are called job aids. They can be as simple as a chart, diagram, or questioning process. Even the collection of errors and their subsequent causes and solutions in the fault database is considered a job aid.
RCA Frameworks
RCA originated in the nuclear branch of the United States Navy (Dew, 2008). They were concerned with the design, operation, maintenance, and fueling of naval nuclear reactors. Under the leadership of Admiral Hyman Rickover, the Navy set a high standard of performance for operational systems and personnel. The methodologies were developed through collaboration between nuclear Navy personnel and staff at the Atomic Energy Commission. RCA is a derivative of Failure Mode Effect Analysis (FMEA) initiated for reliability engineering for the U.S. military in 1949 to determine the effect of system and equipment failures; FMEA has since been used by NASA, U.S. manufacturing, and the U.S. auto industry standards (Dunn & Renner, n.d.). The ramifications of RCA are paramount in certain cases, as is noted in its origin.
Six-box Model. If you are unsure as to which category the performance problem falls into, use Sanders and Thiagarajan’s six-box model to unveil the root cause(s) that was developed specifically for the Association for Talent Development (Willmore, 2016). It lists common organizational performance problems. The six performance improvement factors are structure/process, resources, information, knowledge/skills, motives, and wellness. It serves as a job aid for categorizing workplace problems. Start with a chart and then build a concept map from the data collected to help visualize the presenting problem.
Fishbone Diagram. Another RCA tool is Ishikawa’s (1968) fishbone diagram. It illustrates significant categories with the main bones for causes and the head signifies the problem. The minor bones stemming from the main bones identify possible causes. Utilize brainstorming techniques based on the triangulation of data gathered from various stakeholders to create the diagram. Ishikawa’s fishbone diagram will aid a novice analyst in forming a mental map of the problem that can be shared and discussed with others. As aforementioned, expert problem analysts create mental maps of the presenting problem during FEA (Ertmer et al., 2008). The fishbone diagram does not provide a set of categories. Willmore suggested combining Sanders and Thiagarajan’s six-box model categories with Ishikawa’s fishbone format. Figure 1 is an example of a fishbone diagram created by Lang (2008) with different categories.
Figure 1
Blank Fishbone Diagram to illustrate the Causes of a Problem
Why Tree. Another root cause tool comes from the founder of the Toyota Industries Company, Sakichi Toyoda, who developed the Why Tree. He first used the method in the Toyota manufacturing process in 1958 (Bright Hub PM, n.d.). It consists of five why questions that represent a deeper understanding of a problem. For each answer, you ask why until you uncover the root cause. Responses are mapped out according to different reasons. Willmore (2016) stated that not all the Why Tree branches need to extend five levels, as the root causes may surface after two or three questions. There are three benefits to using this process. One, the different branching reasons that stem from a problem statement can lead to more than one root cause and various interventions. Second, it creates a mental map to synthesize the presenting problem. This can be mapped out to share with teams for analysis. Third, it aids novice analysts in digging deeper to uncover the root causes to avoid hasty conclusions.