Critical Thinking Skills: A Process for Better Problem Solving and Decision Making

Category: Decision Making, Problem Solving
ID: W-1015

FREE

About HRDQ-U Virtual Seminars

  • Live Seminars offer 80+ scheduled open enrollment, instructor-led training classes for your employees featuring real-time interaction with our expert trainers and engaging professional development sessions you can join on any device. Enroll your learners and let them develop critical soft skills from their home or office.
  • Recorded Seminars offer an archived streaming video of our popular live 3-hour seminars that provide the same level of in-depth training for you or your individual employees to enjoy on their own schedule. Each Recorded Seminar is available for repeat or resumed viewing for 90 days, so you can study at your own pace.

 

About HRDQ-U Webinars

  • Live Webinars are FREE, and offered each week on Wednesdays. These sessions are designed for consultants, trainers, coaches, and managers providing training to their teams – anyone interested in organizational, team, and individual learning. Each 1-hour webinar features thought leaders presenting new ideas, advice, insights, and how-to on topics from leadership to teams, communication to diversity, and much more.
  • Recorded Webinars are typically available for viewing shortly after the original broadcast as an archived streaming video for you to view on-demand at any time. Recordings are not always free, so be sure to join us for the Live Webinar if your schedule permits.

 

Need help registering or viewing an event?

The United States Department of Labor has identified critical thinking as the foundation for key skills such as problem solving, decision making, creativity, and strategic planning, just to name a few. And executives from major corporations rank critical thinking as the #1 workplace competency. Still, only 28% of those with a four-year college education are rated as excellent critical thinkers.

We’ll help you put critical thinking skills on your training docket when you join us for the free webinar, Critical Thinking Skills: A Process for Better Problem Solving and Decision Making, where you’ll be introduced to the HRDQ product Critical Thinking Fundamentals. This training workshop is designed to help individuals learn and apply higher-level problem-solving skills in a low-risk environment through a number of interactive activities.

If your organization wants to remain competitive in today’s environment, you need to invest more time in the training and development of critical thinking skills.

Participants Will Learn

  • Establish a common understanding of what critical thinking is.
  • Identify the foundational attitudes and skills of critical thinking.
  • Provide a four-step process for using critical thinking on the job.
  • Recognize and avoid common critical thinking mistakes.


Who Should Attend

  • A training or HR professional who delivers training.
  • An independent training consultant.
  • A manager who delivers or purchases training as part of their role.

Sara Lindmont: Hi, everyone. Welcome to today’s webinar, Critical Thinking Skills: A
Process for Better Problem Solving and Decision Making, hosted by
HRDQ-U and presented by Rick Lepsinger. My name is Sara, and I will
moderate today’s webinar. The webinar will last about an hour, so if you
have any questions, go ahead and type them into the questions panel
on your go-to webinar area there. You can open that up, type in there,
hit submit, and those questions will come through to us. We’ll either
answer them as we go or at the end. If we run out of time, we’ll answer
them afterwards by email, so you will definitely hear from us.
Sara Lindmont: Today’s webinar content is from our newest program titled Critical
Thinking Fundamentals. If you’re interested in delivering this training
within your organization, please contact HRDQ. Our presenter today is
Rick Lepsinger. President of OnPoint Consulting, Rick’s career has
focused on helping organizations and leaders identify and develop
leaders, work better virtually, enhance cross-functional team
performance, and get from strategy to execution faster. He conducts
numerous seminars and workshops on succession management, leading
from a distance, leading cross-functional teams, and enhancing
execution. Rick has written numerous articles and is the author or coauthor of several books, including his most recent Closing the Execution
Gap: How Great Leaders and Their Companies Get Results.
Sara Lindmont: Welcome, Rick, and thank you for joining us today.
Rick Lepsinger: Sara, thanks very much. I’m glad to be here. And, everyone, welcome.
Thank you for joining us today. As Sara said, today’s topic is around
critical thinking, which is all about enhancing judgment and the quality
of your decisions overall. We’re gonna take time to cover four primary
objectives. We’re gonna talk a little bit about what we mean by critical
thinking, what it actually is, what it actually looks like in practice. We’ll
talk about some of the basic skills, the attitudes, the characteristics of
effective critical thinkers. We’ll walk through a four-step process for
critical thinking, and really the importance of that is that critical thinking
is a skill you can learn. Some people may be born with a tendency to be
more critical in their thinking, but all of us can learn to do this, and to do
this effectively. I will also talk a little bit about some of the common
critical thinking mistakes and what you can do to avoid them overall.
1
Rick Lepsinger: Let’s start with a definition in terms of what we mean by critical
thinking, and critical thinking really is, fundamentally, it’s the process by
which you go about evaluating the information that you’re receiving.
You’re evaluating it in terms of its truthfulness and its value overall. But
you’re doing this in a systematic way, in a way that is directive and that
is efficient overall. It really is thinking a little bit about the information
you’re getting in terms of how accurate is it, and is it of use to me in
general.
Rick Lepsinger: In terms of the benefits. I think we spoke a little bit before that critical
thinking is kind of a baseline or a fundamental skill or competence that
would impact any number of work-related activities, as you can see
listed here. It could certainly improve communication, it can help in your
written communication as well. When it comes to making decisions on
the job about choosing new markets, handling a crisis, deciding who to
hire, critical thinking is sort of fundamental to being able to do those
things well. The challenge for many of us in a very busy, stressful world
is to take the time, right? And to avoid old habits and biases that may
negatively impact our ability to really think through the situation and
the information that we’re getting.
Rick Lepsinger: When you think about the people who are the most effective critical
thinkers, there are a number of characteristics that many of them share.
It’s interesting because some of these things are sort of, you might think
of them as almost mutually exclusive, but these are people who are
inquisitive and open-minded. They’re objective and analytical, and
they’re also reflective and reasonable.
Rick Lepsinger: Let’s take a look at each one of those in a little bit more detail. In terms
of being inquisitive and open-minded, this is really approaching a
situation without any preconceived ideas. This can be very challenging.
One of the things I would suggest is that it’s not that all of a sudden we
become kind of neutral, or all of the sudden being open-minded where
we just don’t have preconceived notions. The idea here is that with
awareness you can check yourself. We all naturally have some
preconceived idea about a situation or about a choice or an option, or
about an individual.
Rick Lepsinger: The idea is to not let that preconceived idea dominate, to really be able
to check yourself and say, “Is there other information? Is there another
way to look at that?” That’s really what we mean by being curious. It’s
not that you don’t have the preconceived idea. It’s that you move past
it. You’re aware of the fact that you do, and you move past it. The idea is
that you, and one way to do that may be to check in with other people,
trusted advisors, people who are subject matter experts, to give you
2
another perspective on a particular situation.
Rick Lepsinger: On the objective and analytical side, this is about minimizing the impact
of emotion on your decision making. We can talk it a little bit later, but
it’s not that you eliminate emotion because it’s a component of the
decision making process, but you also wanna be as objective as possible,
and in terms of being analytic, to be able to assess the data that’s in
front of you. And to be able to determine of all the information you’re
getting, which is most important, and which is most useful. The key here
is being able to separate facts from opinion. That becomes very
challenging in today’s world. What is a fact? We used to know what that
was, but for some reason it’s not as clear as it used to be, from
someone’s opinion overall. The main difference is that facts can be
proved or disproved. Facts some kind of backed up information, and it’s
objective, it’s unbiased. Opinions is something that it’s a point of view
that I have, that I may or may not be able to provide some supporting
information.
Rick Lepsinger: The whole idea here is to avoid taking opinions and presenting them as
if in fact they were facts. Now the key here, again, is not so much what
you expect other people to do, but it’s really on the receiving side, it’s as
you’re listening, as you’re reading, to really say, “What’s the basis of
this? What’s behind this overall?” There are a couple of key questions
that you can use to help you separate facts from opinion overall, to
really think about the key issue or the problem you’re trying to resolve.
What information do you have about the issue? What assumptions do
you have, or ideas that support your strategy or plan? Is there evidence
to support these assumptions? What might be gaps in reasoning? And
to really think, pose these questions to yourself or to the team to really
help you think it through, to not necessarily just take the information as
given that it will be true and accurate.
Rick Lepsinger: Another technique to help you rethink or reframe the problem, to give
you a fresh point of view on it, is reframing. There are four different
ways to reframe an issue or a problem. One is paraphrasing. Your initial
view or issue is, how can we reduce our shipping delays? If you were
going to reframe it using paraphrasing, you might say, “How can we
ensure customers receive their orders on time?” Looking at it through a
different lens, helps you start to think through and separate opinion
from fact going forward.
Rick Lepsinger: The other is a 180-turnaround. If initially you’re thinking about, how can
we encourage employees to follow the new procedure? If you flipped it
around 180 degrees, you might say, “What might we be doing to
discourage employees from following the new procedure?” To put a
different lens on it. The third is to broaden it, or expand it. If initially
3
you’re thinking, should we expand our product line in China, you might
reframe it by saying, “How can we achieve increased financial success in
China?” Or to redirect it. If initially you’re posing the question, “How can
we increase revenue?” To reframe it by saying, “How can we decrease
our costs?” Again the idea here is to put a different lens on things, to get
a different point of view.
Rick Lepsinger: Let’s see how in terms of your ability to determine which is a fact and
which is an assumption. We’ll be doing a number of these during the
course of our time together to give you an opportunity to test your
critical thinking skills. Let’s take a look at the first scenario around
recognizing assumptions.
Rick Lepsinger: When product A was launched in India, TV commercials proved to be
the most effective making tool. That’s why to support the launch of the
product in our market, we should allocate most of the budget to TV
commercials. Which of the following is a fact or an assumption? Number
one: The two markets have similar consumer media preferences. Is that
a fact or is that an assumption? Number two: TV commercials the
supported product A launch in India had the highest ROI among media
channels.
Rick Lepsinger: Just use the information that you’ve been presented in this little minicase without adding too much to it. Sara’s putting up the polling
question right now. Which one is a fact? Pick the one you think is a fact.
We’ll keep it open for a few moments.
Rick Lepsinger: Sara, you can close it when we get about half or so, depending on how
fast they’re coming in.
Rick Lepsinger: Okay, so most people picked the second one, TV commercials that
supported the product launch in India tend to be the fact. Sara, if you
close that down, we’ll take a look at the right answer. In fact, that is the
correct answer. The first one, the two markets have similar consumer
media preferences, is an assumption that’s being made. There’s no
information in the little scenario that indicates that they had the same
preferences, but you do have some information that can be supported
that says it had the best return on investment overall.
Rick Lepsinger: Let’s take a look at a second scenario. Here I do not support
telecommuting in our region. When we tried to implement this a few
years ago, the initiative failed because of technology. People could not
access the internet when away from the office and the narrow
bandwidth made it impossible to hold virtual meetings.
Rick Lepsinger: Which of the following is a fact? The issues that prevented the
4
successful rollout of telecommuting previously have not been resolved,
or access to necessary technology has a great impact on the
effectiveness of people working remotely.
Rick Lepsinger: Sara, if you’d put up the poll. Which one of these statements is a fact?
Sara, you can close it whenever … Oh, there we go. Okay. Let’s take a
look at the results.
Rick Lepsinger: Again, people selected number two, access to technology has a great
impact. And most people picked that as the fact overall. Let’s take a look
at the … And in fact that is the correct answer. We do know that based
on this scenario that access is critical. We do not have any real data that
says this issues have not been resolved. This particular person who’s
speaking is assuming that the issues have not been resolved, but there’s
no information in this particular case that would indicate that.
Rick Lepsinger: The second is around being reflective and reasonable. Here we’re
talking about personal bias, and avoiding personal bias and applying
common sense. Again, the key here on personal bias, it’s not that you
won’t have bias and preferences. The challenge is to recognize it, and
recognize when it’s affecting your choices and your decisions. When we
think about some mistakes that individuals make, there’s a number of
them. They have to do with bias, with hidden assumptions, with
misinterpreting statistics, jumping to conclusions, rationalizing, and
emotional thinking. Let’s take a look at each one of those in a little bit
more tail.
Rick Lepsinger: Biases is about being attached to a particular belief, right? And closed
off to information that runs counter to that overall, right? We have the
idea of a confirmation bias. We are only looking for information that
supports your conclusion. You draw the conclusion, then look for
information that supports it, and do not attend to other data. Again, it’s
not … it’s almost human nature to do it, it’s the awareness that it’s
happening, it’s recognizing so you can stop it from happening.
Rick Lepsinger: Hidden assumptions is about reaching a conclusion without supporting
evidence. It’s just, “I believe this. There may not be any data around it,
but I believe it overall.” The misinterpreting statistics, and I think this is
particularly interesting, especially against this notion of, what’s a fact? I
thought it was a fact, but what is a fact? There’s this idea about
misinterpreting or misusing statistics overall.
Rick Lepsinger: If you take a look at these two items: Taking this drug reduces the risk of
cancer by 50%. Taking this drug reduces the risk of cancer from 6% to
3%. Which one would be most convincing to you? For most people,
taking the drug, the drug reduces the risk of cancer by 50%, seems much
5
better than reducing the risk of cancer from 6% to 3%. However, it’s the
same statistic. It’s just being presented or said in a different way. You
could in fact take something that’s truthful, that’s factual, but it might
not necessarily be statistically important or significant in terms of the
conversation overall. I think, again, in many ways people manipulate
data to put the best spin on it. Part of critical thinking is to really dig in
to understand the implication of that data, and what the meaning is
overall.
Rick Lepsinger: The other is around jumping to conclusions. This is about going with first
impressions, right? Your gut reaction. This is a little bit about the
emotional side as well. This is how I feel about it. This is what I believe.
This is my experience, so it must be true overall. You’re trying to just be
aware because this can be a result of stress, of time pressure, that
causes us to move to what we think is the easiest conclusion. The idea
of rationalization is basically reaching a conclusion and then looking for
evidence, right? We have an opinion, and then we look for data that
supports it overall.
Rick Lepsinger: The emotional thinking is the idea of reacting emotionally, and it could
be because the language the other person is using, the situation, the
stakes that are high, rather than being more objective. I would say to
you in these situations, it’s not so much to eliminate feelings, to
eliminate your gut reaction, but what you wanna do is make sure you
have a balance. You want both your right brain and your left brain to
sort of be operating together, so that you have kind of the best of both
worlds. You have the objective analytic side, and you have the
subjective sort of gut feeling. Then you can sort of put those two things
together.
Rick Lepsinger: Moving to critical thinking skills overall, and there are three of them:
The ability to reason, the ability to predict, and the ability to evaluate.
Once again, these are all learned skills. Some of us have more of a
tendency to them, but these are things we can all learn and enhance our
abilities. The whole idea of the ability to reason, it’s about applying
systematic, logical reasoning to an issue overall. The key here is being
able to find mistakes, inconsistencies, in logic and reasoning, and to take
the time to actually look for them overall, right? You wanna find and
surface hidden assumptions overall. The idea of thinking about cause
and effect, and to avoid confusing that is part of that analytic thinking.
Rick Lepsinger: For example, you have online training participants receive higher scores,
so the conclusion would be online training is more effective. However,
the problem with that reasoning is that there are probably multiple
reasons that might contribute to that outcome. Even though there
might be a relationship between those two, it may not necessarily be
6
causal. A lot of times when you read studies, they talk about people who
don’t smoke are smarter, or people who eat kumquats live longer. Well,
there may be a relationship between those two factors, but it doesn’t
necessarily mean that one causes the other. That’s where we need to be
more discerning, more critical, more focused in terms of our thinking.
Rick Lepsinger: A couple of questions you can use to help you recognize the quality of
the information and the evidence that you have, and to be able to
recognize the effects of your emotions on when you’re evaluating
arguments, is to think about the pros and cons. To really step back and
say, “What are my biases?” And to be explicit about exploring that.
Think about the impact of the decision on other people. Think about
who would agree or disagree with the proposed solution, and think
about key points, perspectives that you need to keep in mind as you’re
evaluating these particular options. Part of this is not so much to let
critical thinking just happen, it really is the application of some key
questions overall.
Rick Lepsinger: The other idea, sorry, let’s just go back, is to use this idea of the devil’s
advocate. You really do wanna have what-if questions. We frequently
try to suppress people who are asking what-if questions because they
tend to slow things down. We see it as not being a team player, and we
really wanna get on to the next task overall. When people raise the,
“Well, what if it doesn’t? What if it does? What if it could?” We really
aren’t open to that, but in point of fact, we should be doing the
opposite. We should be encouraging this kind of questioning, this kind
of raising these kinds of issues to get us to look at the other side of
things. This idea of evaluating information overall, I think in terms of
related to our cognitive biases and how we go about assessing the
quality and value of information.
Rick Lepsinger: I have a video that I’d like to share with you. It’s a TED Talk. I think you’ll
find it very, very interesting. It’s really about our tendency to be more
biases, to look for patterns, to make assumptions, and not to really be
particularly critical about the information we’re looking at.
Rick Lepsinger: Sara, if you could start that video for us please.
Michael Shermer: I am Michael Shermer, the director of the Skeptics Society, publisher of
Skeptic magazine. We investigate claims of the paranormal, pseudoscience, and fringe groups and cults, and claims of all kinds between
science and pseudo-science and non-science and junk science, voodoo
science, pathological science, bad science, non-science, and plain old
nonsense. And unless you’ve been on Mars recently, you know there’s a
lot of that out there. Some people call us debunkers, which is kind of a
negative term. But let’s face it, there’s a lot of bunk. We are like the
7
bunko squads of the police department out there flushing out … Well,
we’re sort of like the Ralph Naders of bad ideas, trying to replace bad
ideas with good ideas.
Michael Shermer: I’ll show you an example of a bad idea. I brought this with me. This was
given to us by NBC Dateline to test. It’s produced by the Quadro
Corporation of West Virginia. It’s called the Quadro 2000 Dowser Rod.
This was being sold to high school administrators for $900 apiece. It’s a
piece of plastic with a Radio Shack antenna attached to it. You could
dowse for all sorts of things, but this particular one was built to dowse
for marijuana in students’ lockers.
Michael Shermer: The way it works is you go down the hallway, and you see if it tilts
toward a particular locker, and then you open the locker. It looks
something like this. I’ll show you. Well, it has kind of a right-leaning bias.
Well, this is science, so we’ll do a controlled experiment. It’ll go this way
for sure.
Michael Shermer: Sir, do you want to empty your pockets, please, sir?
Michael Shermer: So the question was, can it actually find marijuana in students’ lockers?
And the answer is, if you open enough of them, yes.
Michael Shermer: But in science, we have to keep track of the misses, not just the hits.
And that’s probably the key lesson to my short talk here is that this is
how psychics work, astrologers, tarot card readers and so on. People
remember the hits and forget the misses. In science, we keep the whole
database, and look to see if the number of hits somehow stands out
from the total number that you would expect by chance.
Michael Shermer: In this case, we tested it.
Michael Shermer: We had two opaque boxes. One with government-approved THC
marijuana, and one with nothing, and it got it 50% of the time, which is
exactly what you’d expect with a coin-flip model. That’s just a fun little
example here of the sorts of things we do.
Michael Shermer: Skeptic is the quarterly publication. Each one has a particular theme.
This one is on the future of intelligence. Are people getting smarter or
dumber? I have an opinion of this myself because of the business I’m in,
but in fact, people, it turns out, are getting smarter. Three IQ points per
10 years, going up. Sort of an interesting thing.
Michael Shermer: With science, don’t think of skepticism as a thing, or even science as a
thing. Are science and religion compatible? It’s like, is science and
plumbing compatible? They’re just two different things. Science is not a
8
thing. It’s a verb. It’s a way of thinking about things. It’s a way of looking
for natural explanations for all phenomenon.
Michael Shermer: I mean, what’s more likely, that extraterrestrial intelligences or multidimensional beings travel across the vast distances of interstellar space
to leave a crop circle in Farmer Bob’s field in Puckerbrush, Kansas to
promote skeptic.com, our web page? Or is it more likely that a reader of
Skeptic did this with Photoshop? And in all cases we have to ask, what’s
the more likely explanation? Before we say something is out of this
world, we should first make sure that it’s not in this world. What’s more
likely, that Arnold had a little extraterrestrial help in his run for the
governorship, or that the World Weekly News makes stuff up?
Michael Shermer: Part of that, the same theme is expressed nicely here in this Sidney
Harris cartoon. For those of you in the back, it says here: “Then a
miracle occurs. I think you need to be more explicit here in step two.”
This single slide completely dismantles the intelligent design arguments.
There’s nothing more to it than that. You can say a miracle occurs, it’s
just that it doesn’t explain anything. It doesn’t offer anything. There’s
nothing to test. It’s the end of the conversation for intelligent design
creationists.
Michael Shermer: Whereas, and it’s true, scientists sometimes throw terms out as
linguistic place fillers, dark energy or dark matter, something like that,
until we figure out what it is, we’ll just call it this. It’s the beginning of
the causal chain for science. For intelligent design creationists, it’s the
end of the chain. Again, we can ask this, what’s more likely? Are UFOs
alien spaceships, or perceptual cognitive mistakes, or even fakes?
Michael Shermer: This is a UFO shot from my house in Altadena, California, looking down
over Pasadena. If it looks a lot like a Buick hubcap, it’s because it is. You
don’t even need Photoshop. You don’t need high-tech equipment, you
don’t need computers. This was shot with a throwaway Kodak
Instamatic camera. You just have somebody off on the side with a
hubcap ready to go. Camera’s ready, that’s it.
Michael Shermer: Although it’s possible that most of these things are fake or illusions or so
on, and that some of them are real, it’s more likely that all of them are
fake, like the crop circles. On a more serious note, in all of science we’re
looking for a balance between data and theory. In the case of Galileo, he
had two problems when he turned his telescope to Saturn. First of all,
there was no theory of planetary rings. Second of all, his data was grainy
and fuzzy, and he couldn’t quite make out what it was he was looking at.
He wrote that he has seen, “I have observed that the furthest planet has
three bodies.” And this is what he ended up concluding that he saw. So
without a theory of planetary rings and with only grainy data, you can’t
9
have a good theory. It wasn’t solved until 1655.
Michael Shermer: This is Christiaan Huygens’s book in which he cataloged all the mistakes
that people made in trying to figure out what was going on with Saturn.
It wasn’t till Huygens had two things, he had a good theory of planetary
rings and how the solar system operated, and then he had better
telescopic, more fine-grain data in which he could figure out that as the
Earth is going around faster, according to Kepler’s Laws, than Saturn,
then we catch up with it. And we see the angles of the rings at different
angles, there. And that’s, in fact, turns out to be true.
Michael Shermer: The problems with having a theory is that your theory may be loaded
with cognitive biases. One of the problems of explaining why people
believe weird things is that we have things, on a simple level, and then
I’ll go to more serious ones. Like, we have a tendency to see faces. This
is the face on Mars. In 1976, where there was a whole movement to get
NASA to photograph that area because people thought this was
monumental architecture made by Martians.
Michael Shermer: Well, it turns out, here’s the close-up of it from 2001. If you squint, you
can still see the face. And when you’re squinting, what you’re doing is
you’re turning that from fine-grain to coarse-grain, so you’re reducing
the quality of your data. And if I didn’t tell you what to look for, you’d
still see the face, because we’re programmed by evolution to see faces.
Faces are important for us socially. Of course, happy faces, faces of all
kinds are easy to see. You see the happy face on Mars, there. If
astronomers were frogs, perhaps they’d see Kermit the Frog. Do you see
him there? Little froggy legs. Or if geologists were elephants?
Michael Shermer: Religious iconography. Discovered by a Tennessee baker in 1996. He
charged five bucks a head to come see the nun bun till he got a ceaseand-desist from Mother Teresa’s lawyer. Here’s Our Lady of Guadalupe
and Our Lady of Watsonville, just down the street, or is it up the street
from here? Tree bark is particularly good because it’s nice and grainy,
branchy, black-and-white splotchy, and you can get the pattern-seeking,
humans are pattern-seeking animals. Here’s the Virgin Mary on the side
of a glass window in Sao Paulo. Here’s the Virgin Mary made her
appearance on a cheese sandwich, which I got to actually hold in a Las
Vegas casino, of course, this being America.
Michael Shermer: This casino paid $28,500 on eBay for the cheese sandwich. But who
does it really look like? The Virgin Mary? It has that sort of puckered lips,
1940s-era look. Virgin Mary in Clearwater, Florida. I actually went to see
this one. There was a lot of people there. The faithful come to be in
their wheelchairs and crutches, and so on. We went down and
investigated. Just to give you a size, that’s Dawkins, me and The Amazing
10
Randi, next to this two, two-and-a-half story-sized image. All these
candles, so many thousands of candles people had lit in tribute to this.
We walked around the backside, just to see what was going on. It turns
out wherever there’s a sprinkler head and a palm tree, you get the
effect. Here’s the Virgin Mary on the backside, which they started to
wipe off. I guess you can only have one miracle per building. Is it really a
miracle of Mary, or is it a miracle of Marge?
Michael Shermer: I’m going to finish up with another example of this, with auditory
illusions. There’s this film, White Noise, with Michael Keaton, about the
dead talking back to us. By the way, the whole business of talking to the
dead it’s not that big a deal. Anybody can do it, turns out. It’s getting the
dead to talk back that’s the really hard part. In this case, supposedly,
these messages are hidden in electronic phenomena. There’s a
ReverseSpeech.com web page at which I downloaded this stuff. Here is
the forward, this is the most famous one of all of these. Here’s the
forward version of the very famous song.
Michael Shermer: (singing)
Michael Shermer: Boy, couldn’t you just listen to that all day? All right, here it is
backwards, and see if you can hear the hidden messages that are
supposedly in there.
Michael Shermer: (singing)
Michael Shermer: What’d you get?
Speaker 4: Satan.
Michael Shermer: Satan. Okay, at least we got Satan. Now, I’ll prime your auditory part of
your brain to tell you what you’re supposed to hear, and then hear it
again.
Michael Shermer: (singing)
Michael Shermer: You can’t miss it when I tell you what’s there.
Michael Shermer: I’m going to just end with a positive, a nice little story. The Skeptics is a
nonprofit educational organization. We’re always looking for little good
things that people do. In England, there’s a pop singer, one of the top
popular singers in England today, Katie Melua. She wrote a beautiful
song. It was in the top five in 2005, called, Nine Million Bicycles in
Beijing. It’s a love story. She’s sort of the Norah Jones of the UK, about
how she much loves her guy, and compared to nine million bicycles, and
so forth. And she has this one passage here.
11
Michael Shermer: (singing)
Michael Shermer: Well, that’s nice. At least she got it close. In America, it would be,
“We’re 6,000 light years from the edge.”
Michael Shermer: But my friend, Simon Singh, the particle physicist now turned science
educator, he wrote the book The Big Bang, and so on, uses every chance
he gets to promote good science. And so he wrote an op-ed piece in The
Guardian about Katie’s song, in which he said, “Well, we know exactly
how far from the edge. It’s 13.7 billion light years, and it’s not a guess.
We know within precise error bars there how close it is. We can say,
although not absolutely true, that it’s pretty close to being true.”
Michael Shermer: And, to his credit, Katie called him up after this op-ed piece came out,
and said, “I’m so embarrassed. I was a member of the astronomy club. I
should’ve known better.” And she re-cut the song. I will end with the
new version.
Michael Shermer: (singing)
Michael Shermer: How cool is that?
Rick Lepsinger: … one way to talk a little bit about the role of cognitive biases, how they
impact our judgment, the way in which we look at information, and the
importance of critical thinking, the importance of being skeptical and
really looking at information overall.
Rick Lepsinger: I’m gonna skip these next couple of slides, at least I’m gonna try to.
Okay, all right. The third area’s around the ability to predict. This is
about identifying consequences, especially looking at the unintended
consequences, or the potentially negative consequences of your action
overall. There’s a little delay in the … The idea here is to think about the
outcomes, not the ones you want, rather to think a little bit about the
possible, potential problems that might be out there overall. That is
really the focus. We tend to focus on what we want, rather than what
might be or what could happen.
Rick Lepsinger: The last is the ability to evaluate overall. This is around assessing the
merit of a conclusion that you’re making overall. What you’re trying to
do is think about, does it follow from the data I have? Is the conclusion
fair and reasonable? And based on the information I have, does it make
sense overall?
Rick Lepsinger: Here are a couple of questions that you can ask yourself to basically help
you determine if your conclusions are following the information that
12
you have overall. After evaluating the facts, what’s the best possible
outcome? What specific evidence do I have that’s driving my
conclusion? Is there new evidence that I should be thinking about?
What does your common sense say overall? What’s the timeline? What
opportunities does your conclusion provide? And by being specific, and
asking yourself these questions, you can make it part of your thought
process overall.
Rick Lepsinger: The important thing to remember, I think, around critical thinking is that
decisions should not be seen as final. You really need to be open to new
facts, new information that may give you a different point of view on
things, and may cause you to make a few adjustments overall. What
you’re trying to do is not avoid or ignore new information just because
you’ve made a commitment to a particular point of view.
Rick Lepsinger: We’ll just move ahead a little bit on this. In terms of some of the
obstacles around critical thinking. One of them is a general lack of
awareness. In general, that’s what we’ve talking about, being
introspective, understanding your own biases, understanding what
some of the key questions are, being a little bit skeptical is a key
component of being a critical thinker and a key obstacle. The poor
decision making culture and the lack of time overall. Let’s take a look at
each one of those.
Rick Lepsinger: In terms of the lack of awareness. One solution is to follow a linear
process. To use a structured process around collecting information,
assessing the information, and drawing a conclusion, which will help in
terms of addressing that awareness piece. You want to overall take a
look at the results, and basically ask, is it reasonable? What are the
overall implications? What decisions and actions will you take now going
forward?
Rick Lepsinger: A poor decision making culture here is around a culture that rewards
speed and fast decisions. When people do critical thinking self
assessments, one of the things that causes problems is not that they’re
not good critical thinkers. Many people are excellent critical thinkers,
but when you rush the test, or when you get interrupted during the test,
it tends to have a negative impact on your score, on the outcome. This
idea of speed does not always work in your favor overall, right?
Rick Lepsinger: What you’re trying to do also, and this goes back to that devil’s
advocate, that what-if question. Avoiding group think, where the culture
of the organization is to go along with the group. That can bee the death
of critical thinking, where people don’t express an alternate point of
view, don’t share additional information, don’t ask difficult questions
overall.
13
Rick Lepsinger: The lack of time, I think, is also interesting. Again, it goes back to the
speed thing. It causes suboptimal decisions to be made because you’re
not taking time to explore the relevant information. You’re also working
with incomplete information and higher stress levels. The key question
here is, what information is most relevant, and how do you know that
you have the right information? A structured process can help with that
overall, and a critical thinking model.
Rick Lepsinger: The idea here is to start by examining the issue, set out your decision
criteria, your goal, then gather that information related to your criteria,
and use that to reach a conclusion overall, right? You wanna ask a wide
range of questions when you’re examining the issue overall, but the key
is to set your goal or criteria to be clear about what it is you’ll use to
make that decision overall, right? To be able to define what a successful
outcome would look like before you start to look at information. Now
you’re at the data gathering stage. You wanna collect a wide range of
data, but you also can’t go on an unlimited search for information as
well, right? The key here is to be able to determine when enough data
will help you make the best decision, and how do you have enough
information.
Rick Lepsinger: This chart shows the results of a study that was done where they took a
look at horse handicappers, race handicappers, people who bet on
horses. You can see the red line is the degree of confidence they had,
and the blue line is their performance. You can see that as they were
given more information their confidence increased dramatically, but
their performance did not improve at all. As a matter of fact, there’s sort
of an inflection point where you can see that their performance actually
started to decrease as the information increased. More information is
not necessarily better. What you’re looking for is the right information,
right?
Rick Lepsinger: How do you know when you have the right information? The answer to
that question is the systematic process and a decision criteria. A
decision criteria answers two key questions. What will the best
alternative look like? And what are the characteristics of that alternative
that will accomplish our objective in the best way?
Rick Lepsinger: Here’s an example of a what decision criteria might look like. This
example is around buying a house, which I think is something many of
you have done. Here on the left, what you’re laying out is the criteria,
that basically defines the best house for you. In theory, this is what you
would give to the realtor, and you’d say to them, “Only show me houses
that meet this criteria.” Then you’d take a look at two or three or more
alternatives. Now you’re able to compare each alternative to that
14
criteria to see which alternative meets the criteria the best. The
importance of this, or the usefulness of this, is you’re defining what
information is most relevant, what information is most useful.
Rick Lepsinger: Now your search for information is in a more targeted kind of net,
rather than just looking at everything out there that’s possible, you’re
defining what’s relevant and important. If you can’t fill in one of these
little matrix squares, your search for information is also more targeted
in general because you’re focused on a particular criteria. In addition,
your assessment of the alternatives is more objective because you’re
not comparing alternatives to each other, you’re comparing alternatives
to the decision criteria. Which one meets that criteria the best? It takes
the personal preference out of the equation, and it takes personality out
of the equation, and allows you to be much more objective.
Rick Lepsinger: When it comes time to reaching a conclusion, you wanna make sure that
there’s a logical flow between the information you have and the
conclusion that you are drawing overall. You’re also kind of discarding
those that don’t meet your criteria, or may have adverse consequences.
Because your decision criteria gets at the positive side, but you also
wanna do potential problem analysis to look at the potential downside.
Rick Lepsinger: All right, so let’s take a look at kind of some general best practices, and
then we can open it up for a few questions if you would like. The key
here is take time to understand and frame the issue. Speed here is not
necessarily your friend. Even when you think you know what the
situation is, or you’ve seen it before, that in fact could be one of the
biases. “I’ve seen this before. I know exactly what it is. I’ll do what I did
before.” Take time to really understand, frame the issue to understand
the problem. Awareness of your biases makes a big difference. It’s not
that you can eliminate your bias. It’s not that you can all of the sudden
not have them. The key is to be aware. Because those biases actually
help you sort through the enormous amount of information that you get
every day. The idea here is to be aware of the bias, and not let it have a
negative impact on the way you’re looking at the information, and the
way you’re drawing conclusions, right?
Rick Lepsinger: Along with that is to check assumptions. Make sure that opinions are
backed up with data. Make sure that the facts you’re looking at are
presented in the most useful and relevant manner, that the information
is not being manipulated, or that people are selectively disclosing
information, right? You want to be diligent about collecting data and
asking questions. A big part of critical thinking is asking questions
overall. Again, timeframes cause us to not do that because it extends
the decision making process, but in fact it’s really just investing time on
the frontend to make a better quality decision so you spend less time on
15
the backend trying to fix it overall.
Rick Lepsinger: The other thing is that you should work hard to find evidence that
refutes your conclusion. Find information that’s disconfirming. Many of
us work hard to find information that supports our point of view. I
would suggest that that’s self defeating. Look for information that is
actually the opposite of what you would like to do, that refutes it. If in
fact you try to knock down your idea, and it still holds up, you have
much more confidence that it’s the right way to go. So don’t avoid or
minimize that. Embrace contradictory information to really make sure
your idea holds up overall. Focus on objectively testing logic of your
conclusion. Does it follow from the data that you have overall? Try to
remain open-minded, not too attached to your particular preference or
opinion or conclusion, and be open to other points of view and other
perspectives.
Rick Lepsinger: All right, so just in general, characteristics of critical thinkers. It’s about
being inquisitive, being objective, being reflective. These are things if
you’re aware, you can in fact make happen overall. Some of those
things, those characteristics, are not always skill driven. On the skill side
though, it’s really about a reasoning, being able to anticipate or look
forward at potential problems or consequences, and then being able to
evaluate information in an objective manner.
Rick Lepsinger: The obstacles we talked about, the lack of awareness, a culture that
doesn’t support critical thinking, and the lack of time. Those things we
actually can control to some degree. And to avoid some of those
mistakes, right? The cognitive biases, rationalizations, and to let
emotional thinking dominate your approach.
Rick Lepsinger: All right, so we have some time for a couple of questions, but I’ll turn it
over to Sara at this point.
Sara Lindmont: Wonderful. Thank you so much, Rick. We do have some time for
questions, so go ahead and type those up, send those in. While we’re
waiting for those, I just want to introduce everyone to HRDQ. We
publish research-based, experiential learning products that you can
deliver in your organization. Today’s session was based on our critical
thinking fundamentals, and that is a piece that you can deliver yourself,
or if you want one of our expert trainers like Rick to deliver it for you,
we also provide those services.
Sara Lindmont: Good, I can see some questions are coming in. Let’s go ahead and we’re
gonna get started there. Our first question is about emotion. What role
do emotions play in critical thinking and decision making? Are they
always a problem?
16
Rick Lepsinger: No, I don’t think emotions are always a problem because, again,
emotions can be based on sort of your gut reaction. Your gut reaction is
not necessarily based on nothing, right? It’s based on your experience,
previous experience, your knowledge, which really adds a lot of value.
You may not be able to articulate it clearly, but the emotional side isn’t
always bad. Now, the emotional side, like anger or stress, can definitely
be a problem. The idea there is to sort of take time, sort of like count to
10, and give time for your rational brain to kick, but you really do wanna
have a balance of both the rational, analytic side and sort of the
intuitive, gut-sense on things because both of them add value. Again,
the challenge is to keep them in balance and not let one dominate the
other overall.
Sara Lindmont: Great, thank you. Our next question is about the culture of organization.
How can you change a poor decision making culture?
Rick Lepsinger: Yeah, and that can become a little bit more difficult. Part of it is define
culture. A lot of it depends on where you are in the organization, but
culture is really kind of the aggregate of individual behavior. And
behavior kind of reflects values and norms overall. It starts with
controlling what you can control as to the best of your ability. For those
of you that actually lead teams, you actually can start to apply some of
these critical thinking concepts to your team overall. Now, again, you
may not be able to impact the entire organization, but you can impact
the culture of your team, the culture of your department overall.
Rick Lepsinger: If you’re just an individual contributor, where you don’t necessarily have
authority or control over how the department or function or
organization is working, the idea is just to start to model it, right? To ask
some of these key questions, to pose alternatives to people. To do it in a
way that’s constructive, that doesn’t necessarily eat up a lot of time, but
that helps people see the value overall. There are a number of ways to
make it happen, but the more senior you are in the organization, the
more, the broader kind of span you can impact by the application of
some of these concepts and tools. It’s really just setting up expectations
to make sure that people are available, that people understand what
they are, and that you’re reinforcing their use.
Sara Lindmont: Good, and Rick, I sort of have, I have a question here for you myself.
How do I know that I am a good, that I am good at critical thinking? How
would have a sense for that?
Rick Lepsinger: Well, there’s a couple of things. One, I mean, it starts at being self
reflective, and really thinking about, are you questioning assumptions?
Are you using a systematic process? Are you evaluating information?
17
You could just sort of do a self checklist. You can also get feedback from
others in terms of debrief the decision making process. Take a look at
decisions that have been made and say, “How did we go about that
overall?” Debrief it in the context of some of the tools we talked about
today.
Rick Lepsinger: In addition to that, you could also evaluate your critical thinking using
different self assessments. There’s two of them that are out there. One
of them is the Watson Glaser Critical Thinking Test, and the other is the
Hogan Business Reasoning Inventory. Both of them will give you some
inside into your ability to think critically. They give you a whole range of
scenarios, and they ask you to draw conclusions based on the scenarios
that you’re looking at. Those are two self assessments that are available
in the public domain, that you’d be able to get some read on your
critical thinking skills.
Sara Lindmont: Good, great. Our last question here, I think, that we’re gonna have time
for is around hiring. Can you talk a little bit about critical thinking in
relation to bias and emotions when you’re making hiring decisions?
Rick Lepsinger: Yeah, right, that’s actually a good question, but that’s one of the places
where it plays out a great deal. One has to do with the time we put in to
really look at the candidate overall, and the kinds of questions we’re
asking. In the hiring process, frequently we ask sort of softball
questions, easy questions. We don’t really challenge people. And we
may not even have a competency. The competency model would be
your decision criteria. And to say, “How well does this person line up
against these competencies that I’m using to make the hiring decision?”
Rick Lepsinger: That’s one starting point. Do you have the competencies that you’re
looking for, and are you asking questions to better get at whether they
have the competency or not? The other is around biases. If somebody
walked in and we just liked them, they go to the same school we went
to, we just look for data that supports why we should hire them, right?
You’re trying to avoid that kind of a bias, that confirmation kind of a
bias, or the rationalization bias overall. Or we’re just not really, we don’t
care. We’ve gotta fill the spot. We’ll take the first warm body that looks
like they can answer my questions and seem reasonably personal. We’ll
just get them into the seat right away.
Rick Lepsinger: The problem with that is that you end up getting into a cycle of having
to continually rehire because you make a bad hiring choice, the person
doesn’t work out, and rather than … the speed there works against you
because rather than finding somebody who will stay with the
organization and perform at a high level, we find someone who really is
not a good fit, they turn over, and then we’re back into the hiring cycle
18
again.
Rick Lepsinger: Here, taking the time, using a competency model as your decision
criteria. You might even think about using objective testing because in
an interview, people know how to answer your questions and give good
interview, but if you use leadership style questionnaires and
assessments, they tend to be more objective, and you get a better read
on people’s preferences, their inclination and their style, and the extent
to which they’re a good fit, not just the way they’ve been coached to
answer a question in the interview overall.
Sara Lindmont: Good, thank you you so much, Rick. It’s always wonderful to hear your
expertise.
Rick Lepsinger: Thanks, thanks very much. I’m glad we were able to do this, and again,
everyone, thank you very much for joining us. Again, this idea of critical
thinking, it’s not that there’s more fake news in the world. It’s just that
there’s less critical thinking. The idea is we really do need to be more
skeptical, more critical, more objective, to probe a little bit, be more
open-minded when you’re hearing different points of view. I think all of
that’s really about enhancing judgment and the quality of your
decisions. Thanks very much for joining us today.
Sara Lindmont: Good, and if we did not get a chance to answer your question, we will
email you shortly with that answer. Go ahead if you have any additional
questions, go ahead and type those in. We’ll stay on the line here for a
second, so I can capture those. We look forward to having you
participate in our next webinar.
19

Presenter

Moustache M - Visual perception

Rick Lepsinger is President of OnPoint Consulting. Rick’s career has focused on helping organizations and leaders identify and develop leaders, work better virtually, enhance cross functional team performance, and get from strategy to execution faster. He conducts numerous seminars and workshops on succession management, leading from a distance, leading cross functional teams, and enhancing execution.  Rick has written numerous articles and is the author or co-author of several books.

Reviews

There are no reviews yet.

Be the first to review “Critical Thinking Skills: A Process for Better Problem Solving and Decision Making”
Share this page
Share on facebook
Share on twitter
Share on linkedin
Share on email

About HRDQ-U Webinars

HRDQ-U is a free learning community for trainers and facilitators, coaches and consultants, organization development professionals, managers, supervisors and leaders; really anyone who shares a passion for soft-skills training and performance improvement. We bring exciting content to you through webinars from subject matter experts and thought leaders to help you explore new ideas, gain industry insight, and improve people skills in your workplace.