test of problem solving description

JavaScript seems to be disabled in your browser. For the best experience on our site, be sure to turn on Javascript in your browser.

  • 844 378 4918
  • Assess Online (OES)
  • Access ProLearn

Logo

  • Compare Products

(TOPS-3E:NU) Test of Problem Solving - Elementary, Third Edition Normative Update

About Qualification Levels

test of problem solving description

--> From $42.00

--> To $431.00

Sign In to download sample materials.

Don't have an account? Register Here.

Includes 25 Examiner Record Booklets; Picture Book; Picture Sequence Cards; Manual

* Required Fields

(TOPS-3E:NU) Test of Problem Solving - Elementary, Third Edition Normative Update

--> $42.00 -->

About This Product

The TOPS-3E: NU focuses on a student’s linguistic ability to think and reason. Language competence is the verbal indicator of how a student’s language skills affect his or her ability to think, reason, problem solve, infer, classify, associate, predict, determine causes, sequence, and understand directions. The test focuses on a broad range of language-based thinking skills, including clarifying, analyzing, generating solutions, evaluating, and showing affective thinking.

While other tests may assess students’ thinking skills by tapping mathematical, spatial, or nonverbal potential, the TOPS-3E: NU measures discreet skills that form the foundation of language-based thinking, reasoning, and problem-solving ability.

Although the skills tested by the TOPS-3E: NU are necessary for developing social competence, it is not primarily a test of pragmatic or social language skills. Rather, it should be part of a battery of tests and observations used to assess pragmatic competence.

New features:

  • Characteristics of the normative sample were stratified by age relative to region, gender, ethnicity, and socioeconomic factors.
  • The Total Score was renamed the Problem Solving Index and calculated as a standard score with a mean of 100 and a standard deviation of 15.
  • Each item on the test was evaluated using both conventional item analysis to choose “good” items and differential item analysis to find and eliminate potentially biased items.
  • The index score was thoroughly examined for floor and ceiling effects.
  • The test was subjected to diagnostic accuracy analyses, particularly rigorous techniques involving the computation of the receiver operating characteristic/area under the curve (ROC/AUC) statistic.
  • The Manual was reorganized and rewritten to provide more detailed information on the administration, interpretation, and statistical characteristics of the test.
  • Privacy Policy
  • Terms of Use
  • Cookie Settings

PAR

TOPS-3E: NU

Test of problem solving–3 elementary: normative update, linda bowers, ma, slp, rosemary huisingh, ma, slp, and carolyn logiudice, ma, ccc-slp, browse products a-z, measure discreet skills that form the foundation of language-based thinking, reasoning, and problem-solving.

The TOPS-3E: NU assesses a school-age child’s ability to integrate semantic and linguistic knowledge with reasoning ability by way of picture stimuli and verbal responses.

The TOPS-3E: NU focuses on students’ linguistic ability to think and reason. Language competence is the verbal indicator of how a student’s language skills affect his ability to think, reason, problem solve, infer, classify, associate, predict, determine causes, sequence, and understand directions. The test focuses on a broad range of language-based thinking skills, including clarifying, analyzing, generating solutions, evaluating, and showing affective thinking.

While other tests may asses students’ thinking skills by tapping mathematical, spatial, or nonverbal potential, the TOPS-3E: NU measures discreet skills that form the foundation of language-based thinking, reasoning, and problem-solving ability.

Although the skills tested on the TOPS-3E: NU are necessary for developing social competence, it is not primarily a test of pragmatic or social language skills. Rather, it should be part of a battery of tests and observations used to assess pragmatic competence.

Features and benefits

  • Characteristics of the normative sample were stratified by age relative to region, gender, ethnicity, and socioeconomic factors, and other critical variables are the same as those reported for the school-age population reported the ProQuest Statistical Abstract of the United States, 2016 (ProQuest, 2016).
  • The Total Score was renamed the Problem Solving Index and calculated as a standard score with a mean of 100 and a standard deviation of 15.
  • Each item on the test was evaluated using both conventional item analysis to choose “good” items and differential item analyses to find and eliminate potentially biased items.
  • The index score was thoroughly examined for floor and ceiling effects.
  • The test was subjected to diagnostic accuracy analyses, particularly rigorous techniques involving the computation of the receiver operating characteristic/area under the curve (ROC/AUC) statistic.
  • The Examiner’s Manual was reorganized and rewritten to provide more detailed information on the administration, interpretation, and statistical characteristics of the test.
  • The TOPS-3E: NU has three components: an Examiner’s Manual, Examiner Record Booklets, and a Picture Book. The Examiner’s Manual includes a comprehensive discussion of the test’s theoretical and research-based foundation, item development, standardization, administration and scoring procedures, norm tables, and guidelines for using and interpreting the test’s results. The Examiner Record Booklet provides space to record responses and transform the raw score to an age equivalent, percentile rank, and the Problem Solving Index. The test kit also includes a Picture Book, which includes the picture stimuli for the test items.

includes TOPS-3E: NU Examiner's Manual, TOPS-3E: NU Picture Book, Picture Sequence Cards, and 25 Examiner Record Booklets

Forms and booklets

Technical information.

Reliability and validity studies were conducted with individuals who have typical language ability and individuals who had been previously diagnosed with a language impairment or received other special education services. The average coefficient alpha is .82 for the Problem Solving Index. Studies were conducted to examine the ability of the test to differentiate students who receive special education services or have language impairments from those who do not. The results demonstrate that a Problem Solving Index cutoff score of 90 resulted in a sensitivity of .75, a specificity of .85, and a ROC/AUC of .74 for differentiating students who receive special education services; and a cutoff score of 92 resulted in a sensitivity of .69, a specificity of .89, and a ROC/AUC of .73 for differentiating students who have a language impairment. Validity of the test composites was demonstrated by correlations to the Universal Nonverbal Intelligence Test–Group Abilities Test (UNIT-GAT; Bracken & McCallum, in development). The coefficient for the Analogic Reasoning subtest was .73, and the coefficient for the Quantitative subtest was .89, both very large.

PRODUCT SEARCH

Browse by category, e-stimulus books, spanish language instruments, general mental health, school assessment, remote assessment, healthcare and clinical, assessment advisors, partalks webinars, par training, university partnership program, supplemental resources, data collection and research, content hub, qualification levels, frequently asked questions, software updates, be a duck: stories from customer support, work at par, meet our staff, community partners, legal and privacy.

test of problem solving description

loading

How it works

For Business

Join Mind Tools

Self-Assessment • 20 min read

How Good Is Your Problem Solving?

Use a systematic approach..

By the Mind Tools Content Team

test of problem solving description

Good problem solving skills are fundamentally important if you're going to be successful in your career.

But problems are something that we don't particularly like.

They're time-consuming.

They muscle their way into already packed schedules.

They force us to think about an uncertain future.

And they never seem to go away!

That's why, when faced with problems, most of us try to eliminate them as quickly as possible. But have you ever chosen the easiest or most obvious solution – and then realized that you have entirely missed a much better solution? Or have you found yourself fixing just the symptoms of a problem, only for the situation to get much worse?

To be an effective problem-solver, you need to be systematic and logical in your approach. This quiz helps you assess your current approach to problem solving. By improving this, you'll make better overall decisions. And as you increase your confidence with solving problems, you'll be less likely to rush to the first solution – which may not necessarily be the best one.

Once you've completed the quiz, we'll direct you to tools and resources that can help you make the most of your problem-solving skills.

How Good Are You at Solving Problems?

Instructions.

For each statement, click the button in the column that best describes you. Please answer questions as you actually are (rather than how you think you should be), and don't worry if some questions seem to score in the 'wrong direction'. When you are finished, please click the 'Calculate My Total' button at the bottom of the test.

Answering these questions should have helped you recognize the key steps associated with effective problem solving.

This quiz is based on Dr Min Basadur's Simplexity Thinking problem-solving model. This eight-step process follows the circular pattern shown below, within which current problems are solved and new problems are identified on an ongoing basis. This assessment has not been validated and is intended for illustrative purposes only.

Below, we outline the tools and strategies you can use for each stage of the problem-solving process. Enjoy exploring these stages!

Step 1: Find the Problem (Questions 7, 12)

Some problems are very obvious, however others are not so easily identified. As part of an effective problem-solving process, you need to look actively for problems – even when things seem to be running fine. Proactive problem solving helps you avoid emergencies and allows you to be calm and in control when issues arise.

These techniques can help you do this:

PEST Analysis helps you pick up changes to your environment that you should be paying attention to. Make sure too that you're watching changes in customer needs and market dynamics, and that you're monitoring trends that are relevant to your industry.

Risk Analysis helps you identify significant business risks.

Failure Modes and Effects Analysis helps you identify possible points of failure in your business process, so that you can fix these before problems arise.

After Action Reviews help you scan recent performance to identify things that can be done better in the future.

Where you have several problems to solve, our articles on Prioritization and Pareto Analysis help you think about which ones you should focus on first.

Step 2: Find the Facts (Questions 10, 14)

After identifying a potential problem, you need information. What factors contribute to the problem? Who is involved with it? What solutions have been tried before? What do others think about the problem?

If you move forward to find a solution too quickly, you risk relying on imperfect information that's based on assumptions and limited perspectives, so make sure that you research the problem thoroughly.

Step 3: Define the Problem (Questions 3, 9)

Now that you understand the problem, define it clearly and completely. Writing a clear problem definition forces you to establish specific boundaries for the problem. This keeps the scope from growing too large, and it helps you stay focused on the main issues.

A great tool to use at this stage is CATWOE . With this process, you analyze potential problems by looking at them from six perspectives, those of its Customers; Actors (people within the organization); the Transformation, or business process; the World-view, or top-down view of what's going on; the Owner; and the wider organizational Environment. By looking at a situation from these perspectives, you can open your mind and come to a much sharper and more comprehensive definition of the problem.

Cause and Effect Analysis is another good tool to use here, as it helps you think about the many different factors that can contribute to a problem. This helps you separate the symptoms of a problem from its fundamental causes.

Step 4: Find Ideas (Questions 4, 13)

With a clear problem definition, start generating ideas for a solution. The key here is to be flexible in the way you approach a problem. You want to be able to see it from as many perspectives as possible. Looking for patterns or common elements in different parts of the problem can sometimes help. You can also use metaphors and analogies to help analyze the problem, discover similarities to other issues, and think of solutions based on those similarities.

Traditional brainstorming and reverse brainstorming are very useful here. By taking the time to generate a range of creative solutions to the problem, you'll significantly increase the likelihood that you'll find the best possible solution, not just a semi-adequate one. Where appropriate, involve people with different viewpoints to expand the volume of ideas generated.

Tip: Don't evaluate your ideas until step 5. If you do, this will limit your creativity at too early a stage.

Step 5: Select and Evaluate (Questions 6, 15)

After finding ideas, you'll have many options that must be evaluated. It's tempting at this stage to charge in and start discarding ideas immediately. However, if you do this without first determining the criteria for a good solution, you risk rejecting an alternative that has real potential.

Decide what elements are needed for a realistic and practical solution, and think about the criteria you'll use to choose between potential solutions.

Paired Comparison Analysis , Decision Matrix Analysis and Risk Analysis are useful techniques here, as are many of the specialist resources available within our Decision-Making section . Enjoy exploring these!

Step 6: Plan (Questions 1, 16)

You might think that choosing a solution is the end of a problem-solving process. In fact, it's simply the start of the next phase in problem solving: implementation. This involves lots of planning and preparation. If you haven't already developed a full Risk Analysis in the evaluation phase, do so now. It's important to know what to be prepared for as you begin to roll out your proposed solution.

The type of planning that you need to do depends on the size of the implementation project that you need to set up. For small projects, all you'll often need are Action Plans that outline who will do what, when, and how. Larger projects need more sophisticated approaches – you'll find out more about these in the article What is Project Management? And for projects that affect many other people, you'll need to think about Change Management as well.

Here, it can be useful to conduct an Impact Analysis to help you identify potential resistance as well as alert you to problems you may not have anticipated. Force Field Analysis will also help you uncover the various pressures for and against your proposed solution. Once you've done the detailed planning, it can also be useful at this stage to make a final Go/No-Go Decision , making sure that it's actually worth going ahead with the selected option.

Step 7: Sell the Idea (Questions 5, 8)

As part of the planning process, you must convince other stakeholders that your solution is the best one. You'll likely meet with resistance, so before you try to “sell” your idea, make sure you've considered all the consequences.

As you begin communicating your plan, listen to what people say, and make changes as necessary. The better the overall solution meets everyone's needs, the greater its positive impact will be! For more tips on selling your idea, read our article on Creating a Value Proposition and use our Sell Your Idea Skillbook.

Step 8: Act (Questions 2, 11)

Finally, once you've convinced your key stakeholders that your proposed solution is worth running with, you can move on to the implementation stage. This is the exciting and rewarding part of problem solving, which makes the whole process seem worthwhile.

This action stage is an end, but it's also a beginning: once you've completed your implementation, it's time to move into the next cycle of problem solving by returning to the scanning stage. By doing this, you'll continue improving your organization as you move into the future.

Problem solving is an exceptionally important workplace skill.

Being a competent and confident problem solver will create many opportunities for you. By using a well-developed model like Simplexity Thinking for solving problems, you can approach the process systematically, and be comfortable that the decisions you make are solid.

Given the unpredictable nature of problems, it's very reassuring to know that, by following a structured plan, you've done everything you can to resolve the problem to the best of your ability.

This assessment has not been validated and is intended for illustrative purposes only. It is just one of many Mind Tool quizzes that can help you to evaluate your abilities in a wide range of important career skills.

If you want to reproduce this quiz, you can purchase downloadable copies in our Store .

You've accessed 1 of your 2 free resources.

Get unlimited access

Discover more content

Problem Solving

4 Logical Fallacies

Avoid Common Types of Faulty Reasoning

Add comment

Comments (2)

Afkar Hashmi

😇 This tool is very useful for me.

over 1 year

Very impactful

test of problem solving description

Get 20% off your first year of Mind Tools

Our on-demand e-learning resources let you learn at your own pace, fitting seamlessly into your busy workday. Join today and save with our limited time offer!

Sign-up to our newsletter

Subscribing to the Mind Tools newsletter will keep you up-to-date with our latest updates and newest resources.

Subscribe now

Business Skills

Personal Development

Leadership and Management

Member Extras

Most Popular

Newest Releases

Article am7y1zt

Pain Points Podcast - Balancing Work And Kids

Article aexy3sj

Pain Points Podcast - Improving Culture

Mind Tools Store

About Mind Tools Content

Discover something new today

Pain points podcast - what is ai.

Exploring Artificial Intelligence

Pain Points Podcast - How Do I Get Organized?

It's Time to Get Yourself Sorted!

How Emotionally Intelligent Are You?

Boosting Your People Skills

Self-Assessment

What's Your Leadership Style?

Learn About the Strengths and Weaknesses of the Way You Like to Lead

Recommended for you

Analyzing the financial performance of organizations.

An Outline of the Various Methods Used to Analyze Financial Performance

Business Operations and Process Management

Strategy Tools

Customer Service

Business Ethics and Values

Handling Information and Data

Project Management

Knowledge Management

Self-Development and Goal Setting

Time Management

Presentation Skills

Learning Skills

Career Skills

Communication Skills

Negotiation, Persuasion and Influence

Working With Others

Difficult Conversations

Creativity Tools

Self-Management

Work-Life Balance

Stress Management and Wellbeing

Coaching and Mentoring

Change Management

Team Management

Managing Conflict

Delegation and Empowerment

Performance Management

Leadership Skills

Developing Your Team

Talent Management

Decision Making

Member Podcast

Mind Resources

Shopping Cart

Professional resources, assessments, and educational books.

  • Special Education
  • Occupational Therapy
  • Speech & Language

TOPS-3 ELEMENTARY / COMPLETE KIT

Test of Problem Solving 3 (TOPS-3)

  • Ages 6 to 12
  • Grades 1 - 7
  • Testing Time 35 minutes
  • Administration Individual
  • Product Code 34140 ( MR #059004 )

* Qualifications required to purchase this item. Click here to complete the qualifications form.

*DISCONTINUED (*NEW EDITION in Alternatives below)

The student gives a logical explanation about a situation combining what he knows or can see with previous experiences and background information.  The ability to infer is critical for success in the classroom, academics, and social development.
The student determines and explains logical, everyday sequences of events.  This skill is critical to academic performance and requires an understanding of the situation, determining the logical sequence of events, and expressing it clearly.
The student is asked to explain why something would not occur or why one shouldn't take a given action in a situation.  Responses reveal how well your student notices, attends to, understands, and expresses an appropriate response on this subtest.
The student must recognize the problem, think of alternative solutions, evaluate the options, and state an appropriate solution that will work well.  It also includes how to avoid specific problems.
This subtest requires the student to anticipate what will happen in the future.  This requires him to draw from past experiences to reflect on the future.  This skill is an academic as well as a life skill.
The student must give a logical reason for a given aspect of the situation in the paragraph.  To be successful, the student must see the relationship between the action and the outcome.

Series items

  • Alternatives
  • Additional Information

test of problem solving description

TOPS-3 Elementary Test Forms (20)

Third Edition

test of problem solving description

Table of Contents

Alternatives and related items

test of problem solving description

Test of Problem Solving–Elementary: Normative Update (TOPS-3E: NU)

COMPLETE KIT

  • Copyright 2005
  • Brain Development
  • Childhood & Adolescence
  • Diet & Lifestyle
  • Emotions, Stress & Anxiety
  • Learning & Memory
  • Thinking & Awareness
  • Alzheimer's & Dementia
  • Childhood Disorders
  • Immune System Disorders
  • Mental Health
  • Neurodegenerative Disorders
  • Infectious Disease
  • Neurological Disorders A-Z
  • Body Systems
  • Cells & Circuits
  • Genes & Molecules
  • The Arts & the Brain
  • Law, Economics & Ethics
  • Neuroscience in the News
  • Supporting Research
  • Tech & the Brain
  • Animals in Research
  • BRAIN Initiative
  • Meet the Researcher
  • Neuro-technologies
  • Tools & Techniques
  • Core Concepts
  • For Educators
  • Ask an Expert
  • The Brain Facts Book

BrainFacts.org

Test Your Problem-Solving Skills

Personalize your emails.

Personalize your monthly updates from BrainFacts.org by choosing the topics that you care about most!

Find a Neuroscientist

Engage local scientists to educate your community about the brain.

Image of the Week

Check out the Image of the Week Archive.

Facebook

SUPPORTING PARTNERS

Dana Foundation logo

  • Privacy Policy
  • Accessibility Policy
  • Terms and Conditions
  • Manage Cookies

Some pages on this website provide links that require Adobe Reader to view.

'+img.alt+'

' + value.name + '

' + value.desc + '

Test of Problem Solving 2: Adolescent (TOPS-2:A)

  • MORE INFORMATION
  • SPECIFICATIONS
  • understanding/comprehension
  • interpretation
  • self-regulation
  • explanation
  • inference/insight
  • decision-making
  • intent/purpose
  • problem solving
  • acknowledgment
  • Subtest A: Making Inferences: The student is asked to give a logical explanation about a situation, combining what he knows or can see with previous experience/background information. Students who do well on this subtest make plausible inferences, predictions, or interpretations.
  • Subtest B: Determining Solutions: The student is asked to provide a logical solution for some aspect of a situation presented in a passage.
  • Subtest C: Problem Solving: This subtest requires a student to recognize the problem, think of alternative solutions, evaluate the options, and state an appropriate solution for a given situation. It also includes stating how to avoid specific problems.
  • Subtest D: Interpreting Perspectives: A student who does well on this subtest will evaluate other points of view in order to make a conclusion.
  • Subtest E: Transferring Insights: The student is asked to compare analogous situations by using information stated in the passage.

test of problem solving description

MORE ITEMS FROM Cognition and Intelligence

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Front Psychol

Intelligence and Creativity in Problem Solving: The Importance of Test Features in Cognition Research

Associated data.

This paper discusses the importance of three features of psychometric tests for cognition research: construct definition, problem space, and knowledge domain. Definition of constructs, e.g., intelligence or creativity, forms the theoretical basis for test construction. Problem space, being well or ill-defined, is determined by the cognitive abilities considered to belong to the constructs, e.g., convergent thinking to intelligence, divergent thinking to creativity. Knowledge domain and the possibilities it offers cognition are reflected in test results. We argue that (a) comparing results of tests with different problem spaces is more informative when cognition operates in both tests on an identical knowledge domain, and (b) intertwining of abilities related to both constructs can only be expected in tests developed to instigate such a process. Test features should guarantee that abilities can contribute to self-generated and goal-directed processes bringing forth solutions that are both new and applicable. We propose and discuss a test example that was developed to address these issues.

The definition of the construct a test is to measure is most important in test construction and application, because cognitive processes reflect the possibilities a task offers. For instance, a test constructed to assess intelligence will operationalize the definition of this construct, being, in short, finding the correct answer. Also, the definition of a construct becomes important when selecting tests for the confirmation of a specific hypothesis. One can only find confirmation for a hypothesis if the chosen task instigates the necessary cognitive operations. For instance, in trying to confirm the assumed intertwining of certain cognitive abilities (e.g., convergent thinking and divergent thinking), tasks should be applied that have shown to yield the necessary cognitive process.

The second test feature, problem space , determines the degrees of freedom cognition has to its disposal in solving a problem. For instance, cognition will go through a wider search path when problem constraints are less well defined and, consequently, data will differ accordingly.

The third test feature, knowledge domain , is important when comparing results from two different tests. When tests differ in problem space, it is not advisable they should differ in knowledge domain. For instance, when studying the differences in cognitive abilities between tests constructed to asses convergent thinking (mostly defined problem space) and divergent thinking (mostly ill-defined problem space), in general test practice, both tests also differ in knowledge domain. Hence, data will reflect cognition operating not only in different problem spaces, but also operating on different knowledge domains, which makes the interpretation of results ambiguous.

The proposed approach for test development and test application holds the promise of, firstly, studying cognitive abilities in different problem spaces while operating on an identical knowledge domain. Although cognitions’ operations have been studied extensively and superbly in both contexts separately, they have rarely been studied in test situations where one or the other test feature is controlled for. The proposed approach also presents a unique method for studying thinking processes in which cognitive abilities intertwine. On the basis of defined abilities, tasks can be developed that have a higher probability of yielding the hypothesized results.

The construct of intelligence is defined as the ability to produce the single best (or correct) answer to a clearly defined question, such as a proof to a theorem ( Simon, 1973 ). It may also be seen as a domain-general ability ( g -factor; Spearman, 1904 ; Cattell, 1967 ) that has much in common with meta cognitive functions, such as metacognitive knowledge, metacognitive monitoring, and metacognitive control ( Saraç et al., 2014 ).

The construct of creativity, in contrast, is defined as the ability to innovate and move beyond what is already known ( Wertheimer , 1945/1968 ; Ghiselin , 1952/1985 ; Vernon, 1970 ). In other words, it emphasizes the aspect of innovation. This involves the ability to consider things from an uncommon perspective, transcend the old order ( Ghiselin , 1952/1985 ; Chi, 1997 ; Ward, 2007 ), and explore loosely associated ideas ( Guilford, 1950 ; Mednick, 1962 ; Koestler, 1964 ; Gentner, 1983 ; Boden, 1990 ; Christensen, 2007 ). Creativity could also be defined as the ability to generate a solution to problems with ill-defined problem spaces ( Wertheimer , 1945/1968 ; Getzels and Csikszentmihalyi, 1976 ). In this sense it involves the ability to identify problematic aspects of a given situation ( Ghiselin , 1952/1985 ) and, in a wider sense, the ability to define completely new problems ( Getzels, 1975 , 1987 ).

Guilford (1956) introduced the constructs of convergent thinking and divergent thinking abilities. Both thinking abilities are important because they allow us insights in human problem solving. On the basis of their definitions convergent and divergent thinking help us to structurally study human cognitive operations in different situations and over different developmental stages. Convergent thinking is defined as the ability to apply conventional and logical search, recognition, and decision-making strategies to stored information in order to produce an already known answer ( Cropley, 2006 ). Divergent thinking, by contrast, is defined as the ability to produce new approaches and original ideas by forming unexpected combinations from available information and by applying such abilities as semantic flexibility, and fluency of association, ideation, and transformation ( Guilford, 1959 , as cited in Cropley, 2006 , p. 1). Divergent thinking brings forth answers that may never have existed before and are often novel, unusual, or surprising ( Cropley, 2006 ).

Guilford (1967) introduced convergent and divergent thinking as part of a set of five operations that apply in his Structure of Intellect model (SOI model) on six products and four kinds of content, to produce 120 different factors of cognitive abilities. With the SOI model Guilford wanted to give the construct of intelligence a comprehensive model. He wanted the model to include all aspects of intelligence, many of which had been seriously neglected in traditional intelligence testing because of a persistent adherence to the belief in Spearman’s g ( Guilford, 1967 , p. vii). Hence, Guilford envisaged cognition to embrace, among other abilities, both convergent and divergent thinking abilities. After these new constructs were introduced and defined, tests for convergent and divergent thinking emerged. Despite the fact that Guilford reported significant loadings of tests for divergent production on tests constructed to measure convergent production ( Guilford, 1967 , p. 155), over the years, both modes of thinking were considered as separate identities where convergent thinking tests associated with intelligence and divergent thinking tests with creativity ( Cropley, 2006 ; Shye and Yuhas, 2004 ). Even intelligence tests that assess aspects of intelligence that supposedly reflect creative abilities do not actually measure creativity ( Kaufman, 2015 ).

The idea that both convergent and divergent thinking are important for solving problems, and that intelligence helps in the creative process, is not really new. In literature we find models of the creative process that define certain stages to convergent and divergent thinking; the stages of purposeful preparation at the start and those of critical verification at the end of the process, respectively ( Wallas, 1926 ; Webb Young , 1939/2003 ). In this view, divergent thinking enables the generation of new ideas whereas the exploratory activities of convergent thinking enable the conversion of ideas into something new and appropriate ( Cropley and Cropley, 2008 ).

We argue that studying the abilities of divergent and convergent thinking in isolation does not suffice to give us complete insight of all possible aspects of human problem solving, its constituent abilities and the structure of its processes. Processes that in a sequence of thoughts and actions lead to novel and adaptive productions ( Lubart, 2001 ) are more demanding of cognition for understanding the situation at hand and planning a path to a possible solution, than abilities involved in less complex situations ( Jaušovec, 1999 ). Processes that yield self-generated and goal-directed thought are the most complex cognitive processes that can be studied ( Beaty et al., 2016 ). Creative cognition literature is moving toward the view that especially in those processes that yield original and appropriate solutions within a specific context, convergent and divergent abilities intertwine ( Cropley, 2006 ; Ward, 2007 ; Gabora, 2010 ).

The approach of intertwining cognitive abilities is also developed within cognitive neuroscience by focusing on the intertwining of brain networks ( Beaty et al., 2016 ). In this approach divergent thinking relates to the default brain network. This network operates in defocused or associative mode of thought yielding spontaneous and self-generated cognition ( Beaty et al., 2015 ). Convergent thinking relates to the executive control network operating in focused or analytic modes of thought, yielding updating, shifting, and inhibition ( Benedek et al., 2014 ). Defocused attention theory ( Mendelssohn, 1976 ) states that less creative individuals operate with a more focused attention than do creative individuals. This theory argues that e.g., attending to two things at the same time, might result in one analogy, while attending to four things might yield six analogies ( Martindale, 1999 ).

In the process of shifting back and forth along the spectrum between associative and analytic modes of thinking, the fruits of associative thought become ingredients for analytic thought processes, and vice versa ( Gabora, 2010 ). In this process, mental imagery is involved as one sensory aspect of the human ability to gather and process information ( Jung and Haier, 2013 ). Mental imagery is fed by scenes in the environment that provide crucial visual clues for creative problem solving and actuates the need for sketching ( Verstijnen et al., 2001 ).

Creative problem solving processes often involve an interactive relationship between imagining, sketching, and evaluating the result of the sketch ( van Leeuwen et al., 1999 ). This interactive process evolves within a type of imagery called “visual reasoning” where forms and shapes are manipulated in order to specify the configurations and properties of the design entities ( Goldschmidt, 2013 ). The originality of inventions is predicted by the application of visualization, whereas their practicality is predicted by the vividness of imagery ( Palmiero et al., 2015 ). Imaginative thought processes emerge from our conceptual knowledge of the world that is represented in our semantic memory system. In constrained divergent thinking, the neural correlates of this semantic memory system partially overlap with those of the creative cognition system ( Abraham and Bubic, 2015 ).

Studies of convergent and divergent thinking abilities have yielded innumerable valuable insights on the cognitive and neurological aspects involved, e.g., reaction times, strategies, brain areas involved, mental representations, and short and long time memory components. Studies on the relationship between both constructs suggest that it is unlikely that individuals employ similar cognitive strategies when solving more convergent than more divergent thinking tasks ( Jaušovec, 2000 ). However, to arrive at a quality formulation the creative process cannot do without the application of both, convergent and divergent thinking abilities (e.g., Kaufmann, 2003 ; Runco, 2003 ; Sternberg, 2005 ; Dietrich, 2007 ; Cropley and Cropley, 2008 ; Silvia et al., 2013 ; Jung, 2014 ).

When it is our aim to study the networks addressed by the intertwining of convergent and divergent thinking processes that are considered to operate when new, original, and yet appropriate solutions are generated, then traditional thinking tests like intelligence tests and creativity tests are not appropriate; they yield processes related to the definition of one or the other type of construct.

Creative Reasoning Task

According to the new insights gained in cognition research, we need tasks that are developed with the aim to instigate precisely the kind of thinking processes we are looking for. Tasks should also provide a method of scoring independently the contribution of convergent and divergent thinking. As one possible solution for such tasks we present the Creative Reasoning Task (CRT; Jaarsveld, 2007 ; Jaarsveld et al., 2010 , 2012 , 2013 ).

The CRT presents participants with an empty 3 × 3 matrix and asks them to fill it out, as original and complex as possible, by creating components and the relationships that connect them. The created matrix can, in principle, be solved by another person. The creation of components is entirely free, as is the generation of the relationships that connects them into a completed pattern. Created matrices are scored with two sub scores; Relations , which scores the logical complexity of a matrix and is, therefore, considered a measure for convergent thinking, and Components and Specifications , which scores the originality, fluency, and flexibility and, therefore, is considered an indication for divergent thinking (for a more detailed description of the score method, see Appendix 1 in Supplementary Material).

Psychometric studies with the CRT showed, firstly, that convergent and divergent thinking abilities apply within this task and can be assessed independently. The CRT sub score Relations correlated with the Standard Progressive Matrices test (SPM) and the CRT sub score Components and Specifications correlated with a standard creativity test (TCT–DP, Test of Creative Thinking–Drawing Production; Urban and Jellen, 1995 ; Jaarsveld et al., 2010 , 2012 , 2013 ). Studies further showed that, although a correlation was observed for the intelligence and creativity test scores, no correlation was observed between the CRT sub scores relating to intelligent and creative performances ( Jaarsveld et al., 2012 , 2013 ; for further details about the CRT’s objectivity, validity, and reliability, see Appendix 2 in Supplementary Material).

Reasoning in creative thinking can be defined as the involvement of executive/convergent abilities in the inhibition of ideas and the updating of information ( Benedek et al., 2014 ). Jung (2014) describes a dichotomy for cognitive abilities with at one end the dedicated system that relies on explicit and conscious knowledge and at the other end the improvisational system that relies more upon implicit or unconscious knowledge systems. The link between explicit and implicit systems can actually be traced back to Kris’ psychoanalytic approach to creativity dating from the 1950s. The implicit system refers to Kris’ primary process of adaptive regression, where unmodulated thoughts intrude into consciousness; the explicit system refers to the secondary process, where the reworking and transformation of primary process material takes place through reality-oriented and ego-controlled thinking ( Sternberg and Lubart, 1999 ). The interaction between explicit and implicit systems can be seen to form the basis of creative reasoning, i.e., the cognitive ability to solve problems in an effective and adaptive way. This interaction evolved as a cognitive mechanism when human survival depended on finding effective solutions to both common and novel problem situations ( Gabora and Kaufman, 2010 ). Creative reasoning solves that minority of problems that are unforeseen and yet of high adaptability ( Jung, 2014 ).

Hence, common tests are insufficient when it comes to solving problems that are unforeseen and yet of high adaptability, because they present problems that are either unforeseen and measure certain abilities contained in the construct of creativity or they address adaptability and measure certain abilities contained in the construct of intelligence. The CRT presents participants with a problem that they could not have foreseen; the form is blank and offers no stimuli. All tests, even creativity tests, present participants with some kind of stimuli. The CRT addresses adaptability; to invent from scratch a coherent structure that can be solved by another person, like creating a crossword puzzle. Problems, that are unforeseen and of high adaptability, are solved by the application of abilities from both constructs.

Neuroscience of Creative Cognition

Studies in neuroscience showed that cognition operating in ill-defined problem space not only applies divergent thinking but also benefits from additional convergent operations ( Gabora, 2010 ; Jung, 2014 ). Understanding creative cognition may be advanced when we study the flow of information among brain areas ( Jung et al., 2010 ).

In a cognitive neuroscience study with the CRT we focused on the cognitive process evolving within this task. Participants performed the CRT while EEG alpha activity was registered. EEG alpha synchronization in frontal areas is understood as an indication of top-down control ( Cooper et al., 2003 ). When observed in frontal areas, for divergent and convergent thinking tasks, it may not reflect a brain state that is specific for creative cognition but could be attributed to the high processing demands typically involved in creative thinking ( Benedek et al., 2011 ). Top-down control, relates to volitionally focusing attention to task demands ( Buschman and Miller, 2007 ). That this control plays a role in tasks with an ill-defined problem space showed when electroencephalography (EEG) alpha synchronization was stronger for individuals engaged in creative ideation tasks compared to an intelligence related tasks ( Fink et al., 2007 , 2009 ; Fink and Benedek, 2014 ). This activation was also found for the CRT; task related alpha synchronization showed that convergent thinking was integrated in the divergent thinking processes. Analyzes of the stages in the CRT process showed that this alpha synchronization was especially visible at the start of the creative process at prefrontal and frontal sites when information processing was most demanding, i.e., due to multiplicity of ideas, and it was visible at the end of the process, due to narrowing down of alternatives ( Jaarsveld et al., 2015 ).

A functional magnetic resonance imaging (fMRI) study ( Beaty et al., 2015 ) with a creativity task in which cognition had to meet specific constraints, showed the networks involved. The default mode network which drives toward abstraction and metaphorical thinking and the executive control network driving toward certainty ( Jung, 2014 ). Control involves not only maintenance of patterns of activity that represent goals and the means to achieve those ( Miller and Cohen, 2001 ), but also their voluntary suppression when no longer needed, as well as the flexible shift between different goals and mental sets ( Abraham and Windmann, 2007 ). Attention can be focused volitionally by top-down signals derived from task demands and automatically by bottom-up signals from salient stimuli ( Buschman and Miller, 2007 ). Intertwining between top-down and bottom-up attention processes in creative cognition ensures a broadening of attention in free associative thinking ( Abraham and Windmann, 2007 ).

These studies support and enhance the findings of creative cognition research in showing that the generation of original and applicable ideas involves an intertwining between different abilities, networks, and attention processes.

Problem Space

A problem space is an abstract representation, in the mind of the problem solver, of the encountered problem and of the asked for solution ( Simon and Newell, 1971 ; Simon, 1973 ; Hayes and Flowers, 1986 ; Kulkarni and Simon, 1988 ; Runco, 2007 ). The space that comes with a certain problem can, according to the constraints that are formulated for the solution, be labeled well-defined or ill-defined ( Simon and Newell, 1971 ). Consequently, the original problems are labeled closed and open problems, respectively ( Jaušovec, 2000 ).

A problem space contains all possible states that are accessible to the problem solver from the initial state , through iterative application of transformation rules , to the goal state ( Newell and Simon, 1972 ; Anderson, 1983 ). The initial state presents the problem solver with a task description that defines which requirements a solution has to answer. The goal state represents the solution. The proposed solution is a product of the application of transformation rules (algorithms and heuristics) on a series of successive intermediate solutions. The proposed solution is also a product of the iterative evaluations of preceding solutions and decisions based upon these evaluations ( Boden, 1990 ; Gabora, 2002 ; Jaarsveld and van Leeuwen, 2005 ; Goldschmidt, 2014 ). Whether all possible states need to be passed through depends on the problem space being well or ill-defined and this, in turn, depends on the character of the task descriptions.

When task descriptions clearly state which requirements a solution has to answer then the inferences made will show little idiosyncratic aspects and will adhere to the task constraints. As a result, fewer options for alternative paths are open to the problem solver and search for a solution evolves in a well-defined space. Vice versa, when task or problem descriptions are fuzzy and under specified, the problem solver’s inferences are more idiosyncratic; the resulting process will evolve within an ill-defined space and will contain more generative-evaluative cycles in which new goals are set, and the cycle is repeated ( Dennett, 1978 , as cited in Gabora, 2002 , p. 126).

Tasks that evolve in defined problem space are, e.g., traditional intelligence tests (e.g., Wechsler Adult Intelligence Scale, WAIS; and SPM, Raven , 1938/1998 ). The above tests consist of different types of questions, each testing a different component of intelligence. They are used in test practice to assess reasoning abilities in diverse domains, such as, abstract, logical, spatial, verbal, numerical, and mathematical domains. These tests have clearly stated task descriptions and each item has one and only one correct solution that has to be generated from memory or chosen from a set of alternatives, like in multiple choice formats. Tests can be constructed to assess crystallized or fluid intelligence. Crystallized intelligence represents abilities acquired through learning, practice, and exposure to education, while fluid intelligence represents a more basic capacity that is valuable to reasoning and problem solving in contexts not necessarily related to school education ( Carroll, 1982 ).

Tasks that evolve in ill-defined problem space are, e.g., standard creativity tests. These types of test ask for a multitude of ideas to be generated in association with a given item or situation (e.g., “think of as many titles for this story”). Therefore, they are also labeled as divergent thinking test. Although they assess originality, fluency, flexibility of responses, and elaboration, they are not constructed, however, to score appropriateness or applicability. Divergent thinking tests assess one limited aspect of what makes an individual creative. Creativity depends also on variables like affect and intuition; therefore, divergent thinking can only be considered an indication of an individual’s creative potential ( Runco, 2008 ). More precisely, divergent thinking explains just under half of the variance in adult creative potential, which is more than three times that of the contribution of intelligence ( Plucker, 1999 , p. 103). Creative achievement , by contrast, is commonly assessed by means of self-reports such as biographical questionnaires in which participants indicate their achievement across various domains (e.g., literature, music, or theater).

Studies with the CRT showed that problem space differently affects processing of and comprehension of relationships between components. Problem space did not affect the ability to process complex information. This ability showed equal performance in well and ill-defined problem spaces ( Jaarsveld et al., 2012 , 2013 ). However, problem space did affect the comprehension of relationships, which showed in the different frequencies of relationships solved and created ( Jaarsveld et al., 2010 , 2012 ). Problem space also affected the neurological activity as displayed when individuals solve open or closed problems ( Jaušovec, 2000 ).

Problem space further affected trends over grade levels of primary school children for relationships solved in well-defined and applied in ill-defined problem space. Only one of the 12 relationships defined in the CRT, namely Combination, showed an increase with grade for both types of problem spaces ( Jaarsveld et al., 2013 ). In the same study, cognitive development in the CRT showed in the shifts of preference for a certain relationship. These shifts seem to correspond to Piaget’s developmental stages ( Piaget et al., 1977 ; Siegler, 1998 ) which are in evidence in the CRT, but not in the SPM ( Jaarsveld et al., 2013 ).

Design Problems

A sub category of problems with an ill-defined problem space are represented by design problems. In contrast to divergent thinking tasks that ask for the generation of a multitude of ideas, in design tasks interim ideas are nurtured and incrementally developed until they are appropriate for the task. Ideas are rarely discarded and replaced with new ideas ( Goel and Pirolli, 1992 ). The CRT could be considered a design problem because it yields (a) one possible solution and (b) an iterative thinking process that involves the realization of a vague initial idea. In the CRT a created matrix, which is a closed problem, is created within an ill-defined problem space. Design problems can be found, e.g., in engineering, industrial design, advertising, software design, and architecture ( Sakar and Chakrabarti, 2013 ), however, they can also be found in the arts, e.g., poetry, sculpting, and dance geography.

These complex problems are partly determined by unalterable needs, requirements and intentions but the major part of the design problem is undetermined ( Dorst, 2004 ). This author points out that besides containing an original and a functional value, these types of problems contain an aesthetic value. He further states that the interpretation of the design problem and the creation and selection of possible suitable solutions can only be decided during the design process on the basis of proposals made by the designer.

In design problems the generation stage may be considered a divergent thinking process. However, not in the sense that it moves in multiple directions or generates multiple possibilities as in a divergent thinking tests, but in the sense that it unrolls by considering an initially vague idea from different perspectives until it comes into focus and requires further processing to become viable. These processes can be characterized by a set of invariant features ( Goel and Pirolli, 1992 ), e.g., structuring. iteration , and coherence .

Structuring of the initial situation is required in design processes before solving can commence. The problem contains little structured and clear information about its initial state and about the requirements of its solution. Therefore, design problems allow or even require re-interpretation of transformation rules; for instance, rearranging the location of furniture in a room according to a set of desirable outcomes. Here one uncovers implicit requirements that introduce a set of new transformations and/or eliminate existing ones ( Barsalou, 1992 ; Goel and Pirolli, 1992 ) or, when conflicting requirements arise, one creates alternatives and/or introduces new trade-offs between the conflicting constraints ( Yamamoto et al., 2000 ; Dorst, 2011 ).

A second aspect of design processes is their iterative character. After structuring and planning a vague idea emerges, which is the result of the merging of memory items. A vague idea is a cognitive structure that, halfway the creative process is still ill defined and, therefore, can be said to exist in a state of potentiality ( Gabora and Saab, 2011 ). Design processes unroll in an iterative way by the inspection and adjustment of the generated ideas ( Goldschmidt, 2014 ). New meanings are created and realized while the creative mind imposes its own order and meaning on the sensory data and through creative production furthers its own understanding of the world ( Arnheim , 1962/1974 , as cited in Grube and Davis, 1988 , pp. 263–264).

A third aspect of design processes is coherence. Coherence theories characterize coherence in, for instance, philosophical problems and psychological processes, in terms of maximal satisfaction of multiple constraints and compute coherence by using, a.o., connectionist algorithms ( Thagard and Verbeurgt, 1998 ). Another measure of coherence is characterized as continuity in design processes. This measure was developed for a design task ( Jaarsveld and van Leeuwen, 2005 ) and calculated by the occurrence of a given pair of objects in a sketch, expressed as a percentage of all the sketches of a series. In a series of sketches participants designed a logo for a new soft drink. Design series strong in coherence also received a high score for their final design, as assessed by professionals in various domains. Indicating that participants with a high score for the creative quality of their final sketch seemed better in assessing their design activity in relation to the continuity in the process and, thereby, seemed better in navigating the ill-defined space of a design problem ( Jaarsveld and van Leeuwen, 2005 ). In design problems the quality of cognitive production depends, in part, on the abilities to reflect on one’s own creative behavior ( Boden, 1996 ) and to monitor how far along in the process one is in solving it ( Gabora, 2002 ). Hence, design problems are especially suited to study more complex problem solving processes.

Knowledge Domain

Knowledge domain represents disciplines or fields of study organized by general principles, e.g., domains of various arts and sciences. It contains accumulated knowledge that can be divided in diverse content domains, and the relevant algorithms and heuristics. We also speak of knowledge domains when referring to, e.g., visuo-spatial and verbal domains. This latter differentiation may refer to the method by which performance in a certain knowledge domain is assessed, e.g., a visuo-spatial physics task that assesses the content domain of the workings of mass and weights of objects.

In comparing tests results, we should keep in mind that apart from reflecting cognitive processes evolving in different problem spaces, the results also arise from cognition operating on different knowledge domains. We argue that, the still contradictory and inconclusive discussion about the relationship between intelligence and creativity ( Silvia, 2008 ), should involve the issue of knowledge domain.

Intelligence tests contain items that pertain to, e.g., verbal, abstract, mechanical and spatial reasoning abilities, while their content mostly operates on knowledge domains that are related to contents contained in school curricula. Items of creativity tests, by contrast, pertain to more idiosyncratic knowledge domains, their contents relating to associations between stored personal experiences ( Karmiloff-Smith, 1992 ). The influence of knowledge domain on the relationships between different test scores was already mentioned by Guilford (1956 , p. 169). This author expected a higher correlation between scores from a typical intelligence test and a divergent thinking test than between scores from two divergent thinking tests because the former pair operated on identical information and the latter pair on different information.

Studies with the CRT showed that when knowledge domain is controlled for, the development of intelligence operating in ill-defined problem space does not compare to that of traditional intelligence but develops more similarly to the development of creativity ( Welter et al., in press ).

Relationship Intelligence and Creativity

The Threshold theory ( Guilford, 1967 ) predicts a relationship between intelligence and creativity up to approximately an intelligence quotient (IQ) level of 120 but not beyond ( Lubart, 2003 ; Runco, 2007 ). Threshold theory was corroborated when creative potential was found to be related to intelligence up to certain IQ levels; however, the theory was refuted, when focusing on achievement in creative domains; it showed that creative achievement benefited from higher intelligence even at fairly high levels of intellectual ability ( Jauk et al., 2013 ).

Distinguishing between subtypes of general intelligence known as fluent and crystallized intelligence ( Cattell, 1967 ), Sligh et al. (2005) observed an inverse threshold effect with fluid IQ: a correlation with creativity test scores in the high IQ group but not in the average IQ group. Also creative achievement showed to be affected by fluid intelligence ( Beaty et al., 2014 ). Intelligence, defined as fluid IQ, verbal fluency, and strategic abilities, showed a higher correlation with creativity scores ( Silvia, 2008 ) than when defined as crystallized intelligence. Creativity tests, which involved convergent thinking (e.g., Remote Association Test; Mednick, 1962 ) showed higher correlations with intelligence than ones that involved only divergent thinking (e.g., the Alternate Uses Test; Guilford et al., 1978 ).

That the Remote Association test also involves convergent thinking follows from the instructions; one is asked, when presented with a stimulus word (e.g., table) to produce the first word one thinks of (e.g., chair). The word pair table–chair is a common association, more remote is the pair table–plate, and quite remote is table–shark. According to Mednick’s theory (a) all cognitive work is done essentially by combining or associating ideas and (b) individuals with more commonplace associations have an advantage in well-defined problem spaces, because the class of relevant associations is already implicit in the statement of the problem ( Eysenck, 2003 ).

To circumvent the problem of tests differing in knowledge domain, one can develop out of one task a more divergent and a more convergent thinking task by asking, on the one hand, for the generation of original responses, and by asking, on the other hand, for more common responses ( Jauk et al., 2012 ). By changing the instruction of a task, from convergent to divergent, one changes the constraints the solution has to answer and, thereby, one changes for cognition its freedom of operation ( Razumnikova et al., 2009 ; Limb, 2010 ; Jauk et al., 2012 ). However, asking for more common responses is still a divergent thinking task because it instigates a generative and ideational process.

Indeed, studying the relationship between intelligence and creativity with knowledge domain controlled for yielded different results as defined in the Threshold theory. A study in which knowledge domain was controlled for showed, firstly, that intelligence is no predictor for the development of creativity ( Welter et al., 2016 ). Secondly, that the relationship between scores of intelligence and creativity tests as defined under the Threshold theory was only observed in a small subset of primary school children, namely, female children in Grade 4 ( Welter et al., 2016 ). We state that relating results of operations yielded by cognitive abilities performing in defined and in ill-defined problem spaces can only be informative when it is ensured that cognitive processes also operate on an identical knowledge domain.

Intertwining of Cognitive Abilities

Eysenck (2003) observed that there is little justification for considering the constructs of divergent and convergent thinking in categorical terms in which one construct excludes the other. In processes that yield original and appropriate solutions convergent and divergent thinking both operate on the same large knowledge base and the underlying cognitive processes are not entirely dissimilar ( Eysenck, 2003 , p. 110–111).

Divergent thinking is especially effective when it is coupled with convergent thinking ( Runco, 2003 ; Gabora and Ranjan, 2013 ). A design problem study ( Jaarsveld and van Leeuwen, 2005 ) showed that divergent production was active throughout the design, as new meanings are continuously added to the evolving structure ( Akin, 1986 ), and that convergent production was increasingly important toward the end of the process, as earlier productions are wrapped up and integrated in the final design. These findings are in line with the assumptions of Wertheimer (1945/1968) who stated that thinking within ill-defined problem space is characterized by two points of focus; one is to work on the parts, the other to make the central idea clearer.

Parallel to the discussion about the intertwining of convergent and divergent thinking abilities in processes that evolve in ill-defined problem space we find the discussion about how intelligence may facilitate creative thought. This showed when top-down cognitive control advanced divergent processing in the generation of original ideas and a certain measure of cognitive inhibition advanced the fluency of idea generation ( Nusbaum and Silvia, 2011 ). Fluid intelligence and broad retrieval considered as intelligence factors in a structural equation study contributed both to the production of creative ideas in a metaphor generation task ( Beaty and Silvia, 2013 ). The notion that creative thought involves top-down, executive processes showed in a latent variable analysis where inhibition primarily promoted the fluency of ideas, and intelligence promoted their originality ( Benedek et al., 2012 ).

Definitions of the Constructs Intelligence and Creativity

The various definitions of the constructs of intelligence and creativity show a problematic overlap. This overlap stems from the enormous endeavor to unanimously agree on valid descriptions for each construct. Spearman (1927) , after having attended many symposia that aimed at defining intelligence, stated that “in truth, ‘intelligence’ has become a mere vocal sound, a word with so many meanings that finally it has none” (p. 14).

Intelligence is expressed in terms of adaptive, goal-directed behavior; and the subset of such behavior that is labeled “intelligent” seems to be determined in large part by cultural or societal norms ( Sternberg and Salter, 1982 ). The development of the IQ measure is discussed by Carroll (1982) : “Binet (around 1905) realized that intelligent behavior or mental ability can be ranged along a scale. Not much later, Stern (around 1912) noticed that, as chronological age increased, variation in mental age changes proportionally. He developed the IQ ratio, whose standard deviation would be approximately constant over chronological age if mental age was divided by chronological age. With the development of multiple-factor-analyses (Thurstone, around 1935) it could be shown that intelligence is not a simple unitary trait because at least seven somewhat independent factors of mental ability were identified.”

Creativity is defined as a combined manifestation of novelty and usefulness ( Jung et al., 2010 ). Although it is identified with divergent thinking, and performance on divergent thinking tasks predicts, e.g., quantity of creative achievements ( Torrance, 1988 , as cited in Beaty et al., 2014 ) and quality of creative performance ( Beaty et al., 2013 ), it cannot be identified uniquely with divergent thinking.

Divergent thinking often leads to highly original ideas that are honed to appropriate ideas by evaluative processes of critical thinking, and valuative and appreciative considerations ( Runco, 2008 ). Divergent thinking tests should be more considered as estimates of creative problem solving potential rather than of actual creativity ( Runco, 1991 ). Divergent thinking is not specific enough to help us understand what, exactly, are the mental processes—or the cognitive abilities—that yield creative thoughts ( Dietrich, 2007 ).

Although current definitions of intelligence and creativity try to determine for each separate construct a unique set of cognitive abilities, analyses show that definitions vary in the degree to which each includes abilities that are generally considered to belong to the other construct ( Runco, 2003 ; Jaarsveld et al., 2012 ). Abilities considered belonging to the construct of intelligence such as hypothesis testing, inhibition of alternative responses, and creating mental images of new actions or plans are also considered to be involved in creative thinking ( Fuster, 1997 , as cited in Colom et al., 2009 , p. 215). The ability, for instance, to evaluate , which is considered to belong to the construct of intelligence and assesses the match between a proposed solution and task constraints, has long been considered to play a role in creative processes that goes beyond the mere generation of a series of ideas as in creativity tasks ( Wallas, 1926 , as cited in Gabora, 2002 , p. 1; Boden, 1990 ).

The Geneplore model ( Finke et al., 1992 ) explicitly models this idea; after stages in which objects are merely generated, follow phases in which an object’s utility is explored and estimated. The generation phase brings forth pre inventive objects, imaginary objects that are generated without any constraints in mind. In exploration, these objects are evaluated for their possible functionalities. In anticipating the functional characteristics of generated ideas, convergent thinking is needed to apprehend the situation, make evaluations ( Kozbelt, 2008 ), and consider the consequences of a chosen solution ( Goel and Pirolli, 1992 ). Convergent reasoning in creativity tasks invokes criteria of functionality and appropriateness ( Halpern, 2003 ; Kaufmann, 2003 ), goal directedness and adaptive behavior ( Sternberg, 1982 ), as well as the abilities of planning and attention. Convergent thinking stages may even require divergent thinking sub processes to identify restrictions on proposed new ideas and suggest requisite revision strategies ( Mumford et al., 2007 ). Hence, evaluation, which is considered to belong to the construct of intelligence, is also functional in creative processes.

In contrast, the ability of flexibility , which is considered to belong to the construct of creativity and denotes an openness of mind that ensures the generation of ideas from different domains, showed, as a factor component for latent divergent thinking, a relationship with intelligence ( Silvia, 2008 ). Flexibility was also found to play an important role in intelligent behavior where it enables us to do novel things smartly in new situations ( Colunga and Smith, 2008 ). These authors studied children’s generalizations of novel nouns and concluded that if we are to understand human intelligence, we must understand the processes that make inventiveness. They propose to include the construct of flexibility within that of intelligence. Therefore, definitions of the constructs we are to measure affect test construction and the resulting data. However, an overlap between definitions, as discussed, yields a test diversity that makes it impossible to interpret the different findings across studies with any confidence ( Arden et al., 2010 ). Also Kim (2005) concluded that because of differences in tests and administration methods, the observed correlation between intelligence and creativity was negligible. As the various definitions of the constructs of intelligence and creativity show problematic overlap, we propose to circumvent the discussion about which cognitive abilities are assessed by which construct, and to consider both constructs as being involved in one design process. This approach allows us to study the contribution to this process of the various defined abilities, without one construct excluding the other.

Reasoning Abilities

The CRT is a psychometrical tool constructed on the basis of an alternative construct of human cognitive functioning that considers creative reasoning as a thinking process understood as the cooperation between cognitive abilities related to intelligent and creative thinking.

In generating relationships for a matrix, reasoning and more specifically the ability of rule invention is applied. The ability of rule invention could be considered as an extension of the sequence of abilities of rule learning, rule inference, and rule application, implying that creativity is an extension of intelligence ( Shye and Goldzweig, 1999 ). According to this model, we could expect different results between a task assessing abilities of rule learning and rule inference, and a task assessing abilities of rule application. In two studies rule learning and rule inference was assessed with the RPM and rule application was assessed with the CRT. Results showed that from Grades 1 to 4, the frequencies of relationships applied did not correlate with those solved ( Jaarsveld et al., 2010 , 2012 ). Results showed that performance in the CRT allows an insight of cognitive abilities operating on relationships among components that differs from the insight based on performance within the same knowledge domain in a matrix solving task. Hence, reasoning abilities lead to different performances when applied in solving closed as to open problems.

We assume that reasoning abilities are more clearly reflected when one formulates a matrix from scratch; in the process of thinking and drawing one has, so to speak, to solve one’s own matrix. In doing so one explains to oneself the relationship(s) realized so far and what one would like to attain. Drawing is thinking aloud a problem and aids the designer’s thinking processes in providing some “talk-back” ( Cross and Clayburn Cross, 1996 ). Explanatory activity enhances learning through increased depth of processing ( Siegler, 2005 ). Analyzing explanations of examples given with physics problems showed that they clarify and specify the conditions and consequences of actions, and that they explicate tacit knowledge; thereby enhancing and completing an individual’s understanding of principles relevant to the task ( Chi and VanLehn, 1991 ). Constraint of the CRT is that the matrix, in principle, can be solved by another person. Therefore, in a kind of inner explanatory discussion, the designer makes observations of progress, and uses evaluations and decisions to answer this constraint. Because of this, open problems where certain constraints have to be met, constitute a powerful mechanism for promoting understanding and conceptual advancement ( Chi and VanLehn, 1991 ; Mestre, 2002 ; Siegler, 2005 ).

Convergent and divergent thinking processes have been studied with a variety of intelligence and creativity tests, respectively. Relationships between performances on these tests have been demonstrated and a large number of research questions have been addressed. However, the fact that intelligence and creativity tests vary in the definition of their construct, in their problem space, and in their knowledge domain, poses methodological problems regarding the validity of comparisons of test results. When we want to focus on one cognitive process, e.g., intelligent thinking, and on its different performances in well or ill-defined problem situations, we need pairs of tasks that are constructed along identical definitions of the construct to be assessed, that differ, however, in the description of their constraints but are identical regarding their knowledge domain.

One such possible pair, the Progressive Matrices Test and the CRT was suggested here. The CRT was developed on the basis of creative reasoning , a construct that assumes the intertwining of intelligent and creativity related abilities when looking for original and applicable solutions. Matched with the Matrices test, results indicated that, besides similarities, intelligent thinking also yielded considerable differences for both problem spaces. Hence, with knowledge domain controlled, and only differences in problem space remaining, comparison of data yielded new results on intelligence’s operations. Data gathered from intelligence and creativity tests, whether they are performance scores or physiological measurements on the basis of, e.g., EEG, and fMRI methods, are reflections of cognitive processes performing on a certain test that was constructed on the basis of a certain definition of the construct it was meant to measure. Data are also reflections of the processes evolving within a certain problem space and of cognitive abilities operating on a certain knowledge domain.

Data can unhide brain networks that are involved in the performance of certain tasks, e.g., traditional intelligence and creativity tests, but data will always be related to the characteristics of the task. The characteristics of the task, such as problem space and knowledge domain originated at the construction of the task, and the construction, on its turn, is affected by the definition of the construct the task is meant to measure.

Here we present the CRT as one possible solution for the described problems in cognition research. However, for research on relationships among test scores other pairs of tests are imaginable, e.g., pairs of tasks operating on the same domain where one task has a defined problem space and the other one an ill-defined space. It is conceivable that pairs of test could operate, besides on the domain of mathematics, on content of e.g., visuo-spatial, verbal, and musical domains. Pairs of test have been constructed by changing the instruction of a task; instructions instigated a more convergent or a more a divergent mode of response ( Razumnikova et al., 2009 ; Limb, 2010 ; Jauk et al., 2012 ; Beaty et al., 2013 ).

The CRT involves the creation of components and their relationships for a 3 × 3 matrix. Hence, matrices created in the CRT are original in the sense that they all bear individual markers and they are applicable in the sense, that they can, in principle, be solved by another person. We showed that the CRT instigates a real design process; creators’ cognitive abilities are wrapped up in a process that should produce a closed problem within an ill-defined problem space.

For research on the relationship among convergent and divergent thinking, we need pairs of test that differ in the problem spaces related to each test but are identical in the knowledge domain on which cognition operates. The test pair of RPM and CRT provides such a pair. For research on the intertwining of convergent and divergent thinking, we need tasks that measure more than tests assessing each construct alone. We need tasks that are developed on the definition of intertwining cognitive abilities; the CRT is one such test.

Hence, we hope to have sufficiently discussed and demonstrated the importance of the three test features, construct definition, problem space, and knowledge domain, for research questions in creative cognition research.

Author Contributions

All authors listed, have made substantial, direct and intellectual contribution to the work, and approved it for publication.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Supplementary Material

The Supplementary Material for this article can be found online at: http://journal.frontiersin.org/article/10.3389/fpsyg.2017.00134/full#supplementary-material

  • Abraham A., Bubic A. (2015). Semantic memory as the root of imagination. Front. Psychol. 6 : 325 10.3389/fpsyg.2015.00325 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Abraham A., Windmann S. (2007). Creative cognition: the diverse operations and the prospect of applying a cognitive neuroscience perspective. Methods 42 38–48. 10.1016/j.ymeth.2006.12.007 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Akin O. (1986). Psychology of Architectural Design London: Pion. [ Google Scholar ]
  • Anderson J. R. (1983). The Architecture of Cognition Cambridge, MA: Harvard University Press. [ Google Scholar ]
  • Arden R., Chavez R. S., Grazioplene R., Jung R. E. (2010). Neuroimaging creativity: a psychometric view. Behav. Brain Res. 214 143–156. 10.1016/j.bbr.2010.05.015 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Arnheim R. (1962/1974). Picasso’s Guernica Berkeley: University of California Press. [ Google Scholar ]
  • Barsalou L. W. (1992). Cognitive Psychology: An Overview for Cognitive Scientists Hillsdale, NJ: LEA. [ Google Scholar ]
  • Beaty R. E., Benedek M., Silvia P. J., Schacter D. L. (2016). Creative cognition and brain network dynamics. Trends Cogn. Sci. 20 87–95. 10.1016/j.tics.2015.10.004 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Beaty R. E., Kaufman S. B., Benedek M., Jung R. E., Kenett Y. N., Jauk E., et al. (2015). Personality and complex brain networks: the role of openness to experience in default network efficiency. Hum. Brain Mapp. 37 773–777. 10.1002/hbm.23065 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Beaty R. E., Nusbaum E. C., Silvia P. J. (2014). Does insight problem solving predict real-world creativity? Psychol. Aesthet. Creat. Arts 8 287–292. 10.1037/a0035727 [ CrossRef ] [ Google Scholar ]
  • Beaty R. E., Silvia R. E. (2013). Metaphorically speaking: cognitive abilities and the production of figurative language. Mem. Cognit. 41 255–267. 10.3758/s13421-012-0258-5 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Beaty R. E., Smeekens B. A., Silvia P. J., Hodges D. A., Kane M. J. (2013). A first look at the role of domain-general cognitive and creative abilities in jazz improvisation. Psychomusicology 23 262–268. 10.1037/a0034968 [ CrossRef ] [ Google Scholar ]
  • Benedek M., Bergner S., Konen T., Fink A., Neubauer A. C. (2011). EEG alpha synchronization is related to top-down processing in convergent and divergent thinking. Neuropsychologia 49 3505–3511. 10.1016/j.neuropsychologia.2011.09.004 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Benedek M., Franz F., Heene M., Neubauer A. C. (2012). Differential effects of cognitive inhibition and intelligence on creativity. Pers. Individ. Dif. 53 480–485. 10.1016/j.paid.2012.04.014 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Benedek M., Jauk E., Sommer M., Arendasy M., Neubauer A. C. (2014). Intelligence, creativity, and cognitive control: the common and differential involvement of executive functions in intelligence and creativity. Intelligence 46 73–83. 10.1016/j.intell.2014.05.007 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Boden M. A. (1990). The Creative Mind: Myths and Mechanisms London: Abacus. [ Google Scholar ]
  • Boden M. A. (1996). Artificial Intelligence New York, NY: Academic. [ Google Scholar ]
  • Buschman T. J., Miller E. K. (2007). Top-down versus bottom-up control of attention in the prefrontal and posterior parietal cortices. Science 315 1860–1862. 10.1126/science.1138071 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Carroll J. B. (1982). “The measurement of Intelligence,” in Handbook of Human Intelligence , ed. Sternberg R. J. (New York, NY: Cambridge University Press; ), 29–120. [ Google Scholar ]
  • Cattell R. B. (1967). The theory of fluid and crystallized general intelligence checked at the 5-6 year-old level. Br. J. Educ. Psychol. 37 209–224. 10.1111/j.2044-8279.1967.tb01930.x [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chi M. T. H. (1997). “ Creativity: Shifting across ontological categories flexibly ,” in Creative Thought: An Investigation of Conceptual Structures and Processes , eds Ward T., Smith S., Vaid J. (Washington, DC: American Psychological Association; ), 209–234. [ Google Scholar ]
  • Chi M. T. H., VanLehn K. A. (1991). The content of physics self-explanations. J. Learn. Sci. 1 69–105. 10.1207/s15327809jls0101_4 [ CrossRef ] [ Google Scholar ]
  • Christensen B. T. (2007). The relationship of analogical distance to analogical function and preinventive structure: the case of engineering design. Mem. Cogn. 35 29–38. 10.3758/BF03195939 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Colom R., Haier R. J., Head K., Álvarez-Linera J., Quiroga M. A., Shih P. C., et al. (2009). Gray matter correlates of fluid, crystallized, and spatial intelligence: testing the P-FIT model. Intelligence 37 124–135. 10.1016/j.intell.2008.07.007 [ CrossRef ] [ Google Scholar ]
  • Colunga E., Smith L. B. (2008). Flexibility and variability: essential to human cognition and the study of human cognition. New Ideas Psychol. 26 158–192. 10.1016/j.newideapsych.2007.07.012 [ CrossRef ] [ Google Scholar ]
  • Cooper N. R., Croft R. J., Dominey S. J. J., Burgess A. P., Gruzelier J. H. (2003). Paradox lost? Exploring the role of alpha oscillations during externally vs. internally directed attention and the implications for idling and inhibition hypotheses. Int. J. Psychophysiol. 47 65–74. 10.1016/S0167-8760(02)00107-1 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Cropley A. (2006). In praise of convergent thinking. Creat. Res. J. 18 391–404. 10.1207/s15326934crj1803_13 [ CrossRef ] [ Google Scholar ]
  • Cropley A., Cropley D. (2008). Resolving the paradoxes of creativity: an extended phase model. Camb. J. Educ. 38 355–373. 10.1080/03057640802286871 [ CrossRef ] [ Google Scholar ]
  • Cross N., Clayburn Cross A. (1996). Winning by design: the methods of Gordon Murray, racing car designer. Des. Stud. 17 91–107. 10.1016/0142-694X(95)00027-O [ CrossRef ] [ Google Scholar ]
  • Dennett D. (1978). Brainstorms: Philosophical Essays on Mind and Psychology Montgomery, VT: Bradford Books. [ Google Scholar ]
  • Dietrich A. (2007). Who’s afraid of a cognitive neuroscience of creativity? Methods 42 22–27. 10.1016/j.ymeth.2006.12.009 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dorst K. (2004). The problem of design problems: Problem solving and design expertise. J. Design Res. 4 10.1504/JDR.2004.009841 [ CrossRef ] [ Google Scholar ]
  • Dorst K. (2011). The core of ‘design thinking’ and its application. Des. Stud. 32 521–532. 10.1016/j.destud.2011.07.006 [ CrossRef ] [ Google Scholar ]
  • Eysenck H. J. (2003). “Creativity, personality and the convergent-divergent continuum,” in Critical Creative Processes , ed. Runco M. A. (Cresskill, NJ: Hampton Press; ), 95–114. [ Google Scholar ]
  • Fink A., Benedek M. (2014). EEG alpha power and creative ideation. Neurosci. Biobehav. Rev. 44 111–123. 10.1016/j.neubiorev.2012.12.002 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Fink A., Benedek M., Grabner R. H., Staudt B., Neubauer A. C. (2007). Creativity meets neuroscience: experimental tasks for the neuroscientific study of creative thinking. Methods 42 68–76. 10.1016/j.ymeth.2006.12.001 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Fink A., Grabner R. H., Benedek M., Reishofer G., Hauswirth V., Fally M., et al. (2009). The creative brain: investigation of brain activity during creative problem solving by means of EEG and FMRI. Hum. Brain Mapp. 30 734–748. 10.1002/hbm.20538 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Finke R. A., Ward T. B., Smith S. M. (1992). Creative Cognition: Theory, Research, and Applications Cambridge, MA: MIT Press. [ Google Scholar ]
  • Fuster J. M. (1997). Network memory. Trends Neurosci. 20 451–459. 10.1016/S0166-2236(97)01128-4 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Gabora L. (2002). “Cognitive mechanisms underlying the creative process,” in Proceedings of the Fourth International Conference on Creativity and Cognition , eds Hewett T., Kavanagh T. (Loughborough: Loughborough University; ), 126–133. [ Google Scholar ]
  • Gabora L. (2010). Revenge of the ‘neurds’: Characterizing creative thought in terms of the structure and dynamics of human memory. Creat. Res. J. 22 1–13. 10.1080/10400410903579494 [ CrossRef ] [ Google Scholar ]
  • Gabora L., Kaufman S. B. (2010). “Evolutionary approaches to creativity,” in The Cambridge Handbook of Creativity , eds Kaufman J. S., Sternberg R. J. (Cambridge: Cambridge University Press; ), 279–300. [ Google Scholar ]
  • Gabora L., Ranjan A. (2013). “How insight emerges in a distributed, content-addressable memory,” in The Neuroscience of Creativity , eds Bristol A., Vartanian O., Kaufman J. (Cambridge: MIT Press; ), 19–43. [ Google Scholar ]
  • Gabora L., Saab A. (2011). “Creative inference and states of potentiality in analogy problem solving,” in Proceedings of the Annual Meeting of the Cognitive Science Society , Boston, MA, 3506–3511. [ Google Scholar ]
  • Gentner D. (1983). Structure mapping: a theoretical framework for analogy. Cogn. Sci. 7 155–170. 10.1207/s15516709cog0702_3 [ CrossRef ] [ Google Scholar ]
  • Getzels J. W. (1975). Problem finding and the inventiveness of solutions. J. Creat. Behav. 9 12–18. 10.1002/j.2162-6057.1975.tb00552.x [ CrossRef ] [ Google Scholar ]
  • Getzels J. W. (1987). “Creativity, intelligence, and problem finding: retrospect and prospect,” in Frontiers of Creativity Research: Beyond the Basics , ed. Isaksen S. G. (Buffalo, NY: Bearly Limited; ), 88–102. [ Google Scholar ]
  • Getzels J. W., Csikszentmihalyi M. (1976). The Creative Vision: A Longitudinal Study of Problem Finding in Art New York, NY: Wiley. [ Google Scholar ]
  • Ghiselin B. (ed.) (1952/1985). The Creative Process Los Angeles: University of California. [ Google Scholar ]
  • Goel V., Pirolli P. (1992). The structure of design problem spaces. Cogn. Sci. 16 395–429. 10.1207/s15516709cog1603_3 [ CrossRef ] [ Google Scholar ]
  • Goldschmidt G. (2013). “A micro view of design reasoning: two-way shifts between embodiment and rationale,” in Creativity and Rationale: Enhancing Human Experience by Design, Human-Computer Interaction Series , ed. Carroll J. M. (London: Springer Verlag; ). 10.1007/978-1-4471-2_3 [ CrossRef ] [ Google Scholar ]
  • Goldschmidt G. (2014). Linkography: Unfolding the Design Process Cambridge, MA: MIT Press. [ Google Scholar ]
  • Grube H. E., Davis S. N. (1988). “Inching our way up mount Olympus: The evolving-systems approach to creative thinking,” in The Nature of Creativity , ed. Sternberg R. J. (New York, NY: Cambridge University Press; ), 243–270. [ Google Scholar ]
  • Guilford J. P. (1950). Creativity. Am. Psychol. 5 444–454. 10.1037/h0063487 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Guilford J. P. (1956). The structure of intellect model. Psychol. Bull. 53 267–293. 10.1037/h0040755 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Guilford J. P. (1959). “Traits of creativity,” in Creativity and its Cultivation , ed. Anderson H. H. (New York: Harper; ), 142–161. [ Google Scholar ]
  • Guilford J. P. (1967). The Nature of Human Intelligence New York, NY: McGraw-Hill, Inc. [ Google Scholar ]
  • Guilford J. P., Christensen P. R., Merrifield P. R., Wilson R. C. (1978). Alternate Uses: Manual of Instructions and Interpretation Orange, CA: Sheridan Psychological Services. [ Google Scholar ]
  • Halpern D. F. (2003). “Thinking critically about creative thinking,” in Critical Creative Processes , ed. Runco M. A. (Cresskill, NJ: Hampton Press; ), 189–208. [ Google Scholar ]
  • Hayes J. R., Flowers L. S. (1986). Writing research and the writer. Am. Psychol. 41 1106–1113. 10.1037/0003-066X.41.10.1106 [ CrossRef ] [ Google Scholar ]
  • Jaarsveld S. (2007). Creative Cognition: New Perspectives on Creative Thinking Kaiserslautern: University of Kaiserslautern Press. [ Google Scholar ]
  • Jaarsveld S., Fink A., Rinner M., Schwab D., Benedek M., Lachmann T. (2015). Intelligence in creative processes; an EEG study. Intelligence 49 171–178. 10.1016/j.ijpsycho.2012.02.012 [ CrossRef ] [ Google Scholar ]
  • Jaarsveld S., Lachmann T., Hamel R., van Leeuwen C. (2010). Solving and creating Raven Progressive Matrices: reasoning in well and ill defined problem spaces. Creat. Res. J. 22 304–319. 10.1080/10400419.2010.503541 [ CrossRef ] [ Google Scholar ]
  • Jaarsveld S., Lachmann T., van Leeuwen C. (2012). Creative reasoning across developmental levels: convergence and divergence in problem creation. Intelligence 40 172–188. 10.1016/j.intell.2012.01.002 [ CrossRef ] [ Google Scholar ]
  • Jaarsveld S., Lachmann T., van Leeuwen C. (2013). “The impact of problem space on reasoning: Solving versus creating matrices,” in Proceedings of the 35th Annual Conference of the Cognitive Science Society , eds Knauff M., Pauen M., Sebanz N., Wachsmuth I. (Austin, TX: Cognitive Science Society; ), 2632–2638. [ Google Scholar ]
  • Jaarsveld S., van Leeuwen C. (2005). Sketches from a design process: creative cognition inferred from intermediate products. Cogn. Sci. 29 79–101. 10.1207/s15516709cog2901_4 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Jauk E., Benedek M., Dunst B., Neubauer A. C. (2013). The relationship between intelligence and creativity: new support for the threshold hypothesis by means of empirical breakpoint detection. Intelligence 41 212–221. 10.1016/j.intell.2013.03.003 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Jauk E., Benedek M., Neubauer A. C. (2012). Tackling creativity at its roots: evidence for different patterns of EEG alpha activity related to convergent and divergent modes of task processing. Int. J. Psychophysiol. 84 219–225. 10.1016/j.ijpsycho.2012.02.012 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Jaušovec N. (1999). “Brain biology and brain functioning,” in Encyclopedia of Creativity , eds Runco M. A., Pritzker S. R. (San Diego, CA: Academic Press; ), 203–212. [ Google Scholar ]
  • Jaušovec N. (2000). Differences in cognitive processes between gifted, intelligent, creative, and average individuals while solving complex problems: an EEG Study. Intelligence 28 213–237. 10.1016/S0160-2896(00)00037-4 [ CrossRef ] [ Google Scholar ]
  • Jung R. E. (2014). Evolution, creativity, intelligence, and madness: “here be dragons”. Front. Psychol 5 : 784 10.3389/fpsyg.2014.00784 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Jung R. E., Haier R. J. (2013). “Creativity and intelligence,” in Neuroscience of Creativity , eds Vartanian O., Bristol A. S., Kaufman J. C. (Cambridge, MA: MIT Press; ), 233–254. [ Google Scholar ]
  • Jung R. E., Segall J. M., Bockholt H. J., Flores R. A., Smith S. M., Chavez R. S., et al. (2010). Neuroanatomy of creativity. Hum. Brain Mapp. 31 398–409. 10.1002/hbm.20874 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Karmiloff-Smith A. (1992). Beyond Modularity: A Developmental Perspective on Cognitive Science Cambridge, MA: MIT Press. [ Google Scholar ]
  • Kaufman J. C. (2015). Why creativity isn’t in IQ tests, why it matters, and why it won’t change anytime soon probably. Intelligence 3 59–72. 10.3390/jintelligence303005 [ CrossRef ] [ Google Scholar ]
  • Kaufmann G. (2003). What to measure? A new look at the concept of creativity. Scand. J. Educ. Res. 47 235–251. 10.1080/00313830308604 [ CrossRef ] [ Google Scholar ]
  • Kim K. H. (2005). Can only intelligent people be creative? J. Second. Gift. Educ. 16 57–66. [ Google Scholar ]
  • Koestler A. (1964). The Act of Creation London: Penguin. [ Google Scholar ]
  • Kozbelt A. (2008). Hierarchical linear modeling of creative artists’ problem solving behaviors. J. Creat. Behav. 42 181–200. 10.1002/j.2162-6057.2008.tb01294.x [ CrossRef ] [ Google Scholar ]
  • Kulkarni D., Simon H. A. (1988). The processes of scientific discovery: the strategy of experimentation. Cogn. Sci. 12 139–175. 10.1016/j.coph.2009.08.004 [ CrossRef ] [ Google Scholar ]
  • Limb C. J. (2010). Your Brain on Improve Available at: http://www.ted.com/talks/charles_limb_your_brain_on_improv [ Google Scholar ]
  • Lubart T. I. (2001). Models of the creative process: past, present and future. Creat. Res. J. 13 295–308. 10.1207/S15326934CRJ1334_07 [ CrossRef ] [ Google Scholar ]
  • Lubart T. I. (2003). Psychologie de la Créativité. Cursus. Psychologie Paris: Armand Colin. [ Google Scholar ]
  • Martindale C. (1999). “Biological basis of creativity,” in Handbook of Creativity , ed. Sternberg R. J. (New York, NY: Cambridge University Press; ), 137–152. [ Google Scholar ]
  • Mednick S. A. (1962). The associative basis of the creative process. Psychol. Rev. 69 220–232. 10.1037/h0048850 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Mendelssohn G. A. (1976). Associational and attentional processes in creative performance. J. Pers. 44 341–369. 10.1111/j.1467-6494.1976.tb00127.x [ CrossRef ] [ Google Scholar ]
  • Mestre J. P. (2002). Probing adults’ conceptual understanding and transfer of learning via problem posing. Appl. Dev. Psychol. 23 9–50. 10.1016/S0193-3973(01)00101-0 [ CrossRef ] [ Google Scholar ]
  • Miller E. K., Cohen J. D. (2001). An integrative theory of prefrontal cortex function. Annu. Rev. Neurosci. 24 167–202. 10.1146/annurev.neuro.24.1.167 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Mumford M. D., Hunter S. T., Eubanks D. L., Bedell K. E., Murphy S. T. (2007). Developing leaders for creative efforts: a domain-based approach to leadership development. Hum. Res. Manag. Rev. 17 402–417. 10.1016/j.hrmr.2007.08.002 [ CrossRef ] [ Google Scholar ]
  • Newell A., Simon H. A. (1972). “The theory of human problem solving,” in Human Problem Solving , eds Newell A., Simon H. (Englewood Cliffs, NJ: Prentice Hall; ), 787–868. [ Google Scholar ]
  • Nusbaum E. C., Silvia P. J. (2011). Are intelligence and creativity really so different? Intelligence 39 36–40. 10.1016/j.intell.2010.11.002 [ CrossRef ] [ Google Scholar ]
  • Palmiero M., Nori R., Aloisi V., Ferrara M., Piccardi L. (2015). Domain-specificity of creativity: a study on the relationship between visual creativity and visual mental imagery. Front. Psychol. 6 : 1870 10.3389/fpsyg.2015.01870 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Piaget J., Montangero J., Billeter J. (1977). “La formation des correlats,” in Recherches sur L’abstraction Reflechissante I , ed. Piaget J. (Paris: Presse Universitaires de France; ), 115–129. [ Google Scholar ]
  • Plucker J. (1999). Is the proof in the pudding? Reanalyses of torrance’s (1958 to present) longitudinal study data. Creat. Res. J. 12 103–114. 10.1207/s15326934crj1202_3 [ CrossRef ] [ Google Scholar ]
  • Raven J. C. (1938/1998). Standard Progressive Matrices, Sets A, B, C, D & E Oxford: Oxford Psychologists Press. [ Google Scholar ]
  • Razumnikova O. M., Volf N. V., Tarasova I. V. (2009). Strategy and results: sex differences in electrographic correlates of verbal and figural creativity. Hum. Physiol. 35 285–294. 10.1134/S0362119709030049 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Runco M. A. (1991). The evaluative, valuative, and divergent thinking of children. J. Creat. Behav. 25 311–319. 10.1177/1073858414568317 [ CrossRef ] [ Google Scholar ]
  • Runco M. A. (2003). “Idea evaluation, divergent thinking, and creativity,” in Critical Creative Processes , ed. Runco M. A. (Cresskill, NJ: Hampton Press; ), 69–94. [ Google Scholar ]
  • Runco M. A. (2007). Creativity, Theories and Themes: Research, Development, and Practice New York, NY: Elsevier. [ Google Scholar ]
  • Runco M. A. (2008). Commentary: divergent thinking is not synonymous with creativity. Psychol. Aesthet. Creat. Arts 2 93–96. 10.1037/1931-3896.2.2.93 [ CrossRef ] [ Google Scholar ]
  • Sakar P., Chakrabarti A. (2013). Support for protocol analyses in design research. Des. Issues 29 70–81. 10.1162/DESI_a_00231 [ CrossRef ] [ Google Scholar ]
  • Saraç S., Önder A., Karakelle S. (2014). The relations among general intelligence, metacognition and text learning performance. Educ. Sci. 39 40–53. [ Google Scholar ]
  • Shye S., Goldzweig G. (1999). Creativity as an extension of intelligence: Faceted definition and structural hypotheses. Megamot 40 31–53. [ Google Scholar ]
  • Shye S., Yuhas I. (2004). Creativity in problem solving. Tech. Rep. 10.13140/2.1.1940.0643 [ CrossRef ] [ Google Scholar ]
  • Siegler R. S. (1998). Children’s Thinking , 3rd Edn Upper Saddle River, NJ: Prentice Hall, 28–50. [ Google Scholar ]
  • Siegler R. S. (2005). Children’s learning. Am. Psychol. 60 769–778. 10.1037/0003-066X.60.8.769 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Silvia P. J. (2008). Creativity and intelligence revisited: a reanalysis of Wallach and Kogan (1965). Creat. Res. J. 20 34–39. 10.1080/10400410701841807 [ CrossRef ] [ Google Scholar ]
  • Silvia P. J., Beaty R. E., Nussbaum E. C. (2013). Verbal fluency and creativity: general and specific contributions of broad retrieval ability (Gr) factors to divergent thinking. Intelligence 41 328–340. 10.1016/j.intell.2013.05.004 [ CrossRef ] [ Google Scholar ]
  • Simon H. A. (1973). The structure of ill structured problems. Artif. Intell. 4 1012–1021. 10.1016/0004-3702(73)90011-8 [ CrossRef ] [ Google Scholar ]
  • Simon H. A., Newell A. (1971). Human problem solving: state of theory in 1970. Am. Psychol. 26 145–159. 10.1037/h0030806 [ CrossRef ] [ Google Scholar ]
  • Sligh A. C., Conners F. A., Roskos-Ewoldsen B. (2005). Relation of creativity to fluid and crystallized intelligence. J. Creat. Behav. 39 123–136. 10.1002/j.2162-6057.2005.tb01254.x [ CrossRef ] [ Google Scholar ]
  • Spearman C. (1904). ‘General intelligence,’ objectively determined and measured. Am. J. Psychol. 15 201–293. 10.2307/1412107 [ CrossRef ] [ Google Scholar ]
  • Spearman C. (1927). The Abilities of Man London: Macmillan. [ Google Scholar ]
  • Sternberg R. J. (1982). “Conceptions of intelligence,” in Handbook of Human Intelligence , ed. Sternberg R. J. (New York, NY: Cambridge University Press; ), 3–28. [ Google Scholar ]
  • Sternberg R. J. (2005). “The WICS model of giftedness,” in Conceptions of Giftedness , 2nd Edn, eds Sternberg R. J., Davidson J. E. (New York, NY: Cambridge University Press; ), 237–243. [ Google Scholar ]
  • Sternberg R. J., Lubart T. I. (1999). “The concept of creativity: Prospects and paradigms,” in Handbook of Creativity , ed. Sternberg R. J. (New York, NY: Cambridge University Press; ), 3–15. [ Google Scholar ]
  • Sternberg R. J., Salter W. (1982). “The nature of intelligence and its measurements,” in Handbook of Human Intelligence , ed. Sternberg R. J. (New York, NY: Cambridge University Press; ), 3–24. [ Google Scholar ]
  • Thagard P., Verbeurgt K. (1998). Coherence as constraint satisfaction. Cogn. Sci. 22 l–24. 10.1207/s15516709cog2201_1 [ CrossRef ] [ Google Scholar ]
  • Torrance E. P. (1988). “The nature of creativity as manifest in its testing,” in The Nature of Creativity: Contemporary Psychological Perspectives , ed. Sternberg R. J. (New York, NY: Cambridge University Press; ), 43–75. [ Google Scholar ]
  • Urban K. K., Jellen H. G. (1995). Test of Creative Thinking – Drawing Production Frankfurt: Swets Test Services. [ Google Scholar ]
  • van Leeuwen C., Verstijnen I. M., Hekkert P. (1999). “Common unconscious dynamics underlie uncommon conscious effect: a case study in the iterative nature of perception and creation,” in Modeling Consciousness Across the Disciplines , ed. Jordan J. S. (Lanham, MD: University Press of America; ), 179–218. [ Google Scholar ]
  • Vernon P. E. (ed.) (1970). Creativity London: Penguin. [ Google Scholar ]
  • Verstijnen I. M., Heylighen A., Wagemans J., Neuckermans H. (2001). “Sketching, analogies, and creativity,” in Visual and Spatial Reasoning in Design, II. Key Centre of Design Computing and Cognition , eds Gero J. S., Tversky B., Purcell T. (Sydney, NSW: University of Sydney; ). [ Google Scholar ]
  • Wallas G. (1926). The Art of Thought New York, NY: Harcourt, Brace & World. [ Google Scholar ]
  • Ward T. B. (2007). Creative cognition as a window on creativity. Methods 42 28–37. 10.1016/j.ymeth.2006.12.002 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Webb Young J. (1939/2003). A Technique for Producing Ideas New York, NY: McGraw-Hill. [ Google Scholar ]
  • Welter M. M., Jaarsveld S., Lachmann T. Problem space matters: development of creativity and intelligence in primary school children. Creat. Res. J. (in press) [ Google Scholar ]
  • Welter M. M., Jaarsveld S., van Leeuwen C., Lachmann T. (2016). Intelligence and creativity; over the threshold together? Creat. Res. J. 28 212–218. 10.1080/10400419.2016.1162564 [ CrossRef ] [ Google Scholar ]
  • Wertheimer M. (1945/1968). Productive Thinking (Enlarged Edition) London: Tavistock. [ Google Scholar ]
  • Yamamoto Y., Nakakoji K., Takada S. (2000). Hand on representations in two dimensional spaces for early stages of design. Knowl. Based Syst. 13 357–384. 10.1016/S0950-7051(00)00078-2 [ CrossRef ] [ Google Scholar ]

Site Logo

  • PRO-ED Order Form
  • Research & Testing
  • Publish with ProEd
  • Reprint Permission

TOPS-2:A: Test of Problem Solving 2: Adolescent

TOPS-2:A: Test of Problem Solving 2: Adolescent

Linda bowers • rosemary huisingh • carolyn logiudice.

  • Product Number: 34130
  • Test Level: B
  • ISBN 978-0-760-60712-1
  • Format: KIT
  • Weight 4 lbs.8 oz.

Description

Diagnose Language-Based Thinking Deficits

Ages: 12-0 through 17-11

Testing Time: 40 minutes

While other tests may assess thinking skills by tapping mathematical, spatial, or nonverbal potential, the TOPS 2: Adolescent assesses critical thinking abilities based on the student's language strategies using logic and experience.

Based on the research of Richard Paul, the TOPS 2: Adolescent emphasizes the integrative disposition of critical thinking by focusing on these cognitive processes:

  • understanding/comprehension
  • interpretation
  • self-regulation
  • explanation
  • inference/insight
  • decision-making
  • intent/purpose
  • problem solving
  • acknowledgment

The test is comprised of five subtests (18 written passages) that assess a student's performance of these skills. The subtests require the student to pay careful attention to, process, and think about what they hear and read; think about problems with a purpose in mind; resist the urge to be impulsive; and express answers verbally.

  • Subtest A: Making Inferences- The student is asked to give a logical explanation about a situation, combining what he knows or can see with previous experience/background information. Students who do well on this subtest make plausible inferences, predictions, or interpretations.
  • Subtest B: Determining Solutions- The student is asked to provide a logical solution for some aspect of a situation presented in a passage.
  • Subtest C: Problem Solving -  This subtest requires a student to recognize the problem, think of alternative solutions, evaluate the options, and state an appropriate solution for a given situation. It also includes stating how to avoid specific problems.
  • Subtest D: Interpreting Perspectives- A student who does well on this subtest will evaluate other points of view in order to make a conclusion.
  • Subtest E: Transferring Insights- The student is asked to compare analogous situations by using information stated in the passage.

COMPLETE TEST INCLUDES: Examiner's Manual, Reading Passages Book, and 20 Test Forms. (©2007)

Please contact [email protected] for revised normative tables.

Learn more

How it works

Transform your enterprise with the scalable mindsets, skills, & behavior change that drive performance.

Explore how BetterUp connects to your core business systems.

We pair AI with the latest in human-centered coaching to drive powerful, lasting learning and behavior change.

Build leaders that accelerate team performance and engagement.

Unlock performance potential at scale with AI-powered curated growth journeys.

Build resilience, well-being and agility to drive performance across your entire enterprise.

Transform your business, starting with your sales leaders.

Unlock business impact from the top with executive coaching.

Foster a culture of inclusion and belonging.

Accelerate the performance and potential of your agencies and employees.

See how innovative organizations use BetterUp to build a thriving workforce.

Discover how BetterUp measurably impacts key business outcomes for organizations like yours.

A demo is the first step to transforming your business. Meet with us to develop a plan for attaining your goals.

Request a demo

  • What is coaching?

Learn how 1:1 coaching works, who its for, and if it's right for you.

Accelerate your personal and professional growth with the expert guidance of a BetterUp Coach.

Types of Coaching

Navigate career transitions, accelerate your professional growth, and achieve your career goals with expert coaching.

Enhance your communication skills for better personal and professional relationships, with tailored coaching that focuses on your needs.

Find balance, resilience, and well-being in all areas of your life with holistic coaching designed to empower you.

Discover your perfect match : Take our 5-minute assessment and let us pair you with one of our top Coaches tailored just for you.

Find your Coach

Best practices, research, and tools to fuel individual and business growth.

View on-demand BetterUp events and learn about upcoming live discussions.

The latest insights and ideas for building a high-performing workplace.

  • BetterUp Briefing

The online magazine that helps you understand tomorrow's workforce trends, today.

Innovative research featured in peer-reviewed journals, press, and more.

Founded in 2022 to deepen the understanding of the intersection of well-being, purpose, and performance

We're on a mission to help everyone live with clarity, purpose, and passion.

Join us and create impactful change.

Read the buzz about BetterUp.

Meet the leadership that's passionate about empowering your workforce.

For Business

For Individuals

Effective problem statements have these 5 components

problem-statement-colleagues-gathered-talking-in-office

Cultivate your creativity

Foster creativity and continuous learning with guidance from our certified Coaches.

We’ve all encountered problems on the job. After all, that’s what a lot of work is about. Solving meaningful problems to help improve something. 

Developing a problem statement that provides a brief description of an issue you want to solve is an important early step in problem-solving .

It sounds deceptively simple. But creating an effective problem statement isn’t that easy, even for a genius like Albert Einstein. Given one hour to work on a problem, he’d spend 55 minutes thinking about the problem and five minutes finding solutions. (Or so the story goes.)

Einstein was probably exaggerating to make a point. But considering his success in solving complex problems, we think he was on to something. 

As humans, we’re wired to jump past the problem and go directly to the solution stage. In emergencies, this behavior can be lifesaving, as in leaping out of the way of a speeding car. But when dealing with longer-range issues in the workplace, this can lead to bad decisions or half-baked solutions. 

That’s where problem statements come in handy. They help to meaningfully outline objectives to reach effective solutions. Knowing how to develop a great problem statement is also a valuable tool for honing your management skills .

But what exactly is a problem statement, when should you use one, and how do you go about writing one? In this article, we'll answer those questions and give you some tips for writing effective problem statements. Then you'll be ready to take on more challenges large and small.

What is a problem statement?

First, let’s start by defining a problem statement. 

A problem statement is a short, clear explanation of an issue or challenge that sums up what you want to change. It helps you, team members, and other stakeholders to focus on the problem, why it’s important, and who it impacts. 

A good problem statement should create awareness and stimulate creative thinking . It should not identify a solution or create a bias toward a specific strategy.

Taking time to work on a problem statement is a great way to short-circuit the tendency to rush to solutions. It helps to make sure you’re focusing on the right problem and have a well-informed understanding of the root causes. The process can also help you take a more proactive than reactive approach to problem-solving . This can help position you and your team to avoid getting stuck in constant fire-fighting mode. That way, you can take advantage of more growth opportunities.  

When to use a problem statement

The best time to create a problem statement is before you start thinking of solutions. If you catch yourself or your team rushing to the solution stage when you’re first discussing a problem, hit the brakes. Go back and work on the statement of the problem to make sure everyone understands and agrees on what the real problem is. 

Here are some common situations where writing problem statements might come in handy: 

  • Writing an executive summary for a project proposal or research project
  • Collaborating   on a cross-functional project with several team members
  • Defining the customer issue that a proposed product or service aims to solve
  • Using design thinking to improve user experience
  • Tackling a problem that previous actions failed to solve 

problem-statement-colleagues-solving-at-laptop

How to identify a problem statement

Like the unseen body of an iceberg, the root cause of a specific problem isn’t always obvious. So when developing a problem statement, how do you go about identifying the true, underlying problem?

These two steps will help you uncover the root cause of a problem :

  • Collect information from the research and previous experience with the problem
  • Talk to multiple stakeholders who are impacted by the problem

People often perceive problems differently. Interviewing stakeholders will help you understand the problem from diverse points of view. It can also help you develop some case studies to illustrate the problem. 

Combining these insights with research data will help you identify root causes more accurately. In turn, this methodology will help you craft a problem statement that will lead to more viable solutions. 

What are problem statements used for?

You can use problem statements for a variety of purposes. For an organization, it might be solving customer and employee issues. For the government, it could be improving public health. For individuals, it can mean enhancing their own personal well-being . Generally, problem statements can be used to:

  • Identify opportunities for improvement
  • Focus on the right problems or issues to launch more successful initiatives – a common challenge in leadership
  • Help you communicate a problem to others who need to be involved in finding a solution
  • Serve as the basis for developing an action plan or goals that need to be accomplished to help solve the problem
  • Stimulate thinking outside the box  and other types of creative brainstorming techniques

3 examples of problem statements

When you want to be sure you understand a concept or tool, it helps to see an example. There can also be some differences in opinion about what a problem statement should look like. For instance, some frameworks include a proposed solution as part of the problem statement. But if the goal is to stimulate fresh ideas, it’s better not to suggest a solution within the problem statement. 

In our experience, an effective problem statement is brief, preferably one sentence. It’s also specific and descriptive without being prescriptive. 

Here are three problem statement examples. While these examples represent three types of problems or goals, keep in mind that there can be many other types of problem statements.        

Example Problem Statement 1: The Status Quo Problem Statement

Example: 

The average customer service on-hold time for Example company exceeds five minutes during both its busy and slow seasons.

This can be used to describe a current pain point within an organization that may need to be addressed. Note that the statement specifies that the issue occurs during the company’s slow time as well as the busy season. This is helpful in performing the root cause analysis and determining how this problem can be solved. 

The average customer service on-hold time for Example company exceeds five minutes during both its busy and slow seasons. The company is currently understaffed and customer service representatives are overwhelmed.

Background:

Example company is facing a significant challenge in managing their customer service on-hold times. In the past, the company had been known for its efficient and timely customer service, but due to a combination of factors, including understaffing and increased customer demand, the on-hold times have exceeded five minutes consistently. This has resulted in frustration and dissatisfaction among customers, negatively impacting the company's reputation and customer loyalty.

Reducing the on-hold times for customer service callers is crucial for Example company. Prolonged waiting times have a detrimental effect on customer satisfaction and loyalty, leading to potential customer churn and loss of revenue. Additionally, the company's declining reputation in terms of customer service can have a lasting impact on its competitive position in the market. Addressing this problem is of utmost importance to improve customer experience and maintain a positive brand image.

Objectives:

The primary objective of this project is to reduce the on-hold times for customer service callers at Example company. The specific objectives include:

  • Analyzing the current customer service workflow and identifying bottlenecks contributing to increased on-hold times.
  • Assessing the staffing levels and resource allocation to determine the extent of understaffing and its impact on customer service.
  • Developing strategies and implementing measures to optimize the customer service workflow and reduce on-hold times.
  • Monitoring and evaluating the effectiveness of the implemented measures through key performance indicators (KPIs) such as average on-hold time, customer satisfaction ratings, and customer feedback.
  • Establishing a sustainable approach to maintain reduced on-hold times, taking into account both busy and slow seasons, through proper resource planning, training, and process improvements.

Example Problem Statement 2: The Destination Problem Statement

Leaders at Example company want to increase net revenue for its premium product line of widgets by 5% for the next fiscal year. 

This approach can be used to describe where an organization wants to be in the future. This type of problem statement is useful for launching initiatives to help an organization achieve its desired state. 

Like creating SMART goals , you want to be as specific as possible. Note that the statement specifies “net revenue” instead of “gross revenue." This will help keep options open for potential actions. It also makes it clear that merely increasing sales is not an acceptable solution if higher marketing costs offset the net gains. 

Leaders at Example company aim to increase net revenue for its premium product line of widgets by 5% for the next fiscal year. However, the company currently lacks the necessary teams to tackle this objective effectively. To achieve this growth target, the company needs to expand its marketing and PR teams, as well as its product development teams, to prepare for scaling. 

Example company faces the challenge of generating a 5% increase in net revenue for its premium product line of widgets in the upcoming fiscal year. Currently, the company lacks the required workforce to drive this growth. Without adequate staff in the marketing, PR, and product development departments, the company's ability to effectively promote, position, and innovate its premium product line will be hindered. To achieve this kind of growth, it is essential that Example company expands teams, enhances capabilities, and strategically taps into the existing pool of loyal customers.

Increasing net revenue for the premium product line is crucial for Example company's overall business success. Failure to achieve the targeted growth rate can lead to missed revenue opportunities and stagnation in the market. By expanding the marketing and PR teams, Example company can strengthen its brand presence, effectively communicate the value proposition of its premium product line, and attract new customers.

Additionally, expanding the product development teams will enable the company to introduce new features and innovations, further enticing existing and potential customers. Therefore, addressing the workforce shortage and investing in the necessary resources are vital for achieving the revenue growth objective.

The primary objective of this project is to increase net revenue for Example company's premium product line of widgets by 5% in the next fiscal year. The specific objectives include:

  • Assessing the current workforce and identifying the gaps in the marketing, PR, and product development teams.
  • Expanding the marketing and PR teams by hiring skilled professionals who can effectively promote the premium product line and engage with the target audience.
  • Strengthening the product development teams by recruiting qualified individuals who can drive innovation, enhance product features, and meet customer demands.
  • Developing a comprehensive marketing and PR strategy to effectively communicate the value proposition of the premium product line and attract new customers.
  • Leveraging the existing base of loyal customers to increase repeat purchases, referrals, and brand advocacy.
  • Allocating sufficient resources, both time and manpower, to support the expansion and scaling efforts required to achieve the ambitious revenue growth target.
  • Monitoring and analyzing key performance indicators (KPIs) such as net revenue, customer acquisition, customer retention, and customer satisfaction to measure the success of the growth initiatives.
  • Establishing a sustainable plan to maintain the increased revenue growth beyond the next fiscal year by implementing strategies for continuous improvement and adaptation to market dynamics.

Example Problem Statement 3 The Stakeholder Problem Statement

In the last three quarterly employee engagement surveys , less than 30% of employees at Eample company stated that they feel valued by the company. This represents a 20% decline compared to the same period in the year prior. 

This strategy can be used to describe how a specific stakeholder group views the organization. It can be useful for exploring issues and potential solutions that impact specific groups of people. 

Note the statement makes it clear that the issue has been present in multiple surveys and it's significantly worse than the previous year. When researching root causes, the HR team will want to zero in on factors that changed since the previous year.

In the last three quarterly employee engagement surveys, less than 30% of employees at the Example company stated that they feel valued by the company. This indicates a significant decline of 20% compared to the same period in the previous year.

The company aspires to reduce this percentage further to under 10%. However, achieving this goal would require filling specialized roles and implementing substantial cultural changes within the organization.

Example company is facing a pressing issue regarding employee engagement and perceived value within the company. Over the past year, there has been a notable decline in the percentage of employees who feel valued. This decline is evident in the results of the quarterly employee engagement surveys, which consistently show less than 30% of employees reporting a sense of value by the company.

This decline of 20% compared to the previous year's data signifies a concerning trend. To address this problem effectively, Example company needs to undertake significant measures that go beyond superficial changes and necessitate filling specialized roles and transforming the company culture.

Employee engagement and a sense of value are crucial for organizational success. When employees feel valued, they tend to be more productive, committed, and motivated. Conversely, a lack of perceived value can lead to decreased morale, increased turnover rates, and diminished overall performance.

By addressing the decline in employees feeling valued, Example company can improve employee satisfaction, retention, and ultimately, overall productivity. Achieving the desired reduction to under 10% is essential to restore a positive work environment and build a culture of appreciation and respect.

The primary objective of this project is to increase the percentage of employees who feel valued by Example company, aiming to reduce it to under 10%. The specific objectives include:

  • Conducting a comprehensive analysis of the factors contributing to the decline in employees feeling valued, including organizational policies, communication practices, leadership styles, and cultural norms.
  • Identifying and filling specialized roles, such as employee engagement specialists or culture change agents, who can provide expertise and guidance in fostering a culture of value and appreciation.
  • Developing a holistic employee engagement strategy that encompasses various initiatives, including training programs, recognition programs, feedback mechanisms, and communication channels, to enhance employee value perception.
  • Implementing cultural changes within the organization that align with the values of appreciation, respect, and recognition, while fostering an environment where employees feel valued.
  • Communicating the importance of employee value and engagement throughout all levels of the organization, including leadership teams, managers, and supervisors, to ensure consistent messaging and support.
  • Monitoring progress through regular employee surveys, feedback sessions, and key performance indicators (KPIs) related to employee satisfaction, turnover rates, and overall engagement levels.
  • Providing ongoing support, resources, and training to managers and supervisors to enable them to effectively recognize and appreciate their teams and foster a culture of value within their respective departments.
  • Establishing a sustainable framework for maintaining high employee value perception in the long term, including regular evaluation and adaptation of employee engagement initiatives to address evolving needs and expectations.

problem-statement-man-with-arms-crossed-smiling

What are the 5 components of a problem statement?

In developing a problem statement, it helps to think like a journalist by focusing on the five Ws: who, what, when, where, and why or how. Keep in mind that every statement may not explicitly include each component. But asking these questions is a good way to make sure you’re covering the key elements:

  • Who: Who are the stakeholders that are affected by the problem?
  • What: What is the current state, desired state, or unmet need? 
  • When: When is the issue occurring or what is the timeframe involved?
  • Where: Where is the problem occurring? For example, is it in a specific department, location, or region?
  • Why: Why is this important or worth solving? How is the problem impacting your customers, employees, other stakeholders, or the organization? What is the magnitude of the problem? How large is the gap between the current and desired state? 

How do you write a problem statement?

There are many frameworks designed to help people write a problem statement. One example is outlined in the book, The Conclusion Trap: Four Steps to Better Decisions, ” by Daniel Markovitz. A faculty member at the Lean Enterprise Institute, the author uses many case studies from his work as a business consultant.

To simplify the process, we’ve broken it down into three steps:

1. Gather data and observe

Use data from research and reports, as well as facts from direct observation to answer the five Ws: who, what, when, where, and why. 

Whenever possible, get out in the field and talk directly with stakeholders impacted by the problem. Get a firsthand look at the work environment and equipment. This may mean spending time on the production floor asking employees questions about their work and challenges. Or taking customer service calls to learn more about customer pain points and problems your employees may be grappling with.    

2. Frame the problem properly  

A well-framed problem will help you avoid cognitive bias and open avenues for discussion. It will also encourage the exploration of more options.

A good way to test a problem statement for bias is to ask questions like these:

3. Keep asking why (and check in on the progress)

When it comes to problem-solving, stay curious. Lean on your growth mindset to keep asking why — and check in on the progress. 

Asking why until you’re satisfied that you’ve uncovered the root cause of the problem will help you avoid ineffective band-aid solutions.

Refining your problem statements

When solving any sort of problem, there’s likely a slew of questions that might arise for you. In order to holistically understand the root cause of the problem at hand, your workforce needs to stay curious. 

An effective problem statement creates the space you and your team need to explore, gain insight, and get buy-in before taking action.

If you have embarked on a proposed solution, it’s also important to understand that solutions are malleable. There may be no single best solution. Solutions can change and adapt as external factors change, too. It’s more important than ever that organizations stay agile . This means that interactive check-ins are critical to solving tough problems. By keeping a good pulse on your course of action, you’ll be better equipped to pivot when the time comes to change. 

BetterUp can help. With access to virtual coaching , your people can get personalized support to help solve tough problems of the future.

Madeline Miles

Madeline is a writer, communicator, and storyteller who is passionate about using words to help drive positive change. She holds a bachelor's in English Creative Writing and Communication Studies and lives in Denver, Colorado. In her spare time, she's usually somewhere outside (preferably in the mountains) — and enjoys poetry and fiction.

18 excellent educational podcasts to fuel your love of learning

The future of ai: where does your job stand, 6 ai prompt generator tools to boost your creativity, 20 ai tools to help boost productivity in 2023, applications of ai: 10 common examples, how to use 100% of your brain: is it possible, 6 chatgpt prompts to save time and boost productivity, experimentation brings innovation: create an experimental workplace, what are metacognitive skills examples in everyday life, similar articles, 10 problem-solving strategies to turn challenges on their head, writing a value statement: your guide to keeping your team aligned, 10 personal brand statements to put all eyes on you, discover 4 types of innovation and how to encourage them, what is organizational structure and why is it important, create smart kpis to strategically grow your business, contingency planning: 4 steps to prepare for the unexpected, what is a career statement, and should you write one, how to craft an impactful company mission statement, stay connected with betterup, get our newsletter, event invites, plus product insights and research..

3100 E 5th Street, Suite 350 Austin, TX 78702

  • Platform Overview
  • Integrations
  • Powered by AI
  • BetterUp Lead
  • BetterUp Manage™
  • BetterUp Care™
  • Sales Performance
  • Diversity & Inclusion
  • Case Studies
  • Why BetterUp?
  • About Coaching
  • Find your Coach
  • Career Coaching
  • Communication Coaching
  • Life Coaching
  • News and Press
  • Leadership Team
  • Become a BetterUp Coach
  • BetterUp Labs
  • Center for Purpose & Performance
  • Leadership Training
  • Business Coaching
  • Contact Support
  • Contact Sales
  • Privacy Policy
  • Acceptable Use Policy
  • Trust & Security
  • Cookie Preferences

How to assess reasoning skills

test of problem solving description

Identifying individuals with excellent reasoning skills is a common goal when making new hires. The ability of your employees to analyze information, think critically, and draw logical conclusions is crucial in today’s dynamic professional landscape. 

Pre-employment assessments offer great value by effectively assessing these essential capabilities. 

TestGorilla’s assessments objectively gauge a candidate’s ability to solve problems, evaluate arguments, and draw logical inferences. By leveraging these assessments, you can secure candidates with the cognitive skills necessary for analytical thinking and decision-making.

Table of contents

What is a reasoning skills assessment, why are reasoning skills important, what skills and traits do employees with good reasoning have, tests for evaluating reasoning skills, how testgorilla can help you find candidates with reasoning skills.

A reasoning skills assessment is a valuable tool that can provide insights into a candidate’s ability to analyze information, think critically, and make logical deductions. 

This assessment aims to evaluate an individual’s cognitive skills related to problem-solving, decision-making, and analytical thinking.

There are several types of cognitive ability tests that can aid in assessing reasoning. During a reasoning skills assessment, candidates are presented with various scenarios, questions, or problems that require them to apply logical thinking and problem-solving techniques. 

It can involve evaluating arguments, identifying patterns, making inferences, or solving puzzles. 

Assessments often use standardized tests or exercises that measure different aspects of reasoning. They’re designed to objectively evaluate a candidate’s cognitive abilities rather than simply relying on qualifications or experience. 

Using a reasoning skills assessment, you can make more informed decisions about a candidate’s aptitude for sound reasoning, problem-solving, and decision-making.

Why are legal assistant skills important graphic list

Effective problem-solving

Employees with solid reasoning skills can tackle complex problems with clarity and efficiency. They can analyze information, identify patterns, and make logical connections, enabling them to devise smart ways to meet challenges. 

Their problem-solving ability enhances productivity, streamlines work processes, and drives continuous organizational improvement. This is why you need analytic skills testing in your hiring process if you want to find the best candidates.

Quality decision-making

Reasoning skills contribute to effective decision-making. Employees who think critically and can logically evaluate information are more likely to make informed decisions based on evidence and careful analysis. 

Their ability to weigh options, consider potential outcomes, and anticipate risks helps mitigate errors.

Adaptability and flexibility

Individuals who can think critically and analyze situations from different angles are better equipped to embrace new challenges, adjust their approach, and find new strategies. 

Their adaptability fosters resilience, enabling them to thrive in fast-paced industries and contribute to organizational growth and success.

Enhanced innovation

Reasoning skills are at the core of innovative thinking. Employees who excel in reasoning can identify gaps, find opportunities, and connect seemingly unrelated ideas or concepts. 

Their ability to analyze data, draw logical conclusions, and come up with creative new tactics drives innovation. Hiring individuals with superb reasoning skills encourage the development of new groundbreaking ideas.

Effective risk management

Employees with exemplary reasoning abilities can evaluate potential risks, weigh their impact, and consider mitigation strategies. 

Their ability to anticipate challenges and make calculated decisions reduces the likelihood of costly errors or setbacks, contributing to effective risk management within your organization.

Continued learning and growth

People with great reasoning skills tend to be lifelong learners. They have a natural curiosity and a desire to expand their knowledge and skills. 

Their ability to think critically and adapt enables them to embrace new information, learn from experiences, and grow professionally. 

Effective communication and collaboration

Employees with reasoning skills can think critically and express their ideas clearly. They can engage in meaningful discussions, contribute valuable insights, and articulate their viewpoints. 

They can also understand and respect diverse perspectives, leading to enhanced teamwork, collaboration, and the generation of new, exciting courses of action through collective intelligence.

Critical thinking

Individuals with good reasoning skills demonstrate strong critical thinking abilities. They can analyze information objectively, evaluate arguments, and identify logical inconsistencies. 

Their critical thinking skills enable them to approach problems and challenges with a logical and rational mindset, enabling them to make sound decisions and solve complex issues effectively.

Problem-solving aptitude

Excellent reasoning skills often go hand in hand with exceptional problem-solving aptitude. Candidates who excel in reasoning can break down complex problems into manageable components, identify patterns, and come up with innovative new strategies. 

They exhibit a natural curiosity, a willingness to explore different approaches, and the ability to think outside the box, enabling them to overcome obstacles and find creative resolutions.

Analytical thinking

A key trait of individuals with good reasoning skills is their ability to think analytically. They can dissect complex information, identify key components, and draw connections between various data points. 

With their analytical thinking skills, they can examine data objectively, discern trends or patterns, and make informed decisions based on evidence and logical deductions.

Logical reasoning

Strong reasoning skills are often indicative of individuals who possess logical reasoning abilities. They can follow sequences, identify cause-and-effect relationships, and draw conclusions based on deductive or inductive reasoning. 

Their logical reasoning skills enable them to evaluate options, anticipate potential outcomes, and choose the most appropriate course of action.

Flexible thinking

Employees with good reasoning skills often exhibit cognitive flexibility. They can adapt their thinking and approach to different situations, incorporating new information and adjusting their perspectives as needed. 

Their cognitive flexibility lets them consider multiple viewpoints, explore alternative options, and navigate complex challenges with an open mind. They re-evaluate assumptions and revise their thinking based on new insights or evidence.

Communication skills

For reasoning skills to be effective in the workplace, communication is key. It’s important that employees can articulate their thoughts clearly, present logical arguments, and express complex ideas in a concise manner. 

The ability to communicate effectively helps to convey the reasoning process, engage in meaningful discussions, and collaborate with others, fostering better teamwork and understanding within the organization. 

Workplace communication tests can evaluate candidates’ ability to communicate at work.

Individuals with good reasoning skills demonstrate a natural curiosity and a thirst for continuous learning. 

They have a genuine interest in expanding their knowledge, exploring new ideas, and seeking out information to enhance their understanding. 

Their curiosity drives them to stay updated on industry trends, engage in self-improvement, and continuously develop their reasoning abilities.

When it comes to assessing a candidate’s reasoning skills, it’s important to delve deeper beyond surface-level observations. Understanding their critical thinking, problem-solving, and decision-making abilities is crucial. That’s where TestGorilla can lend a hand. 

Our extensive test library is a treasure trove of options to suit your needs. You can mix and match tests to create an assessment that aligns perfectly with your company’s requirements. 

Whether you’re searching for top-notch analysts or logical thinkers who thrive in challenging situations, our tests can help you discover exceptional candidates with the cognitive skills to excel.

Here are some of our most popular tests for assessing reasoning skills:

Critical Thinking test

At TestGorilla, we understand the significance of this test in evaluating a candidate’s ability to analyze information, make logical connections, and approach problems from multiple perspectives.

By incorporating the Critical Thinking test into your reasoning skills assessment, you gain valuable insights into an individual’s cognitive abilities and capacity to think critically in real-world scenarios. 

This test goes beyond simple memorization or rote learning; it assesses how candidates can apply their knowledge, reason through complex situations, and arrive at sound conclusions.

Verbal Reasoning test

Our Verbal Reasoning test is essential because it assesses language comprehension, critical thinking, and problem-solving abilities. It evaluates an individual’s capacity to understand written information and draw logical conclusions. 

This test also indirectly measures language proficiency and communication skills. Verbal reasoning tests are widely used because they predict academic and occupational success, and they provide a fair and accessible assessment tool for individuals from diverse backgrounds.

Spatial Reasoning test

TestGorilla’s Spatial Reasoning test assesses a candidate’s capacity to perceive and understand spatial relationships, shapes, and patterns. 

This skill is particularly relevant in fields such as engineering, architecture, design, and logistics, where professionals often encounter complex spatial problems. 

The Spatial Reasoning test also assesses a candidate’s capacity to mentally visualize and manipulate objects in space. These abilities are essential for tasks that involve spatial planning, such as interpreting maps, organizing physical spaces, or understanding 3D models. 

Candidates who perform well in spatial reasoning tests demonstrate a heightened ability to think ahead, anticipate outcomes, and develop effective strategies based on spatial information. 

Numerical Reasoning test

The Numerical Reasoning test provides valuable insights into a job candidate’s reasoning skills, particularly in terms of quantitative analysis, problem-solving, and logical thinking. 

By assessing a candidate’s proficiency in interpreting numerical data and making accurate deductions, this test assists you in identifying those who possess the numerical acumen necessary for roles involving financial analysis, data-driven decision-making, and problem-solving using quantitative methods.

Mechanical Reasoning test

While not all job roles require mechanical reasoning, this test is pertinent for positions that involve machinery, engineering, or technical operations by providing crucial insights into a candidate’s reasoning abilities in these areas.

The Mechanical Reasoning test evaluates a candidate’s understanding of mechanical principles and ability to apply that knowledge to solve problems. 

This test presents candidates with scenarios and questions that require them to analyze mechanical systems, interpret diagrams, and make logical deductions.

Problem Solving test

Problem-solving tests evaluate a candidate’s aptitude for analyzing issues from different perspectives, breaking them down into manageable components, and applying logical reasoning to reach effective resolutions. 

The Problem Solving test measures a candidate’s ability to think critically, make sound judgments, and adapt their problem-solving approach as necessary. 

Strong problem-solving skills are not limited to specific industries or job roles; they are highly transferable and valuable across various fields, including business, technology, healthcare, and customer service.

Attention to Detail (Textual) test

TestGorilla’s Attention to Detail (Textual) test offers valuable insights into a job candidate’s reasoning skills, particularly in assessing their ability to analyze and comprehend written information with precision and accuracy.

In most professional settings, the ability to pay close attention to detail is paramount. The Attention to Detail (Textual) test assesses a candidate’s proficiency in reading, comprehending, and scrutinizing written information, ensuring accuracy and completeness.

Big 5 (OCEAN) test

Reasoning skills are not solely dependent on cognitive abilities but are also influenced by an individual’s personality traits. 

The Big 5 (OCEAN) test assesses a candidate’s personality dimensions, providing a deeper understanding of their approach to challenges, level of openness to new ideas, organizational skills, propensity for collaboration, and emotional stability. 

For example, candidates with a high score in Conscientiousness demonstrate meticulous attention to detail and a structured approach to problem-solving, while those who get a high score in Openness exhibit creativity and a willingness to explore new ways of moving forward. 

By considering these traits alongside reasoning skills, you can gain a comprehensive understanding of a candidate’s potential to excel in tasks requiring critical thinking and reasoning.

Understanding Instructions test

The Understanding Instructions test plays a useful role in evaluating a job candidate’s reasoning skills, specifically their ability to understand and execute tasks based on given instructions accurately. 

This test focuses on assessing an individual’s attention to detail, critical thinking, and capacity to analyze and interpret instructions. 

It offers valuable insights into a candidate’s logical reasoning, problem-solving skills, and potential for success in roles that require close adherence to guidelines.

If you’re looking to identify candidates with exceptional reasoning skills, TestGorilla is here to support your hiring journey. With our extensive range of scientifically designed tests, we provide you with a powerful tool to assess and evaluate critical thinking and problem-solving abilities. 

By incorporating TestGorilla’s assessments into your hiring process, you’ll gain valuable insights into each candidate’s capacity to analyze, strategize, and make informed decisions, setting the stage for building a team of exceptional talent.

At TestGorilla, we understand that finding individuals who can think critically and adapt to complex challenges is crucial for your company’s success. Our tests are carefully crafted to gauge candidates’ logical reasoning, analytical skills, and cognitive abilities, giving you a comprehensive understanding of their reasoning prowess. 

By relying on TestGorilla’s innovative assessment platform, you can confidently identify top-tier candidates who will contribute fresh perspectives, creativity, and ingenuity to your organization.

Let us help you identify candidates with the critical thinking, problem-solving, and decision-making abilities your company needs to thrive.

Sign up for TestGorilla’s free plan today and experience the power of our reasoning skills assessments firsthand.

Related posts

Best 16 personalities test for hiring featured image

Best 16 personalities test for hiring

How to be a good interviewer: A guide for recruiters and hiring managers featured image

How to be a good interviewer: A guide for recruiters and hiring managers

Stock trader job description: Boost your hiring game featured image

Stock trader job description: Boost your hiring game

Hire the best candidates with TestGorilla

Create pre-employment assessments in minutes to screen candidates, save time, and hire the best talent.

test of problem solving description

Latest posts

what is learning agility featured image

The best advice in pre-employment testing, in your inbox.

No spam. Unsubscribe at any time.

Hire the best. No bias. No stress.

Our screening tests identify the best candidates and make your hiring decisions faster, easier, and bias-free.

Free resources

test of problem solving description

This checklist covers key features you should look for when choosing a skills testing platform

test of problem solving description

This resource will help you develop an onboarding checklist for new hires.

test of problem solving description

How to assess your candidates' attention to detail.

test of problem solving description

Learn how to get human resources certified through HRCI or SHRM.

test of problem solving description

Learn how you can improve the level of talent at your company.

test of problem solving description

Learn how CapitalT reduced hiring bias with online skills assessments.

test of problem solving description

Learn how to make the resume process more efficient and more effective.

Recruiting metrics

Improve your hiring strategy with these 7 critical recruitment metrics.

test of problem solving description

Learn how Sukhi decreased time spent reviewing resumes by 83%!

test of problem solving description

Hire more efficiently with these hacks that 99% of recruiters aren't using.

test of problem solving description

Make a business case for diversity and inclusion initiatives with this data.

Evidence-Based Blog and Materials for SLPs!

  • $ 0.00 0 items

Review of Social Language Development Test Adolescent: What SLPs Need to Know

Product Image

Basic overview

Release date: 2010 Age Range:  12-18 Authors:Linda Bowers , Rosemary Huisingh , Carolyn LoGiudice Publisher: Linguisystems (PRO-ED as of 2014)

The Social Language Development Test: Adolescent (SLDT-A)  assesses adolescent students’ social language competence. The test addresses the students ability to take on someone else’s perspective, make correct inferences, interpret social language, state and justify logical solutions to social problems, engage in appropriate social interactions, as well as interpret ironic statements.

The Making Inferences subtest of the SLDT-A assesses students’ ability to infer what someone in the picture is thinking as well as state what visual cues aided him/her in the making of that inference.

The first question asks the student to pretend to be a person in the photo and then to tell what the person is thinking by responding as a direct quote. The quote must be relevant to the person’s situation and the emotional expression portrayed in the photo.The second question asks the student to identify the relevant visual clues that he used to make the inference.

Targeted Skills include:

  • detection of nonverbal and context clues
  • assuming the perspective of a specific person
  • inferring what the person is thinking and expressing the person’s thought
  • stating the visual cues that aided with response production

A score of 1 or 0 is assigned to each response, based on relevancy and quality. However, in contrast to the SLDTE  student must give a correct response to both questions to achieve a score of 1.

Errors can result due to limited use of direct quotes (needed for correct responses to indicate empathy/attention to task), poor interpretation of provided visual clues (attended to irrelevant visuals) as well as vague, imprecise, and associated responses.

The Interpreting Social Language subtest of the SLDT-A assesses students’ ability to demonstrate actions (including gestures and postures), tell a reason or use for an action, think and talk about language and interpret figurative language including idioms.

A score of 1 or 0 is assigned to each response, based on relevancy and quality. Student must give a correct response to both questions to achieve a score of 1.

Targeted Skills:

  • Ability to demonstrate actions such as gestures and postures
  • Ability to explain appropriate reasons or use for actions
  • Ability to think and talk about language
  •  Ability to interpret figurative language (e.g., idioms)

Errors can result due to vague, imprecise (off-target), or associated responses as well as lack of responses. Errors can result due to lack of knowledge of correct nonverbal gestures to convey meaning of messages.  Finally errors can result due to literal interpretations of idiomatic expressions.

The Problem Solving subtest of the SLDT-A assesses students’ ability to offer a logical solution to a problem and explain why that would be a good way to solve the problem.

To receive a score of 1, the student has to provide an appropriate solution with relevant justification. A score of 0 is given if any of the responses to either question were incorrect or inappropriate.

  • Taking perspectives of other people in various social situations
  • Attending to and correctly interpreting social cues
  • Quickly and efficiently determining best outcomes
  • Coming up with effective solutions to social problems
  • Effective conflict negotiation

Errors can result due to illogical or irrelevant response s, restatement of the problem, rude solutions, or poor solution justifications.

The Social Interaction subtest of the SLDT-A assesses students’ ability to socially interact with others.

A score of 1 is given for an appropriate response that supports the situation. A score of 0 is given for negative, unsupportive, or passive responses as well as for ignoring the situation, or doing nothing.

  • Provision of appropriate, supportive responses
  • Knowing when to ignore the situation

Errors can result due to inappropriate responses that were negative , unsupportive or illogical.

The Interpreting Ironic Statements subtest of the SLDT-A assesses sudents’ ability to recognize sarcasm and interpret ironic statements.

To get a score of 1, the student must give a response that shows s/he understands that the speaker is being sarcastic and is saying the opposite of what s/he means.  A score of 0 is given if the response is literal and ignores the irony of the situation.

Errors can result due to consistent provision of literal idiom meanings indicating lack of understanding of the speaker’s intentions as well as “missing” the context of the situation. errors also can result due the the student identifying that the speaker is being sarcastic but being unable to explain the reason behind the speaker’s sarcasm (elaboration).

For example, one student was presented with a story of a brother and a sister who extensively labored over a complicated recipe. When their mother asked them about how it came out, the sister responded to their mother’s query: “Oh, it was a piece of cake”. The student was then asked: What did she mean?” Instead of responding that the girl was being sarcastic because the recipe was very difficult, student responded: “easy.”  When presented with a story of a boy who refused to help his sister fold laundry under the pretext that he was “digesting his food”, he was then told by her, “Yeah, I can see you have your hands full.” the student was asked: “What did she mean?” student provided a literal response and stated: “he was busy.”

goal-setting

The following goals can be generated based on the performance on this test:

  • Long Term Goals: Student will improve social pragmatic language skills  in order to effectively communicate with a variety of listeners/speakers in all social and academic contexts
  • Short Term Goals
  • Student will improve his/her ability to  make inferences based on social scenarios
  • Student will improve his/her interpretation of facial expressions, body language, and gestures
  • Student will improve his/her ability to interpret social language (demonstrate appropriate gestures and postures, use appropriate reasons for actions, interpret figurative language)
  • Student will his/her ability to provide multiple interpretations of presented social situations
  • Student will improve his/her ability to improve social interactions with peers and staff (provide appropriate supportive responses; ignore situations when doing nothing is the best option, etc)
  • Student will improve his/her ability to  interpret abstract language (e.g., understand common idioms, understand speaker’s beliefs, judge speaker’s attitude, recognize sarcasm, interpret irony, etc)

Caution

A word of caution regarding testing eligibility: 

I would also not administer this test to the following adolescent populations:

  • Students with social pragmatic impairments secondary to intellectual disabilities (IQ <70)
  • Students with severe forms of Autism Spectrum Disorders
  • Students with severe language impairment and limited vocabulary inventories
  • English Language Learners (ELL) with suspected social pragmatic deficits 
  • Students from low SES backgrounds with suspected pragmatic deficits 

—I would not administer this test   to  Culturally and Linguistically Diverse (CLD)   students due to significantly increased potential for linguistic and cultural bias, which may result in test answers being marked incorrect due to the following:

  • Lack of relevant vocabulary knowledge will affect performance 
  • How many of such students would know know the meaning of the word “sneer”?
  • —How many can actually show it?
  • Has an —entire subtest devoted to idioms
  • —Individual vs. cooperative culture differences

What I like about this test: 

  • I like the fact that unlike the   CELF-5:M ,  the test is composed of open-ended questions instead of offering orally/visually based multiple choice format as it is far more authentic in its representation of real world experiences
  • I really like how the select subtests ( Making Inferences ) require a response to both questions in order for the responder to achieve credit on the total subtest

Overall, when you carefully review what’s available in the area of assessment of social pragmatic abilities of adolescents this is an important test to have in your assessment toolkit as it provides very useful information for social pragmatic language treatment goal purposes.

Have YOU purchased SLDTA  yet? If so how do you like using it? Post your comments, impressions and questions below.

Helpful Resources Related to Social Pragmatic Language Overview, Assessment  and Remediation :

  • The Checklists Bundle
  • Narrative Assessment and Treatment Bundle
  • Social Pragmatic Assessment and Treatment Bundle
  • Psychiatric Disorders Bundle
  • Fetal Alcohol Spectrum Disorders Assessment and Treatment Bundle
  • Social Pragmatic Deficits Checklist for Preschool Children
  • Social Pragmatic Deficits Checklist for School Aged Children
  • Behavior Management Strategies for Speech Language Pathologists
  • Social Pragmatic Language Activity Pack

Disclaimer: The views expressed in this post are the personal opinion of the author. The author is not affiliated with PRO-ED or Linguisystems in any way and was not provided by them with any complimentary products or compensation for the review of this product. 

3 thoughts on “ Review of Social Language Development Test Adolescent: What SLPs Need to Know ”

test of problem solving description

A very informative review! I totally agree with the author’s point of view, especially the part regarding EL and CLD students! Thank you so much for sharing this review!

[…] Social Language Development Test Adolescent (up to 18 years of age) […]

[…] Skills (Suggestion: administer portions of Social Language Development Test Elementary, Social Language Development Test Adolescent, Informal Social Thinking Dynamic Assessment […]

You must be logged in to post a comment.

  • Adolescents (28)
  • Ankyloglossia (1)
  • APD Validity (7)
  • App Review (18)
  • Articulation (17)
  • Assessment (118)
  • Autism (19)
  • behavior strategies (14)
  • Bell Curve Charting (3)
  • Bilingual (17)
  • Blogging About Research (25)
  • Book Companion (10)
  • CEUs/CMHs (4)
  • Checklist (23)
  • Childhood Apraxia of Speech (6)
  • Clinical Assessment (19)
  • Clinical Fellows (1)
  • CMH Quiz (1)
  • Context Clues (3)
  • Contextual Intervention (8)
  • Controversial Practices in Health Care (7)
  • Critical Thinking (22)
  • Cues and Prompts (4)
  • Development milestones (16)
  • Developmental Disabilities (12)
  • Differential Diagnosis (20)
  • Down Syndrome (6)
  • Dysgraphia (7)
  • Dyslexia (35)
  • Dyslexia Assessment (12)
  • Dyslexia/Reading Disability (30)
  • Early Intervention (18)
  • Elementary (32)
  • emotional and behavioral disturbances (34)
  • Emotional Intelligence (5)
  • Evidence-Based Practice (EBP) (50)
  • Executive Function (15)
  • Facebook (7)
  • Fairy Tales (1)
  • feeding (3)
  • Following Directions (2)
  • Fragile X Syndrome (3)
  • Freebie (13)
  • genetic syndromes (14)
  • Giveaway (71)
  • Independent Educational Evaluations (IEEs) (13)
  • Insight (15)
  • insurance coverage (2)
  • Intake forms (6)
  • internationally adopted children (28)
  • Kindergarten (11)
  • language delay (74)
  • Language Disorder (68)
  • language stimulation (30)
  • Learned Helplessness (1)
  • Learning Disability (55)
  • Life Skills (3)
  • listening comprehension (29)
  • Literacy (42)
  • Metacognition (4)
  • Metalinguistics (12)
  • Middle School (24)
  • Morphological Awareness (8)
  • Motor Speech Disorders (5)
  • multicultural (24)
  • Multisensory Stimulation (9)
  • Narratives (10)
  • News Release (44)
  • NJAC 6A:14-3.6 (1)
  • Nonfiction Text (2)
  • orofacial assessment (4)
  • Parent Consultation (4)
  • Parent Tips (2)
  • Phonemic Awareness (24)
  • Phonics (9)
  • Phonological Awareness (16)
  • Picture Books (11)
  • preschoolers (52)
  • Problem Solving and Verbal Reasoning (34)
  • Processing Disorders (25)
  • Professional Consultation (2)
  • Professional Development (3)
  • Psychiatric Impairments (3)
  • Reading Comprehension (19)
  • Reading Fluency (17)
  • Report Writing Tips (12)
  • resource websites (33)
  • Review (37)
  • Rewards (1)
  • Russian (6)
  • Science of Reading Literacy Certificate for SLPs (1)
  • Screening (1)
  • Search Tips (2)
  • Sequencing (2)
  • Severe Disabilities (12)
  • SLP Efficiency Bundles (3)
  • Smart Speech Therapy Article (77)
  • Social Communication (25)
  • social pragmatic language (70)
  • Social Thinking® Products (2)
  • Special Education Disputes (14)
  • speech language pathology (208)
  • Speech-Language Report Tutorials (12)
  • SPELL-Links/Learning By Design (3)
  • Spelling (12)
  • stuttering (5)
  • Summer Learning Loss (1)
  • Test Review (16)
  • Thematic Intervention (13)
  • Toddlers (12)
  • Tongue Tie (1)
  • Translanguaging (5)
  • Treatment Suggestions (112)
  • Trivia Night (5)
  • Video Modeling (1)
  • Vocabulary (42)
  • Wh- Questions (2)
  • word-finding (12)
  • Writing (16)

The AI-Powered Talent Assessment Tool – See Testlify in action

The ai-powered talent assessment tool – view demo, problem solving test, overview of problem solving test.

This test evaluates the candidate’s problem-solving skills across various business functionalities.

Skills measured

  • Client Management
  • Decision Making
  • Customer Escalation
  • Stakeholder Management

Available in

Cognitive Ability

About the Problem Solving test

This test evaluates the candidate’s problem-solving skills across various business functionalities. 

The capacity to solve problems under pressure is a crucial skill to look for in any candidate. It is common for problems to occur in businesses. But how a candidate analyzes the problem and solves the issue differentiates them from other applicants. Problem-solving tests are an excellent tool for assessing a candidate’s analytical skills when presented with complex situations.

This test poses the candidates with potential issues that might arise in different business areas and queries how they would respond in the given situations. The questions force the candidate to think from the point of view of a specified role and gauge how well they resolve the problem successfully with everyone’s best interests in mind.

When recruiting candidates to fill positions in management, operations, and logistics, assessing their problem-solving skills is essential as they are often faced with situations requiring quick reasoning ability to resolve the issue without affecting the reputation of the company. Candidates who clear this test are good at thinking critically, managing clients and stakeholders, handling customers, and making appropriate decisions. The test can identify real talents who possess the capability to create short-term and long-term solutions for problems that might obstruct a company from achieving its goals.

Relevant for

  • Project leadership
  • Team leadership
  • Project management

Recruiting for Problem Solving

For candidates with good problem-solving abilities, the traditional method of assessing their skills by questioning them about their previous experiences can be inaccurate, and recruiters can lose promising talents. Additionally, cognitive bias may arise while evaluating soft skills during personal interviews. An objective problem-solving test is designed to assess without bias various skills such as:

Hire the best, every time, anywhere

Customer satisfaction.

Testlify helps you identify the best talent from anywhere in the world, with a seamless experience that candidates and hiring teams love every step of the way.

Customer assistance or client management often involves addressing a critical problem or grievance and are key areas determining the success of a business. It requires careful consideration of an issue’s cause and effect and the consequences of the actions taken to resolve it.

Making appropriate situational decisions with the given resources is a vital problem-solving skill. A candidate should make suitable choices after identification and careful consideration of the problem, thinking about the possible solutions and repercussions of the issue.

Problem-solving skills include the ability to identify situations requiring escalation and take appropriate actions. In addition to providing solutions, it is a necessary skill to guide and redirect customers to experienced agents who can help them better.

Managing stakeholders requires knowledge about the interests of different parties involved and careful coordination to minimize damages to the overall business. Satisfied stakeholders are critical to the long-term sustainability of any project, and extensive problem-solving skills are required to handle them.

The Problem Solving test is created by a subject-matter expert

Testlify’s skill tests are designed by experienced SMEs (subject matter experts). We evaluate these experts based on specific metrics such as expertise, capability, and their market reputation. Prior to being published, each skill test is peer-reviewed by other experts and then calibrated based on insights derived from a significant number of test-takers who are well-versed in that skill area. Our inherent feedback systems and built-in algorithms enable our SMEs to refine our tests continually.

test of problem solving description

Key features

White label

Typing test

ATS integrations

Custom questions

Live coding tests

Multilingual support

Psychometric tests

Why choose testlify?

Elevate your recruitment process with Testlify, the finest talent assessment tool. With a diverse test library boasting 1000+ tests, and features such as custom questions, typing test, live coding challenges, google suit questions and psychometric tests, finding the perfect candidate is effortless.

Enjoy seamless ATS integrations, white label features and multilingual support, all in one platform. Simplify candidate skill evaluation and make informed hiring decisions with Testlify.

Solve your skill assessment needs with ease

1400+ premium tests.

Choose from a test library of 1200+ tests for different roles and skills.

100+ ATS integration

Sample reports.

  • View sample score card
  • View SMART personality test report
  • View DISC personality test report
  • View Culture fit test report
  • Use this test

sales skills

Top five hard skills interview questions for Problem Solving

Here are the top five hard-skill interview questions tailored specifically for Problem Solving. These questions are designed to assess candidates’ expertise and suitability for the role, along with skill assessments.

1. Can you walk me through your process for solving complex problems?

Why this matters.

Problem-solving is a critical skill in any business or technical role, and a skilled problem solver should have a well-defined process for approaching complex problems.

What to listen for?

Listen for the candidate to describe a structured approach to problem-solving, including how they define the problem, gather information, analyze data, generate potential solutions, evaluate those solutions, and implement the best one. Look for examples of how the candidate has used this process to solve complex problems in the past.

2. How do you handle situations where you don’t have all the information you need to solve a problem?

Problem-solving often involves incomplete or ambiguous information, and a skilled problem solver should be able to handle those situations effectively.

Listen for the candidate to describe their approach to handling incomplete or ambiguous information, including how they identify gaps in their knowledge, how they gather additional information, and how they make assumptions and test those assumptions. Look for examples of how the candidate has successfully solved problems with incomplete information.

3. Can you describe a particularly challenging problem you solved and how you approached it?

This question allows the candidate to showcase their problem-solving abilities and provides insight into their problem-solving process.

Listen for the candidate to describe a particularly challenging problem they solved, and how they approached it. Look for examples of how the candidate defined the problem, identified potential solutions, evaluated those solutions, and implemented the best one. Also, listen for how the candidate communicated their solution to stakeholders and how they measured the success of their solution.

4. How do you prioritize and manage multiple competing problems or projects?

Effective problem solvers should be able to prioritize and manage multiple competing priorities to maximize productivity and efficiency.

Listen for the candidate to describe their process for prioritizing and managing multiple competing problems or projects. Look for examples of how they’ve managed complex projects and how they’ve dealt with competing demands for their time and attention. Also, listen for how the candidate balances short-term and long-term priorities.

5. Can you give an example of a problem you encountered that required you to think outside the box to find a solution?

Innovative thinking and creativity can be valuable assets in problem-solving, and this question helps assess those skills.

Listen for the candidate to describe a problem that required them to think outside the box to find a solution. Look for examples of how they generated unique and creative solutions, and how they tested and refined those solutions. Also, listen for how the candidate communicated their solution to stakeholders and how they evaluated the success of their solution.

Frequently Asked Questions for Problem Solving

What is problem solving assessment.

A problem-solving assessment is a process used to evaluate an individual’s ability to identify and solve problems in a systematic and logical manner. It may be conducted as part of a job application process, in order to determine whether a candidate has the necessary skills and experience to perform a particular role.

How to use Problem Solving assessment for hiring?

What roles can be covered in the problem solving assessment, what topics are covered in the problem solving assessment, why is problem solving important.

  • It helps to improve decision-making: Problem-solving involves gathering and analyzing information, evaluating options, and making decisions based on that information. This process helps to ensure that decisions are based on sound reasoning and evidence, rather than being made impulsively or based on assumptions.
  • Effective problem-solving is crucial for achieving both personal and organizational goals. By identifying and addressing obstacles and challenges, individuals and organizations can move forward and make progress toward their objectives.

Frequently asked questions (FAQs)

Want to know more about Testlify? Here are answers to the most commonly asked questions about our company.

Can I try a sample test before attempting the actual test?

Yes, Testlify offers a free trial for you to try out our platform and get a hands-on experience of our talent assessment tests. Sign up for our free trial and see how our platform can simplify your recruitment process.

How can I select the tests I want from the Test Library?

To select the tests you want from the Test Library go to the Test Library page and browse tests by categories like role-specific tests, Language tests, programming tests, software skills tests, cognitive ability tests, situational judgment tests, and more. You can also search for specific tests by name.

Ready-to-go tests are pre-built assessments that are ready for immediate use, without the need for customization. Testlify offers a wide range of ready-to-go tests across different categories like Language tests (22 tests), programming tests (57 tests), software skills tests (101 tests), cognitive ability tests (245 tests), situational judgment tests (12 tests), and more.

Can you integrate with our existing ATS?

Yes, Testlify offers seamless integration with many popular Applicant Tracking Systems (ATS). We have integrations with ATS platforms such as Lever, BambooHR, Greenhouse, JazzHR, and more. If you have a specific ATS that you would like to integrate with Testlify, please contact our support team for more information.

What are the basic technical requirements needed to take your tests?

Testlify is a web-based platform, so all you need is a computer or mobile device with a stable internet connection and a web browser. For optimal performance, we recommend using the latest version of the web browser you’re using. Testlify’s tests are designed to be accessible and user-friendly, with clear instructions and intuitive interfaces.

Are your tests valid and reliable?

Yes, our tests are created by industry subject matter experts and go through an extensive QA process by I/O psychologists and industry experts to make sure that the tests have a good reliability and validity and give accurate results.

Hire with Facts, not Fiction.

Resumes don’t tell you everything! Testlify gives you the insights you need to hire the right people with skills assessments that are accurate, automated, and unbiased.

Testlify AI

Test library

Reseller plan

What’s new

Video interviews

Product roadmap

Lateral hiring

Diversity and inclusion

Volume hiring

Remote hiring

Blue collar hiring

Freelance hiring

Campus hiring

Information technology

Logistics & supply chain

Recruitment

Hospitality

Real estate

Careers We are hiring

For subject matter experts

Our partners

Role specific tests

Language tests

Programming tests

Software skills tests

Cognitive ability tests

Situational judgment tests

Coding test s

Engineering tests

Company type

Non-profits

Public sector

Help center

Join Testlify SME

Integration program

Referral program

Partnership program

Success stories

Competitors

Hiring guides

HR glossary

Privacy policy Terms & conditions Refund policy

GDPR compliance

Cookie policy

Security practices

Data processing agreement

Data privacy framework

Trust center

Testgorilla

Vervoe Adaface Maki People Xobin TestDome Mettl

Greenhouse JobAdder JazzHR

Zoho Recruit

[email protected]

[email protected]

©2024 Testlify All Rights Reserved

Get email notifications whenever Workable  creates ,  updates  or  resolves  an incident.Email address:

Get incident updates and maintenance status messages in Slack.

Recruiting is complex, time-consuming, and risky.

[fluentform id=”23″]

Get 40% off on your first year’s billing!

Hurry and make the most of this special offer before it expires., new customers only..

[fluentform id=”21″]

Test library request

These are upcoming tests. If you wish to prioritize this test request, we can curate it for you at an additional cost.

Frontend Developer (Enhanced)

Frontend Developer (Enhanced) specializes in advanced UI/UX design, proficient in latest web technologies for optimal interactive experiences.

GBV – Gender Based Violence

The GBV test evaluates candidates’ awareness and approach to gender-based violence, ensuring hires promote a safe, respectful workplace, thereby enhancing organizational culture and commitment.

Global Mindset

The Global Mindset Test identifies candidates adept at navigating cross-cultural interactions, enhancing hiring by pinpointing individuals with adaptability, communication, and strategic thinking.

Digital Literacy

The Digital Literacy test evaluates candidates’ digital skills, including computer literacy, cybersecurity awareness, and software proficiency, aiding in hiring decisions for tech-driven workplaces.

Business Acumen

The Business Acumen test evaluates candidates’ strategic and operational understanding, aiding in hiring individuals with the ability to drive business success and make informed decisions.

Learning Agility

The Learning Agility Test assesses a candidate’s ability to learn from experiences and apply knowledge in unfamiliar situations, enhancing hiring by identifying individuals.

Self Management

The self-management test assesses candidates’ ability to effectively organize, motivate, and regulate themselves, enhancing hiring decisions by identifying individuals with high autonomy.

Entrepreneurial Thinking

The entrepreneurial thinking test identifies candidates with the mindset to innovate and lead. It benefits hiring by pinpointing individuals capable of driving growth and managing risks.

Risk Assessment

The risk assessment test evaluates candidates’ ability to identify, evaluate, and mitigate risks, enhancing hiring decisions by ensuring prospective hires possess critical risk management skills.

Compliance and Governance

The Compliance and Governance test identifies candidates proficient in regulatory standards and ethical practices, enhancing hiring accuracy and safeguarding organizational integrity.

Visionary Leadership

The Visionary Leadership test identifies future-focused leaders, assesses strategic thinking and innovation, enhances hiring decisions, and drives organizational growth.

Social Responsibility

The social responsibility test evaluates candidates’ ethical awareness and commitment to community and environmental issues. It aids in selecting candidates aligned with corporate values and more.

Operational Excellence

The Operational Excellence test assesses candidates’ proficiency in optimizing processes and driving efficiency. It aids in hiring by identifying candidates with the skills needed for roles.

Google Android

This Google Android test evaluates proficiency in Android app development, covering key aspects like UI, functionality, and Android system knowledge.

Extensible markup language XML

XML tests in the hiring process enable structured data exchange, aiding in candidate assessment. Benefits include easy data manipulation, compatibility, and organized candidate profiles.

The Nginx test library provides a comprehensive set of test cases designed to validate and ensure the reliable performance of Nginx in various scenarios.

The Redux assessment tests are designed to evaluate a candidate’s proficiency in working with Redux, a popular JavaScript library used for managing application state.

TypeScript Developer

The TypeScript Developer test assesses a candidate’s proficiency in TypeScript programming skills, ensuring adherence to industry best practices.

Digital Marketing SaaS Products

The Digital Marketing SaaS Products test assesses candidates’ ability to strategize, implement, and optimize online marketing efforts for software solutions.

The Azure test is designed to evaluate a candidate’s proficiency with Microsoft’s cloud platform, focusing on their ability to build, deploy, and manage applications and services.

Sales Aptitude SaaS Products

The Sales Aptitude SaaS Products test is designed to evaluate a candidate’s potential in sales roles, focusing on skills and traits essential for success.

Monitoring and Evaluation Officer

Monitoring and Evaluation Officer test screens for key skills like project management, analysis, communication, and ethics, streamlining recruitment for roles demanding a broad range of competencies.

WordPress Security

The WordPress Security test evaluates candidates’ proficiency in safeguarding WordPress websites against cyber threats. It ensures candidates skilled in robust security, Safeguard, and keep trust.

WordPress Plugin Development

The WordPress Plugin Development test assesses candidates’ proficiency in creating custom plugins for WordPress websites, enabling employers to Spot skilled developers capable of enhancing website.

WordPress Theme Customization

The WordPress Theme Customization test evaluates candidates’ ability to tailor WordPress themes, ensuring visually appealing and functional websites.

General WordPress Development

The General WordPress Development test evaluates candidates’ expertise in WordPress, including theme and plugin development, and PHP proficiency.

WordPress Hosting Management

The WordPress Hosting Management test pinpoints experts in server optimization, security, and data protection for WordPress, aiding efficient hiring for reliable web hosting management.

WordPress Ultimo

The WordPress Ultimo test assesses multisite management skills, essential for web development, digital marketing, and content roles in industries needing scalable web solutions.

WordPress Multisite

Assess WordPress Multisite skills: network optimization, scalability, efficient site management. Find experts in multisite operation.

WordPress Performance Optimization

The WordPress Performance Optimization test screens for expertise in speeding up websites, essential for SEO and user satisfaction, streamlining the hiring of high-performance WordPress professionals.

Talk to our product advisor

Schedule a product demo meeting, and we’ll show you Testlify in action

test of problem solving description

Color Ball Sort : Puzzle 12+

Higgs technology co., limited, designed for ipad.

  • 5.0 • 1 Rating

Screenshots

Description.

Dive into the captivating world of Color Ball Sort – the ultimate brain-teasing color sorting game! Immerse yourself in a challenging and addictive puzzle experience that will test your strategic thinking and problem-solving skills. Key Features: Mind-Blowing Puzzles: Tackle an array of mind-bending puzzles designed to engage and entertain players of all ages. Sort colorful balls with precision to unravel intricate patterns and complete each level. Intuitive Gameplay: Simple drag-and-drop controls make playing a breeze. Just tap and drag the balls to their designated tubes, but beware – one wrong move could set you back! Strategic Thinking: Exercise your brain as you strategize the best way to sort the balls efficiently. With every move, you'll encounter new challenges that will keep you hooked for hours. Beautiful Graphics: Enjoy stunning visuals and vibrant colors that enhance the overall gaming experience. The sleek and minimalist design adds a touch of elegance to the game. Unlock New Levels: Progress through a variety of levels, each more challenging than the last. Unlock new puzzles and test your skills in increasingly complex scenarios. Boost Your IQ: Color Ball Sort is not just a game – it's a workout for your brain! Enhance your logical reasoning and problem-solving abilities while having fun. No Time Limits: Take your time to strategize and solve puzzles at your own pace. No stressful timers – just pure, relaxing gameplay. How to Play: Drag and drop the balls into the tubes to match the colors. Plan your moves carefully to avoid getting stuck. Enjoy the satisfaction of completing each level as you progress through the game. Download Color Ball Sort now and unleash the genius within you! Embark on a journey of color, strategy, and endless fun. Can you conquer the sorting challenge? Find out today!

Version 1.0.4

1. add more levels. 2. fix bugs.

Ratings and Reviews

App privacy.

The developer, HIGGS TECHNOLOGY CO., LIMITED , indicated that the app’s privacy practices may include handling of data as described below. For more information, see the developer’s privacy policy .

Data Used to Track You

The following data may be used to track you across apps and websites owned by other companies:

  • Identifiers
  • Diagnostics

Data Linked to You

The following data may be collected and linked to your identity:

Privacy practices may vary, for example, based on the features you use or your age. Learn More

Information

  • Developer Website
  • App Support
  • Privacy Policy

More By This Developer

Tile Master - Classic Match

Cube Master 3D - Classic Match

Tile Connect - Classic Match

Tile Master 3D - Classic Match

Match Triple Goods:Sort Games

Bubble Boxes : Match 3D

You Might Also Like

Shelf Sort Puzzle Game

Tangled Stickers

Nuts & Bolts Sort

Color Ball Sort - Puzzle Game

Guess Monster Voice

IMAGES

  1. What Is Problem-Solving? Steps, Processes, Exercises to do it Right

    test of problem solving description

  2. math problem solving checklist

    test of problem solving description

  3. Problem Solving Practice Test

    test of problem solving description

  4. Advanced Problem Solving Test: Questions and Answers

    test of problem solving description

  5. what is the 5 step problem solving model

    test of problem solving description

  6. 7 Steps to Improve Your Problem Solving Skills

    test of problem solving description

VIDEO

  1. Canadian Force Aptitude Test (CFAT) math word problem solving

  2. Canadian Force Aptitude Test (CFAT) math word problem solving

  3. math word problem Canadian force aptitude test (CFAT)

  4. Canadian force aptitude test problem solving (CFAT)

  5. math fractions and decimal operations

  6. Canadian force aptitude test problem solving (CFAT)

COMMENTS

  1. (TOPS-3:E) Test of Problem Solving-3:Elementary

    While other tests may assess students' thinking skills by tapping mathematical, spatial, or nonverbal potential, the TOPS 3 Elementary measures discrete skills that form the foundation of language-based thinking, reasoning, and problem-solving abilities. The test is composed of 18 situations that examine six thinking tasks.

  2. (TOPS-3E:NU) Test of Problem Solving

    The TOPS-3E: NU focuses on a student's linguistic ability to think and reason. Language competence is the verbal indicator of how a student's language skills affect his or her ability to think, reason, problem solve, infer, classify, associate, predict, determine causes, sequence, and understand directions. The test focuses on a broad range ...

  3. Test of Problem Solving

    Measure Discreet Skills that Form the Foundation of Language-Based Thinking, Reasoning, and Problem-Solving. The TOPS-3E: NU assesses a school-age child's ability to integrate semantic and linguistic knowledge with reasoning ability by way of picture stimuli and verbal responses. The TOPS-3E: NU focuses on students' linguistic ability to ...

  4. What is Problem Solving? Steps, Process & Techniques

    Finding a suitable solution for issues can be accomplished by following the basic four-step problem-solving process and methodology outlined below. Step. Characteristics. 1. Define the problem. Differentiate fact from opinion. Specify underlying causes. Consult each faction involved for information. State the problem specifically.

  5. TOPS-3E: NU: Test Of Problem Solving-Third Edition Elementary

    Description of the Test. The TOPS-3E: NU has three components: an Examiner's Manual, Examiner Record Booklets, and a Picture Book. ... Booklet provides space to record responses and transform the raw score to an age equivalent, percentile rank, and the Problem Solving Index. The test kit also includes a Picture Book, which includes the picture ...

  6. How Good Is Your Problem Solving?

    Problem solving is an exceptionally important workplace skill. Being a competent and confident problem solver will create many opportunities for you. By using a well-developed model like Simplexity Thinking for solving problems, you can approach the process systematically, and be comfortable that the decisions you make are solid.

  7. Test of Problem Solving 3 (TOPS-3)

    While other tests may assess students' thinking skills by tapping mathematical, spatial, or nonverbal potential, the TOPS 3 Elementary measures discrete skills that form the foundation of language-based thinking, reasoning, and problem-solving abilities. The test is composed of 18 situations that examine six thinking tasks.

  8. Test of Problem Solving 2: Adolescent

    Overview. The Test of Problem Solving 2 - Adolescent (TOPS-2:A-Adolescent; Bowers, Huisingh, & LoGiudice, 2007) is a norm-referenced instrument that assesses critical thinking abilities based on student language strategies using logic and experience. It is for adolescents ages 12 years through 17 years, 11 months.

  9. Test Your Problem-Solving Skills

    Test Your Problem-Solving Skills. Complete the patterns by filling in the appropriate shape.

  10. Test of Problem Solving 2: Adolescent (TOPS-2:A)

    TOPS 2: Adolescent uses a natural context of problem-solving situations related to adolescent experiences and assesses five different decision-making skill areas critical to academic, problem solving, and social success. The test is comprised of five subtests (18 written passages) that assess a student's performance of these skills.

  11. A Test of the Test of Problem Solving (TOPS)

    In recent years, a number of new speech and language tests for children have appeared. Of interest is whether these new tests help to define language disorders in such a way as to identify children with problems, delineate their problems, and provide useful intervention strategies. The focus of this paper is the Test of Problem Solving (1984).

  12. Intelligence and Creativity in Problem Solving: The Importance of Test

    Vice versa, when task or problem descriptions are fuzzy and under specified, the problem solver's inferences are more idiosyncratic; the resulting process will evolve within an ill-defined space and will contain more generative-evaluative cycles in which new goals are set, and the cycle is repeated (Dennett, 1978, as cited in Gabora, 2002, p ...

  13. What Are Problem-Solving Skills? Definition and Examples

    Problem-Solving Skills Definition. Problem-solving skills are the ability to identify problems, brainstorm and analyze answers, and implement the best solutions. An employee with good problem-solving skills is both a self-starter and a collaborative teammate; they are proactive in understanding the root of a problem and work with others to ...

  14. TOPS-2:A: Test of Problem Solving 2: Adolescent

    Description. Diagnose Language-Based Thinking Deficits. Ages: 12-0 through 17-11 Testing Time: 40 minutes While other tests may assess thinking skills by tapping mathematical, spatial, or nonverbal potential, the TOPS 2: Adolescent assesses critical thinking abilities based on the student's language strategies using logic and experience.. The TOPS 2: Adolescent uses a natural context of ...

  15. How to Write a Problem Statement (With 3 Examples)

    Example Problem Statement 1: The Status Quo Problem Statement. Example: The average customer service on-hold time for Example company exceeds five minutes during both its busy and slow seasons. This can be used to describe a current pain point within an organization that may need to be addressed.

  16. How to assess reasoning skills

    When it comes to assessing a candidate's reasoning skills, it's important to delve deeper beyond surface-level observations. Understanding their critical thinking, problem-solving, and decision-making abilities is crucial. That's where TestGorilla can lend a hand. Our extensive test library is a treasure trove of options to suit your needs.

  17. PDF The Wechsler Individual Achievement Test- 3 Edition

    The Wechsler Individual Achievement Test, Third Edition (WIAT-III) is a source of information about an individual's academic skills and problem-solving abilities that can be used to guide appropriate intervention. It is a comprehensive yet flexible measurement tool useful for achievement skills assessment, learning disability diagnosis ...

  18. How to Test Problem-Solving and Decision-Making Skills

    One of the most common and effective methods to evaluate a candidate's problem-solving and decision-making skills is to ask them scenario-based questions. These are hypothetical situations that ...

  19. Why you need pre-employment problem-solving tests

    Pre-employment problem solving tests are designed to measure a candidate's ability to solve complex problems through teamwork, logic, reasoning and cooperation with their peers. Finding the ideal candidate for a role goes beyond determining if they have enough experience to perform the job's tasks — the real goal is to find someone who ...

  20. Review of Social Language Development Test Adolescent: What SLPs Need

    The Problem Solving subtest of the SLDT-A assesses students' ability to offer a logical solution to a problem and explain why that would be a good way to solve the problem. To receive a score of 1, the student has to provide an appropriate solution with relevant justification.

  21. Problem Solving test

    The Problem Solving test is created by a subject-matter expert. Testlify's skill tests are designed by experienced SMEs (subject matter experts). We evaluate these experts based on specific metrics such as expertise, capability, and their market reputation. Prior to being published, each skill test is peer-reviewed by other experts and then ...

  22. Module 1

    Military Problem Solving Process (MPSP) The ________ was expanded and altered so it could apply to operational problems. Incorrect - Mission analysis Process. Military Problem Solving Process (MPSP) As a leader which technique or process would you use to plan for tactical military operations? Military Decision Making Process (MDMP) Study with ...

  23. Genera; Psychology Chapter 8 Inquizitive

    1.) Flexibility and novelty in thinking. 2.) High overall intelligence. 3.)the ability to reason, to make decisions, to make sense of events, and to adapt to environmental challenges. 4.)Knowledge and use of knowledge. Identify each characteristic as relating to either decision making or problem solving.

  24. ‎Color Ball Sort : Puzzle on the App Store

    ‎Dive into the captivating world of Color Ball Sort - the ultimate brain-teasing color sorting game! Immerse yourself in a challenging and addictive puzzle experience that will test your strategic thinking and problem-solving skills. Key Features: Mind-Blowing Puzzles: Tackle an array of mind-bendi…