Feeding Two Wolves: How Balancing Creativity and Focus Fuels Innovation and Mastery

Updated 9/29/25 for clarity and development of key concepts.

There’s an old parable about two wolves living inside us. One represents our darker nature, the other our better instincts. The lesson: the wolf you feed is the one that thrives. But when it comes to innovation and mastery, I’ve found the real challenge isn’t choosing which wolf to feed. It’s learning to feed both.

In my work, those two wolves represent divergent and convergent thinking. Divergent thinking explores possibilities without constraint. Convergent thinking brings discipline and execution. Most of us naturally favor one over the other. That imbalance shows up in how we work, how teams function, and ultimately, in what gets built.

The Atelier Method and Innovation

Several years after art school, I returned to study master draftsmanship through the traditional atelier model, specifically the Grand Central Atelier methods. This classical approach emphasizes precision, observation from life, and technical rigor. It might seem disconnected from digital strategy work, but the parallels are telling.

My painting American Music (inspired by the Violent Femmes song) began with pure exploration. I didn’t know what the piece would become. I started with the song’s energy and let the concept reveal itself through sketching and experimentation. That’s divergent thinking: generating possibilities, following intuition, staying open to what emerges.

But once the concept crystallized, everything shifted. The next months were convergent work: hours of precise execution refining tones, adjusting shadows, capturing subtle shifts in light. Every brushstroke required focus and technical control. The creative freedom that sparked the idea gave way to disciplined execution.

This same pattern plays out in every innovation project. Early stages require divergent thinking: brainstorming, prototyping, exploring what’s possible without immediately judging feasibility. Then comes the shift to user research, data analysis, strategic planning. The wild ideas get refined into something that can actually ship.

Why Balance Is Hard

Most people get stuck on one side. Some teams generate endless ideas but never execute. Others jump to execution before exploring alternatives. They optimize the first solution they find rather than the best one.

I work with teams to move fluidly between these modes. Early in a project, we deliberately create space for divergent thinking. We encourage ideas that might seem unfocused but reveal valuable directions. Then we shift intentionally to convergent work, refining those insights into actionable strategy.

When it works, the two modes compound. Disciplined execution often sparks new creative insights. Exploring edge cases during implementation reveals better approaches. The wolves feed each other.

The Research Limitation

This balance matters because of a fundamental constraint most companies miss: you cannot research your way to breakthrough innovation.

Research is invaluable for understanding user needs and validating market fit. But research is inherently backward-looking. It tells you what people know they want based on what they’ve experienced. Groundbreaking products emerge from creative leaps beyond what users can articulate, combined with disciplined execution that makes those visions real.

Apple didn’t research their way to the iPhone. They made a creative bet about how people would want to interact with technology, then executed with obsessive focus on details. The research came later, validating and refining the core vision.

This is why companies staffed with smart people doing rigorous research still produce incremental products. They’re feeding only the convergent wolf, the one that optimizes known problems with proven methods. The divergent wolf that generates genuinely new possibilities is starving.

Progress requires both: research and data to ground ideas in reality, plus creative vision to push beyond what’s currently known or proven.

Mastery Works the Same Way

This pattern extends beyond product development to any pursuit of mastery. In classical realism, you don’t simply copy what you see. You interpret, bringing both technical precision and creative understanding to the work.

The process moves between modes constantly. You start with creative freedom, exploring how to approach the subject. Then you focus intensely on technical execution: mixing exact colors, capturing precise values. But during that focused work, you often discover new possibilities. A technique you practiced for realism suddenly suggests an unexpected creative direction.

That’s where divergent and convergent thinking integrate naturally. Not as opposing forces, but as complementary capabilities that strengthen each other through practice.

Application

If you recognize yourself getting stuck in one mode:

Too much divergence (endless ideation, nothing ships):

  • Set concrete decision points: “We explore until X date, then commit”
  • Define what “good enough” looks like before you start
  • Build in forcing functions that require convergence

Too much convergence (optimizing the first solution):

  • Deliberately schedule divergent thinking time early in projects
  • Use creative constraints that force novel approaches
  • Reward exploration, not just execution

For teams:

  • Make the mode shifts explicit: “We’re in divergent mode until Friday”
  • Don’t judge divergent ideas by convergent criteria (and vice versa)
  • Staff projects with people who can operate in both modes, not just their preferred one

The goal isn’t perfect balance. It’s fluidity—knowing when to explore, when to focus, and how to let each mode inform the other.


Like to learn more about divergent and convergent thinking in technology innovation? Check out the following materials:

The Synergy of Diverge and Converge in Design ThinkingVoltage Control: How divergent and convergent thinking are essential in the design thinking process, offering practical tips on how to integrate these approaches into innovative projects.

Unleash the Unexpected for Radical InnovationMIT Sloan Management Review: Explores how radical innovation often emerges from unexpected ideas, highlighting the importance of environments that encourage divergent thinking and creativity. (May require subscription)

The Design of Everyday ThingsDon Norman: A classic UX book, essential for understanding design thinking, usability, and balancing creativity with practical application in product design.


About the Author
Dorothy is a digital strategist and researcher who helps companies turn big ideas into real-world innovations. Outside of work, she is applying this same balance of creativity and focus to her current pursuit of master draftsmanship.

Moving from Vision to Design: User-Centered Methods for New Product Definition

Seminar by Dorothy M. Danforth for the IEEE Computer Society Leading Professional Seminar Series. – 30 minutes

https://www.youtube.com/watch?v=JnymRBYY1vQ&feature=youtu.be

It’s a common scenario: A company is planning a new product or significant redesign. There have been various discussions about how the product should have a “great user experience” and “focus on the user.” But, there are also conflicting ideas on what a great experience might entail, along with competing priorities for what the product absolutely must do to be successful in the marketplace. ​

Where to begin? How do you break through the confusion and move towards a clarified product vision? Whether a large established corporation or lean start-up, organizations struggle with progressing from early ideation into clear requirements and a tangible design phase. This webinar will explore ways to leverage user experience design methods in the very early stages of the product life cycle.

This session covers the following:

  • An overview of practical user research and design planning methods useful for early stage products and redesigns
  • Strategies for leveraging these methods to refine a product’s vision and ensure features are tied to user goals
  • Examples of how keeping a focused eye on user needs can help resolve conflicting priorities and promote product team alignment

Conducting a Solid UX Competitive Analysis

“Competition brings out the best in products
and the worst in people.” –  David Sarnoff

Most people are familiar with the concept of a competitive analysis; it’s a fairly standard business term to describe identifying and evaluating your competition in the marketplace. In the case of UXD, a competitive analysis is used to evaluate how a given product’s competition stacks up against usability standards and overall user experience.  A comparative analysis is a term I’ve often used to describe the review of applications or website that are not in direct competition with a product, but may have similar processes or interface elements that are worth reviewing.

Often, when a competitive review is conducted, the applications or websites are reviewed against a set of fairly standard usability principles (or heuristics) such as layout consistency, grouping of common tasks, link affordance, etc. Sometimes, however, the criteria can be more broadly defined to include highlights of interesting interaction models, notable functionality and/or other items that might be useful in the context of the product being designed and/or goals of a specific release.

The Expert Review
Competitive reviews can be done in conjunction with an “expert” review which is a usability evaluation of the existing product. If doing both a competitive and expert review, it’s helpful to start out with the competitive review and then conduct the expert review using the same criteria. Completing the competitive review first allows you to judge your own product relative to your competition.

Why Conduct a Competitive Analysis?

  • Understand how the major competition in your space is handling usability
  • Understand where your product stands in reference to its competition
  • Idea generation on how to solve various usability issues
  • Get an idea of what it might take to gain a competitive edge via usability/UX
  • If a thorough competitive review has never been conducted.
  • When a new product, or major game-changing rebuild is being considered.
  • Annually or bi-Annually to keep an eye on trends in your industry and on the web (such as changes in how social networking sites are integrated)

When is a Competitive Analysis Most Useful?

Competitive analysis is best done during early planning and requirements gathering stages. It can be conducted independent from a specific project cycle, or if used with a more focused criteria, it can help with the goals for a specific release.

Limitations of a Competitive Analysis

  • A competitive analysis can help you understand what it will take to come up to par with your competitors; however, it cannot show you how you can innovate and lead.
  • Insights can be limited by the knowledge level and/or evaluation abilities of the reviewer.
  • They can be time consuming to conduct and need to be re-done on a regular basis.

How to Conduct a UXD Competitive Analysis

  1. Select your competition. On average, I would recommend to target no less than five, but no more than ten of your top competitors. The longer the competitive list is, the more difficult it will be to do a sufficiently thorough investigation. In addition, there becomes a point of diminishing returns where there is not much new going on in the space that hasn’t already been brought to light by a previous competitor.
  2.  
    Consider this…

    When selecting a list of competitors, instead of just asking, who does what we do? Think about user’s alternatives. Ask, who or what is mostly likely to keep users from using our software or going to our website? While not normally thought of as “competition”, alternatives for an operations support system could be an excel spreadsheet or printed paper forms.

  3. Define the assessment criteria. It is important to define your criteria before you get started to be sure you are consistent in what you look for during your reviews. Take a moment to consider some of the issues and goals your organization is dealing with and some hypotheses you uncovered during your audit and try to incorporate these into the criteria as possible. Criteria should be specific and assessable. Here is a short excerpt of a criteria used in an e-commerce site evaluation:
    i.    Customer reviews and ratings?
    ii.    Can you wish list your items to refer back to later?
    iii.    In-store pickup option?
    iv.    Product availability indication?
    v.    Can you compare other items against each other?
    vi. What general shopping features are available?
  4. Create a Spreadsheet. Put your entire assessment criteria list in the first column, and the names of your competition along the top header row. Be sure to leave a general comments line at the bottom of your criteria list; you will use this for notes during individual reviews. Some of the evaluation might be relative (i.e. rate quality of imagery relative to other sites 1-10), so it is particularly helpful to have one spreadsheet as you work through each of your reviews.
  5. Gather your materials. Collect the competitive site URLs, software, or devices that you will be reviewing so they will be readily available.  A browser with tabbed browsing works great for web reviews. A tip for a mobile device application is that often simulators can be downloaded that allow you to display the mobile device software on your computer.
  6. Start Reviewing. One at a time, go down the criteria list while looking through the application and enter your responses. It can be helpful to use a double screen with the application on one view and the spreadsheet on the other. Take your time and try to be as observant as possible; you are looking for both good and bad points to report. As you review, write down notes on what you liked, what annoyed you and any interesting widgets you see. Take screen captures of interesting or relevant screens as you do each review.
  7. Prepare the Analysis. Create an outline of the review document including a summary area and a section for each individual review. Paste the assessment results and your notes from the spreadsheet into the document and use as a starting point for writing the report. You may need to grab additional screen captures of specific things that will be in your evaluation.
  8. Summarize your Insights. Now that you have the reviews done, you can look back what data pops out as most relevant. Some of the criteria results can be translated into summary charts and graphs to better illustrate the information.
  9. Schedule a Read Out. Take time to present your findings, setup a read-out for your colleagues. You may want to create a very high-level power point presentation of some of the more interesting point’s from your review. After conducting the read-out, publish your documentation and let people know where you’ve placed the information.

Competitive Assessment Rubric

If you don’t have time for a full written competitive analysis, you can evaluate your competition with an assessment rubric.  Because it results in a clear ranking, the rubric is a good “at a glance” way of communicating a software’s relative strengths and weaknesses to clients. Some things to note about this evaluation method:

  • It is not a scientific analysis; it’s a short-cut for communicating your assessment of the systems reviewed. Like judging a talent competition, subjective ratings (i.e. “eight out of 10”) are inherently imprecise. However, if you use consistent, pre-defined criteria you should end up with a realistic representation of your comparative rankings.
  • When creating the assessment criteria it is important to select attributes that are roughly equivalent in value. In the example below, “Template Layouts” and “Browsing & Navigation” were equally important to the overall effectiveness of the sites reviewed.

The following rubric was created to evaluate mobile phone activation sites, but this approach can be adapted to create ranking metrics for any application.

1. First, create the criteria and rating system by which you will evaluate each system.

1 – Poor 2 – Average 3 – Excellent
Marketing Commerce Integration Option to “Buy Now” is not offered in marketing pages or users are sent to a third party site. The option to “Buy Now” is available from the marketing pages but there are some usability issues with layout and transition The marketing and commerce sites are well integrated and provide users with an almost seamless transition
Template Layouts (Commerce) The basic layout is not consistent from page to page and/or the activity areas within the layout are not clearly grouped by type of user task. The layout is mostly consistent from page to page and major activity areas are grouped by task type. Some areas with information heavy content or more complex user tasks deviate from the established layout paradigms. The site shows a high level of continuity both in page to page transitions and task-type groupings. Information heavy content and complex user tasks are well thought out and intuitive relative to the site’s established layout paradigms.
Browsing & Navigation The site lacks a cohesive Information Architecture. Information is not in a clear top-down hierarchy. There are numerous “orphan” or pop-up pages that do not fit within the site structure. Similar content is duplicated in multiple areas or is presented in multiple navigational contexts. The site has a structured Information Architecture. Secondary and Tertiary navigation items are related to parent elements, but there may be multiple menus unrelated to the broader structure. There may be orphan pages of detail or less relevant information. The Information Architecture is highly cohesive. Information is structured with a clear understanding of user goals. Everything has a logical place within the architecture; secondary menus are incorporated into the site structure or clearly transitioned.
Terminology & Labeling (Commerce) Terminology and labeling is inconsistent, confusing, or inaccurate. Different terms are used to represent the same concept. Some terms may not adhere to a common understanding of their meanings. Terminology and labeling is consistent but could be more intuitive. Some unnecessary industry specific terminology or uncommon terms are used. Terminology and naming is both intuitive and consistent. Only necessary industry specific terminology is used, in context, with help references.

2.    Next, evaluate the competition along with your own system, scoring the results.

  • Marketing & Commerce Integration: how well the site handled the user’s transition from browsing product information to after making an online purchase decision.
  • Template Layouts: how clear and consistent the overall layout is and how well the layout elements translate to different areas of the site.
  • Browsing & Navigation: relative clarity and consistency of the information architecture and overall ease of browsing.
  • Terminology & Labeling: relative clarity and consistency of the language use and element naming.

 


Marketing & Commerce Integration Template Layouts Browsing & Navigation Terminology & Labeling

Total

Our System

2 (Average)

1 (Poor)

2 (Average)

3 (Excellent)

8

Company A

3

3

2

3

11

Company B

2

2

2

2

8

Company C

1

3

1

2

7

Company D

2

3

2

3

10

Company E

3

Company F
Company G
Company H

 

3.    Once your table is complete, you can sort on the totals to see your rough rankings.