Feeding Two Wolves: How Balancing Creativity and Focus Fuels Innovation and Mastery

You’ve probably heard the parable of the two wolves—one representing our darker instincts, the other our better nature. The idea is that the wolf you feed is the one that thrives. But in the context of innovation, I’ve come to believe the real challenge isn’t choosing which wolf to feed, but learning how to nurture both.

One wolf represents divergent thinking—the mind that runs wild with unrestrained, endless possibilities. The other is convergent thinking—the disciplined, focused mind that brings clarity and execution to those ideas. I navigate this tension daily, both in my work as a strategist and in my personal artistic journey.

Many years after attending art school, I decided to pursue master draftsmanship through the traditional atelier model, specifically using the Grand Central Atelier (GCA) methods. For those unfamiliar, this fine art method emphasizes precision, observation from life, and classical technique—a practice that might seem far removed from digital strategy, but in truth, there are clear parallels.

Much like the early stages of an innovation project, my artwork begins with a period of divergent thinking. I allow myself to explore the subject freely, not knowing exactly where it will lead. This was evident in my project American Music (inspired by the Violent Femmes song of the same name). The concept for the painting revealed itself to me slowly, much like finding a melody within the noise. At this stage, the possibilities are limitless.

Then comes a necessary shift—the Convergent Wolf steps in. Once I’ve found my direction, I move into a precise, focused mode, where hours of labor go into perfecting the details. In art, this means refining tones, adjusting shadows, and capturing the nuances that bring a piece to life. In my professional work, this shift happens when concepts move from imagination to execution—when user research, data analysis, and strategic planning turn abstract notions into real-world outcomes.

This is an inherent struggle for most of us. We either live unrestrained in the divergent mode, endlessly exploring ideas without bringing them to fruition, or we cling to the predictability of convergent thinking, trying to control ideas before they’ve had a chance to evolve. But through both my professional and artistic practices, I’ve learned that true innovation only happens when you encourage both wolves.

In every project I lead, whether crafting a product roadmap or refining a strategic vision, the tension between creativity and pragmatism is always present. I work with teams to embrace this dynamic—encouraging unbounded ideas from the wild Divergent Wolf, which may seem unfocused at first but reveal valuable insights into where things could go. Then, we shift to the wise Convergent Wolf, refining those ideas together into something actionable and concrete. When we get it right, the two modes feed off each other, creating something greater than either could alone.

Balancing divergent and convergent thinking isn’t just a business strategy—it’s essential for achieving meaningful outcomes in any field. Without divergence, you miss the creative breakthroughs that lead to innovation, and without convergence, those ideas will never see the light of day. This is also why companies can’t ‘research’ their way into innovative products—while research helps to ground ideas in reality, it’s the creative leaps and visionary thinking beyond the data that produce the most valuable results. Successful products emerge when companies balance research with creativity, execution, and the willingness to explore what’s possible beyond what’s known.

This is also why companies can’t ‘research’ their way into innovative products

Research is invaluable for understanding user needs and market trends, but it’s inherently backward-looking. Groundbreaking innovation comes from generating new possibilities through divergent thinking—creative leaps that can’t always be predicted by data. Progress requires a balance between research, creativity, and disciplined execution, fueled by human intuition and vision to push boundaries and create transformative products.

This same balance applies to mastering any craft, whether in business or art. Masters don’t simply copy what they see; they interpret, with deep understanding and intention. The journey begins with the spark of an idea and the freedom to explore, but it soon transitions into discipline and focus. Each brushstroke, each adjustment to tone, represents hours of dedicated work, yet all of it stems from the initial creative freedom. Sometimes, honing in on the details sparks unexpected discoveries—where the Divergent and Convergent Wolves meet once again, harmoniously collaborating in new and unexpected ways.

So, which wolf do I feed? Both. Because in the end, it’s not about choosing between creativity and execution—it’s about the dance between them. And that’s where real innovation—and mastery—happens.


Like to learn more about divergent and convergent thinking in technology innovation? Check out the following materials:

The Synergy of Diverge and Converge in Design ThinkingVoltage Control: How divergent and convergent thinking are essential in the design thinking process, offering practical tips on how to integrate these approaches into innovative projects.

Unleash the Unexpected for Radical InnovationMIT Sloan Management Review: Explores how radical innovation often emerges from unexpected ideas, highlighting the importance of environments that encourage divergent thinking and creativity. (May require subscription)

The Design of Everyday ThingsDon Norman: A classic UX book, essential for understanding design thinking, usability, and balancing creativity with practical application in product design.


About the Author
Dorothy is a digital strategist and researcher who helps companies turn big ideas into real-world innovations. Outside of work, she is applying this same balance of creativity and focus to her current pursuit of master draftsmanship.

Moving from Vision to Design: User-Centered Methods for New Product Definition

Seminar by Dorothy M. Danforth for the IEEE Computer Society Leading Professional Seminar Series. – 30 minutes

https://www.youtube.com/watch?v=JnymRBYY1vQ&feature=youtu.be

It’s a common scenario: A company is planning a new product or significant redesign. There have been various discussions about how the product should have a “great user experience” and “focus on the user.” But, there are also conflicting ideas on what a great experience might entail, along with competing priorities for what the product absolutely must do to be successful in the marketplace. ​

Where to begin? How do you break through the confusion and move towards a clarified product vision? Whether a large established corporation or lean start-up, organizations struggle with progressing from early ideation into clear requirements and a tangible design phase. This webinar will explore ways to leverage user experience design methods in the very early stages of the product life cycle.

This session covers the following:

  • An overview of practical user research and design planning methods useful for early stage products and redesigns
  • Strategies for leveraging these methods to refine a product’s vision and ensure features are tied to user goals
  • Examples of how keeping a focused eye on user needs can help resolve conflicting priorities and promote product team alignment

Conducting a Solid UX Competitive Analysis

“Competition brings out the best in products
and the worst in people.” –  David Sarnoff

Most people are familiar with the concept of a competitive analysis; it’s a fairly standard business term to describe identifying and evaluating your competition in the marketplace. In the case of UXD, a competitive analysis is used to evaluate how a given product’s competition stacks up against usability standards and overall user experience.  A comparative analysis is a term I’ve often used to describe the review of applications or website that are not in direct competition with a product, but may have similar processes or interface elements that are worth reviewing.

Often, when a competitive review is conducted, the applications or websites are reviewed against a set of fairly standard usability principles (or heuristics) such as layout consistency, grouping of common tasks, link affordance, etc. Sometimes, however, the criteria can be more broadly defined to include highlights of interesting interaction models, notable functionality and/or other items that might be useful in the context of the product being designed and/or goals of a specific release.

The Expert Review
Competitive reviews can be done in conjunction with an “expert” review which is a usability evaluation of the existing product. If doing both a competitive and expert review, it’s helpful to start out with the competitive review and then conduct the expert review using the same criteria. Completing the competitive review first allows you to judge your own product relative to your competition.

Why Conduct a Competitive Analysis?

  • Understand how the major competition in your space is handling usability
  • Understand where your product stands in reference to its competition
  • Idea generation on how to solve various usability issues
  • Get an idea of what it might take to gain a competitive edge via usability/UX
  • If a thorough competitive review has never been conducted.
  • When a new product, or major game-changing rebuild is being considered.
  • Annually or bi-Annually to keep an eye on trends in your industry and on the web (such as changes in how social networking sites are integrated)

When is a Competitive Analysis Most Useful?

Competitive analysis is best done during early planning and requirements gathering stages. It can be conducted independent from a specific project cycle, or if used with a more focused criteria, it can help with the goals for a specific release.

Limitations of a Competitive Analysis

  • A competitive analysis can help you understand what it will take to come up to par with your competitors; however, it cannot show you how you can innovate and lead.
  • Insights can be limited by the knowledge level and/or evaluation abilities of the reviewer.
  • They can be time consuming to conduct and need to be re-done on a regular basis.

How to Conduct a UXD Competitive Analysis

  1. Select your competition. On average, I would recommend to target no less than five, but no more than ten of your top competitors. The longer the competitive list is, the more difficult it will be to do a sufficiently thorough investigation. In addition, there becomes a point of diminishing returns where there is not much new going on in the space that hasn’t already been brought to light by a previous competitor.
  2.  
    Consider this…

    When selecting a list of competitors, instead of just asking, who does what we do? Think about user’s alternatives. Ask, who or what is mostly likely to keep users from using our software or going to our website? While not normally thought of as “competition”, alternatives for an operations support system could be an excel spreadsheet or printed paper forms.

  3. Define the assessment criteria. It is important to define your criteria before you get started to be sure you are consistent in what you look for during your reviews. Take a moment to consider some of the issues and goals your organization is dealing with and some hypotheses you uncovered during your audit and try to incorporate these into the criteria as possible. Criteria should be specific and assessable. Here is a short excerpt of a criteria used in an e-commerce site evaluation:
    i.    Customer reviews and ratings?
    ii.    Can you wish list your items to refer back to later?
    iii.    In-store pickup option?
    iv.    Product availability indication?
    v.    Can you compare other items against each other?
    vi. What general shopping features are available?
  4. Create a Spreadsheet. Put your entire assessment criteria list in the first column, and the names of your competition along the top header row. Be sure to leave a general comments line at the bottom of your criteria list; you will use this for notes during individual reviews. Some of the evaluation might be relative (i.e. rate quality of imagery relative to other sites 1-10), so it is particularly helpful to have one spreadsheet as you work through each of your reviews.
  5. Gather your materials. Collect the competitive site URLs, software, or devices that you will be reviewing so they will be readily available.  A browser with tabbed browsing works great for web reviews. A tip for a mobile device application is that often simulators can be downloaded that allow you to display the mobile device software on your computer.
  6. Start Reviewing. One at a time, go down the criteria list while looking through the application and enter your responses. It can be helpful to use a double screen with the application on one view and the spreadsheet on the other. Take your time and try to be as observant as possible; you are looking for both good and bad points to report. As you review, write down notes on what you liked, what annoyed you and any interesting widgets you see. Take screen captures of interesting or relevant screens as you do each review.
  7. Prepare the Analysis. Create an outline of the review document including a summary area and a section for each individual review. Paste the assessment results and your notes from the spreadsheet into the document and use as a starting point for writing the report. You may need to grab additional screen captures of specific things that will be in your evaluation.
  8. Summarize your Insights. Now that you have the reviews done, you can look back what data pops out as most relevant. Some of the criteria results can be translated into summary charts and graphs to better illustrate the information.
  9. Schedule a Read Out. Take time to present your findings, setup a read-out for your colleagues. You may want to create a very high-level power point presentation of some of the more interesting point’s from your review. After conducting the read-out, publish your documentation and let people know where you’ve placed the information.

Competitive Assessment Rubric

If you don’t have time for a full written competitive analysis, you can evaluate your competition with an assessment rubric.  Because it results in a clear ranking, the rubric is a good “at a glance” way of communicating a software’s relative strengths and weaknesses to clients. Some things to note about this evaluation method:

  • It is not a scientific analysis; it’s a short-cut for communicating your assessment of the systems reviewed. Like judging a talent competition, subjective ratings (i.e. “eight out of 10”) are inherently imprecise. However, if you use consistent, pre-defined criteria you should end up with a realistic representation of your comparative rankings.
  • When creating the assessment criteria it is important to select attributes that are roughly equivalent in value. In the example below, “Template Layouts” and “Browsing & Navigation” were equally important to the overall effectiveness of the sites reviewed.

The following rubric was created to evaluate mobile phone activation sites, but this approach can be adapted to create ranking metrics for any application.

1. First, create the criteria and rating system by which you will evaluate each system.

1 – Poor 2 – Average 3 – Excellent
Marketing Commerce Integration Option to “Buy Now” is not offered in marketing pages or users are sent to a third party site. The option to “Buy Now” is available from the marketing pages but there are some usability issues with layout and transition The marketing and commerce sites are well integrated and provide users with an almost seamless transition
Template Layouts (Commerce) The basic layout is not consistent from page to page and/or the activity areas within the layout are not clearly grouped by type of user task. The layout is mostly consistent from page to page and major activity areas are grouped by task type. Some areas with information heavy content or more complex user tasks deviate from the established layout paradigms. The site shows a high level of continuity both in page to page transitions and task-type groupings. Information heavy content and complex user tasks are well thought out and intuitive relative to the site’s established layout paradigms.
Browsing & Navigation The site lacks a cohesive Information Architecture. Information is not in a clear top-down hierarchy. There are numerous “orphan” or pop-up pages that do not fit within the site structure. Similar content is duplicated in multiple areas or is presented in multiple navigational contexts. The site has a structured Information Architecture. Secondary and Tertiary navigation items are related to parent elements, but there may be multiple menus unrelated to the broader structure. There may be orphan pages of detail or less relevant information. The Information Architecture is highly cohesive. Information is structured with a clear understanding of user goals. Everything has a logical place within the architecture; secondary menus are incorporated into the site structure or clearly transitioned.
Terminology & Labeling (Commerce) Terminology and labeling is inconsistent, confusing, or inaccurate. Different terms are used to represent the same concept. Some terms may not adhere to a common understanding of their meanings. Terminology and labeling is consistent but could be more intuitive. Some unnecessary industry specific terminology or uncommon terms are used. Terminology and naming is both intuitive and consistent. Only necessary industry specific terminology is used, in context, with help references.

2.    Next, evaluate the competition along with your own system, scoring the results.

  • Marketing & Commerce Integration: how well the site handled the user’s transition from browsing product information to after making an online purchase decision.
  • Template Layouts: how clear and consistent the overall layout is and how well the layout elements translate to different areas of the site.
  • Browsing & Navigation: relative clarity and consistency of the information architecture and overall ease of browsing.
  • Terminology & Labeling: relative clarity and consistency of the language use and element naming.

 


Marketing & Commerce Integration Template Layouts Browsing & Navigation Terminology & Labeling

Total

Our System

2 (Average)

1 (Poor)

2 (Average)

3 (Excellent)

8

Company A

3

3

2

3

11

Company B

2

2

2

2

8

Company C

1

3

1

2

7

Company D

2

3

2

3

10

Company E

3

Company F
Company G
Company H

 

3.    Once your table is complete, you can sort on the totals to see your rough rankings.