Measuring Design Thinking on the Innovation and Partner Ecosystem Level

Are we an organization capable of innovating and collaborating with design thinking? And are our partners too?

The transition from the culture change program level to the innovation ecosystem level is fluid too. On this level, we ask ourselves: are we an innovative organization? Do we consistently bring up new business opportunities, product/service innovations, and improvements within our innovation portfolio? Did we establish an internal innovation system, which does a good job of supporting our innovators? Are we embedded in an external ecosystem of partners, coopetitors, and suppliers, making all of its actors better off through knowledge exchange, co-creation, and network effects?

In politics, innovation (eco)systems are seen on even a higher out zoom level: here, they often are understood in line with Friedrich List’s and Bengt-Åke Lundvall’s notion of production, education, economic development systems in society on regional, sectoral, or national levels . There they denote the flow of technology and information among people, enterprises, and institutions of a nation. But for this article let’s stick to the point of view of an organization and look at some example metrics:

Innovation Ecosystem Activity Metrics

Improved activity metrics across all levels mentioned in the articles before, plus:

  • # and quality of partnerships and collaborations
  • # of requests for co-creation and collaboration from relevant external parties
  • % of celebrated and reflected smart failures in public
  • % of unstructured time given to employees
  • Ratio of having to outsource innovation work vs. being able to do it yourself
    (e.g. coaching, prototyping, design, user research, coding, etc.)
  • etc.
Innovation Ecosystem Output Metrics

Improved output metrics across the project and portfolio/funnel levels, plus:

  • Changes in psychological safety across the whole organization
  • Increase in employee satisfaction and productivity
  • Positive media and analyst mentions
  • Easier recruiting and retaining of talent
  • Upward trend in classic KPIs for UI, UX, CX, and brand perception across all product/service lines
  • Customer satisfaction, e.g. via NPS or Customer Centricity Score
  • Improved organizational readiness scores for innovation, e.g. via ‘readiness assessments’
  • Innovation contribution, e.g. via % of revenue from products/services created in the last five years
  • etc.

As I tried to show in the article series, metrics become more abstract with each level. The higher order levels are the responsibility of executives and top management; the lower order levels are taken care of by coaches, teams leaders and teams themselves. Yet, in reality things can get complicated. New ways of working might for example require you and your project sponsors to manage and measure on different levels simultaneously. We strongly advise you to ask yourself on which level(s) you actually plan to intervene to measure the right things and also be able to set expectations with all co-creators accordingly. The box below visualizes a not very uncommon scenario:

Your first design thinking project:

Let’s assume you start your project with the goal to design a public-private partnership business model together with business partners in another country. None of them ever worked with agile methodologies. Also, the team on-site you were able to compile, lacks experience in the hard skills needed. Usually that is user research and prototyping. So, while the project is running, you realize that both your team and the partners first need methods and »mindset« training as it turns out that they are not entirely comfortable with working in a design thinking mode yet. Even if your team and partners are ready to roll, you might often find yourself in a situation where you will have to explain or even justify this new way of working to internal colleagues in their organization and your own. This will cost additional time, which is well worth investing, but will be deducted from the project processing time and will thus influence how much you can realistically achieve. If your project sponsor now only gets your outcome metrics reported, they might think you have been »lazy« or »unsuccessful«.

Design thinking projects in non-agile organizations with low innovation mastery involuntarily have to address non-project-related issues and intervene in other higher-order levels to make progress.

So, do yourself a favor. Clearly differentiate your project-level metrics from the other ones and communicate them separately. If you have to do additional awareness and communication work with all kinds of stakeholders on top of your project, try to measure and report such metrics as they arise accordingly. It would be unfair if you were measured only by the project results. Especially in first projects it is not only about project success but also about how you have influenced and changed assumptions, thinking and behavior on the other levels.

Conclusion

You see, measuring innovation in general and design thinking in particular, isn’t a trivial task you can achieve with just a three-step silver-bullet method some managers wish for. But with openness and the willingness to create an operating system for innovation that really changes underlying management and performance systems, it can be done and measured. And the good news for the impatient managers is: if you measure teams with the right innovation metrics, you will not only treat them fairer, but it will also lay bare faster unhelpful or even toxic behaviors to innovation like social loafing, managing up and without doing work, polishing PPTs without content, etc. And as real innovation work is hard and getting measured against the metrics mentioned in this article series makes it impossible to hide such behaviors, you can quickly sort out innovation tourists and other unhelpful individuals your teams don’t need in their projects.

Unless we determine what shall be measured and what the yardstick of measurement in an area will be, the area itself will not be seen.

Peter Drucker

I personally believe that the next few years will separate the wheat from the chaff and show which organizations are really serious about professionalizing their innovation management. That professionalization also means not burning teams and measuring them by standards that are either unfair or too lax. In any case, it is worth keeping an eye on the driving forces around the topic of innovation accounting, which have extremely advanced the field in recent years. Esther Gons’ and Dan Toma’s excellent book “Innovation Accounting” is highly recommended for those who want to go deeper into the topic. But also, Kromatic’s Tristan Kromer and the folks at Strategyzer regularly publish practical articles on the matter. It remains exciting! What are your thoughts on this?


References

Want to read more?
Sonalkar, N., Mabogunje, A., & Leifer, L. (2013). Developing a visual representation to characterize moment-to-moment concept generation in design teams. International Journal of Design Creativity and Innovation, 1(2), 93–108. https://doi.org/10.1080/21650349.2013.773117
Toma, D., & Gons, E. (2022). Innovation Accounting (1st edition). BIS Publishers.
Drucker, P. (1959). People and Performance. Harvard Business School Press.
Lundvall, B. (2016). Innovation System Research and Policy Where it came from and where it might go. Undefined. https://www.semanticscholar.org/paper/Innovation-System-Research-and-Policy-Where-it-came-Lundvall/8651c99aab602a59a893df750169738f50ce8ea6
Porter, M. E., & Nohria, N. (2018, July 1). What Do CEOs Actually Do? Harvard Business Review. https://hbr.org/2018/07/what-do-ceos-actually-do
Schmiedgen, J., Spille, L., Köppen, E., Rhinow, H., & Meinel, C. (2016). Measuring the Impact of Design Thinking. In H. Plattner, C. Meinel, & L. Leifer (Eds.), Design Thinking Research: Making Design Thinking Foundational (pp. 157–170). Springer International Publishing. https://doi.org/10.1007/978-3-319-19641-1_11
Taft, R., & Rossiter, J. R. (1966). The Remote Associates Test: Divergent or Convergent Thinking? Psychological Reports, 19(3_suppl), 1313–1314. https://doi.org/10.2466/pr0.1966.19.3f.1313
Nijstad, B., De Dreu, C., Rietzschel, E., & Baas, M. (2010). The Dual Pathway to Creativity Model: Creative Ideation as a Function of Flexibility and Persistence. European Review of Social Psychology, 21, 34–77. https://doi.org/10.1080/10463281003765323
Gons, T. V. D. T. E. (2017). The Corporate Startup: How Established Companies Can Develop Successful Innovation Ecosystems (1st edition). Vakmedianet Management bv.
Spool, J. M. (2020, June 3). Why UX Outcomes Make Better Goals Than Business Outcomes. Medium. https://medium.com/creating-a-ux-strategy-playbook/why-ux-outcomes-make-better-goals-than-business-outcomes-3bacaae23510
Menning, A., Rhinow, A., & Nicolai, C. (2016). The Topic Markup Scheme and the Knowledge Handling Notation: Complementary Instruments to Measure Knowledge Creation in Design Conversations (pp. 291–307). https://doi.org/10.1007/978-3-319-40382-3_16
McGann, M., Blomkamp, E., & Lewis, J. M. (2018). The rise of public sector innovation labs: experiments in design thinking for policy. Policy Sciences, 51(3), 249–267. https://doi.org/10.1007/s11077-018-9315-7
Amplitude. (n.d.). The North Star Playbook. Amplitude. Retrieved November 4, 2020, from https://amplitude.com/north-star
Edmondson, A. (1999). Psychological Safety and Learning Behavior in Work Teams. Administrative Science Quarterly, 44(2), 350–383. https://doi.org/10.2307/2666999
Schmiedgen, J., Rhinow, H., Köppen, E., & Meinel, C. (2016). Parts without a whole?: The current state of Design Thinking practice in organizations. Universitätsverlag Potsdam.
Dweck, C. S. (2006). Mindset: The New Psychology of Success (Illustrated edition). Random House.
Polaine, A., Løvlie, L., & Reason, B. (2013). Measuring Services. In Service Design: From Insight to Implementation (1st edition, pp. 151–169). Rosenfeld Media.
Mulgan, G. (2019). Measuring our impact. Nesta. https://www.nesta.org.uk/feature/measuring-our-impact/
frog Design. (2020). The Business Value of Design - frog. https://info2.frogdesign.com/en/business-value-of-design
Edmondson, A. C. (2011). Strategies of learning from failure. Harvard Business Review, 89(4), 48–55, 137.
Mayer, S., Haskamp, T., & de Paula, D. (2021, January 5). Measuring what Counts: An Exploratory Study about the Key Challenges of Measuring Design Thinking Activities in Digital Innovation Units.
Royalty, A., Chen, H., Roth, B., & Sheppard, S. (2019). Measuring Design Thinking Practice in Context. In C. Meinel & L. Leifer (Eds.), Design Thinking Research : Looking Further: Design Thinking Beyond Solution-Fixation (pp. 61–73). Springer International Publishing. https://doi.org/10.1007/978-3-319-97082-0_4
Friedl, J. (2021, January 3). Measuring design quality with heuristics. Medium. https://jasminefriedl.medium.com/measuring-design-quality-with-heuristics-44857efa514
Watson, R. T., & Saunders, C. (2005). Managing insight velocity: The design of problem solving meetings. Business Horizons, 48(4), 285–295. https://doi.org/10.1016/j.bushor.2004.06.001
Dosi, C., Rosati, F., & Vignoli, M. (2018). Measuring Design Thinking Mindset. 1991–2002. https://doi.org/10.21278/idc.2018.0493
Croll, A., & Yoskovitz, B. (2013). Lean Analytics: Use Data to Build a Better Startup Faster. O’Reilly Media.
Kromer, T. (n.d.). Kromatic Innovation Ecosystem Booklet. Retrieved January 8, 2021, from https://kromatic.com/innovation-resources/kromatic-innovation-ecosystem-booklet
Mastrogiacomo, S., Osterwalder, A., Smith, A., & Papadakos, T. (2021). High-Impact Tools for Teams: 5 Tools to Align Team Members, Build Trust, and Get Results Fast (1st edition). Wiley.
Downe, L. (2020). Good Services: How to Design Services That Work (1st edition). BIS Publishers.
Liedtka, J. (2017). Evaluating the Impact of Design Thinking in Action. Academy of Management Proceedings, 2017(1), 10264. https://doi.org/10.5465/AMBPP.2017.177
Royalty, A., & Roth, B. (2016). Developing Design Thinking Metrics as a Driver of Creative Innovation (pp. 171–183). https://doi.org/10.1007/978-3-319-19641-1_12
Obeng, E. (1995). All Change!: The Project Leader’s Secret Handbook (New Ed edition). Financial Times Prent.
Whicher, A., Raulik‐Murphy, G., & Cawood, G. (2011). Evaluating Design: Understanding the Return on Investment. Design Management Review, 22(2), 44–52. https://doi.org/10.1111/j.1948-7169.2011.00125.x
Designing with Society: A Capabilities Approach to Design, Systems Thinking and Social Innovation. (n.d.). Routledge & CRC Press. Retrieved December 17, 2020, from https://www.routledge.com/Designing-with-Society-A-Capabilities-Approach-to-Design-Systems-Thinking/Boylston/p/book/9781138554337
Mabogunje, A., Sonalkar, N., & Leifer, L. (2016). Design Thinking: A New Foundational Science for Engineering. International Journal of Engineering Education, 32, 1540–1556.
Anders Richtnér, Anna Brattström, Johan Frishammar, Jennie Björk, and Mats Magnusson. (2018). Creating Better Innovation Measurement Practices. In When Innovation Moves at Digital Speed: Strategies and Tactics to Provoke, Sustain, and Defend Innovation in Today’s Unsettled Markets (p. Chapter 12). The MIT Press. https://doi.org/10.7551/mitpress/11858.001.0001

What do you think? Leave a reply!

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Want to read more of our thinking?