A Framework for Creative Visualization-Opportunities Workshops
Applied visualization researchers often work closely with domain collaborators to explore new and useful applications of visualization. The early stages of collaborations are typically time consuming for all stakeholders as researchers piece together an understanding of domain challenges from disparate discussions and meetings. A number of recent projects, however, report on the use of creative visualization-opportunities (CVO) workshops to accelerate the early stages of applied work, eliciting a wealth of requirements in a few days of focused work. Yet, there is no established guidance for how to use such workshops effectively. In this paper, we present the results of a 2-year collaboration in which we analyzed the use of 17 workshops in 10 visualization contexts. Its primary contribution is a framework for CVO workshops that: 1) identifies a process model for using workshops; 2) describes a structure of what happens within effective workshops; 3) recommends \numberOfGuidelinesactionable guidelines for future workshops; and 4) presents an example workshop and workshop methods. The creation of this framework exemplifies the use of critical reflection to learn about visualization in practice from diverse studies and experience.
Two key challenges in the early stages of applied visualization research are to find pressing domain problems and to translate them into interesting visualization opportunities. Researchers often discover such problems through a lengthy process of interviews and observations with domain collaborators that can sometimes take months [39, 56, 80]. A number of recent projects, however, report on the use of workshops to characterize domain problems in just a few days of focused work [16, 17, 18, 35, 64, 88]. More specifically, these workshops are creative visualization-opportunities workshops (CVO workshops), in which researchers and their collaborators explore opportunities for visualization in a domain . When used effectively, such workshops reduce the time and effort needed for the early stages of applied visualization work, as noted by one participant: “The interpersonal leveling and intense revisiting of concepts made more progress in a day than we make in a year of lab meetings …[the workshop] created consensus by exposing shared user needs” .
The CVO workshops reported in the literature were derived and adapted from software requirements workshops  and creative problem-solving workshops  to account for the specific needs of visualization design. These adaptations were necessary because existing workshop guidance does not appropriately emphasize three characteristics fundamental to applied visualization, which we term visualization specifics: the visualization mindset of researchers and collaborators characterized by a symbiotic collaboration  and a deep and changing understanding of domain challenges and relevant visualizations ; the connection to visualization methodologies that include process and design decision models [62, 80]; and the use of visualization methods within workshops to focus on data analysis challenges and visualization opportunities .
The successful use of CVO workshops resulted from an ad hoc process in which researchers modified existing workshop guidance to meet the needs of their specific projects and reported the results in varying levels of detail. For example, Goodwin et al.  provide rich details, but with a focus on their experience using a series of workshops in a collaboration with energy analysts. In contrast, Kerzner et al.  summarize their workshop with neuroscientists in one sentence even though it profoundly influenced their research. Thus, there is currently no structured guidance about how to design, run, and analyze CVO workshops. Researchers who are interested in using such workshops must adapt and refine disparate workshop descriptions.
In this paper, we — a group of visualization and creativity researchers who have been involved with every CVO workshop reported in the literature — reflect on our collective experience and offer guidance about how and why to use CVO workshops in applied visualization. More specifically, this paper results from a 2-year international collaboration in which we applied a methodology of critically reflective practice  to perform meta-analysis of our collective experience and research outputs from conducting 17 workshops in 10 visualization contexts [16, 18, 17, 34, 35, 42, 64, 68, 69, 88], combined with a review of the workshop literature from the domains of design [3, 14, 38, 72], software engineering [27, 31, 32, 33, 47, 48, 50], and creative problem-solving [13, 19, 21, 59, 66].
This paper’s primary contribution is a framework for CVO workshops. The framework consists of: 1) a process model that identifies actions before, during, and after workshops; 2) a structure that describes what happens in the beginning, in the middle, and at the end of effective workshops; 3) a set of 25 actionable guidelines for future workshops; and 4) an example workshop and example methods for future workshops. To further enhance the actionability of the framework, in Supplemental Materials111http://bit.ly/CVOWorkshops/ we provide documents with expanded details of the example workshop, additional example methods, and 25 pitfalls we have encountered when planning, running, and analyzing CVO workshops.
We tentatively offer a secondary contribution: this work exemplifies critically reflective practice that enables us to draw upon multiple diverse studies to generate new knowledge about visualization in practice. Towards this secondary contribution we include, in Supplemental Materials, an audit trail [10, 41] of artifacts that shows how our thinking evolved over the 2-year collaboration.
In this paper, we first summarize the motivation for creating this framework and describe related work in Sec. 1 and 2. Next, we describe our workshop experience and reflective analysis methods in Sec. 3 and 4. Then, we introduce the framework in Sec. 5 – 9. After that, we discuss implications and limitations of the work in Sec. 10. We conclude with future work in Sec. 11.
1 Motivation and Background
In our experience, CVO workshops provide tremendous value to applied visualization stakeholders — researchers and the domain specialists with whom they collaborate. CVO workshops provide time for focused thinking about a collaboration, which allows stakeholders to share expertise and explore visualization opportunities. In feedback, one participant reported the workshop was “a good way to stop thinking about technical issues and try to see the big picture” .
CVO workshops can also help researchers understand analysis pipelines, work productively within organizational constraints, and efficiently use limited meeting time. As another participant said: “The structured format helped us to keep on topic and to use the short time wisely. It also helped us rapidly focus on what were the most critical needs going forward. At first I was a little hesitant, but it was spot-on and wise to implement” .
Furthermore, CVO workshops can build trust, rapport, and a feeling of co-ownership among project stakeholders. Researchers and collaborators can leave workshops feeling inspired and excited to continue a project, as reported by one participant: “I enjoyed seeing all of the information visualization ideas …very stimulating for how these might be useful in my work” .
Based on these reasons, our view is that CVO workshops have saved us significant amounts of time pursuing problem characterizations and task analysis when compared to traditional visualization design approaches that involve one-on-one interviews and observations. What may have taken several months, we accomplished with several days of workshop preparation, execution, and analysis. In this paper we draw upon 10 years of experience using and refining workshops to propose a framework that enables others to use CVO workshops in the future.
CVO workshops are based on workshops used for software requirements and creative problem-solving . Software requirements workshops elicit specifications for large-scale systems  that can be used in requirements engineering  and agile development . There are many documented uses of such workshops [31, 48, 49, 50], but they do not appropriately emphasize the mindset of visualization researchers or a focus on data and analysis.
More generally, creative problem-solving workshops are used to identify and solve problems in a number of domains  — many frameworks exist for such workshops [1, 13, 19, 20, 38]. Meta-analysis of these frameworks reveal common workshop characteristics that include: promoting trust and risk taking, exploring a broad space of ideas, providing time for focused work, emphasizing both problem finding and solving, and eliciting group creativity from the cross-pollination of ideas .
Existing workshop guidance, however, does not completely describe CVO workshops. The key distinguishing feature of CVO workshops is the explicit focus on visualization, which implies three visualization specifics for effective workshops and workshop guidance:
Workshops should promote a visualization mindset — the set of beliefs and attitudes held by project stakeholders, including an evolving understanding about domain challenges and visualization [54, 80] — that fosters and benefits an exploratory and visual approach to dealing with data while promoting trust and rapport among these stakeholders ;
Workshops should use visualization methods that explicitly focus on data visualization and analysis by exploring visualization opportunities with the appropriate information location and task clarity .
This paper is, in part, about adopting and adapting creative problem-solving workshops to account for these visualization specifics.
2 Related Work
Workshops are commonly used in a number of fields, such as business [20, 21, 83] and education [2, 8]. Guidance from these fields, however, does not emphasize the role of workshops in a design process, which is central to applied visualization. Therefore, we focus this section on workshops as visualization design methods.
CVO workshops can be framed as a method for user-centered design , participatory design , or co-design  because they involve users directly in the design process — we draw on work from these fields that have characterized design methods. Sanders et al. , for example, characterize methods by their role in the design process. Biskjaer et al.  analyze methods based on concrete, conceptual, and design space aspects. Vines et al.  propose ways of thinking about how users are involved in design. Dove  describes a framework for using data visualization in participatory workshops. A number of books also survey existing design methods [9, 38] and practices [36, 40, 74]. These resources are valuable for understanding design methods but do not account for visualization specifics such as methodologies that emphasize the critical role of data early in the design process .
CVO workshops can also be framed within existing visualization design process and decision models [51, 56, 62, 80, 85]. More specifically, CVO workshops focus on eliciting opportunities for visualization software from collaborators. They support the understand and ideate design activities  or fulfill the winnow, cast, and discover stages of the design study methodology’s nine-stage framework .
A number of additional methods can be used in the early stages of applied work. Sakai and Aert , for example, describe the use of card sorting for problem characterization. McKenna et al.  summarize the use of qualitative coding, personas, and data sketches in collaboration with security analysts. Koh et al.  describe workshops that demonstrate a wide range of visualizations to domain collaborators, a method that we have adapted for use in CVO workshops as described in Sec. 7.4. Roberts et al.  describe a method for exploring and developing visualization ideas through structured sketching. This paper is about how to use these design methods, and others, within structured CVO workshops.
Visualization education workshops are also relevant to CVO workshops. Huron et al.  describe data physicalization workshops for constructive visualization with novices. He et al.  describe workshops for students to think about the relationships between domain problems and visualization designs. In contrast, we frame CVO workshops as a method for experienced researchers to pursue domain problem characterization. Nevertheless, we see opportunities for participatory methods, such as constructive visualization  and sketching , to be integrated into CVO workshops.
|P1||2009||Cartography||“âReimagining the legend as an exploratory visualization interfaceâ”||3||Paper||JD||*|||
|P2||2012||Smart Homes||Deliver insights into the role of smart homes and new business potential||4||Paper||SG||JD,SJ,*|||
|P3||2012||Human terrain||“develop [visualization] techniques that are meaningful in HTA”||3||Paper||JD||*|||
|P4||2015||Neuroscience||Explore problem-driven multivariate graph visualization||1||Paper||EK||MM, *|||
|P5||2015||Constraint prog.||Design performance profiling methods for constraint programmers||1||Paper||SG||*|||
|P6||2017||Psychiatry||Support visual analysis of determining or associated factors of suicide||1||Paper||*||EK,*|||
|P7||2017||Genealogy||Discover opportunities to support visual genealogy analysis||1||—||*||EK,MM,*|||
|P8||2017||Biology||Support phylogenetic analysis with visualization software||1||In-progress||*||EK,MM,*|||
|1||Explore possibilities for enhancing legends with visualizations||1v||3v / 5c||6|
|1||Identify future opportunities for utilising smart home data/technologies||2v / 1p||0v / 5c||6|
|1||Identify novel visual approaches most suitable for HTA||1v / 1p||7v / 6c||9|
|1||Explore shared user needs for visualization in retinal connectomics||4v||0v / 9c||7|
|1||Identify analysis and vis. opportunities for improved profiling of cons. prog.||2v / 1c||0v / 10c||7|
|1||Understand the main tasks of psychiatric researchers||2v||1v / 6c||3|
|1||Explore opportunities for a design study with genealogists||1v||3v / 7c||3|
|1||Explore opportunities for funded collaboration between vis. and bio.||1v / 1c||2v / 12c||7x2|
3 Workshop Experience and Terminology
To develop the CVO workshop framework proposed in this paper, we gathered researchers who used workshops on 3 continents over the past 10 years. Our collective experience includes 17 workshops in 10 contexts: 15 workshops in 8 applied collaborations, summarized in Table 1 and Table 2; and 2 participatory workshops at IEEE VIS that focused on creating visualizations for domain specialists [68, 69].
The ways in which we use workshops have evolved over 10 years. In three of our projects, we used a series of workshops to explore opportunities, develop and iterate on prototypes, and evaluate the resulting visualizations in collaborations with cartographers , energy analysts , and defense analysts . In three additional projects, we used a single workshop to jump-start applied collaborations with neuroscientists , constraint programmers , and psychiatrists . Recently, we used two workshops to explore opportunities for funded collaboration with genealogists  and biologists .
In our meta-analysis, we focused on the workshops used in the early stages of applied work or as the first in a series of workshops. To describe these workshops, we developed the term CVO workshops because they aim to deliberately and explicitly foster creativity while exploring opportunities for applied visualization collaborations.
Focused on CVO workshops, our experience includes the eight workshops in Table 2. Since we analyzed more data than appeared in any resulting publications, including artifacts and experiential knowledge, we refer to workshops and their projects by identifiers, e.g.,  refers to our collaboration with cartographers. In projects where we used more than one workshop [1 – 1], the identifier corresponds to the first workshop in the series, unless otherwise specified.
To describe our experience, we developed terminology for the role of researchers involved in each project. The primary researcher is responsible for deciding to use a CVO workshop, executing it, and integrating its results into a collaboration. Alternatively, supporting researchers provide guidance and support to the primary researcher. We have been involved with projects as both primary and supporting researchers (see Table 1).
We also adopt terminology to describe CVO workshops. Workshops are composed of methods — specific, repeatable and modular activities . The methods are designed around a theme that identifies the workshop’s central topic or purpose . The facilitators plan and guide the workshop, and the participants carry out the workshop methods. Typically the facilitators are visualization researchers and participants are domain collaborators, but, visualization researchers can participate [1, 1], and collaborators can facilitate [1, 1]. We adopted and refined this vocabulary during our reflective analysis.
4 Research Process
The contributions in this paper arise from reflection — the analysis of experiences to generate insights [5, 78]. More specifically, we applied a methodology of critically reflective practice , summarized by Thompson and Thompson  as “synthesizing experience, reflection, self-awareness and critical thinking to modify or change approaches to practice.”
We analyzed our collective experience and our CVO workshop data, which consisted of documentation, artifacts, participant feedback, and research outputs. The analysis methods that we used can be described through three metaphorical lenses of critically reflective practice:
The lens of our collective experience — we explored and articulated our experiential knowledge through interviews, discussions, card sorting, affinity diagramming, observation listing, and observations-to-insights . We codified our experience, individually and collectively, in both written and diagram form. We iteratively and critically examined our ideas in light of workshop documentation and artifacts.
The lens of our learners (i.e., readers) — in addition to intertwining our analysis with additional workshops, we shared drafts of the framework with visualization researchers, and we used their feedback to make the framework more actionable and consistent.
Our reflective analysis, conducted over two years, was messy and iterative. It included periods of focused analysis and writing, followed by reflection on what we had written, which spurred additional analysis and rewriting. Throughout this time, we generated diverse artifacts, including models for thinking about how to use workshops, written reflections on which methods were valuable to workshop success, and collaborative writing about the value of workshops. This paper’s Supplemental Material contains a timeline of significant events in our reflective analysis and 30 supporting documents that show how our ideas evolved into the following framework.
5 Fundamentals of the Framework
The framework proposed in this paper describes how and why to use CVO workshops. We use the term framework because what we have created provides an interpretive understanding and approach to practice instead of causal or predictive knowledge . The framework is a thinking tool to navigate the process of planning, running, and analyzing a workshop, but we note that it cannot resolve every question about workshops because the answers will vary with local experience, preference, and context. In this section, we describe a set of factors that contribute to workshop effectiveness, as well as introduce the workshop process model and structure. We intend for the framework to be complemented by existing workshop resources from outside of visualization [1, 8, 20, 21].
5.1 Tactics for Effective Workshops
Reflecting on our experience and reviewing the relevant literature [63, 66, 76, 77, 81] enabled us to identify several key factors that contribute to the effectiveness of workshops: focusing on the topic of visualization, data and analysis, while fostering, maintaining, and potentially varying the levels of agency, collegiality, trust, interest, and challenge associated with each. We term these factors TACTICs for effective workshops:
(T)opic — the space of ideas relevant to data, visualization, and domain challenges in the context of the workshop theme.
(A)gency — the sense of stakeholder ownership in the workshop, the workshop outcomes, and the research collaboration.
(C)ollegiality — the degree to which communication and collaboration occur among stakeholders.
(T)rust – the confidence that stakeholders have in each other, the workshop, the design process, and the researchers’ expertise.
(I)nterest — the amount of attention, energy, and engagement to workshop methods by the stakeholders.
(C)hallenge — the stakeholders’ barrier of entry to, and likelihood of success in, workshop methods.
The TACTICs are not independent, consistent, or measurable. The extent to which they are fostered depends upon the context in which they are used, including various characteristics of the workshop — often unknown in advance, although perhaps detectable by facilitators. Yet, selecting methods to maintain appropriate levels of agency, interest, and trust — while varying levels of challenge and approaching the topic from different perspectives — likely helps workshops to have a positive influence on the mindset of stakeholders and to generate ideas that move forward the methodology of the project. Hence, we refer to the TACTICs throughout this framework.
5.2 Process Model and Structure
The framework proposes two models for describing how to use CVO workshops: a process model and a workshop structure. The models were adapted from the extensive literature that describes how to use workshops outside of visualization [1, 8, 13, 15, 20, 21, 66].
The process model shown in Fig. 1 (left) consists of three stages that describe the actions of using CVO workshops:
Before: define & design. Define the workshop theme and design workshop methods, creating a flexible workshop plan.
During: execute & adapt. Perform the workshop plan, adapting it to participants’ reactions in light of the TACTICs, generating workshop output as a set of artifacts and documentation.
After: analyze & act. Make sense of the workshop output and use it in the downstream design process.
Nested within the process is the CVO workshop structure — Fig. 1 (right) — that identifies key aspects of the methods used in the beginning, middle, and end of workshops:
Opening. Establish shared context and interest while promoting trust, agency, and collegiality.
Core. Promote creative thinking about the topic, potentially varying challenge to maintain interest.
Closing. Provide time for reflection on the topic and promote continued collegiality in the collaboration.
The process model and structure are closely connected as shown by the orange box in Fig. 1. As part of the workshop process, we design and execute a workshop plan. This plan follows the workshop structure because it organizes methods into the opening, core, and closing. In other words, the process is about how we use a workshop; the structure is about how methods are organized within a workshop.
We use the process model and structure to organize the following four sections of this paper. In these sections, we use paragraph-level headings to summarize 25 actionable workshop guidelines. Additionally, in Supplemental Materials we include a complementary set of 25 pitfalls that are positioned against these guidelines and the TACTICs to further enhance the actionability of the framework.
6 Before the Workshop: Define & Design
Creating an effective CVO workshop is a design problem: there is no single correct workshop, the ideal workshop depends on its intended outcomes, and the space of possible workshops is practically infinite. Accordingly, workshop design is an iterative process of defining a goal, testing solutions, evaluating their effectiveness, and improving ideas. The framework we have developed here is part of this process. In this section, we introduce four guidelines — summarized in paragraph-level headings — for workshop design.
Define the theme.
Just as design starts with defining a problem, creating a CVO workshop starts with defining its purpose, typically by articulating a concise theme. An effective theme piques interest in the workshop through a clear indication of the topic. It encourages a mindset of mutual learning among stakeholders. It also focuses on opportunities that exhibit the appropriate task clarity and information location of the design study methodology . Examples from our work emphasize visualization opportunities (e.g., “enhancing legends with visualizations” ), domain challenges (e.g., “identify analysis and visualization opportunities for improved profiling of constraint programs” ), or broader areas of mutual interest (e.g., “explore opportunities for a funded collaboration with phylogenetic analysts” ).
Although we can improve the theme as our understanding of the domain evolves, posing a theme early can ground the design process and identify promising participants.
Recruit diverse and creative participants.
We recruit participants who have relevant knowledge and diverse perspectives about the topic. We also consider their openness to challenge and potential collegiality.
Examples of effective participants include a mix of frontline analysts, management, and support staff ; practitioners, teachers, and students ; or junior and senior analysts . We recommend that participants attend the workshop in person because remote participation proved distracting in one workshop . Recruiting fellow-tool builders  as participants should be approached with caution because their perspectives may distract from the topic — this happened in our workshop that did not result in active collaboration .
Design within constraints.
Identifying constraints can help winnow the possibilities for the workshop. Based on our experience, the following questions are particularly useful for workshop design:
Who will use the workshop results? Identifying the primary researcher early in the process is important because he or she will be responsible for the workshop and ultimately use its results. In a workshop where we did not clearly identify the primary researcher, the results went unused .
Who will help to facilitate the workshop? We have facilitated our workshops as the primary researcher, with the assistance of supporting researchers or professional workshop facilitators. Domain collaborators can also be effective facilitators, especially if the domain vocabulary is complex and time is limited [1, 1].
Where will the workshop be run? Three factors are particularly important for determining the workshop venue: a mutually convenient location, a high quality projector for visualization examples, and ample space to complete the methods. We have had success with workshops at offsite locations [1, 1], our workplaces, and our collaborators’ workplaces [1 – 1].
Pilot the methods and materials.
Piloting methods can ensure that the workshop will generate ideas relevant to the topic while maintaining appropriate levels of interest and challenge. We have piloted methods to evaluate how understandable they are [1, 1], to test whether they create results that can be used to advance visualization design methodologies [1, 1], to find mistakes in method prompts [1, 1, 1, 1], and to ensure that the materials are effective — e.g., sticky notes are the correct size and visualizations are readable on the projector.
7 Workshop Structure and Methods
This section describes guidelines for the methods used in the three phases of the CVO workshop structure (described in Sec. 5.2) — the opening, core, and closing. It concludes with a summary of an example workshop and resources for additional workshop methods.
7.1 Workshop Opening
The workshop opening communicates the goals and guidelines for participants, but it can be more than that. It can foster agency by encouraging self-expression and idea generation. It can encourage collegiality and trust by promoting open communication, acknowledging expertise, and establishing a safe co-owned environment. It can also garner interest by showing that the workshop will be useful and enjoyable. Two guidelines contribute to an effective opening.
Set the stage — engage.
CVO workshops typically open with a short introduction that reiterates the theme and establishes shared context for participants and facilitators. We have introduced workshops as “guided activities that are meant to help us understand: what would you like to do with visualization?” . We have also used graphics that summarize the goals of our project, potentially priming participants to engage with the topic of visualization .
The opening can establish principles for creativity [1, 66], potentially fostering trust and collegiality. We used the following principles in one of our workshops : 1) all ideas are valid, express and record them; 2) let everyone have their say; 3) be supportive of others; 4) instead of criticizing, create additional ideas; 5) think ‘possibility’ – not implementation; 6) speak in headlines and follow with detail; and 7) switch off all electronic devices.
Introduction presentations should be kept short to maintain interest. Passive methods, such as lectures and presentations, can discourage participation at the outset. For example, we started one workshop  with a presentation on the current state of analysis tools. This presentation encouraged participants to passively listen rather than actively explore, establishing a passive mindset that we had to overcome in subsequent methods. An effective opening engages participants.
We use methods that encourage self-expression to support interpersonal leveling and to act on the creativity principles — all ideas are valid and be supportive of others. Such interpersonal methods help to establish an atmosphere of trust and collegiality among participants and facilitators. They can also provide participants with a sense of agency .
We have used interpersonal methods that ask participants to sketch ideas while suspending judgment  or to introduce themselves through analogies as a potential primer for creativity (see analogy introduction in Sec. 7.4). Overall, we use interpersonal methods in the opening to engage participants and facilitators, preparing them for the workshop core.
7.2 Workshop Core
In the workshop core, we harness the active and engaged mindset of participants by encouraging them to explore a wide ideaspace before selecting the more promising ideas. The methods in the core potentially generate hundreds of sticky notes, sketches, and other artifacts. Analysis of our experience and relevant literature leads us to suggest five guidelines for an effective core.
Elicit visualization opportunities.
We select workshop methods relevant to the topic, asking participants about their current analysis challenges, limitations of existing tools, characteristics of their data, or the ways in which they would like to use visualization. This can be achieved by adding a visualization twist to existing design and workshop methods.
In one workshop , for example, we used a method that “developed user stories, considered relevant datasets, discussed alternative scenarios and sketched solutions” with our domain collaborators. In retrospect, this method connected the topic into a more general workshop method, user stories .
Explore, then focus.
We organize the core to first generate ideas using divergent methods that expand the ideaspace. Then, we evaluate ideas using convergent methods that winnow the ideaspace . Using divergent methods early in the core allows us to consider many possibilities while also promoting agency and maintaining interest. Then, convergent methods can narrow the ideaspace to the more promising ideas.
Classifying methods as either divergent or convergent risks oversimplification as individual methods often include both divergent and convergent aspects. Consider our use of brainstorming  during one workshop : we asked participants to record “problems and successes associated with the current clients on sticky notes” (divergent) and then to share the more interesting ideas (convergent). We classify this method as divergent because it creates ideas, despite the convergent discussion. In contrast, a convergent method may only involve grouping sticky notes from previous methods. Overall, in line with existing workshop guidance [1, 13, 21, 66], we judge methods by their intended impact on the ideaspace and organize the core with phases of divergent and convergent methods.
Create physical and visual artifacts.
We select methods by how they encourage participants to write, draw, or otherwise externalize their ideas. Externalizing ideas creates artifacts for us to analyze after the workshop. It aids creative thinking because expressing an idea forces the creator to elaborate it , and promotes idea sharing that encourages collegiality.
We consider the artifact materials to be important. Sticky notes are particularly useful because they enable participants to group or rank ideas and potentially to discover emergent concepts in the ideaspace . We have used sticky notes in almost all of our workshops, often using their color to encode information about which method generated an idea, and their positions to relate, differentiate, or rank ideas. This can help establish consensus. It can also aid post-workshop analysis by recording how ideas evolved and were valued throughout the workshop. Additional materials effective for externalizing ideas include handouts with structured prompts, butcher paper, and poster boards. Using whiteboards is tempting, but ideas are easily lost if the boards are erased.
We also consider the form of ideas to be important. Effective methods create artifacts relevant to the theme and topic of visualization. This can be achieved through the use of visual language (see wishful thinking in Sec. 7.4) and by encouraging participants to sketch or draw, such as in storyboarding [1, 1, 1]. We see many opportunities to create visual artifacts using existing methods, such as sketching with data , constructive visualizations , or parallel prototyping  approaches.
Balance activity with rest.
Because continuously generating or discussing ideas can be tiring for participants, we structure workshop methods to provide a balance between activity and rest. Specifically, we incorporate passive methods that provide time for incubation, the conscious and unconscious combination of ideas .
Passive methods can include short breaks with food and coffee, informal discussions over meals, or methods where participants listen to presentations. When using methods that present ideas, asking participants to record their thoughts and reactions can promote interest and maintain a feeling of agency. We have typically used passive methods in full-day workshops [1, 1, 1, 1], but we rely on breaks between methods for shorter workshops .
Mix it up.
We consider the relationships among methods to be important as we strive to balance exploration with focus and activity with rest, while also using many materials for externalizing ideas. Considering methods that vary these factors can provide different levels of challenge because, for example, methods that require drawing ideas may be more difficult than discussing ideas. Using a variety of methods may also maintain interest because participants may become bored if too much time is spent on a specific idea.
We avoid potentially jarring transitions between methods to preserve participant interest. Convergent discussions can be used to conclude individual methods by highlighting the interesting, exciting, or influential ideas. These discussions can promote collegiality by encouraging communication of ideas, agency by validating participants’ contributions, and interest in the ideas generated. Convergent discussions also highlight potentially important ideas for researchers to focus on after the workshop.
Convergent methods can also conclude the workshop core by grouping or ranking key ideas. We have used storyboarding to encourage the synthesis of ideas into a single narrative [1, 1, 1]. We have also asked participants to rank ideas, providing cues for analyzing the workshop results . Convergent methods provide a sense of validation, potentially helping to build trust among researchers and collaborators as we transition to the closing.
7.3 Workshop Closing
The workshop closing sets the tone for continued collaboration. It is an opportunity to promote collegiality by reflecting on the shared creative experience. It allows for analysis that can potentially identify the more interesting visualization opportunities. The following two guidelines apply to effective closings.
Encourage reflection for validation.
We use discussions at the end of workshops to encourage reflection, potentially providing validation to participants and generating information valuable for workshop analysis. We encourage participants to reflect on how their ideas have evolved by asking, “What do you know now that you did not know this morning?”  or “What will you do differently tomorrow, given what you have learned today?” . Responses to these questions can provide validation for the time committed to the workshop. One participant, for example, reported, “I was surprised by how much overlap there was with the challenges I face in my own work and those faced by others” .
Promote continued collaboration.
We conclude the workshop by identifying the next steps of action — continuing the methodology of the collaboration. We can explain how the ideas will be used to move the collaboration forward, often with design methods as we describe in Sec. 9.
We can also ask participants for feedback about the workshop to learn more about their perceptions of visualization and to evaluate the effectiveness of workshop methods — encouraging the visualization mindset. E-mailing online surveys immediately after a workshop is effective for gathering feedback [1, 1].
7.4 Example Workshop & Methods
To illustrate the workshop structure, we include an example workshop, shown in Fig. 2. We selected this example because it has proven effective in three of our projects [1, 1, 1]. Here, we describe three methods of this workshop that we have also used successfully in additional workshops [1, 1], and we refer to the Supplemental Material for descriptions of the remaining five methods. We emphasize that this is a starting place for thinking about workshops, and encourage that methods be adopted and adapted for local context.
To explain the workshop methods, we refer to their process — the steps of execution . This process description abstracts and simplifies the methods because during their execution we adapt the process based on participant reactions and our own judgment of the TACTICs.
We have used this active, interpersonal, and potentially divergent method in the workshop opening. A process of this method, shown in Fig. 2 (right, top), starts with a facilitator posing the analogy introduction prompt, e.g., “If you were to describe yourself as an animal, what would you be and why?” . The facilitators and participants then respond to the prompt in turn — expressing themselves creatively.
Because everyone responds to the eccentric prompt, this method supports interpersonal leveling that helps to develop trust and collegiality among stakeholders. Using analogy can prime participants to think creatively .
This method is simple to execute, and participants report that it has a profound impact on the workshop because of the leveling that occurs. The method helps to establish trust and that all ideas should be accepted and explored .
A more topical alternative requires more preparation. We have asked participants to come to the workshop with an image that represents their feelings about the project. Participants have created realistic images, clip-art, and sketches to present and discuss . A visual analogy introduction can help establish the topic of visualization early in the workshop.
We have used this divergent, active method early in the workshop core. It is based on creativity methods to generate aspirations . We tailored these methods to visualization by prompting participants with a domain scenario and asking questions: “What would you like to know? What would you like to do? What would you like to see?”
One process of this method is shown in Fig. 2 (right, middle). First, we introduce the prompt and participants answer the know/do/see questions individually on sticky notes. Next, participants share ideas in a large group to encourage collegiality and cross-pollination of ideas. Then, participants form small groups and try to build on their responses by selecting interesting ideas, assuming that they have been completed, and responding to the know/do/see questions again — increasing the challenge. Finally, we lead a convergent discussion to highlight interesting ideas and to transition to the next method.
We encourage participants to record answers to the know/do/see questions on different color sticky notes because each prompt provides information that is useful at different points in the design process. Participants describe envisaged insights they would like to know and analysis tasks that they would like to do. Asking what participants would like to see is often more of a challenge, but ensures that a topic of visualization is established early.
We tailor the prompt to the workshop theme and project goals. For example, we asked energy analysts about long term goals for their project — “aspirations for the Smart Home programme…” They generated forward-thinking ideas, e.g., to better understand the value of the data . In contrast, we asked neuroscientists about their current analysis needs — “suppose you are analyzing a connectome…” They created shorter term ideas, e.g., to see neuron connectivity .
We have used this divergent, initially passive method later in the workshop core because it promotes incubation while allowing participants to specify visualization requirements by example. Similar to analogy-based creativity methods  and the visualization awareness method , we present a curated collection of visualizations and ask participants to individually record analogies to their domain and to specify aspects of the visualizations that they like or dislike. We have used this method repeatedly, iteratively improving its process by reflecting on what worked in a number of our workshops [1 – 1, 1].
One process of this method is shown in Fig. 2 (right, bottom). First, we provide participants with paper handouts that contain a representative image of each visualization — we have encouraged participants to annotate the handouts, externalizing their ideas [1, 1, 1]. Next, we present the curated visualizations on a projector and ask participants to think independently about how each visualization could apply to their domain and record their ideas. Then, we discuss these visualizations and analogies in a large group.
We curate the example visualizations to increase interest and establish participants’ trust in our visualization expertise. We have used visualizations that we created (to show authority and credibility); those that we did not create (for diversity and to show knowledge of the field); older examples (to show depth of knowledge); challenging examples (to stretch thinking); playful examples (to support engagement and creativity); closely related examples (to make analogies less of a challenge); and unrelated examples (to promote more challenging divergent thinking).
The discussions during this method have expanded the workshop ideaspace in surprising ways, including “What does it mean for legends to move?” , “What does it mean for energy to flow?” , and “What does it mean for neurons to rhyme?” . Although this method is primarily passive, participants report that it is engaging and inspiring to see the possibilities of visualization and think about how such visualizations apply to their domain.
Additional Methods & Resources
We introduce the example workshop and methods as starting points for future workshops. Yet, the workshop design space is practically infinite and design should be approached with creativity in mind.
To help researchers navigate the design space, our Supplemental Material contains a list of 15 example methods that we have used or would consider using in future workshops. For these methods, we describe their process, their influence on the workshop ideaspace, their level of activity, and their potential impact on the TACTICs for effective workshops.
We have also found other resources particularly useful while designing workshops. These include books [1, 20, 21, 25, 38, 58] and research papers [55, 56, 71]. Although these resources target a range of domains outside of visualization, we tailor the workshop methods such that they encourage a visualization mindset and focus on the topic of visualization opportunities.
8 During The Workshop: Execute & Adapt
Continuing the CVO workshop process model shown in Fig. 1, we execute the workshop plan. This section proposes five guidelines for workshop execution.
Prepare to execute.
We prepare for the workshop in three ways: resolving details, reviewing how to facilitate effectively, and checking the venue. We encourage researchers to prepare for future workshops in the same ways.
We prepare by resolving many details, such as inviting participants, reserving the venue, ordering snacks for breaks, making arrangements for lunch, etc. Brooks-Harris and Stock-Ward  summarize many practical details that should be considered in preparing for execution. Our additional advice is to promote the visualization mindset in workshop preparation and execution.
We prepare by reviewing principles of effective facilitation, such as acting professionally, demonstrating acceptance, providing encouragement, and using humor [1, 8, 20, 21, 83]. We also assess our knowledge of the domain because, as facilitators, we will need to lead discussions. Effectively leading discussions can increase collegiality and trust between stakeholders as participants can feel that their ideas are valued and understood. In cases where we lacked domain knowledge, we recruited collaborators to serve as facilitators [1, 1].
We also prepare by checking the venue for necessary supplies, such as a high quality projector, an Internet connection (if needed), and ample space for group activity. Within the venue, we arrange the furniture to promote a feeling of co-ownership and to encourage agency — a semi-circle seating arrangement works well for this . A mistake in one of our workshops was to have a facilitator using a podium, which implied a hierarchy between facilitators and participants, hindering collegiality .
Workshops provide a time to step away from normal responsibilities and to focus on the topic. Accordingly, participants and facilitators should be focused on the workshop without distractions, such as leaving for a meeting.
Communicating with people outside of the workshop — e.g., through e-mail — commonly distracts participants and facilitators. It should be discouraged in the workshop opening (e.g., switch off all electronic devices). Principles in the workshop opening, however, should be justified to participants. Also, facilitators should lead by example at the risk of eroding trust and collegiality.
While starting execution, the workshop opening can establish an atmosphere in which participants take initiative in completing methods. It is, however, sometimes necessary to redirect the participants in order to stay focused on the topic. Conversations that deviate from the workshop theme should be redirected. In one workshop , participants were allowed to discuss ideas more freely, and they reported in feedback that, “We had a tendency to get distracted [during discussions].” In a later workshop , we more confidently guided discussions, and participants reported “We were guided and kept from going too far off track …this was very effective.”
However, guiding participants requires judgment to determine whether a conversation is likely to be fruitful. It also requires us to be sensitive to the TACTICs — e.g., how would redirecting this conversation influence collegiality or agency? Redirection can be jolting and can contradict some of the guidelines (e.g., all ideas are valid). We can prepare participants for redirection with another guideline during the workshop opening: Facilitators may keep you on track gently, so please be sensitive to their guidance.
As we guide participants to stay on topic, it is important to be flexible in facilitation. For example, we may spend more time than initially planned on fruitful methods or cut short methods that bore participants.
Following this guideline can also blur the distinction between participants and facilitators. In one workshop , participants proposed a method that was more useful than what was planned. Thus, they became facilitators for this part of the workshop, which reinforced agency and maintained the interest of all stakeholders in the project. In the future, we may explore ways to plan this type of interaction, perhaps encouraging participants to create their own methods.
As we guide the workshop, we interpret group dynamics and adapt methods to the changing situation. We can be forced to adapt for many reasons, such as a failing method (nobody feels like an animal this morning; sticky notes don’t stick), a loss of interest (there is no energy; the room is too hot; we had a tough away day yesterday); a lack of agency (some participants dominate some tasks); or an equipment failure (projector does not work; no WiFi connection to present online demos ). Designing the workshop with alternative methods in mind — perhaps with varying degrees of challenge — can ensure that workshop time is used effectively.
Record ideas collectively.
Remember: conversations are ephemeral and anything not written down will likely be forgotten. We therefore encourage facilitators and participants to document ideas with context for later analysis. Selecting methods to create physical artifacts can help with recording ideas. As described in Sec. 7, externalizing ideas on sticky notes and structured prompts has been effective in our workshops and addresses the visualization mindset.
We are uncertain about the use of audio recording to capture workshop ideas. Although it can be useful for shorter workshops , it can require tremendous time to transcribe before analysis . Also, recording audio effectively can be challenging as participants move around during the workshop.
It can be useful to ensure that facilitators know that they are expected to help document ideas. A pilot workshop can help with this. In at least one of our projects , a pilot workshop may have reduced the note taking pressure on the primary researcher by setting clear expectations that all facilitators should help take notes.
9 After the Workshop: Analyze & Act
After the CVO workshop, we analyze its output and use the results of that analysis to influence the on-going collaboration. Here, we describe five guidelines for this analysis and action.
Allocate time for analysis — soon.
Effective CVO workshops generate rich and inspiring artifacts that can include hundreds of sticky notes, posters, sketches, and other documents. The exact output depends on the methods used in the workshop. Piloting methods can help prepare researchers for the analysis. Regardless, making sense of this output is labor intensive, often requiring more time than the workshop itself. Thus, it is important that we allocate time for analysis, particularly within a day of the workshop, so that we can analyze the workshop output while the experiences are still fresh in our memory.
Create a corpus.
We usually start analysis by creating a digital corpus of the CVO workshop output. We type or photograph the artifacts, organizing ideas into digital documents or spreadsheets. Through this process, we become familiar with key ideas contained in the artifacts. The corpus also preserves and organizes the artifacts, potentially allowing us to enlist diverse stakeholders — such as facilitators and collaborators — in analysis . This can help in clarifying ambiguous ideas or adding context to seemingly incomplete ideas.
Analyze with an open mind.
Because the ideas in the workshop output will vary among projects, there are many ways to analyze this corpus of artifacts. We have used qualitative analysis methods — open coding, mindmapping, and other less formal processes — to group artifacts into common themes or tasks [1, 1 – 1]. Quantitative analysis methods should be approached with caution as the frequency of an idea provides little information about its potential importance.
We have ranked the themes and tasks that we discovered in analysis according to various criteria, including novelty, ease of development, potential impact on the domain, and relevance to the project [1, 1– 1]. In other cases [1, 1], workshop methods generated specific requirements, tasks, or scenarios that could be edited for clarity and directly integrated into the design process.
We encourage that analysis be approached with an open mind because of the many ways to make sense of the workshop data, including some approaches that we may not yet have considered.
Embrace results in the visualization design process.
Similarly, CVO workshop results can be integrated into visualization methodologies and processes in many ways. We have, for example, run additional workshops that explored the possibilities for visualization designs [1, 1]. We have applied traditional user-centered design methods, such as interviews and contextual inquiry, to better understand collaborators’ tasks that emerged from the workshop . We have created prototypes of varying fidelity, from sketches to functioning software [1 – 1], and we have identified key aims in proposals for funded collaboration .
In all of these cases, our actions were based on the reasons why we ran the workshops, and the workshop results profoundly influenced the direction of our collaboration. For example, in our collaboration with neuroscientists , the workshop helped us focus on graph connectivity, a topic that we were able to explore with technology probes and prototypes of increasing fidelity, ultimately resulting in new visualization tools and techniques.
Revisit, reflect, and report on the workshop.
The CVO workshop output is a trove of information that can be revisited throughout (and even beyond) the project. It can be used to document how ideas evolve throughout applied collaborations. It can also be used to evaluate and validate design decisions by demonstrating that any resulting software fulfills analysis needs that were identified in the workshop data [1 – 1]. Revisiting workshop output repeatedly throughout a project can continually inspire new ideas.
In our experience creating this paper, revisiting output from our own workshops allowed us to analyze how and why to use CVO workshops. We encourage researchers to reflect and report on their experiences using CVO workshops, the ways in which workshops influence collaborations, and ideas for future workshops. We hope that this framework provides a starting point for research into these topics.
This section discusses implications and limitations of CVO workshops and the research methodology of critically reflective practice.
10.1 Limitations of CVO Workshops
Our experience across diverse domains — from cartography to neuroscience — provides evidence that CVO workshops are a valuable and general method for fostering the visualization mindset while creating artifacts that advance visualization methodologies. We argue that they achieve these goals through the use of methods that appropriately emphasize the topic of visualization opportunities while accounting for (inter)personal factors, including agency, collegiality, challenge, interest, and trust.
Yet, workshops may not be appropriate in some scenarios. Because using workshops requires researchers to ask interesting questions and potentially lead discussions about their collaborators’ domain, we caution the use of workshops as the first method in a project. Traditional user-centered approaches should be used to learn domain vocabulary and explore the feasibility of collaboration. In the project that did not result in ongoing collaboration , we lacked the domain knowledge needed to effectively design the workshop. Also, our collaborators were too busy to meet with us before the workshop, which should have been a warning about the nature of the project. Accordingly, we recommend researchers evaluate the preconditions of design studies  in projects where they are considering workshops.
We also recognize that workshops may not be well received by all of the stakeholders. In a full-day workshop , one participant reported that “Overall, it was good, but a bit long and slightly repetitive.” Similarly, after another full-day workshop , one participant said “There was too much time spent expanding and not enough focus …discussions were too shallow and nonspecific.” Nevertheless, both workshops were generally well received by stakeholders as they allowed us to explore a broad space of visualization opportunities. We can, however, improve future workshops by ensuring that the methods are closely related to the topic and that we facilitate workshops in a way that provides appropriate agency to all of the stakeholders.
More generally, whether workshops can enhance creativity is an open question [63, 77]. Creativity is a complex phenomenon studied from many perspectives, including design , psychology , sociology , and biology . The results of several controlled experiments indicate that group-based methods can reduce creativity [4, 60]. Yet, critics of these studies argue that they rely on contrived metrics and lack ecological validity [23, 53]. Experimentally testing the relationship between workshops and creativity is beyond the scope of this paper. Instead, we focus on understanding and communicating how we use CVO workshops in applied collaborations.
10.2 Critically Reflective Practice
Throughout this project, we wrestled with a fundamental question: how can we rigorously learn from our diverse, collective experience? We first examined measurable attributes of workshops, such as their length, number of participants, and quantity of ideas generated. However, our workshops were conducted over 10 years in applied settings with no experimental controls. More importantly, it is difficult, if not impossible, to measure how ideas influence collaborations. Quantitative analysis, we decided, would not produce useful knowledge about how to use CVO workshops.
We also considered qualitative research methodologies and methods, such as grounded theory  and thematic analysis . These approaches focus on extracting meaning from externalized data, but the the most meaningful and useful information about workshops resided in our collective, experiential knowledge. We therefore abandoned analysis methods that ignore (or seek to suppress) the role of experience in knowledge generation.
We found critically reflective practice to be an appropriate approach, providing a methodology to learn from the analysis of experience, documentation, and existing theory, while allowing for the use of additional analysis methods [7, 84]. Due to the nature of reflection, however, the framework is not exhaustive, predictive, or objective. Nevertheless, it is consistent with our experience, grounded in existing theory, and, we argue, useful for future visualization research.
Yet, the use of reflective practice may raise questions about the validity of this work. After all, can the framework be validated without experimental data? We emphasize our choice of the term framework  because we intend for it to be evaluated by whether it provides an interpretive understanding of CVO workshops. Our position is that it achieves this goal because it enabled us to learn from our experience using workshops on 3 continents over the past 10 years. For example, we used the framework to identify and organize 25 pitfalls to avoid in future workshops — they are described in the Supplemental Material. This framework, however, is only a snapshot of our current understanding of CVO workshops, which will continue to evolve with additional research, practice, and reflection.
Given that this work results from the subjective analysis of our experience, we recognize that there could also be questions about its trustworthiness. Therefore, to increase the trustworthiness of our results, we provide an audit trail [10, 41] of our work that contains a timeline of our analysis and our experience as well as diverse artifacts, including comparative analysis of our workshops, presentations outlining the framework, early written drafts of our framework, and structured written reflection to elicit ideas from all of this paper’s coauthors. This audit trail, in Supplemental Material, summarizes and includes 30 of the reflective artifacts, culled from the original set to protect the privacy of internal discussions and confidential materials from our domain collaborators.
In future reflective projects, we plan to establish guidelines that encourage transparency of reflective artifacts through mechanisms to flag documents as on- or off-the-record. Because our research and meta-analysis would have been impossible without well-preserved documentation, we hope that the audit trail inspires future thinking on how to document and preserve the decisions in visualization collaborations. We put forth both the audit trail and our documented use of critically reflective practice as secondary contributions.
11 Conclusion and Future Work
This paper contributes a framework for using workshops in the early stages of applied visualization research. The framework consists of two models for CVO workshops — a process model and a workshop structure. The framework also includes 25 actionable guidelines for future workshops and a validated example workshop.
We support the framework with Supplemental Material that includes extended details about the example workshop, 15 additional example workshop methods, 25 pitfalls to avoid in future workshops, and an analysis timeline and audit trail documenting how we developed the framework during a 2-year reflective collaboration. We hope that this framework inspires others to use and report on CVO workshops in applied visualization research.
Further thinking on the framework reveals opportunities for developing CVO workshop methods that emphasize the visualization mindset. For example, inspired by the Dear Data project , we could ask participants to create graphics that reveal something about their daily life in the week before the workshop. The Dear Data Postcard Kit  offers guidance and materials for creating data visualizations about personal experiences, which could be adopted in CVO workshops.
We also hope to better understand the role of data in CVO workshops. Visualization methodologies stress the importance of using real data early in collaborative projects [43, 80]. Our workshops, however, have focused participants on their perceptions of data rather than using real data because working with data is time consuming and unpredictable. In some projects, we incorporated data into the design process by using a series of workshops spaced over weeks or months, providing time for developers to design prototypes between workshops [1 – 1]. This development between workshops was expensive in terms of time and effort. But time moves on, and we may be able to reliably use data in workshops with new technologies and techniques, e.g., visualization design tools , declarative visualization languages , constructive visualization , and sketching .
Additionally, in this paper we focused on workshops to elicit visualization opportunities in the early stages of applied work. Exploring how the framework could be influenced by and extended for workshops that correspond to other stages of applied work — including the creation and analysis of prototypes, the exploration of data, or in the deployment, training and use of completed systems — may open up opportunities for using creativity in visualization design and research.
We are grateful to the participants, facilitators, and fellow researchers in all of our workshops. We thank the following people for their feedback and contributions to this work: the anonymous reviewers, Graham Dove, Tim Dwyer, Peter Hoghton, Christine Pickett, David Rogers, Francesca Samsel, members of the Vis Design Lab at the University of Utah, and members of the giCentre at City, University of London. This work was supported in part by NSF Grant IIS-1350896.
-  Creative Problem-Solving Resource Guide. Creative Education Foundation, Scituate, MA, USA, 2015.
-  L. W. Anderson, D. R. Krathwohl, P. W. Airasian, K. A. Cruikshank, R. E. Mayer, P. R. Pintrich, J. Rathes, and M. C. Wittrock. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxnomy of Educational Objectives, Abridged Edition. Pearson, 2000.
-  M. M. Biskjaer, P. Dalsgaard, and K. Halskov. Understanding creativity methods in design. In Proc. Conf. Designing Interactive Syst., pages 839–851. ACM SIGCHI, 2017.
-  T. J. Bouchard. Personality, problem-solving procedure, and performance in small groups. J. Appl. Psychology, 53(1):1–29, 1969.
-  D. Boud, R. Keogh, and D. Walker. Reflection: Turning Experience into Learning. Routledge Taylor and Francis Group, London, UK, 1985.
-  V. Braun and V. Clarke. Using thematic analysis in psychology. Qualitative Res. Psychology, 3(2):77–101, 2006.
-  S. Brookfield. Critically reflective practice. J. of Continuing Edu. in the Health Professions, 18(4):197–205, 1998.
-  J. E. Brooks-Harris and S. R. Stock-Ward. Workshops: Designing and Facilitating Experiential Learning. SAGE Publications, Inc, Thousand Oaks, CA, USA, 1999.
-  B. Buxton. Sketching User Experiences: Getting the Design Right and the Right Desing. Morgan Kaufmann, San Francisco, CA, USA, 2010.
-  M. Carcary. The research audit trail-enhancing trustworthiness in qualitative inquiry. The Electron. J. of Bus. Res. Methods, 7(1), 2009.
-  J. Corbin and A. Strauss. Grounded theory research: Procedures, canons, and evaluative critera. Qualitative Sociology, 13(1):3–21, 1990.
-  M. Crotty. The Foundations of Social Research. SAGE Publications, Inc, London, UK, 1998.
-  E. de Bono. Lateral Thinking for Management. Pelican Books, Middlesex, UK, 1983.
-  G. Dove and S. Jones. Using data to stimulate creative thinking in the design of new products and services. In Proc. Conf. Designing Interactive Syst., pages 443–452. ACM SIGCHI, 2014.
-  G. Dove, S. Julie, M. Mose, and N. Brodersen. Grouping notes through nodes: The functions of post-it notes in design team cognition. In Des. Thinking Res. Symp., Copenhagen Business School, 2016.
-  J. Dykes, J. Wood, and A. Slingsby. Rethinking map legends with visualization. IEEE Trans. Vis. Comput. Graphics, 16(6):890–899, 2010.
-  S. Goodwin, J. Dykes, S. Jones, I. Dillingham, G. Dove, D. Allison, A. Kachkaev, A. Slingsby, and J. Wood. Creative user-centered design for energy analysts and modelers. IEEE Trans. Vis. Comput. Graphics, 19(12):2516–2525, 2013.
-  S. Goodwin, C. Mears, T. Dwyer, M. Garcia de la Banda, G. Tack, and M. Wallace. What do constraint programming users want to see? Eexploring the role of visualisation in profiling of models and search. IEEE Trans. Vis. Comput. Graphics, 23(1):281–290, 2016.
-  J. Gordon, William. Synectics - the Developmnent of Creative Capacity. Harper and Row, New York, NY, USA, 1961.
-  D. Gray, J. Macanufo, and S. Brown. Gamestorming: A Playbook for Innovators, Rulebreakers, and Changemakers. O’Reilly Media, Sebastopol, CA, USA, 2010.
-  P. Hamilton. The Workshop Book: How to Design and Lead Succesful Workshops. FT Press, Upper Saddle River, NJ, USA, 2016.
-  S. He and E. Adar. VizItCards: A card-based toolkit for infovis design education. IEEE Trans. Vis. Comput. Graphics, 23(1):561–570, 2017.
-  T. Hewett, M. Czerwinski, M. Terry, J. Nunamaker, L. Candy, B. Kules, and E. Sylvan. Creativity support tool evaluation methods and metrics. In NSF Workshop Report on Creativity Support Tools, 2005.
-  M. J. Hicks. Problem Solving and Decision Making: Hard, Soft, and Creative Approaches. Thomson Learning, London, UK, 2004.
-  L. Hohmann. Innovation Games: Creating Breakthrough Products Through Collaborative Play. Addison-Wesley, Boston, MA, USA, 2007.
-  B. Hollis and N. Maiden. Extending agile processes with creativity techniques. IEEE Software, 30(5):78–84, 2013.
-  J. Horkoff, N. Maiden, and J. Lockerbie. Creativity and goal modeling for software requirements engineering. In Proc. Conf. Creativity and Cognition, pages 165–168. ACM, 2015.
-  S. Huron, S. Carpendale, J. Boy, and J. D. Fekete. Using VisKit: A manual for running a constructive visualization workshop. In Pedagogy of Data Vis. Workshop at IEEE Vis, 2016.
-  S. Huron, S. Carpendale, A. Thudt, A. Tang, and M. Mauerer. Constructive visualization. In Proc. Conf. Designing Interactive Syst., pages 433–442. ACM SIGCHI, 2014.
-  Y. Jabareen. Building a conceptual framework: Philosophy, definitions, and procedure. Intern. J. of Qualitative Methods, 8(4):49–62, 2008.
-  S. Jones, P. Lynch, N. Maiden, and S. Lindstaedt. Use and influence of creative ideas and requirements for a work-integrated learning system. In Int. Requirements Eng. Conf., pages 289–294. IEEE, 2008.
-  S. Jones and N. Maiden. RESCUE: An integrated method for specifying requirements for complex socio-technical systems. In J. L. Mate and A. Silva, editors, Requirements Engineering for Sociotechnical Systems, pages 245–265. Information Resources Press, Arlington, VA, USA, 2005.
-  S. Jones, N. Maiden, and K. Karlsen. Creativity in the specification of large-scale socio-technical systems. In Conf. Creative Inventions, Innovations and Everyday Des. HCI, 2007.
-  E. Kerzner, A. Lex, and M. Meyer. Utah population database workshop (workshop, University of Utah). unpublished, 2017.
-  E. Kerzner, A. Lex, T. Urness, C. L. Sigulinsky, B. W. Jones, R. E. Marc, and M. Meyer. Graffinity: Visualizing connectivity in large graphs. Comput. Graph. Forum, 34(3):251–260, 2017.
-  J. Knapp, J. Zeratsky, and B. Kowitz. Sprint: How to Solve Big Problems and Test New Ideas in Just Five Days. Simon & Schuster, New York, NY, USA, 2016.
-  L. C. Koh, A. Slingsby, J. Dykes, and T. S. Kam. Developing and applying a user-centered model for the design and implementation of information visualization tools. In Proc. Int. Conf. Inform. Vis., pages 90–95. IEEE, 2011.
-  V. Kumar and V. LaConte. 101 Design Methods: A Structured Approach to Driving Innovation in Your Organization. Wiley, San Francisco, CA, USA, 2012.
-  H. Lam, E. Bertini, P. Isenberg, and C. Plaisant. Empirical studies in information visualization: Seven scenarios. IEEE Trans. Vis. Comput. Graphics, 18(9):1520–1536, 2012.
-  B. Laural, editor. Design Research: Methods and Perspectives. MIT Press, Cambridge, MA, USA, 2003.
-  Y. S. Lincoln and E. Guba. Naturalistic Inquiry. SAGE Publications, Inc, Thousand Oaks, CA, USA, 1985.
-  C. Lisle, E. Kerzner, A. Lex, and M. Meyer. Arbor summit workshop (workshop, University of Utah). unpublished, 2017.
-  D. Lloyd and J. Dykes. Human-centered approaches in geovisualization design: Investigating multiple methods through a long-term case study. IEEE Trans. Vis. Comput. Graphics, 17(12):2498–2507, 2011.
-  T. I. Lubart. Creativity across cultures. In R. J. Sternberg, editor, Handbook of Creativity, pages 339–350. Cambridge University Press, Cambridge, UK, 1999.
-  G. Lupi and S. Posavec. Dear Data: The Story of a Friendship in Fifty-Two Postcards. Penguin, London, UK, 2016.
-  G. Lupi and S. Posavec. Dear Data Postcard Kit: For Two Friends to Draw and Share (Postcards). Princeton Architectural Press, New York City, NY, USA, 2017.
-  N. Maiden, S. Jones, K. Karlsen, R. Neill, K. Zachos, and A. Milne. Requirements engineering as creative problem solving: A research agenda for idea finding. In Int. Requirements Eng. Conf., pages 57–66. IEEE, 2010.
-  N. Maiden, S. Manning, S. Robertson, and J. Greenwood. Integrating creativity workshops into structured requirements processes. In Proc. Conf. Designing Interactive Syst., pages 113–122. ACM SIGCHI, 2004.
-  N. Maiden, C. Ncube, and S. Robertson. Can requirements be creative? Experiences with an enhanced air space management system. In Int. Conf. Software Eng., pages 632–641. IEEE, 2007.
-  N. Maiden and S. Robertson. Developing use cases and scenarios in the requirements process. In Proc. Intern. Conf. Software Eng., pages 561–570. ACM, 2005.
-  G. E. Marai. Activity-centered domain characterization for problem-driven scientific visualization. IEEE Trans. Vis. Comput. Graphics, 24(1):913–922, 2018.
-  C. Martindale. Biological bases of creativity. In R. J. Sternberg, editor, Handbook of Creativity, pages 137–152. Cambridge University Press, Cambridge, UK, 1999.
-  R. Mayer. Fifty years of creativity research. In R. J. Sternberg, editor, Handbook of Creativity, pages 449–460. Cambridge University Press, Cambridge, UK, 1999.
-  N. McCurdy, J. Dykes, and M. Meyer. Action design research and visualization design. In Proc. Workshop on Beyond Time and Errors on Novel Evaluation Methods for Vis. (BELIV), pages 10–18. ACM, 2016.
-  E. McFadzean. The creativity continuum:towards a classification of creative problem solving techniques. J. of Creativity and Innovation Manage., 7(3):131–139, 1998.
-  S. McKenna, D. Mazur, J. Agutter, and M. Meyer. Design activity framework for visualization design. IEEE Trans. Vis. Comput. Graphics, 20(12):2191–2200, 2014.
-  S. McKenna, D. Staheli, and M. Meyer. Unlocking user-centered design methods for building cyber security visualizations. In IEEE Symp. Vis. for Cyber Security (VizSec), 2015.
-  M. Michalko. Thinkertoys: A Handbook for Creative-Thinking Techniques. Ten Speed Press, Emeryville, CA, USA, 2006.
-  W. C. Miller. The Creative Edge: Fostering Innovation Where You Work. Basic Books, New York City, NY, USA, 1989.
-  B. Mullen, C. Salas, and E. Johnson. Productivity loss in brainstorming groups: A meta-analytical integration. Basic and Appl. Social Psychology, 12(1):3–23, 1991.
-  M. Muller and S. Kuhn. Participatory design. Commun. ACM, 36(6):24–28, 1993.
-  T. Munzner. A nested model for visualization design and validation. IEEE Trans. Vis. Comput. Graphics, 15(6):921–928, 2009.
-  R. S. Nickerson. Enhancing creativity. In R. J. Sternberg, editor, Handbook of Creativity, pages 392–430. Cambridge University Press, Cambridge, UK, 1999.
-  C. Nobre, N. Gehlenborg, H. Coon, and A. Lex. Lineage: Visualizing multivariate clinical data in genealogy graphs. IEEE Trans. Vis. Comput. Graphics, to be published., 2018.
-  D. A. Norman and S. W. Draper. User Centered System Design; New Perspectives on Human-Computer Interaction. L. Erlbaum Associates Inc, Hillsdale, NJ, USA, 1986.
-  A. Osborn. Applied Immagination: Principles and Procedures of Creative Problem Solving. Charle Scribener’s Sons, New York, NY, USA, 1953.
-  J. C. Roberts, C. Headleand, and P. D. Ritsos. Sketching designs using the five design-sheet methodology. IEEE Trans. Vis. Comput. Graphics, 22(1):419–428, 2016.
-  D. H. Rogers, C. Aragon, D. Keefe, E. Kerzner, N. McCurdy, M. Meyer, and F. Samsel. Discovery Jam. In IEEE Vis (Workshops), 2016.
-  D. H. Rogers, F. Samsel, C. Aragon, D. F. Keefe, N. McCurdy, E. Kerzner, and M. Meyer. Discovery Jam. In IEEE Vis (Workshops), 2017.
-  R. Sakai and J. Aerts. Card sorting techniques for domain characterization in problem-driven visualization research. In Eurographics Conf. Vis. (Short Papers). Eurographics, 2015.
-  E. B.-N. Sanders. Information, insipiration, and co-creation. In Conf. European Academy of Des., 2005.
-  E. B.-N. Sanders, E. Brandt, and T. Binder. A framework for organizing the tools and techniques of participatory design. In Proc. Participatory Des. Conf., pages 195–198, 2010.
-  E. B.-N. Sanders and P. J. Stappers. Co-creation and the new landscapes of design. CoDesign: Int. J. of CoCreation in Des. and the Arts, 4(1):5–18, 2008.
-  L. Sanders and P. J. Stappers. Convivial Toolbox: Generative Research for the Front End of Design. BIS Publishers, Amsterdam, The Netherlands, 2013.
-  A. Satyanarayan, D. Moritz, and K. Wongsuphasawat. Vega-Lite: A grammar of interactive graphics. IEEE Trans. Vis. Comput. Graphics, 23(1):341–350, 2017.
-  K. R. Sawyer. Group Creativity: Music, Theater, Collaboration. Lawrence Erlbaum Associates, Mahwah, NJ, USA, 2003.
-  K. R. Sawyer. Explaining Creativity - the Science of Human Innovation. Oxford University Press, New York, NY, USA, 2006.
-  D. A. Schon. The Reflective Practitioner. Basic Books, New York City, NY, USA, 1988.
-  M. Sedlmair, P. Isenberg, D. Baur, and A. Butz. Evaluating information visualization in large companies: Challenges, experiences and recommendations. In Proc. Workshop on Beyond Time and Errors on Novel Evaluation Methods for Vis. (BELIV), pages 79–86. ACM, 2010.
-  M. Sedlmair, M. Meyer, and T. Munzner. Design study methodology: Reflections from the trenches and the stacks. IEEE Trans. Vis. Comput. Graphics, 18(12):2431–2440, 2012.
-  B. Shneiderman, G. Fischer, M. Czerwinski, and B. Myers. NSF Workshop Report on Creativity Support Tools. National Science Foundation, 2005.
-  B. Shneiderman and C. Plaisant. Strategies for evaluating information visualization tools. In Proc. Workshop on Beyond Time and Errors on Novel Evaluation Methods for Vis. (BELIV), pages 1–7. ACM, 2006.
-  R. B. Stanfield. The Workshop Book: From Individual Creativity to Group Action. New Society Publishers, Gabriola Island, BC, Canada, 2002.
-  S. Thompson and N. Thompson. The Critically Reflective Practioner. Palgrave Macmillan, New York, NY, USA, 2008.
-  M. Tory and T. Moller. Human factors in visualization research. IEEE Trans. Vis. Comput. Graphics, 10(1):72–82, 2004.
-  J. Vines, R. Clarke, and P. Wright. Configuring participation: On how we involve people in design. In CHI ’13 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, volume 20, 2013.
-  R. S. Vosko. Where we learn shapes our learning. New Directions for Adult and Continuing Edu., 50(Summer):23–32, 1991.
-  R. Walker, A. Slingsby, J. Dykes, K. Xu, J. Wood, P. H. Nguyen, D. Stephens, B. L. W. Wong, and Y. Zheng. An extensible framework for provenance in human terrain visual analytics. IEEE Trans. Vis. Comput. Graphics, 19(12):2139–2148, 2013.
-  J. Walny, S. Huron, and S. Carpendale. An exploratory study of data sketching for visual representation. Comput. Graph. Forum, 34(3):231–240, 2015.
-  K. Wongsuphasawat, D. Motitz, J. Mackinlay, B. Howe, and J. Heer. Voyager: Exploratory analysis via faceted browsing of visualization recommendations. IEEE Trans. Vis. Comput. Graphics, 22(1):649–658, 2016.