• KNAER-RECRAE Highlight:

    Untitled design 10

    The Knowledge Network for Student Well-Being sponsored the 2017 Ontario Healthy Schools Coalition Conference held April 4-5 in Toronto; the theme was Ontario’s Well-being Strategy for Education, and what it will mean for the province’s schools.

    The Knowledge Network for Student Well-Being was supported by six partner organizations; the Hamilton-Wentworth District School Board, the Offord Centre for Child Studies at McMaster University, School Mental Health ASSIST, PREVnet, The Social Planning Network of Ontario, and the Ontario Health Schools Coalition.

    Don Buchanan, a Knowledge Mobilization Officer and the Knowledge Network Facilitator with E-Best (Evidence-Based Education and Services Team with the HWDSB) presented the successful proposal for the Knowledge Network for Student Well-Being. It included details on developing successful networks, some current research in knowledge networks, and outlines the work the network will be doing over the next four years. Watch Don’s presentation.

    In addition to Don’s presentation, you can view six other presentations from the conference:


    The KNAER team engaged with many new people and colleagues!

    OHSC 2017 Final

    Over 40 people signed up to receive our KNAER newsletter for breaking news and events. You can sign up here as well! Newsletter sign-up

    Can't wait for the next conference!

  • KNAER-RECRAE Highlight:

    Partnering with Education Professionals to Mobilize Knowledge: Exciting developments in Australia and the UK


    In this blog, Professor Sharples shares with the KNAER network takeaways from his recent work with Evidence for Learning in Australia, as well as groundbreaking work at the Education Endowment Foundation in the UK. Similar to knowledge mobilization initiatives in Ontario, Jonathan observes a shift in both countries towards developing meaningful partnerships between practitioners and researchers as a key strategy for increasing the mobilization and impact of research informed practices in schools and classrooms. Jonathan adapted this blog from a recent post he contributed to the Evidence for Learning website.

    Towards the end of last year I had the pleasure of spending some time in Australia, working with colleagues at Evidence for Learning (, a new venture that aims to take an evidence-based approach to ‘helping great practice become common practice’.

    …the momentum for evidence-based reform has been increasingly carried by a broad alliance of people – policy makers, practitioners, intermediaries…

    My overriding impression from being there (apart from not wanting to come back to London in Winter!) was that it feels like an exciting time to be teaching in Australia, as evidence-informed practice emerges as a more central feature of the education landscape. The developments remind me of one of the key shifts we have seen in the UK over the last ten years, where the momentum for evidence-based reform has been increasingly carried by a broad alliance of people – policy makers, practitioners, intermediaries – rather than being a predominately research-driven agenda.

    This shift may also ring true for many in the Canadian Knowledge Mobilization community and it provides a fantastic opportunity I think. In Australia , I was struck by the sophisticated research knowledge of the teachers and headteachers I met, and their skills in interpreting and applying evidence to practice. Most impressive was the clarity of purpose in terms of using research, in combination with their professional expertise, to improve pupils’ learning and make education more equitable. Whilst this sounds obvious it is not always the case. Although well intentioned, sometimes we get so focused on the means – generating and using research – that we can lose track of the end, which is, of course, to improve the quality of teaching and learning in schools.

    I was struck by the sophisticated research knowledge of the teachers and headteachers I met, and their skills in interpreting and applying evidence to practice. Most impressive was the clarity of purpose in terms of using research, in combination with their professional expertise, to improve pupils’ learning and make education more equitable.

    Related to this I was struck by the sharp focus on implementation.I really like the phrase ‘the practitioner is the intervention’. It emphasises that no matter how good evidence-based practices and programmes are on paper, what really matters is how they manifest themselves in the day-to-day work of teachers in the classroom. Again this sounds obvious, but too often we focus on encouraging an engagement with evidence at the expense of providing the necessary time, commitment and support to translate a conceptual understanding into practical behaviours. In this respect I was lucky to see some great examples of disciplined and skilled implementation: schools picking a key priority area, say formative assessment or mastery learning, and setting out a long-term commitment to applying research in a way that works for their context, sometimes at the expense of other potential priorities.

    I was lucky to see some great examples of disciplined and skilled implementation: schools picking a key priority area…and setting out a long-term commitment to applying research in a way that works for their context, sometimes at the expense of other priorities.

    One of the most exciting aspects of the Education Endowment Foundation, where I am based in the UK, is how it is emerging as a genuine partnership between research and practice, with research producers and users collaborating to establish ‘what works’, and applying that knowledge in complex school environments.For example, a national network of Research Schools is emerging, which are creating opportunities and capacity for evidence-informed practice in their region, by communicating research findings, providing training and professional development, and encouraging disciplined, ‘bottom-up’ innovation (

    The opportunity, I think, is to make these leading schools as central as possible to our endeavour, as those individuals that are able to bridge the research-practice divide are, I believe, our greatest asset. The opportunity is for all stakeholders – researchers, practitioners and intermediaries – to work out what they bring to the party, whether that’s in driving innovation, capturing impacts, translating evidence, or integrating it into practice. Working out these coordinated, but differentiated, roles isn’t easy, although when it happens I think that’s when things really start to motor.


    Professor Sharples is a Senior Researcher at the Education Endowment Foundation, seconded from the Institute of Education at University College London, where he is exploring schools’ use of research evidence. Jonathan works with schools and policy makers across the sector to promote evidence-informed practice, and spread knowledge of ‘what works’ in teaching and learning. He writes evidence-based guidance for schools and works with practitioners to scale-up effective practices.

    Jonathan previously worked at The Institute for the Future of the Mind at the University of Oxford, where he was looking at how insights from brain-science research can support teachers’ expertise and professional development. Prior to this he worked as a secondary school science teacher in Sydney. He is the author of Evidence for the Frontline, a report published by the Alliance for Useful Evidence that outlines the elements of a functioning evidence system.


  • Knowledge Mobilization Connection:

    Have we Practiced what We've Preached?


    In July 2016, Sandra Nutley wrote a blog for the KNAER entitled, Using research to shape knowledge mobilisation practice. Nutley commented in this blog that while there is a growing number of knowledge mobilization initiatives dedicated to facilitating and enhancing research use, she noted an irony that “many of these initiatives struggle to demonstrate that their own knowledge mobilisation practice are themselves not research-informed and in line with the best available research on how to enhance research use”. Nutley was referring to several tenets that emerged from a literature review (2016) that she, Huw Davies and Alison Powell did that explored the latest thinking and empirical evidence on best practices. Even though Nutley indicated that KNAER was the exception to the current practice, we, the KNAER Secretariat, reflected on this irony and asked ourselves, can we demonstrate that we followed these tenets from the Nutley and colleagues’ literature review? Can we demonstrate that our own decisions surrounding the original KNAER were evidence-informed and have we continued to do so for the renewed KNAER? Essentially, have we “practiced what we’ve preached” about using research to inform practice?  

    Tenet #1 Bringing Researchers and Research Users Together

    The original KNAER was active between 2010-2014 with 44 knowledge mobilization projects. Each project was selected through an adjudication process that included a call for proposals that included specific application guidelines. Within these guidelines each project was expected to include several partners or partnerships. Of these partnerships, projects were expected to collaborate with at least one academic researcher. Analysis of our final reports indicated that the majority of projects were somehow connected to researchers either through universities, colleges, health organizations, and/or research departments at school boards. These projects were also expected to connect to practitioners: many demonstrated this by working with school boards and teacher and principal associations as brokers to educators in the field.


    KNAER Phase I Partnerships


    Tenet #2 Acknowledge the Importance of Context

    We understood context in this case to mean the context of the Ontario public education sector. The KNAER acknowledged the importance of the Ontario public education context in a number of ways. The original KNAER initiative selected knowledge mobilization projects that focused on the (then) four ministry priority areas: teaching and learning, transitioning, equity, and engagement. This was specifically one of the criteria used in adjudicating the more than 100 proposals submissions. As is evident in Tenet One, all knowledge mobilization projects included partnerships with some combination of provincial and local intermediaries such as the Learning Disabilities Association of Ontario and Peel District School Board. These intermediaries work closely with educators in the field and are aware of the current educational trends and challenges in that particular area. Lastly, even though all projects fell within at least one of the then four priority areas, the actual educational issue being addressed varied considerably. This diverse range of topics (see below) was driven by local concerns and needs.


    • Aboriginal Education
    • Arts Education
    • Classroom Management
    • Early Childhood Education
    • Education in the North
    • English Language Learners
    • Equity and Inclusion
    • French-language Education
    • Knowledge Mobilization
    • Leadership
    • Mathematics Education
    • Mental Health
    • Multi-modal learning
    • Physical Health
    • Science Education
    • Special Education
    • Stakeholder Engagement
    • Student Identity


    Tenet #3 Being aware of the needs of research users

    During the two years of KNAER (2010-2012), further consultation was done with educators, researchers, intermediary groups, and parents and it became clear that while the successful project leads were well-versed in knowledge mobilization, knowledge mobilization overall was not well understood throughout the education sector. For this reason, the KNAER began supporting professional learning around knowledge mobilization. Specifically, we created resources about how to create an effective knowledge mobilization plan such as tips for Knowledge mobilization planning and write a short research summary for your primary audience. We also repurposed the KNAER website and created a toolkit that compiled the resources generated from the KNAER. This toolkit mainly concentrated on resources applicable to the content focus of the various knowledge mobilization projects. For example, a mathematics-focused project by Shelley Yearley, Trish Steele, and Cathy Bruce and their partners, entitled Exploring Learning and Differentiated Instruction for the Difficult to Learn Topic of Grade 6 Fractions Using Teacher-Coach-Research-Developer Networking, have several useful resources in the toolkit, including a literature review, an information sheet, and their own toolkit.


    Another example is the Our Kids Network: Taking Research to Practice project, which included a university partner (Charles Sturt University), a provincial network (Our Kids Network), a school board (Halton District School Board), several health agencies (Halton Region Children’s Services and Department of Health, ErinOakKids Centre for Treatment and Development), the Halton Police, and several community partners (Halton Children’s Aid Society, ROCK Reach Out Centre for Kids, Halton Multicultural Council). The OKN project focused on building capacity to utilize research, strengthen cross-sector partnering in taking research to practice, sustain engagement with internal and external stakeholders, and more effectively responding to issues facing children, youth and their families in the Halton region and used the strength, resources, and knowledge of their network and partners to accomplish this. Their resources in the KNAER toolkit include a report, online video, toolkit and virtual community of practice, and website.


    Tenet #4 Drawing on a range of types of knowledge, not just research-based knowledge

    A number of different types of knowledge can be found within the 44 KNAER projects. Because the majority of project topics were driven by local practitioners and communities connected with the education sector, practitioner and community knowledge was used substantially in efforts to inform practice. An example of a community-based project is the Kimaaciihtoomin E-Anishinaabe-Kikinoo’amaageyak (Beginning To Teach In An Indigenous Way) project by Jean-Paul Restoule and his partners, including the Toronto District School Board and its Aboriginal Education Center. This project focused on integrating Aboriginal perspectives into the classroom and has many resources available in the toolkit, including several presentations, articles, a toolkit, online videos, and website.

    Tenet #5 Testing and evaluating interventions

    We interpret this tenet, testing and evaluating interventions, to mean did our efforts produce any of our intended outcomes. It was difficult to determine any type of systematic evaluation of the full KNAER initiative because each KNAER knowledge mobilization project focused on a different educational issue and approached each issue in a different manner and included different partners. However, each project did attempt to report impact, degree of influence, record outputs and outcomes and we tried to provide a transparent narrative of the full initiative in the final report. An external evaluator evaluated the full initiative and concluded that KNAER was a “trailblazing” initiative (McGuire, Zorie, & Frank, 2014, p. 9). In addition, we conducted a review of the utility of KNAER, a literature review, interviews with knowledge mobilization experts and conducted planning sessions with various stakeholders to create our final report. The KNAER final report is our culminating internal evaluation and lessons learned about KNAER (2010-2014). This final report was (and is) the basis for the renewed KNAER initiative.

    Tenet #6 Feeding knowledge from evaluation back into future practice

    Last, but certainly not least, Nutley stated that another effective practice reported in the literature was to utilize knowledge gleaned from evaluations to inform future practice. As mentioned in Tenet Six, we conducted an extensive final report of the original KNAER (2010-2014) where we considered the findings from the external evaluation, our own literature review, expert interviews, virtual discussions, and planning sessions to develop a systems approach model that was proposed to the Ontario Ministry of Education for consideration in redesigning KNAER. The Ontario Ministry of Education accepted this model and has utilized it as the framework for the renewed KNAER that was launched in fall 2016. To read more about our lessons learned and recommendations for the renewed KNAER please see our final report
    KNAER II Model 2016-11-15


    Let’s return to the question we asked ourselves earlier in the blog: Did we practice what we preached? As demonstrated in this blog we do think that to varying degrees we have practiced what we’ve preached. Perhaps the more relevant question is how will we continue to take our lessons learned and apply them to new challenges that arise with the revised systems model as we move forward? How will we know the outcome from applying this new knowledge and understanding?

  • KNAER-RECRAE Highlight:

    Read The Hamilton Spectator article on launch of KNAER Well-being Network

    Mac, school board to lead student well-being networks

    Hamilton will be leading the effort to equip Ontario schools with new resources to improve student well-being.

    The Ministry of Education has announced that the Hamilton public school board and McMaster University's Offord Centre for Child Studies have been chosen to spur the building of "knowledge networks" addressing in an integrated way...[continue reading article]

  • Knowledge Mobilization Connection:




    Over the last two decades there has been a steady growth of high-quality research in education and human services. Yet we have heard researchers express frustration that policymakers and practitioners don't use (and sometimes misuse) research findings. Conversely, policymakers and practitioners suggest that research is frequently not relevant to their work-or it's not easily accessible or understood. This is not surprising as evidence-based practice and policy has traditionally been about producing evidence and disseminating research to users—an approach that Vivian Tseng has characterized as a one-way street.

    But just producing rigorous research and bringing evidence to practitioners and policymakers won't get it used. Users need ongoing engagement around research. They need opportunities to talk about the research and interact with others to apply findings.  Researchers need to focus less on dissemination and more on dialogue. Enter research-practice partnerships.

    Long-term collaboration between researchers and decision makers—research-practice partnerships—shift the dynamic between research producers and users by creating two way streets. Instead of asking how researchers can produce better work for practitioners, partnerships ask how researchers and practitioners can jointly define research questions. Rather than asking how researchers can better disseminate research to practitioners,partnerships strive for mutual understanding and shared commitments from the beginning. Successful partnerships enable researchers to develop stronger knowledge of practitioners' challenges, their contexts, and the opportunities and limitations for using research. And they allow practitioners to develop greater trust of the research and deeper investment in its production and use.


    The Principles Behind Successful Partnerships

    While there are many different kinds of RPPs, they are guided by key principles that make them different from other types of collaborations.

    Mutualism brings researchers and practitioners to the table to develop an agenda that is mutually beneficial.  By collaborating throughout the process, both researchers and practitioners get their needs met.

    Commitment to long-term collaboration means that the partnership can foster iterative work to understand and address key problems of practice. There is an ongoing cycle of learning and doing.

    Trusting Relationships are the key to effective partnerships. To sustain a long-term collaboration, partners need to believe that they can rely on each other to come through on agreements and to understand and even anticipate each other's needs and interests. Trust enables partners to weather bad news, disagreements, unfulfilled expectations, and changes in leadership.


    How do Partnerships Work?

    Building an RPP is hard work. They are complex organisms, with structures, processes, and roles that evolve as partnerships mature and adapt. However they form, we have observed five elements that seem to come together in successful partnerships.
    (See Diagram 1)

    1. All RPPs must determine their structure. While not all RPPs have formal documents, many develop charters, MOUs, and operating principles to record their shared goals, clarify stakeholder representation and roles, and spell out governance and operational issues.

    2. RPPs must develop a shared commitment, which includes defining the research agenda. The research agenda is a focal point for activities within a research–practice partnership. In RPPs, research agendas are shaped around problems of practice, policy, and implementation, and are co-shaped by partners to fit practice priorities.

    3. Partnerships need to develop their processes, routines, and “ground rules” for producing and using research evidence.  Many RPPs have a “no surprise” rule, wherein the agency partners have an opportunity to review a research report before it is released to the public.

    4. Capacity building may involve one or all partners in the collaboration. It can focus on building the agency's capacity to use research, bolstering researchers' capacity to conduct and communicate useful research, or supporting the capacity of the partnership itself through staffing.

    5. It's critical that the funding portfolio for RPPs covers partnership infrastructure as well as projects.

    As all of these elements come together, a partnership identity often emerges where research and practice partners develop a shared sense of what their partnership is and what it does.

    The University of Chicago Consortium on School Research's (CCSR) 25-year relationship with Chicago Public Schools illustrates how a school system can learn from research evidence to drive improvements over time.  In the late 1990s, CCSR researchers sought to measure freshman progress toward graduation and developed an "on-track indicator" to help schools identify freshmen who were unlikely to graduate in four years.  Their research challenged prevailing assumptions and stimulated new ideas for combating dropout.  But the work didn't stop there.  The district and schools put systems in place to use the indicator data to intervene with students while their research partners experimented with user-friendly ways to present the data.  As schools tried out different strategies for bringing students on track, CCSR systematically studied those strategies to determine which ones worked and why. By 2013 that sustained work had yielded an on-time graduation rate of 82 percent, up from 57 percent just six years earlier. Educators, using data and research, learned how to better support student success.

    Let's turn to a relatively new partnership between the research and program offices within one agency: the Office of Family Assistance's (OFA) healthy marriage programs and the Office of Planning, Research and Evaluation (OPRE)within the U.S. Department of Health and Human Services, Administration for Children and Families (ACF). In the early 2000's, OPRE launched long-term studies to evaluate whether marriage and relationship education programs might improve outcomes for children in low-income families. By the time the evaluation results were available-about ten years later-the context within ACF had changed and the program office had new questions, not envisioned when the long-term studies were planned.  A misalignment had emerged between what research could offer and what current policy needed. In order to make the research more relevant, closer collaboration between the research and program offices needed to be established.

    That's exactly what happened when in 2010 a partnership was structured between OPRE and OFA's healthy marriage programs. Lauren Supplee, formerly of OPRE, and her OFA counterparts sat down and discussed the needs of the program. Since they had a history of working together, they already had a trusting relationship. Subsequently, they were able to structure a partnership that would be responsive to OFA staff. One way that this took shape was that OPRE researchers invested time in building relationships with OFA acrossall levels of staff. By working together, ideas for new research projects are generated through the year. To build the research agenda, the OPRE and OFA staffs have ongoing conversations about program operations as well as specific structured meetings around annual research planning. The overall goal for these discussions is to raise questions from the field that research may be able to support. The research portfolio now includes multiple projects that include foundational projects to describe services and participants, impact studies to determine efficacy of components or whole programs and research capacity building for the field to address important questions simultaneously. Through shared goals, shared commitment, open communication, capacity building and infrastructure, an effective RPP was born.


    How can we Ensure the Next Generation of Partnerships

    RPPs represent a sea change in the way we think about research, practice, and the uses of evidence. They break down walls between researchers and practitioners by creating two-way streets of engagement. Researchers and practitioners, alike, must remain open to learning about and adapting to different perspectives. Taking the long view on research and practice improvement reaps tremendous benefits for youth and families.

    There are important questions that researchers need to ask themselves before entering into an RPP. Can they adopt new ways of working in order to produce more timely and useful research? Are they open to taking on different approaches to research? Finally, are they willing to acquire  skills that are not traditional for researchers, including communication with broader audiences and imagining research agendas from a practice perspective?

    Practitioners too, will need to work in new ways. They may need to ask themselves: are their organizations flexible enough to foster the conditions that will allow for more effective and systematic use of research evidence? At the same time, policymakers will need to address the bureaucratic barriers to research-program collaborations within agencies and between public agencies and external research partners.

    Private and public funders have a role to play as well in preserving the progress that partnerships have made. While funding for projects is easier to come by, some foundations are not as willing to invest in operating costs and infrastructure development. But, RPPs need resources for outreach, communications, and relationship building if they are to be sustained for the long haul. What would be ideal is a reliable combination of funds: government support for operating costs and specific large-scale research studies, private support for infrastructure maintenance and behind-the-scenes internal "R&D" activities, local foundation support for context specific research and evaluation studies and infrastructure, and university support for faculty time and partnership space.

    Research-practice partnerships are not the path for the faint of heart. But acknowledging and addressing the challenges inherent in the work can bring about opportunities to close the notorious gaps between research and practice—to build two-way streets that improve work on both sides of the divide. RPPs allow researchers and practitioners to build joint, strategic research agendas, to embed data and research in ongoing work, to build knowledge from one project to the next, and to integrate lessons learned into practice and policy. When mutual trust forges confidence in the research, we can collectively bring about more effective services and enhance outcomes for children and youth.

    Diagram 1

    Change Churn


    About the authors

    Vivian Tseng is the Vice President, Programs at the William T. Grant Foundation. She leads the Foundation’s grantmaking programs and its initiatives to connect research, policy, and practice to improve child and youth outcomes. In 2009, she launched the Foundation’s initiative on the use of research evidence in policy and practice. She also designed the Foundation’s support for research-practice partnerships, including a learning community of research-practice partnerships across the country. Tseng has longstanding interests in mentoring young researchers and strengthening the career pipeline for scholars of color. Under her leadership, the William T. Grant Scholars Program has deepened its support for early-career researchers and established a grants program to support mentoring for junior researchers of color. She serves on the Boards of the Forum for Youth Investment, Asian Americans and Pacific Islanders in Philanthropy, and Evidence and Policy.She was previously on the faculty in Psychology and Asian American studies at CSUN. Her studies of racial, cultural, and immigration influences on child development have been published in Child Development and her research on improving social settings and promoting social change have appeared in the American Journal of Community Psychology. She received her Ph.D. from NYU and her B.A. from UCLA.

    John Q. Easton is Vice President, Programs at the Spencer Foundation in Chicago.  At Spencer, he developed and leads a new grant program for Researcher-Practitioner Partnerships. From June of 2009 through August 2014 he was director of the Institute of Education Sciences in the U.S. Department of Education. Prior to his government service, Easton was executive director of the University of Chicago Consortium School Research. He was affiliated with the consortium since its inception in 1990, and became its deputy director in 1997 and executive director in 2002.  Easton served a term on the National Assessment Governing Board, which sets policies for the National Assessment of Educational Progress (NAEP).  He is a member of the Illinois Employment Security Advisory Board, the Illinois Longitudinal Data System Technical Advisory Committee, and the Chicago Public Schools’ School Quality Report Card Steering Committee.

    Lauren H. Supplee is a program area director for early childhood research. Dr. Supplee has devoted her professional career to working on research and evaluation with the goal of applying the knowledge to policy and practice. She is committed to conducting research and evaluation that can contribute to program improvement and improved outcomes for children and families. Her research has focused on evidence-based policy, social-emotional development in early childhood, parenting, prevention and intervention programs for children at-risk, and implementation research. Prior to joining Child Trends, Lauren worked for the federal Administration for Children and Families in the Office of Planning, Research, and Evaluation for ten years, with the last four of those as the director of the Division of Family Strengthening. She began her career as a research associate at the University of Pittsburgh. Lauren received her Ph.D. from Indiana University in educational psychology with a specialization in family-focused early intervention services.

  • KNAER-RECRAE Highlight:


    The Consortium for the Study of Leadership and Ethics in Education (CSLEE) annual conference is upon us!

    This year’s conference is being hosted by one of our partner institutions – Western University – and KNAER is a very active participant.

    KNAER will be present at CSLEE in a number of capacities:

    • Our social media coordinators will be live tweeting and providing social media updates throughout the conference - Follow us (@KNAER_RECRAE) on Twitter and like us (KnaerRecrae) on Facebook! And don't forget to follow the full CSLEE twitter conversation at #CSLEE2016

    • We will have a poster on display during the conference about knowledge mobilization in Ontario (check it out for some helpful tips, there will be a handy takeaway sheet as well)

    • We will be hosting a networking session that lets you create connections in a snap and get some useful advice on building lasting connections – learn the tricks of speed networking with us! Taking place on Thursday, October 20 from 3:30-4:30pm in the Juniper room (we’ll have tasty treats for you to enjoy) Don't forget to bring business cards!

    • Join us at our hands-on workshop on social media on Friday, October 21 from 1:30-3:00pm in the Cherry room, and learn how to utilize these digital tools to mobilize your research and expand your network (bring your laptop or tablet or phone or any other digital device that lets you connect – you’ll be glad you did!)

    • We confront the idea of values in knowledge mobilization on Saturday, October 22 from 10:15-11:30am in the Juniper room as part of a three paper panel. Come hear our co-directors speak about this important (and not often addressed) issue.

    • Come meet the KNAER team at any of our presentations or stop by our KNAER table for some swag, resources, and good conversation!

    The full conference program and more information can be found on the CSLEE conference website at:

    Hope to see you there!

    The KNAER Team

  • Knowledge Mobilization Connection:


    Systems Thinking as Part of a Knowledge Translation (KT) Approach:

    How Our NeuroDevNet Team Used Systems Thinking to Improve Our Production of Research Summaries

    by Anneliese Poetz

    Anneliese Poetz completed her PhD in Social Science at McMaster University, which generated a systems-based model for Knowledge Translation. Anneliese has experience writing plain language research summaries for policymakers, parents and teachers at the Canadian Language and Literacy Research Network and, in her most recent work for the National Collaborating Centre for Infectious Diseases, she facilitated national stakeholder consultations, and developed stakeholder-and evidence-informed products to improve public health practice. For more on Anneliese and her work click here.


    I recently had the pleasure of being able to present at the recent Canadian Knowledge Mobilization Forum (#CKF16) conference in Toronto, Ontario. It was a 7-minute presentation entitled “Systems and Processes for Knowledge Translation” and focused on one of the examples of how I use systems thinking to inform my work in Knowledge Translation (KT).


    Several years ago, I met a business analyst who informed me that what I was doing in my job in the field of KT was essentially what a business analyst does: use stakeholder input to inform the design (and/or re-design) of products and processes. When you think about it, everything we do in KT is either a product or a process. The products I worked on included evidence-informed tools for health care practitioners to apply to their work, and guides for researchers to help them “do” KT. One of the processes that we needed to improve was for our production of clear language summaries called ResearchSnapshots.


    Wondering what the difference is between Knowledge Mobilization, Knowledge Translation, and other like terms? Visit Gary Myers' KMbeing blog post and join the conversation on KMb: Definitions & Terminology.  


    If you look at slide #6 in the above presentation, you will see a framework that outlines the key concepts in business analysis, according to the Business Analysis Body of Knowledge (BABOK v.3). These are: 1) Need 2) Change 3) Stakeholder 4) Solution 5) Value 6) Context

    Each of these is of equal importance, and all must be represented. One of the things we did wrong with our initial process for the ResearchSnapshots was that we transferred the existing process (and writing staff) used by York’s KMb Unit without consideration of the differences in context within NeuroDevNet.

    One of the ‘tools’ within the field of Business Analysis is a methodology called root cause analysis. We conducted a root cause analysis in order to pinpoint what the root of the problem was, and create a targeted solution. We discovered the problem was that the writers used by York KMb’s Unit, although highly skilled in clear language writing, had social science expertise but were asked to summarize research papers that were basic science and clinical science based. The researchers complained that they had to rewrite most if not all of the content, and it was a lot of work for them to do so. The result was that we achieved customer satisfaction (buy-in) among the researchers for the new process.

    What we did to improve the process was to first identify all the stakeholders directly and indirectly affected by the process. Then we gathered information about their needs (often in the form of the complaints we’d received from researchers) with respect to the process, which were then transformed into ‘requirements’. These requirements informed the re-design of the new process.

    The process had to be easy for researchers, and create value for the Network. Since the projects within the Network were so diverse and often specialized, it would have been too difficult (and maybe impossible) to find writers who were content experts. So, the new process begins with the researcher nominating a paper that was produced as a result of one of their NeuroDevNet funded projects, along with one of their trainees (students) who is expert in the content area. Then, we provide training and support toward the production of a clear language summary of their paper that is ready for final review and sign off by the researcher. In this way, it is easy for researchers because they only have to make minimal edits to the draft, and it creates value for the Network not only because of the clear language summary that is produced but the transferrable skills that the trainee acquires.

    Let’s break down how this method reflects ‘systems thinking’:


    1) A system is composed of parts. The first thing we did was map out the stakeholders and where they were situated within the system (see slide #9).

    2) All the parts of a system must be related (directly or indirectly). We mapped out the stakeholders as related, directly or indirectly, to the customer service issue (or ‘incident’).

    3) A system has a boundary, and the boundary of a system is a decision made by an observer or a group of observers. The ‘system’ was what facilitated the execution of the process for creating clear language summaries (ResearchSnapshots). In other words, the boundary of the system was the affiliation of researchers as part of NeuroDevNet, and research papers to be summarized were those produced as part of NeuroDevNet funded research projects.

    4) A system can be nested within another system, a system can overlap with another system. The ‘system’ for producing ResearchSnapshots within the KT Core with one researcher is nested within the larger ‘system’ of the NeuroDevNet pan-Canadian Network of researchers and projects.

    5) A system is bounded in time, but may be intermittently operational. A system is bounded in space, though the parts are not necessarily co-located.We engage with researchers to co-create ResearchSnapshotsat the time that we receive a service request, usually after a researcher has published a new peer-reviewed paper. These requests are sporadic depending on the frequency and pace of publications arising from its pan-Canadian NeuroDevNet-funded projects. 

    6) A system receives input from, and sends output into, the wider environment. We receive requests but we will also offer services if we see an opportunity. Once the ResearchSnapshots are finalized, they are made available on the NeuroDevNet website

    7) A system consists of processes that transform inputs into outputs. The process for clear language writing of ResearchSnapshots is one of the processes that exist within the KT Core, that transforms inputs (peer reviewed publications, clear language summary drafts in word) into outputs (finalized draft of clear language summary, formatted onto ResearchSnapshot .pdf template, formatted for accessibility). 

    8) A system is autonomous in fulfilling its purpose. A car is not a system. A car with a driver is a system. Similarly, the KT Core as a department within NeuroDevNet is not a system. The KT Core with a Lead, Manager and Assistant, is a system.


    As a systems thinker, remember that a system is dynamic and complex, and that information flows among the different elements that compose a system. For example, information flows among the KT Core Lead, Manager and Assistant. A system is a community situated within an environment. For example, the KT Core is a system situated within NeuroDevNet, and as a result, information also flows more broadly between the KT Core and NeuroDevNet’s community of researchers. Information flows from and to the surrounding environment, for example, the KT Core posts its finalized ResearchSnapshots publicly on the NeuroDevNet website.

    The field of Business Analysis has identified (and published in BABOK) a common sense framework and practical methodologies, which I believe can advance the field of KT towards more meaningful and useful products and processes that are responsive to the systems in which they are intended to be used.

  • Knowledge Mobilization Connection:



    By Anne Bergen* & Elizabeth Shantz**

    *Director, Knowledge to Action Consulting;
    **Knowledge Mobilization and Training Manager, Canadian Water Network

    This blog post gives an overview of helpful practices in research impact evaluation, with case study examples from the Canadian Water Network’s recent evaluation work.


    • Understand why you are evaluating: your evaluation goals will impact how and what you evaluate;
    • Map out your evaluation criteria: inputs, outputs, outcomes, and assumptions. A logic model can be a helpful tool to do this;
    • Build relationships with end-users and plan for evaluation at the beginning of research projects or programmes.
    • Be flexible and pragmatic. Revisit and adapt your evaluation plan. 


    Research impact evaluation is an exercise in understanding how a project or body of work has contributed to change: in people, places, policy, and practice.That means that research impact evaluation goes beyond counting the outputs of research (e.g., publications, reports, presentations), to tracking and assessing uptake and outcomes and impact.

    The question of what should be considered “research impact” is complex. Counting outputs is relatively easy, but outputs aren’t evidence of impact. It’s much more difficult to draw a link between research activities and changes in individuals, organizations, or systems.

    As research impact tends to be indirect, it’s helpful to think about research activities as contributing to changes, rather than as a direct cause. That is, individual changes in knowledge, attitudes, and understanding may influence decisions and behaviour. However, these decision and behavioural changes occur due to uptake of multiple pieces of information, assimilation with existing knowledge and experience, and the larger context. Similarly, changes in “policy and practice” involve uptake and use of multiple pieces of evidence and external pressures. 


    Case Study Example: Canadian Water Network

    At Canadian Water Network (CWN), our goal has been to fund research that addresses real-world water challenges that affect public health, the environment and our economy.However, demonstrating links between research and decisions that affect these factors has been difficult, as policy and practice decisions take a long time to change, and changes generally result from a number of influences rather than from a single project.

    In 2012, CWN launched an evaluation to better understand the contribution of the last ten years of CWN research to these impacts. We used a 3-stage process:

    1. Analysis of existing reports and documents
    2. Interviews with project researchers
    3. Interviews with research users to learn about the impact of each project in their organization.



    Keep a close eye on your evaluation goals

    When planning to evaluate research impact, questions about the scope of evaluation should be defined as early in the process as possible. Revisit these goals often.

    What kind of impact do you expect? In what timeframe? For whom? Under what conditions? Based on what evidence?

    What aspects of your research do you want to evaluate? What are you going to do with the resulting information?


    Case Study Example: Canadian Water Network

    We had a number of goals for CWN’s evaluation project:
    1. Identify and share success stories
    2. Identify the elements of impactful research to inform future projects 
    3. Determine whether user-driven programs led to more uptake and impact than researcher-driven programs 
    4. Establish a comprehensive database summarizing inputs, outputs and outcomes for all projects


    Build a logic model to explain your research impact pathways

    Creating a logic model or theory of change defines the links between research activities, outputs, and desired outcomes, as well as the expected end-users or target populations for change. A logic model makes it easier to map contributions of research activities and outputs to uptake, use, and impacts.


    Logic Model Key Components

    End-users/ target populations

    Who are the audiences/targets of change of your research activities and outputs? These are the people and groups who can tell you about research uptake, use, and impact.


    Activities & outputs

    What are your research and knowledge translation activities that might impact your end-users? Think about in-person activities like meetings, workshops, and collaborations, as well as written and visual outputs.

    Outcomes & impacts

    What are the expected outcomes, and impacts of your research and knowledge translation activities? What are the changes in people’s knowledge, attitudes, skills, and actions? What are the community and systems level impacts?


    Assumptions & contexts

    What needs to be in place for activities to lead to desired outcomes? What context is necessary for the activities to have impact? When do you expect these changes to occur?


    CWN Logic Model

    Adapted from the University of Wisconsin-Extension evaluation logic model template 

    Case Study Example: Canadian Water Network

    CWN’s logic model - adapted from the University of Wisconsin Extension logic model, outlines the expected pathways between research inputs, activities, outputs or products, and short-, medium- and long-term outcomes. These categories formed the framework within which we gathered and interpreted data.


    CWN’s logic model from inputs to outputs to impacts shown here is a helpful starting example for mapping out your research impact pathways. Other examples include Morton’s (2015) matrix of outcomes and indicators for research impact and Phipps et al.’s (2016) logic model for knowledge mobilization, which spells out benefits for researchers and end-users (



    Ideally, you’ll start planning for research impact evaluation by defining goals and creating a logic model at the beginning of your research project. Research impact may take years or decades, but trying to reach out to end-users long after the fact to learn about uptake and use is unlikely to be helpful.

    Building relationships and involving end-users throughout the research process will make it easier to follow-up over time, and will build more opportunities for impact into the research process. In addition, end-user involvement and collaboration will amplify your research impact.

    Research impact takes time. Don’t expect to see changes in policy or practice right away. Consider also that although it will likely be easier to identify contributions from broader programmes of research, rather than single projects, these impacts will take a relatively longer period of time to emerge. Since it’s rarely possible to wait decades to evaluate impact, focus on identifying shorter-term indicators of potential future impacts.


    Case Study Example: Canadian Water Network

    CWN gathers evaluation information throughout a project, including information on outputs, short-term outcomes that have occurred, and medium- or long-term outcomes that are expected to occur later. This process helped us to forecast potential future impacts and highlighted areas where we could follow up during this evaluation to learn whether expected impacts actually occurred.

    Building strong relationships with researchers and end users is critical, and we believe relationships were a key factor in the high response rates to our interview requests (74% of researchers and 68% of end users) – even for projects that were 10+ years old!



    Once you have evaluation goals and a logic model in place, keep revisiting and revising the goals and logic model. The outcomes you expected in Year One may not be the outcomes that are emerging in Year Five. In particular, your logic model for research impact should be updated about annually. The goals of your evaluation may shift, depending on changing pressures from funders and other stakeholders.

    Case Study Example: Canadian Water Network

    As our evaluation proceeded, we updated and changed some of the outcome categories in the logic model due to emerging impacts. Our priorities also shifted from identifying the elements of impactful projects and programs to identifying and sharing success stories. Having a large number of goals for the evaluation may have been a complicating factor – the more evaluation goals you have, the more difficult to achieve them all!



    Remember that an evaluation cannot capture every single outcome or impact, that research impacts cannot necessarily be directly compared across projects or programmes. Finding a balance between doing research and evaluating research can be challenging. Those seeking to understanding research impact should be mindful that evaluation has costs, as well as benefits, for researchers and end-users. Working to minimize the burdens of data collection and reporting will make ongoing research impact evaluation more sustainable.

    Carefully designed and implemented evaluation can help you learn how to better mobilize knowledge and create research impact. Build in time and space to learn from the evaluation results, and make evaluation part of a regular reflective practice.

    Case Study Example: Canadian Water Network

    We originally intended to complete CWN’s evaluation in-house within 6-12 months, but found that it took almost 2 years, 2 interns, and the assistance of Knowledge to Action Consulting to gather all of the comprehensive data we were interested in. Database development was the most time-consuming aspect of the evaluation.

    Looking ahead, we plan to continue successive evaluation work. By strategically collecting data, building relationships and scaling back our evaluation plan, we hope to benefit from a more streamlined process.


    Note: This blog post was adapted from the authors’ recent previous work, including presentations to Alberta SPOR unit, OMAFRA, and CKF16.


    About the authors

    Anne is a consultant who helps people and organizations transform knowledge into action ( She gained mixed methods research expertise through her PhD training in Applied Social Psychology, and believes that a common understanding of problems and solutions can be built through engaged research and collaborative action. Anne uses approaches grounded in social science theory to develop evaluation frameworks for collecting rigorous, meaningful, and actionable data. She has worked with diverse stakeholders (academic, community groups, non-profit, government) to create knowledge mobilization activities and strategies for projects, and programmes, and organizations.

    Elizabeth has been a knowledge mobilizer at Canadian Water Network since 2010. Her background in industrial/organizational psychology and experience as a community-engaged researcher inform her work at CWN; she is responsible for supporting the development and implementation of knowledge mobilization strategies in CWN programs to effectively bring together researchers and decision makers. She also evaluates the impact of CWN research on policy and practice, develops plain language products to share the results of CWN research, and designs tools and training programs to build knowledge mobilization capacity.



    Better Evaluation

    Economic and Social Research Council (2011).Branching Out: New Directions in Impact Evaluation from the ESRC’s Evaluation Committee. Appendix 1 – Conceptual Framework for Impact Evaluation. Retrieved from:

    Lavis, J. N., Robertson, D., Woodside, J. M., McLeod, C. B., & Abelson, J. (2003). How can research organizations more effectively transfer research knowledge to decision makers? Milbank Quarterly, 81, 221-248.

    Morris, Z. S., Wooding, S., & Grant, J. (2011). The answer is 17 years, what is the question: understanding time lags in translational research. Journal of the Royal Society of Medicine104, 510-520. Retrieved from

    Morton (2015a) Progressing research impact assessment: A ‘contributions’ approach Research Evaluation, 24, 405-419. Retrieved from

    Morton, S. (2015b). Creating research impact: the roles of research users in interactive research mobilisation. Evidence & Policy: A Journal of Research, Debate and Practice 11, 35-55.

    National Collaborating Centre for Methods and Tools (2012). Evaluating knowledge translation interventions: A systematic review. Hamilton, ON: McMaster University. Retrieved from

    Phipps, D.J., Cummings, J. Pepler, D., Craig, W. and Cardinal, S. (2016). The Co-Produced Pathway to Impact describes Knowledge Mobilization Processes. J. Community Engagement and Scholarship, 9(1): 31-40.
  • Knowledge Mobilization Connection:

    Tangled Up in Networks

    By Heather Bullock

    We live in an increasingly networked world. Advances in technology over the past 15 years have enabled collaboration and networked activities in ways we couldn’t imagine just a few short years ago. Despite this, there has been relatively little research on networks, how they function, and their potential. Personally, I find networks both fascinating and mystifying. The “network” seems like a nebulous concept, but also a very real one, especially when I think about my “work network” or my “network of friends and family”. For me, in the past, networks were just things that existed; I gave little thought to how they are structured or function or how you might use networks to achieve goals, outside of the odd suggestion like “use your network to find a job after your graduate work”. But I gradually became more interested in networks. I would love to say curiosity alone drove me to find out more about networks, but in reality, about 10 years ago, people in my workplace began discussing using networks for certain purposes. So, I started to do some digging and here are some things I found out about networks:

    There is a whole language related to describing and understanding networks, including interesting terms like “actor”, “nodes” and “structural holes”

    Networks are about the connections among ‘actors’ within a specific type of knowledge and/or practice

    Actors can be individuals, groups or organizations

    Networks are constantly changing and have life cycles, similar to living things

    Networks can be social, work-related or interest-driven

    You can make maps of networks (and sometimes, they are quite beautiful)


    This told me something about WHAT networks are, but little about WHY they exist and do what they do, so more digging was needed. It turns out there are many advantages to networks. Some advantages that I find most interesting include:

    Building connections beyond one’s individual or organizational experience

    Allowing greater ease of movement beyond professional, disciplinary, and organizational boundaries

    Encouraging shared learning, rapid diffusion of new knowledge, cross-fertilization of diverse ideas, efficient problem solving and enhanced group ownership

    Now I started to get even more interested. Aren’t these advantages aligned with some of the goals of knowledge mobilization? If networks are a structure that can achieve these goals, how might I use them in my work? Intentionally creating a network was a novel idea for me. But where to start? As far as I was aware, there was no course on network building I could take. Back to the literature I went.

    It turns out that there are few researchers who have studied networks in detail over the years and many of the earlier helpful contributions are from two groups of people: one group consists of Provan and Milward, out of the University of Arizona. Along with some colleagues, they have concluded that there are several factors to consider when building networks and setting them up for success. Some of the factors that predict effectiveness are: the level of trust amongst members, the number of participants or “actors”, the level of consensus around the goals of the network and the level of skill and experience with other networks that the actors bring. These all also predict what model of network governance you should consider. The second group includes Robeson, Wenger, Garcia & Dorohovic, Huerta, and others, whose thinking has helped shape the following keys to network success:

    Establish clear purpose and goals

    Address hierarchy of needs

    Include a culture of trust in stated core values

    Fulfill specific role functions such as effective leadership, sponsorship, knowledge brokerage and community membership

    Maintain a flexible infrastructure

    Establish supportive processes

    Balance homogeneity and heterogeneity

    Secure adequate resources

    Demonstrate value

    The big question for KMb is how we maximize the value of networks to support moving evidence to action? Evidence Exchange Network (EENet) provides a case example. EENet is a knowledge exchange network that brings together diverse stakeholders, including researchers, policymakers, service providers, system planners, persons with lived experience, and family members. Our goal is to make Ontario’s mental health and addictions system more evidence-informed—no mean feat! But that’s why we take multiple approaches. We translate evidence – which we interpret broadly – into usable and accessible forms: Research Snapshots, for example. But we don’t simply disseminate findings through our network; we also help shape the knowledge that’s being created. Our Creating Together initiative brought together provincial partners to help set research priorities. Our Persons with Lived Experience (PWLE) and Family Members Advisory Panel – a mouthful, I know! – has informed projects under the Drug Treatment Funding Program.

    We are especially excited about our Communities of Interest (CoI) initiative. We view CoIs as forums for knowledge exchange and collaborative knowledge creation on a topic related to mental health and addictions. This format allows for progress toward some more specific knowledge mobilization goals with the support of the network’s knowledge brokers. This year, nine COIs are busily working toward their goals.

    All of these efforts – Creating Together, the PWLE and Family Members Advisory Panel, the CoIs – are almost like mini-networks within EENet. But then that’s been our conception for awhile: we are a network of networks. Maybe that’s confusing; but our aim is not to duplicate existing networks but, rather, to link up to them and help increase the spread of evidence.

    One challenge we continue to face is how to evaluate a network approach to knowledge mobilization? Especially given the multi-faceted nature of the activities taking place within the structure. Our answer has been to develop a theory of change and use a multi-pronged approach that allows us to examine the network as a whole as well as specific activities within the network. One of the most interesting evaluative approaches we have employed to date is a social network analysis of where Ontario mental health and addictions stakeholders are going for their evidence and how EENet fits within that landscape. Hopefully we will be publishing that soon.
    Some questions to leave you with:

    What networks do you belong to?

    Are there networks that exist out there that can help you fill a KM goal? (It is always easier to plug in to an existing structure than build from scratch like we did)

    How can technology help networks? Is the technological platform the network? Or is it a tool for the network?

    What are some of the downsides of networks?

    In the meantime, here some resources about networks that I have found useful:

    A new(er) paper on Inter-organizational networks:

    One of my “go to” resources for networks in a KMb context: 

    To learn more about Evidence exchange network:

    Note: An earlier version of this blog originally appeared as part of a course on “Knowledge Mobilization and Evidence-based Practice” at Renison College, University of Waterloo in 2014.

    About me:

    Heather Bullock, MSc. is a PhD candidate in the Health Policy program at McMaster University in Ontario, Canada and is part of the McMaster Health Forum's Impact Lab. Heather is a 2016 Trudeau Scholar. Heather has an extensive background in health care policy and knowledge mobilization, holding progressive leadership positions. She is on leave from her position as Director of Knowledge Exchange at the CAMH, Canada’s largest mental health and addictions teaching hospital. In this role, she developed and led an innovative knowledge mobilization initiative: Evidence Exchange Network, which aims to make Ontario’s mental health and addiction system more evidence-informed. She also helped build a program that supports implementation efforts in Ontario’s mental health and addictions system. Heather has also worked for the Government of Ontario as Research Transfer Advisory where she connected people within government with the best available research evidence to support evidence-informed policy-making. She instructs several courses and modules on knowledge exchange including a graduate-level course at McMaster University and previously at the University of Toronto.

    Heather’s research interests lie in how large jurisdictions implement evidence-informed policy directions in mental health systems. Her dissertation is exploring how developed countries structure their implementation efforts as well as the process of policy implementation in Ontario’s mental health and addiction system. Heather serves in an advisory capacity for several provincial, national, and international initiatives such as the International Knowledge Exchange Network for Mental Health and the Provincial Centre of Excellence for Child and Youth Mental Health. Her expertise is sought globally and she has conducted training in knowledge mobilization and implementation for Government of Saskatchewan, University of Toronto, University of Ottawa, Government of Ireland, and the Swedish Government, among others. She has Masters level training in behavioural ecology and evolutionary psychology from Queen’s University.

  • Knowledge Mobilization Connection:

    Five tips for getting knowledge into action 

    by Sarah Morton

    Sarah Morton, University of Edinburgh works at the interface between social research, policy and practice in a range of leadership roles. She is Co-Director at the Centre for Research on Families and Relationships where she leads a Knowledge Exchange team .She is a Director of What Works Scotland ( leading on the evidence to action stream that aims to increase ways that local authorities can use evidence to develop public services. She is also the Director (Knowledge Exchange and Research Impact) for the Usher Institute of population health and informatics ( She has worked as an Impact Analyst within the University, for the ESRC ( and with wider projects. Sarah’s research has investigated the process assessing the impact of research on policy and practice. ( She is an Associate Editor of the journal Evidence and Policy (  Here Sarah reflects from her experience and knowledge of the literature to bring out five tips for getting knowledge into action.

    I think we are at a good time in the knowledge mobilisation field. We have built a body of research that explains a lot about what helps and hinders knowledge getting used to aid decision making in policy and practice. (see Oliver 2014). We are developing rewards and incentives to help academics get research out of the academy (e.g, and we are at least playing lip service to the idea that policy and practice should be informed by the latest evidence (KNAER is a good example).

    Despite this, there are still many pitfalls along the way, and it is often easier to identify where things went wrong rather than success stories. The five tips I have identified below will not surprise many readers of this blog, and yet they are often the pitfalls that I see in practice. I’d really welcome any comments from readers about whether these are issues that you see in your practice, if I have missed something major, or if you disagree completely!
    Five tips for getting knowledge into action:
    Plan ahead
    Any good project or process involves careful planning, but how often is evidence-use included in the plan? If researchers want their research to have impact, a well-planned user engagement and KMb strategies have been shown to be effective ( On the policy or practice side, valuing evidence, showing leadership and embedding evidence into organisational practices are all key.

    So what would a planned evidence use process look like? For those from policy or practice it might consider how evidence will frame any project or development ( , how it will be considered and built on, what will be done when people don’t agree on what the evidence says, and how evidence will be accessed, analysed and interpreted. For research teams and partners it would consider who will be engaged and involved, what methods are best for engaging stakeholders  and how the research might contribute to change. This needs to move beyond simple ideas of making research accessible, into more complex and process focussed projects. (I have written about this much more extensively here
    Get the right people round the table
    In our evidence to action projects About Families ( and the Evidence Request Bank ( we learnt a lot about who is involved in evidence-use processes. Like others taking a systems-thinking approach (e.g. Best and Holmes 2010) we believe that it is essential to include a range of key actors in any knowledge mobilisation process. This would include considering the skills mix of any team in accessing, interpreting and animating evidence of different types. Any systems change also needs to include the perspectives of all key players within the system. Depending on the size of the change project these views might be represented in person, or through consultation of various kinds.
    Have the conversation
    Often the starting point for evidence use projects is the evidence itself, but there are a variety of discussions and framings that are essential for evidence to action. What is evidence needed for? What kinds of evidence might be useful? How will they be interpreted? How will evidence inform change processes? Who needs to be involved? Research doesn’t speak for itself so relationships are key to evidence to action.  Effective facilitation of knowledge mobilisation needs an ongoing sense of open dialogue, regular revisiting of planned aims, interrogation of context, and keeping the conversation going about the usefulness and relevance of evidence. We worked with Research Impact ( and NCCPE ( to develop a manifesto for partnership research that can help frame some of this conversation
    Focus on the process
    Using evidence is not a one-off event, but an ongoing process. If people feel they have ‘done’ knowledge mobilisation then they are missing a trick. Using and reusing evidence, checking as programmes develop, and building up more evidence as events unfold are all essential parts of successful knowledge mobilisation. An ongoing focus on the processes can open up new opportunities, ensure ground is not lost, help address conflict and tension, and assess changing contexts and their implications for KMb. Overall a focus on processes helps to ensure knowledge mobilisation continues to be as effective and relevant as it can be.
    Learn, evaluate, review
    I said in the opening of this blog that we are in a good place in terms of understanding barriers and facilitators to knowledge mobilisation (although a recent review is opening up this conversation . We are in a less clear place about what strategies and methods are most effective in which circumstances ( . As a community of knowledge mobilisers we need to develop evaluation methods, reflect more deeply and write up what we find out. My own approach to this has been published here . Every project needs a learning, review and evaluation process, even if on a simple team scale. As the field matures this will be essential in honing the craft, creating training programmes and developing the most effective strategies.

    So those are my five tips for getting knowledge into action. How do they resonate with your own experience? What might you add? What resources do you use? I look forward to continuing the conversation.

    Best, A. and B. Holmes (2010). "Systems thinking, knowledge and action: towards better models and methods." Evidence & Policy: A Journal of Research, Debate and Practice 6: 145-159.
Find us on Facebook!
Follow Us  on Twitter!
Subscribe to our blog!