Briana L. Rigau, University of Illinois at Chicago
Jodi Dooling-Litfin and Elizabeth Scully, Rocky Mountain Human Services, Denver, Colorado
Catherine Killian and Gail Fisher, University of Illinois at Chicago
Beth McManus, Colorado School of Public Health, Aurora Colorado
Mary A. Khetani, University of Illinois at Chicago
This article outlines a quality improvement (QI) initiative to build organizational capacity for engagement in research and evidence-based practice in early intervention (EI). The authors describe the implementation of online learning modules, a writing group simulation, and the launch of a writing group fellowship with service coordinators and their supervisors in a community EI program. The findings of these efforts are promising and indicate that implementing this QI initiative has the potential to improve practitioner capacity for engaging in research within EI, which may be beneficial for building research culture within EI, as well as advancing EI practice and scholarship.
Federal and state policies require early intervention (EI) programs to collect statewide child and family outcomes for the purposes of demonstrating EI accountability and efficacy (Bruder, 2010). In addition, the Individuals With Disabilities Education Improvement Act (IDEA) of 2004 requires EI to provide services that are evidence-based. However, EI programs experience challenges with collecting valid and reliable statewide outcomes data and incorporating evidence-based practices into EI care (Bruder, 2010).
One way that EI programs can address these challenges is through mutually beneﬁcial research partnerships that help to demonstrate EI efficacy and improve EI care quality. However, consistent with Bruder (2010), we have learned ﬁrsthand that there is a lack of consistent opportunities across EI programs for practitioners (e.g., service coordinators, supervisors, direct service providers) to build the knowledge and skills needed for meaningful engagement in research.
Since 2014, we have partnered with two EI programs on a research project to help them improve their delivery of family-centered care. One of the aims of this project was to examine the feasibility, acceptability, and value of implementing electronic patient reported outcomes (e-PROs) into the workﬂows of service coordinators (Rigau et al., 2018). While we had successfully piloted a standard research protocol for conducting this project at a neighboring, university-affiliated EI program (Khetani et al., 2018), we were unsuccessful in using the same approach with service coordinators at Rocky Mountain Human Services (RMHS), a larger and non-university affiliated EI program. In fact, only 5 out of the 22 service coordinators successfully recruited families (Rigau et al., 2018). While the university-affiliated EI program had extensive research experience, RMHS had little to no prior research experience.
In addition, service providers rather than service coordinators recruited families during the pilot study (Khetani et al., 2018). Differing levels of research experience likely contributed to the difficulties RMHS service coordinators had with research engagement. RMHS service coordinators cited unmet training needs as a key barrier to their ability to enroll families in the project (Rigau et al., 2018). Although service coordinators at RMHS undergo training and orientation related to their job duties and the EI system, they currently receive no on-the-job research process or engagement training (J. Dooling-Litﬁn, personal communication, August 2017).
Training in Research Engagement
On the basis of our published research describing this experience (Rigau et al., 2018), we co-created a three-part training with the RMHS service coordinators and supervisors to address this unmet training need for research engagement (see Figure 1). This three-part training included:
- Part 1 (mandatory): three self-paced and online learning modules,
- Part 2 (optional): a face-to-face writing group simulation (an activity that involves service coordinators reviewing and contributing to a research product, i.e., a manuscript), and
- Part 3 (optional): an annually renewed writing group fellowship that supports ongoing participation in authorship opportunities on research products.
Participants in the training were: (1) employed at RMHS from January 2019 through project implementation and (2) identiﬁed as a service coordinator or a supervisor to service coordinators. The target numbers for participation were 10 for the learning modules and 6–10 for the writing group simulation and writing group fellowship. To our knowledge, this is the ﬁrst co-designed professional development opportunity to build EI practitioner capacity for health services research (HSR) engagement within an EI program. In the remaining sections, we outline what we co-created, what we discovered, and what we learned to inform future efforts.
Part 1: Online Learning Modules
A series of three self-paced online learning modules were co-designed by academic and community partners to address the following topics: (a) conducting research to build the evidence base of EI practice, (b) practitioner involvement in research to generate evidence for practice, and ( c) practitioners’ research utilization and engagement to support their practice. To support participants’ learning during module completion, each module contained: short video cases, audio and visual displays of information, and interactive questions. Participants were given access to the online platform to complete the learning modules over 2 weeks.
RMHS EI program leadership and supervisors inﬂuenced learning module design by encouraging service coordinators to volunteer to pilot learning modules, supporting the volunteers to complete the training individually and during paid time, and allotting 45–75 minutes for learning module completion.
Figure 1. Overview of a Three-Part Training to Build Early Intervention Practitioner Capacity for Health Services Research Engagement at Rocky Mountain Human Services
The Module 1 learning outcome was “Identify why research and evidence-based practice are important to the ﬁeld of early intervention.” Its scope centered on the importance of conducting research to build the evidence base of EI practice. Participants received information regarding the United States’ Office of Management and Budget’s evaluation of EI, wherein EI efficacy was not demonstrated (Bruder, 2010); the many beneﬁts of collecting child and family outcomes data for EI outcomes research; and how collecting child and family outcomes data relates to utilizing evidence-based practices to improve EI efficacy.
The Module 2 learning outcome was “Identify reasons that service coordinators should be involved in research efforts, as well as formulate at least one clinically relevant curiosity.” Module 2 centered on the reasons practitioners should use research evidence and engage in research processes, emphasizing the beneﬁts of being part of an EI program that participates in research. This learning module outlined a model commonly used in occupational therapy to facilitate practitioner engagement in research (Hammel et al., 2015). HSR was also introduced as a type of research that is amenable to practitioner engagement. Indeed, this module aligns with national efforts to create infrastructure for collaborative research with health systems to encourage providers’ evidence-based clinical decision making (National Institutes of Health, 2019), and to our knowledge is the ﬁrst of its kind in EI.
The learning outcome for Module 3 was “Identify at least one way you can be involved in research efforts, as well as translate your identiﬁed clinically relevant curiosity from Module 2 into a PICO question.” This module focused on resources and techniques that practitioners can use to engage in research efforts, including information on accessing and searching databases online. For example, the PICO Framework (Ebell, 1999; Richardson, Wilson, Nishikawa, & Hayward, 1995) was outlined as a technique for participants to use to develop questions that can be answered through research. References to RMHS’s prior efforts to support research engagement were made, including their efforts to support evidence-based practice (e.g., journal club) and research engagement (e.g., the HSR project that RMHS is currently involved in). The third module also outlined participants’ available options to help design research projects, as well as interpret and disseminate research ﬁndings. The pilot study (Khetani et al., 2018) was included as an example because EI providers had given feedback during the pilot study that informed protocol revisions for the current phase of work being undertaken at RMHS. The writing group concept was introduced to participants as a novel way to engage in research study design, as well as the interpretation and dissemination of research ﬁndings within their workﬂows.
Learning Module Format
Prior to beginning the modules, participants were presented with a message highlighting their role in co-creating the learning modules and the instructions necessary for completing the learning modules. Participants were asked to share information (e.g., educational background) and then complete 6 pre-module questions (2 pre-module questions for each module) prior to accessing Module 1. After participants completed Module 1, they received access to the post-module questions for Module 1. Participants were required to answer 2 of the 3 post-module questions correctly to move on to Module 2. Participants progressed through Modules 2 and 3, as well as the post-module questions for Modules 2 and 3, in the same manner. Following the Module 3 post-module questions, participants were asked to provide feedback on elements of the learning modules via survey and interview to inform optimizations to the modules.
Part 2: Writing Group Simulation
Participants interested in participating in the writing group fellowship completed a simulation activity to practice (with an experienced trainer) contributing as a member of the writing group. Participants signed up for a time to complete the writing group simulation during a pre-determined week-long period. RMHS EI program leadership and supervisor input informed the design of this simulation that: was completed in dyads or triads, occurred on-site and during paid time, lasted 1 hour, and concluded with a 15-minute orientation to the writing group fellowship.
The trainer began each simulation by offering to answer participant questions about the learning modules and by asking guided questions to reinforce learning of module content. Next, the trainer introduced a manuscript draft and charged the participants to contribute to it in the following ways: (a) provide feedback on the logic and ﬂow of the introduction, (b) draft a statement of the study aim for inclusion in the abstract, ( c) identify 3–5 Medical Subject Heading terms that applied to the study, (d) identify 1–2 key results that should be discussed, and (e) discuss the relevance of study ﬁndings to clinical and organizational practices. Participants were also interviewed to collect feedback to improve the writing group simulation and then issued a $10 gift card.
Part 3: Writing Group Fellowship
Upon completing Parts 1 and 2, participants officially joined the writing group fellowship. During the writing group simulation, participants were issued the writing group’s policies and procedures and asked for their input on a group name. The writing group fellowship was launched in March 2019, with members expected to meet monthly, or more often as deadlines approach (e.g., every week for a month to meet a manuscript resubmission deadline). Members will attend fellowship meetings during staff paid time.
Participant Engagement in the Training
Of the 25 individuals approached, 13 (3 supervisors and 10 service coordinators; 52%) volunteered to complete the learning modules. More than half of the participants had been employed at RMHS for 5 years or less (61.6%) and had a graduate degree (53.9%). Participants represented diverse educational backgrounds, most commonly social work.
All 13 participants completed the learning modules within the allotted, 2-week time frame. Nine participants completed the learning modules in a single sitting, whereas four participants required 2–3 sittings. Mean completion time was 61.4 minutes (range = 36–102 minutes).
The average pre-module question score was 4.62 out of 6 (range = 3.33–5.67). The average post-module question score ranged from 2.6–2.8 out of 3. The average change in score from pre–post module completion ranged from -0.08–0.62. Most participants completed the interactive question in Module 1, and all participants completed the interactive questions in Modules 2 and 3.
All participants completed ratings for elements of the modules. No negative ratings were received; a small number of neutral ratings were noted, but the majority of ratings were positive (see Table 1). More than 90% of participants (12/13) provided narrative feedback for the learning modules, resulting in 7 learning modules changes and the re-recording of select audio clips to match in-text revisions.
A total of 10 participants (3 supervisors and 7 service coordinators; 76.9%) successfully completed the writing group simulation in 75–115 minutes. The primary reason for non-participation in the writing group simulation was lack of time. Nine of the participants completed the writing group simulation in-person, on-site, and as a dyad or triad during the predetermined week-long time frame. One service coordinator completed the writing group simulation individually via video conference because of scheduling conﬂicts during the predetermined week-long time frame. All 10 participants who completed the writing group simulation joined the writing group fellowship.
Participant simulation engagement was measured by the average in-text and discussion-based contributions. The average number of manuscript mark-ups was 2.67 (range = 0–5), and the average number of unique comments made during discussions about the manuscript was 22.8 (range = 20–27). All participants reported the simulation was helpful as is (i.e., no modiﬁcations necessary) and that fellowship policies and procedures were adequate (i.e., no changes needed).
To our knowledge, this is the ﬁrst co-designed professional development opportunity to build EI practitioner capacity for research engagement within an EI program. This approach exempliﬁes a Scholarship of Practice (SOP) philosophy, wherein practitioners engage in research in partnership with academic partners, advancing knowledge and practice (Hammel et al., 2015; Kielhofner, 2005). Findings were overwhelmingly positive and provide rationale for implementation scale up to the broader EI workforce at RMHS and other EI programs.
Table 1. Participants’ Ratings of Learning Module Elements
|Learning Module Element||Response||% of participants|
|Information in the learning modules||Helpful or very helpful||100)|
|Post-module questions||Helpful or very helpful||92|
|Interactive Questions||Helpful or very helpful||100|
|Videos||Helpful or very helpful||92.|
|Audio Clips||Helpful or very helpful||85|
|Images||Appealing or very appealing||78|
|Navigation||Easy or very easy||77|
|Overall Satisfaction||Satisﬁed or very satisﬁed||100|
Part 1: Learning Modules
All participants completed the learning modules within the allotted 2 weeks. Moreover, the range in completion time was modest and participants unanimously reported that they were satisﬁed with the learning modules and that the content of the learning modules was helpful (see Table 1). Taken together, this suggests participants beneﬁtted from the learning modules’ self-paced nature.
Pre-module question scores indicated that all participants had some baseline knowledge about the content presented. This result is not surprising given all participants had at least a baccalaureate degree and more than half had a graduate degree. However, ﬁndings also indicate knowledge gaps among participants, with the most prevalent being related to the importance of EI research and evidence-based practice.
Findings also suggested participants increased their knowledge about the content presented in Modules 1 and 3 after completing Modules 1 and 3. In Module 2, the average change in pre- to post-module score remained stable, which could suggest relatively high baseline knowledge about the reasons service coordinators should be involved in research efforts. However, the current Module 2 pre–post module questions may not adequately measure participants’ learning as written, which should be examined in the future.
Participants’ completion of the learning modules’ embedded interactive questions suggested persistent participant engagement throughout learning module completion. Most participants successfully created a PICO question, while some were missing one or more relevant PICO question components. Some participants reported difficulty and confusion with the PICO question activities.
Overall, participants reported high levels of acceptability related to the learning modules. Participants with varying education, educational backgrounds, and lengths of employment at RMHS unanimously reported learning module satisfaction and helpfulness (see Table 1). When prompted, participants described the learning modules as “informative” and “straightforward.” One participant stated that the learning modules “…would be helpful for service coordinators who were new and more experienced.” Another stated the learning modules gave “…good information to help service coordinators feel empowered in their role.”
Suggestions for Improvement
We incorporated the majority of the participants’ suggestions for improvement into the learning modules. Improvements were made to introductory instructions, drag and drop post-module question, turning a clinical curiosity into a PICO question, and combining shorter slides with longer slides. These changes were designed to improve navigation ease, clarity, and learning module structure, as well as enhance future users’ learning. One participant suggested removing Video #1 from the learning modules entirely, which was not incorporated as a change. Based on feedback received from participants, it was determined that removing Video #1 would not improve the majority of future users’ experience with the learning modules.
Parts 2 and 3: Writing Group Simulationand Fellowship
Data supported the feasibility and engagement in the writing group simulation and fellowship experience.
More than 75% of participants (76.9%) successfully completed the writing group simulation and joined the writing group fellowship, suggesting their feasibility. Because one participant was unable to complete the writing group simulation in-person within the pre-determined time frame, some individuals may beneﬁt from greater ﬂexibility in scheduling to participate. Our ﬁndings also suggested simulation feasibility with varying group sizes and contexts (i.e., in-person and virtually). Yet, writing group simulation completion time was 25 minutes longer when conducted virtually, suggesting virtual meetings can be feasible if participants can dedicate more than 90 minutes to the activity.
Results indicated that participants were engaged throughout the completion of the writing group simulation, as the majority of participants made mark-ups to the manuscript provided to them, and all participants made multiple comments during the discussion-based portions of the writing group simulation.
Future Success and Sustainability of the Training
To ensure that implementation of the training within the broader RMHS EI workforce and uptake at other EI programs is successful and sustainable, we made the following recommendations.
Part 1: Learning Modules
Moodle, a web platform that is free for up to 50 users (Moodle, 2019), was used for this project. We recommend this type of web platform for programs seeking to co-design research engagement activities. Indeed, the majority of participants highly rated the platform’s ease of navigation (see Table 1). Continuing to host the learning modules at no cost will be sustainable to use in the long term, as funding for EI programs has decreased in recent years (Khetani et al., 2018).
Learning Module Format
It is also recommended that learning modules are offered in a self-paced format. Providing participants with the ﬂexibility to complete the learning modules at their own pace and over the course of multiple sessions contributed to the learning modules’ feasibility.
Module 2 Pre–Post Module Questions
RMHS and other EI program leadership should review the Module 2 pre–post module questions to ensure they adequately measure users’ learning. It is possible that Module 2 pre–post questions were written so that participants were able to infer the correct answers with little to no knowledge of the content presented. Accurate data collection and review of this data will increase the likelihood the learning modules will be successful at RMHS and other EI programs (Greenhalgh, Robert, Macfarlane, Bate, & Kyriakidou, 2004).
Ongoing learning module review and revision is critical to adapt the modules as the EI ﬁeld and practice change. Time constraints (i.e., a lack of resources) will be a barrier to this ongoing quality assurance for the learning modules (Suarez-Balcazar, Mirza, & Hansen, 2015). However, RMHS and other EI programs can partner with academic institutions to assist with these activities. Academic partners can offer research and evidence-based resources to maintain the learning modules in exchange for contributions of EI knowledge and experiences from RMHS and other EI programs, thereby creating a mutually beneﬁcial research partnership (Suarez-Balcazar et al., 2005).
Parts 2 and 3: Writing Group Simulation and Fellowship
We recommend that the writing group simulation be ﬂexible in order to include in-person and virtual meeting options. Providing this ﬂexibility enabled one participant to complete the writing group simulation who wouldn’t have been able to otherwise, thus improving the feasibility of participation. The feedback from the participant who engaged virtually suggested no difference in the quality of the experience when completed virtually.
The writing group simulations should be conducted in dyads and/or triads. Related to group size, participants reported, “If the writing group simulation had been done with the trainer one-on-one, there would have been less opportunity for brainstorming throughout the activity,” and “If the writing group simulation had occurred in a larger group, I may not have said as much.”
We recommend RMHS and other EI programs establish a writing group name for internal use at the ﬁrst writing group fellowship meeting. Doing so will ensure that members of the writing group fellowship have sufficient time to determine their name. Establishing a writing group name for internal use may promote group members’ ownership and investment in the writing group, which could facilitate the writing group fellowship’s sustainability.
As the writing group transitions from being facilitated by academic partners to being facilitated by RMHS, challenges in oversight could threaten the fellowship’s sustainability (Suarez-Balcazar et al., 2015). To mitigate this concern, a writing group fellowship leader should be established as soon as possible. Ensuring an established key contact for the writing group will support effective communication between academic and community partners, and thus promote the sustainability of the writing group fellowship (Suarez-Balcazar et al., 2015). In addition, the writing group fellowship leader tracks members’ attendance and contributions, holding members accountable for contributing to the ongoing academic–community partnership. In this way, an EI program can promote shared accountability between themselves and their academic partners for ongoing research projects, thereby increasing the sustainability of the writing group fellowship at RMHS (Suarez-Balcazar et al., 2005). The leader for the writing group fellowship also becomes the trainer for the writing group simulation.
Strong leadership engagement in the training’s implementation is critical to ensure its successful program-wide implementation (Consolidated Framework for Implementation Research, 2018; Greenhalgh et al., 2004; Kimber, Barwick, & Fearing, 2012). In our project, EI program leadership and supervisors have been engaged in all aspects of training implementation. We also recommend the engagement of organizations’ executive leadership. Engaging all levels of leadership with the implementation of the training is critical for success (Kimber et al., 2012; Prosci Inc., 2017). It is essential that, throughout the training’s implementation, leadership demonstrate their support for, advocacy of, and commitment to the training (Greenhalgh et al., 2004).
To further bolster the training’s success, we also recommend that communication about the training occurs prior to and throughout the training’s implementation (Kimber et al., 2012; Wanner, 2014). For the training to be successful, communication efforts will need to include the training’s need, purpose, change processes, anticipated staff impacts and beneﬁts, expectations, support strategies, and anticipated outcomes (Kash, Spaulding, Johnson, & Gamm, 2014; Kimber et al., 2012; Prosci Inc., 2017).
Ongoing Reﬂection and Evaluation
To further support the successful implementation of the training, EI program leadership should continue to reﬂect on and evaluate the implementation of the training (Bertram, Blase, & Fixsen, 2015; Consolidated Framework for Implementation Research Research Team, 2018; Greenhalgh et al., 2004). This evaluation can be done by collecting data regarding the factors and outcomes related to the training’s implementation (Bertram et al., 2015; Greenhalgh et al., 2004; Kash et al., 2014; Lachman, Runnacles, & Dudley, 2015; Prosci Inc., 2017). Beneﬁcially, this ongoing quality assurance will inform decisions regarding the training implementation and progress toward the goal of improving practitioner-engaged research with the goal of sustaining an SOP within their EI program (Bertram et al., 2015; Prosci Inc., 2017). In addition, EI program leadership should collect ongoing training feedback to provide themselves with staff feedback to improve this professional development opportunity (Bertram et al., 2015).
Co-Designing the Professional Development Opportunity
Training development should occur in partnership with EI program leadership and staff. Resistance to change, such as the implementation of this training, may occur if staff are not consulted regarding the change (University of Wisconsin-Milwaukee, 2018). Soliciting staff feedback can also serve as a mechanism for staff consultation regarding the co-design of the training, which could minimize resistance (University of Wisconsin-Milwaukee, 2018) and maximize success.
RMHS is at the forefront of efforts to build research culture within EI. By undertaking the QI initiative as described throughout this article, RMHS has been able to prepare a subset of service coordinators and their supervisors to meaningfully engage in research as part of their employment. Not only are these service coordinators and supervisors more ready to engage in research, but RMHS is also better able to contribute to HSR efforts within the EI ﬁeld and has improved their ability to provide evidence-based and high-quality EI services to families and children. Findings are promising and indicate that implementing a QI initiative as described has the potential to improve practitioner capacity for research engagement and enhance the SOP within a large, urban EI program. Findings also indicate scaling up the training to other EI programs may be beneﬁcial for building research culture within EI. This article can be used to inform such scale-up efforts.
This project was funded by the National Institutes of Health, RMHS, and the Department of Occupational Therapy at the University of Illinois at Chicago (UIC). We thank Megan Win from the Department of Occupational Therapy at UIC, as well as Dianna Bosak, Vera Kaelin, Jessica Jarvis, and Andrea Gurga from the Children’s Participation in Environment Research Lab at UIC, for assistance with creating videos and visuals for the training. We also thank the following service coordinators and supervisors at RMHS for participating in and co-creating the training as described in this article: Lindsay Kuznicki, Amanda Pedrow, Haley Carle, Damaris Castillo, Wintre Brown, Angie Carey, Amy Jatsko, Natalie Lyle, Laura Sciarcon, Jamie Bane, Lisa Banuelos, Shannon Banks, and Kat Moser.
Briana L. Rigau, OTD, OTR/L, recently graduated from the University of Illinois at Chicago with a master’s of science in occupational therapy. She has been a team member of the Children’s Participation in Environment Research Lab at UIC since August 2016. Briana has presented and published on topics related to her work at the ZERO TO THREE National Training Institute. In addition, Briana is a pediatric occupational therapist who has been working in early intervention since October 2018.
Jodi Dooling-Litfin, PhD, is the director of Developmental and Behavioral Health at Rocky Mountain Human Services. She is a clinical psychologist. Dr. Dooling-Litﬁn began her career as a psychologist at a children’s hospital, where part of her role was to consult with the local early intervention program. She has since held clinical and administrative roles in early childhood, mental health, and human services programs. She has presented and published on topics related to her work. Throughout her career, she has participated on local, regional, and national committees related to early childhood, behavioral health, and leadership.
Elizabeth Scully is the Early Intervention Program operations manager at Rocky Mountain Human Services. Elizabeth Scully received her bachelor’s degree in social welfare and has worked at Rocky Mountain Human Services for 12 years. Her extensive experience and expertise in early intervention are relevant to this project.
Catherine Killian, OTD, MEd, OTR/L, is a clinical assistant professor and academic ﬁeldwork coordinator at the University of Illinois at Chicago, overseeing clinical ﬁeldwork experiences. She has expertise as a health care manager and a clinician in acute care, sub-acute care, skilled nursing, acute rehabilitation, outpatient, and home health settings. Dr. Killian is a member of the Illinois Occupational Therapy Licensure Board and the Chicago Area Council of Occupational Therapy Directors. She has appointments on two American Occupational Therapy Association committees. Her scholarly work focuses on leadership skill development, ﬁeldwork education, performance improvement, and program development.
Gail Fisher, PhD, OTR/L, FAOTA, is a clinical professor at the University of Illinois at Chicago Department of Occupational Therapy. Her health services research experience and familiarity with the Scholarship of Practice model are relevant to this project. Her scholarship focuses on how policies affect therapy practice and access to therapy.
Beth McManus, PT, MPH, ScD, is a child health services researcher whose research focuses on systems of care for very young children with special health care needs. Beth is particularly interested in access to, effectiveness, and cost-effectiveness of Part C early intervention services for infants and toddlers with developmental delays and disabilities. Beth is an associate professor in the Department of Health Systems, Management and Policy at Colorado School of Public Health. Beth also maintains an active clinical practice as a pediatric physical therapist in the Neonatal Intensive Care Unit at Children’s Hospital Colorado.
Mary A. Khetani, ScD, OTR/L, is a pediatric occupational therapist and rehabilitation scientist. Dr. Khetani joined the University of Illinois at Chicago Department of Occupational Therapy in September 2015. She holds an affiliate research appointment at the CanChild Centre for Childhood Disability Research in Canada. Dr. Khetani directs the Children’s Participation in Environment Research Lab (www.cperl.ahslabs. uic.edu) that contributes to interdisciplinary and multisite translational research for advancing family-centered and participation-focused pediatric (re)habilitation. Dr. Khetani was a member of the ZERO TO THREE Fellows class of 2007–2009.
Rigau, B. L., Dooling-Litfin, J., Scully, E., Killian, C., Fisher, G., McManus, B., & Khetani, M. A. (2019). Building organizational capacity for research in early intervention. ZERO TO THREE Journal, 39(6), 54–61.
Bertram, R. M., Blase, K. A., & Fixsen, D. L. (2015). Improving programs and outcomes: Implementation frameworks and organization change. Research on Social Work Practice, 25(4), 477–487. Source link
Bruder, M. B. (2010). Early childhood intervention: A promise to children and families for their future. Exceptional Children, 76(3), 339–355. Source link
Consolidated Framework for Implementation Research. (2018). Consolidated framework for implementation research constructs [PDF file]. Source link
Ebell, M. H. (1999). Information at the point of care: Answering clinical questions. Journal of the American Board of Family Practice, 12(3), 225–235.
Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. The Milbank Quarterly, 82(4), 581–629. Source link
Hammel, J., Magasi, S., Mirza, M. P., Fischer, H., Preissner, K., Peterson, E., & Suarez-Balcazar, Y. (2015). A scholarship of practice revisited: Creating community-engaged occupational therapy practitioners, educators, and scholars. Occupational Therapy in Health Care, 29(4), 352–369. Source link
Individuals with Disabilities Education Improvement Act of 2004. Pub. L. No.108-446, 20 U.S.C. § 1400 et seq. 2004.
Kash, B. A., Spaulding, A., Johnson, C. E., & Gamm, L. (2014). Success factors for strategic change initiatives: A qualitative study of healthcare administrators’ perspectives. Journal of Healthcare Management, 59(1), 65–81. Source link
Khetani, M. A., McManus, B. M., Arestad, K., Richardson, Z., Charlifue-Smith, R.,Rosenberg, C., & Rigau, B. (2018). Technology-based functional assessment in early intervention: A pilot study. BioMed Central Pilot and Feasibility Studies, 4, 1–9. Source link
Kimber, M., Barwick, M., & Fearing, G. (2012). Becoming an evidence-based service provider: Staff perceptions and experiences of organizational change. Journal of Behavioral Health Services & Research, 39(3), 314–332. http://doi.org/10.1007/s11414-012-9276-0
Lachman, P., Runnacles, J., & Dudley, J. (2015). Equipped: Overcoming barriers to change to improve quality of care (theories of change). Archives of Disease in Childhood: Education and Practice Edition, 100, 13–18. source link
Moodle. (2019). Pricing. https://moodle.com/pricing/
National Institutes of Health. (2019). About NIH Collaboratory. https://rethinkingclinicaltrials.org/about-nih-collaboratory
Prosci Inc. (2017). Five tips for managing resistance. Source link
Richardson, W. S., Wilson, M. C., Nishikawa, J., & Hayward, R. S. (1995). The well built clinical question: A key to evidence-based decisions. ACP Journal Club, 123(3), A12–3.
Rigau, B. L., Scully, E. A., Dooling-Liftin, J. K., Murphy, N. J., McManus, B. M., & Khetani, M. A. (2018). Community engagement to pilot electronic patient reported outcomes (e-PROs) in early intervention: Lessons learned. Journal of Clinical and Translational Science, 2(1), 20–26. Source link
Suarez-Balcazar, Y., Hammel, J., Helfrich, C., Thomas, J., Wilson, T., & Head-Bull, D. (2005). A model of university-community partnerships for occupational therapy scholarship and practice. Occupational Therapy In Health Care, 19(1–2), 47–70. Source link
University of Wisconsin-Milwaukee. (2018). Resistance to change. Source link
Wanner, M. F. (2014). Implement strategy by strategic initiatives with effective program and change management. Paper presented at PMI® Global Congress 2014—EMEA, Dubai, United Arab Emirates. Newtown Square, PA: Project Management Institute.