Enhancing Child Care Quality by Director Training and Collegial Mentoring

Gillian Doherty
Doherty Associates
Tammy McCormick Ferguson
Early Childhood Community Development Centre
Glory Ressler
Canadian Mothercraft Society
Jonathan Lomotey
Novell Community Development Solutions

Abstract

Although considerable evidence confirms that a director with good leadership and administrative skills is vital for developing and sustaining a high-quality child care program, many directors assume the role with little management experience or training. This paper reports on a training program in Canada that combined a formal curriculum to increase director administrative knowledge and skills with a mentoring component emphasizing peer support and collegial learning, delivered in a way that enabled participants to continue working full time. The participants in each of the 28 locations across the province formed study groups with facilitators and held monthly three-hour meetings for a year. During the meetings the participants and their facilitators discussed specific aspects of center administration and their implications for practice. Between meetings participants engaged in self-reflection, shared reflective journals, and worked as a group or in pairs on assignments such as doing a critical analysis of a case study. Graduates showed significant improvement in their administrative practice as measured by the PAS (t[57] = 4.31, p < .001) and in the global classroom quality in their centers as measured by the ECERS-R (t[57]= 3.32, p < .01). Eighteen months after graduation, members of all study groups reported ongoing contact with each other (e.g., seeking and receiving assistance and working on joint projects). These reports confirm the program’s success in developing local, ongoing director support networks.

Introduction

Most research to identify predictors of good quality center-based child care has focused on adult-child ratio, group size, teacher general education, and teacher early childhood training. Similarly, much of the discussion about enhancing child care centers “has been about the professional development of teachers and not about those who supervise them” (Ryan, Whitebook, Kipnis, & Sakai, 2011). Yet several studies report that directors’ leadership skills and administrative practices have a significant impact on classroom quality (Bloom, Rafanello, Eisenberg, & Talan, 1998; Bloom & Sheerer, 1992; Lower & Cassidy, 2007; McCormick Center for Early Childhood Leadership, 2010a; Phillips, Howes, & Whitebook, 1991). This study reports the results from a director administrative training program, Mentoring Pairs for Child Care (MPCC). Development, implementation, and evaluation of MPCC were funded by the government of Ontario, Canada, as part of a strategy to enhance the quality of the province’s child care centers.

Administrative Practices and Program Quality

Effective program administration requires both leadership and management knowledge and skills (Talan & Bloom, 2004). Skilled leaders are able to articulate a vision of quality child care to guide their programs, inspire others to work toward that vision, set goals, develop a course of action, and create the organizational climate and structures needed to support the vision (Carter, 2012; Muijs, Aurbrey, Harris, & Briggs, 2004). Good managers establish and maintain effective financial, personnel, and monitoring systems and practices; make optimum use of human and financial resources; create and sustain a supportive workplace; supervise staff; and coordinate with children’s families and other organizations working with the child or family (Bloom & Bella, 2005; Muijs et al., 2004; Ryan et al., 2011). Several studies report an association between administrative and program quality (Bloom et al., 1998; Bloom & Sheerer, 1992; Lower & Cassidy, 2007; McCormick Center for Early Childhood Leadership, 2010a; Phillips et al., 1991). However, these studies used a variety of methods to measure administrative quality.

The release of the Program Administration Scale (PAS), described later in this study, provided the child care community with its first objective, standardized, and validated tool to measure the quality of a center’s administrative practices (Talan & Bloom, 2004). To date it has been used to evaluate administrative quality in two published studies. The largest, which involved 452 classrooms in 138 American centers, found that administrative quality accounted for 26% of the variance in classroom global quality (t[138] = 3.62, p < .0001) after controlling for the impact of length of day, number of teachers, annual turnover rate, child enrollment, and the lead teacher’s qualifications (McCormick Center for Early Childhood Leadership, 2010a, p. 1). The other study also found a significant relationship (r[54] = .291, p = .031) between administrative and classroom quality (Lower & Cassidy, 2007, p. 195–196).

The Current Study

Project Rationale and Research Questions

Effective child care center administration requires a body of administrative knowledge and skills that goes beyond that obtained through basic early childhood teacher training. Nevertheless, a survey of 564 directors in 25 American states reports that one-third had not completed any college coursework in administration (McCormick Center for Early Childhood Leadership, 2010b, p. 2). A similar situation exists in Canada, which influenced the Ontario government’s decision to fund MPCC’s development and evaluation.

The study’s three research questions were: (a) To what extent did graduation from MPCC enhance directors’ administrative practices? (b) To what extent did graduation from MPCC enhance the classroom global quality in directors’ centers? (c) Did the MPCC study groups result in local director support networks that continued after graduation?

Program Design

The parameters for MPCC were established by the funder’s stipulation that the program (a) use an administrative curriculum that was accepted by all the community colleges on the project’s Advisory Committee, (b) establish mentoring relationships among participants, and (c) be accessible to directors regardless of their location. Its design was informed by and built upon two other initiatives to enhance child care center quality. One was the director administrative training model developed by Paula Jorde Bloom and her colleagues at the McCormick Center for Early Childhood Leadership at National-Louis University and used in its two director administration training programs (Bloom et al.,1998; Bloom & Sheerer, 1992). The other was the Partners in Practice Mentoring Model, which was successfully field-tested with experienced directors mentoring newer directors (E. E. Ferguson, T. M. Ferguson, Singleton, & Soave, 1999). MPCC’s design also included unique components to address the funder’s specific requirements.

The McCormick Center Director Training Model

The McCormick training model is grounded in adult learning theory and a social systems approach to organizational change (Bloom & Sheerer, 1992). Consistent with adult learning theory, the instructor’s role is expanded beyond conveying information to include encouraging discussion among participants as well as their active involvement in designing learning experiences that meet their needs. Each training participant is provided a mentor hired by the program. Using the social systems approach, mentors guide their mentees through examining their center’s social system, identifying issues that need to be addressed, analyzing the potential impact of different alternatives to organizational change, and developing a quality enhancement plan. The McCormick model provided MPCC’s developers with an example of effective incorporation of adult learning theory, a social systems approach to understanding organizational change, and mentoring in a program that also includes a formal curriculum.

The Partners in Practice (PIP) Mentoring Model

The PIP Mentoring Model is based on two principles: (a) excellence in early childhood education requires reflective practitioners and (b) effective mentoring requires valuing the knowledge, skills, and uniqueness that each member brings to the relationship; open dialogue; a nonjudgmental approach; and honest, respectful, and supportive feedback (E. E. Ferguson et al., 1999). PIP uses strategies and materials that serve as the focus of mentoring discussions, including journaling, conferencing, site visits, and analyzing a situation in a center through a cause-and-effect diagram. The PIP model provided MPCC with a vision of effective mentoring relationships and various materials and strategies to focus discussions between mentors and mentees.

Multiple Training Locations

MPCC’s design also was influenced by the funder’s stipulation that the program be accessible to directors regardless of their location. Although Ontario is Canada’s most populous province with several large urban areas (and associated suburban and rural areas), it is also characterized by sparsely populated northern regions and southern rural areas. Provisions for this uneven population distribution were built into the program design to enable MPCC to be provided in local areas regardless of their location.

Method

MPCC’s Unique Program Components

Although MPCC is informed by the McCormick and PIP models, two of its components are unique: its use of occupational standards for center administrators as the foundation for its curriculum and its emphasis on collegial mentoring instead of matching participants with mentors hired by the program.

The Occupational Standards for Child Care Administrators. The Occupational Standards for Child Care Administrators document, which was developed through extensive consultation with the child care sector across Canada (Child Care Human Resources Sector Council, 2006), served as the foundation for the MPCC curriculum. It addresses six key director responsibilities: child development and care, human resources, financial, facilities, family and community relations, and governance. Each area of responsibility is subdivided into related tasks that must be completed for competency in that area. In turn, each task includes subtasks, each of which has a list of essential knowledge and required skills and abilities. For example, the two tasks identified for child development and care are (1) develops and implements children’s programs and (2) creates child-centered environments. The first task has six subtasks: (a) develops and implements a philosophy of early childhood education and care, (b) applies pedagogical guidelines, (c) provides programming support to staff, (d) develops philosophies and practices for meeting children’s needs, and (e) evaluates programs. For the first subtask, one component of required knowledge is awareness of the environmental and cultural influences on child development; among the required skills is the ability to create a philosophy by consulting and involving stakeholders, such as center staff and children’s families.

To create the MPCC, the standards document was supplemented by written materials and required activities related to leadership and administration theory and practice and to developing a center’s mentoring culture. In 2013 the Child Care Human Resources Sector Council updated the original 2006 standards used by MPCC and expanded them to address additional key areas. The updated standards are available to the general public and may be downloaded free of charge.

Collegial Mentoring. Unlike the usual practice in training programs whereby each participant is matched with a nonparticipant mentor hired by the program, MPCC participants were matched to form mentor-mentee pairs.

Participants

Selection. The 28 areas of Ontario selected for MPCC training included 75% of the province’s child care centers (N = 4,505) and represented a cross section of sparsely populated northern areas, southern rural areas, and urban communities of various sizes with their surrounding rural or suburban areas. Each director in these areas was informed about the availability of this free college credit course through individual letters that included a brochure and a website address for additional information. The program was also promoted by the government licensing officials in each area and by local community colleges. Of the 405 applications received, 403 met the requirement of being a current director or assistant director, and all 403 were accepted. Three hundred and forty of the 403 participants completed the one-year program, reflecting an attrition rate of 15.6%.

Mentor-mentee pairs. Participants with less than five years of director experience were matched with more experienced directors from their geographic areas. The matching process took into account such factors as length and type of participants’ child care–related training, occupational standards areas in which mentors felt most competent and on which mentees most wanted to work, and the mentors’ teaching-style preference and the mentees’ learning-style preference.

Initial Orientation and Training

Trainers. MPCC facilitators were college instructors with child care education and/or management experience and were hired to be responsible for program delivery in a specific area. Before the program started, they attended two days of group orientation. This orientation was provided by the sponsoring organization’s executive director and the staff person responsible for MPCC’s implementation in all 28 participating areas. All participating directors, whether mentor or mentee, received 21 hours of preparation across three days at the start of the program. This included an orientation to MPCC, introduction to key concepts (e.g., leadership, mentoring, reflective practice, career development, and systems thinking), and review of the training methods and tools they would be using. Directors designated as mentors received an additional seven hours of preparation that included training in mentoring techniques and facilitating reflective practice. Both participant orientations were provided by the facilitators for the group of participants they would be working with.

Study Group Meetings and Related Activities. The participants and the program facilitator in each area formed study groups, ranging in size from 6 to 22 participants, which held monthly three-hour meetings from February to December. Instead of formal presentations of the curriculum content by the facilitator or a guest instructor, MPCC participants were expected to come to each meeting having reviewed the occupational standards (Child Care Human Resources Sector Council, 2006) and other assigned material that would be discussed. The facilitator’s role was to assist participants in analyzing what they had reviewed, discussing the implications for practice, sharing their ideas and experiences, and integrating new insights on management and leadership into their daily practice.

Each facilitator received a facilitator manual; its content and use were reviewed and discussed during their initial training. The manual included PowerPoint slides for the introduction to the study group process at the first meeting, templates for the four facilitation processes (case studies, cause-and-effect process and diagram, critical review of a case study or article, and key informant presentation), a list of supplemental reading, and the MPCC website navigation sheet describing how to use it for a variety of activities (e.g., a group forum, mentee-mentor exchange and discussion of their reflective journals, or two or more participants working together on a project). Basic assignments and content were uniform across groups but, consistent with adult learning theory, each group was free to choose which process it would use to complete each assignment. Each mentoring pair was required to submit a written copy of the assignment work and to make a presentation to their facilitator and study group on the knowledge they had gained and/or the changes they had made as a result of an assignment. Participants also were required to contribute to a community presentation at the end of the program, showcasing outcomes and outlining how they had developed a philosophy of quality child care to inform their practice. At the end of the training, the group facilitator reviewed each participant’s presentations and materials and determined an overall pass/fail mark on the basis of the individual’s presentation(s), attendance, and completion of other MPCC components (e.g., maintaining a reflective journal, self-assessments, and development of a professional development plan).

Participants reported spending an additional five to eight hours between meetings each month reviewing the material for the next meeting and completing between-meeting assignments. Assignments included participants working on projects with mentoring partners or other members of their study group, making a site visit to another participant’s center, and gathering information for a presentation to the group on such topics as pedagogical guidelines. Mentoring pairs were expected to exchange and discuss their reflective journals on a regular basis throughout the training.

Subsample Selection and Subject Characteristics

Before the start of the program a subsample of 10 of the 28 participating areas was selected from various parts of the province to provide centers with on-site pre- and posttraining observations. Participation in MPCC was voluntary, as was agreeing to participate in the evaluation on-site observations for both the center and the lead teacher in the room to be observed. The final number of observed programs comprised 14% of the entire sample.

Pre-MPCC observations and interviews were conducted in December 2009 and early January 2010. At that time, 8 of the 71 centers that had agreed to be observed were not available because of scheduling difficulties, an influenza outbreak, or director illness or absence for another reason; this left a sample of 63 centers. Post-MPCC observations and interviews were conducted in January and February 2011. Six of the 63 centers with completed pre-MPCC observations were not available for the post-MPCC observation for a variety of reasons. Three directors had withdrawn from MPCC, the board of directors in one center withdrew permission to be observed, one director had changed centers, and one was ill during the timeframe available. This left a final sample of 57.

Fifty-four of the 57 directors observed after MPCC had at least a two-year college early childhood diploma. Eight also had a bachelor’s degree in or related to early childhood education; a ninth had a master’s degree. Nine directors had completed a certificate in administration. Twenty-five of the 57 directors had less than five years of director experience.

Evaluation Instruments

Program Administration Scale. Administrative practice was evaluated using the Program Administration Scale (Talan & Bloom, 2004). The PAS has 25 items arranged in 10 subscales: human resources development, personnel cost and allocation, center operations, child assessment, fiscal management, program planning and evaluation, family partnerships, marketing and public relations, technology, and staff qualifications. The final subscale was not used in this study because of difficulties converting American terminology for teacher training to that used in Canada. Administering the PAS involves a tour of the internal and external areas used for programming or by parents or staff; an interview with the director about policies, procedures, and administrative practices; and a document review to confirm the information obtained from the director. Each PAS item is ranked using a seven-point scale with 1 representing inadequate, 3 minimal, 5 good, and 7 excellent. The reliability of the PAS scale was assessed using Cronbach’s alpha. The scale had high internal consistency both at pre-test (α = .86) and posttest (α = .87). Pearson’s r correlations among the nine subscales ranged between −.05 and .63 with a median of .35 at pretest and between .04 and .67 with a median of .39 at posttest. These statistics are comparable to those reported in the normative sample (Talan & Bloom, 2004, pp. 71–72). In 2011, Talan and Bloom updated this scale.

Classroom global quality. The Early Childhood Environment Rating Scale – Revised (ECERS-R) was used to measure global classroom quality (Harms, Clifford, & Cryer, 1998). The 43 ECERS-R descriptors are arranged in seven subscales: space and furnishings, personal care routines, language-reasoning, activities, interactions, program structure, and parents and staff. Administration of the ECERS-R involves observation in a single classroom supplemented by an interview with the lead teacher regarding items that could not be observed, such as children’s nap time. Each descriptor is ranked using a seven-point scale with 1 representing inadequate, 3 minimal, 5 good, and 7 excellent. The reliabilities of the total ECERS-R scale and its subscales were assessed using Cronbach’s alpha. The total scale displayed high internal consistency at both pretest (α = .91) and posttest (α = .94). The internal consistency of the subscales ranged between .67 and .86. These statistics are comparable to those reported in the earlier normative sample (Harms et al., 1998, p. 3).

The Evaluation Approach

The study’s three research questions were: (a) To what extent did graduation from MPCC enhance directors’ administrative practices? (b) To what extent did MPCC graduation enhance the classroom global quality in directors’ centers? and (c) Did the MPCC study groups result in local director support networks that continued after graduation? Answering the first question required maximizing the number of centers whose administrative practices were evaluated. Unfortunately, the available funds only permitted global quality assessment in one classroom in each center using a pre- or postevaluation instead of a comparison or control group.

Data Collection

Each pre- and postprogram observation required a full day in the center, during which the ECERS-R data were collected in the morning and the PAS data in the afternoon. All 10 data collectors had at least a two-year college diploma in early childhood education and nine or more years of experience in the child care field after they received their degree. Each signed a data collector confidentiality agreement before their first observation and attended three days of training on the PAS and two days of training on the ECERS-R, during which inter-rater agreement levels were established. At that time they were 85.1% for the PAS (range 81% to 100%) and 90% for the ECERS-R (range 88.0% to 97.4 %). Inter-rater agreement levels were again established just before the posttraining observations; all data collectors continued to have 80% or better levels on both instruments. Data from the observation scoring sheets were transferred into the Statistical Package for the Social Services (SPSS) and checked to ensure data entry accuracy.

Findings

Effectiveness of the Training Model

MPCC effectiveness was measured by the extent to which MPCC graduation enhanced directors’ administrative practices, MPCC graduation enhanced global classroom quality in the directors’ centers, and the MPCC study groups led to creation of local director support networks that continued after graduation.

Administrative practices. As indicated in Table 1, the study found significantly higher post-MPCC scores on the total PAS and on seven of its subscales. The effect size for the total score was d = 0.57, p < .001, which, using Cohen’s (1988) suggested interpretation, indicates that MPCC had a moderate impact on administration quality. This impact level is good given that post-MPCC data had to be collected within two months after program completion to meet the funder’s requirements, leaving little time for directors to adapt what they had learned. The most significant PAS changes occurred in those areas where directors have the most control: human resources development, use of technology, and marketing and public relations. Change was not significant for two subscales: child assessment and personnel cost and allocation, which includes salaries and benefits. Administrative change in these areas often requires prior approval from the operator and/or additional funds.

Table 1
Changes in Administrative Practice (PAS)
Variable Pretest mean
(n = 57)
Posttest mean
(n = 57)
t-value Cohen’s d
Total PAS score 2.8 3.3 4.31*** 0.57
Human resources development 2.4 3.1 4.07*** 0.54
Personnel cost and allocation 2.1 2.3 1.21
Center operations 3.2 3.5 2.27* 0.30
Child assessment 2.9 3.2 1.27
Fiscal management 3.3 3.8 2.61* 0.35
Program planning and evaluation 2.1 2.6 2.35* 0.31
Family partnership 2.8 3.3 2.54* 0.34
Marketing and public relations 3.2 4.1 4.96*** 0.66
Technology 4.1 4.5 2.81** 0.37
* p < .05
** p < .01
*** p < .001

Classroom global quality. As illustrated in Table 2, director graduation from MPCC was associated with significant enhancement of global classroom quality on the total ECERS-R scale (t[57] = 3.32, p < .01) and four of its subscales. The effect size for the total ECERS-R was d = 0.44, p < .01, which, while lower than that for the PAS, still indicates that MPCC had moderate impact on classroom quality. The lower effect size found for the ECERS-R is to be expected, as the link from director administrative training to classroom quality is indirect via the training’s impact on the director’s leadership skills and the center’s administrative practices. As a result, enhancement of classroom quality occurs only after the effects of enhanced director leadership and center administrative practices trickle down to the teaching staff (Guskey, 1998). The most significant changes occurred in the ways space and furnishings were used, personal care routines, provision of activities to support child development, and provisions for meeting staff needs. There was little change in adult-child interaction which, with a pre-MPCC score of 6.0, was already good.

Table 2
Changes in Global Classroom Quality (ECERS-R)
Variable Pretest mean
(n = 57)
Posttest mean
(n = 57)
t-value Cohen’s d
Total ECERS-R score 4.6 4.9 3.32** 0.44
Space and furnishings 4.7 5.0 2.42* 0.32
Personal care routines 4.3 4.6 2.38* 0.32
Language-reasoning 4.6 4.6 1.64
Activities 3.5 3.8 2.77** 0.37
Interaction 6.0 6.2 1.52
Program structure 5.4 5.6 1.80
Parents and staff 5.2 5.5 2.92** 0.39
* p < .05
** p < .01

Development of director support networks. Eighteen months after MPCC ended, graduates were sent a survey that explored the extent to which they had maintained contact with other members of their study groups and the nature of those contacts. At that time, most graduates reported they were still in contact with their study group members and reported having sought and given assistance and been involved in joint projects. These projects included (a) providing training sessions for other directors on the Occupational Standards for Child Care Administrators and/or using the PAS to evaluate one’s own administrative quality and identify areas needing attention, (b) developing and circulating a list of directors in their area who were available for informal support, (c) mentoring assistant directors interested in becoming directors, and (d) reaching out to new or nonparticipating directors to engage them in director support networks. Such activities indicate that MPCC was successful in creating ongoing local director support networks.

Discussion

Findings of this study support previous research indicating a strong relationship between administrative quality as measured by the PAS and global classroom quality as measured by the ECERS-R (Lower & Cassidy, 2007; McCormick Center for Early Childhood Leadership, 2010a). It also replicates the finding of enhanced classroom quality after directors completed an administrative training program that, like MPCC, combined a formal curriculum with a strong mentoring component (Bloom et al., 1998; Bloom & Sheerer, 1992). MPCC extends the findings of these two studies by demonstrating the successful use of (a) multiple training sites to provide the program simultaneously to 403 directors instead of only to a small group in a single location, (b) collegial mentor-mentee pairs instead of participants matched with a mentor hired by the program, and (c) study groups whose members live in the area and have frequent contact with each other during training as a method of developing ongoing director support networks.

Our findings indicate that MPCC had a moderate impact on administrative practice (d = 0.57, p < .001) and on global classroom quality (d = 0.44, p < .01). There also was a significant positive correlation (r = .48, p < .01) between administrative and global classroom quality. The fact that MPCC study group members maintained contact with each other after graduation and reached out to other directors not involved in the training indicates that the program was successful in developing ongoing local director support networks.

Several limitations were placed on the evaluation design by the need to use a voluntary sample and the constraints of the funding available for the program evaluation. We attempted to address these limitations by recruiting as large and as representative a sample as possible, approaching every center in the 10 evaluation areas that had a director enrolled in MPCC. However, the self-selection means that the centers observed in MPCC may represent only those whose governing bodies and directors are the most self-confident and committed to program quality. Self-selection also resulted in a small sample of 57 centers, which raises the issue of potential sample selection bias.

The limited funds available for the evaluation also meant that MPCC could only use a single group pre- and postdesign rather than a comparison or control group, which raises the question of possible data collector bias. We attempted to reduce this possibility by using data collectors with no connection with either MPCC or the agency responsible for its delivery. In addition, observation scores were entered electronically and once input, they could be downloaded only by the project coordinator or her data assistant.

MPCC’s first goal was enhancement of administrative practice; therefore pre- and posttraining evaluations of administration were done in every center. However, resource availability permitted pre- and posttraining evaluation of classroom quality in only one classroom per center. The observed room might not have been representative of the global program quality in all the center’s classrooms. Resource limitations also meant that we were unable to seek information about possible variables other than MPCC that may have influenced our participants’ experience during the yearlong program (e.g., a center providing additional teacher training). This lack of information about other factors that might impact on classroom quality reduced certainty that enhanced global classroom quality was associated solely with enhanced administrative practices.

Implications

The pre-MPCC finding of low administrative quality in its sample is unlikely to be unique to the province of Ontario. A study of 564 centers across 25 U.S. states reports an average mean overall PAS score of 3.4, just above a minimal rating (McCormick Center for Early Childhood Leadership, 2010c, p. 1). These findings indicate an urgent need to dedicate resources to improving the leadership and administrative abilities of current directors through a variety of strategies. One might be the provision of affordable training such as that discussed in this article that enables directors to continue to work full time and can be delivered to multiple sites simultaneously at low cost. Other possible strategies include training directors how to use the PAS to identify needed changes and develop a quality enhancement plan and providing resources to enable the implementation of such plans. Such initiatives would be good first steps toward having skilled leadership and administration in every center, but they are not sufficient. Consideration needs to be given to providing directors who have completed administration and leadership programs with compensation that is commensurate with their knowledge and skills to reduce the possibility that they will leave the field for higher salaries as managers elsewhere (Ryan et al., 2011).

References

Bloom, Paula Jorde, & Bella, Jill. (2005). Investment in leadership training – The payoff for early childhood education. Young Children, 60(1),32–40.

Bloom, Paula Jorde; Rafanello, Donna; Eisenberg, Eileen; & Talan, Teri. (1998). Early childhood leadership training: Taking charge of change (Evaluation report – Cohorts #1–#5,unpublished manuscript). Wheeling, IL: The Center for Early Childhood Leadership, National-Louis University.

Bloom, Paula Jorde, & Sheerer, Marilyn. (1992). The effect of leadership training on child care program quality. Early Childhood Research Quarterly, 7, 579–594. doi:10.1016/0885-2006(92)90112-C

Carter, Margie. (2012, Winter). Improving your program with pedagogical leadership. The Director’s Link, 1–3.

Child Care Human Resources Sector Council. (2006). Occupational standards for child care administrators. Ottawa, Canada: Author.

Child Care Human Resources Sector Council. (2013). Occupational standards for child care administrators (Rev. ed.). Ottawa, Canada: Author.

Cohen, Jacob. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum.

Ferguson, E. Elaine; Ferguson, Tammy McCormick; Singleton, Carmella; & Soave, Alison. (1999). The partners in practice mentoring model: Reflection, caring and sharing. Halifax, Nova Scotia: Partners in Practice, Child Care Connection Nova Scotia.

Guskey, Thomas R. (1998). The age of our accountability. Journal of Staff Development, 19(4), 36–44.

Harms, Thema; Clifford, Richard M.; & Cryer, Debby. (1998). Early childhood environment rating scale (Rev. ed.). New York, NY: Teachers College Press.

Lower, Joanna K., & Cassidy, Deborah, J. (2007). Child care work environments: The relationship with learning environments. Journal of Research in Childhood Education, 22, 189–204. doi:10.1080/02568540709594621

McCormick Center for Early Childhood Leadership. (2010a, Spring). A window on early childhood administrative practices. Research Notes.

McCormick Center for Early Childhood Leadership. (2010b, Winter). Head Start administrative practices, director qualifications, and links to classroom quality. Research Notes.

McCormick Center for Early Childhood Leadership. (2010c, Summer). Connecting the dots: Director qualifications, instructional leadership practices, and learning environments in early childhood programs. Research Notes.

Muijs, Daniel; Aubrey, Carol; Harris, Alma; & Briggs, Mary. (2004). How do they manage? A review of the research on leadership in early childhood. Journal of Early Childhood Research, 2, 157–169. doi:10.1177/1476718X04042974

Phillips, Deborah; Howes Carollee; & Whitebook, Marcy. (1991). Child care as an adult work environment. Journal of Social Issues, 47(2), 49–70. doi:10.1111/j.1540-4560.1991.tb00287.x

Ryan, Sharon; Whitebook, Marcy; Kipnis, Fran; & Sakai, Laura. (2011). Professional development needs of directors leading in a mixed service delivery preschool system. Early Childhood Research & Practice, 13(1). English | Spanish | Chinese

Talan, Terri N., & Bloom, Paula Jorde. (2004). Program administration scale: Measuring early childhood leadership and management. New York, NY: Teachers College Press.

Talan, Terri N., & Bloom, Paula Jorde. (2011). Program administration scale: Measuring early childhood leadership and management (2nd ed.).  New York, NY: Teachers College Press.

Author Information

Gillian Doherty, Ph.D., was research consultant for the Mentoring Pairs for Child Care (MPCC) study. She has been involved with child and family issues for more than 30 years in a variety of roles, including researcher, consultant, and university and college instructor in child development. Dr. Doherty was the project director and a principal investigator for the You Bet I Care! Project, which collected demographic and work information from center and home-based child care providers across Canada and studied levels of quality in six provinces and one territory. She participated in development of the Canadian Child Care Federation’s training program and other supports to enhance the quality of services provided by child care centers and family child care providers.

Tammy McCormick Ferguson, B.A., E.C.E., was founding executive director of the Early Childhood Community Development Centre and led its growth and development for 20 years. She co-developed the Partners in Practice Canadian Mentoring model and co-developed and led Mentoring Pairs for Child Care. She also co-developed a framework defining the role of owners and boards in child care and has been involved in various initiatives to advance child care in Canada. In 2012, she was awarded the Canadian Child Care Federation’s National Awards of Excellence.

Contact information: htpd2@hotmail.com

Glory Ressler, B.A., co-developed and managed the Mentoring Pairs for Child Care program and has led integration research initiatives in Ontario, Canada. Ms. Ressler is a well-known, respected conference presenter and trainer. She is the director of Mothercraft’s College and Community Data Group. In 2011, she was appointed eastern Canadian national representative to the World Forum on Early Care and Education.

Jonathan Lomotey, Ph.D., executive director of Novell Community Development Solutions, provided the data analyses for the research discussed in this article. He has extensive experience in community-based research, community development, and program evaluation. Dr. Lomotey was a co-initiator and lead researcher for two projects to enhance capacity for trauma treatment, promote trauma-informed practice, and increase access to trauma services in the Waterloo and Guelph-Wellington-Dufferin regions in Ontario, Canada.

Funders: The Ontario Ministry of Children and Youth Services and the Ontario Ministry of Education.