Skip to main content

 

“Assessing the Impact of Childcare Initiatives – My AFE at the University of Nevada, Reno” 

By Laila Abed (Duke MIDP ’25)

Summer 2024 AFE Blog Post Series

 

My passion for data analysis and statistics grew last year when I started my Master in International Development Policy (MIDP) program at Duke University. I studied several related classes such as econometrics, monitoring and evaluation, and evaluation of public expenditure. Data analytics is an interdisciplinary field that will allow me to apply the knowledge and skills that I have learned in the aforementioned courses towards strengthening the development and peacebuilding field in the Middle East. My persistence towards learning data analytics helped me to obtain an internship at the University of Nevada, Reno — Extension as part of a big team working on assessing the impact of more than 35 childcare initiatives in Nevada.

The Coronavirus Response and Relief Supplemental Appropriations (CRRSA) Act, which passed in December 2020 to combat the COVID-19 pandemic and its economic consequences, provided Nevada with $93 million in supplemental Child Care Development Block Grant (CCDBG) funding. Nevada is using these federal funds for a variety of childcare infrastructure expansion projects and program activities, including but not limited to operating grants; staff stipends; expanding childcare networks by providing recruitment, outreach, mentoring, and support for early childhood workforce; and improving quality with trainings, resources, and supplies.

University of Nevada, Reno (UNR) evaluates the childcare programs and projects implemented with these funds. The monitoring and evaluation team conducts a summative evaluation to assess the overall effectiveness of the thirty-five (35) projects statewide.

The team uses the Organization for Economic Co-operation and Development (OECD) Development Assistance Committee’s (DAC) evaluation criteria, which are widely used by evaluators across the globe. The five (5) criteria and key concepts are as follows:

  • Relevance: The extent to which a program is aligned with the needs and priorities of its target population and the context in which it operates
  • Efficiency: Plan to produce outputs through comparison of inputs with the outputs to determine how to maximize outputs with the given resources
  • Effectiveness: Comparison of the actual outcomes with the desired outcomes
  • Impact (Intended and Unintended): How and the extent to which the intervention improved the overall quality of life for participants or the community
  • Sustainability: The extent to which the intervention is socially, economically, or environmentally suitable for continuation beyond the program or initiative

This approach allows the team to examine the projects’ outcomes based on their stated objectives and determine if they are worth continuing to address the needs of the community being served. It also enables the team to offer insights on future program development and implementation.

My role in as a data analyst includes several duties including, i) the design of the evaluation methodology, ii) the design and review the questions, tools, and templates used for the evaluation, iii) using Qualtrics to build and disseminate surveys, iv) sample calculation, v) data collection and cleaning, vi) data analysis and report writing, and finally, vii) present the findings and lessons learned. So far, I have participated in several trainings related to designing evaluation tools, qualitative analysis, using SPSS, using Qualtrics, and writing evaluation report.

Being part of a big multicultural and diverse team is a priceless and a great opportunity to share experience, learn, and develop communications skills. I accomplish my tasks, propose ideas, discuss with the team, and feel comfortable in this place. This significant experience has strengthened my confidence in my skills, the things that I can achieve and the positions which I can reach in the future.

 

Comments are closed.