Performing an analysis of data is one of the most important stages in this process. Departments should be able to determine where you are in meeting the targets for this goal. Below are a few things to consider in this stage.
Closing the Loop is central to this stage. It refers to the process of examining results in light of original goals, making informed adjustments, and documenting decisions to ensure meaningful change. By closing the loop, assessment becomes more than data collection—it becomes a catalyst for action, improvement, and accountability.
Steps for Closing the Loop
- Prepare Data: Organize, clean, and format data so it is accurate, complete, and ready for analysis.
- Review Data: Analyze results to determine whether outcomes and goals were met.
- Discuss Findings: Share results with faculty, staff, and stakeholders to gain multiple perspectives.
- Document Decisions: Record what was learned, the changes made, and the rationale behind them.
Prepare Data for Analysis
It’s important to review your data before you begin the analysis. Here are some steps to take to help you do that:
- Remove identifiable information: De-identify or anonymize data to protect confidentiality and comply with ethical standards.
- Request additional data, if needed: Identify gaps in the dataset and gather supplementary information to strengthen the analysis.
- Clean and verify data: Review and revise data to ensure accuracy, consistency, and completeness; check for missing values, duplicates, or errors.
- Reformat for analysis: Convert data into the correct structure or format for the software or method you will use (e.g., coding qualitative responses, restructuring spreadsheets, or standardizing scales).
- Organize documentation: Maintain clear records of data sources, transformations, and coding decisions to ensure transparency and reproducibility.
- Check alignment with goals: Confirm that the dataset corresponds with the outcomes or objectives being measured before moving forward.
Review Assessment Data
Now is time to interpret the assessment evidence and look for themes. Consider the questions below:
Participation and Reach
Guiding Questions
- Who participated in the program or activity? What are their characteristics?
- How many individuals or groups participated?
- How often did participants engage with the program?
- How did participants learn about the program?
Data Collection Methods
- Collect demographic info, participant lists
- Attendance records, registration logs
- Session tracking, participation logs
- Surveys, referral tracking
Implementation
Guiding Questions
- Were program activities delivered as planned (with fidelity)?
- Were all components of the program implemented consistently?
- Did staff follow intended procedures or curriculum?
Data Collection Methods
- Observation checklists, staff logs
- Facilitator reports, session notes
- Staff surveys, observation, audits
Participant Experience and Satisfaction
Guiding Questions
- How satisfied were participants with the program?
- What aspects of the program were most or least helpful?
- Were participants actively engaged throughout the program?
Data Collection Methods
- Satisfaction surveys, focus groups
- Surveys, interviews, feedback forms
- Observation, participation metrics
Staff and Facilitator Feedback
Guiding Questions
- How do staff perceive program delivery?
- Did staff encounter challenges implementing the program?
- How do staff assess participant engagement and outcomes?
Data Collection Methods
- Staff surveys, interviews, debriefs
- Staff surveys, focus groups
- Observation, session reports
Data and Tracking
Guiding Questions
- What data were collected to measure participation, implementation, and outcomes?
- Are there gaps or inconsistencies in the data?
- What patterns or trends emerge from the data?
Data Collection Methods
- Data logs, records, surveys
- Data review, audits
- Data analysis, dashboards
Impact and Next Steps
Guiding Questions
- Did the program achieve its intended outcomes or goals?
- What lessons can improve future implementation?
- Are additional assessments or follow-ups needed to measure long-term impact?
Data Collection Methods
- Outcome data, assessment reports
- Staff reflection, evaluation meetings
- Planning documents, follow-up surveys
Discuss and Reflect on Your Findings
Share assessment results with faculty, staff, and stakeholders to gain multiple perspectives and ensure a comprehensive understanding of the data.
Some steps to help you reflect are included here:
- Determine Audience and Disseminate Data: Identify which groups or individuals should receive the results, share the data with relevant stakeholders, and schedule a Data Dive session to collaboratively explore the findings.
- Analyze and Interpret Results: Compare outcomes with benchmarks to assess success, evaluate the effectiveness of any changes made, and identify patterns and contextual factors.
- Document Insights and Next Steps: Record key takeaways, recommendations, and decisions to guide action and inform the next steps in the assessment cycle.
Document and Report
This is the stage to clearly communicate what occurred during the assessment process. Describe the activities performed, how data was collected, and the significance of the findings. Include information on how the data was analyzed, who was involved in discussions or meetings, and any suggestions or insights gathered. Indicate whether assessment criteria were met and, based on the results, determine whether changes are needed or if the assessment cycle can be concluded.
Reporting is completed in Watermark: Planning and Self-Study to ensure all documentation is centralized and accessible. As part of the Annual Assessment Report process, summarize outcomes, improvements suggested, and lessons learned from the assessment cycle. This report provides a comprehensive record for stakeholders and informs future planning and continuous improvement efforts.
Improvement Plan
Include an Improvement Plan to organize and track recommended changes, such as the following:
- Type of Improvement: Identify the specific change or enhancement to be made.
- Responsible Party: Assign who will implement the improvement.
- Cost: Note any associated expenses or resources required.
- What Will Be Improved: Clearly describe what aspect of the program, service, or process will be addressed.
- Implementation Timeline: Specify when the improvement will be carried out.
This structured approach ensures accountability, supports resource planning, and provides a clear roadmap for applying lessons learned in the next assessment cycle.
Questions Your Report Should Answer
A successful report will answer the following questions.
The Plan
- Was the plan achievable?
- Did activities occur as intended? If not, why?
- What changes can be made to improve implementation in the future?
Data
- How effective were the data collection tools?
- How well did the data analysis tools work?
- Is the data meaningful and actionable?
- What additional information is needed to support decisions?
Program Decisions
- What factors explain student successes?
- Why did outcomes differ from expectations?
- What is needed for students to reach the desired level of achievement?
- What changes to curriculum, courses, services, or programs could improve results?
Systems
- How are changes implemented and documented?
- Are current methods effective for managing and tracking improvements?
- Are lines of communication clear and efficient?
- Are documentation, storage, and retrieval processes functioning properly?