Low response rates are often a challenge in randomized control trials that bases an outcome measure off data collected through a survey or other forms. In this paper, we go over the process of developing a SAS program to create response rate reports in the context of an evaluation of a teacher professional development program. In this education study, a literacy assessment is hosted online by a testing company, but the project team needs to continually follow-up with participating teachers to ensure their students take the assessment during the testing window. Each week and sometimes ad-hoc, the team (nonprogrammers) requires a generated report to check on test response rates calculated at the treatment, course, teacher, school, and district level. This report is used to determine whether additional follow-up is needed, and if a certain course has reached for an acceptable percentage of completed assessments.
In addition to the development of the process, this paper will briefly highlight macro techniques used to automate response rate calculations and how we utilize the Dynamic Data Exchange (DDE) method to output these rates into an accessible and easy-to-update Microsoft Excel spreadsheet. Lastly, the paper will share successes and challenges in the process, lessons learned in quality assurance and data validation, and possible improvements.