IPEDS Surveys Reporting: ‘Tis the Season (Part 2)

12/05/2017

Comments

“Duck Season! Rabbit Season! IPEDS Season!”

While duck seasons and rabbit seasons come and go, it always seems to be IPEDS season. So, we’re here to continue our examination of the IPEDS survey process. In part 1 we looked at the various challenges encountered in each step of the process. Now, we’re going to dive into ways you can avoid or remedy those pain points.

So that we’re comparing apples to apples, let’s reiterate and then follow the steps in the IPEDS process.

IPEDS Process Revisited

Once again, the IPEDS process can be generally thought of as being comprised of the following steps:

  1. Gathering the data
  2. Reviewing the results
  3. Correcting the data errors
  4. (Occasionally) retooling the collection method
  5. Restarting the process until satisfactory results are achieved
  6. Submitting the data & explaining significant differences

Gathering the Data

The institution’s reporting tool usually handles the collection of the data. Sometimes multiple tools might be required to pull the necessary data from multiple data sources. Whatever the scenario, Institutional Research must first gather the needed IPEDS survey data into one place before beginning the review and validation process. It will allow them to better manage that entire process.

This will eliminate the headache-inducing procedure of working with multiple spreadsheets or making sure the right people have access to the right web portals. And it will help ensure that IR personnel, and those in other functional areas, are reviewing and validating (or updating) the same survey data. In addition, making it easier for other functional areas to participate in the process will lead to greater buy-in from, and improved working relationships with, those areas.

Reviewing the Results (and Tracking the Survey)

Putting the entire burden of accuracy of IPEDS reporting on one institutional researcher is neither fair nor practical. So, while the researcher may be responsible for compiling the metrics, there must be a coordinated effort with others – who are relevant to that survey – to review those metrics and help validate the results. Each reviewer brings a different institutional viewpoint and diverse knowledge to the process. Consequently, the more others are involved, the greater chance the survey results will be accurate.

One way to coordinate such involvement is to develop a central method of tracking who is supposed to review the survey results, the status of those reviews, and any comments or responses a reviewer provides. Knowing who has – or hasn’t – looked at the survey results, as well as seeing all relevant communications in one place, would enable the institutional researcher to more effectively usher each IPEDS survey through the review process.

Furthermore, this central method will permit IR to track the survey itself. You’ll know who is supposed to review the survey and its current status. Knowing where in the process an IPEDS survey is located is a great help to an institutional researcher, and others, who depend on its completion. As you know, the surveys are not trivial and require a lot of work. You should not have to track everything via spreadsheets and post-it notes.

Correcting the Data Errors

Errors will occur. After all, people are involved in the process. (To err is human!) But if you’re able to document the errors, you’ll help reduce the chance of those same mistakes appearing in future survey results. A way to facilitate this sort of documentation is to set up a communication hub that will allow all involved to notate errors found during review, who found the error, and the necessary fix.

Without a method of tracking necessary changes, reviewers may see multiple survey results without knowing which one is correct, what specific sections need to be checked, what has changed, or why a change was made. Everyone could end up working with inaccurate data and off comments that make no sense. The process could quickly dissolve into chaos.

Please note: when an error is found, the correction should be made in the data source (i.e. the institution’s ERP), as it represents the source of truth.

Retooling the Collection Method

Software programs evolve over time. Data that was stored in one place is moved to another location. A new module is added, or a new database is brought online. In addition, we’re now keeping more refined information than we have in the past.

The way that we collect data needs to change as well. These changes should reflect the evolution of the data sources. Communication between Institutional Research and the data stewards is very important. An institutional researcher must be able to take such changes into account, because not doing so could severely slow down the survey process. And, not being able to note these changes will make it harder to compare survey results across years.

Restarting the Process

If, for whatever reasons, errors are found and/or changes are made to the survey results, the process must start again. Remember, corrections to the data must be made at the source. Therefore, once those changes are made at the source, they’ll need to be pulled back into that central location to be reviewed and validated again.

Therefore, sufficient time must be allotted to allow for such restarts. Having a central method from which to manage and monitor the process will certainly help expedite things, but even that won’t compensate for poor planning. You know the submission windows, so plan accordingly!

Submitting the Data & Explaining Significant Differences

The review and validation process is complete and it’s time to submit your IPEDS survey to NCES. Do you remember all the corrections that were made? How about the reasons for those corrections?

If you managed the process using the methods described above, you will have tracked the what and why of all changes made. You’ll now be able to provide clear explanations in your submission, lessening the chance of any penalty being levied against your institution.

Conclusion

Through organization, communication and collaboration, IPEDS data collection and review can be efficient, effective and well understood. That can happen by implementing the central methods and hubs discussed above.

No longer does IPEDS season have to be met with dread by Institutional Research. There are ways to avoid the pain, and declare “open season on IPEDS,” by knowing you’ll be spending less time on these reporting tasks and more time doing the strategic research you were meant to do.

Like this blog?

You might also like this On Demand webinar:

 

Peter Wilbur is a Strategic Solutions Manager on the Client Experience Team at Evisions. Peter graduated from Northern Arizona University with a computer science degree in 1984. After working in several industries and with numerous companies, he joined Evisions in 2010 working on the support desk before moving to Professional Services, where he eventually came to serve as Professional Services Manager. Peter is a member of the Project Management Institute and a PMP. He enjoys spending time with his German Shepherds.

Related Posts

0 Comments

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Share This