As a university administrator you likely have at least some responsibility for reporting data — whether to an accrediting body like the American Bar Association, a federal agency like the Department of Education, the institutional research department at your own university, or any number of other agencies, departments, and survey collectors.

As someone who reports data to external stakeholders, you have a responsibility to take this duty seriously and think beyond the numbers.

Reporting incorrect or misleading data can really come back to bite you, and what appears to be a straightforward process may not always be so. Furthermore, there’s a good chance you aren’t a statistician and haven’t had formal training on this topic.

So where does this leave you?

Higher education administrators face a tremendous amount of pressure around reporting accurate data, especially those responsible for reporting consumer information, such as admissions and financial aid data, employment outcomes, enrollment data and attrition/graduation rates.

In some cases, you may have clear instructions on how to provide what is being asked for (i.e., how many decimal places to include in reporting a GPA, or rounding rules in a percentile calculation), and other times you will have freedom to use your own judgment. You may also experience times when you unknowingly err in your reporting because something was unclear or misleading in the instructions, or because of some other honest mistake on your part.

There are sometimes even nuances in the method you use for calculating a statistic.

Take this example: a colleague of mine — Elisabeth Steele Hutchison, Director of Admissions & Special Projects at the University of Hawaii, recently sparked a fantastic discussion among law schools after discovering that different software programs calculate percentiles (such as 25th and 75th percentiles) using different algorithms, and thus often produce different results! One would naturally assume that by using a standard statistical function in a computer program they could ensure accuracy — yet if you used Excel to calculate the percentile in one instance, and SPSS or Crystal Reports in another instance, you would likely arrive at different answers. (Here’s a great article by Tom Dierickx on this topic — thanks Elisabeth!) You could therefore potentially provide stakeholders with differing information, while using what most people would consider to be credible methods. How’s that for unsettling?

There is a lack of discussion around this topic within schools, and among colleagues and industry organizations. I applaud those willing to tackle the challenges associated with data reporting by bringing up these discussions and sharing their experiences.

Some questions to ponder:

1) Do you feel comfortable raising questions and partaking in discussions around data reporting and integrity in your institution? Among your colleagues? At industry events?

2) When you report data, do you feel personally accountable for its accuracy, or do you see yourself more as just the messenger?

3) When something in the data you are reporting appears “off” (i.e., the data on your freshman class reads that 0 students come from the mid-west, yet you could swear you just met a group of students from Indiana during orientation week) — do you pause, trace the data to the source and verify the information is correct? Or do you let it pass, assuming you must have been mistaken? Or, worse, are you not even noticing when things seem “off” in the first place — in other words, are you reading and comprehending the data before you pass it along?

4) Do you read the fine print when asked to report data? Those little words like as of, through, only, including, and excluding can make all the difference, and warrant your careful attention.

5) If you report the same data annually, do you take the time to thoroughly read the instructions each year, or do you make assumptions based on “how it’s been done in the past”?

Schools should be setting clear expectations for their administrators and offer resources and training around these topics so as to ensure the highest integrity across the institution when it comes to data reporting. Defining what you/your institution believes “data integrity” entails in the first place might be a good place to begin.

There is no reason to assume that all personnel in your organization automatically know how to properly handle and report data. It is an incredible amount of pressure to place on a single person, and it is wise to make sure multiple people know where to find key data, and understand the proper way to report it. If leaders aren’t providing clear expectations, training, resources, and support in this process, institutions are at risk of winding up with surprises down the road.

There should be a clear and simple process for employees at every level to come forward in a safe way when s/he has concerns about the accuracy of data their department or institution is releasing to stakeholders. If you learned your superior or colleague was knowingly reporting false data, would you know what to do?

In order to maintain (and improve) integrity and ethics around data reporting, leaders need to cultivate a culture within their institution that supports employees trying to do right. Directors need to feel okay about asking for help if they don’t know or understand something (like how to do a certain calculation, or how to weed “exclusions” out of their data, for example); staff at all levels need to know who to talk to if they feel uncomfortable with something that’s going on; colleagues should feel safe to discuss nuances and mistakes within data reporting, always in an effort to improve; and leaders of institutions need to make it clear that they value data integrity and expect the same throughout their organization.