Edulytics Webinar Volume 1

Edulytic Solutions Webinar Volume 1

January 20, 20265 min read

Edulytic Solutions: Is Your PA Program Overwhelmed by Data?

On November 18, I hosted a webinar focused on a challenge I’ve been watching PA programs grapple with for many years: how to manage, analyze, and use the growing volume of assessment data now expected under today’s accreditation environment. That webinar also introduced Edulytic Solutions, a new analytics platform my company has been developing to address this growing need.

As with all of my webinars, the goal was education first. These sessions are always free and designed to provide real value to PA program leaders. This blog series builds on that webinar, summarizing key ideas and placing them in a broader context.

Before we talk about solutions, though, it’s essential to clearly name the problem.

Mountains of Data

If you’re a PA program director or faculty leader, chances are you don’t feel short on data. You feel buried in it. When we think back to how relatively simple data collection was as few as 15 years ago, recognizing our new normal can be mind-boggling. “Ah, remember the good old days of 2010, when you could just meet twice a year, catch everyone up, and consider it done?”

Now look at all this! Assessment results. Course evaluations. Clinical evaluations. Preceptor feedback. Attrition data. PANCE outcomes. Remediation tracking. Faculty workload. Student progress over time. Add in the ARC-PA Sixth Edition Standards, and expectations around documentation, longitudinal analysis, and continuous program improvement have only intensified.

The challenge facing PA programs today is an abundance of information, plus the widening gap between the amount of data programs are expected to collect on an ongoing basis and the reality of analyzing, interpreting, and using that data.

Faculty Are Asked to Do More With Less Time

Faculty in today’s PA programs are carrying extraordinary demands. Teaching, service, and scholarship requirements continue to grow, while students entering programs often require more individualized remediation and learning support. At the same time, many programs are stretched thin by the day-to-day realities of operations, leaving little capacity to implement the level of assessment now required to maintain compliance.

Ongoing assessment can be a heavy burden, particularly when there are too few people with the time or expertise to collect, compile, and analyze data effectively. Ultimately, the most significant constraint is time, and reclaiming it matters.

No More Educated Guessing

Historically, many program decisions were made using a combination of experience, professional judgment, small sample sizes, and anecdotal evidence. That approach worked when programs were smaller and expectations were lighter.

Today, it’s under strain. When program leaders are asked, "How do you know this change improved outcomes? What trends are you seeing over time? Where is faculty workload becoming unsustainable?” the honest answer is often: “We have a sense, but it’s hard to prove.”

That’s a reflection of how complex PA education has become.

The “Scattered Spreadsheet” Reality

In most programs, data can be found everywhere. We’ll find assessment data in one system, evaluations in another, various spreadsheets maintained by individual faculty members, and reports assembled manually for retreats or site visits. That’s nobody’s fault; for years, data like this was decentralized, with no compulsion to change the way things were done.

In a decentralized system, each dataset may make sense on its own. But in the big picture, all that data is disconnected and labor-intensive, and may be thoroughly outdated by the time anyone has the bandwidth to analyze it meaningfully.

Faculty may spend countless hours compiling, formatting, and reconciling information. They put in heroic effort, but it’s just not a sustainable model for continuous improvement.

Why Things Are Getting Worse

Several forces are converging at once:

  • growing program sizes

  • expanded accreditation expectations

  • increased emphasis on longitudinal analysis; and

  • mounting faculty workloads.

Annual or episodic review once made sense. Increasingly, it doesn’t.

Programs are now expected to monitor patterns over time, identify trends early, and document how decisions are informed by data, and do all this while continuing to teach, mentor, advise, and support students.

When assessment systems rely heavily on manual labor, the burden falls on the people who are already there, namely, the faculty. Over time, that burden can limit their innovation, draw them further away from the student-facing activities they love, contribute to burnout, and turn data from a “helpful resource” into a significant source of stress.

No profession grounded in care, mentorship, and education should be forced into this conundrum.

What’s the Solution?

If managing data feels harder than it used to, you’re not imagining it, and you’re certainly not alone.

As you know, I’ve been working with PA programs on assessment and accreditation challenges for many years. The problem hasn’t changed, but the tools available to address it thoughtfully and responsibly certainly have. In many ways, the dramatic increase in data expectations over the past decade has been driven by technological advances. Our ability to collect, store, and retrieve information has grown exponentially since the previously mentioned old-school year of 2010. It’s only reasonable that we now expect technology to help us manage and make sense of that data as well. I’m pleased to report that, at long last, the tools are catching up with the need.

So, stick around and see what the future will bring! In next week’s post, we’ll take a closer look at how the ARC-PA Sixth Edition Standards have accelerated these pressures, particularly around assessment expectations, workload, and longitudinal analysis. From there, we’ll explore what it means to move from episodic review to continuous insight and how emerging approaches, including analytics and AI, can support (though not replace) the human judgment at the heart of PA education.

I look forward to seeing you again!


Assessment DataPA Program Accreditation (ARC-PA Sixth Edition)Faculty Workload & BurnoutLongitudinal AnalysisAnalytics & Data Management (Edulytics)
blog author image

Scott Massey

With over three decades of experience in PA education, Dr. Scott Massey is a recognized authority in the field. He has demonstrated his expertise as a program director at esteemed institutions such as Central Michigan University and as the research chair in the Department of PA Studies at the University of Pittsburgh. Dr. Massey's influence spans beyond practical experience, as he has significantly contributed to accreditation, assessment, and student success. His innovative methodologies have guided numerous PA programs to ARC-PA accreditation and improved program outcomes. His predictive statistical risk modeling has enabled schools to anticipate student results. Dr Massey has published articles related to predictive modeling and educational outcomes. Doctor Massey also has conducted longitudinal research in stress among graduate Health Science students. His commitment to advancing the PA field is evident through participation in PAEA committees, councils, and educational initiatives.

Back to Blog

© 2024 Scott Massey Ph.D. LLC