![](/Portals/0/KeyTakeaway45.png)
In 2018, a group of professionals from industry, medical education companies and specialty societies identified a dire need: To elevate the quality of the grant lifecycle in continuing professional/medical education. This realization sparked a series of innovative workshops, playfully dubbed the “Alligator Pit”. Over the next six years, these sessions aimed to transform how the CPD community approached grant proposals and outcomes. As we reflect on our efforts, we find ourselves asking: Did we achieve our goals?
This article examines the outcomes of our experiment, offering valuable insights for both grant applicants and industry supporters. By sharing our successes, challenges and lessons learned, we hope to contribute to the ongoing improvement of medical education and healthcare optimization.
The Alligator Pit
The first year (2019), in a too-small room, small groups broke out to develop their own “grant proposals” based on real-world disease states. The following year, in the year that shall not be named (2020), we again broke out into groups to develop activities on the little-known faux diseases of Jediitis and Walkerexia (it was fun, I promise). The following year, due to COVID, the session was virtual, but we picked it back up again in 2022, this time focusing on outcomes. We tried to change up the themes based on reality TV, with fun themes like “Shark Tank” (hence, “Alligator Pit”), “Chopped” and “Flip This House” to keep things interesting (or more difficult to understand, depending on how you look at it). Some years we had closed sessions to ensure individuals would not come in and out and disrupt the group work. Throughout each iteration, we focused on storytelling: Demonstrating how linking needs assessment, learning objectives, educational format and outcomes through instructional design leads to stronger proposals and, ultimately, more effective activities.
After the 2024 iteration, we conducted a post-mortem to evaluate our efforts, review the sessions, identify areas for improvement, and consider alternative approaches. If this were a morbidity/mortality review, we could probably describe the session as “a seven-year-old reptile suffering from exhaustion and burnout.” When we looked at the results of all those years of work, we found we had not moved the needle as much as we had hoped. In fact, we would sometimes receive a grant application just weeks after our activity from someone who’d attended a session that was disappointingly unchanged from previous submissions.
Unfortunately, the alligator could not be treated and went to hospice.
We found some common issues for both educational providers and industry supporters to consider, including key reasons that the best practices discussed were not implemented.
Lessons Learned
Attendance Demographic Challenges
While the intended audience for the series was grant professionals with intermediate experience, which we defined as 3–5 years in the field, we had many attendees who were new to the profession. We also had individuals who had been in the field for decades. Participation varied from 20–45+ persons per activity.
The majority of our participants were from medical education companies, and it was not always the same individual year over year, although we acknowledge that turnover is an issue in our industry. Sharing information within your organization is vital; lessons learned aren’t just for the learner, it takes a village to successfully execute an educational activity.
During every session, there was at least one request of, “Just tell us what to put in our proposals so that we’re guaranteed to get funding.” Sorry, friends; there is no such thing.
Implementation Challenges
One recurring issue was that some attendees came from organizations that were siloed throughout the entire planning, implementation and outcomes process. Often, only grant developers attended without the education team, or the outcomes team was absent, or a mix of both. This gap made it difficult for some attendees to implement the message of incorporating instructional design — from needs assessment to outcomes reporting.
There was an expressed attitude that organizational leadership did not appreciate the need to change how submissions were made because “they’ll just give us money anyway.” Industry supporters across the board are seeing budgets shrink, which forces them to be more strategic (and decline more grants). Supporting a grant “just because we’ve always done it” or because “it’s the best out of a less-than-ideal lot” is no longer strategic and effective stewardship. Supporters are more particular out of necessity, not only from a budget perspective, but from a best practices perspective.
Corollary to this, many organizations submit an iteration of the activity year over year without demonstrating why it’s still an unmet educational need based on previous outcomes: Does the gap still actually exist? An activity for third-year residents is one thing; we obviously want them all learning the same things. But if you’re doing a subspecialty day, for example, just because it’s what you’ve always done, ask yourself why. Is it just a vague description of why the education is needed, followed by a literature search alone to justify doing the project again? Do your learners express a want/need for what you’re putting on? And perhaps, more importantly, why does this matter for patients?
Something for industry supporters to note: According to several participants in the 2024 Alligator Pit, there is a perception that grantors only want to fund known entities rather than supporting a request from a “new” provider. This isn’t necessarily so, but providers need to show supporters how they differ from other educational providers, or how they are better positioned to address the actual unmet need of their own learners. If a promise is made, keep it. If something goes wrong, learn from it and share the learnings. One group of participants during this same iteration proposed community leader involvement in their project, which got the session faculty very excited. However, when it came time for them to share their outcomes report, community leaders were nowhere to be found in the information. When asked, they said “Well, we couldn’t quantify it, so we didn’t include it.” Failing to deliver on what you promised, or at least to explain why it didn’t happen, will make it less likely that your organization may be funded in the future.
Limited Team Training Investments
Make sure your team is trained in ACCME/EACCME/AOA, OIG and other regulatory requirements. Share information in the form of on-the-job training and ensure that everyone has the tools they need to succeed. The Alliance Basics course is a great place to start. The online Career Pathways curriculum is another.
Conclusion
As we close the book on the Alligator Pit experiment, it's clear that improving the quality of grant proposals in continuing professional development is an ongoing journey. The challenges we've identified — from demographic mismatches and implementation hurdles to the need for organizationwide engagement and up-to-date practices — underscore the complexity of this field. Yet, these challenges also present opportunities for growth.
To elevate grant proposal standards, we must cultivate continuous learning, embrace innovation, and prioritize patient impact. Whether seasoned or new to submitting grants, remember: Compelling proposals tell a cohesive story — from needs assessment to outcomes — and show a clear commitment to advancing healthcare. As we move forward, let's carry the lessons from the Alligator Pit with us, always striving to create educational programs that not only secure funding but genuinely improve patient care. The alligator may have retired, but the quest for excellence in medical education funding continues — and it's a journey we're all on together
--
Disclosure
The information in this article is based on the authors’ experience as part of the Alligator Pit team and does not reflect the policies of their past, current or future employers.
Special thanks to some of the OG members of the team: Patty Jassak, Ann Marie DeMatteo and Gail Radecki.