AI Tools for Fundraisers: Your Ethical Action Plan

More and more fundraisers want to use AI to generate impact. But many are still trepidatious. Here are some AI tools for fundraisers to power donor engagement while keeping ethics and donor privacy as the top priority.

“Does your organization allow you to use a generative AI tool such as ChatGPT, Claude, or Gemini?”

Earlier this year, we polled a large audience at a CASE conference, asking this question.

In March, just over 50% of attendees answered “yes.”

Four months later, during a recent webinar on AI and Advancement, over 84% of attendees answered “yes.”

That’s a fast-moving upwards trend. And a testament to the innovative momentum that is stirring in the higher ed fundraising community. 

We love to see this. Innovation is kind of our thing. AND, given the rise in fundraisers using AI, it is more important than ever to call attention to the ethical and responsible use of AI tools for fundraisers.

Tackling AI and donor data privacy isn’t for the faint of heart. Clark University’s Joe Manok is among a small group of bold advancement leaders who have embraced the risks and rewards of AI in fundraising. (Joe and his team have recently launched an AI Microcredential for non-profits to bring more folks on board the innovation train.) 

Recently, Joe joined our EverTrue President Brent Grinna for a conversation about navigating risk and embracing the future of fundraising. You can watch the full (free!) webinar here, or catch a preview below.

 "AI literacy, if you're not investing in it right now, you really need to pay attention to that. Ethics and governance really build trust.  We have to show donors that we take their data very seriously.”
Joe Manok
Vice President for University Advancement, Clark University

Ethics Frameworks: Maturity and Readiness

Donor data privacy is serious stuff. Under Joe’s guidance, the Clark team has created two structural frameworks within which to assess their risk and readiness to adopt and implement new AI tools: Ethics Maturity and Ethics Readiness frameworks. 

These two frameworks help to assess, guide, and track progress from informal awareness to fully-auditable use of AI tools for fundraisers. 

Currently, the Clark Advancement office is operating between Level 3 and Level 4 on these frameworks, meaning they:

  • Have a formal AI policy in place.
  • Use only publicly-available information with free AI tools like ChatGPT
  • Have leveled-up their AI usage in paid, secure AI platforms (they are power users of our Signal platform).
  • Have annual audits for their AI usage.
  • Are actively seeking ways to implement more AI within secure and policy-compliant boundaries.

These two frameworks are simple but effective. They help the Clark team identify their current state, improve ethical oversight, track their AI integration progress, and plan for future implementations.

And, they are sharable visuals that give transparency to university leadership and all involved team members – so everyone knows the risks and the rewards of augmenting fundraising with AI tools.

AI Risk and Readiness Matrix

Using those two AI frameworks —ethics and readiness—the Clark team plotted a Risk and Readiness Matrix to visualize their AI strategy. The four quadrants are:

  1. Vulnerable (Low Ethics, Low Readiness)
  2. Idealistic (High Ethics, Low Readiness)
  3. Aggressive (Low Ethics, High Readiness)
  4. Future Ready (High Ethics, High Readiness)

In 2022, Clark was in the vulnerable quadrant. By 2025, they’ve moved toward future ready. Their ethical considerations and technological readiness advance in tandem – which is the goal for any team.

A super-important point here is that there is a need not only to evaluate the ethics of using AIbut also the ethics of not using it. Yes, there are risks to consider. But there are tools and frameworks in place to counter those risks, keep donor privacy paramount, and raise more support for life-changing initatives.

As Joe rightly points out, failing to adopt more efficient tools could be a disservice to donors and your institution.

 "I would encourage you to pause and think about the ethics of NOT using AI. What are you telling your donors when you have an opportunity to be 2x or 3x as efficient – and you decide not to do that?"
Joe Manok
Vice President of University Advancement, Clark University

The fundraiser AI action plan

Change starts with one person. Think of tasks on your plate that could be automated or improved with the help of AI. (Lacking ideas? How about activity reminders? Prospect summaries? Outreach templates?) 

And then get working on a plan to bring the future of fundraising to your shop. Here is a tried-and-true 6-step plan to get started. 

  1. Write your personal AI position statement
    Reflect on your personal stance toward AI. What are your fears? Where do you see the biggest opportunities?  You don’t have to share this with anyone (though, it’ll come in handy for tech pitches).
  2. Build a working group
    Gather stakeholders from different teams that will help assess and shape a possible AI strategy.
  3. Secure an AI mandate from leadership
    Getting the green light from the top is important. Seek a formal endorsement or direction from leadership to pursue the use of AI.
  4. Launch free pilots
    Start small with publicly available tools (e.g., ChatGPT, Claude, Gemini) for non-sensitive data to explore value and build internal buy-in.
  5. Set data protocols
    Develop guidelines around what data can (and cannot) be shared with free AI tools.
  6. Pursue strategic partnerships
    Up-level with paid AI-aligned vendors (like EverTrue) to ensure the most effective AI adoption – that is also safe and seure.

Let Signal do the data digging for you. Book your demo to see how fundraisers use Signal's AI features to connect with the right donors, at the right time.

See it in action!

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x