Most UX work on AI products starts at the interface — how the user asks the question, how the answer is displayed. This project went a layer deeper: working with data engineers to ensure the AI had the context to answer correctly in the first place.
That meant reviewing every field in the giving data model — transactions, recurring schedules, donor records — understanding what each field meant to an administrator, how they'd talk about it, and what kind of queries they'd expect it to answer.
Labels, descriptions, and semantic context gave the AI the right framing — so when an administrator asks "show me lapsed donors from last quarter," the system understands what "lapsed" means in church giving, not just as a generic data concept.
I also worked on calculated fields — experimenting with custom metrics and suggesting them to the data engineering team based on how administrators actually think about giving performance. UX knowledge informed the data model, not just the other way around.
Lapsed donor
No transaction records in defined period
Someone who used to give regularly but has stopped — a re-engagement opportunity
Recurring schedule
Automated payment plan with frequency and amount
Predictable giving — the backbone of a church's financial planning
Fund allocation
Transaction tagged to a specific designated fund
Where donors are directing their generosity — often tied to specific campaigns or ministries
Net giving
Total transactions minus refunds in period
The real number — what actually came in after any reversals or failures