Dec 9, 2024

Analyzing a Scout Troop’s Financial Records with AI: A Journey in Clarity and Collaboration

Reconciling my son’s scout troop financial register with bank records seemed like a perfect task for AI. It’s repetitive, data-driven, and follows clear rules. Yet, my experience taught me that AI isn’t a magic wand—it’s a tool that requires clarity, iteration, and human guidance.

Initial Expectations vs. Reality

The task involved comparing a financial register exported as a PDF with an Excel sheet of bank records. I needed to identify mismatches causing a reconciliation discrepancy. Converting the files to CSV, I gave detailed instructions: match amounts and dates (within seven days), and flag unmatched rows. The AI produced results quickly, but with issues:

- Duplicates appeared in matched rows.

- Unmatched transactions didn’t make sense.

- The output format was hard to validate.

It became clear that the AI wasn’t operating with the intuitive expertise I’d expected. Frustrated, I paused the project.


Shifting Mindsets: Lessons from *Co-Intelligence*

Reading Ethan Mollick’s Co-Intelligence: Living and Working with AI helped me understand my struggle.  This was the first AI book I read that helped me recognize more about what AI cannot do.  Before I had assumed AI could behave like a seasoned accountant. Instead, I needed to:

1. Interact dynamically with the AI.

2. Normalize and clean data for precise comparisons.

3. Break the problem into smaller, phased tasks.


Adapting the Workflow

With this new mindset, I approached the problem iteratively:

1. Dynamic Interaction: I started asking the AI to review files and ask clarifying questions. For example, I prompted: “You are an accountant. Review this CSV and ask one question at a time to understand it.”

2. Normalization: We aligned formats for dates and transaction amounts. Withdrawals were translated into negatives to ensure consistency.

3. Mapping Table: Transaction comments in the register mentioned scout names, while the bank records had parent names. By extracting last names into a new column, I simplified the matching process.

4. Transaction Types: To reduce complexity, I directed the AI to split files by transaction type (e.g., checks, Zelle, debit cards). This adjustment improved the accuracy of matches.


Key Insights

The iterative process revealed:

- AI lacks domain expertise. It doesn’t “just know” rules like “never match the same transaction twice.”

- Self-debugging signals complexity. When the AI struggled to reconcile records, I realized the task needed breaking down further.

- Human assumptions matter. Recognizing biases (e.g., overestimating AI’s capabilities) is crucial for effective collaboration.


Broader Reflections

The experience raised larger questions: How do SOC teams prevent hallucinations and evaluate AI confidence in large-scale log analysis? I also reflected on my own biases—expecting AI to replace expertise rather than augment it.


Practical Takeaways

1. Dynamic Conversations: Use prompts like: “You are [expert role] helping with [problem]. Ask questions one at a time to explore solutions.”

2. Engineer’s Mindset: AI interprets problems like a developer. Clarify expectations and iterate and request that it explain what it heard to be sure you are well connected.   Consider asking it to be a Product Manager to ask better questions about the problem you are trying to solve.

3. Simplify Tasks: View AI self-debugging as a signal to break problems into phased components.

AI isn’t a miracle worker, but it can amplify your efforts when guided thoughtfully. This journey sharpened my own skills in communication, problem-solving, and critical thinking. For anyone exploring AI, start with curiosity and a willingness to collaborate—it’s a skillset that applies to humans as much as machines.


No comments:

Post a Comment