From RAG-based user guidance to API-driven financial copilots and AI-powered automations
Executive summary
In a recent proof of concept (PoC), we explored how to create custom AI bots that sit alongside Oracle ERP Cloud to help end users find guidance, retrieve transactional insights, and automate high-volume finance processes. The PoC focused on three complementary patterns: (1) a Retrieval-Augmented Generation (RAG) bot for user manuals and how-to guides (BU‑BOT), (2) an API-connected bot for conversational access to financial data (FinBOT), and (3) AI-assisted automations for manual journals, lockbox, and invoice account distribution. This post summarizes the architecture, the build approach, what worked well, and the practical lessons learned when taking AI from a demo to an operational workflow.
Why custom AI bots matter in ERP programs
Modern ERP transformations often succeed or fail based on usability and adoption. Even with well-designed processes, users still spend time searching for guidance, waiting on support teams for “simple” data questions, and performing repetitive activities like coding invoices or creating manual journals.
Custom AI bots can reduce that friction by providing:
- Self-service guidance through natural-language Q&A over approved documentation (policies, user manuals, how-to guides, and knowledge articles).
- Conversational insights to retrieve transaction status, open items, supplier/customer details, and KPIs via controlled APIs.
- Automation with human control to eliminate repetitive work while keeping approvals, confidence thresholds, and auditability intact.
Pattern 1: BU‑BOT — RAG-based guidance over user documentation
The first bot pattern is a “knowledge companion” designed for functional users. BU‑BOT indexes approved documents uploaded to Oracle and answers questions by retrieving the most relevant passages and using an LLM to craft an explanation. In practice, this is Retrieval-Augmented Generation (RAG): the bot searches a curated corpus, brings back the best matches, and then generates an answer grounded in those sources.
How BU‑BOT works (high level)
- Curate and upload content: user manuals, step-by-step guides, and internal knowledge articles.
- Index documents and chunk content for retrieval.
- At question time, retrieve top relevant chunks (semantic search).
- Generate the response using retrieved evidence, returning links/sections back to the user.
- Continuously improve content quality and coverage based on real questions.
Practical tips
- Start with a narrow domain (e.g., Payables period close, invoice entry, or procurement receipts) before expanding.
- Use clear document ownership and a publishing workflow—RAG quality is directly tied to document quality.
- Log questions that the bot cannot answer; this becomes your roadmap for content improvement.

Pattern 2: FinBOT — an API-connected financial copilot
Where BU‑BOT answers “how do I…?”, FinBOT answers “what is happening right now?”. FinBOT connects to Oracle ERP Cloud data via controlled REST APIs and presents the results conversationally. The goal is not to replace reporting, but to accelerate the most common inquiries (top overdue invoices, supplier balances, status checks) and reduce reliance on technical teams for everyday questions.
Core building blocks
- Business objects mapped to specific REST endpoints (read-only where possible).
- Worker agents that handle a specific intent (e.g., invoices, suppliers, payments, journals).
- A supervisor agent that routes questions to the correct worker agent and enforces guardrails.
- Deep links that take the user directly into the relevant Oracle page for follow-up actions.
Example user journeys
- “Show me the top 3 invoices by amount that are due.” → Bot calls invoice endpoint, returns ranked list with supplier and due date.
- “What is the real-time status of payment X?” → Bot pulls payment status and highlights next actions or exceptions.
- “Which suppliers have the most unpaid amount?” → Bot aggregates and presents top suppliers, with drill-down links.

AI-powered automations: moving from answers to actions
The third pattern extends beyond chat. Once you can interpret transactions and apply logic consistently, you can orchestrate an end-to-end process—still with approvals and audit trails. In this PoC, we explored three finance automations that are typically high volume and high friction: manual journals from bank statements, AI-assisted lockbox, and invoice account distribution.
Use case A: Manual journal creation from CAMT.053 bank statements
Manual journals are often created at scale—especially when bank accounts sit outside Oracle or when statement data arrives as files. The PoC pattern uses Oracle Integration Cloud (OIC) to fetch CAMT.053 files from SFTP, invoke an AI agent to classify and account transactions, and then generate FBDI journal files for Oracle ERP.
End-to-end flow
- Retrieve CAMT.053 bank statements from SFTP via OIC.
- AI Agent analyzes transactions, proposes accounting, and assigns a confidence score.
- Generate FBDI journal files in OIC.
- Send files to Oracle ERP for journal import.
- Auto-post journals above a confidence threshold.
- Route low-confidence items for manual review and feedback.
- Use feedback to continuously improve the model.
Key design decisions
- Confidence thresholds are essential for auditability and control.
- Keep orchestration deterministic (OIC) and treat AI as an advisor that produces proposals plus confidence.
- Capture user corrections as structured feedback for retraining.
Use case B: AI Lockbox for receipt processing
Lockbox processes often fail on the “messy middle”: combined payments, partial payments, foreign remittances, and unstructured references. The PoC uses an AI agent to improve matching between credit transactions (from CAMT.54C) and open receivables in Oracle ERP, with OIC orchestrating file handling and lockbox file generation.
Automated lockbox processing flow
- Bank SFTP server stores CAMT.54C files.
- OIC retrieves files from SFTP.
- AI Agent analyzes transactions and proposes matches to open invoices.
- OIC generates lockbox files (and optionally comments for uncertain matches).
- Oracle ERP processes lockbox files.
- Receipts and user decisions are used to retrain and improve match accuracy.
Operational controls
- Apply matches automatically only above a defined score threshold (e.g., >80%).
- For lower scores, surface suggestions and require user confirmation.
- Measure hit rate by company and payment type to prioritize training data.
Use case C: Intelligent invoice account distribution
Coding invoices is repetitive and error-prone, especially when account strings are long and distribution rules vary by supplier, cost center, item category, or project. The PoC explored using historical invoices plus business rules to propose account distributions, accelerating invoice preparation and reducing rework.
Business value
- Faster invoice preparation and lower cycle time.
- Reduced coding errors and improved accounting quality.
- Better use of staff time—focus on exceptions rather than routine entry.
- Stronger scalability as invoice volumes grow.
Reference architecture: components and connections
Across all patterns, the architecture stays consistent: a controlled integration/orchestration layer (OIC), an AI agent layer that performs interpretation and scoring, and Oracle ERP Cloud as the system of record for transactions and approvals. Files (bank statements) and APIs (transaction queries) are the primary interfaces. The design deliberately keeps deterministic steps separate from probabilistic AI decisions.
Governance, security, and auditability
Introducing AI into finance processes requires strong guardrails. In the PoC we treated the AI agent as an assistant: it proposes, explains, and scores—while Oracle ERP retains authority for approvals, posting, and audit trails.
- Use least-privilege API access and prefer read-only endpoints for conversational insights.
- Log prompts, retrieved sources, API calls, and responses for traceability.
- Avoid training on sensitive data without clear policies; anonymize where possible.
- Design for segregation of duties: the person approving should not be the one “teaching” the bot in production.
Challenges and lessons learned
Every PoC surfaces realities that demos often hide. Here are the biggest learnings:
- Training takes time: High-quality training data and iteration cycles are needed to reach stable accuracy—especially for fuzzy payments and variable accounting logic.
- AI augments, it doesn’t replace: The most valuable pattern is AI + deterministic orchestration + human validation for exceptions.
- Performance matters: If responses are slow, user adoption drops. Optimize retrieval, reduce payload sizes, and cache where appropriate.
- Start small with a strong use case: Pick a narrow process with measurable value and clear success criteria before scaling.
- Continuous monitoring is non-negotiable: Model behavior drifts as business patterns change; monitoring and feedback loops keep accuracy stable.
Where to go next
To move from PoC to production, a phased approach works well:
- Phase 1 – Assist — Deploy BU‑BOT for guidance and a read-only FinBOT for common inquiries. Measure adoption and question coverage.
- Phase 2 – Recommend — Add AI proposals to workflows (e.g., suggested distributions, suggested matches) with approval gates.
- Phase 3 – Automate — Introduce confidence-based straight-through processing for high-confidence items, with continuous retraining and controls.
Closing thoughts
Custom AI bots can make Oracle ERP Cloud significantly easier to use and operate—if they are designed with governance, confidence-based controls, and a clear human-in-the-loop approach. In our PoC, the combination of RAG for guidance, APIs for insights, and OIC-orchestrated automations provided a pragmatic blueprint: start with user enablement, expand to recommendations, and then automate selectively where confidence and auditability are strong.
