Location: Offsite, with regular meetings video teleconference (VTC)
Requirements Description:
The project aims to develop a GUI-based prototype that enables stakeholders to validate the AI-powered document analysis pipeline on a curated set of Project Management Plans (PMPs). The tool should allow users to upload PMP documents (within NU classification), automatically process them, and inspect extracted information such as classified topics, semantic relationships, and cross-referenced entities across documents. Visualizations such as dashboards should be used to present these connections interactively, offering insights into project overlaps and dependencies. Additionally, the prototype should include a chatbot interface that supports conversational querying over the uploaded and analyzed documents, including their discovered interrelations. This is part of a continuous effort to streamline portfolio management activities and facilitate data analysis.
The overarching objective is to automate the analysis of NATO PoW project documentation to generate insights, value added responses to user inquiries and summary of results to support Portfolio Management activities. This should include the capability to:
- Identify links, common topics, technologies, and initiatives;
- Prevent duplication across PoWs, projects and work packages;
- Promote collaboration and reuse of outputs and methodologies;
- Improve visibility for strategic alignment with EDTs and NATO
- Generate reports and visual summaries for dashboards, stakeholder briefings, and strategic reviews;
- Query the system using natural language to explore both high-level insights and deep project documentation;
- Drill into granular content from PMP documents and highlight how concepts, objectives, and timelines relate across PoWs;
- Perform business-level portfolio analysis through aggregation and filtering (e.g. by topic, status, stakeholder);
Skill, Knowledge & Experience:
The consultancy support for this work requires a Data Analyst with the following qualifications:
- Bachelor’s degree in a relevant field (computer science, data science, software engineering, machine learning, etc.) or equivalent experience.
- 8+ years of experience in applied
- 3+ years of experience deploying and managing Large Language Models (LLMs).
- 3+ years of experience in developing GUI based interfaces for users;
- 2+ years of experience of developing intelligent document chunking, topic tagging, and semantic classification capabilities;
- 2+ years of demonstrated experience of generating visualizations, graphs and databases from extracted data;
- 2+ years of demonstrated experience of developing interactive Chatbot solutions that enable users to “Chat with Documents”;
- 2+ years of experience with network filtering, access control, and cloud security architecture;
- Ability to build automated experimentation environments with support for human interaction (benign/adversarial).
- Evidence and high-level overview of research methodology and models development process to build credibility and confidence in the bidder’s ability to deliver.
