Skip to content

AI Service: System Specification

This document provides the system specification for the AI Service (ai_service).

1. Overview

The ai_service is the central orchestration component of the AI-Enhanced Workflow (AIEW), as described in the AI-Enhanced Workflow feature document and the corresponding ADR-011. Its primary responsibility is to interpret user intent, manage conversation state, construct and propose execution plans, and synthesize final responses for the user.

It functions as the "brain" of the assistant, connecting the user-facing api_gateway with the tool-executing mcp_service.

2. Technology Stack

The service leverages the following core technologies:

  • Orchestration: LangGraph is used to build the core logic of the AI agent as a state machine, allowing for complex, cyclical workflows that include planning, user confirmation, and tool execution steps.
  • Session Store: Redis is used for short-term, low-latency storage of active conversation sessions. This includes message history and pending plan details.
  • Long-Term Storage: MongoDB is used for the long-term archival of chat histories, enabling users to resume past conversations and providing a dataset for analytics.
  • Communication:
    • gRPC: The service exposes a gRPC interface for high-performance internal communication with the api_gateway.
    • MCP: The service's LangGraph agent is configured with the langchain-mcp-adapters library to consume tools from the mcp_service over the Model Context Protocol.