The Technical Support Problem
A Geneva software publisher offering an ERP solution for SMEs employed 12 support technicians handling about 400 monthly tickets. Each ticket required searching through a knowledge base containing over 2000 technical articles describing known issues, solutions, procedures, and best practices. This documentation base was stored in SharePoint but keyword search gave disappointing results, forcing technicians to manually browse numerous articles before finding relevant information.
Average ticket resolution time reached 3.5 hours, with one hour devoted to information search. New technicians struggled particularly, lacking the experience to know where to look. Clients complained about slow responses and some were considering switching solutions. Management sought a way to drastically improve support efficiency without increasing headcount.
The Augmented Knowledge Solution
We developed an intelligent knowledge base system combining Azure AI Search with semantic search, Azure OpenAI for response generation, and a modern user interface developed in SPFx integrated directly into the ticketing tool.
The first step consisted of restructuring and enriching the existing knowledge base. We indexed all 2000 articles in Azure AI Search with semantic search activation, which understands the intent behind a query rather than just looking for word matches. For example, a search for "unable to save an invoice" will find articles talking about "blocking during accounting document validation" even if the exact words differ.
Each article was automatically enriched with vector embeddings generated by Azure OpenAI, enabling semantic similarity measurement between documents. Similar articles are automatically linked, and technicians discover related solutions they wouldn't have thought of.
The search interface developed as a SharePoint Framework WebPart offers a modern user experience. The technician enters their question in natural language, exactly as they would phrase it to a colleague, for example "how to resolve a VAT issue that doesn't calculate on an order line". The system performs several types of searches in parallel: semantic search in Azure AI Search to find the most relevant articles, search in resolved ticket history to identify similar cases, and search in client system logs to detect corresponding error messages.
Results are presented as cards ranked by relevance. But the real added value comes from automatic synthetic response generation. The system sends the technician's question along with the three most relevant articles to a GPT-4 model deployed in Azure OpenAI Service with specific instructions to generate a direct and actionable response. The prompt asks the model to synthesize information from source articles, adapt the technical tone to context, and structure the response in numbered steps if it's a procedure.
The generated response is displayed at the top of the interface with sources cited, allowing the technician to verify reliability. The technician can then directly copy this response into their client communication, adapt it as needed, or dive deeper by consulting the complete source articles. A feedback system indicates whether the response was useful, feeding continuous improvement.
For cases where no existing article answers the question, the system automatically detects this gap and creates a task assigned to the knowledge base manager to write a new article. This loop ensures the base continuously enriches with real encountered problems.
An analysis module also uses AI to identify trends in tickets. If several similar tickets arrive in short time, the system automatically alerts the development team that a potential bug affects multiple clients, enabling rapid response.
Operational Benefits
After eleven months of deployment, results exceed expectations. Average ticket resolution time dropped from 3.5 hours to 1.6 hours, a 55% reduction. Information search time plummeted from 60 minutes to 15 minutes on average. Technicians find relevant information on the first try in 82% of cases, versus about 40% previously.
Client satisfaction measured by post-resolution survey jumped 42 points, from 67% to 91% satisfied or very satisfied clients. Clients particularly appreciate response speed and precision. The number of tickets escalated to level 2 (complex problems) decreased 35%, as level 1 technicians can now resolve more cases thanks to facilitated access to expertise.
New technicians become operational much faster. Training period dropped from 6 weeks to 3 weeks because they can rely on the system to compensate for their lack of experience. Staff turnover rate decreased, with technicians feeling less frustrated and more effective.
The knowledge base itself significantly improved. Thanks to the gap detection system, 180 new articles were created in eleven months, filling undocumented areas. Existing article quality also progressed because relevance feedback identifies those needing rewriting.
Intelligence and Learning
The system continuously learns from its use. Each time a technician marks a response as useful or not useful, this information refines ranking algorithms. Articles frequently judged not very relevant for certain queries see their score decrease for those contexts.
We also implemented a proactive suggestion system. When a technician opens a new ticket, the system automatically analyzes the problem description and immediately suggests potentially relevant articles, even before the technician launches an explicit search. This anticipation saves precious time.
A Power BI dashboard allows the support manager to track key metrics: average resolution time by problem type, most consulted articles, detected documentation gaps, and client satisfaction. These indicators guide improvement investments.
Architecture and Security
The architecture relies on Azure AI Search with semantic search option enabled, Azure OpenAI Service for embeddings and response generation, SharePoint for article storage, and an SPFx WebPart developed in TypeScript and React for the user interface.
Security is critical as articles contain sensitive technical information about the product. Interface access is controlled by Azure AD with multi-factor authentication. API calls to Azure OpenAI are secured by managed identities. The GPT-4 model is deployed in private mode with guarantee of non-use of data for training.
Monthly Azure cost including AI Search, OpenAI API calls and storage represents approximately 600 CHF. The 22 hours of daily work saved (12 technicians × 1.9 hours × 22 working days) generate very largely positive ROI.
Future Evolutions
The publisher plans to expose a public version of this knowledge base to end clients, enabling self-support and further reducing ticket volume. Integration with Microsoft Teams would allow technicians to search information directly from their conversations. Finally, an automatic translation system could make the base accessible in multiple languages for international expansion.
Conclusion
This AI-augmented knowledge base illustrates how semantic search and natural language generation can transform technical support. By radically facilitating access to the organization's collective expertise, we enable each technician to perform like a senior expert. The result is simultaneous improvement of operational efficiency and client satisfaction—the ideal combination for a support center.