Intelligent Document Search Streamlines Machinery Maintenance for Consumer Electronics Giant

About Customer

Customer is a world leader in high-technology heating, air-conditioning, and refrigeration solutions. They provide sustainable solutions, integrating energy-efficient products, building controls, and energy services for residential, commercial, retail, transport, and food service customers.

Business Problem

Customer’s (electronics products) technical documentation included user manuals, maintenance guides, schematics, and troubleshooting procedures. For technicians working in the manufacturing industry, access to accurate and comprehensive technical documentation is crucial. It provides the roadmap they need to operate, maintain, and repair complex machinery effectively. However, disorganized documentation resulted in technicians spending 1 - 2 hrs/day on average searching for information, leading to longer issue resolution times and inefficient access to critical troubleshooting information.


  • Data Ingestion: Configure and set up various data connectors, such as S3, SharePoint, Confluence, and Website connectors, to ingest and index data into Amazon Kendra from multiple sources.

  • User Request via GenAI Chatbot App: Users initiate requests or queries within the GenAI app, seeking information.

  • App Queries Amazon Kendra Index: The chatbot app directs the user's query to the Amazon Kendra index.

  • Index Provides Relevant Search Results: Amazon Kendra retrieves pertinent data excerpts from ingested enterprise data that align with the user's query.

  • App Sends Contextual Data and User Query to LLM Prompt: Using the retrieved data as context, the app forwards both the user's query and related data to the LLM prompt.

  • LLM Generates Contextualized Response: The Language Model processes the contextual data and user query to craft a precise and relevant response.

  • Context Based Q&A with Memory: The LangChain framework helps maintain contextual conversation memory, enabling a coherent dialogue. Without it, each query would be treated as an entirely independent input, disregarding past interactions.

  • LLM Response Delivered to User: The response generated by the LLM is sent back to the user through the GenAI app.

Business Outcome

  • The solution accelerated information retrieval, reducing response times by 50%.
  • Helping technicians with quick and accurate troubleshooting using GenAI based document search, resulting in minimized operational downtime.
  • Access information swiftly, reducing the time spent on manual searches by 70%.
  • Real-time insights enable stakeholders to make well-informed decisions, contributing to agile and responsive operations.