Authors: Dileep palle, SRM Institute Of Science And Technology
This piece lays out a detailed look at designing, setting up, and checking out a RESTful API setup. It helps make it easy to plug an AI-powered chatbot into modern customer service systems. The whole thing pulls in cloud AI tools to handle live chat inputs. It gives back smart replies that fit the context of what users say. That REST API acts as a middle layer. It links up front-end parts like web or app screens with the back-end AI stuff. This way, it manages sessions well, keeps track of context, and moves data securely across spread-out setups.
The way the API is built handles both quick back-and-forth replies and ones that take a bit longer. It lets communication flex based on what services need or how networks act up. Through clear steps and ways to collect feedback, the chatbot keeps getting better at talks. It sharpens up on spotting user goals and boosting how happy people feel. On top of that, there are strong security and privacy setups. Things like tokens for ID checks, locked-down data sending, and sticking to rules on protecting info keep user chats safe.
Keywords: AI Chatbot, REST API, Customer Service Automation, Conversational AI, Cloud Deployment, Performance Evaluation, Scalability, Security, Open API Specification, Google Cloud Platform
Published in: 2024 Asian Conference on Communication and Networks (ASIANComNet)
Date of Publication: --
DOI: -
Publisher: IEEE