A cross-border community for researchers with openness, equality and inclusion

ABSTRACT LIBRARY

Multimodal AI Framework for Personalized and Health-Aware Cooking Recommendations

Publisher: IEEE

Authors: S Swarna Suganthi, Velammal College of Engineering and Technology P Pooja Shree, Velammal College of Engineering and Technology,Madurai

  • Favorite
  • Share:

Abstract:

In the current era of growing interest in health-conscious eating and personalized nutrition, traditional recipe recommendation systems often fail to account for diverse user needs, ingredient availability, and practical cooking constraints. The Multimodal Artificial Intelligence (AI) Framework proposed in this study integrates and analyzes multiple data modalities—textual dietary preferences, food images, and cooking videos—to generate personalized and health-aware cooking recommendations. The framework considers individual health profiles, ingredients detected from visual inputs, and user-specific cooking skill levels inferred from video analysis to tailor recipe suggestions effectively. By leveraging multimodal deep learning algorithms, the system delivers contextually aware, precise, and adaptive recommendations. Experimental evaluations on benchmark and hybrid datasets demonstrate its effectiveness in enhancing recommendation relevance, supporting dietary compliance, and improving overall user satisfaction. These results indicate strong potential for real-world deployment in intelligent culinary assistants, personalized diet planning platforms, and smart health applications.

Keywords: Personalized Recipe Recommendation, Multimodal Deep Learning, Ingredient Recognition, Cooking Skill Estimation

Published in: 2024 Asian Conference on Communication and Networks (ASIANComNet)

Date of Publication: --

DOI: -

Publisher: IEEE