Paper Title: BAT-GPT: Large Language Models for Interactive Digital Twin-based BMS
Conference Name: KICS Summer 2024
Abstract: This study introduces BAT-GPT, a novel digital twin assistant specifically designed for battery management systems (BMS). Leveraging the Text-To-Text Transfer Transformer (T5) architecture, this model is directly applied to interpret insights provided by digital twins, utilizing a specialized dataset derived from a NASA battery database. The dataset was meticulously prepared to support the generation of relevant prompts and responses, enabling the model to be applied effectively without fine-tuning. We employ a gradient accumulation strategy in model training to optimize GPU resources. Experimental results show that BAT-GPT trained with gradient accumulation achieves impressive performance metrics with over a 5% increase in ROUGE-L score, BLEU score, and accuracy and an approximate 6% decrease in perplexity and training time compared to the model without gradient accumulation. These results highlight the model’s capability to significantly enhance explainability and user interaction within digital twins for BMS applications.