TR2025-059
Electric Motor Cogging Torque Prediction with Vision Transformer Models
-
- "Electric Motor Cogging Torque Prediction with Vision Transformer Models", IEEE International Electric Machines and Drives Conference (IEMDC), May 2025.BibTeX TR2025-059 PDF
- @inproceedings{Sun2025may,
- author = {Sun, Siyuan and Wang, Ye and Koike-Akino, Toshiaki and Yamamoto, Tatsuya and Sakamoto, Yusuke and Wang, Bingnan},
- title = {{Electric Motor Cogging Torque Prediction with Vision Transformer Models}},
- booktitle = {IEEE International Electric Machines and Drives Conference (IEMDC)},
- year = 2025,
- month = may,
- url = {https://www.merl.com/publications/TR2025-059}
- }
,
- "Electric Motor Cogging Torque Prediction with Vision Transformer Models", IEEE International Electric Machines and Drives Conference (IEMDC), May 2025.
-
MERL Contacts:
-
Research Areas:
Abstract:
Motor performances such as cogging torque and torque ripple are difficult to predict accurately with surrogate models. In this work, we propose Vision Transformer (ViT) based models to tackle the problem. We adopt a ViT model pre-trained on image classification tasks, and fine-tune it with a dataset prepared for interior permanent magnet motor designs. Each motor design is represented by a 2d image and fed into the ViT model for making predictions on cogging torque. To further improve the data efficiency of the model, we customize it by utilizing the motor design parameter information to initialize the class token of the ViT model. We show that the proposed method significantly outperforms established deep convolutional neural network (CNN) based models, and achieves high accuracy on cogging torque prediction on the test dataset.