TR2023-072

Synthesizing Building Operation Data with Generative Models: VAEs, GANs, or Something In Between?


    •  Salatiello, A., Wang, Y., Wichern, G., Koike-Akino, T., Yoshihiro, O., Kaneko, Y., Laughman, C.R., Chakrabarty, A., "Synthesizing Building Operation Data with Generative Models: VAEs, GANs, or Something In Between?", ACM e-Energy Conference, DOI: 10.1145/​3599733.3600260, June 2023.
      BibTeX TR2023-072 PDF
      • @inproceedings{Salatiello2023jun,
      • author = {Salatiello, Alessandro and Wang, Ye and Wichern, Gordon and Koike-Akino, Toshiaki and Yoshihiro, Ohta and Kaneko, Yosuke and Laughman, Christopher R. and Chakrabarty, Ankush},
      • title = {Synthesizing Building Operation Data with Generative Models: VAEs, GANs, or Something In Between?},
      • booktitle = {ACM e-Energy Conference},
      • year = 2023,
      • month = jun,
      • doi = {10.1145/3599733.3600260},
      • url = {https://www.merl.com/publications/TR2023-072}
      • }
  • MERL Contacts:
  • Research Areas:

    Machine Learning, Multi-Physical Modeling, Optimization

Abstract:

The generation of time-series profiles of building operation requires expensive and time-consuming data consolidation and modeling efforts that rely on extensive domain knowledge and need frequent revisions due to evolving energy systems, user behavior, and environmental conditions. Generative deep learning may be used to provide an automatic, scalable, data-source-agnostic, and efficient method to synthesize these artificial time-series profiles by learning the distribution of the original data. While a range of generative neural networks have been proposed, generative adversarial networks (GANs) and variational autoencoders (VAEs) are most popular models; GANs typically require considerable customization to stabilize the training procedure, while VAEs are often reported to generate lower-quality samples compared to GANs. In this paper, we propose a network architecture and training procedure that combines the strengths of VAEs and GANs by incorporating Regularized Adversarial Fine-Tuning (RAFT). We imbue the architecture with conditional inputs to reflect ambient/outdoor conditions and operating conditions, and demonstrate its effectiveness by using operational data collected over 585 days from SUSTIE: Mitsubishi Electric’s net-zero energy building. Comparing against classical GAN, VAE, Wasserstein-GAN, and VAE-GAN, our pro- posed conditional RAFT-VAE-GAN outperforms its competitors in terms of mean accuracy, training stability, and several metrics that ascertain how close the synthetic distribution is to the measured data distribution.

 

  • Related News & Events