Papers
Energy-Driven Intelligent Generative Urban Design, Based On Deep Reinforcement Learning Method With A Nested Deep Q-R Network
Chenyu Huang School of Architecture and Art, North China University of Technology
Gengjia Zhang Department of Architecture, Tamkang University
Minggang Yin East China Architectural Design and Research Institute Co., Ltd
Jiawei Yao College of Architecture and Urban Planning, Tongji University
To attain “carbon neutrality,” lowering urban energy use and increasing the use of renewable resources have become critical concerns for urban planning and architectural design. Traditional energy consumption evaluation tools have a high operational threshold, requiring specific parameter settings and cross-disciplinary knowledge of building physics. As a result, it is difficult for architects to manage energy issues through ‘trial and error’ in the design process. The purpose of this study is to develop an automated workflow capable of providing urban configurations that minimizing the energy use while maximizing rooftop photovoltaic power potential. Based on shape grammar, parametric meta models of three different urban forms were developed and batch simulated for its energy performance. Deep reinforcement learning (DRL) is introduced to find the optimal solution of the urban geometry. A neural network was created to fit a real-time mapping of urban form indicators to energy performance and was utilized to predict reward for the DRL process, namely a Deep R-Network, while nested within a Deep Q-Network. The workflow proposed in this paper promotes efficiency in optimizing the energy performance of solutions in the early stages of design, as well as facilitating a collaborative design process with human-machine interaction.
Keywords: Energy-Driven Urban Design, Intelligent Generative Design, Rooftop Photovoltaic Power, Deep Reinforcement Learning, Sdg 11, Sdg 12