Level: Ph.D.

AI-powered digital matching for the built environment

Abstract

The contemporary landscape of Building Information Modeling (BIM) has undergone a substantial transformation, now embodying a vast repository of contextual and operational data relevant to the lifecycle of built assets. This broad and complex spectrum of data represents a challenge for the assimilation of compartmentalized multisource information and the extraction of usable information. My doctoral research addresses this complexity by proposing a new framework that synergizes knowledge graphs (as a backbone for data integration), virtual reality (VR) and artificial intelligence (AI)-driven interfaces. This integrative approach aims to revolutionize user interaction with digital twins, facilitating more intuitive and immersive engagement with virtual representations of built environments.

Project results

With the aim of advancing interactivity in the field of building information models (BIM), one of the main outcomes of my doctoral research was the development of a prototype BIM-based virtual reality (VR) tool. This tool integrates live streams of sensor data from buildings and renders them in a virtual environment. This feature is essential for professionals planning simulation scenarios, as it provides a dynamic, responsive interface that simulates potential real-world changes and their impacts. The ability to visualize and manipulate building data in real time in a VR space marks a significant leap forward in scenario analysis and decision-making processes in building management. From a data integration and analysis perspective, research has explored the field of unsupervised learning to facilitate machine understanding of built asset lifecycle data through knowledge graphs. Relying on unsupervised learning algorithms, the system can autonomously interpret complex graphical representations of data relating to the built environment, discovering patterns and associations without human supervision.
This method considerably reduces the time and resources required for BIM data analysis, and thus streamlines the management and operation of built assets. The culmination of these efforts is the development of an AI-powered interface that enables intuitive interactions with digital twins or BIM models. This interface exploits the potential of open BIM standards, enabling broad compatibility with various BIM software and data formats. What’s more, thanks to the integration of advanced natural language processing (NLP) techniques, users can interact with BIM data using conversational language, making the system more accessible and user-friendly for professionals, whatever their level of technical expertise. This AI-driven interface acts as a bridge, translating complex BIM data into usable information through a more natural and intuitive user experience.

Project contributions

From a research perspective, exploring unsupervised learning to interpret knowledge graphs in the context of built assets improves understanding of data analysis in BIM and opens up avenues for future research in autonomous data model recognition. In addition, the research makes a substantial academic contribution by merging natural language processing (NLP) and BIM. It offers an insightful perspective on harmonizing advanced IT techniques with the pragmatic constraints of the construction industry, such as cost-effectiveness and system responsiveness. From a practical point of view, the VR tool developed as part of this research offers industry professionals powerful decision-making capabilities, by visualizing sensor data and potential changes in a virtual environment. The AI-powered interface created as part of this research makes BIM models more accessible to a wider range of professionals. By using NLP and open BIM standards, the barrier to entry for interacting with complex BIM data is lowered, enabling wider adoption of BIM technologies across the industry.

Publications

Publications from this project are available below:

Shahinmoghaddam, M., Nazari, A., & Zandieh, M. (2018). CA-FCM: Towards a formal representation of expert’s causal judgments over construction project changes. Advanced Engineering Informatics, 38, 620–638.

Shahinmoghadam, M., & Motamedi, A. (2019, May). Review of BIM-centred IoT Deployment– State of the Art, Opportunities, and Challenges. In Proceedings of the 36th International Symposium on Automation and Robotics in Construction (ISARC 2019).

Shahinmoghadam, M., & Motamedi, A. (2021). An ontology-based mediation framework for integrating federated sources of BIM and IoT data. In Proceedings of the 18th International Conference on Computing in Civil and Building Engineering: ICCCBE 2020. Springer International Publishing.

Motamedi, A., & Shahinmoghadam, M. (2021). BIM-IoT-integrated architectures as the backbone of cognitive buildings: Current state and future directions. In BIM-enabled Cognitive Computing for Smart Built Environment (pp. 45–68). CRC Press.

Motamedi, A., & Cheriet, M. (2021). Applying Machine Learning and Digital Twinning for the Live Assessment of Thermal Comfort in Buildings. Proc. of the Conference CIB W78, 2021, 11– 15.

Davari, S., Shahinmoghadam, M., Motamedi, A., & Poirier, E. A. (2022). Demystifying the Definition of Digital Twin for Built Environment. Proceedings of the ICCEPM: The 9th International Conference on Construction Engineering and Project Management.

Research team

The project team :

Partners : BESLOGIC.
Team

The project team

Partners

This project was supported by :

Similar research

Explore our research in more depth by exploring these related studies and resources:

Scroll to Top