top of page


3D Neural Synthesis Gaining Control with Neural Radiance Fields

ACADIA Conference Denver 2023

CAADRIA Workshop 2023

This research introduces a novel 3D machine learning-aided design approach for early design stages. Integrating language within a multimodal framework grants designers greater control and agency in generating 3D forms. The proposed method leverages Stable Diffusion and Runway's Gen1 through the generation of 3D Neural Radiance Fields
(NeRFs), surpassing the limitations of 2D image-based outcomes in aiding the design process. This paper presents a flexible machine-learning workflow taught to students in a conference workshop and outlines the multimodal methods used - between text, image, video, and NeRFs. The resultant NeRF design outcomes are contextualized within
a Unity agent-based virtual environment for architectural simulation and are experienced with real-time VFX augmentations. This hybridized design process ultimately highlights the importance of feedback loops and control within machine-learning-aided design processes.

Research Team: George Guida, Daniel Escobar, Carlos Navarro

ACADIA 2023 Conference Paper Link on ResearchGate

Recording YouTube of Conference 


Project Video


Feedback Process

Results + Process

bottom of page