Keywords

1 Introduction

The definition of AR appeared in 1997 and is described as a technological field that involves the seamless overlay of computer-generated virtual images aligned with the real world, and can be viewed and interacted with in real-time [1]. With the continuous development of technology and equipment, AR has gradually started to enter our daily lives. Particularly, in the past decade, with the invention of AR headsets and the popularisation of AR-ready smartphones and tablets, the research and applications of AR have grown explosively [5]. No matter how AR develops, it does not deviate from its purpose of bridging the gap between virtual data and the real world [2].

Furthermore, more and more architects are interested in AR because it arguably offers all kinds of new interaction scenarios in all the architectural fields [7]. Recently, AR technology has been applied and explored from finding and design, construction, visualisation to education, and more [11].

Architectural design, being the quintessential 3D–4D design field, has throughout its history been limited by 2D or cumbersome 3D representation, such as sketching on the plane surface or building physical scale models [3]. Even though computer-aided architectural design and modelling software is widely used to produce digital 3D models, their preview is still limited to a 2D-based screen, which lacks an intuitive means of onsite visualisation and modification. Additionally, conventional screen-based visualisation methods for design and analysis are restrictive to how well the user understands the space on a computer, as the design is done outside the building site, hence there might be disparities between the design and final fabrication [9]. This limitation may be eliminated by AR technology, which has become readily available, together with tools facilitating the easy creation of 3D–4D models as holograms onsite. Furthermore, with its gesture and voice capture features, AR can increase the potential for interaction between humans and data [4].

Robotic fabrication, an emerging high-tech architectural digital fabrication method, has shown great potential for integrating architectural design and engineering practices, establishing a highly effective interplay between digital design and construction processes [8]. However, the robotic operation process requires complex knowledge and skilled programming code workers, which is an expertise that is traditionally not found in architectural practitioners [10]. Although there are already some robotic operation plugins within the Rhinoceros/Grasshopper platform, they require architects to visually programme the process, which is usually inefficient, complicated and accompanied by many debugging and instability errors. Robotic programming in Grasshopper also tests or challenges the traditional architects’ logic. Even with these programme methods, there is a lack of security simulation and protection for inexperienced architects. Therefore, digital fabrication always needs the help of engineers. Due to the disconnection between architects and robotic engineers, uncertain situations often appear in the robotic fabrication process [6]. AR technology may avoid this limitation, which can capture interactive inputs through the UI and display onsite holographic simulation to provide an easy, safe, and low-threshold method for architects to control robots by themselves.

This paper proposes an onsite AR immersive design to fabrication framework by combining the above unique characteristics and functions of AR to find out how AR technology is changing and evolving the traditional design to assembly methods in architectural construction.

2 Research Methodology

Our Augmented Bricks research project proposes an onsite AR immersive design to fabrication framework for the assembly of masonry structures. The framework consists of two phases: (a) the algorithmic immersive design of the object and (b) the robotic fabrication of the object by a robotic arm (Fig. 1). To validate this AR-assisted framework, we conduct a design experiment, which includes the design and assembly of four parametric brick-based columns and evaluate its workflow as well as inspect the advantages and disadvantages of each step. The prototypes were designed and built with the styrofoam blocks (150 * 50 * 20 mm) as the prototype brick-based material for testing, which is suitable for parametric design, easy for AR devices to detect, and able to be picked and placed by the robotic gripper.

Fig. 1.
figure 1

The Augmented Bricks project AR-assisted framework flowchart. The framework is divided into two phases: immersive design and AR-assisted robotic assembly. The outcomes of each phase are a digital design and a physical structure.

Our software includes Rhinoceros/Grasshopper, which was applied for the development of the design algorithm, as well as the structural simulation plugin PhysX, the robotic fabrication firmware Robots and Fologram an AR plugin for Grasshopper. We use Fologram to identify interactions in AR from hand gestures or screen-based inputs; PhysX to give a real-time structural stability simulation and design modification feedback; as well as Robots to develop the robotic operation trajectory and gripper commands. Our original contribution is to integrate the advantage functions from various plugins and create an onsite AR immersive design to assembly framework for masonry structures.

Our hardware includes a handheld deviceis—iPhone 11, and a headset—Microsoft HoloLens 1 for AR, as well as a Universal Robots 10 robot arm with Robotiq 2F-140 grippers for the robotic equipment. We also use a laptop for back-end running and debugging. All of these devices are connected to a WIFI router in the same IP address network environment for transforming the data from different stages, and live streaming commends on design software and plugins to visualise and output response ports.

3 The Augmented Bricks Design Experiment and Outcomes

3.1 Phase 1: The AR Immersive Design

The AR immersive design process is the first phase of the Augment Brick experiment, which contains the 3D onsite environment scanning, gesture or screen-based interactive design input method, structural simulation feedback, and multiple-designer data sharing. The idea of proposing an immersive design method is to evolve the traditional design method by giving architects a 3D–4D modelling environment that could be shared and to provide them with an onsite virtual space experiment and structural rationale feedback before the structure is built.

The user requires the 3D scanning onsite design base before the AR immersive design. To achieve that, we provide two spatial environment scanning ports, an AR smart device (smartphone or tablet) and AR headsets. First, the user can activate the ‘Track Scan’ function in the Fologram plugin for real-time digitalised environment scanning. The scanning operation for users is to use a smart device or headsets by looking around with the camera smoothly in the onsite environment with a steady light source. Second, the physical environment will be transformed into a simple mesh in Grasshopper for architects to use as an onsite design base. Last, to adjust and align the digital environment or set the design boundaries, the user can use Aruco Markers to set the datum reference points physically upon the onsite base and convert them digitally by scanning the markers through AR devices. By doing that, the accuracy of the design plane is improved significantly (Fig. 2). The converted onsite base mesh is stored in a QR code for subsequent use. This method is only used for simple and basic onsite environments. For complex environments or uncertain onsite bases, we recommend the user to activate the Capture App for smart-device, or the spatial mapping function in HoloLens, to scan and import the corresponding highly accurate digital 3D meshes for further edition in software before using Aruco Markers to set the datum reference points.

Fig. 2.
figure 2

The designer uses a 3D onsite environment scanning process, including the Aruco Markers datum reference points, to create the corresponding digitalised environment mesh for the immersive design base and bounding plane

For the immersive design process, we create an open design algorithm platform, followed by an AR immersive UI and a structural stability simulation feedback loop. First, the user can choose an algorithm from our design library, representing different brick-based structures shapes. These algorithms in our library follow the parametric design logic, which provides essential shape control and design constraints for AR interaction to reduce the impact of excessively active AR interactive inputs. The content of these algorithms will include the declared shape generation logic, interactive parameters, UI input factors, etc. Architects can customise the design algorithm according to their needs in our open platform. Second, scan the QR code in AR and release the onsite base data in the previous step. The virtual onsite hologram will be aligned immediately with the physical environment in the AR for the user to preview. Next, activate the AR immersive design UI, which is designed through the Fologram open-source function in AR devices. Users can use hand gestures or screen-based input methods to interact and adjust the parameter sliders on the AR UI in real-time. These design and modification inputs are connected to the design algorithm so that users can preview their design immediately as onsite 3D holograms, which can be previewed and experienced in real-time (Fig. 3). Besides that, our framework supports multi-participant for collaborating design on the same onsite base. Finally, the designed structure is simulated by PhysX for its stability. The user can preview the outcomes as holographic animations to find the fragile connection parts and modify them according to the framework feedback loop. After all the simulations and modifications are over, the design structure will be sent for robotic assembly.

Fig. 3.
figure 3

The designer is using the AR immersive design UI to modify the structure and preview it in real-time onsite with an AR smart device (iPhone 11) and AR headset (HoloLens 1)

Phase 1 provides accessible QR codes, which contain the corresponding 3D onsite environment meshes, as well as the immersive design outcome models and data for users to access and align with the physical robotic operation base in phase 2.

Design Phase Findings

In summary, the AR immersive design process does fulfil our pre-determined assumptions. We successfully designed four brick-based columns in our AR immersive design framework. These four columns are applied to different shape generation algorithms to explore the impact of multiple parameter inputs, such as keyboard input, option input and slider inputs, on different design algorithms through the AR UI, as well as to explore the flexibility and friendliness of the customised design algorithm set up in the immersive design process. As a result, the immersive design phase is suitable for various interactive input modes and supports different customised algorithm settings. This onsite design and preview function break the conventional 2D-based design method, providing designers with a 3D–4D immersive perception in AR for more practical design. However, this process still has some limitations. For example, design algorithms have to be pre-set in the system. Since the current physical masonry structures are not made of interlocking units or are using adhesives, the structures rely on their own weight’s structural stability, which significantly limits the diversification and complexity of design algorithms. Moreover, if the design algorithms can be set and realised in real-time by user interaction in AR, it will bring a qualitative leap to the user experience. However, it depends on software and equipment development capabilities. Finally, the natural onsite environment may not be as simple as the lab-based environment. Our system will cause tolerances in facing the complex onsite environment and unstable lighting. Therefore, extra sensors will be introduced into our system to improve the accuracy of dealing with complex environmental interference onsite.

3.2 Phase 2: The AR-Assisted Assembly

The AR-assisted assembly process is the second phase of the Augment Brick verification experiment, which contains the physical assembly segmentation and AR-assisted robotic operation. The idea of proposing an AR-assisted assembly method is to provide an easy, safe, and low-threshold method for architects to control industrial robots in the construction process by themselves without any computer science knowledge or coding skills. This unique assumption will reduce the design-build tolerances due to the architects’ absence from operating and supervising the high-tech complex digital fabrication process.

Having completed the AR immersive design phase, the users need to upload their design output to our system with the help of the Robots plugin for robotic assembly. First, according to the operation radius of the robotic arm in our lab and the size of the structure, we set up an assembly segmentation range box (500 * 500 * 600 mm), which can be changed according to different brands of robots in different assembly situations. The designed structure will be divided into several parts according to this range box for the robotic operation because some structures will exceed the working radius of the robotic arm.

For the AR-assisted assembly process, our system will complete the design structure assembly of each part from the bottom up. First, the user needs to scan the QR code, which was generated from the phase 1, in AR devices to locate the virtual holographic world, including the virtual robotic arm, environment meshes, range box, and the part of the pre-designed structure that needs to be assembled, to the physical robotic operation site. Second, the structure will be divided into foam brick elements as targets in the robotic workflow. The user needs to manually point out the pre-designed structure hologram as the target, either by using hand gestures in headsets or by pointing at the screen through a smart device in AR. According to the user’s interactive selection, the robotic operation trajectory will be shown as holographic lines immediately from the foam brick pick location to the target location. Then, the user can preview the robotic pick and place operation animation as holograms upon the entire construction set. We provide an AR-assisted robotic operation UI, in which the user can interact and adjust the robotic setting parameters, such as gripper open or closed commands, operation mode, operation speed, etc., during the holographic simulation process. After the simulation provides the expected results, the user can operate the robot by pressing the upload button through the AR UI for the automated robotic assembly process (Fig. 4). Moreover, the user could manually select each layer or even each brick as the target in AR UI, only when the special assembly sequence is required. The pick and place simulation and operation will be repeated on each brick or layer till the end. Finally, after the separate part constructions are complete, the user will manually assemble these parts in sequence according to the AR instruction (Fig. 5).

Fig. 4.
figure 4

The designer uses the AR-assisted robotic operation method to select the target hologram and send gripper commands (open or closed) through the AR environment and achieve the robotic pick and place operation to assemble the foam brick structure step by step

Fig. 5.
figure 5

The designer manually assembles these two parts in sequence according to the AR instruction onsite in the AR-assisted system. The user needs to align the bottom part of Part B with the red holographic instruction guideline to complete the assembly of the whole column

Fabrication Phase Findings

In summary, the AR-assisted assembly process indeed achieved a more accessible and intuitive robotic assembly operation for users based on our pre-determined assumptions. We finished assembling four brick-based columns efficiently and precisely. Even unskilled architectural students can easily manipulate this process. All the commands and processes have been pre-developed in our system, which means that the users do not need to be trained in how to use Grasshopper plugins or computer science language to control an industrial robot. They only need to manipulate the AR UI to preview the virtual simulation and realise the physical robotic operation, which is safer and more manageable for architects and designers to learn and use. In addition, this AR-assisted method can manually choose the robotic assembly order, pause and repeat at any time, which is more flexible than the traditional robotic operation method, and avoid the unstable connection due to the lengthy code generated by traditional methods. However, this process still has some limitations. We have currently only used a robotic gripper. Other robotic end-effectors such as hot wire cutting tools, and 3D printing tools, could be used in the future, allowing a much wider face of applications. Additionally, the current robotic pick and place targets are based on the corresponding control points related to the AR design model. There are still tolerance issues if one is relying solely on gestures and interactions to command the robots via AR. More sensors will be applied to our AR-assisted system in order to improve the recognition ability and the physical and virtual alignment capabilities of the target location in the AR environment. Furthermore, these four masonry column outcomes are held together only by their own weight. Although they all passed the stability simulation before being built, the structures remain significantly unstable, especially with increasing height or environmental disturbances. Finally, the UI works well, but after completing several parts of the physical structure assembly, the shadow of the holograms and the physical bricks overlap, making it difficult for users to select visually. The visualisation of our UI should be further improved, for example, only the selected hologram target is displayed, and the rest are displayed or hidden in a wireframe, which is convenient for users’ manipulation.

4 Conclusion and Discussion

The Augmented Brick research developed and verified an immersive design to fabrication framework, which operates successfully for the design and assembly of masonry structures with AR and robotic technologies. Our framework optimises the traditional architectural design to the fabrication process by providing users with the possibility of immersive spatial experience and design modification through AR immersive design methods and empowering them to control industrial robotic arms to achieve complex parametric shape construction through AR-assisted assembly methods (Fig. 6).

Fig. 6.
figure 6

Through our AR-assisted system, the entire process from design to assembly of four brick-based columns has been realised as preliminary physical tests

However, there are limitations and space for further improvement. Tolerance issues between the physical-virtual alignment and robotic grasping position are one of the most significant obstacles. The tolerance curing during the design process can be ignored because the slight hologram offset does not affect the immersive design, modification and preview. However, tolerances occurring during the assembly process need to be pre-calculated and incorporated into the assembly process as they affect the accuracy of the physical objects. We found out that feeding manually, brick after brick, to the robotic gripper reduces a certain amount of tolerances but also reduces the flexibility of the robotic automation. In further research, extra sensors, such as Xbox Kinect or Azure Kinect, are needed for teaching the robot to recognise and grab the bricks precisely to solve the feeding issue. Also, these sensors can help to scan the onsite design environment and the design base precisely, and to improve the recognition ability and the physical-virtual alignment capabilities of the target location in AR.

Additionally, the performance of our PhysX proved to be successful as it predicted the collapse of one of the masonry walls we tried to fabricate as shown in (Fig. 7). One can see that the collapsed structure is almost identical to the simulation model. Further research, could also investigate the development of interlocking brick joints as well as the use of adhesives, such as mortar and glue, to enhance the stability of structures to exploit the limitations. With the help of interlock joints or brick adhesive, the structure will no longer be constrained by gravity. More complex immersive design algorithms and more flexible interactive inputs will stimulate the creativity of architects. Moreover, the experiment needs to be repeated with real bricks in the future, as they may have different physical behaviours. We aim to optimise our ‘design to fabrication’ framework; thus, it can be applied to the design and construction of real architectural components.

Fig. 7.
figure 7

The simulation of the wall design (left) and the physical performance during the robotic assembly process (right). The structure does not contain any interlock joints or mortar between each brick to keep the structure stable.

Finally, we are also aiming to repeat the experiment by mounting the robotic arm on the MiA mobile robotic platform, which would liberate the fabrication process from spatial limitations appearing in the lab environment. The final goal is to achieve the onsite AR immersive design to fabrication framework in architectural scale applications. The Augmented Brick framework will bridge the gap between architectural design and high-tech construction techniques and place parametric design and high-tech manufacturing back into the hands of architects with the help of AR.