Posted in | News | Technology

Collaborative Robots Streamline Curtain Wall Installation

A recent article in Biomimetic Intelligence and Robotics presents a human-robot collaboration system built on the "human-centered with machine support" concept. The system is designed to assist in handling large, heavy objects, such as building curtain walls, by intuitively adapting to human intentions.

Collaborative Robots Streamline Curtain Wall Installation
Study: Human-robot collaborative handling of curtain walls using dynamic motion primitives and real-time human intention recognition. Image Credit: Holmes Su/Shutterstock.com

Background

Construction sites are complex environments that often require multiple workers to manually assemble building curtain walls. Robots provide a promising solution and are increasingly replacing manual methods. However, traditional robotic systems rely on pre-programmed guidance, which becomes inefficient in complex and unpredictable scenarios.

By utilizing dynamic motion primitives (DMP) theory, robots gain flexibility, learning, and generalization capabilities. However, most current approaches focus on optimizing the robot’s end-effector position, often overlooking speed optimization during motion. Additionally, understanding the operator’s real-time intentions—subject to changes based on environmental and task conditions—is crucial for effective handling.

Achieving efficient human-robot collaboration in these tasks depends on the robot's ability to interpret the operator’s motion intentions through sensor data. This study introduces a human-robot collaborative curtain wall handling system, leveraging a skill-learning approach to enhance both flexibility and real-time responsiveness.

Methods

This study focused on developing a human-robot collaborative handling system for building facade operations, specifically targeting the assembly of curtain walls. The system was designed to perform facade handling tasks using real-time trajectory planning based on human intentions. It comprised three primary modules: intent understanding, motion trajectory planning, and execution.

The intent understanding module gathered real-time force data from human grip actions on the curtain wall using a six-axis force sensor (M4313M4B model from Sunrise Instruments). Additionally, it tracked the motion of the robot’s end-effector (UR5) to estimate human intentions—whether to accelerate, decelerate, or maintain the current speed. A Kalman filtering algorithm was applied to the sensor data to smooth the force curves and enhance accuracy.

The motion trajectory planning module generated robot motion trajectories at varying speeds by utilizing trajectory learning and generalization models. These models allowed the robot to adapt its movements in response to the human operator's real-time input.

Finally, the execution module integrated human intentions with the learned motion trajectories to dynamically adjust the robot’s path, enabling seamless cooperation during the curtain wall assembly process.

The robot’s skill-learning framework combined trajectory learning and generalization through DMP, with human intention recognition playing a key role. To accommodate various trajectory shapes, a nonlinear control term incorporating time was added to the robot’s trajectory learning model. The system's functionality was demonstrated experimentally using a platform equipped with an intent acquisition module and a human-machine cooperation control module. The effectiveness of the robot’s trajectory learning and generalization was evaluated by testing its ability to follow a given target path on the platform.

Results and Discussion

During the collaborative handling process, the robot successfully gathered data from the operator via the force sensor, interpreting human intentions in real time. The robot then communicated with a computer through a serial port, executing motion instructions based on the planned trajectories, effectively integrating human intentions into the handling tasks.

The experimental results demonstrated the effectiveness of the trajectory learning model, with the robot accurately learning and reproducing the teaching trajectories. The generalization error was kept below 0.278 %, indicating that the learned motion trajectories closely matched the teaching trajectories. Furthermore, when tested across different starting and target points, the errors in the robot's generated motion trajectories were under 0.07 %, showcasing the precision of the learning and reproduction model.

In experiments involving linear motion within the human-machine cooperative space, the robot's motion trajectories showed an initial acceleration phase followed by deceleration along the X, Y, and Z axes. The errors in these trajectories were measured at 0.03377, 0.02377, and 0.2250, respectively. These results highlight the robot’s ability to accurately recognize and respond to the operator's intentions regarding acceleration and deceleration during motion.

A transport experiment involving multidirectional and curved movement further verified the system's ability to understand operator intentions. The average trajectory error in this scenario was less than 0.00055 m, and the robot achieved 100 % accuracy in intention recognition. This confirmed the robot's capability to effectively collaborate with the operator in handling the curtain wall.

Moreover, while traditional curtain wall handling typically requires at least three workers, the proposed human-robot collaboration system reduced this need to just one operator guiding the robot's actions. As a result, handling efficiency improved by over 60 %, highlighting the potential of this system to significantly enhance productivity and reduce labor demands in construction settings.

Conclusion

Overall, the researchers successfully designed a collaborative handling system for curtain walls by leveraging human intention understanding to enhance robot handling skills. In this “human-centered with machine support” design, the robot served as the main load-bearer, guided by the operator to handle curtain walls.

The human-robot integration ensured a smooth, flexible, and labor-saving handling process, enhancing accuracy and safety in curtain wall assembly tasks. The researchers suggest optimizing the handling process to further enhance efficiency and improve the flexibility of robots in such collaborative handling scenarios.

Journal Reference

Li, F., Sun, H., Liu, E., & Du, F. (2024). Human-robot collaborative handling of curtain walls using dynamic motion primitives and real-time human intention recognition. Biomimetic Intelligence and Robotics, 100183. DOI: 10.1016/j.birob.2024.100183, https://www.sciencedirect.com/science/article/pii/S266737972400041X

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Article Revisions

  • Sep 20 2024 - Revised sentence structure, word choice, punctuation, and clarity to improve readability and coherence.
Nidhi Dhull

Written by

Nidhi Dhull

Nidhi Dhull is a freelance scientific writer, editor, and reviewer with a PhD in Physics. Nidhi has an extensive research experience in material sciences. Her research has been mainly focused on biosensing applications of thin films. During her Ph.D., she developed a noninvasive immunosensor for cortisol hormone and a paper-based biosensor for E. coli bacteria. Her works have been published in reputed journals of publishers like Elsevier and Taylor & Francis. She has also made a significant contribution to some pending patents.  

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Dhull, Nidhi. (2024, September 20). Collaborative Robots Streamline Curtain Wall Installation. AZoBuild. Retrieved on October 11, 2024 from https://www.azobuild.com/news.aspx?newsID=23609.

  • MLA

    Dhull, Nidhi. "Collaborative Robots Streamline Curtain Wall Installation". AZoBuild. 11 October 2024. <https://www.azobuild.com/news.aspx?newsID=23609>.

  • Chicago

    Dhull, Nidhi. "Collaborative Robots Streamline Curtain Wall Installation". AZoBuild. https://www.azobuild.com/news.aspx?newsID=23609. (accessed October 11, 2024).

  • Harvard

    Dhull, Nidhi. 2024. Collaborative Robots Streamline Curtain Wall Installation. AZoBuild, viewed 11 October 2024, https://www.azobuild.com/news.aspx?newsID=23609.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.