IndexFiguresTables |
Myeung un Kim♦° , Yeosang Yoon* , Joonhwa Choi* and Jangjun Park*Virtual Reality Based Teleoperation of Reusable Rocket Refueling Robot System Utilizing WebXR and WebRTC ProtocolsAbstract: This paper introduces a pioneering virtual reality(VR) based teleoperation system for robotic fueling of reusable rockets. It addresses the post-landing phase challenges, where residual propellants like kerosene, methane, and hydrogen present explosion risks, making human proximity hazardous. The proposed system utilizes a digital twin interface powered by the WebXR API, enabling a realistic view of the robot and worksite with high fidelity. Peer-to-peer connections enabled by the Web Real-Time Communication(WebRTC) protocol facilitate real-time control and feedback. Given the frequent landing of reusable rockets on maritime platforms, affected by waves and wind, our system includes a VR control user interface(UI) that assists in precise control despite such instability. Experiments use components from the Korea Space Launch Vehicle II, specifically the fuel filler neck and fuel line inlet, demonstrating the system’s practical application and potential in enhancing the safety and efficiency of space exploration operations. Keywords: Virtual reality , teleoperation , reusable rocket refueling robot system , remote control Ⅰ. IntroductionThe advent of reusable rockets has marked a new era in space exploration, significantly reducing the cost and increasing the frequency of space missions. However, the post-landing phase of these rockets poses unique challenges that must be addressed to ensure their readiness for subsequent launches. Among these challenges, the safe expulsion of residual propellants and the refueling process are of paramount importance. The presence of combustible materials such as kerosene, methane, and hydrogen in the propellants introduces risks of explosions and hazardous gas emissions, necessitating stringent safety measures that limit human involvement in these operations[1]. In this context, the deployment of a robot remote control system emerges as a critical solution, enabling safe and efficient handling of these tasks without direct human exposure to the dangers inherent in the process[2]. The necessity of such innovations was starkly highlighted by an incident involving SpaceX’s Starship prototype, which encountered an explosion during an experiment aimed at testing the integrity of the propulsion system under extreme conditions. The experiment involved filling the prototype with sub-cooled liquid nitrogen and pressurizing it to its limit. The aftermath of such a high-stakes experiment required a meticulous inspection to assess damages and gather valuable data for future tests. To navigate the hazardous environment laden with thick nitrogen clouds and debris, SpaceX deployed a robotic dog named ’Zeus’. Zeus patrolled the test site, showing the innovative use of robotic assistance in scenarios too dangerous for human operators[3]. This paper advances a novel approach to overcoming the challenges associated with reusable rocket refueling through a virtual reality(VR) based teleoperation system. This system redefines the interaction and control mechanisms between operators and the robot for post-landing operations of reusable rockets. By providing operators with a realistic virtual representation of the worksite, our solution enables precise and intuitive control from a distance, significantly enhancing operational safety and efficiency. Leveraging the WebXR API[4] and the Web Real-Time Communication(WebRTC) protocol[5], our approach ensures seamless synchronization between the robot and its VR-based digital twin, facilitating smooth and responsive control that accurately reflects real-time movements and adjustments. Moreover, recognizing the unique challenges presented by the maritime platforms where reusable rockets often land, subjected to the unpredictable movements of waves and wind, we have designed a specialized VR control user interface(UI). This interface effectively minimizes the impact of environmental factors, allowing operators to maintain precise control over the robot despite the inherent instability of these platforms. The adaptability of our system to such demanding environments highlights its potential to significantly impact robotic teleoperation in space exploration and related fields. Ⅱ. Related Works2.1 VR-Based Robot TeleoperationThe intersection of VR and robot teleoperation represents a paradigm shift in how humans interact with and control machines from a distance. A work[6] introduces a VR-based telerobotic system designed for compliance tasks, where it combines visual and haptic feedback with a local intelligence controller. This study serves as one of the foundational studies in integrating VR into robotic control, demonstrating that VR can significantly improve the operator’ s sense of presence and control efficacy. Since then, the scope of VR-based robot teleoperation has broadened to include various applications, such as deploying robots in environments too hazardous for humans, enhancing the operator’ s spatial awareness, accuracy, and overall control experience through immersive VR capabilities. Examples include research on a VR-based human-robot interaction interface designed for underwater exploration[7], a VR-based rescue detection robot system in an underground coal mine[8], and a VR-based remotely controlled humanoid robot on the International Space Station(ISS)[9]. While previous studies introduced innovative approaches, including haptic control interfaces, three-dimensional graphics, and VR, they also exhibited certain limitations. For instance, the studies by Kuan et.al.[6] and Cruz et.al.[7] focused exclusively on control interfaces and robot pose simulations, lacking the ability to visually perceive dynamically changing environments. Although Zhang et.al.[8] utilized 3D cameras to gather real-time environmental data, this information was conveyed to the user only indirectly, through a simplified environmental model. Similarly, Goza et.al.[9] implemented motion synchronization to control the robot and provided real-time visual feedback via stereo cameras. However, this approach was limited to the camera’s point of view and relied on cumbersome wired equipment for robot control. In contrast, our system facilitates intuitive robot control through motion synchronization while providing a free viewpoint by rendering VR from point cloud data( PCD). 2.2 Real-Time Streaming Using the WebRTC ProtocolThe WebRTC protocol has emerged as a critical technology for enabling real-time, peer-to-peer(P2P) communications directly in web browsers, adding standard APIs and built-in real-time audio and video capabilities along with codecs, all without the need for plugins[10]. This integration allows for seamless media and data streaming between browsers, enhancing user experiences by facilitating direct and efficient communication channels. Recent implementations of WebRTC have showcased its versatility and efficiency, with low latency streaming of high-quality video feeds in applications such as video conferencing[11], online multiplayer gaming[12], telemedicine[13], and surgical simulation[14]. In this paper, we explore the use of WebRTC to enable a single user to remotely control a robot. Furthermore, as proposed in [15], it is feasible to extend this application to allow multiple users to remotely control a single robot by employing a priority system to manage parallel tasks, leveraging WebRTC’s robust framework for efficient, real-time communication. 2.3 Robotic Refueling SystemThe advent of robotic refueling systems has significantly advanced vehicle fueling. Focused on car fueling, Autofuel’ s robotic refueling system demonstrates the potential to phase out manual fueling. The system uses AI to detect a vehicle’ s position and directs a robotic arm to autonomously refuel it[16]. Furthermore, a robotic refueling system for unmanned surface vehicles(USVs) is developed, showing its potential for autonomous shore-to-ship and ship-to-USV refueling. This system can be manually controlled by sailors onboard or operated autonomously by tracking the target refueling receptacle[17]. In this paper, we focus on human-operated robotic refueling of reusable rockets. Contrary to car fueling, the refueling of reusable rockets landing on maritime platforms presents unique challenges, such as variable positions and the effects of waves. These dynamic conditions make the implementation of autonomous robotic fueling systems more complex. Ⅲ. VR-Based Teleoperation of Reusable Rocket Refueling Robot SystemWe propose a system that allows a user to remotely control a rocket fueling robot through a VR-based digital twin display. Figure 1 shows the overall configuration of the proposed robot system. We assume the rocket fueling robot is located near the reusable rocket, and the user is at a distance from the robot. The scenario assumes the robot is not visible to the user’s line of sight and that the user is at a sufficiently safe distance in case of a rocket explosion. The rocket fueling robot consists of a robotic arm, gripper, 3D camera, and PC. The user wears VR equipment, which includes a head mounted display(HMD) equipped with a PC, and controllers. A server system facilitates communication between the robot and VR equipment. When exchanging the robot’s pose information and corresponding control commands, communication occurs via the WebSocket protocol. For transmitting the RGB-D data collected and preprocessed from the 3D camera to the VR equipment, the WebRTC protocol is utilized. The server system receives the robot’s pose information from the robot. It then uses the WebXR API to generate a VR scene that is transmitted to the VR equipment. Additionally, RGB-D data collected from the 3D camera is preprocessed before being sent to the VR equipment. Within the VR scene, the PCD rendered from the preprocessed RGB-D data, is aligned to create 3D video streaming. The user views the digital twin 3D video streaming of the robot and surrounding worksite through the HMD and generates gesture-based commands for robot control using the controllers. The control target values generated by gestures are transmitted to the robot, thus directing its movements. This entire process is repeated. The goal of the proposed robot system is three-fold: · The RGB-D data of the worksite captured by the 3D camera, combined with the robot’s pose information, should be utilized to render the robot and worksite on the HMD screen as 3D graphics identical to reality. This ensures that the user feels as though they are actually on-site, experiencing the environment first-hand. · For precise control of the robot, the 3D graphics based digital twin synchronized with the worksite must be implemented in real time. To this end, information related to video streaming and robot control must be processed and transmitted quickly. · To ensure smooth control of the robot on the reusable rocket landing platform, subject to significant movement from sea waves and wind, a VR control UI must be developed to minimize the impact of platform movement and maximize user convenience. Each component of the system will be described in detail in the following: 3.1 Hardware ConfigurationThis paper primarily aims to examine whether a robot, remotely controlled via a VR-based interface by an operator situated at a distance, can successfully inject fuel into a reusable rocket. Focusing on the act of fuel injection, this study does not use actual reusable rockets or real fuel pipelines. Instead, it employs specific parts of the rocket and pipeline system, specifically the fuel filler neck and the fuel line inlet, which are integral to the Korea Space Launch Vehicle-II’s umbilical system. Figure 2 displays the fuel filler neck and the fuel line inlet utilized in this study. Upon insertion of the fuel line inlet into the fuel filler neck, an internal O-ring ensures a secure seal. Fig. 2. Fuel filler neck and fuel line inlet used in our Fuel filler neck and fuel line inlet used in our The setup of the robot and the surrounding worksite used in our experiment are as depicted in Figure 3. The fuel filler neck is adjustable in position with 6 degrees of freedom(DoF), as indicated by the yellow arrows. The fuel line inlet is placed on a table. The robot performs the task of picking up the fuel line inlet and inserting it into and removing it from the fuel filler neck through the user’s remote control. The robotic arm used is the Universal Robots’ UR5e, which has 6 DoF, a reach of 33.5 inches, and a payload capacity of up to 5 kg. The gripper used is ROBOTIS’ RH-P12RN, which is compatible with UR5e. The 3D camera used to capture the robot and the surrounding worksite is StereoLabs’ ZED 2i, which is positionally adjustable with 6 DoF, as denoted by the green dashed arrows. The VR equipment used in this system must meet the following criteria: · It requires a 6 DoF controller that is compatible with controlling a 6 DoF robotic arm. · For implementing a VR scene using the WebXR API and ensuring seamless video streaming via the WebRTC protocol, the equipment must be capable of internet access through a web browser. In this experiment, the Meta Quest Pro is utilized, fulfilling both of these conditions. 3.2 Server SystemTo establish a connection between the robot and VR equipment, a server system initially hosts a VR web page. A user accesses this web page through the VR equipment and requests a connection with the robot, which includes the 3D camera. Once signaling, adhering to the WebRTC protocol, is complete, the VR equipment and the robot establish a connection. Leveraging the WebRTC protocol, this setup allows the VR equipment to receive preprocessed RGB-D data via P2P communication, facilitating the seamless streaming of high-quality video with minimal latency. It also enables the exchange of robot pose information and control commands, ensuring precise and responsive control. The server system consists of a WebSocket server[18], a TURN server(Traversal Using Relays around NAT, where NAT stands for Network Address Translation)[19], and a VR web hosting server. 1) The WebSocket server functions as both the signaling server for the WebRTC protocol and the WebSocket server for transmitting robot control commands. WebSocket offers a reliable and low-latency communication approach, ideal for sending small, frequent control commands, ensuring precise synchronization between user inputs and the robot’s actions. 2) The TURN server serves as a relay in WebRTC communications, enabling data transmission when direct P2P communication is obstructed by symmetric NAT environments. 3) By visiting the server-hosted web page, the user initiates an VR session implemented with the WebXR API. This step is essential for activating the overall remote control robot system. The VR web hosting server is responsible for hosting the aforementioned web page. 3.3 3D Video StreamingFor people to remotely control a robot and observe its surroundings in real-time, seamless 3D video streaming is essential. This requires: (1) the capability to transmit RGB-D data captured by a 3D camera installed at the worksite to the VR equipment, and (2) the ability to visualize the worksite in the HMD, allowing the user to view it with a realistic sensation. The sequence in which these functions are processed is illustrated in Figure 4. Fig. 4. Process of 3D video streaming. The left part illustrates the process of transmitting RGB-D data captured by a 3D camera to VR equipment, while the right part depicts the process of visualizing the worksite for immersive viewing by the user. In this robot system, an RGB-D data stream, which includes both RGB and depth data, is captured by the 3D camera. The large raw RGB-D data undergoes preprocessing, wherein the RGB data is compressed into JPG format to reduce its size, and the depth data is bit-sampled to either 8-bit or 12-bit depth. The preprocessed data is then transmitted from the robot’s PC to the VR equipment. Communication between the robot’s PC and VR equipment occurs via P2P, facilitated by the WebRTC protocol. The transmission involves using the WebRTC MediaStream API for RGB data and the RTCDataChannel API for depth data. The VR equipment uses the received RGB and depth data, along with camera parameters, to render spatial video as color-informed 3D PCD. The camera parameters include intrinsic parameters such as focal length, principal point, and distortion coefficients, as well as extrinsic parameters that detail the camera’s position and orientation. This spatial video then undergoes a registration process with the VR scene created using the WebXR API. Once registration is complete, it is visualized in the HMD, enabling the user to experience realistic 3D video streaming. To enhance spatial awareness and control, a pre-rendered virtual robot model graphic is integrated within the VR scene, synchronized in real-time with the actual robotic arm’s pose. This approach addresses the limitations of 3D video streaming, which often captures only partial views, by providing a comprehensive view of the robot from various perspectives, aiding in the prevention of issues such as self-collision or reaching a singularity. Figure 5 illustrates the components that make up the VR scene, displaying an example of the spatial video (a) and the virtual robot model graphic (b). Part (c) shows the outcome of registering parts (a) and (b), which is the comprehensive VR scene viewable through the HMD. 3.4 Robot ControlAs described in the previous section, seamless 3D video streaming allows a user, distant from the robot, to control the robotic arm and gripper with a 6 DoF VR controller by simply viewing the video streaming screen. The robot continuously transmits pose information, such as the joint angles of the robotic arm and the degree of opening and closing of the gripper, to the VR equipment. This information ensures that the virtual robot displayed on the screen is precisely synchronized with the actual robot. By manipulating the VR controllers, sticks, and buttons, the user can adjust the postures of the robotic arm and gripper. Control target values, generated by the user’s gestures, are transmitted to the robot every frame via the server system, facilitating movement of the robotic arm and gripper. This process repeats, enabling real-time remote control of the robot. 3.5 VR Control UI for Reusable Rocket Refueling Robot SystemReusable rockets typically land on maritime platforms, similar to SpaceX’s drone ships, to minimize the risk of casualties during landing mishaps, as these platforms are usually far from populated areas. The ocean-based location of these platforms offers the flexibility to adjust their position more easily than on land, aligning with the rocket’s trajectory and mission objectives. However, the movement caused by sea waves and wind in maritime environments presents a significant challenge for remote robot control, as these conditions are inherently unpredictable and cause constant instability. Unlike static environments, where conditions are stable and predictable, maritime environments demand a more adaptable control solution. To address this, we developed a VR control UI, depicted in Figure 6, that enables users to make real-time adjustments to the robotic arm’s translational and rotational movements with greater ease and precision through simple button operations on the VR display. This adaptability is crucial for ensuring accurate and reliable operation even under dynamic maritime conditions. In Figure 6, the first row of X, Y, Z buttons allows the user to toggle the robotic arm’s end-effector translational movement on or off for each axis. The second row controls the rotation of the end-effector on each axis. The third row permits the user to align the end-effector’s coordinate axes with the reference coordinate system for precise control. Control target values are adjusted based on the button settings before being transmitted to the robot, facilitating accurate operation despite platform movement. 3.6 Real-time ExperimentsAs shown in Figure 7, we conducted an experiment on VR-based remote control of a robot system designed for refueling reusable rockets, utilizing the hardware configuration described in Section 3.1. In Figure 7, (a) depicts the VR scene viewed by the user, (b) illustrates the user, located at a distance, controlling the robot while wearing VR equipment, and (c) represents the actual robot. In this experiment, the robot performs the task of inserting and removing the fuel line inlet into the fuel filler neck. The VR scene is synchronized with the robot, enabling the user to control the robot in real-time. Regarding latency, which we define as the time it takes for the video from the robot’s camera to appear on the user’s VR screen, our system achieved an average latency of about 53ms when both the robot-side PC and the user-side PC were connected to the same WLAN(Wireless Local Area Network). Additionally, when tested between two locations roughly 200km apart, the latency remained within an average of 300ms. These results confirm that our system can perform seamlessly under these conditions. The video results displaying the user’s control of the robot can be reviewed at the following link: https://youtu.be/Oo eqIhOaBm0 Ⅳ. Future WorksIn this paper, we propose a robotic system designed for refueling reusable rockets. However, several additional considerations must be addressed before its deployment in real-world scenarios. Firstly, our experiments have utilized only a robot arm fixed to a table; there is a clear need to enhance the robot with mobility capabilities, such as wheels or legs. Secondly, while the current setup includes components like the fuel line inlet and the fuel filler neck, full deployment would require the integration of the actual rocket and fuel pipeline. Furthermore, it is necessary to scale up the robot’s size and its capacity to manage the weight of the actual fuel pipeline. With the robot positioned near the reusable rocket and the user stationed at a safe distance to mitigate the risk of explosion, all remote control information is transmitted wirelessly, introducing significant security concerns that must be addressed to ensure reliable operation. While there is considerable room for development and research in this domain, this paper nonetheless represents a pivotal first step toward establishing a robotic system for fueling reusable rockets, underscoring its potential impact on the field. Ⅴ. ConclusionsThis paper introduces a pioneering approach to teleoperating a reusable rocket refueling robot system with a VR-based digital twin display, substantially advancing operational safety and efficiency. By integrating advanced technologies such as the WebRTC protocol and the WebXR API, we have developed a system that enables precise remote control of a robot with immediate feedback and an immersive 3D display. This immersive environment enhances the user’s ability to perform complex tasks by providing a rich, detailed view of the robot and its surroundings, suitable for dynamic scenarios akin to maritime platforms, even though experiments on such platforms were not conducted. The development of a VR control UI tailored for dynamic environments highlights our system’s robustness and potential for challenging operational scenarios. Our work lays a solid foundation for further research and innovation in robotic teleoperation for space exploration, emphasizing adaptability, immersive interaction, and precision in dynamic environments. BiographyMyeung Un KimFeb. 2015 : B.S. Computer Sci- ence Engineering, Ulsan National Institute of Science and Technology, Ulsan, Korea. Aug. 2020 : Combined M.S. and Ph.D. Electrical Engineering, Ulsan National Institute of Science and Technology, Ulsan, Korea. Jul. 2020-Current : Senior Researcher, Korea Aerospace Research Institute, Daejeon, Korea. [Research Interest] Deep learning, aerospace, AR/VR, robot remote control. [ORCID:0000-0001-5354-267X] BiographyBiographyBiographyReferences
|
StatisticsCite this articleIEEE StyleM. u. Kim, Y. Yoon, J. Choi, J. Park, "Virtual Reality Based Teleoperation of Reusable Rocket Refueling Robot System Utilizing WebXR and WebRTC Protocols," The Journal of Korean Institute of Communications and Information Sciences, vol. 49, no. 12, pp. 1765-1774, 2024. DOI: 10.7840/kics.2024.49.12.1765.
ACM Style Myeung un Kim, Yeosang Yoon, Joonhwa Choi, and Jangjun Park. 2024. Virtual Reality Based Teleoperation of Reusable Rocket Refueling Robot System Utilizing WebXR and WebRTC Protocols. The Journal of Korean Institute of Communications and Information Sciences, 49, 12, (2024), 1765-1774. DOI: 10.7840/kics.2024.49.12.1765.
KICS Style Myeung un Kim, Yeosang Yoon, Joonhwa Choi, Jangjun Park, "Virtual Reality Based Teleoperation of Reusable Rocket Refueling Robot System Utilizing WebXR and WebRTC Protocols," The Journal of Korean Institute of Communications and Information Sciences, vol. 49, no. 12, pp. 1765-1774, 12. 2024. (https://doi.org/10.7840/kics.2024.49.12.1765)
|
