A Remote Sharing Method Of 3D Physical Objects Using Instance-Segmented Real-Time 3D Point Cloud For Design Meeting
In the fields of architecture and urban design, physical models such as design study models and building material samples are used in design meetings to help stakeholders, including experts and non-experts, understand the contents of a project. And in preparation for COVID-19 and disasters, not only traditional face-to-face meetings but also remote meetings via the Internet and hybrid meetings that combine these two methods are beginning to be widely used in society. Conventional web conferencing can only share two-dimensional information such as images and video through the screen, but it is expected to enable smoother remote conferencing by interactively sharing physical objects on the screen. This is expected to contribute to the realization of better architectural and urban design processes without any discrepancies among the stakeholders, and to the reduction of unnecessary energy consumption due to travel for face-to-face meetings. 3D virtual models can be created using CAD (computer-aided design) or BIM (Building Information Modeling) and shared remotely over a network. However, creating 3D virtual models from scratch is time-consuming, and sharing physical objects that move and deform is difficult. Another way to create 3D virtual objects is through point cloud technology. A point cloud is a set of points with coordinates and color information, which can be acquired by a 3D scanner or an RGB-D camera. Stotko et al. (2019) developed a client-server system that captures a static 3D physical scene in real-time and allows a large number of users to explore it. However, while this system can build 3D virtual models on top of the scene’s mixed reality (MR), it cannot manipulate them. Therefore, a system that can manipulate point clouds in clusters by using fast point cloud segmentation has been reported ([Authors 2020]). However, the Euclidean cluster method used in this study divides the point cloud by the distance between two clusters, so if the point clouds approach less than the allowed length, they are treated as the same. The objective of this research is to develop a system that uses instance segmentation to segment objects by the area of the object, rather than the distance between objects, in a real-time point cloud captured from 3D real objects. With this system, even if the sender touches a shared physical object, the sender object and its physical object are each displayed in MR as a segmented point cloud, and the receiver can manipulate them individually from a remote location. Through experiments, we analyzed the relationship between the number of point clouds to be transmitted and the frames per second (fps) at which the point clouds are displayed on MR, and studied the number of point clouds that can be transmitted in real-time, as well as down-sampling the point clouds to enable more objects to be remotely shared in real-time. This system allows the receiver to freely move and rearrange the shared objects on the sender’s side, thus facilitating remote meetings for design review while communicating with the sender.(491 words) References Stotko, P., Krumpen, S., B.Hullin, M., Weinmann, M. and Klein, R.: 2019, SLAMCast: Large-Scale, Real-Time 3D Reconstruction and Streaming for Immersive Multi-Client Live Telepresence, IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, vol25, No5, pp.2102-2112 [Authors 2020] This is omitted due to double-blind peer review and will be clearly stated at the camera-ready submission stage.
Keywords: The Remote Meeting, Fast Point Cloud, Instance Segmentation, Three-Dimensional Remote Sharing, Mixed Reality, Sdg11, Sdg13.