
<(From Left) Ph.D candidate Jumin Lee, Ph.D candidate Woo Jae Kim, Ph.D candidate Youngju Na, Ph.D candidate Kyu Beom Han, Professor Sung-eui Yoon>
Existing 3D scene reconstructions require a cumbersome process of precisely measuring physical spaces with LiDAR or 3D scanners, or correcting thousands of photos along with camera pose information. The research team at KAIST has overcome these limitations and introduced a technology enabling the reconstruction of 3D —from tabletop objects to outdoor scenes—with just two to three ordinary photographs. The breakthrough suggests a new paradigm in which spaces captured by camera can be immediately transformed into virtual environments.
KAIST announced on November 6 that the research team led by Professor Sung-Eui Yoon from the School of Computing has developed a new technology called SHARE (Shape-Ray Estimation), which can reconstruct high-quality 3D scenes using only ordinary images, without precise camera pose information.
Existing 3D reconstruction technology has been limited by the requirement of precise camera position and orientation information at the time of shooting to reproduce 3D scenes from a small number of images. This has necessitated specialized equipment or complex calibration processes, making real-world applications difficult and slowing widespread adoption.
To solve these problems, the research team developed a technology that constructs accurate 3D models by simultaneously estimating the 3D scene and the camera orientation using just two to three standard photographs. The technology has been recognized for its high efficiency and versatility, enabling rapid and precise reconstruction in real-world environments without additional training or complex calibration processes.
While existing methods calculate 3D structures from known camera poses, SHARE autonomously extracts spatial information from images themselves and infers both camera pose and scene structure. This enables stable 3D reconstruction without shape distortion by aligning multiple images taken from different positions into a single unified space.

<Representative Image of SHARE Technology>
"The SHARE technology is a breakthrough that dramatically lowers the barrier to entry for 3D reconstruction,” said Professor Sung-Eui Yoon. “It will enable the creation of high-quality content in various industries such as construction, media, and gaming using only a smartphone camera. It also has diverse application possibilities, such as building low-cost simulation environments in the fields of robotics and autonomous driving."

<SHARE Technology, Precise Camera Information and 3D Scene Prediction Technology>
Ph.D. Candidate Youngju Na and M.S candidate Taeyeon Kim participated as co-first authors on the research. The results were presented on September 17th at the IEEE International Conference on Image Processing (ICIP 2025), where the paper received the Best Student Paper Award.
The award, given to only one paper among 643 accepted papers this year—a selection rate of 0.16 percent—once again underscores the excellent research capabilities of the KAIST research team.
This achievement was carried out with support from the Ministry of Science and ICT's SW Star Lab Project under the task 'Development of Perception, Action, and Interaction Algorithms for Unspecified Environments for Open World Robot Services.'
< Professor Youngjin Kwon > Modern CPUs have complex structures, and in the process of handling multiple tasks simultaneously, an order-scrambling error known as a 'concurrency bug' can occur. Although this can lead to security issues, these bugs were extremely difficult to detect using conventional methods. Our university's research team has developed a world-first-level technology to automatically detect these bugs by precisely reproducing the internal operation of the CPU in a virt
2025-11-21<(From Left) Professor Ji Tae Kim of the Department of Mechanical Engineering, Professor Soong Ju Oh of Korea University and Professor Tianshuo Zhao of the University of Hong Kong> The “electronic eyes” technology that can recognize objects even in darkness has taken a step forward. Infrared sensors, which act as the “seeing” component in devices such as LiDAR for autonomous vehicles, 3D face recognition systems in smartphones, and wearable healthcare devices, ar
2025-11-03<(From Left) Ph.D candidate Chanhee Lee, Professor Uichin Lee, Professor Hyunsoo Lee, Ph.D candidate Youngji Koh from School of Computing> The number of single-person households in South Korea has exceeded 8 million, accounting for 36% of the total, marking an all-time high. A Seoul Metropolitan Government survey found that 62% of single-person households experience 'loneliness', deepening feelings of isolation and mental health issues. KAIST researchers have gone beyond the limitations
2025-10-21<(From Left) Professor Yoonkey Nam, Dr. Dongjo Yoon from the Department of Bio and Brain Engineering> Cultured neural tissues have been widely used as a simplified experimental model for brain research. However, existing devices for growing and recording neural tissues, which are manufactured using semiconductor processes, have limitations in terms of shape modification and the implementation of three-dimensional (3D) structures. By "thinking outside the box," a KAIST research team has
2025-09-26<Research team photo (from top left) Dr. Bobae Hyeon, Professor Daesoo Kim, Director Chang-joon Lee, (right) Professor Won Do Heo> Globally recognized figures like Muhammad Ali and Michael J. Fox have long suffered from Parkinson's disease. The disease presents a complex set of motor symptoms, including tremors, rigidity, bradykinesia, and postural instability. However, traditional diagnostic methods have struggled to sensitively detect changes in the early stages, and drugs targeting
2025-09-22