Credit: CC0 Public Domain

An app developed by Cornell researchers uses augmented reality to help users repeatedly capture images from the same location with a phone or tablet to make time-lapse videos—without leaving a camera on site.

Time-lapse photography, which involves combining photos taken over long periods of time, provides a powerful way to visualize phenomena such as changing seasons or the movement of the sun. Traditionally, photographers would leave a camera on a tripod for the duration of the event, but researchers working with Abe Davis, assistant professor of computer science in the Cornell Ann S. Bowers College of Computing and Information Science, have developed a more convenient method. Their iOS app, ReCapture, is now freely available in the Apple app store.

The researchers believe this is the first application designed for creating time-lapse videos from handheld devices.

Ruyu Yan, a computer science major in Cornell Engineering who was the lead developer of the app, presented the work, "ReCapture: AR-Guided Time-lapse Photography," at the 2022 Association for Computing Machinery (ACM) Symposium on User Interface Software and Technology on Nov. 1.

The app has three capture modes that cover a range of scenarios. One works best for landscapes, one helps capture close-up scenes and a third collects a range of images that can be used to reconstruct the scene in 3D offline.

Each capture mode uses different information about the scene. The simplest mode uses an overlay of previous shots to help the user line up new photos. For close-up scenes, which tend to be more difficult to capture, the application tries to figure out where the camera is in 3D space and uses arrows to tell the user how to move and tilt their phone toward the correct location.

The work grew out of Yan's summer research with Davis through the Computer Science Undergraduate Research Program. Yan had mentioned an interest in geocaching, an activity where participants use a GPS to locate a box of trinkets called a cache, hidden by other geocaching enthusiasts. Meanwhile, Davis had been envisioning a project that would help field researchers repeatedly find and re-photograph precise locations from their field sites to track any changes. Together, they came up with the idea of "geocaching with pictures," which ultimately evolved into ReCapture.

"Geocaching may be something that people are doing for fun, but if you're a scientist and you're doing field work, then there's a similar kind of problem at play," Davis said.

Jiatian Sun, a doctoral student in the field of computer science and Longxiulin Deng, a computer science major in Cornell Engineering, also assisted with the study.

Yan said the hardest part was developing the app interface to guide users through the process, because "what works intuitively for me may not work intuitively for others." She sought feedback from 20 beta testers and also worked with the XR Collaboratory at Cornell Tech, which advises researchers on augmented reality, virtual reality and mixed reality applications.

Additionally, she had to figure out how to manage the mountains of associated with the photos. "The app used to crash a lot," she said. This was a problem because if the app was too slow or constantly crashing, people wouldn't collect enough , leading to jerky, poor quality videos.

In future versions of the app, Davis thinks they may be able to smooth out gaps and abrupt transitions in the footage using recent machine learning techniques, which would yield higher quality videos.

Besides making gifs and videos, the app may also have valuable scientific applications, as Davis had envisioned. The team has shared the app with field researchers in other departments at Cornell, and colleagues in the School of Integrative Plant Science in the College of Agriculture and Life Sciences have already begun using it to collect data.

More information: Ruyu Yan et al, ReCapture: AR-Guided Time-lapse Photography, The 35th Annual ACM Symposium on User Interface Software and Technology (2022). DOI: 10.1145/3526113.3545641

Provided by Cornell University