This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

proofread

A new way to integrate data with physical objects

A new way to integrate data with physical objects
StructCode modifies fabrication features like finger joints and living hinges in laser-cut objects, allowing for the addition of information such as labels and instructions without adding extra parts or materials. It also includes a robust tag decoding system and ensures the structural integrity of the laser-cut objects is maintained. Credit: MIT CSAIL

To get a sense of what StructCode is all about, says Mustafa Doğa Doğan, think of Superman. Not the "faster than a speeding bullet" and "more powerful than a locomotive" version, but a Superman, or Superwoman, who sees the world differently from ordinary mortals—someone who can look around a room and glean all kinds of information about ordinary objects that is not apparent to people with less penetrating faculties.

That, in a nutshell, is "the high-level idea behind StructCode," explains Doğan, a Ph.D. student in electrical engineering and at MIT and an affiliate of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). "The goal is to change the way we interact with objects"—to make those interactions more meaningful and more meaning-laden—"by embedding information into objects in ways that can be readily accessed."

StructCode grew out of an effort called InfraredTags, which Doğan and other colleagues introduced in 2022. That work, as well as the current project, was carried out in the laboratory of MIT Associate Professor Stefanie Mueller—Doğan's advisor, who has taken part in both projects. In last year's approach, "invisible" tags—that can only be seen with cameras capable of detecting infrared light—were used to reveal information about .

The drawback there was that many cameras cannot perceive infrared light. Moreover, the method for fabricating these objects and affixing the tags to their surfaces relied on 3D printers, which tend to be very slow and often can only make objects that are small.

StructCode, at least in its original version, relies on objects produced with laser-cutting techniques that can be manufactured within minutes, rather than the hours it might take on a 3D printer. Information can be extracted from these objects, moreover, with the RGB cameras that are commonly found in smartphones; the ability to operate in the infrared range of the spectrum is not required.

In their initial demonstrations of the idea, the MIT-led team decided to construct their objects out of wood, making pieces such as furniture, picture frames, flowerpots, or toys that are well suited to laser-cut fabrication. A key question that had to be resolved was this: How can information be stored in a way that is unobtrusive and durable, as compared to externally-attached bar codes and QR codes, and also will not undermine an object's structural integrity?

The solution that the team has come up with, for now, is to rely on joints, which are ubiquitous in wooden objects made out of more than one component. Perhaps the most familiar is the finger joint, which has a kind of zigzag pattern whereby two wooden pieces adjoin at right angles such that every protruding "finger" along the joint of the first piece fits into a corresponding "gap" in the joint of the second piece and, similarly, every gap in the joint of the first piece is filled with a finger from the second.

"Joints have these repeating features, which are like repeating bits," Dogan says. To create a code, the researchers slightly vary the length of the gaps or fingers. A standard size length is accorded a 1. A slightly shorter length is assigned a 0, and a slightly longer length is assigned a 2. The encoding scheme is based on the sequence of these numbers, or bits, that can be observed along a joint. For every string of four bits, there are 81 (34) possible variations.

The team also demonstrated ways of encoding messages in "living hinges"—a kind of joint that is made by taking a flat, rigid piece of material and making it bendable by cutting a series of parallel, vertical lines. As with the finger joints, the distance between these lines can be varied: 1 being the standard length, 0 being a slightly shorter length, and 2 being slightly longer. And in this way, a code can be assembled from an object that contains a living hinge.

The idea is described in a paper, "StructCode: Leveraging Fabrication Artifacts to Store Data in Laser-Cut Objects," that was presented this month at the 2023 ACM Symposium on Computational Fabrication in New York City. Doğan, the paper's first author, is joined by Mueller and four co-authors—recent MIT alumna Grace Tang '23, MNG '23; MIT undergraduate Richard Qi; University of California at Berkeley graduate student Vivian Hsinyueh Chan; and Cornell University Assistant Professor Thijs Roumen.

"In the realm of materials and design, there is often an inclination to associate novelty and innovation with entirely new materials or manufacturing techniques," notes Elvin Karana, a professor of materials innovation and design at the Delft University of Technology. One of the things that impresses Karana most about StructCode is that it provides a novel means of storing data by "applying a commonly used technique like laser cutting and a material as ubiquitous as wood."

The idea for StructCode, adds University of Colorado computer scientist Ellen Yi-Luen Do, is "simple, elegant, and totally makes sense. It's like having the Rosetta Stone to help decipher Egyptian hieroglyphs."

Patrick Baudisch, a computer scientist at the Hasso Plattner Institute in Germany, views StructCode as "a great step forward for personal fabrication. It takes a key piece of functionality that's only offered today for mass-produced goods and brings it to custom objects."

Here, in brief, is how it works: First, a laser cutter—guided by a model created via StructCode—fabricates an object into which encoded information has been embedded. After downloading a StructCode app, an user can decode the hidden message by pointing a cellphone camera at the object, which can (aided by StructCode software) detect subtle variations in length found in an object's outward-facing joints or living hinges.

The process is even easier if the user is equipped with augmented reality glasses, Doğan says. "In that case, you don't need to point a camera. The information comes up automatically." And that can give people more of the "superpowers" that the designers of StructCode hope to confer.

"The object doesn't need to contain a lot of information," Doğan adds. "Just enough—in the form of, say, URLs—to direct people to places they can find out what they need to know."

Users might be sent to a website where they can obtain information about the object—how to care for it, and perhaps eventually how to disassemble it and recycle (or safely dispose of) its contents. A flowerpot that was made with living hinges might inform a user, based on records that are maintained online, as to when the plant inside the pot was last watered and when it needs to be watered again.

Children examining a toy crocodile could, through StructCode, learn scientific details about various parts of the animal's anatomy. A picture frame made with finger joints modified by StructCode could help people find out about the painting inside the frame and about the person (or persons) who created the artwork—perhaps linking to a video of an artist talking about this work directly.

"This technique could pave the way for new applications, such as interactive museum exhibits," says Raf Ramakers, a computer scientist at Hasselt University in Belgium. "It holds the potential for broadening the scope of how we perceive and interact with everyday objects"—which is precisely the goal that motivates the work of Doğan and his colleagues.

But StructCode is not the end of the line, as far as Doğan and his collaborators are concerned. The same general approach could be adapted to other manufacturing techniques besides laser cutting, and information storage doesn't have to be confined to the of wooden objects.

Data could be represented, for instance, in the texture of leather, within the pattern of woven or knitted pieces, or concealed by other means within an image. Doğan is excited by the breadth of available options and by the fact that their "explorations into this new realm of possibilities, designed to make objects and our world more interactive, are just beginning."

More information: Paper: groups.csail.mit.edu/hcie/file … StructCode-paper.pdf

This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

Citation: A new way to integrate data with physical objects (2023, October 18) retrieved 27 April 2024 from https://techxplore.com/news/2023-10-physical.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Invisible tagging system enhances 3D object tracking

87 shares

Feedback to editors