Although constructed upon many different factors, the human identity is largely defined by our commitment to growing under all circumstances. This commitment, in particular, has brought the world some huge milestones, with technology emerging as …
Although constructed upon many different factors, the human identity is largely defined by our commitment to growing under all circumstances. This commitment, in particular, has brought the world some huge milestones, with technology emerging as quite a major member of the group. The reason why we hold technology in such a high regard is predicated upon its skill-set, which guided us towards a reality that nobody could have ever imagined otherwise. Nevertheless, if we look beyond the surface for one hot second, it will become abundantly clear how the whole runner was also very much inspired by the way we applied those skills across a real world environment. The latter component, in fact, did a lot to give the creation a spectrum-wide, and as a result, initiate a full-blown tech revolution. Of course, this revolution eventually went on to scale up the human experience through some outright unique avenues, but even after achieving a feat so notable, technology will somehow continue to bring forth the right goods. The same has turned more and more evident in recent times, and assuming one new discovery ends up with the desired impact, it will only put that trend on a higher pedestal moving forward.
The researching team at Massachusetts Institute of Technology has successfully developed a generative-AI-driven tool, which is designed to let the user add custom design elements to 3D models without compromising the functionality of the fabricated objects. To understand the significance of such a development, we must start by acknowledging a trend where people are leveraging cheaper 3D printers to fabricate their own objects. They are able to do so through an access of free open-source repositories containing user-generated and downloadable 3D models. However, as simple as it sounds, installing custom design elements into these models remains a massive challenge. This is because the process demands a complex and expensive computer-aided design (CAD) software, making the operation particularly difficult in case the original representation of the model is not available online. Furthermore, if a user somehow, let’s say, succeeds in bringing personalized elements to an object, the idea of actually ensuring those customizations don’t hurt the object’s functionality would still mandate an extra degree of domain expertise that many novice makers lack. But how does MIT’s latest brainchild solve the problem in question? Well, named as Style2Fab, the tool will enable you to personalize 3D models of objects, using only natural language prompts to describe their desired design. Once that bit is done, you can just go and ahead fabricate the objects with a 3D printer.
“For someone with less experience, the essential problem they faced has been: Now that they have downloaded a model, as soon as they want to make any changes to it, they are at a loss and don’t know what to do. Style2Fab would make it very easy to stylize and print a 3D model, but also experiment and learn while doing it,” said Faraz Faruqi, a computer science graduate student and lead author of a paper introducing Style2Fab.
The researchers kicked off the proceedings through a study on various 3D model repositories currently present across the web. The move educated them on the methods they can use for applying AI to segment models into functional and aesthetic components. Eventually, they defined two specific functionalities i.e. external functionality, which involves parts of the model that interact with the outside world, and on the other hand internal functionality, which involves parts of the model that need to mesh together after fabrication. Notably, the new technology’s user would need a dedicated stylization tool to keep the geometry of externally and internally functional segments intact while enabling customization of nonfunctional, aesthetic segments. This requires Style2Fab to figure out which parts of a 3D model are functional. Fortunately, the technology comes through just fine by analyzing the model’s topology, including curves or angles where two planes connect, to track the frequency of changes in geometry. Such an analysis gives us the necessary base to divide the model into a certain number of segments. That’s not all, considering Style2Fab then also compares those segments to a dataset the researchers designed and programmed to hold over 294 models of 3D objects. With each model bearing either a functional or aesthetic label, the segment is marked as functional, if it mirrors any of the benchmarks existing in that newly-formed data set.
“But it is a really hard problem to classify segments just based on geometry, due to the huge variations in models that have been shared. So these segments are an initial set of recommendations that are shown to the user, who can very easily change the classification of any segment to aesthetic or functional,” said Faruqi.
In the next step, we reach a natural language prompt describing our desired design elements. After you have picked a prompt closest to your needs, an AI system called Text2Mesh will start figuring out a suitable 3D model for the same.
Beyond helping designers, the new technology also has a possible use case in the emerging field of medical making. This use case stems from a research which deemed both the aesthetic and functional features related to an assistive device as deciding factors when it comes to how effectively a patient will use the object. At the same time, though, the research also revealed that clinicians, quite like novice designers, may not have the expertise to personalize 3D-printable models. Enter Style2Fit. Talk about the examples explaining where the technology can help, it can, for instance, customize the appearance of a thumb splint so it blends in with the clothing without altering the device’s functionality.
For the immediate future, the researchers plan on empowering Style2Fab to offer fine-grained control over physical properties as well as geometry. Beyond that, they also hope to enhance the technology enough to let users generate their own custom 3D models from scratch within the system.
Copyrights © 2024. All Right Reserved. Engineers Outlook.