by • April 17, 2016 • No Comments
As rad as those handheld 3D printing pens are, you have to have a few amount of ability (or at quite least practice) in order to manufacture anything that is much additional familiar than a mangled three-dimensional squiggle. A proper 3D printing device is all but one of those 3D printing pens stapled to a robot that can move it in three axes and do a much advantageous job manufacturing things that appear quite great and function well, but it does not allow for much artistic participation of you. For a few individuals, that is the point, but if you’d like to be additional directly involved, Yeliz Karadayi’s thesis project, called “Guided Hand,” is a 3D printing pen with a haptic interface that assists store you of screwing things up too badly.
These haptic interfaces are all but little robot arms, although you can create the same effect with robot arms of any size). It is complex to explain how it feels to use one of these things, and the experience does not come through quite well on video, but all but, the end of the arm (being a robot) knows precisely where it is in 3D space, that means it can tell whether it is of to intersect a virtual 3D object or not. Most of the time, the arm is in gravity compensation mode, but if you try to move it into a virtual object, it can kick in its motors and stop you. In practice, this results in a quite convincing I’m-poking-an-invisible-object kind of feeling, a lot like what you get when you move a pair of sturdy magnets around every other. But, the robot arm can in addition duplicate textures, tactile sensations, and actually 3D objects that are moving around or (a fewtimes) attempting to bite you.
Karadayi’s implementation of this robotic innovation uses several various techniques to assist manual the user. There’s boundary exclusion, that practuallyts you of drawing inside an area, as well as containment, that practuallyts you of drawing outside an area. Attraction assists manual the user by providing a few physical feedback, manufacturing it simpler to follow specific paths. Variations on this include snapping and path next, that bias the pen to assist you trace lines and curves.
Mixing in simulated tactile sensations, like vibration, friction, and dampening, contribute another, slightly less forceful way of guiding the user. These sensations in addition have influences on the output of the 3D printing pen itself, major to (and this is a technical term) increasing squigglyness. The squigglitude in addition depends on the settings of the 3D printing pen itself, and whether you are via a medium that is air-cured or UV-cured.
The quite rad thing of Guided Hand is that it can be tuned to be as invasive or non-invasive as you want it to be, as Karadayi explains, enabling you to begin with a template, but manufacture alters to it as you go:
As shown in the sculpture application, Guided Hand provides the user with a lot of future to learn new crafty ways to print, but it is not perfectly
necessary to print inside the confines of a version or to print all of the version shown in the digital space, or to actually have a version at all. By enabling a few tolerance and freedom inside the constraints, and for the reason the user is sturdyer than the bounding forces and can break away of them, the user is thus able-bodied to manufacture alters to a version in real time as it is being printed. Essentially what this means is that the version in the digital space, pretty than being the target output print, may be idea of instead as additional of a template. With this in mind the opportunity for prototyping arises, where with every print the user can manufacture slight or subtle alters.
Robotic haptic interfaces like these tend to be many effective when coupled with a virtual environment that the user can experience, whether it’s in an augmented reality context (like a visual overlay), or in virtual reality:
An amazing concept in a visual overlay is that the overlay can provide additional information than only the object in the haptic space, such as vector maps, or a heat map of what is left to be printed versus what is not, or advice on where in the volume to extrude ink faster and where to extrude slower. Just imagine a gradient, such that as the color shifts of white to black, a number of variable-bodieds can alter. For example, the vibration of the haptic device may intensify, or maybe the resistance or tolerance level intensifies, or maybe the extrusion speed increases. Perhaps all of these things take place at once, all while the print is yet being actuated and completely regulated by the user.
Sounds quite rad, at quite least for you artistic types, but personally, I’m going to leave the 3D printing completely up to the robots.
[ Guided Hand ] via [ Reddit ]
by admin • March 5, 2017
by admin • November 28, 2016
by admin • November 28, 2016