Looks to me like you can by these examples.

“It allows buildings to move, transform and even communicate! A new advertising phenomenon is taking the world by storm. 3d Projection Mapping allows marketers and advertisers to create surreal landscapes by projecting images on objects. As you watch these videos, notice how much buzz is created as people record and take photos of the event via their mobile phone. For those of you who are not familiar with the term or marketing technique, (12 MUST SEE 3D Projection Mapping Examples)

*”Video Projection Mapping is an exciting new projection technique that can turn almost any surface into a dynamic video display. Specialized software is used to warp and mask the projected image to make it fit perfectly on irregularly shaped screens. When done right, the end result is a dynamic projection installation that transcends ordinary video projection. The goal of this site is to compile examples of impressive uses of video mapping techniques.” * (http: //www. creativeguerrillamark… MUST SEE 3D Projection Mapping Examples (12 MUST SEE 3D Projection Mapping Examples)

This answer originally appeared on this Quora question on Astral Projection.

Hands-On With Lightform Projection Mapping!

The text:

hey everybody someone from tested and it’s Jeremy from tested so some of you may know that Jeremy and I do a weekly show called projections but one of the things that we haven’t covered untested is well actual projections projection mapping right so this isn’t necessarily fit into that bucket but projection mapping is just very very cool not like about a year ago we were talking on the podcast about how people at Christmastime or holidays in general they take these lights lasers and they project them on the side of their house and it just looks kind of cheesy I mean it looks very warped and and you know not like projection mapping should right where the light actually conforms itself to the geometry of the object your project exactly exactly so I was thinking would it be great if like somebody made a tool that allowed to anybody to projection map onto their house or in in their rooms or an object or whatever you want and somebody has a new company called light form just stay beautiful consumers and prosumers so we went to visit their HQ and chat with our CEO about their technology and check out their light form projection mapping device Brett thank you so much for having us here yeah of course I’m super stoked to see light form so you guys make this device and working on it for a couple years it is a projection mapping device yes exactly so it’s a hardware and a software solution purpose built for projected AR or projection mapping so what the light form is is it’s a high resolution camera so it’s a 4k color camera there’s a computer inside of this and you take this and you actually mount it onto almost any projector and so now it’s attached and what we’ve done is we’ve turned this projector into a 3d scanner so we can take this we can point it at any object in the real world we get a 3d scan of it and then we can apply magical effects or we can ply information on to the real world using projection I’ve seen projection mapping in a variety of places and some amusement parks like Disneyland it’s in retail locations even like e. Sports events but not really in the consumer space this device looks really small and compact how does this compare to something that like a disney would use for a theme park yep yeah so we actually got into projection about 10 years ago while at Disney Imagineering so I was adjusting measuring fill was touring with Skrillex doing big projected visuals so light form is all about taking those big kind theme park level magical experiences and then boiling down into a solution that is accessible for any designer to create project today R so we’ll talk about kind of what that scanning process is so light form is designed to scan static scenes and then within a couple of minutes create super compelling magical effects we have a couple minutes right now can you walk me through yes Devlin’s processes alright so we’ll set this one to the side and go to our demo in unit over here so let me just grab a stormtrooper helmet here alright and then we actually 3d printed a tested logo or on the form one printer over there so I’m just gonna set these objects down and then what we’re gonna do is we’re gonna scan it so right now the camera on the laptop you can actually see a live feed of what the camera is seeing and then Phil is gonna trigger a scan so we’ll see what that looks like so projection mapping really it’s kind of backwards you map first and then you project on the computer’s understanding of the shape and sizes of the world and that’s actually why we call this projected AR and not projection mapping because we think the mapping out of projection mapping so it’s more about the experience in the content that you can create and less about that painful process of mapping pixels amount objects in a lot of AR devices we’ve seen mapping the world understanding the shapes and size of the world’s a huge part of it how does this do it because here I see it’s it’s one lens yep right you’re not using like what a Kinect would have in terms of like IR sensor as blasters or stereo even how does this understand the shape of that yep so it is stereo and it’s stereo because you think of stereo is like left eye right eye right and that’s how we see in 3d but the left eye is the camera of the light form and the right eye is the projector so they’re actually talking to each other through light so those patterns that we saw were black and white patterns those were zeros and ones so it was actually sending binary code to each other through light and at the end of the day what we end up getting is a projector resolution scan so you can get a 1080p scan of whatever you point at so much higher than a real time depth sensor and we can also use a projector with different lenses so you can actually zoom in and out of the lens on that projector you could use different lensing projectors and we are capable of scanning any scene from 3 feet to infinity so you can scan a coffee cup stormtrooper mask or a building across the street and so your projector agnostic yep and projector agnostic so any normal throw projector is what the LF one supports productors have a vive wide range of throws like a small pico projector and the angles you but you could mount this on a small thing habit behind your television and and it would still work fine yep yeah so it’s actually designed to be mounted with adhesive so you literally just peel off a sticker plop it on to any projector you can put on the top or the bottom plug in HDMI and the power and then you’re good to go you turn your projector into a 3d scanner the calibration though what about the calibration between the scanner do that all automatically using computer vision so we’re all Ph. Ds in computer vision and design and spent two years building a system read have to worry about any of that we just do it for you all right well the scan seems to be done those get much faster than a couple minutes what does the computer know about this world now so if we on fill screen you can actually see that we have the scan so we have the color information and then we also have the depth behind that okay so we have a projector resolution depth and color scan and we’re going to use that to then outline an object in the scene and apply a magical effect to it you can was saying projector resolution so the higher the resolution of a projector the better fidelity your scan is correct up to 1080p so the hardware is designed to be cheap and so we support up to 1080p so there’s no understanding of what the scene is that this is a helmet or it’s a head or you know it’s a box you’re manually masking but what tools does the system in able to help you with that masking so we have a bunch of easy things that you expect in like Adobe suite so we have a pen tool that we spend a bunch of time on for quickly outlining objects you just have to kind of trace an object we also have quick select tool so quick select a magic wand like shop has but instead of just having color we use depth as well so it’s not just looking for outlines or dark shapes so you could select a white coffee cup on a white table okay select the coffee cup using depth in center okay so you want to they’re selecting the outline here right picking out those those shapes you can tweak I assume you know and refine your mask and then with that mask put a photo or a video on this so you can do images or videos so you can use your existing assets but what we’re really excited about is these projected AR effects so these are intelligent effects that are actually using the color and the 3d scan of the scene so this is actually wiping in depth through the object we call this one digital fade so there’s a library of these effects and these effects are actually real time so they’re running on the device on the computer that is the the light form lf1 not compiling export in this video yeah these are shaders right so there are shaders that are running on the device in real time and what we can actually do is we can control them so you can go over here of that and actually control different parameters of the effects and then Phil will show you mapping we have this tested logo here and Phil show the pen tool mapping the tested logo and applying another effect very so we’re publishing that right now so it’s actually saving the project file transmitting it to the light form and then playing it back into the real world so that’s the workflow right you’re working you’re creating your mass you are adding an effect your library effects may be adjusting some variables and they are stacking the effects as possible and then once you get published that all sends to the light form you don’t need to have a connected computer correct so the coolest thing is Phil could actually close his laptop and walk away and we just made permanently deployed magic mmm it’s all runs locally so but what’s inside here essentially is like a small computer yeah HDMI output it’s a quad core a nine processor it’s designed to be permantly installed mmm so you know if you’re an AV installer this is actually a fanless design and there’s on board storage so it’s not like an SD card that’s gonna fail over time yeah the idea is that it’s computer vision hardware that makes it really easy to create compelling scenes in seconds but then you can permanently deploy this if you’re an art gallery you know in your house or at a retail store now because you’re running real time effects on here and even combining that potentially with static images videos what’s the load of this machine can you that at some point do you lose framerate is there a maximum complexity you have for this yeah it’s very easy for an artist to throw things at a computer to make it run really slow because it’s basically a mobile phone processor and so what we do is we intelligently try and kind of optimize the scene so we’ll take if you create a bunch of really complex effects like a particle system will actually render that to a video file and send the video file to the device okay so then you can have a mix of passive and interactive content running on the device but you don’t have to worry about you know kind of optimizing your content we do that for you that this is super cool I know this is something you guys bought and you have run this demo before but the whole scan to finish was just a couple minutes and this looks really neat yeah so we actually have you brought in an object yes so now we’re gonna map an object that you chose not us and I want to get my hands on that software yes definitely all right check it out sounds good so we brought something from our office of course the zorb is the f1 from the Fifth Element nice static prop pretty complex and Phil here you guys have scanned this yeah we have so we have our projector scan as our background here so again it’s it’s really easy to actually go in and use this as a reference as we’re tracing the outlines we can do a quick selection just kind of select all the gray areas and then convert that to vector in this case we’ve actually gone and created a creative mask ourselves took only a few minutes and what we can do is if we want to actually interactively edit this mask I can stream the preview to the device so I have a crosshair now that’s out in the real world and if I wanted to just tweak some of that barrel right there I could go and get that really precisely aligned and so everything that’s bright here is within the mask correct yes so we’re projecting white to represent the the mask of the area that you’re gonna be mapping okay and then what’s really handy is this interactive crosshair so as I’m kind of editing these points in the real world I can see exactly where they lie so we bring this point in a little bit reading that point out a little bit and then we have a kind of sub pixel accuracy here so we can scroll way in and we can actually get down to you know our Bezier handles and this is when you really want to get it nice and precise but for most applications you can just do kind of a rough selection and that’s gonna be portent because the angle this projector is gonna be different and where the shadow is kind of overlap yeah you want to fine tune that if I shift this though you’ll need the rescan yeah correct so by vectorizing the mask outline it’s very easy for us to just select that whole shape and then do a transformation on it and move it on our roadmap we have we call Auto fine tune which will account for small shifts in projector movement or object movement so typically when you have an installation going for a while you know things heat up cool down buildings move a little bit right there and you can notice that you know five pixel offset so with computer vision we’ll actually be able to help solve effort and that’s for kind of lateral movements or just fixed planes scale changes if I do a big rotation or even a slight rotation you’ll need to tweak that mask manually correct you will you will want to go in and there’s significant changes to the scene tweak your Bezier handles a little bit but what’s nice about our instant effects is that the content underlying you don’t have to recreate that you’re actually able to just apply that and with the new scan and the new underlying projector pixels that content just kind of works right away exactly how you would seen it before correct yeah so this one we actually you know have a some nice depth data there that’s cool so what I’ll do is I’ll bring up our effects pane and we’ll go and use a depth trace effect so this is an effect that’s actually using the underlying depth data I’ll go ahead and insert that and then we can actually publish that so you do see a preview of the effect graph is playing using that same depth at the same visual data the edge data correct and we actually have in our software we also have a preview tab which will allow you to preview what you know kind of a more photo accurate representation of the scene would look like while you’re in the software let’s uh let’s also give our favorite digital fade a try and then we’ll switch back to our color there and publish that and then what can you do after you’ve published the scene that kind of tweaks can you do to this animation yeah so I actually have this MIDI controller hooked up so we support OSC as you saw before MIDI and then other kind of input/output devices so I can actually interactively adjust the speed and color of this effect so I can change it to more like a yellowish or a reddish and then actually speed it up or slow it down a little bit so we could spend a fair amount of time going through and actually mapping the various components and portions of this device but we end up with you know quite a few effects and we also have stock video integration so yeah as I would say everything here so far has been rendered yeah but let’s see what video looks like or yep so I’ll go ahead and I can import a label into our software cool and so now we’ve got kind of what you would expect for a traditional mapping tool so I can actually go ahead and I can then adjust our mask I can go ahead and adjust the underlying shape or structure and so if I wanted to actually add a little call out so say we’re doing like a museum exhibit mm hmm for our for our collection go ahead and map that luckily I have this nice grid on the pegboard there they can help me align it otherwise mapping text is like a pretty difficult problem just to get you know your alignment with your scene properly but having that underlying scan there is really handy for this and the type of transforms you can do or beyond just the skew correct yeah so I could actually also scale the content within this we also have like a mesh warping so say you had like a cylindrical or spherical object you know you could actually take your content and warp it around that very easily you know seeing text there occurs to me that dynamic content to be really interesting yes right now its effects which is rendering real time and imported images but what type of dynamic content could potentially put on this so we’ll have a fair amount of social media integrations so you could display your Twitter feed yeah and have you know say you’re a business and you wanted to display a hashtag you could actually have that kind of integrated and then that would be dynamically populated based on your hookups with your different social media and then you could also have effects and video content running simultaneously and it’s also super cool you mind if I try my hand at it and create some effects yeah please I might I might scoot this label up a little bit for you there so we can see it a little better but yeah absolutely so Brett I had a chance to use a little bit of the software they’ll give you a demo of our on the CF one it’s definitely intuitive and fun but I also can see that so you know you’re iterating fast on this I know you guys launched the pre order campaign where are you guys at in terms of the software and also the hardware yep so we launched three weeks ago and you can go online to reserve your light form that light form ships in November so we actually already have final hardware so it’s FCC certified and we’re actually shipping pallets of the hardware right now to a storage facility and what we’re doing is working with select early adopters with the program that’s already been sold out to refine the software and kind of get that process you know to where it’s really smooth and you know completely bug free and then we’ll be shipping out 2,000 light form units in November Wow is software where you see the iteration mostly coming forward yep in both user experience and just capability yeah definitely so you know the hardware kind of stays the same but we actually have a free a pro and an enterprise version of the software and there’s monthly updates so we say that the hardware gets smarter every month because we’re pushing updates and that includes the free version what things you say are on your feature lists or your priority list for things that get to users yep so everything that we showed you in the demo outside of the stock video is actually in the free version um so a hundred thousand searchable stock videos is in the pro version and we’re also adding support for multiple projectors so being able to synchronize a timeline across multiple projectors so just just to have multiple displays but not on the same object like for example this globe projectors gonna hit this face a bunch of other faces I may want to project can you combine them we’re starting with kind of separate scenes so you know in the office you could have all of these demos on a unified timeline and then have like triggers with you know like an i. Pad app to make them kind of all on a unified timeline so they’re the same experience and then down the road eventually we’ll look into multi projector support but what we want to keep is the authoring process really easy and if you noticed it’s 3d data but the authoring is in 2d and that’s what makes it really simple awesome well thank you so much Bret for having us here this has been super fun and I never thought we’d see projection mapping done so easy with device like that yeah so we you know just scan this object as well and just showing off here we have an i. Pad app where I can change the size or the color of the effect right so the effect is running real time on on the hardware and we created this you know within whatever 10 seconds and now it’s permantly deployed I know you a lot of your target customer erases retail and and location based up experiences but I want this in the home yeah well that’s why we built it because we want in our own home too so awesomes pleasure meet Joe who’s really nice to meet you thanks okay Jerry was that what you expected it was it was exactly what I expected of course now that I’ve seen this I could think of what I want next but what it is is exactly what I hope for I mean I think that this is this fulfills that request that I had a year ago where you could take one of these aim it at your house at nighttime and you know do some pretty interesting mapping the thing I took away most with chatting with Brett and Phil there is how difficult this no previously it would have been you do and why we don’t see in the forum why we only really see it in places like in retail and amusement parks and and big set ups against the side of buildings now and why it’s not in the hands of people like you and me yeah I think traditionally like the to do good projection mapping it’s taken multiple disciplines and it’s shoehorned all of these technologies together in a way that’s really effective but very specialized right I don’t want to do that coding yeah for the mesh the we the the geometry and bending that light around there I want to focus on the animation resign and a lot of these effects that they showed us made that really simple now we were only there we tried a few things or we tried them cf1 we tried this globe and it was effective for for even like flat surfaces with with outlines right those animations are cool we didn’t get to see the entire range of animations but they’ve built in nor what it would take to make your own animations yeah I was really intrigued by that too so all the animations are actually shader based and there’s a tool out there called shader toy which is a website you can go to and it’s basically a demonstration of what you can do with shaders and you can contribute your own and that’s and there’s an enormous amount of possibilities when it comes to this kind of programming right and so they were saying that eventually you should be able to do just about all of that on this device may be limited by the processing power it is their mobile chipset right and you know what we saw the objects were small we didn’t get to see you what it would look like if they use light form on the side of building on a car right example and if you’re throwing the projector longer distance bigger wider spread does that take more computation right true or like how many FEX are you running at the same time yeah we did see framerate drops if you had multiple effects going and they don’t cap that right now they’re trying to develop these effects with a sweet spot in mind but it’s going to be up to the users to really take a look at the final effect and see is it meeting and framerate that they want and I know you know for the examples they had even at like weddings you’re lighting on the wedding cake or new signage look cool that’s like practical applications and I was thinking of like I want to put this on our set right I want to use this against the tested backdrop against those hard edges and light it up or for filmmaking I could see filmmaking filmmakers using that the cool thing about doing stuff like that is you don’t have to use the whole throw of the of the projector you don’t have to like fill in all the gaps and put effects everywhere you don’t see the edge that the framing box that you normally would see when you shoot a projection image against the wall right exactly exactly but just doing little subtle things like having something move around the background that we have have said you know even that would be very effective and if you do it right you can’t tell us projection mapping you sort of look at you like is that a light that is on that surface yes yes right as opposed to putting an LED string back exactly that’s real light that bounces off and adds fill light so a person’s standing in front of it so it’s like watching like alien or Blade Runner right all these science fiction films where you actually see a lot of projection mapping being used now in the filmmaker in that futuristic world this can happen in the real world I love that there are shaders because it means it’s real time and it means that you can have external you know switches or the sliders or any kind of input effect the effect whether it’s the actual content if it’s text or if it’s the color or rate or anything like that I mean it it’s baby steps towards what could be highly interactive you know I said if it’s whether you’re in a space where the people walking through or maybe it’s fed by web content or eventually maybe in a video game well that’s the thing dynamic content is what they don’t have yet and they talk about they will add Twitter feeds or text in the future you want a computer interface I want the me turn this zf1 into my monitor right that globe any shape device to be a mirror of my desktop because then I can add interactivity then I can put sense on that and have that change base of how I interact with it right and are you also then talking about you want to be able to move that object around well that’s the thing they don’t do also explicitly static objects right now and they talked about how when they first started as a company yeah they wanted to track objects in real time that’s an incredibly difficult computational challenge and that’s what not what their product is aimed at but yeah of course I want that in the future yeah yeah so very cool light form it is available for pre order and they said they’re shipping them later this year we’d love to get one in and using our asset and it’s again opens possibilities and what you can do with art and creativity using light so thanks for watching and I will see you guys next time..

Anonymous

View all posts

14 comments

Your email address will not be published. Required fields are marked *

  • Automating the mapping is awesome and mathematically very interesting. I was a bit disappointed though by the answers of the interviewee as he often just told something general about the product only vaguely related to the question.

  • Oh cool, they REINVENTED the kinetic! Any technology from 10 years ago they can claim tHEY invented it and Tested will believe them. How much will it cost to have Tested claim that Tesla invented the wheel?

  • Software looks great!
    An alternative is Madmapper which is very like this software, maybe a bit less plug and play especially with the camera, but also cheaper in some ways 🙂

    I will defiently try Lightform out some day!

  • For those curious, it’s $699.99 for the Lightform LF1, and $1499.99 for a bundle kit that includes the Lightform LF1, a 1080p projector, and some other things like a tripod.

  • Was really excited up until about 3 minutes in when it basically becomes clear its just a medium res webcam with some software… (edit: where, having watched the whole video, you apparently have to make ‘complex’ object selections yourself by drawing an outline? pff) i love how the guy tries to gloss over that fact with some marketing bs about stereo vision existing between the projector and sensor, but overall disappointed, would’ve been much cooler if it included multiple camera’s/sensors so there was no calibration needed for each change of the scene. Might even allow for moving scenes?

  • I can imagine presentations and instructional activities, once it becomes truly real-time interactive. Projection on white globes could be used to display geographical or planetary information. Projection on blank human figures could show anatomical details; a larger than life torso could show an operation with depth of field as opposed to flat video imaging. Architects could project renderings of a finished space on an empty room or large space. This will only get better.

  • I did not see them select the viewer position at any point, so it looks like this doesn’t compensate for actual curvature and 3D shape, it’s just a masking system for procedural effects, and they don’t even trust the automatic masks (he did the gun totally by hand), which makes it very “meh”. Everything it does can be done easily and interactively by hand (I’ve done it myself for several live shows, drawing the masks and effects while seeing them being projected, which is much more accurate – I did have to render the effects as movies, but once that is done you can play them off sub-$50 hardware, no need to leave a $700 device connected to each projector in the show).

    It doesn’t even seem to have any sensor inputs to change the effect based on external events (like people stopping in front of it, or objects being moved), so being able to render the effects in real time is kind of pointless. It’s effectively just recalculating the same frames over and over. Waste of power.

    I guess this might find a market in shop displays, but it’s not powerful enough for professional installations (also, the cost scales very badly with multiple projectors, since they expect you to have one per projector, whereas other systems use a few Kinects or webcams to cover the whole scene, and can then control dozens of projectors) and it’s kind of pointless for home use.

  • “Permanent onboard storage. Unlike an SD card that can fail over time.”..
    Only has the onboard storage that can fail over time then? And unable to be replaced.

  • Is that a… Zorg ZF-1 on the background?
    Edit: Nevermind, they acknowledge it on the video.
    For those who want to know, this is the weapon developed by the Gary Oldman character from Fifth element, it’s the coolest shit, it has a lock on feature for homing bullets, rocket launcher, arrow launcher, net launcher, a flamethrower (my favorite) and some sort of freezing gas called the ice cube system.

  • It’s 2018 and Jamie Heynaman still doesn’t have a channel following the raising of small birds in his mustache called “Nested”