I agree, at least someone is making progress for humanity.
@rolceron3 жыл бұрын
I agree
@KIKURAsky Жыл бұрын
😊😊
@Rallion13 жыл бұрын
While the speed increase is impressive, the visual fidelity takes such a massive hit that I'm not sure the trade off is worth it in this case. It's good for the super high density weaves like satin but for knitted garments IMO it just doesn't hold up.
@patrickquinn31673 жыл бұрын
100%. As a knitter and a programmer, it just looks wrong. It's clearly a 2D sheet and lacks depth in the edges and for many knit patterns, the gaps and stretch in the stitches are as important as the yarn locations. It's sitting in an awkward position, where the old method is probably better for high fidelity loose sparse knits but a generic cloth simulation might be better for high fidelity dense knits.
@helphelphelphelpmehelphelphelp3 жыл бұрын
Well, the previous one does look better, but if the fabric you are going for is dense enough you might get away with just using a displacement instead of actually simulating all of the individual strands.
@A11V1R153 жыл бұрын
Maybe it could be used so you render the important parts with the lengthier but more accurate method and the other ones with this faster one, as he was talking about movie renderization
@slaveNo-40283 жыл бұрын
right, it almost looks like the knitted geometry is printed on, the previous results were much better and actually looked like knit fabric. Anyhow, I can't wait for more improvements in the future, I'm sure there'll be a way to get both good looks and fast processing in some time.
@bolt73 жыл бұрын
Yeah, it looks a bit wrong, like an HD minecraft texture pack.
@陳以律3 жыл бұрын
I feel like the video glossed over too many details of the paper, resulting in people wondering what the point of all this is, or whether the authors were "cheating" in some way. Which is a shame, since there are some rather interesting innovations in the paper. So anyways, here’s my take on a short(ish) summary of the paper. Hopefully this will answer some questions. DISCLAIMER: I’m not any of the authors, so I might get things wrong. Like Károly said, while simulating individual yarns for a piece of cloth looks great, it takes an insane amount of time to calculate interactions between individual yarns. This makes yarn-level simulation unfeasible if a) the threads are extremely small and/or dense, like a t-shirt, or b) if the piece of cloth is really large. Essentially what this paper does to address these problems is that, instead of simulating individual yarns, it creates a 2D surface that mimics the behaviour of patches of knitted cloth. This is akin to, say, fluid simulation. In fluid simulation, we very rarely want to simulate each and every molecule, since that’s impossible with even just a few drops of water. Rather, we use a 3D volume that mimics the behaviour of a bulk of fluid molecules. The key insight of the paper is then that we can use these fluid-sim-like methods for knitted cloth (the technical term for this is called “homogenization”, hence the paper title “Homogenized Yarn-Level Cloth”.) So how does this paper go about doing this? Since there are a multitude of ways to weave threads together into cloth, there’s no one general physical model that can be easily used to describe the behaviour of knitted cloth (if we use threads as the model, this would just be the 2012 paper all over again). The authors instead proposed the following two-step algorithm: 1. Start with a small piece of knitted cloth, bend it in different ways, simulate it with traditional yarn-level simulation, and record the results. A model is then fit over the results so that, given a deformation, it predicts the response of the cloth. This is illustrated in the video from 2:09 to 2:29. 2. Now, given a piece of cloth with some arbitrary shape, note that it can essentially be split into tiny patches. We can now put each of these small patches into the pre-trained model above, and it will give us how the cloth responds to force and deformation at every point. With this, there is now enough information to create a simulation. This is of course glossing over a lot of technical details. If you observe the two steps above, you will notice an important advantage of this paper: since the individual threads are abstracted into a mathematical model in step 1, the resolution, and hence the total time, of the simulation in step 2 only depends on the resolution of the 2D surface representing the cloth, and not on any properties of the individual threads. Therefore, the threads can be as many or as densely knitted together as you please without affecting the performance in step 2. This is the key to how the new method improves the simulation speed so much, and how it simulates densely woven fabrics that were previously infeasible. Now just to address two of the most common comments: 1. “How is this different from cloth simulators in, say, Blender or Houdini?” Mainly the first step. The way the model is fit is general enough to mimic many different weaving patterns, and this is a huge departure from conventional methods, where there is only one model with a few adjustable parameters. 2. “But it looks like a flat surface with a cheap texture on top!” Indeed. Since the individual threads are abstracted away in step 1, we no longer have information of the weaving patterns in step 2, and this is a major downside of this method. However, there is a silver lining in that, the model fitted in step 1 contains deformation information of the original yarn-level simulation. Perhaps one can use this to restore the missing yarns, say, two papers down the line? ;) Obligatory “What a time to be alive!”
@TwoMinutePapers3 жыл бұрын
You are indeed right in many things here. Thank you very much for the feedback and your help! 🙏
@Ceelvain3 жыл бұрын
This comment needs to be much higher.
@benjaminmiller36203 жыл бұрын
Thank you for summarizing! I was indeed, one of the people initially underwhelmed by the results. "You reduced the fidelity of the simulation, and it ran faster? Wow. /s". It's much more interesting in light of: abstracting the material properties of an arbitrary cloth geometry, into a form a more traditional but faster cloth sim can handle. I'm sure that procedural texturing can restore/approximate some of the higher fidelity yarn details from the model, without too much extra computation.
@mostlyokay3 жыл бұрын
Thank you for the comment!
@DidierSurka3 жыл бұрын
@@TwoMinutePapers Please pin this comment. Me, a tech artist that has worked with it's share of cloth sims needed this context very much.
@Wecoc13 жыл бұрын
0:43 We can make sheep that dress like Steve Jobs, what a world we live in
@pikachu-jf2oh3 жыл бұрын
Made me laugh
@DrGold-ks1mp3 жыл бұрын
What a time to be alive!
@iankelley93023 жыл бұрын
What a time to be alive!
@iankelley93023 жыл бұрын
@@DrGold-ks1mp Dangit, took my comment before I even typed it!
@DrGold-ks1mp3 жыл бұрын
@@iankelley9302 what can I say? We really are living in a science fiction world
@SirusStarTV3 жыл бұрын
I like that everyone here isn't blindly impressed by the new algorithm demo
@thedofflin3 жыл бұрын
@randomguy8196 Are you sure? How would you reproduce the original style from the faster method?
@yevgeniygorbachev51523 жыл бұрын
@randomguy8196 The issue is that 2MP is comparing a higher quality algorithm with lower quality one. The comparison that should have been made is between ordinary cloth simulation in 2012 and now.
@notimelikethepresent47393 жыл бұрын
@randomguy8196 And 1/1000th of the quality...
@Potatinized3 жыл бұрын
@randomguy8196 I have to agree with you. with this method, we can get the simulation right, and THEN, we can use it as a wrap deformer on the high poly mesh. Saves TONS of time indeed. Still not as good as the direct method, but it's a good compromise if you know what you're doing.
@YusuphYT3 жыл бұрын
@randomguy8196 Yes but why use yarn to demonstrate cloth? And not expect people to complaint that you faked the yarn on a yarn cloth simulation lmao
@ag360153 жыл бұрын
Congrats to the student and the teacher!
@Arckil3 жыл бұрын
But if the strings are not simulated one by one, this is just a regular cloth simulation... I would love to see a per-string simulation taking less time !
@ΚωσταντίναΒιτσιλάκη3 жыл бұрын
yeah there's nothing new about that just go make a cloth simulation in blender and you 'll have similar, if not better-faster results
@zetahurley2943 жыл бұрын
@@ΚωσταντίναΒιτσιλάκη you're missing the key fact: its still taking the weave into account for how the whole cloth is deformed, and of course it doesn't have the same small point issues you get with Blender cloth simulations and similar. It's still taking the forces that are present between individual strands into account with the simulation, even if it's not calculating the shapes of each stand, and just the whole
@stemfourvisual3 жыл бұрын
@@zetahurley294 It is doing that sure, but its not so immediately visible, as in the older examples. Whilst this topic is of course about maths and computer science, it's also about aesthetics! :)
@zetahurley2943 жыл бұрын
@@stemfourvisual yeah he really should have included examples of normal cloth simulations also to show how it is different, because the way it folds over is incredibly different
@TheLegoJungle3 жыл бұрын
@@zetahurley294 Interesting nuance. Thanks for pointing that out.
@ALZulas3 жыл бұрын
As a data scientist, and a knitter/sewer, I like the original version much better for knitted items. But I think for woven fabrics, it will be incredibly helpful. Stitch definition isn't really important for woven fabrics. But stitch definition gives knitting it's character, which is lost in the newer version. I love it for the light weight t-shirts, I think that's a real game changer.
@thomasrosebrough90623 жыл бұрын
Obviously the original version is going to be better, because it costs enormous time to produce. The point of this paper is not to produce a final product visually. This paper is trying to show that it's possible to simulate the behaviors of the cloth as a whole (the curls and tension lines) without simulating each individual strand. A paper down the line will hopefully show that their is a cheap way to intrude those characteristics back in, resulting in a pretty good approximation of the full simulation that's much faster.
@HansMilling3 жыл бұрын
The old method looks way more realistic with the knitted fabric.
@tincoeani95293 жыл бұрын
Yeah when you have some 3D modeling experience it stands out a lot... For the new method, you can have pretty similar results with about any modern 3d software with cloth simulation applied to a model with a fabric texture. The "new" method feels like trying to fake hyperrealistic raytraced renders through rasterized rendering cheats, it's just can't ever be as convincing with current technology and without AI...
@AuxenceF3 жыл бұрын
yes but the new one is faster, its a tradeoff
@cptant76103 жыл бұрын
@@AuxenceF But it seems like just reducing the mesh density with the old method would probably provide similar results?
@Rem_NL3 жыл бұрын
@@AuxenceF there are real time cloth physics in games from 8 years ago that do the same, they don't take 2 hours to render. IDK what is so spectacular about this paper honestly. Arkham Asylum 2009: kzbin.info/www/bejne/rWG9oWenprGgqNk
@tundrummax62213 жыл бұрын
@@Rem_NL Batman Arkham Asylum doesn't do strand-based cloth simulations though, do they
@EadsJasper3 жыл бұрын
I feel like this is a step backwards. The detail loss is to great to call this an advancement.
@xugro3 жыл бұрын
But for woven fabrics that much detail can't be seen so this method is a big improvement for those i think
@killerbug053 жыл бұрын
Yeah it went from realistic to what looks like a flat png given similar physics
@EadsJasper3 жыл бұрын
@@killerbug05 Exactly what i was thinking.
@xenotronia66813 жыл бұрын
@@EadsJasper this method is an approximation that saves an incredible amount of time while producing similar results (curving, not aesthetically)
@jimbynewchron29013 жыл бұрын
I have no experience in computer graphics, physics simulations, or anything outside of playing games and absorbing media with that technology, but seeing this kinda stuff advance never ceases to amaze me. I've seen a lot of people comment on how the newer way of simulating cloth is more fitting for woven fabrics. I didn't have a way to express it but I could tell that something was off with the newer one compared to the older one. I just hope at some point there's a new technique that comes up or a big improvement in computing power that makes it much easier to do the yarn fabric simulations. TL;DR I love these videos and I wish so many concessions didn't have to be made for progress, but maybe yarn simulations will be much better in the future.
@iGaktan3 жыл бұрын
Hey Károly, Sorry but from your video it wasn't very clear how this new method is different from traditional simplistic cloth simulation models that can run in real time
@homer31893 жыл бұрын
I am usually very impressed by progress, but the old method looks like real cloth and the new one does not.
@namenull73993 жыл бұрын
The new one looks like a 2d plane being deformed like cloth with a texture applied. If you applied raytracing or something could you get the transparency to come back?
@squidbad3 жыл бұрын
I don’t think you understood the method nor raytracing in general.
@games5283 жыл бұрын
The second half of your comment is complete nonsense. Ray tracing had absolutely nothing to do with any of this. You could use an alpha mask to get the holes back, or you could use the simulated mesh to deform a high density mesh with actual yarn geometry.
@namenull73993 жыл бұрын
@@squidbad and i don't think you understand my question.
@namenull73993 жыл бұрын
@@games528 who said anything about ray tracing being apart of this video? I was asking a semi rhetorical question with a yes or no awnser. The awnser is yes, you COULD use ray tracing to make it look transparent as long as the texture has proper alpha values. it could be used to make the fibers look 3d with the proper normal maps. FFS
@Teturarsyra3 жыл бұрын
@@namenull7399 Ray tracing does not matter because you can also simulate transparency with standard rendering method as well (a.k.a standard rasterization). The point of the method is that this 2D sheet looks exactly like the *physically accurate* simulation with the curls and all but you only need a fraction of the computation time. You could use many methods to improve the "final look", alpha transparency being the most basic one. You can use displacement maps and normal maps as well (again it works both in ray tracing or rasterization...) or better use the 2D sheet (which animation and folds are accurate) to drive a 3D mesh that represent the yarn, because the mesh is only following the 2D sheet it won't take much time to render contrary to past method that simulate every single force and collisions for each thread. Research works like bricks, the topic being researched is always very narrow and sometimes authors do not have even time to produce the best possible results/renders (or the necessary knowledge in modeling and other tools etc.), the goal is to unlock fundamental problems so that other people can go further. What I'm trying to say is that often you have to judge the paper by how he does things (algorithms, math etc) rather than the result itself. so you have to extrapolate what could be done. It's like if I do a standard cloth simulation than runs twice as fast as a previous cloth sim but I have poor skills in lighting and the final video sucks compared to old methods, it does not mean the method worthless or that it's not a major scientific improvement.
@squeakycamp2073 жыл бұрын
Wow, I didn't know that you work in Vienna! It's nice to see a great person like you work in my home country!
@awwkaw99963 жыл бұрын
If I was pixar, I think I'd make use of this new method for cloth, but for knitted materials, it doesn't really hold up to the old material. That might change in the next paper though.
@emrecelenli1613 жыл бұрын
well. looks like they found a way to make fabric which looks like a texture. what a time to be alive.
@sebzos11103 жыл бұрын
This paper focused on movement, stretch and curls. It didn't focus on the whole area about fabric which is why it "looks" cheaper and more 2d but behaves like the real deal.
@GoulartGH3 жыл бұрын
yeah; while i get the point of the new method; this mostly makes me wish for a hybrid one, where it uses the "cheaper" method for the parts with not much movement, and the expensive one for the parts where you'd notice the pulling effects on the individual yarns. and some of the complex geometry of the expensive method could perhaps be re-added onto the new one via the use of displacement maps
@netyimeni1693 жыл бұрын
@@GoulartGH exactly what i thought
@insidejaysskull3 жыл бұрын
Wonderfully beautiful papers Dr! I love CGI, CGI animation and seeing technology progress. Its amazing what has happened in the past 20 years, cant wait to see the future!
@Uxcis3 жыл бұрын
Why are so many people badmouthing this paper? This is amazing! look at the time loss, even if the graphical fidelity is a bit worse, it's still a huge improvement in terms of computation. And the CURLS!!!! amazing.
@RedFire7892 жыл бұрын
The Old methods are better for details for movies etc... but the new method is just if you want to render a quick animation with some cloth and stuff. But tbh i like the old method because of how Realistic it looks compare to the new one
@Extys3 жыл бұрын
The old method looks way better!
@john_hunter_3 жыл бұрын
So you could probably generate the yarn along the cloth mesh. That way you can preserve the holes & the 3D look while having a faster simulation.
@abdoudjam68463 жыл бұрын
Insane how video game Engines adapt fast to new computing techniques to better simulate (Clothes, grass and other materials) to give the casual player the satisfying feeling of the real-world interactions, (Respect !)
@ESPlover7073 жыл бұрын
What a time to be alive. The virtual world is going to be indistinguishable from the real world and that’s really cool and really scary at the same time. Great video. Great work.
@igg55893 жыл бұрын
Actually, I do not see anything amazing in those faster versions on the balls (02:40). There are no knitted yarns, only simple mesh and yarn texture applied. You can do this in almost any 3D app almost real time nowdays.
@Tgungen3 жыл бұрын
Its weird that couple of years from now on, people will find these numbers hilarious
@内田ガネーシュ3 жыл бұрын
Man this simulation was impressive. Also having you teacher say you name in his channels, making it so much more awesome.
@cineblazer3 жыл бұрын
I really love this channel. One of my life goals is to write a paper one day that makes it on this channel. Thank you for your wonderful work Dr. Zsolnais-Feher!
@MatiEP093 жыл бұрын
one of the best channels! I really want to see more content.
@nicko31513 жыл бұрын
As I work in 3D modeling for games, I would love to see this lvl of cloth simulation in engines
@davidebic3 жыл бұрын
Maybe in... 15 or 20 years? I find it hard to imagine sufficient computing power any earlier. We might start to see more significant clothing simulations in 4 or 5 years instead of pre-rendered ones, but I doubt it's gonna be anywhere near this level.
@David-un4cs3 жыл бұрын
That Unreal 5 demo impressed me with the cloth physics on the character. Obviously they aren't as complex as this, but it made a difference for sure.
@avatarion3 жыл бұрын
Consoles don't have the processing power to do this.
@mdyusoof7863 жыл бұрын
One of the best videos in KZbin
@augurelite3 жыл бұрын
AMAZING! Congrats to Georg
@runforitman3 жыл бұрын
I look forward to the day beautiful cloth physics, deformation, and such, will be viable in vr it would be so much fun to play with
@kiachi4703 жыл бұрын
Exciting and Amazing Paper looks amazing
@nixel13243 жыл бұрын
I think a possible next step for a future paper would be to map a 3D mesh of a knitted pattern to the simulated flat mesh of this new simulation. It still wouldn't (automatically) bring back the stretching gaps, but would still bring back a lot of visual detail.
@mamborambo3 жыл бұрын
I learn something new everytime I tune in. Well done!
@samp-w74393 жыл бұрын
I have to agree with a lot of the other comments here. The yarn-level simulation looks much better for knitted fabrics, I would much prefer that based on the knitted examples shown, despite the time.
@DurianFruit3 жыл бұрын
Simulated cloth is so fun to look at!
@Lttlemoi3 жыл бұрын
Regarding how the simulated cloth looks, perhaps an additional step can generate the actual geometry of the woven threads in order to regain the look of the older simulation.
@notapplicable72923 жыл бұрын
Honestly, this just looks like a yarn texture on a cloth simulation.
@roryhiggins11463 жыл бұрын
I’m not even a computer programmer but I still love to watch these
@chaosfire3213 жыл бұрын
2 more papers down the line, we'll get it down to seconds. A year or so later, Unity's gonna add it to their engine. What an amazing time to be alive!
@HSE_VO3 жыл бұрын
Amazing video as always man!!!
@TheEnde1243 жыл бұрын
imagine getting a shoutout from your professor like this
@chubbymoth58103 жыл бұрын
Wow,.. just Wow. I can imagine the games using this method in some years time. Knitted garments can probably be simulated with some bump mapping instead as well for most purposes, but these methods should take care of a lot of protruding objects through agents.
@Phyligree3 жыл бұрын
its really wonderful to see you support your former student !
@wariowashere70173 жыл бұрын
Cloth moment
@shmookins3 жыл бұрын
2:55 When will we get games with that detail? 2040's?
@abraxas26583 жыл бұрын
I really hope the next paper combines the previous two somehow, calculating out the correct overall shape and distant forces and then calculating yarn-level locally using estimates of distant forces
@duxcrassus10403 жыл бұрын
somebody show dream’s animators these papers
@KIKURAsky Жыл бұрын
this is so cool and realistic i love it
@M33f3r3 жыл бұрын
Yarnmadillo smooshing is comfy looking
@kuhantilope89703 жыл бұрын
Huh?! This new method just looks like a regular subdivided plane with a texture slapped on it? Cool that it renders faster, but it's just not the same thing anymore. Theres no individual strands....
@lebro44013 жыл бұрын
I think it focuses how the plan reacts similar to the knitted one.
@kuhantilope89703 жыл бұрын
@@lebro4401 But you can simmulate this in Blender in like 1 minute with the same result 😅 (Atleast it looks like that to me :D)
@charlieinabox11643 жыл бұрын
3:07 the settling of the fabric was jarring, haha, very unnatural at the moment. 3:13 Visually I prefer the original method that took longer. A lot of the beauty of the wool strand interacting were lost in the new method. (looks like you agree) Hopefully they can build on the faster method and capture some of that lost detail. 5:53 Soccadillo. This stuff is so exciting and these video are always great! Thank you!
@SharkWrestler3 жыл бұрын
Its hard to believe but eventually we will see video games with physics like this. Im waiting for a medieval setting with stuff like this, amazing.
@ericdunthorne19813 жыл бұрын
The aesthetics of the new techniques are very bad compared to the old. The old look real the new look CGI. If the goal is to look real the new ones fail. The shorter times are irrelevant if the goal isn't achieved. That's my opinion on the topic.
@echodff3 жыл бұрын
Me: Never go to College Two Minute Papers: Hello fellow scholars Me: *giggles in stupid*
@dxnxz533 жыл бұрын
i love this channel
@Beregorn883 жыл бұрын
This paper was clearly cheating: they use a single set of parameters for the whole sheet and pasted the texture after the computation, instead of computing all the thread interactions. You can clearly see the stretching and deformation on the width of what should be single threads... It is just a rubber sheet with a cloth texture, I fail to see what's innovative in this one.
@GoldSabre3 жыл бұрын
This is the next step in Strand-type gaming
@OG_CK20183 жыл бұрын
1 Million on the way!
@TimeKitt3 жыл бұрын
Might have done good to move focus to the overal fabric shape rather than the knit looks. I know that this is a big step and might be a small step to add back in that geometry, but I cant emotionally accept that as much after seeing all the beautiful knit patterns on the left that drew my eye. Also this is old render times for the previous method, or modern render times for the old method?
@frostburnspirit90653 жыл бұрын
The new one looks worse than how much faster it is..
@dreamingwanderer11243 жыл бұрын
The newer one looks more like paper, even for the lower density weaves were you can see the large strings individually. It looks more like they applied a texture to a shape rather than simulating patterns.
@umblapag3 жыл бұрын
Disclosure is probably a better term that disclaimer for this type of situation
@niklaskarlsson2363 жыл бұрын
Hi Is there a audio AI, "already done to feedback to learn", that take "a bad voice recording" and then feedback AI with good recording on the swma voice but not exactly the same readed text. And output is a clean version of the "bad recording of the voice "? :-)
@baumkuchen65433 жыл бұрын
Was the rendering for new and old method done on the same hardware or the old method is an old footage? Would be good to mentioned that as well.
@gge60213 жыл бұрын
I really Like These Videos :)
@hacked21233 жыл бұрын
They need to work on a foviated renderer for portions of the scene where object interactions are expection (any point where the torque is in opposition of gravity).
@tattoomaniacsalina3 жыл бұрын
This is the first time that I've seen the "previous method" look way better than the "new method"
@scientistpac3 жыл бұрын
I don't understand the changes. I really prefer the old methode even though it took lot of time to compute. It really feels like a standard marvelous designer simulation atm. Can't wait for the next paper!
@c2ashman3 жыл бұрын
Are these times normalized to the machine capabilities of the papers from 2012? i can't find any information in the paper. How did you calculate these speed ups?
@AClownsWorld3 жыл бұрын
big congratulations to your student :D
@TheLegoJungle3 жыл бұрын
3:33 unfair comparison it seems like. The new method’s cloth looks clearly 2D and “fake,” you can’t make out the strands. The old one takes forever to compute but the cloth’s fidelity (the actual model) is amazing.
@sageelliott35582 жыл бұрын
Imagine if, once all the simulations like this - yarn, cloth, fluid, light - got so fast that we could combine them together and make a super simulator. How cool would that be?
@Mrdestiny173 жыл бұрын
this is awesome, I have access to a pretty good render farm, but I cant use it for personal use
@goblinslayer63753 жыл бұрын
What a time to be alive?
@Kargoneth3 жыл бұрын
Yes. He seems to have missed his catch phrase.
@malfattio28943 жыл бұрын
I think the implication is that displacement maps or something similar could be used to add more overall detail to the newer algorithm
@Great.Milenko3 жыл бұрын
don't get me wrong this is awesome work, but I'm still not sure if the benefits outweigh the still fairly heavy simulation times, compared to a traditional cloth sim, the overall effects don't seem particularly noticeable. I guess it would be handy in very particular use cases but generally the output isnt hugely different from a regular cloth sim.
@Jonathan-ex3sl3 жыл бұрын
Amazing stuff!
@Virsconte3 жыл бұрын
Can you somehow use the output of the new technique to initialize the more computationally intensive version?
@Merthalophor3 жыл бұрын
How are these times calculated? Since 2012, hardware might well have increased by this speed. If it's standardized, I wonder how useful it is.
@sammikinsderp3 жыл бұрын
Really impressive!
@HeortirtheWoodwarden3 жыл бұрын
In 50 years we'll probably be able to simulate material behavior even better than this and in real time to use in our fantastic virtual worlds.
@blengi3 жыл бұрын
How well do methods that model fabrics more like an intelligent material by instantiating some kind of AI shader methodology per fabric unit type and element, to locally infer some kind of trained physics like response rather than using a more direct and rigorous physical simulation do?
@BadMathGavin3 жыл бұрын
3:26 it's basically 46.22 times faster. You nearly gave an additional 33% onto that value to increase the approximation to 60x just because 59/1 is almost 60 lol
@DiegoJ10233 жыл бұрын
I was expecting the new method to look much better
@hollandbnu3 жыл бұрын
The tech is finally here so we can have a Babushka Simulator!
@andy-kg5fb3 жыл бұрын
Yarnmadillo go!!!
@welsthe3rd3 жыл бұрын
Whoa!! 2 more papers down the line and the title will be "Life is a Simulation"
@pluto43013 жыл бұрын
Real life cloth looks so much like this!
@eccentricity233 жыл бұрын
Why not use an algorithm that resolves higher detail in the areas that need it, like the first paper, and treats lower detail areas like the second paper? Just like how fluid simulations vary the size of the particles depending on where detail is needed.
@tmsgaming59983 жыл бұрын
I have been watching your videos for a while and I might just be missing something big but I still don't know what you mean when you say this sim took 1 hour. what is the base line for the computational hard ware and is it the same as it was years ago? because if the computational hardware gets better even a old simulation software that takes days could take hours on a new setup. how are these things measured?
@ChinoxBoi3 жыл бұрын
they are all nice looking i see it as different types of the same material
@OrderedEntropy3 жыл бұрын
*me just waiting for the day someone makes all the simulation tech out there into a fully fletched openworld game.
@nutzeeer3 жыл бұрын
so lets use the mesh that retains the physicality and marry it with the previous geometry!
@pepsakdoek10293 жыл бұрын
On the 'visible yarn' simulations I think this one loses too much fidelity to really make it worth, but if you use satin or other thinner yarn materials I think this is perfect.
@TheLoy713 жыл бұрын
grr, want to have this in games. Would put an end to clipping
@axiezimmah3 жыл бұрын
In a couple of years we'll have stuff like this in real-time, imagine the potential for games physics and movies
@hardware643 жыл бұрын
So it's faster by taking out all of what made the first method amazing in the first place