‼️ New rayrender update! v0.33.0 introduces instancing with create_instances(): copy/paste an existing scene (either a single 3D model or a collection of objects) and translate/rotate/scale it to a new location! This allows you to generate extremely large scenes with a low memory footprint. For example, here's 4,000,000 dragon models with about half a million vertices each: this would take about 150 terabytes of memory if loaded raw!
1/2 So this small, rather un-assuming render is kind of insane...
Earlier today, I posted a video showing an animation where each of the 435,545 vertices were themselves a complete Stanford dragon--cool. Instancing! However, when I implemented instancing in #rayrender, I made sure to do so in a way that supported nested instances: e.g. you can create an instance, and then instance that instance, and so on. So why am I posting a 400x400 still image of a dragon again?
(Finally started working on a new rayrender feature that's been on my to-do list for a while: instancing! This video would have taken about 10 terabytes of memory if rendered raw)
🎇 We've integrated glossy reflections rendering into GI-1.1, based on our research "Real-time #Rendering of Glossy Reflections using Ray Tracing and Two-level Radiance Caching" 🎇
Our latest blog explains how radiance probes are combined with low-rate, denoised #raytracing for an efficient, high-quality solution in real-time apps like games.
Ik heb gisteren een berg foto's gemaakt van een Voetstappenpad paal (66 stuks) en deze in software gegooid om er een 3D beeld van te creëren. Het resultaat is indrukwekkend!
De foto's in deze post zijn geen echte foto's maar het resultaat van de renderingen.
Ik zal hierna een filmpje posten zodat meer detail te zien is.
Date Captured: Feb 12, 10:46 AM
Location: Laren
Lat/Long: 52.23 N, 5.22 E
Vertices: 25.5k
Since our son Emil was born last summer I did not have much time for personal creative practice, but during the Christmas holidays I had some fun getting into a virtual game console called TIC-80 thanks to #lovebytetcc.
So I did my first 256 byte demoscene production for the @lovebyteparty new-talent competition.
Today I released my first (two) #rust crates! 🎉
mesh_to_sdf generates a signed distance field from a 3D mesh and its client lets you visualize it to fine-tune the parameters (and more).
It's agnostic to your math library, so you start using it right now with #bevy#fyrox or your favorite math library / game engine.
I couldn't have a SDF visualization without a raymarcher now, could I? Including trilinear, tetrahedral and funky interpolations. The only vis missing would be marching cubes I guess, but that's a story for another day.
I have everything I wanted for the 0.1 release, I just need to understand how I publish it on crates.io now.
It has already been possible to change an #iPhone's default web browser through the "Settings" app since iOS 14.
#Apple has a March 6 legal deadline to introduce app sideloading in the European Union in order to comply with the Digital Markets Act #DMA, and iOS 17.4 will add support for this. #Sideloading will allow Apple users to download apps outside of the App Store, but the change will be limited to customers in the EU.
Now is an opportunity to improve your web experience… and your safety…
The first step in turning bacon fat into lard for #SoapMaking is to clean the grease, aka render it.
Place grease into a pot with twice as much water as grease. Bring to a boil and boil for 15 minutes, until grease and water are well combined. Don't allow this mixture to boil over or you'll start a fire. After boiling, refrigerate until the grease hardens. The grease will float to the top and the water and impurities will sink. Scoop the grease from the top and save it. Discard the dirty water.
After much delay, I have finally finished rendering my bacon grease into fat for soap making. I rendered it eight times. It's now free of impurities and virtually scent free. Soon I'll make more soap. #SoapMaking#BaconGrease#rendering
@sebastianlague latest video made me want to try his spatial grid algorithm. I've been wanting to try gpu-based spatial partitioning for quite a while and Sebastian's explanations are really nice to follow and wrap one's head around the concept.
But first, particles!
Finally back to working on the simulation. It's time to dogfood my mesh_to_sdf crate for the fluid sim mesh collisions. Really happy with the first results, it works!
I need to find a better screen recorder though, ScreenToGif struggles and invents drops when it's running at 300fps...