One more experiment blending two images with the movement of simple animated shapes. Combining the layers is done with #Java, #Clojure and #OpenGL blend equations.
Illustration of this flowers by Joseph Constantine Stadler (1780–1812).
Does anyone know of an APL compiler or transpiler that can generated Vulkan or OpenGL shader scripts? (Free/libre would be most appreciated.) I think Aaron Hsu might have engineered something like this at some point, but I can't find anything about it at all right now, probably thanks to our amazing new "AI-enhanced" search engines.
I might have found yet another project I wanna work on.
IMHO it is doable and would involve porting a well known game to #MSDOS with #OpenGL#3dfx#Voodoo support.
I have no idea where I can find the time and energy to do it, but it just went on my list...
I need to display 32-bit/channel floating point images. For various reasons (there are many channels, think multi-spectral images), it is more convenient for me to group the data on the CPU by channel planes.
Before doing multispectral, I was uploading 32-bit floating RGBA textures. Would it be a horrible idea (performance wise) to instead upload each channel as a single-channel 32-bit float texture (GL_RED/MTLPixelFormatR32Float) and then sample from whichever channel textures I want and reassemble to visualize color within my fragment shader?
I figured that for 8-bit/channel images there would be a big benefit to uploading/storing as RGBA (vs. 4 separate single-channel textures) since it’s 32 bits per pixel total; but with each floating point channel already being 32 bpp, perhaps this isn’t as much a concern in my case?
:os_apple: Macbooks are being FREED from the clutches of Apple for Linux ✅ :linux:
CPU & other vital parts are already flying colors - what about Apple Silicon graphics device department?
◉OpenGL ES 3.1 now works with Mesa code
◉Geometry shaders as part of OpenGL 3.2 / GLES 3.2 coming nicely
◉Steam on VMs with DRM native contexts: Valve's legendary game "Portal" runs in 60 FPS
◉Battery life is now ~15 hours in desktop use
Passively participating in #Genuary2024 — Day 8 Chaotic System. In 2012/13 I designed an award-winning audioreactive brand identity system for Leeds College Of Music based on the DeJong strange attractor with tens and hundreds of millions of particles per frame. This massive almost 1 year project consisted of a Mac/PC desktop app (written in Clojure, OpenCL & OpenGL) for exploring the attractor, creating presets and scheduling render jobs for super hi-res print assets (which would take a hours to render and were the biggest image sizes I ever had to deal with, up to 3x3 meters @ 150 dpi). I also had to develop an entire AWS based ad-hoc render farm and asset & user management system for the school to generate personalized video assets, allowing each student to upload their own music, handle audio FFT analysis and beat detection/mapping (all in Clojure) and to create individual sound-responsive clips for their in-school digital signage system and for sharing on social media... Most key aspects were handled via various old thi.ng libraries (e.g. https://thi.ng/simplecl for OpenCL interop). The server app also handled transcoding to dozens of video formats (via ffmpeg) and semi-automatic provisioning of EC2 machines for render/transcoding jobs...
An example video is below (music: Heyoka, Blue Towel)
Looking for a new challenge? Want to help drive the Vulkan API standard forward in the industry? LunarG is looking for talented 3D graphics software developers to help deliver world-class, 3D graphics solutions. LunarG creates and troubleshoots graphics drivers, developer tools, SDKs, and Vulkan and OpenXR ecosystem components for the game console, desktop, and mobile markets.