@rich lol, i feel that, my current game project is using a custom built raycaster and to complete the look, I encode each frame as a valid NTSC signal then decode it when playing back in the game window, lol. I love the fact that "rainbowing" on moire patterns and such actually happens doing it this way, and the encoding and slight noise hides any imperfections pretty well, test render here without any of the game stuff and a checkerboard texture I'm using to test lighting.
The International Astronomical Union (#IAU) are hosting an #astrophotography gallery in #VR! The event is on May 27 from 11am - 12:30pm UTC and May 28 from 8pm - 9:30pm UTC, and showcases astronomical photographs captured with a smartphone 📱🌌
The event is being held in spatial.io, which is accessible from a VR headset (Quest), or just from your computer browser (i.e. no headset needed).
It's compatible with popular VR headsets, and offers curated collections based on user preferences, ensuring a lightly personalized experience for each individual.
#EPSC (Europlanet Science Congress) is being held hybrid this year, and has an exciting program that includes a session on using new tools such as #VR in planetary science outreach.
But... it's 50 Euro per abstract submission, with no guarantee of acceptance. That's a fairly big punch when the Japanese yen is super weak.
@elizabethtasker@YetAnotherGeekGuy Usually those reviews are free right? Done out of the goodness of researchers hearts and so funded by their own funding (implicitly). So a good question is: what is this €50 actually for? To prevent junk abstracts? (In the age of A.I. that's borderline plausible...in other fields.) To get the whole thing off the ground?
Shan-Yuan Teng (https://tengshanyuan.info/) presenting his body of work on challenges in #haptics for #VR and #AR and asking the question how to move them into everyday life! (Photo by Bryan Wang, at U Toronto)