My instinct to capitalise the 'object' in 'sub object' keeps slipping me up when I use Unreal's CreateDefaultSubobject function. Gah! I want to capitalise the object SO MUCH!!! #UE5
Every day I find new ways to be surprised and "delighted" by Unity. Today: turns out that if a vector has a small enough magnitude, normalising it just returns... zero. Even if the magnitude isn't actually all that small. Thanks, Unity. Thunity. 😌
@raodaozao@vfig I'm quite agressive in my my math code with epsilon, and unfortunatly the 10 years that past since I wrote all that I can't remember why, but almost certain its todo with comparisons.
Huh, just realized that all X macro examples I could find use exactly one name for the entries (often "X") - I found using several for different purposes, like different member types to (de)serialize, very useful - is this really unusual or did I just not look hard enough?
(and then I #define D3_IMATTR_FLOAT(NAME), D3_IMATTR_VEC2(NAME) and D3_IMATTR_INT(NAME) however I need it to print/write or parse or whatever, before just putting "D3_IMSTYLE_ATTRS" in a line that then expands all that)
The X Macro examples/articles I found would instead probably use X(NAME, TYPE, SCANFSTR, PRINTFSTR) or sth like that instead of D3_IMATTR_FLOAT(NAME), D3_IMATTR_VEC2(NAME) etc, which would've been more painful overall, IMO
Explanation: Python is a programming language. Numpy is a library for python that makes it possible to run large computations much faster than in native python. In order to make that possible, it needs to keep its own set of data types that are different from python’s native datatypes, which means you now have two different...
I think part of it is the circumstances that would compel users to construct such a list. Until that thread, it hadn't even occurred to me that someone would present a case against the existence of negative literals that required a rebuttal.
@pervognsen@danluu It doesn't really work well to define it that way on two's complement targets though.
E.g. the canonical way to #define INT_MIN on 32-bit C targets is (-0x7fffffff - 1) or similar. You can't write (-0x80000000) since the 0x80000000 is too large for int32 so it forces the constant to be unsigned, and then you end up with the wrong type.
Running #define ; anything yields error: macro names must be identifiers for both C and C++ in an online compiler. So I don’t think the compiler will let you redefine the semicolon.
Well I just tried #define int void in C and C++ before a “hello world” program. C++ catches it because main() has to be an int, but C doesn’t care. I think it is because C just treats main() as an int by default; older books on C don’t even include the “int” part of “int main()” because it’s not strictly necessary.
#define int void replaces all ints with type void, which is typically used to write functions with no return value.
@raptor85 Oh, thanks! This is great info. I used WinMain as a way of getting rid of the extra console terminal that was opening up along with the SDL window.
Also, when I had my own main() I got a compiler error that I could only get rid of by adding a #define SDL_MAIN_HANDLED before the SDL.h include. Is using that #define okay, or is there a better way?
@thephd Please tell me you got rid of #define in C23, yeah? Please, like for real, maybe?
Sigh, I'm trying to extract one function (and support) from the musl library and convert it into C#. I've been feeling fairly confident about it, though concerned about the volume of code and wondering if my original approach might be "good enough".
Still in an explorer mode, I found what looked like functions turn out to be a macro. NBD. Then a macro to a macro. Then 3 layers deep.
@lanodan me #define _XOPEN_SOURCE 500 so that nftw is visible but suddenly that breaks freebsd includes on ci bc setting a standard is supposed to hide the extensions ig
Só for example this in C/C++ #define true (__LINE__ % 10 != 0). Not sure if that counts as swear, but put that in a code and you’ll hear lots of swearing hahahahaha
Ya'll aught to be careful about how much you say I'm cute or awesome or whatever, keep this up and I might start actually believing it, can't imagine, honestly
Python is great, but stuff like this just drives me up the wall (lemmy.world)
Explanation: Python is a programming language. Numpy is a library for python that makes it possible to run large computations much faster than in native python. In order to make that possible, it needs to keep its own set of data types that are different from python’s native datatypes, which means you now have two different...
Destroying friendship (file.coffee)
Which programming languages do you know?