Actually, now that I think about it, is your system little-endian?
I can never tell what the case would be with Macs.
Since RGBA
is just defining what each byte is in memory, then using a literal might be an issue since all hex literals are written as big-endian, so you would need to keep in mind that the that the bytes at the end of the literal would end up first in memory if your system is little-endian.
The more I think about it, the more confident I am that this is just a mistake in the understanding of how the literals work.
If you do something like u32be(0x0000ffff)
, I think that should enforce big-endian enocding. But you need to make sure you transmute()
it and not cast()
it whenever you need it as u32
.
Otherwise you’re just going to have to think about your hex literals in reverse.