DEV Community

Discussion on: Should a modern programming language assume a byte is 8-bits in size?

Collapse
 
cathodion profile image
Dustin King

It sounds like you already think it's "incorrect" to make that assumption.

On the other hand, languages sometimes have to make compromises based on the existing or expected user base and developer community. I don't know what those are like for Leaf.

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

I do lean towards thinking byte == 8-bits is wrong. That latent learning of C in my background, and the thinking that hardware could ultimately change.

For the most part users of Leaf will never see this byte type, only those doing integration with non-Leaf libraries or direct memory access.

Collapse
 
jakebman profile image
jakebman

If your byte type is supposed to be for integration with non-Leaf libraries, I believe you should base it on the specifications for those libraries.

If you expect the integration to be via C libraries, then I believe you should base your specification of byte on exactly what C says it is.

It will be better for your interoperability with C if you can say

byte - The specification of a byte according to all laws and regulations of C's implementation on whatever platform you're running on.

rather than

byte - Mostly what C says it is. Except when it isn't, in which case you will have massive headaches and have to implement a lot of workarounds both in your code and any interop layers you might have.

Thread Thread
 
mortoray profile image
edA‑qa mort‑ora‑y

Yes, I already have a series of abi_ types, so abi_byte makes sense. But C-integration is only part of the story, there are still some low-level cases that require the same concept as a byte. Those are quite low level though, so still ABI relevant. Perhaps just an abi_byte isn't a bad compromise.