You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, we have a strict requirement for our serialization mechanism: it must be deterministic.
This means that the same logical object must serialize into the same bytes representation every time we want to serialize it, by every node.
That requirement forbids us from using protobuf. Since this serialization library isn't deterministic. So we want to implement our own serialization library in-house that would guarantee determinism.
But do we really need the serialization to be deterministic? We only need it for cryptographic signing and hashing. So every time we want to compute the signature or the hash of an object we deterministically serialize it into the bytes representation and get the same signature/hash. What if, instead, we change our logic to serialize an object into bytes and compute signature/hash only once, when the object is created/initialized and after that, we always carry on that bytes representation. Every other node is able to verify that object and parse into Go type, but it preserves the original bytes representation, and next type it needs to save/send it, it uses the original bytes instead of serializing the object on its own. So instead of requiring the serialization operation to be deterministic, we change our logic to effectively do this operation only once. If we can follow this logic we can just use protobuf for serialization everywhere and don't bother with deterministic properties.
Here are a few thoughts against implementing an in-house serialization library. First, It would be very hard to guarantee 100% determinism and relay our crypto on it. We would face the same issues as protobuf library: bugs and differences in implementation depending on the library version or programming language, unknown fields that might be added to objects in the future and etc.
Second, I believe that the serialization problem is common enough and we shouldn't spend our resources to implement it in-house. We should focus on our core protocol features instead.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Currently, we have a strict requirement for our serialization mechanism: it must be deterministic.
This means that the same logical object must serialize into the same bytes representation every time we want to serialize it, by every node.
That requirement forbids us from using protobuf. Since this serialization library isn't deterministic. So we want to implement our own serialization library in-house that would guarantee determinism.
But do we really need the serialization to be deterministic? We only need it for cryptographic signing and hashing. So every time we want to compute the signature or the hash of an object we deterministically serialize it into the bytes representation and get the same signature/hash. What if, instead, we change our logic to serialize an object into bytes and compute signature/hash only once, when the object is created/initialized and after that, we always carry on that bytes representation. Every other node is able to verify that object and parse into Go type, but it preserves the original bytes representation, and next type it needs to save/send it, it uses the original bytes instead of serializing the object on its own. So instead of requiring the serialization operation to be deterministic, we change our logic to effectively do this operation only once. If we can follow this logic we can just use protobuf for serialization everywhere and don't bother with deterministic properties.
Here are a few thoughts against implementing an in-house serialization library. First, It would be very hard to guarantee 100% determinism and relay our crypto on it. We would face the same issues as protobuf library: bugs and differences in implementation depending on the library version or programming language, unknown fields that might be added to objects in the future and etc.
Second, I believe that the serialization problem is common enough and we shouldn't spend our resources to implement it in-house. We should focus on our core protocol features instead.
If, after all, we can't abandon the requirement for deterministic serialization, we should evaluate existing solutions:
https://pkg.go.dev/github.com/ethereum/go-ethereum/rlp#example-Encoder
https://github.com/near/borsh
Beta Was this translation helpful? Give feedback.
All reactions