Why is Protocol Buffers so much better than .NET binary serialization? I can only find comparisons which talk about how much better it is (in terms of performance and size), but I could not find why. Can it be at least partly explained without getting into too much detail?
2 Answers
Because BinaryFormatter stores type and property information in the serialized data, it is larger, hence relatively slower.
ProtoBuf moves this to the application side, so both serializer and deserializer have to know exactly what they're doing and specify in code which type and what properties they want to deserialize into.
Comments
The most important reason is that the binary serialization (BinaryFormatter) is brittle. It encodes exact type names etc, making it useless for archiving data such as in a document format for an application for example.
Other reasons include performance, platform availability etc.
If you don't have any performance problems (i.e. messages are small), and you only want to temporarily store or pass some blob of data, then it can be useful. For example - for passing small messages between two instances of a desktop application on the same computer, it can be useful since you don't have to add another a library just for that task.