These are Avros serialized with the Confluent platform.
I would like to find a working example like this:
but for Spark Structured Streaming.
kafka
.select("value")
.map { row =>
// this gives me test == testRehydrated
val test = Foo("bar")
val testBytes = AvroWriter[Foo].toBytes(test)
val testRehydrated = AvroReader[Foo].fromBytes(testBytes)
// this yields mangled Foo data
val bytes = row.getAs[Array[Byte]]("value")
val rehydrated = AvroReader[Foo].fromBytes(bytes)