Is this some kind of exersise in programmer-torture? I can't imagine why you'd possibly do this. Not least because your struct-as-a-blob could be subject to padding and alignment that will vary from compiler to compiler and platform to platform. Even then, it'll vary between architectures because of endianness differences. At least you used fixed-width types.
Assuming you only care about little-endian and your compilers don't add any padding or alignment (likely for a struct that's just 3 64-bit fields) it's possible. That doesn't make a great idea.
My preferred approach would be to use some Python code with struct, e.g.
python - "x00911ee3561ac801cb0783462586cf01af00000000000000" <<__END__
import sys
import struct
print "{} {} {}".format(*struct.unpack('@QQQ', sys.argv[1][1:].decode("hex")))
__END__
as this can even handle endianness and packing using appropriate modifiers, and you can easily consume the output in a shell script.
If that's not convenient/suitable, it's also possible in bash, just absolutely horrible. For little-endian, unpadded/packed-unaligned:
To decode each value (adapted from https://stackoverflow.com/a/3678208/398670):
$ x=00911ee3561ac801
$ echo $(( 16#${x:14:2}${x:12:2}${x:10:2}${x:8:2}${x:6:2}${x:4:2}${x:2:2}${x:0:2} ))
so, for the full deal:
x=x00911ee3561ac801cb0783462586cf01af00000000000000
uint64_dec() {
echo $(( 16#${1:14:2}${1:12:2}${1:10:2}${1:8:2}${1:6:2}${1:4:2}${1:2:2}${1:0:2} ))
}
uint64_dec ${x:1:16}
uint64_dec ${x:17:16}
uint64_dec ${x:33:16}
produces:
128381549860000000
130470408871937995
175
Now, I feel dirty and need to go wash. I strongly suggest the following:
CREATE TYPE my_struct AS (a numeric, b numeric, c numeric);
then using my_struct instead of a bytea field. Or just use three numeric columns. You can't use bigint because Pg doesn't have a 64-bit unsigned integer.