0

i'm trying to get an image from a string message sended by a socket, so the message before sending it, it has been decoded in base64. Now in my python server i am using this instruction to convert the string image to a real image in order to give it like an argument in cv2.imwrite() and cv2.imshow()

image = np.fromstring(string_image, np.uint8)
cv2.imwrite("image.png", image)
cv2.imshow("image", image)

the code like that gives me an image but not the same as it supposed to be, i mean by that is like i got a gray image with orizontal edges only and nothing else. and if i add the follow instruction between the np.fromstring and cv2.imwrite :

image = cv2.imdecode(image, cv2.IMREAD_COLOR)

i got the empty image error. What should i add or change in my code to get the real image ?

6
  • Look at last 4 lines here... stackoverflow.com/a/59346488/2836621 Commented Aug 11, 2020 at 21:39
  • Please share the first 30-40 characters from string_image. Commented Aug 12, 2020 at 13:46
  • @MarkSetchell always the same problm, with the solution in that link i got an empty image Commented Aug 12, 2020 at 14:00
  • @MarkSetchell here is the first 40 characters : gZN+/4CSff9+jof/f4+I/4SWlP+Fl5X/gY+Y/3qI Commented Aug 12, 2020 at 14:04
  • That doesn't look like a base64-encoded PNG. It should start with iVBORw0KGgo= See stackoverflow.com/a/49690539/2836621 What did you actually send? Commented Aug 12, 2020 at 14:12

1 Answer 1

0

There are too many things going wrong here. First you need to decide what you are going to send. Then you need to decide how you are going to send it.

As regards what you are going to send - you could send:

  • a JPEG - this has the advantage that it will be small and transmit fastest whilst taking least network bandwidth. It is also self-contained - the recipient can work out the width and height from the JPEG header. Disadvantages: potentially has less quality.

  • a PNG - much the same as JPEG but a bit bigger and better quality.

  • raw pixels - which seems to be what you are currently doing maybe? This will take the most network bandwidth and be the slowest. Also, the receiver will not know the width, the height, the bits per pixel or if there is any sub-sampling, so you will have to transmit this with the pixels.

If you send as JPEG or PNG, you will be able to just use Python write() to write the data to disk as a JPEG or PNG at the receiving end. If you want to process the image at the receiving end, you will want to convert it to a Numpy array with cv2.imdecode().

If you send as raw pixels, you will not use cv2.imdecode() at the receiving end because you will have raw pixels, so you will need to put them into a Numpy array and reshape according to the height and width that you remembered to transmit along with the pixels like I said above.

Then there is the question of how you send it. You seem to be base64-encoding but I wonder why as that just makes the image 33% bigger for no reason.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.