0

I am following these examples to convert my csv file to tfrecords.

This is the code I attempted

csv = pd.read_csv("ehealth.csv").values
with tf.python_io.TFRecordWriter("ehealth.tfrecords") as writer:
    for row in csv:
        question, answer, question_bert, answer_bert = row[0], row[1] , row[1], row[2]
        example = tf.train.Example()
        example.features.feature["question"].bytes_list.value.extend(question.encode("utf8"))
        example.features.feature["answer"].bytes_list.value.extend(answer.encode("utf8"))
        example.features.feature["question_bert"].float_list.value.extend(question_bert)
        example.features.feature["answer_bert"].float_list.value.append(answer_bert)
        writer.write(example.SerializeToString())

This is my error

TypeError                                 Traceback (most recent call last) <ipython-input-36-0a8c5e073d84> in <module>()
      4         question, answer, question_bert, answer_bert = row[0], row[1] , row[1], row[2]
      5         example = tf.train.Example()
----> 6         example.features.feature["question"].bytes_list.value.extend(question.encode("utf8"))
      7         example.features.feature["answer"].bytes_list.value.extend(answer.encode("utf8"))
      8         example.features.feature["question_bert"].float_list.value.extend(question_bert)

TypeError: 104 has type int, but expected one of: bytes

It looks like there is an issue when encoding the string. I commented those two lines to make sure everything else is working correctly,

csv = pd.read_csv("ehealth.csv").values
with tf.python_io.TFRecordWriter("ehealth.tfrecords") as writer:
    for row in csv:
        question, answer, question_bert, answer_bert = row[0], row[1] , row[1], row[2]
        example = tf.train.Example()
#         example.features.feature["question"].bytes_list.value.extend(question)
#         example.features.feature["answer"].bytes_list.value.extend(answer)
        example.features.feature["question_bert"].float_list.value.extend(question_bert)
        example.features.feature["answer_bert"].float_list.value.append(answer_bert)
        writer.write(example.SerializeToString())

but then I get these errors

TypeError                                 Traceback (most recent call last) <ipython-input-13-565b43316ef5> in <module>()
      6 #         example.features.feature["question"].bytes_list.value.extend(question)
      7 #         example.features.feature["answer"].bytes_list.value.extend(answer)
----> 8         example.features.feature["question_bert"].float_list.value.extend(question_bert)
      9         example.features.feature["answer_bert"].float_list.value.append(answer_bert)
     10         writer.write(example.SerializeToString())

TypeError: 's' has type str, but expected one of: int, long, float

It turns out that the issue is pandas is interpreting my array as a string instead of an array

type( csv[0][2])

->str

Furthermore, it looks like I have to use example.SerializeToString() since I have an array, but not sure how to go about doing that.

Below is the full code to reproduce the errors including code which downloads the csv file from a google drive.

import pandas as pd
import numpy as np
import requests
import tensorflow as tf

def download_file_from_google_drive(id, destination):
    URL = "https://docs.google.com/uc?export=download"

    session = requests.Session()

    response = session.get(URL, params = { 'id' : id }, stream = True)
    token = get_confirm_token(response)

    if token:
        params = { 'id' : id, 'confirm' : token }
        response = session.get(URL, params = params, stream = True)

    save_response_content(response, destination)    

def get_confirm_token(response):
    for key, value in response.cookies.items():
        if key.startswith('download_warning'):
            return value

    return None

def save_response_content(response, destination):
    CHUNK_SIZE = 32768

    with open(destination, "wb") as f:
        for chunk in response.iter_content(CHUNK_SIZE):
            if chunk: # filter out keep-alive new chunks
                f.write(chunk)

# download_file_from_google_drive('1rMjqKkMnt6_vROrGmlTGStNGmwPO4YFX', 'model.zip') #

file_id = '1anbEwfViu9Rzu7tWKgPb_We1EwbA4x1-'
destination = 'ehealth.csv'
download_file_from_google_drive(file_id, destination)

healthdata=pd.read_csv('ehealth.csv')
healthdata.head()

csv = pd.read_csv("ehealth.csv").values
with tf.python_io.TFRecordWriter("ehealth.tfrecords") as writer:
    for row in csv:
        question, answer, question_bert, answer_bert = row[0], row[1] , row[1], row[2]
        example = tf.train.Example()
        example.features.feature["question"].bytes_list.value.extend(question)
        example.features.feature["answer"].bytes_list.value.extend(answer)
        example.features.feature["question_bert"].float_list.value.extend(question_bert)
        example.features.feature["answer_bert"].float_list.value.append(answer_bert)
        writer.write(example.SerializeToString())


csv = pd.read_csv("ehealth.csv").values
with tf.python_io.TFRecordWriter("ehealth.tfrecords") as writer:
    for row in csv:
        question, answer, question_bert, answer_bert = row[0], row[1] , row[1], row[2]
        example = tf.train.Example()
#         example.features.feature["question"].bytes_list.value.extend(question)
#         example.features.feature["answer"].bytes_list.value.extend(answer)
        example.features.feature["question_bert"].float_list.value.extend(question_bert)
        example.features.feature["answer_bert"].float_list.value.append(answer_bert)
        writer.write(example.SerializeToString())

1 Answer 1

2

Try

example.features.feature["question"].bytes_list.value.extend([bytes(question, 'utf-8')])

It will help your line 6 error, the same change applies to line 7.

And check your numbering in

question, answer, question_bert, answer_bert = row[0], row[1] , row[1], row[2]

I think it should be 0, 1, 2 and 3.

While correcting to the right ordering, you still get the error. So, add

print(type(question_bert))

And it says it is a string. If it is really a string, then you need to change for

float_list.value.append

to

bytes_list.value.extend

If you have an array, then you need to use

tf.serialize_tensor

Here is a simple example of tf.serialize_tensor

a = np.array([[1.0, 2, 46], [0, 0, 1]])
b=tf.serialize_tensor(a)
b

Output is

<tf.Tensor: id=25, shape=(), dtype=string, numpy=b'\x08\x02\x12\x08\x12\x02\x08\x02\x12\x02\x08\x03"0\x00\x00\x00\x00\x00\x00\xf0?\x00\x00\x00\x00\x00\x00\x00@\x00\x00\x00\x00\x00\x00G@\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xf0?'>

You need to save it as bytes.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.