2

I'm trying to use the logging library in Python to write lists of strings to a CSV log file

Logger creation:

export_logger = logging.getLogger('exportLogger')
export_logger.setLevel(logging.INFO)
file_handler = logging.FileHandler('export_log.csv',mode='w')
export_logger.addHandler(file_handler)
export_logger.info(",".join(['status', 'view', 'filename', 'stdout', 'stderr', 'time']))

Logging:

column_list = [status, view, filename, out, err, current_time]
message = ",".join([str(item) for item in column_list])
export_logger.info(message)

My problem is that if any of the strings contain a new line character or comma, it breaks the output. I could surround them in quotes, but then it would break if they contain quotes. I can escape those, but I'd rather not rewrite code for parsing all the edge cases of CSV file writing. Is there a good way to handle this?

Is there a way to easily sanitize strings for writing to CSVs? I could do this: How do I write data into csv format as string (not file)? But that seems very roundabout and unreadable.

I'm only using the logging library because I thought that was best practice, but if it can't handle CSV formatting I may as well just open a module level CSV file and write lines to it using the python csv library, right?

2
  • Unless you have a very compelling reason to use logging for this, you should be using csv. Commented Oct 30, 2018 at 19:08
  • @CertainPerformance I thought I did have a solution but then realized I didn't and undeleted the question again haha. I thought the solution in my linked article worked because I modified it to keep arrays as arrays, but then found it wasn't even a deep copy at all. screencast.com/t/ZYoHERJt6v Commented Nov 13, 2018 at 3:22

2 Answers 2

2

the logging library is mostly for adhoc runtime/diagnostic/debug output

intended/expected output (which it seems like you're after) should be handled more directly — in your case I'd suggest directly opening the output file, wrapping in a csv.writer and then calling writerow as needed

for example:

import csv

output = csv.writer(open('export_log.csv', 'w'))
output.writerow(['status', 'view', 'filename', 'stdout', 'stderr', 'time'])

for foo in bar:
   # do work
   output.writerow([status, view, filename, out, err, current_time])

note that File objects are also "Context Managers" and hence it might make sense to do:

with open('export_log.csv', 'w') as fd:
  output = csv.writer(fd)
  …

if you want to make sure the file gets closed appropriately

Sign up to request clarification or add additional context in comments.

Comments

0

If you want a dedicated logger class for huge applications. You can create seperate objects to log multiple seperate files. I will suggest using pandas library since it is faster and bug-free.

import pandas as pd

class csvLogger:
    def __init__(self, filename):
        self.count = 1
        self.filename = filename

    def __call__(self, named_dict):
        df = pd.DataFrame(named_dict, index=[self.count])
        df.to_csv("./results/{}.csv".format(self.filename), mode='a', header=self.count == 1)

As you can see, the class will append new data to new row in csv file. You can take a hint from below code snippet.

mycsvLogger = csvLogger("file.csv")
data = {"x": 1, "y": 2}
mycsvLogger(data)

That should be a general code for many advance csv logging applications. Enjoy :)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.