0

I have JSON of the form:

{"abc":
  {
    "123":[45600],
    "378":[78689],
    "343":[23456]
  }
}

I need to convert above format JSON to CSV file in R.

CSV format :

 ds      y
123  45600
378  78689
343  23456

I'm using R library rjson to do so. I'm doing something like this:

jsonFile <- fromJSON(file=fileName)
json_data_frame <- as.data.frame(jsonFile)

but it's not doing the way I need it.

3 Answers 3

2

You can use jsonlite::fromJSON to read the data into a list, though you'll need to pull it apart to assemble it into a data.frame:

abc <- jsonlite::fromJSON('{"abc":
{
    "123":[45600],
    "378":[78689],
    "343":[23456]
    }
}')


abc <- data.frame(ds = names(abc[[1]]), 
                  y = unlist(abc[[1]]), stringsAsFactors = FALSE)

abc
#>      ds     y
#> 123 123 45600
#> 378 378 78689
#> 343 343 23456
Sign up to request clarification or add additional context in comments.

6 Comments

how do i save it into csv?
write.csv or readr::write_csv or data.table::fwrite etc.
I did using write.csv(), but in the code you gave, ds column data is added.I don't want that, how to remove it? see, in your output 123,378 etc are coming twice
The default for read.csv is to add the row names. Tell it not to: write.csv(abc, row.names = FALSE)
Or stack(fromJSON(txt)[[1]])
|
1

I believe you got the json file reader - fromJSON function right.

df <- data.frame( do.call(rbind, rjson::fromJSON( '{"a":true, "b":false, "c":null}' )) )

1 Comment

where do i specify json file name?
0

The code below gets me Google's Location History (json) archive from https://takeout.google.com. This is if you have enabled a 'Timeline' (location tracking) in Google Maps on your cell. Credit to http://rpubs.com/jsmanij/131030 for the original code. Note that json files like this can be quite large and plyr::llply is so much more efficient than lapply in parsing a list. Data.table gives me the more efficient 'rbindlist' to take the list to a data.table. Google logs between 350 to 800 GPS calls each day for me! A multi-year location history is converted to quite a sizeable list by 'fromJSON':

format(object.size(doc1),units="MB")
[1] "962.5 Mb"

I found 'do.call(rbind..)' un-optimized. The timestamp, lat, and long needed some work to be useful to Google Earth Pro, but I am getting carried away. At the end, I use 'write.csv' to take a data.table to CSV. That is all the original OP wanted here.

         ts       lat        long     latitude longitude     
1: 1416680531900 487716717 -1224893214 48.77167 -122.4893
2: 1416680591911 487716757 -1224892938 48.77168 -122.4893
3: 1416680668812 487716933 -1224893231 48.77169 -122.4893
4: 1416680728947 487716468 -1224893275 48.77165 -122.4893
5: 1416680791884 487716554 -1224893232 48.77166 -122.4893

library(data.table)
library(rjson)
library(plyr)

doc1 <- fromJSON(file="LocationHistory.json", method="C")
object.size(doc1)

timestamp <- function(x) {as.list(x$timestampMs)}
timestamps <- as.list(plyr::llply(doc1$locations,timestamp))
timestamps <- rbindlist(timestamps)

latitude <- function(x) {as.list(x$latitudeE7)}
latitudes <- as.list(plyr::llply(doc1$locations,latitude))
latitudes <- rbindlist(latitudes)

longitude <- function(x) {as.list(x$longitudeE7)}
longitudes <- as.list(plyr::llply(doc1$locations,longitude))
longitudes <- rbindlist(longitudes)

datageoms <-  setnames(cbind(timestamps,latitudes,longitudes),c("ts","lat","long"))  [order(ts)]
write.csv(datageoms,"datageoms.csv",row.names=FALSE)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.