JSON to pandas DataFrame
What I am trying to do is extract elevation data from a google maps API along a path specified by latitude and longitude coordinates as follows:
from urllib2 import Request, urlopen
import json
path1 = '42.974049,-81.205203|42.974298,-81.195755'
request=Request('http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false')
response = urlopen(request)
elevations = response.read()
This gives me a data that looks like this:
elevations.splitlines()
['{',
' "results" : [',
' {',
' "elevation" : 243.3462677001953,',
' "location" : {',
' "lat" : 42.974049,',
' "lng" : -81.205203',
' },',
' "resolution" : 19.08790397644043',
' },',
' {',
' "elevation" : 244.1318664550781,',
' "location" : {',
' "lat" : 42.974298,',
' "lng" : -81.19575500000001',
' },',
' "resolution" : 19.08790397644043',
' }',
' ],',
' "status" : "OK"',
'}']
when putting into as DataFrame here is what I get:
pd.read_json(elevations)
and here is what I want:
I'm not sure if this is possible, but mainly what I am looking for is a way to be able to put the elevation, latitude and longitude data together in a pandas dataframe (doesn't have to have fancy mutiline headers).
If any one can help or give some advice on working with this data that would be great! If you can't tell I haven't worked much with json data before...
EDIT:
This method isn't all that attractive but seems to work:
data = json.loads(elevations)
lat,lng,el = ,,
for result in data['results']:
lat.append(result[u'location'][u'lat'])
lng.append(result[u'location'][u'lng'])
el.append(result[u'elevation'])
df = pd.DataFrame([lat,lng,el]).T
ends up dataframe having columns latitude, longitude, elevation
python json google-maps pandas
add a comment |
What I am trying to do is extract elevation data from a google maps API along a path specified by latitude and longitude coordinates as follows:
from urllib2 import Request, urlopen
import json
path1 = '42.974049,-81.205203|42.974298,-81.195755'
request=Request('http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false')
response = urlopen(request)
elevations = response.read()
This gives me a data that looks like this:
elevations.splitlines()
['{',
' "results" : [',
' {',
' "elevation" : 243.3462677001953,',
' "location" : {',
' "lat" : 42.974049,',
' "lng" : -81.205203',
' },',
' "resolution" : 19.08790397644043',
' },',
' {',
' "elevation" : 244.1318664550781,',
' "location" : {',
' "lat" : 42.974298,',
' "lng" : -81.19575500000001',
' },',
' "resolution" : 19.08790397644043',
' }',
' ],',
' "status" : "OK"',
'}']
when putting into as DataFrame here is what I get:
pd.read_json(elevations)
and here is what I want:
I'm not sure if this is possible, but mainly what I am looking for is a way to be able to put the elevation, latitude and longitude data together in a pandas dataframe (doesn't have to have fancy mutiline headers).
If any one can help or give some advice on working with this data that would be great! If you can't tell I haven't worked much with json data before...
EDIT:
This method isn't all that attractive but seems to work:
data = json.loads(elevations)
lat,lng,el = ,,
for result in data['results']:
lat.append(result[u'location'][u'lat'])
lng.append(result[u'location'][u'lng'])
el.append(result[u'elevation'])
df = pd.DataFrame([lat,lng,el]).T
ends up dataframe having columns latitude, longitude, elevation
python json google-maps pandas
add a comment |
What I am trying to do is extract elevation data from a google maps API along a path specified by latitude and longitude coordinates as follows:
from urllib2 import Request, urlopen
import json
path1 = '42.974049,-81.205203|42.974298,-81.195755'
request=Request('http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false')
response = urlopen(request)
elevations = response.read()
This gives me a data that looks like this:
elevations.splitlines()
['{',
' "results" : [',
' {',
' "elevation" : 243.3462677001953,',
' "location" : {',
' "lat" : 42.974049,',
' "lng" : -81.205203',
' },',
' "resolution" : 19.08790397644043',
' },',
' {',
' "elevation" : 244.1318664550781,',
' "location" : {',
' "lat" : 42.974298,',
' "lng" : -81.19575500000001',
' },',
' "resolution" : 19.08790397644043',
' }',
' ],',
' "status" : "OK"',
'}']
when putting into as DataFrame here is what I get:
pd.read_json(elevations)
and here is what I want:
I'm not sure if this is possible, but mainly what I am looking for is a way to be able to put the elevation, latitude and longitude data together in a pandas dataframe (doesn't have to have fancy mutiline headers).
If any one can help or give some advice on working with this data that would be great! If you can't tell I haven't worked much with json data before...
EDIT:
This method isn't all that attractive but seems to work:
data = json.loads(elevations)
lat,lng,el = ,,
for result in data['results']:
lat.append(result[u'location'][u'lat'])
lng.append(result[u'location'][u'lng'])
el.append(result[u'elevation'])
df = pd.DataFrame([lat,lng,el]).T
ends up dataframe having columns latitude, longitude, elevation
python json google-maps pandas
What I am trying to do is extract elevation data from a google maps API along a path specified by latitude and longitude coordinates as follows:
from urllib2 import Request, urlopen
import json
path1 = '42.974049,-81.205203|42.974298,-81.195755'
request=Request('http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false')
response = urlopen(request)
elevations = response.read()
This gives me a data that looks like this:
elevations.splitlines()
['{',
' "results" : [',
' {',
' "elevation" : 243.3462677001953,',
' "location" : {',
' "lat" : 42.974049,',
' "lng" : -81.205203',
' },',
' "resolution" : 19.08790397644043',
' },',
' {',
' "elevation" : 244.1318664550781,',
' "location" : {',
' "lat" : 42.974298,',
' "lng" : -81.19575500000001',
' },',
' "resolution" : 19.08790397644043',
' }',
' ],',
' "status" : "OK"',
'}']
when putting into as DataFrame here is what I get:
pd.read_json(elevations)
and here is what I want:
I'm not sure if this is possible, but mainly what I am looking for is a way to be able to put the elevation, latitude and longitude data together in a pandas dataframe (doesn't have to have fancy mutiline headers).
If any one can help or give some advice on working with this data that would be great! If you can't tell I haven't worked much with json data before...
EDIT:
This method isn't all that attractive but seems to work:
data = json.loads(elevations)
lat,lng,el = ,,
for result in data['results']:
lat.append(result[u'location'][u'lat'])
lng.append(result[u'location'][u'lng'])
el.append(result[u'elevation'])
df = pd.DataFrame([lat,lng,el]).T
ends up dataframe having columns latitude, longitude, elevation
python json google-maps pandas
python json google-maps pandas
edited Jan 14 '14 at 5:16
pbreach
asked Jan 14 '14 at 1:32
pbreachpbreach
4,848134980
4,848134980
add a comment |
add a comment |
6 Answers
6
active
oldest
votes
I found a quick and easy solution to what I wanted using json_normalize function included in the latest release of pandas 0.13.
from urllib2 import Request, urlopen
import json
from pandas.io.json import json_normalize
path1 = '42.974049,-81.205203|42.974298,-81.195755'
request=Request('http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false')
response = urlopen(request)
elevations = response.read()
data = json.loads(elevations)
json_normalize(data['results'])
This gives a nice flattened dataframe with the json data that I got from the google maps API.
8
This no longer seems to work — I had to usepd.DataFrame.from_records()
as described here stackoverflow.com/a/33020669/1137803
– avv
Mar 5 '17 at 20:15
1
from_records also doesn't work at times if the json is sufficiently complex, you have to apply json.io.json.json_normalize to get a flatmap Check out stackoverflow.com/questions/39899005/…
– devssh
May 28 '18 at 9:27
add a comment |
Check this snip out.
# reading the JSON data using json.load()
file = 'data.json'
with open(file) as train_file:
dict_train = json.load(train_file)
# converting json dataset from dictionary to dataframe
train = pd.DataFrame.from_dict(dict_train, orient='index')
train.reset_index(level=0, inplace=True)
Hope it helps :)
Doesn't work under pandas 0.20
– Aerin
Oct 29 '18 at 23:31
your_df = pd.read_json(path_or_buf = 'filename.json')
– anu
Nov 12 '18 at 19:59
add a comment |
You could first import your json data in a Python dictionnary :
data = json.loads(elevations)
Then modify data on the fly :
for result in data['results']:
result[u'lat']=result[u'location'][u'lat']
result[u'lng']=result[u'location'][u'lng']
del result[u'location']
Rebuild json string :
elevations = json.dumps(data)
Finally :
pd.read_json(elevations)
You can, also, probably avoid to dump data back to a string, I assume Panda can directly create a DataFrame from a dictionnary (I haven't used it since a long time :p)
I still end up with the same result using the json data and the dictionary that was created. It seems like each element in the dataframe has it's own dict. I tried using your approach in a less attractive way building a separate list for lat, lng, and elevation while iterating through 'data'.
– pbreach
Jan 14 '14 at 5:12
@user2593236 : Hello, I did an error while copy/pasting my code in SO : a del was missing (answer edited)
– Raphaël Braud
Jan 14 '14 at 8:37
Hmm.. Still the same thing where it has 'results' and 'status' as headers while the rest of the json data appear as dicts in each cell. I think the solution to this problem would be to change the format of the data so that it is not subdivided into 'results' and 'status' then the data frame will use the 'lat', 'lng', 'elevation', 'resolution' as the separate headers. Either that, or I will need to find a way to load the json data into a dataframe that will have a multilevel header index as I mentioned in the question.
– pbreach
Jan 14 '14 at 18:47
Which final table do you expect ? The one you got after your edit ?
– Raphaël Braud
Jan 14 '14 at 22:51
The one I got after my final edit does the job, basically all I needed was to get the data in a tabular format that I can export and work with
– pbreach
Jan 16 '14 at 16:48
add a comment |
The problem is that you have several columns in the data frame that contain dicts with smaller dicts inside them. Useful Json is often heavily nested. I have been writing small functions that pull the info I want out into a new column. That way I have it in the format that I want to use.
for row in range(len(data)):
#First I load the dict (one at a time)
n = data.loc[row,'dict_column']
#Now I make a new column that pulls out the data that I want.
data.loc[row,'new_column'] = n.get('key')
add a comment |
billmanH's solution helped me but didn't work until i switched from:
n = data.loc[row,'json_column']
to:
n = data.iloc[[row]]['json_column']
here's the rest of it, converting to a dictionary is helpful for working with json data.
import json
for row in range(len(data)):
n = data.iloc[[row]]['json_column'].item()
jsonDict = json.loads(n)
if ('mykey' in jsonDict):
display(jsonDict['mykey'])
add a comment |
Just a new version of the accepted answer, as python3.x
does not support urllib2
from requests import request
import json
from pandas.io.json import json_normalize
path1 = '42.974049,-81.205203|42.974298,-81.195755'
response=request(url='http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false', method='get')
elevations = response.json()
elevations
data = json.loads(elevations)
json_normalize(data['results'])
Might be better to just edit the accepted answer...
– Christian Hudon
yesterday
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f21104592%2fjson-to-pandas-dataframe%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
6 Answers
6
active
oldest
votes
6 Answers
6
active
oldest
votes
active
oldest
votes
active
oldest
votes
I found a quick and easy solution to what I wanted using json_normalize function included in the latest release of pandas 0.13.
from urllib2 import Request, urlopen
import json
from pandas.io.json import json_normalize
path1 = '42.974049,-81.205203|42.974298,-81.195755'
request=Request('http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false')
response = urlopen(request)
elevations = response.read()
data = json.loads(elevations)
json_normalize(data['results'])
This gives a nice flattened dataframe with the json data that I got from the google maps API.
8
This no longer seems to work — I had to usepd.DataFrame.from_records()
as described here stackoverflow.com/a/33020669/1137803
– avv
Mar 5 '17 at 20:15
1
from_records also doesn't work at times if the json is sufficiently complex, you have to apply json.io.json.json_normalize to get a flatmap Check out stackoverflow.com/questions/39899005/…
– devssh
May 28 '18 at 9:27
add a comment |
I found a quick and easy solution to what I wanted using json_normalize function included in the latest release of pandas 0.13.
from urllib2 import Request, urlopen
import json
from pandas.io.json import json_normalize
path1 = '42.974049,-81.205203|42.974298,-81.195755'
request=Request('http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false')
response = urlopen(request)
elevations = response.read()
data = json.loads(elevations)
json_normalize(data['results'])
This gives a nice flattened dataframe with the json data that I got from the google maps API.
8
This no longer seems to work — I had to usepd.DataFrame.from_records()
as described here stackoverflow.com/a/33020669/1137803
– avv
Mar 5 '17 at 20:15
1
from_records also doesn't work at times if the json is sufficiently complex, you have to apply json.io.json.json_normalize to get a flatmap Check out stackoverflow.com/questions/39899005/…
– devssh
May 28 '18 at 9:27
add a comment |
I found a quick and easy solution to what I wanted using json_normalize function included in the latest release of pandas 0.13.
from urllib2 import Request, urlopen
import json
from pandas.io.json import json_normalize
path1 = '42.974049,-81.205203|42.974298,-81.195755'
request=Request('http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false')
response = urlopen(request)
elevations = response.read()
data = json.loads(elevations)
json_normalize(data['results'])
This gives a nice flattened dataframe with the json data that I got from the google maps API.
I found a quick and easy solution to what I wanted using json_normalize function included in the latest release of pandas 0.13.
from urllib2 import Request, urlopen
import json
from pandas.io.json import json_normalize
path1 = '42.974049,-81.205203|42.974298,-81.195755'
request=Request('http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false')
response = urlopen(request)
elevations = response.read()
data = json.loads(elevations)
json_normalize(data['results'])
This gives a nice flattened dataframe with the json data that I got from the google maps API.
answered Jan 21 '14 at 18:17
pbreachpbreach
4,848134980
4,848134980
8
This no longer seems to work — I had to usepd.DataFrame.from_records()
as described here stackoverflow.com/a/33020669/1137803
– avv
Mar 5 '17 at 20:15
1
from_records also doesn't work at times if the json is sufficiently complex, you have to apply json.io.json.json_normalize to get a flatmap Check out stackoverflow.com/questions/39899005/…
– devssh
May 28 '18 at 9:27
add a comment |
8
This no longer seems to work — I had to usepd.DataFrame.from_records()
as described here stackoverflow.com/a/33020669/1137803
– avv
Mar 5 '17 at 20:15
1
from_records also doesn't work at times if the json is sufficiently complex, you have to apply json.io.json.json_normalize to get a flatmap Check out stackoverflow.com/questions/39899005/…
– devssh
May 28 '18 at 9:27
8
8
This no longer seems to work — I had to use
pd.DataFrame.from_records()
as described here stackoverflow.com/a/33020669/1137803– avv
Mar 5 '17 at 20:15
This no longer seems to work — I had to use
pd.DataFrame.from_records()
as described here stackoverflow.com/a/33020669/1137803– avv
Mar 5 '17 at 20:15
1
1
from_records also doesn't work at times if the json is sufficiently complex, you have to apply json.io.json.json_normalize to get a flatmap Check out stackoverflow.com/questions/39899005/…
– devssh
May 28 '18 at 9:27
from_records also doesn't work at times if the json is sufficiently complex, you have to apply json.io.json.json_normalize to get a flatmap Check out stackoverflow.com/questions/39899005/…
– devssh
May 28 '18 at 9:27
add a comment |
Check this snip out.
# reading the JSON data using json.load()
file = 'data.json'
with open(file) as train_file:
dict_train = json.load(train_file)
# converting json dataset from dictionary to dataframe
train = pd.DataFrame.from_dict(dict_train, orient='index')
train.reset_index(level=0, inplace=True)
Hope it helps :)
Doesn't work under pandas 0.20
– Aerin
Oct 29 '18 at 23:31
your_df = pd.read_json(path_or_buf = 'filename.json')
– anu
Nov 12 '18 at 19:59
add a comment |
Check this snip out.
# reading the JSON data using json.load()
file = 'data.json'
with open(file) as train_file:
dict_train = json.load(train_file)
# converting json dataset from dictionary to dataframe
train = pd.DataFrame.from_dict(dict_train, orient='index')
train.reset_index(level=0, inplace=True)
Hope it helps :)
Doesn't work under pandas 0.20
– Aerin
Oct 29 '18 at 23:31
your_df = pd.read_json(path_or_buf = 'filename.json')
– anu
Nov 12 '18 at 19:59
add a comment |
Check this snip out.
# reading the JSON data using json.load()
file = 'data.json'
with open(file) as train_file:
dict_train = json.load(train_file)
# converting json dataset from dictionary to dataframe
train = pd.DataFrame.from_dict(dict_train, orient='index')
train.reset_index(level=0, inplace=True)
Hope it helps :)
Check this snip out.
# reading the JSON data using json.load()
file = 'data.json'
with open(file) as train_file:
dict_train = json.load(train_file)
# converting json dataset from dictionary to dataframe
train = pd.DataFrame.from_dict(dict_train, orient='index')
train.reset_index(level=0, inplace=True)
Hope it helps :)
edited Jul 19 '17 at 13:17
Brian Burns
6,81754545
6,81754545
answered Jun 17 '17 at 17:04
RishuRishu
2,43811336
2,43811336
Doesn't work under pandas 0.20
– Aerin
Oct 29 '18 at 23:31
your_df = pd.read_json(path_or_buf = 'filename.json')
– anu
Nov 12 '18 at 19:59
add a comment |
Doesn't work under pandas 0.20
– Aerin
Oct 29 '18 at 23:31
your_df = pd.read_json(path_or_buf = 'filename.json')
– anu
Nov 12 '18 at 19:59
Doesn't work under pandas 0.20
– Aerin
Oct 29 '18 at 23:31
Doesn't work under pandas 0.20
– Aerin
Oct 29 '18 at 23:31
your_df = pd.read_json(path_or_buf = 'filename.json')
– anu
Nov 12 '18 at 19:59
your_df = pd.read_json(path_or_buf = 'filename.json')
– anu
Nov 12 '18 at 19:59
add a comment |
You could first import your json data in a Python dictionnary :
data = json.loads(elevations)
Then modify data on the fly :
for result in data['results']:
result[u'lat']=result[u'location'][u'lat']
result[u'lng']=result[u'location'][u'lng']
del result[u'location']
Rebuild json string :
elevations = json.dumps(data)
Finally :
pd.read_json(elevations)
You can, also, probably avoid to dump data back to a string, I assume Panda can directly create a DataFrame from a dictionnary (I haven't used it since a long time :p)
I still end up with the same result using the json data and the dictionary that was created. It seems like each element in the dataframe has it's own dict. I tried using your approach in a less attractive way building a separate list for lat, lng, and elevation while iterating through 'data'.
– pbreach
Jan 14 '14 at 5:12
@user2593236 : Hello, I did an error while copy/pasting my code in SO : a del was missing (answer edited)
– Raphaël Braud
Jan 14 '14 at 8:37
Hmm.. Still the same thing where it has 'results' and 'status' as headers while the rest of the json data appear as dicts in each cell. I think the solution to this problem would be to change the format of the data so that it is not subdivided into 'results' and 'status' then the data frame will use the 'lat', 'lng', 'elevation', 'resolution' as the separate headers. Either that, or I will need to find a way to load the json data into a dataframe that will have a multilevel header index as I mentioned in the question.
– pbreach
Jan 14 '14 at 18:47
Which final table do you expect ? The one you got after your edit ?
– Raphaël Braud
Jan 14 '14 at 22:51
The one I got after my final edit does the job, basically all I needed was to get the data in a tabular format that I can export and work with
– pbreach
Jan 16 '14 at 16:48
add a comment |
You could first import your json data in a Python dictionnary :
data = json.loads(elevations)
Then modify data on the fly :
for result in data['results']:
result[u'lat']=result[u'location'][u'lat']
result[u'lng']=result[u'location'][u'lng']
del result[u'location']
Rebuild json string :
elevations = json.dumps(data)
Finally :
pd.read_json(elevations)
You can, also, probably avoid to dump data back to a string, I assume Panda can directly create a DataFrame from a dictionnary (I haven't used it since a long time :p)
I still end up with the same result using the json data and the dictionary that was created. It seems like each element in the dataframe has it's own dict. I tried using your approach in a less attractive way building a separate list for lat, lng, and elevation while iterating through 'data'.
– pbreach
Jan 14 '14 at 5:12
@user2593236 : Hello, I did an error while copy/pasting my code in SO : a del was missing (answer edited)
– Raphaël Braud
Jan 14 '14 at 8:37
Hmm.. Still the same thing where it has 'results' and 'status' as headers while the rest of the json data appear as dicts in each cell. I think the solution to this problem would be to change the format of the data so that it is not subdivided into 'results' and 'status' then the data frame will use the 'lat', 'lng', 'elevation', 'resolution' as the separate headers. Either that, or I will need to find a way to load the json data into a dataframe that will have a multilevel header index as I mentioned in the question.
– pbreach
Jan 14 '14 at 18:47
Which final table do you expect ? The one you got after your edit ?
– Raphaël Braud
Jan 14 '14 at 22:51
The one I got after my final edit does the job, basically all I needed was to get the data in a tabular format that I can export and work with
– pbreach
Jan 16 '14 at 16:48
add a comment |
You could first import your json data in a Python dictionnary :
data = json.loads(elevations)
Then modify data on the fly :
for result in data['results']:
result[u'lat']=result[u'location'][u'lat']
result[u'lng']=result[u'location'][u'lng']
del result[u'location']
Rebuild json string :
elevations = json.dumps(data)
Finally :
pd.read_json(elevations)
You can, also, probably avoid to dump data back to a string, I assume Panda can directly create a DataFrame from a dictionnary (I haven't used it since a long time :p)
You could first import your json data in a Python dictionnary :
data = json.loads(elevations)
Then modify data on the fly :
for result in data['results']:
result[u'lat']=result[u'location'][u'lat']
result[u'lng']=result[u'location'][u'lng']
del result[u'location']
Rebuild json string :
elevations = json.dumps(data)
Finally :
pd.read_json(elevations)
You can, also, probably avoid to dump data back to a string, I assume Panda can directly create a DataFrame from a dictionnary (I haven't used it since a long time :p)
edited Jan 14 '14 at 8:36
answered Jan 14 '14 at 2:19
Raphaël BraudRaphaël Braud
1,129814
1,129814
I still end up with the same result using the json data and the dictionary that was created. It seems like each element in the dataframe has it's own dict. I tried using your approach in a less attractive way building a separate list for lat, lng, and elevation while iterating through 'data'.
– pbreach
Jan 14 '14 at 5:12
@user2593236 : Hello, I did an error while copy/pasting my code in SO : a del was missing (answer edited)
– Raphaël Braud
Jan 14 '14 at 8:37
Hmm.. Still the same thing where it has 'results' and 'status' as headers while the rest of the json data appear as dicts in each cell. I think the solution to this problem would be to change the format of the data so that it is not subdivided into 'results' and 'status' then the data frame will use the 'lat', 'lng', 'elevation', 'resolution' as the separate headers. Either that, or I will need to find a way to load the json data into a dataframe that will have a multilevel header index as I mentioned in the question.
– pbreach
Jan 14 '14 at 18:47
Which final table do you expect ? The one you got after your edit ?
– Raphaël Braud
Jan 14 '14 at 22:51
The one I got after my final edit does the job, basically all I needed was to get the data in a tabular format that I can export and work with
– pbreach
Jan 16 '14 at 16:48
add a comment |
I still end up with the same result using the json data and the dictionary that was created. It seems like each element in the dataframe has it's own dict. I tried using your approach in a less attractive way building a separate list for lat, lng, and elevation while iterating through 'data'.
– pbreach
Jan 14 '14 at 5:12
@user2593236 : Hello, I did an error while copy/pasting my code in SO : a del was missing (answer edited)
– Raphaël Braud
Jan 14 '14 at 8:37
Hmm.. Still the same thing where it has 'results' and 'status' as headers while the rest of the json data appear as dicts in each cell. I think the solution to this problem would be to change the format of the data so that it is not subdivided into 'results' and 'status' then the data frame will use the 'lat', 'lng', 'elevation', 'resolution' as the separate headers. Either that, or I will need to find a way to load the json data into a dataframe that will have a multilevel header index as I mentioned in the question.
– pbreach
Jan 14 '14 at 18:47
Which final table do you expect ? The one you got after your edit ?
– Raphaël Braud
Jan 14 '14 at 22:51
The one I got after my final edit does the job, basically all I needed was to get the data in a tabular format that I can export and work with
– pbreach
Jan 16 '14 at 16:48
I still end up with the same result using the json data and the dictionary that was created. It seems like each element in the dataframe has it's own dict. I tried using your approach in a less attractive way building a separate list for lat, lng, and elevation while iterating through 'data'.
– pbreach
Jan 14 '14 at 5:12
I still end up with the same result using the json data and the dictionary that was created. It seems like each element in the dataframe has it's own dict. I tried using your approach in a less attractive way building a separate list for lat, lng, and elevation while iterating through 'data'.
– pbreach
Jan 14 '14 at 5:12
@user2593236 : Hello, I did an error while copy/pasting my code in SO : a del was missing (answer edited)
– Raphaël Braud
Jan 14 '14 at 8:37
@user2593236 : Hello, I did an error while copy/pasting my code in SO : a del was missing (answer edited)
– Raphaël Braud
Jan 14 '14 at 8:37
Hmm.. Still the same thing where it has 'results' and 'status' as headers while the rest of the json data appear as dicts in each cell. I think the solution to this problem would be to change the format of the data so that it is not subdivided into 'results' and 'status' then the data frame will use the 'lat', 'lng', 'elevation', 'resolution' as the separate headers. Either that, or I will need to find a way to load the json data into a dataframe that will have a multilevel header index as I mentioned in the question.
– pbreach
Jan 14 '14 at 18:47
Hmm.. Still the same thing where it has 'results' and 'status' as headers while the rest of the json data appear as dicts in each cell. I think the solution to this problem would be to change the format of the data so that it is not subdivided into 'results' and 'status' then the data frame will use the 'lat', 'lng', 'elevation', 'resolution' as the separate headers. Either that, or I will need to find a way to load the json data into a dataframe that will have a multilevel header index as I mentioned in the question.
– pbreach
Jan 14 '14 at 18:47
Which final table do you expect ? The one you got after your edit ?
– Raphaël Braud
Jan 14 '14 at 22:51
Which final table do you expect ? The one you got after your edit ?
– Raphaël Braud
Jan 14 '14 at 22:51
The one I got after my final edit does the job, basically all I needed was to get the data in a tabular format that I can export and work with
– pbreach
Jan 16 '14 at 16:48
The one I got after my final edit does the job, basically all I needed was to get the data in a tabular format that I can export and work with
– pbreach
Jan 16 '14 at 16:48
add a comment |
The problem is that you have several columns in the data frame that contain dicts with smaller dicts inside them. Useful Json is often heavily nested. I have been writing small functions that pull the info I want out into a new column. That way I have it in the format that I want to use.
for row in range(len(data)):
#First I load the dict (one at a time)
n = data.loc[row,'dict_column']
#Now I make a new column that pulls out the data that I want.
data.loc[row,'new_column'] = n.get('key')
add a comment |
The problem is that you have several columns in the data frame that contain dicts with smaller dicts inside them. Useful Json is often heavily nested. I have been writing small functions that pull the info I want out into a new column. That way I have it in the format that I want to use.
for row in range(len(data)):
#First I load the dict (one at a time)
n = data.loc[row,'dict_column']
#Now I make a new column that pulls out the data that I want.
data.loc[row,'new_column'] = n.get('key')
add a comment |
The problem is that you have several columns in the data frame that contain dicts with smaller dicts inside them. Useful Json is often heavily nested. I have been writing small functions that pull the info I want out into a new column. That way I have it in the format that I want to use.
for row in range(len(data)):
#First I load the dict (one at a time)
n = data.loc[row,'dict_column']
#Now I make a new column that pulls out the data that I want.
data.loc[row,'new_column'] = n.get('key')
The problem is that you have several columns in the data frame that contain dicts with smaller dicts inside them. Useful Json is often heavily nested. I have been writing small functions that pull the info I want out into a new column. That way I have it in the format that I want to use.
for row in range(len(data)):
#First I load the dict (one at a time)
n = data.loc[row,'dict_column']
#Now I make a new column that pulls out the data that I want.
data.loc[row,'new_column'] = n.get('key')
edited Oct 20 '14 at 5:10
answered Oct 20 '14 at 4:54
billmanHbillmanH
675718
675718
add a comment |
add a comment |
billmanH's solution helped me but didn't work until i switched from:
n = data.loc[row,'json_column']
to:
n = data.iloc[[row]]['json_column']
here's the rest of it, converting to a dictionary is helpful for working with json data.
import json
for row in range(len(data)):
n = data.iloc[[row]]['json_column'].item()
jsonDict = json.loads(n)
if ('mykey' in jsonDict):
display(jsonDict['mykey'])
add a comment |
billmanH's solution helped me but didn't work until i switched from:
n = data.loc[row,'json_column']
to:
n = data.iloc[[row]]['json_column']
here's the rest of it, converting to a dictionary is helpful for working with json data.
import json
for row in range(len(data)):
n = data.iloc[[row]]['json_column'].item()
jsonDict = json.loads(n)
if ('mykey' in jsonDict):
display(jsonDict['mykey'])
add a comment |
billmanH's solution helped me but didn't work until i switched from:
n = data.loc[row,'json_column']
to:
n = data.iloc[[row]]['json_column']
here's the rest of it, converting to a dictionary is helpful for working with json data.
import json
for row in range(len(data)):
n = data.iloc[[row]]['json_column'].item()
jsonDict = json.loads(n)
if ('mykey' in jsonDict):
display(jsonDict['mykey'])
billmanH's solution helped me but didn't work until i switched from:
n = data.loc[row,'json_column']
to:
n = data.iloc[[row]]['json_column']
here's the rest of it, converting to a dictionary is helpful for working with json data.
import json
for row in range(len(data)):
n = data.iloc[[row]]['json_column'].item()
jsonDict = json.loads(n)
if ('mykey' in jsonDict):
display(jsonDict['mykey'])
answered Dec 11 '18 at 22:14
niltoidniltoid
5661516
5661516
add a comment |
add a comment |
Just a new version of the accepted answer, as python3.x
does not support urllib2
from requests import request
import json
from pandas.io.json import json_normalize
path1 = '42.974049,-81.205203|42.974298,-81.195755'
response=request(url='http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false', method='get')
elevations = response.json()
elevations
data = json.loads(elevations)
json_normalize(data['results'])
Might be better to just edit the accepted answer...
– Christian Hudon
yesterday
add a comment |
Just a new version of the accepted answer, as python3.x
does not support urllib2
from requests import request
import json
from pandas.io.json import json_normalize
path1 = '42.974049,-81.205203|42.974298,-81.195755'
response=request(url='http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false', method='get')
elevations = response.json()
elevations
data = json.loads(elevations)
json_normalize(data['results'])
Might be better to just edit the accepted answer...
– Christian Hudon
yesterday
add a comment |
Just a new version of the accepted answer, as python3.x
does not support urllib2
from requests import request
import json
from pandas.io.json import json_normalize
path1 = '42.974049,-81.205203|42.974298,-81.195755'
response=request(url='http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false', method='get')
elevations = response.json()
elevations
data = json.loads(elevations)
json_normalize(data['results'])
Just a new version of the accepted answer, as python3.x
does not support urllib2
from requests import request
import json
from pandas.io.json import json_normalize
path1 = '42.974049,-81.205203|42.974298,-81.195755'
response=request(url='http://maps.googleapis.com/maps/api/elevation/json?locations='+path1+'&sensor=false', method='get')
elevations = response.json()
elevations
data = json.loads(elevations)
json_normalize(data['results'])
answered Feb 15 at 16:40
AB AbhiAB Abhi
985821
985821
Might be better to just edit the accepted answer...
– Christian Hudon
yesterday
add a comment |
Might be better to just edit the accepted answer...
– Christian Hudon
yesterday
Might be better to just edit the accepted answer...
– Christian Hudon
yesterday
Might be better to just edit the accepted answer...
– Christian Hudon
yesterday
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f21104592%2fjson-to-pandas-dataframe%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown