You can subscribe to this list here.
| 2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
|
Jul
|
Aug
(12) |
Sep
(12) |
Oct
(56) |
Nov
(65) |
Dec
(37) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2004 |
Jan
(59) |
Feb
(78) |
Mar
(153) |
Apr
(205) |
May
(184) |
Jun
(123) |
Jul
(171) |
Aug
(156) |
Sep
(190) |
Oct
(120) |
Nov
(154) |
Dec
(223) |
| 2005 |
Jan
(184) |
Feb
(267) |
Mar
(214) |
Apr
(286) |
May
(320) |
Jun
(299) |
Jul
(348) |
Aug
(283) |
Sep
(355) |
Oct
(293) |
Nov
(232) |
Dec
(203) |
| 2006 |
Jan
(352) |
Feb
(358) |
Mar
(403) |
Apr
(313) |
May
(165) |
Jun
(281) |
Jul
(316) |
Aug
(228) |
Sep
(279) |
Oct
(243) |
Nov
(315) |
Dec
(345) |
| 2007 |
Jan
(260) |
Feb
(323) |
Mar
(340) |
Apr
(319) |
May
(290) |
Jun
(296) |
Jul
(221) |
Aug
(292) |
Sep
(242) |
Oct
(248) |
Nov
(242) |
Dec
(332) |
| 2008 |
Jan
(312) |
Feb
(359) |
Mar
(454) |
Apr
(287) |
May
(340) |
Jun
(450) |
Jul
(403) |
Aug
(324) |
Sep
(349) |
Oct
(385) |
Nov
(363) |
Dec
(437) |
| 2009 |
Jan
(500) |
Feb
(301) |
Mar
(409) |
Apr
(486) |
May
(545) |
Jun
(391) |
Jul
(518) |
Aug
(497) |
Sep
(492) |
Oct
(429) |
Nov
(357) |
Dec
(310) |
| 2010 |
Jan
(371) |
Feb
(657) |
Mar
(519) |
Apr
(432) |
May
(312) |
Jun
(416) |
Jul
(477) |
Aug
(386) |
Sep
(419) |
Oct
(435) |
Nov
(320) |
Dec
(202) |
| 2011 |
Jan
(321) |
Feb
(413) |
Mar
(299) |
Apr
(215) |
May
(284) |
Jun
(203) |
Jul
(207) |
Aug
(314) |
Sep
(321) |
Oct
(259) |
Nov
(347) |
Dec
(209) |
| 2012 |
Jan
(322) |
Feb
(414) |
Mar
(377) |
Apr
(179) |
May
(173) |
Jun
(234) |
Jul
(295) |
Aug
(239) |
Sep
(276) |
Oct
(355) |
Nov
(144) |
Dec
(108) |
| 2013 |
Jan
(170) |
Feb
(89) |
Mar
(204) |
Apr
(133) |
May
(142) |
Jun
(89) |
Jul
(160) |
Aug
(180) |
Sep
(69) |
Oct
(136) |
Nov
(83) |
Dec
(32) |
| 2014 |
Jan
(71) |
Feb
(90) |
Mar
(161) |
Apr
(117) |
May
(78) |
Jun
(94) |
Jul
(60) |
Aug
(83) |
Sep
(102) |
Oct
(132) |
Nov
(154) |
Dec
(96) |
| 2015 |
Jan
(45) |
Feb
(138) |
Mar
(176) |
Apr
(132) |
May
(119) |
Jun
(124) |
Jul
(77) |
Aug
(31) |
Sep
(34) |
Oct
(22) |
Nov
(23) |
Dec
(9) |
| 2016 |
Jan
(26) |
Feb
(17) |
Mar
(10) |
Apr
(8) |
May
(4) |
Jun
(8) |
Jul
(6) |
Aug
(5) |
Sep
(9) |
Oct
(4) |
Nov
|
Dec
|
| 2017 |
Jan
(5) |
Feb
(7) |
Mar
(1) |
Apr
(5) |
May
|
Jun
(3) |
Jul
(6) |
Aug
(1) |
Sep
|
Oct
(2) |
Nov
(1) |
Dec
|
| 2018 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2020 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2025 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| S | M | T | W | T | F | S |
|---|---|---|---|---|---|---|
|
|
|
|
|
|
1
(7) |
2
(8) |
|
3
(3) |
4
(5) |
5
(2) |
6
(3) |
7
(4) |
8
(11) |
9
(4) |
|
10
|
11
(8) |
12
(10) |
13
(16) |
14
(14) |
15
(13) |
16
(1) |
|
17
|
18
(8) |
19
(6) |
20
(13) |
21
(15) |
22
(5) |
23
(13) |
|
24
(2) |
25
(4) |
26
(1) |
27
(4) |
28
(8) |
29
(11) |
30
(5) |
|
31
(3) |
|
|
|
|
|
|
|
From: John H. <jd...@gm...> - 2011-07-01 18:26:54
|
On Fri, Jul 1, 2011 at 11:14 AM, Hackett, John (Norcross, GA) <Joh...@un...> wrote: > After some experimentation (and judicious peeking at the source code), I > think I’ve got the hang of writing custom functions to pass into these > modules – basically, anything that accepts a list of values sliced from a > single column on the structured array and returns a single list seems to > work well. In functional programming terms, rec_summarize appears similar to > “map”, rec_groupby appears similar to “reduce”. > > > > Now – what if I want to derive a calculation from multiple statistics in the > original dataset – eg. create a new column on the array which is derived > from 2 (or up to n) other fields in a custom function which I pass into the > process? > > > > For example, conditional counts/summaries (count transactions and sum the > sales on all orders that weighed > 5K lbs). > > > > Is there a way to do this within numpy or mlab without going all the way out > to python and creating a list comprehension? There are a couple of ways with the existing functions. One is to use a logical mask:: mask = r.weight>5 rg = mlab.rec_groupby(r[mask], groupby, stats) You could also create a new categorical variable with one or more values and attach it to your record array and then use rec_groupby:: heavy = np.where(r.weight>5, 1, 0) and add that to your record array r = mlab.rec_append_fields(r, ['heavy'], [heavy]) and then do a rec_group_by using 'heavy' as your group by attribute. Brian Schwartz has a preliminary implementation of rec_query which allows you to make a SQL query on a record array by converting it to a sqllite table, running the sql query, and returning the results as a new record array, which would solve your problem more cleanly and generically. The code needs a little more polishing, but perhaps Brian you can send over what you have in case John wants to take a look. JDH |
|
From: Hackett, J. (N. GA) <Joh...@un...> - 2011-07-01 16:15:13
|
Good morning - Got a question for a mlab module guru. After some experimentation (and judicious peeking at the source code), I think I've got the hang of writing custom functions to pass into these modules - basically, anything that accepts a list of values sliced from a single column on the structured array and returns a single list seems to work well. In functional programming terms, rec_summarize appears similar to "map", rec_groupby appears similar to "reduce". Now - what if I want to derive a calculation from multiple statistics in the original dataset - eg. create a new column on the array which is derived from 2 (or up to n) other fields in a custom function which I pass into the process? For example, conditional counts/summaries (count transactions and sum the sales on all orders that weighed > 5K lbs). Is there a way to do this within numpy or mlab without going all the way out to python and creating a list comprehension? Thanks. John |
|
From: marz_cyclone <me...@me...> - 2011-07-01 14:52:38
|
hi all,
i'm trying to get the bounding box of a map plotted with basemap to place a
colorbar. in this reduced example from the example directory of basemap, the
colorbar is set to the hight of the axes of the plot.
from mpl_toolkits.basemap import Basemap, shiftgrid
import numpy as np
import matplotlib.pyplot as plt
topoin = np.loadtxt('etopo20data.gz')
lons = np.loadtxt('etopo20lons.gz')
lats = np.loadtxt('etopo20lats.gz')
topoin,lons = shiftgrid(180.,topoin,lons,start=False)
m = Basemap(llcrnrlon=-57,llcrnrlat=37,urcrnrlon=-3,urcrnrlat=60,
resolution='i',projection='tmerc',lon_0=-41,lat_0=45)
# transform to nx x ny regularly spaced native projection grid
nx = int((m.xmax-m.xmin)/40000.)+1; ny = int((m.ymax-m.ymin)/40000.)+1
topodat,x,y = m.transform_scalar(topoin,lons,lats,nx,ny,returnxy=True)
# create the figure.
fig=plt.figure(figsize=(8,8))
# add an axes, leaving room for colorbar on the right.
ax = fig.add_axes([0.1,0.1,0.7,0.7])
# associate this axes with the Basemap instance.
m.ax = ax
# plot image over map with imshow.
im = m.imshow(topodat,plt.cm.jet)
# setup colorbar axes instance.
pos = ax.get_position()
l, b, w, h = pos.bounds
cax = plt.axes([l+w+0.075, b, 0.05, h])
plt.colorbar(im,cax=cax) # draw colorbar
# plot blue dot on boulder, colorado and label it as such.
plt.show()
what i'm interested in is the bounding box of m, the real plotting area, so
i can scale the colorbar axes to that height or width.
is there a way to get this?
thanks in advance
mario
--
View this message in context: http://old.nabble.com/bbox-of-map-tp31974326p31974326.html
Sent from the matplotlib - users mailing list archive at Nabble.com.
|
|
From: <han...@ar...> - 2011-07-01 08:23:23
|
Hi, thanks for the very quick response & fix. We were surprised, too, that we hadn't found more about this problem. We were put on the right track by this related report: http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=395867 - Thanks also for the link to the other report. I'll see if I can come up with a nice example and submit it there. Cheers HB > On 06/30/2011 01:10 PM, Michael Droettboom wrote: > > I'm surprised this bug (which really lies in Tkinter) isn't more widely > > known -- searching the Python bug tracker revealed nothing. It would be > > great to follow-up there (with a standalone Tkinter-crashing example) if > > you're so inclined. > I did find this bug, which seems to be related. > > http://bugs.python.org/issue10647 > > Mike |