You can subscribe to this list here.
| 2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
|
Jul
|
Aug
(12) |
Sep
(12) |
Oct
(56) |
Nov
(65) |
Dec
(37) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2004 |
Jan
(59) |
Feb
(78) |
Mar
(153) |
Apr
(205) |
May
(184) |
Jun
(123) |
Jul
(171) |
Aug
(156) |
Sep
(190) |
Oct
(120) |
Nov
(154) |
Dec
(223) |
| 2005 |
Jan
(184) |
Feb
(267) |
Mar
(214) |
Apr
(286) |
May
(320) |
Jun
(299) |
Jul
(348) |
Aug
(283) |
Sep
(355) |
Oct
(293) |
Nov
(232) |
Dec
(203) |
| 2006 |
Jan
(352) |
Feb
(358) |
Mar
(403) |
Apr
(313) |
May
(165) |
Jun
(281) |
Jul
(316) |
Aug
(228) |
Sep
(279) |
Oct
(243) |
Nov
(315) |
Dec
(345) |
| 2007 |
Jan
(260) |
Feb
(323) |
Mar
(340) |
Apr
(319) |
May
(290) |
Jun
(296) |
Jul
(221) |
Aug
(292) |
Sep
(242) |
Oct
(248) |
Nov
(242) |
Dec
(332) |
| 2008 |
Jan
(312) |
Feb
(359) |
Mar
(454) |
Apr
(287) |
May
(340) |
Jun
(450) |
Jul
(403) |
Aug
(324) |
Sep
(349) |
Oct
(385) |
Nov
(363) |
Dec
(437) |
| 2009 |
Jan
(500) |
Feb
(301) |
Mar
(409) |
Apr
(486) |
May
(545) |
Jun
(391) |
Jul
(518) |
Aug
(497) |
Sep
(492) |
Oct
(429) |
Nov
(357) |
Dec
(310) |
| 2010 |
Jan
(371) |
Feb
(657) |
Mar
(519) |
Apr
(432) |
May
(312) |
Jun
(416) |
Jul
(477) |
Aug
(386) |
Sep
(419) |
Oct
(435) |
Nov
(320) |
Dec
(202) |
| 2011 |
Jan
(321) |
Feb
(413) |
Mar
(299) |
Apr
(215) |
May
(284) |
Jun
(203) |
Jul
(207) |
Aug
(314) |
Sep
(321) |
Oct
(259) |
Nov
(347) |
Dec
(209) |
| 2012 |
Jan
(322) |
Feb
(414) |
Mar
(377) |
Apr
(179) |
May
(173) |
Jun
(234) |
Jul
(295) |
Aug
(239) |
Sep
(276) |
Oct
(355) |
Nov
(144) |
Dec
(108) |
| 2013 |
Jan
(170) |
Feb
(89) |
Mar
(204) |
Apr
(133) |
May
(142) |
Jun
(89) |
Jul
(160) |
Aug
(180) |
Sep
(69) |
Oct
(136) |
Nov
(83) |
Dec
(32) |
| 2014 |
Jan
(71) |
Feb
(90) |
Mar
(161) |
Apr
(117) |
May
(78) |
Jun
(94) |
Jul
(60) |
Aug
(83) |
Sep
(102) |
Oct
(132) |
Nov
(154) |
Dec
(96) |
| 2015 |
Jan
(45) |
Feb
(138) |
Mar
(176) |
Apr
(132) |
May
(119) |
Jun
(124) |
Jul
(77) |
Aug
(31) |
Sep
(34) |
Oct
(22) |
Nov
(23) |
Dec
(9) |
| 2016 |
Jan
(26) |
Feb
(17) |
Mar
(10) |
Apr
(8) |
May
(4) |
Jun
(8) |
Jul
(6) |
Aug
(5) |
Sep
(9) |
Oct
(4) |
Nov
|
Dec
|
| 2017 |
Jan
(5) |
Feb
(7) |
Mar
(1) |
Apr
(5) |
May
|
Jun
(3) |
Jul
(6) |
Aug
(1) |
Sep
|
Oct
(2) |
Nov
(1) |
Dec
|
| 2018 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2020 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2025 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| S | M | T | W | T | F | S |
|---|---|---|---|---|---|---|
|
|
|
|
|
|
1
(2) |
2
(6) |
|
3
|
4
(7) |
5
(10) |
6
(4) |
7
(17) |
8
(4) |
9
(1) |
|
10
(1) |
11
(19) |
12
(14) |
13
(8) |
14
(14) |
15
(9) |
16
(1) |
|
17
|
18
|
19
(8) |
20
(5) |
21
(7) |
22
(13) |
23
(1) |
|
24
|
25
(4) |
26
(2) |
27
(17) |
28
(13) |
29
(6) |
30
(6) |
|
31
(5) |
|
|
|
|
|
|
|
From: Paul H. <pmh...@gm...> - 2013-03-06 22:18:02
|
On Wed, Mar 6, 2013 at 2:00 PM, Clifford Lyon <cli...@gm...>wrote: > I wish to make a boxplot with data in this format: > > Value, Frequency > 0, 128329 > 1, 20390 > 2, 230 > 3, 32 > 4, 3 > > etc. Rather than expand this into a flat array, is there some way to pass > in weights for values? Some of the frequencies I'm working with are very > large, and so the resulting arrays would be huge. AFAIK, all the summary > statistics I need for the plot can be computed from data in this form. > > Boxplot, as it currently stands, wants the raw data. Some recently added features allow you to manually specify the median and it's confidence intervals, but nothing else. I've been meaning to submit a PR for boxplot where it's split into the public method and private drawing function that just takes a dictionary of the values (R does this, IIRC). That wouldn't directly help you in this situation, but you'd be one step closer. -paul |
|
From: Clifford L. <cli...@gm...> - 2013-03-06 22:01:12
|
I wish to make a boxplot with data in this format: Value, Frequency 0, 128329 1, 20390 2, 230 3, 32 4, 3 etc. Rather than expand this into a flat array, is there some way to pass in weights for values? Some of the frequencies I'm working with are very large, and so the resulting arrays would be huge. AFAIK, all the summary statistics I need for the plot can be computed from data in this form. Thanks! |
|
From: Wayne H. <Way...@df...> - 2013-03-06 20:00:28
|
I am trying to use stacked and normed at the same time. I have thought about it - and I think it does make sense for what I want to do. Below is some code and a figure that demonstrate the problem. There are two histograms. Both use teh same data. Both are stacked. The top histogram uses normed=False. The bottom uses normed =True. Ideally, the histograms would be identical except for scaling on the y-axis. The histogram with normed=False looks OK. The left-half of the normed=True histogram looks OK. About half-way through the second distribution moves from the top of the bars to the bottom. At least that's the way it looks from eyeballing. Does anybody have a suggestion about how to get around this problem? Should I be reporting a bug somewhere? Thanks, Wayne Hajas ================== from numpy.random import normal,seed import matplotlib.pyplot as plt seed(100) x=normal(loc=0,scale=1,size=1000) y=normal(loc=1,scale=1,size=1000) bins=map(lambda i:-2.+5.*float(i)/float(50),range(51)) plt.close() plt.subplot(211) plt.hist([x,y],bins=bins,alpha=0.5, stacked=True,normed=False) plt.subplot(212) plt.hist([x,y],bins=bins,alpha=0.5, stacked=True,normed=True) plt.show() <http://matplotlib.1069221.n5.nabble.com/file/n40552/StackedProblem.png> -- View this message in context: http://matplotlib.1069221.n5.nabble.com/conflict-between-stacked-and-normed-in-hist-tp40552.html Sent from the matplotlib - users mailing list archive at Nabble.com. |
|
From: Benjamin R. <ben...@ou...> - 2013-03-06 14:38:49
|
On Tue, Mar 5, 2013 at 5:33 PM, Mahe <mah...@gm...> wrote: > > Benjamin Root <ben.root@...> writes: > > > > > On Tue, Feb 1, 2011 at 11:09 AM, Francesco Benincasa > <francesco.benincasa- > DuY...@pu...> wrote: > > > > Hi all, > > I'm using pygrads for plotting maps from netcdf files. > > I use the contourf method, but I'm not able to fill the region where > there are > > no value (there is the missing value -999) with a color. It seems to > ignore > > the set_bad method that I used to make the colormap. > > Any suggestions? > > Thank you very much in advance. > > -- > > | Francesco Benincasa > > > > > > Most likely, the issue is that set_bad is more for setting the color when > encountering masked values (through masked arrays). As a quick and dirty > way to > deal with it, try setting that color through the set_under() method.The > correct > way to do this is to use set_bad, but convert your numpy array that you > are > displaying into a masked array like so:z_ma = np.ma.masked_array(z, > mask=(z == > -999))and use contourf on z_ma.Let us know how that works for you.Ben Root > > > > > > > > > ------------------------------------------------------------------------------ > > Special Offer-- Download ArcSight Logger for FREE (a $49 USD value)! > > Finally, a world-class log management solution at an even better > price-free! > > Download using promo code Free_Logger_4_Dev2Dev. Offer expires > > February 28th, so secure your free ArcSight Logger TODAY! > > http://p.sf.net/sfu/arcsight-sfd2d > > > > _______________________________________________ > > Matplotlib-users mailing list > > Matplotlib-users@... > > https://lists.sourceforge.net/lists/listinfo/matplotlib-users > > > > Hi ! > I have had the same issue (set_bad not taking effect with nans), and > transformed > the data into a masked array. But it does not seem to work... > Here a minimal example: > > import matplotlib.pyplot as plt > import numpy as np > > plt.clf() > x = np.linspace(-180,180,100) > y = np.linspace(-90,90,100) > x, y = np.meshgrid(x,y) > data = np.cos(x/180*np.pi) + np.sin(y/180*np.pi) > data[(y<50)&(y>30)&(x<50)&(x>30)] = np.nan > data = np.ma.masked_array(data, mask = np.isnan(data)) # has no effect > ncol = 20 > cbar = [-1,1] > palette = plt.cm.Blues > palette.set_bad('green') > palette.set_over('red') > palette.set_under('black') > cs = plt.contourf(x,y,data,np.linspace(cbar[0],cbar[1],ncol), cmap=palette, > extend='both') > plt.colorbar() > cs.set_clim(cbar) # need that for set_upper and set_lower to take effect > plt.show() > > There is already that small bug where one needs to call set_clim for > set_upper > and set_lower, maybe something similar is needed for set_bad? > Any idea? > > Many thanks, > Mahe > > Your problem is very, very subtle, and we probably should handle this better. The issue is, I think, that because of the way contourf works, the colormap is applied to the list of polygon (or patch) collections, each having a value for its level. Because there wouldn't be a "nan" level, there is no polygon or patch at all for that spot. Indeed, if you change the background color of the plot, the white patch becomes whatever color the background is. I hope this clears it up for you. Ben Root |