You can subscribe to this list here.
| 2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
|
Jul
|
Aug
(12) |
Sep
(12) |
Oct
(56) |
Nov
(65) |
Dec
(37) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2004 |
Jan
(59) |
Feb
(78) |
Mar
(153) |
Apr
(205) |
May
(184) |
Jun
(123) |
Jul
(171) |
Aug
(156) |
Sep
(190) |
Oct
(120) |
Nov
(154) |
Dec
(223) |
| 2005 |
Jan
(184) |
Feb
(267) |
Mar
(214) |
Apr
(286) |
May
(320) |
Jun
(299) |
Jul
(348) |
Aug
(283) |
Sep
(355) |
Oct
(293) |
Nov
(232) |
Dec
(203) |
| 2006 |
Jan
(352) |
Feb
(358) |
Mar
(403) |
Apr
(313) |
May
(165) |
Jun
(281) |
Jul
(316) |
Aug
(228) |
Sep
(279) |
Oct
(243) |
Nov
(315) |
Dec
(345) |
| 2007 |
Jan
(260) |
Feb
(323) |
Mar
(340) |
Apr
(319) |
May
(290) |
Jun
(296) |
Jul
(221) |
Aug
(292) |
Sep
(242) |
Oct
(248) |
Nov
(242) |
Dec
(332) |
| 2008 |
Jan
(312) |
Feb
(359) |
Mar
(454) |
Apr
(287) |
May
(340) |
Jun
(450) |
Jul
(403) |
Aug
(324) |
Sep
(349) |
Oct
(385) |
Nov
(363) |
Dec
(437) |
| 2009 |
Jan
(500) |
Feb
(301) |
Mar
(409) |
Apr
(486) |
May
(545) |
Jun
(391) |
Jul
(518) |
Aug
(497) |
Sep
(492) |
Oct
(429) |
Nov
(357) |
Dec
(310) |
| 2010 |
Jan
(371) |
Feb
(657) |
Mar
(519) |
Apr
(432) |
May
(312) |
Jun
(416) |
Jul
(477) |
Aug
(386) |
Sep
(419) |
Oct
(435) |
Nov
(320) |
Dec
(202) |
| 2011 |
Jan
(321) |
Feb
(413) |
Mar
(299) |
Apr
(215) |
May
(284) |
Jun
(203) |
Jul
(207) |
Aug
(314) |
Sep
(321) |
Oct
(259) |
Nov
(347) |
Dec
(209) |
| 2012 |
Jan
(322) |
Feb
(414) |
Mar
(377) |
Apr
(179) |
May
(173) |
Jun
(234) |
Jul
(295) |
Aug
(239) |
Sep
(276) |
Oct
(355) |
Nov
(144) |
Dec
(108) |
| 2013 |
Jan
(170) |
Feb
(89) |
Mar
(204) |
Apr
(133) |
May
(142) |
Jun
(89) |
Jul
(160) |
Aug
(180) |
Sep
(69) |
Oct
(136) |
Nov
(83) |
Dec
(32) |
| 2014 |
Jan
(71) |
Feb
(90) |
Mar
(161) |
Apr
(117) |
May
(78) |
Jun
(94) |
Jul
(60) |
Aug
(83) |
Sep
(102) |
Oct
(132) |
Nov
(154) |
Dec
(96) |
| 2015 |
Jan
(45) |
Feb
(138) |
Mar
(176) |
Apr
(132) |
May
(119) |
Jun
(124) |
Jul
(77) |
Aug
(31) |
Sep
(34) |
Oct
(22) |
Nov
(23) |
Dec
(9) |
| 2016 |
Jan
(26) |
Feb
(17) |
Mar
(10) |
Apr
(8) |
May
(4) |
Jun
(8) |
Jul
(6) |
Aug
(5) |
Sep
(9) |
Oct
(4) |
Nov
|
Dec
|
| 2017 |
Jan
(5) |
Feb
(7) |
Mar
(1) |
Apr
(5) |
May
|
Jun
(3) |
Jul
(6) |
Aug
(1) |
Sep
|
Oct
(2) |
Nov
(1) |
Dec
|
| 2018 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2020 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2025 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| S | M | T | W | T | F | S |
|---|---|---|---|---|---|---|
|
|
1
(3) |
2
|
3
|
4
(1) |
5
(3) |
6
(1) |
|
7
(4) |
8
(6) |
9
(5) |
10
(2) |
11
(3) |
12
(6) |
13
|
|
14
|
15
(9) |
16
(11) |
17
(4) |
18
(3) |
19
(1) |
20
(2) |
|
21
(2) |
22
(13) |
23
(4) |
24
(2) |
25
(7) |
26
(1) |
27
(3) |
|
28
(1) |
29
(2) |
30
(3) |
|
|
|
|
|
From: Jody K. <jk...@uv...> - 2014-09-08 15:59:33
|
It looks like you are calling `pcolor`. Can I suggest you try `pcolormesh`?
75 Mb is not a big file!
Cheers, Jody
On Sep 8, 2014, at 7:38 AM, Benjamin Root <ben...@ou...> wrote:
> (Keeping this on the mailing list so that others can benefit)
>
> What might be happening is that you are keeping around too many numpy arrays in memory than you actually need. Take advantage of memmapping, which most netcdf tools provide by default. This keeps the data on disk rather than in RAM. Second, for very large images, I would suggest either pcolormesh() or just simply imshow() instead of pcolor() as they are more way more efficient than pcolor(). In addition, it sounds like you are dealing with re-sampled data ("at different zoom levels"). Does this mean that you are re-running contour on re-sampled data? I am not sure what the benefit of doing that is if one could just simply do the contour once at the highest resolution.
>
> Without seeing any code, though, I can only provide generic suggestions.
>
> Cheers!
> Ben Root
>
>
> On Mon, Sep 8, 2014 at 10:12 AM, Raffaele Quarta <raf...@li...> wrote:
> Hi Ben,
>
> sorry for the few details that I gave to you. I'm trying to make a contour plot of a variable at different zoom levels by using high resolution data. The aim is to obtain .PNG output images. Actually, I'm working with big data (NetCDF file, dimension is about 75Mb). The current Matplotlib version on my UBUNTU 14.04 machine is the 1.3.1 one. My system has a RAM capacity of 8Gb.
> Actually, I'm dealing with memory system problems when I try to make a plot. I got the error message as follow:
>
> --------------------------------------------
> cs = m.pcolor(xi,yi,np.squeeze(t))
> File "/usr/lib/pymodules/python2.7/mpl_toolkits/basemap/__init__.py", line 521, in with_transform
> return plotfunc(self,x,y,data,*args,**kwargs)
> File "/usr/lib/pymodules/python2.7/mpl_toolkits/basemap/__init__.py", line 3375, in pcolor
> x = ma.masked_values(np.where(x > 1.e20,1.e20,x), 1.e20)
> File "/usr/lib/python2.7/dist-packages/numpy/ma/core.py", line 2195, in masked_values
> condition = umath.less_equal(mabs(xnew - value), atol + rtol * mabs(value))
> MemoryError
> --------------------------------------------
>
> Otherwise, when I try to make a plot of smaller file (such as 5Mb), it works very well. I believe that it's not something of wrong in the script. It might be a memory system problem.
> I hope that my message is more clear now.
>
> Thanks for the help.
>
> Regards,
>
> Raffaele
>
> -----------------------------------------
>
> Sent: Mon 9/8/2014 3:19 PM
> To: Raffaele Quarta
> Cc: Matplotlib Users
> Subject: Re: [Matplotlib-users] Plotting large file (NetCDF)
>
>
>
> You will need to be more specific... much more specific. What kind of plot
> are you making? How big is your data? What version of matplotlib are you
> using? How much RAM do you have available compared to the amount of data
> (most slowdowns are actually due to swap-thrashing issues). Matplotlib can
> be used for large data, but there exists some speciality tools for the
> truly large datasets. The solution depends on the situation.
>
> Ben Root
>
> On Mon, Sep 8, 2014 at 7:45 AM, Raffaele Quarta <raf...@li...>
> wrote:
>
> > Hi,
> >
> > I'm working with NetCDF format. When I try to make a plot of very large
> > file, I have to wait for a long time for plotting. How can I solve this?
> > Isn't there a solution for this problem?
> >
> > Raffaele
> >
> > --
> > This email was Virus checked by Astaro Security Gateway. http://www.sophos.com
> >
> >
> >
> > ------------------------------------------------------------------------------
> > Want excitement?
> > Manually upgrade your production database.
> > When you want reliability, choose Perforce
> > Perforce version control. Predictably reliable.
> >
> > http://pubads.g.doubleclick.net/gampad/clk?id=157508191&iu=/4140/ostg.clktrk
> > _______________________________________________
> > Matplotlib-users mailing list
> > Mat...@li...
> > https://lists.sourceforge.net/lists/listinfo/matplotlib-users
> >
> >
>
> --
> This email was Virus checked by Astaro Security Gateway. http://www.sophos.com
>
>
>
> ------------------------------------------------------------------------------
> Want excitement?
> Manually upgrade your production database.
> When you want reliability, choose Perforce
> Perforce version control. Predictably reliable.
> http://pubads.g.doubleclick.net/gampad/clk?id=157508191&iu=/4140/ostg.clktrk_______________________________________________
> Matplotlib-users mailing list
> Mat...@li...
> https://lists.sourceforge.net/lists/listinfo/matplotlib-users
--
Jody Klymak
http://web.uvic.ca/~jklymak/
|
|
From: Benjamin R. <ben...@ou...> - 2014-09-08 14:38:32
|
(Keeping this on the mailing list so that others can benefit)
What might be happening is that you are keeping around too many numpy
arrays in memory than you actually need. Take advantage of memmapping,
which most netcdf tools provide by default. This keeps the data on disk
rather than in RAM. Second, for very large images, I would suggest either
pcolormesh() or just simply imshow() instead of pcolor() as they are more
way more efficient than pcolor(). In addition, it sounds like you are
dealing with re-sampled data ("at different zoom levels"). Does this mean
that you are re-running contour on re-sampled data? I am not sure what the
benefit of doing that is if one could just simply do the contour once at
the highest resolution.
Without seeing any code, though, I can only provide generic suggestions.
Cheers!
Ben Root
On Mon, Sep 8, 2014 at 10:12 AM, Raffaele Quarta <raf...@li...
> wrote:
> Hi Ben,
>
> sorry for the few details that I gave to you. I'm trying to make a contour
> plot of a variable at different zoom levels by using high resolution data.
> The aim is to obtain .PNG output images. Actually, I'm working with big
> data (NetCDF file, dimension is about 75Mb). The current Matplotlib version
> on my UBUNTU 14.04 machine is the 1.3.1 one. My system has a RAM capacity
> of 8Gb.
> Actually, I'm dealing with memory system problems when I try to make a
> plot. I got the error message as follow:
>
> --------------------------------------------
> cs = m.pcolor(xi,yi,np.squeeze(t))
> File "/usr/lib/pymodules/python2.7/mpl_toolkits/basemap/__init__.py",
> line 521, in with_transform
> return plotfunc(self,x,y,data,*args,**kwargs)
> File "/usr/lib/pymodules/python2.7/mpl_toolkits/basemap/__init__.py",
> line 3375, in pcolor
> x = ma.masked_values(np.where(x > 1.e20,1.e20,x), 1.e20)
> File "/usr/lib/python2.7/dist-packages/numpy/ma/core.py", line 2195, in
> masked_values
> condition = umath.less_equal(mabs(xnew - value), atol + rtol *
> mabs(value))
> MemoryError
> --------------------------------------------
>
> Otherwise, when I try to make a plot of smaller file (such as 5Mb), it
> works very well. I believe that it's not something of wrong in the script.
> It might be a memory system problem.
> I hope that my message is more clear now.
>
> Thanks for the help.
>
> Regards,
>
> Raffaele
>
> -----------------------------------------
>
> Sent: Mon 9/8/2014 3:19 PM
> To: Raffaele Quarta
> Cc: Matplotlib Users
> Subject: Re: [Matplotlib-users] Plotting large file (NetCDF)
>
>
> You will need to be more specific... much more specific. What kind of plot
> are you making? How big is your data? What version of matplotlib are you
> using? How much RAM do you have available compared to the amount of data
> (most slowdowns are actually due to swap-thrashing issues). Matplotlib can
> be used for large data, but there exists some speciality tools for the
> truly large datasets. The solution depends on the situation.
>
> Ben Root
>
> On Mon, Sep 8, 2014 at 7:45 AM, Raffaele Quarta <
> raf...@li...>
> wrote:
>
> > Hi,
> >
> > I'm working with NetCDF format. When I try to make a plot of very large
> > file, I have to wait for a long time for plotting. How can I solve this?
> > Isn't there a solution for this problem?
> >
> > Raffaele
> >
> > --
> > This email was Virus checked by Astaro Security Gateway.
> http://www.sophos.com
> >
> >
> >
> >
> ------------------------------------------------------------------------------
> > Want excitement?
> > Manually upgrade your production database.
> > When you want reliability, choose Perforce
> > Perforce version control. Predictably reliable.
> >
> >
> http://pubads.g.doubleclick.net/gampad/clk?id=157508191&iu=/4140/ostg.clktrk
> > _______________________________________________
> > Matplotlib-users mailing list
> > Mat...@li...
> > https://lists.sourceforge.net/lists/listinfo/matplotlib-users
> >
> >
>
> --
> This email was Virus checked by Astaro Security Gateway.
> http://www.sophos.com
>
>
|
|
From: Benjamin R. <ben...@ou...> - 2014-09-08 13:20:06
|
You will need to be more specific... much more specific. What kind of plot are you making? How big is your data? What version of matplotlib are you using? How much RAM do you have available compared to the amount of data (most slowdowns are actually due to swap-thrashing issues). Matplotlib can be used for large data, but there exists some speciality tools for the truly large datasets. The solution depends on the situation. Ben Root On Mon, Sep 8, 2014 at 7:45 AM, Raffaele Quarta <raf...@li...> wrote: > Hi, > > I'm working with NetCDF format. When I try to make a plot of very large > file, I have to wait for a long time for plotting. How can I solve this? > Isn't there a solution for this problem? > > Raffaele > > -- > This email was Virus checked by Astaro Security Gateway. http://www.sophos.com > > > > ------------------------------------------------------------------------------ > Want excitement? > Manually upgrade your production database. > When you want reliability, choose Perforce > Perforce version control. Predictably reliable. > > http://pubads.g.doubleclick.net/gampad/clk?id=157508191&iu=/4140/ostg.clktrk > _______________________________________________ > Matplotlib-users mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-users > > |
|
From: Raffaele Q. <raf...@li...> - 2014-09-08 11:46:20
|
Hi, I'm working with NetCDF format. When I try to make a plot of very large file, I have to wait for a long time for plotting. How can I solve this? Isn't there a solution for this problem? Raffaele -- This email was Virus checked by Astaro Security Gateway. http://www.sophos.com |
|
From: Pierre H. <pie...@cr...> - 2014-09-08 10:23:53
|
Le 05/09/2014 21:53, Arnaldo Russo a écrit : > The following code plots my table, but greek letters are not in Arial. What about adding greek letters directly with a Unicode string and keeping LaTex only for the table? best, Pierre (my greek and math unicode "copy-pasting files" attached) |
|
From: Ken M. <ma...@gm...> - 2014-09-08 02:28:31
|
Hi, I'm having trouble with Basemap South Pole Stereographic projections. The required "boundinglat" doesn't seem to work properly, and I hope someone can help me figure out the correct syntax. I have the following information about the data: extent: -3,333,500 to 3,333,500 m true scale latitude: -71 south pole stereographic projection. Given this, I try the following: m = Basemap(resolution='c',projection='spstere', lat_ts=-71,lat_0=0.,lon_0=0.) And it crashes reporting that I need boundinglat. I can't give width/height which are accepted by the 'stere' projection. I use http://www.pgc.umn.edu/tools/conversion and determine that 3333500 is 60 degrees south. If then add "boundinglat=-60" to the Basemap call, the data appears nowhere near the correct location. If I add "boundinglat=-89.999999" it looks almost perfect. If I change that to "-90" or add a few more 9's, it crashes. Am I determining the bounding lat correctly from the provided information? Should I be calling Basemap differently? Any help will be much appreciated. Thanks, Ken Mankoff |