You can subscribe to this list here.
| 2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
|
Jul
|
Aug
(12) |
Sep
(12) |
Oct
(56) |
Nov
(65) |
Dec
(37) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2004 |
Jan
(59) |
Feb
(78) |
Mar
(153) |
Apr
(205) |
May
(184) |
Jun
(123) |
Jul
(171) |
Aug
(156) |
Sep
(190) |
Oct
(120) |
Nov
(154) |
Dec
(223) |
| 2005 |
Jan
(184) |
Feb
(267) |
Mar
(214) |
Apr
(286) |
May
(320) |
Jun
(299) |
Jul
(348) |
Aug
(283) |
Sep
(355) |
Oct
(293) |
Nov
(232) |
Dec
(203) |
| 2006 |
Jan
(352) |
Feb
(358) |
Mar
(403) |
Apr
(313) |
May
(165) |
Jun
(281) |
Jul
(316) |
Aug
(228) |
Sep
(279) |
Oct
(243) |
Nov
(315) |
Dec
(345) |
| 2007 |
Jan
(260) |
Feb
(323) |
Mar
(340) |
Apr
(319) |
May
(290) |
Jun
(296) |
Jul
(221) |
Aug
(292) |
Sep
(242) |
Oct
(248) |
Nov
(242) |
Dec
(332) |
| 2008 |
Jan
(312) |
Feb
(359) |
Mar
(454) |
Apr
(287) |
May
(340) |
Jun
(450) |
Jul
(403) |
Aug
(324) |
Sep
(349) |
Oct
(385) |
Nov
(363) |
Dec
(437) |
| 2009 |
Jan
(500) |
Feb
(301) |
Mar
(409) |
Apr
(486) |
May
(545) |
Jun
(391) |
Jul
(518) |
Aug
(497) |
Sep
(492) |
Oct
(429) |
Nov
(357) |
Dec
(310) |
| 2010 |
Jan
(371) |
Feb
(657) |
Mar
(519) |
Apr
(432) |
May
(312) |
Jun
(416) |
Jul
(477) |
Aug
(386) |
Sep
(419) |
Oct
(435) |
Nov
(320) |
Dec
(202) |
| 2011 |
Jan
(321) |
Feb
(413) |
Mar
(299) |
Apr
(215) |
May
(284) |
Jun
(203) |
Jul
(207) |
Aug
(314) |
Sep
(321) |
Oct
(259) |
Nov
(347) |
Dec
(209) |
| 2012 |
Jan
(322) |
Feb
(414) |
Mar
(377) |
Apr
(179) |
May
(173) |
Jun
(234) |
Jul
(295) |
Aug
(239) |
Sep
(276) |
Oct
(355) |
Nov
(144) |
Dec
(108) |
| 2013 |
Jan
(170) |
Feb
(89) |
Mar
(204) |
Apr
(133) |
May
(142) |
Jun
(89) |
Jul
(160) |
Aug
(180) |
Sep
(69) |
Oct
(136) |
Nov
(83) |
Dec
(32) |
| 2014 |
Jan
(71) |
Feb
(90) |
Mar
(161) |
Apr
(117) |
May
(78) |
Jun
(94) |
Jul
(60) |
Aug
(83) |
Sep
(102) |
Oct
(132) |
Nov
(154) |
Dec
(96) |
| 2015 |
Jan
(45) |
Feb
(138) |
Mar
(176) |
Apr
(132) |
May
(119) |
Jun
(124) |
Jul
(77) |
Aug
(31) |
Sep
(34) |
Oct
(22) |
Nov
(23) |
Dec
(9) |
| 2016 |
Jan
(26) |
Feb
(17) |
Mar
(10) |
Apr
(8) |
May
(4) |
Jun
(8) |
Jul
(6) |
Aug
(5) |
Sep
(9) |
Oct
(4) |
Nov
|
Dec
|
| 2017 |
Jan
(5) |
Feb
(7) |
Mar
(1) |
Apr
(5) |
May
|
Jun
(3) |
Jul
(6) |
Aug
(1) |
Sep
|
Oct
(2) |
Nov
(1) |
Dec
|
| 2018 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2020 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2025 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| S | M | T | W | T | F | S |
|---|---|---|---|---|---|---|
|
|
|
|
1
(12) |
2
(13) |
3
(14) |
4
(9) |
|
5
(9) |
6
(22) |
7
(17) |
8
(16) |
9
(19) |
10
(17) |
11
(6) |
|
12
|
13
(20) |
14
(21) |
15
(20) |
16
(10) |
17
(14) |
18
(3) |
|
19
(3) |
20
(12) |
21
(22) |
22
(26) |
23
(31) |
24
(26) |
25
(9) |
|
26
(4) |
27
(33) |
28
(15) |
29
(37) |
30
(26) |
|
|
|
From: Eric F. <ef...@ha...> - 2009-04-08 23:38:30
|
antonv wrote: > I know that using the csv files is very slow but I have no knowledge of > working with the netcdf format and I was in a bit of a rush when I wrote > this. I will take a look again at it. How would you translate a grib in > netcdf? Are there any secific applications or straight through numpy? The program you are already using is said to convert grib2 to netcdf: http://www.nws.noaa.gov/mdl/NDFD_GRIB2Decoder/ and there are several modules providing a netcdf interface for numpy. I like this one: http://code.google.com/p/netcdf4-python/ and it is included in Enthought Python Distribution. For GRIB to numpy, googling turned up http://code.google.com/p/pygrib2/ as well as PyNIO. My guess is that this (pygrib2) will be exactly what you need. It is by Jeffrey Whitaker, the author of the above-mentioned netcdf4 interface as well as of basemap. > > As for pyngl, if i remember correctly I looked at it but it was not working > on windows. Well, I recommend switching to linux anyway, but that is another story. Eric |
|
From: Anton V. <vas...@ya...> - 2009-04-08 23:06:39
|
Wow Jeff! You save me again! I remember looking at it last year and thinking it would be awesome if there would be a windows installer for it! I will install and play with it tonight! Thanks a lot! Anton ________________________________ From: Jeff Whitaker <js...@fa...> To: antonv <vas...@ya...> Cc: mat...@li... Sent: Wednesday, April 8, 2009 4:02:22 PM Subject: Re: [Matplotlib-users] Computer specs for fast matplotlib and basemap processing antonv wrote: > I know that using the csv files is very slow but I have no knowledge of > working with the netcdf format and I was in a bit of a rush when I wrote > this. I will take a look again at it. How would you translate a grib in > netcdf? Are there any secific applications or straight through numpy? > > As for pyngl, if i remember correctly I looked at it but it was not working > on windows. > > Thanks, > Anton > Anton: If these are grib version 2 files, another option is http://code.google.com/p/pygrib2. I have made a windows installer. -Jeff > > > efiring wrote: > >> antonv wrote: >> >>> I have a bit of experience programming and I am pretty sure I get my >>> parts of >>> the code pretty well optimized. I made sure that in the loop I have only >>> the >>> stuff needed and I'm loading all the stuff before. >>> >>> The biggest bottleneck is happening because I'm unpacking grib files to >>> csv >>> files using Degrib in command line. That operation is usually around half >>> an >>> >> Instead of going to csv files--which are *very* inefficient to write, store, and then read in again--why not convert directly to netcdf, and then read your data in from netcdf as needed for plotting? I suspect this will speed things up quite a bit. Numpy support for netcdf is very good. Of course, direct numpy-enabled access to the grib files might be even better, eliminating the translation phase entirely. Have you looked into http://www.pyngl.ucar.edu/Nio.shtml? >> >> Eric >> >> >> >>> hour using no more than 50% of the processor but it maxes out the memory >>> usage and it definitely is hard drive intensive as it ends up writing >>> over 4 >>> GB of data. I have noticed also that on a lower spec AMD desktop this >>> runs >>> faster than on my P4 Intel Laptop, my guess being that the laptop hdd is >>> 5400 rpm and the desktop is 7200 rpm. >>> >>> Next step is to take all those csv files and make images from them. For >>> this >>> one I haven't dug too deep to see what is happening but it seems to be >>> the >>> other way, using the cpu a lot more while keeping the memory usage high >>> too. >>> >>> Thanks, >>> Anton >>> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by: >> High Quality Requirements in a Collaborative Environment. >> Download a free trial of Rational Requirements Composer Now! >> http://p.sf.net/sfu/www-ibm-com >> _______________________________________________ >> Matplotlib-users mailing list >> Mat...@li... >> https://lists.sourceforge.net/lists/listinfo/matplotlib-users >> >> >> > > -- Jeffrey S. Whitaker Phone : (303)497-6313 Meteorologist FAX : (303)497-6449 NOAA/OAR/PSD R/PSD1 Email : Jef...@no... 325 Broadway Office : Skaggs Research Cntr 1D-113 Boulder, CO, USA 80303-3328 Web : http://tinyurl.com/5telg |
|
From: Jeff W. <js...@fa...> - 2009-04-08 23:02:26
|
antonv wrote: > I know that using the csv files is very slow but I have no knowledge of > working with the netcdf format and I was in a bit of a rush when I wrote > this. I will take a look again at it. How would you translate a grib in > netcdf? Are there any secific applications or straight through numpy? > > As for pyngl, if i remember correctly I looked at it but it was not working > on windows. > > Thanks, > Anton > Anton: If these are grib version 2 files, another option is http://code.google.com/p/pygrib2. I have made a windows installer. -Jeff > > > efiring wrote: > >> antonv wrote: >> >>> I have a bit of experience programming and I am pretty sure I get my >>> parts of >>> the code pretty well optimized. I made sure that in the loop I have only >>> the >>> stuff needed and I'm loading all the stuff before. >>> >>> The biggest bottleneck is happening because I'm unpacking grib files to >>> csv >>> files using Degrib in command line. That operation is usually around half >>> an >>> >> Instead of going to csv files--which are *very* inefficient to write, >> store, and then read in again--why not convert directly to netcdf, and >> then read your data in from netcdf as needed for plotting? I suspect >> this will speed things up quite a bit. Numpy support for netcdf is very >> good. Of course, direct numpy-enabled access to the grib files might be >> even better, eliminating the translation phase entirely. Have you >> looked into http://www.pyngl.ucar.edu/Nio.shtml? >> >> Eric >> >> >> >>> hour using no more than 50% of the processor but it maxes out the memory >>> usage and it definitely is hard drive intensive as it ends up writing >>> over 4 >>> GB of data. I have noticed also that on a lower spec AMD desktop this >>> runs >>> faster than on my P4 Intel Laptop, my guess being that the laptop hdd is >>> 5400 rpm and the desktop is 7200 rpm. >>> >>> Next step is to take all those csv files and make images from them. For >>> this >>> one I haven't dug too deep to see what is happening but it seems to be >>> the >>> other way, using the cpu a lot more while keeping the memory usage high >>> too. >>> >>> Thanks, >>> Anton >>> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by: >> High Quality Requirements in a Collaborative Environment. >> Download a free trial of Rational Requirements Composer Now! >> http://p.sf.net/sfu/www-ibm-com >> _______________________________________________ >> Matplotlib-users mailing list >> Mat...@li... >> https://lists.sourceforge.net/lists/listinfo/matplotlib-users >> >> >> > > -- Jeffrey S. Whitaker Phone : (303)497-6313 Meteorologist FAX : (303)497-6449 NOAA/OAR/PSD R/PSD1 Email : Jef...@no... 325 Broadway Office : Skaggs Research Cntr 1D-113 Boulder, CO, USA 80303-3328 Web : http://tinyurl.com/5telg |
|
From: antonv <vas...@ya...> - 2009-04-08 22:54:24
|
I know that using the csv files is very slow but I have no knowledge of working with the netcdf format and I was in a bit of a rush when I wrote this. I will take a look again at it. How would you translate a grib in netcdf? Are there any secific applications or straight through numpy? As for pyngl, if i remember correctly I looked at it but it was not working on windows. Thanks, Anton efiring wrote: > > antonv wrote: >> I have a bit of experience programming and I am pretty sure I get my >> parts of >> the code pretty well optimized. I made sure that in the loop I have only >> the >> stuff needed and I'm loading all the stuff before. >> >> The biggest bottleneck is happening because I'm unpacking grib files to >> csv >> files using Degrib in command line. That operation is usually around half >> an > > Instead of going to csv files--which are *very* inefficient to write, > store, and then read in again--why not convert directly to netcdf, and > then read your data in from netcdf as needed for plotting? I suspect > this will speed things up quite a bit. Numpy support for netcdf is very > good. Of course, direct numpy-enabled access to the grib files might be > even better, eliminating the translation phase entirely. Have you > looked into http://www.pyngl.ucar.edu/Nio.shtml? > > Eric > > >> hour using no more than 50% of the processor but it maxes out the memory >> usage and it definitely is hard drive intensive as it ends up writing >> over 4 >> GB of data. I have noticed also that on a lower spec AMD desktop this >> runs >> faster than on my P4 Intel Laptop, my guess being that the laptop hdd is >> 5400 rpm and the desktop is 7200 rpm. >> >> Next step is to take all those csv files and make images from them. For >> this >> one I haven't dug too deep to see what is happening but it seems to be >> the >> other way, using the cpu a lot more while keeping the memory usage high >> too. >> >> Thanks, >> Anton > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by: > High Quality Requirements in a Collaborative Environment. > Download a free trial of Rational Requirements Composer Now! > http://p.sf.net/sfu/www-ibm-com > _______________________________________________ > Matplotlib-users mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-users > > -- View this message in context: http://www.nabble.com/Computer-specs-for-fast-matplotlib-and-basemap-processing-tp22956400p22961419.html Sent from the matplotlib - users mailing list archive at Nabble.com. |
|
From: Eric F. <ef...@ha...> - 2009-04-08 21:37:35
|
antonv wrote: > I have a bit of experience programming and I am pretty sure I get my parts of > the code pretty well optimized. I made sure that in the loop I have only the > stuff needed and I'm loading all the stuff before. > > The biggest bottleneck is happening because I'm unpacking grib files to csv > files using Degrib in command line. That operation is usually around half an Instead of going to csv files--which are *very* inefficient to write, store, and then read in again--why not convert directly to netcdf, and then read your data in from netcdf as needed for plotting? I suspect this will speed things up quite a bit. Numpy support for netcdf is very good. Of course, direct numpy-enabled access to the grib files might be even better, eliminating the translation phase entirely. Have you looked into http://www.pyngl.ucar.edu/Nio.shtml? Eric > hour using no more than 50% of the processor but it maxes out the memory > usage and it definitely is hard drive intensive as it ends up writing over 4 > GB of data. I have noticed also that on a lower spec AMD desktop this runs > faster than on my P4 Intel Laptop, my guess being that the laptop hdd is > 5400 rpm and the desktop is 7200 rpm. > > Next step is to take all those csv files and make images from them. For this > one I haven't dug too deep to see what is happening but it seems to be the > other way, using the cpu a lot more while keeping the memory usage high too. > > Thanks, > Anton |
|
From: antonv <vas...@ya...> - 2009-04-08 20:57:25
|
I have a bit of experience programming and I am pretty sure I get my parts of the code pretty well optimized. I made sure that in the loop I have only the stuff needed and I'm loading all the stuff before. The biggest bottleneck is happening because I'm unpacking grib files to csv files using Degrib in command line. That operation is usually around half an hour using no more than 50% of the processor but it maxes out the memory usage and it definitely is hard drive intensive as it ends up writing over 4 GB of data. I have noticed also that on a lower spec AMD desktop this runs faster than on my P4 Intel Laptop, my guess being that the laptop hdd is 5400 rpm and the desktop is 7200 rpm. Next step is to take all those csv files and make images from them. For this one I haven't dug too deep to see what is happening but it seems to be the other way, using the cpu a lot more while keeping the memory usage high too. Thanks, Anton -- View this message in context: http://www.nabble.com/Computer-specs-for-fast-matplotlib-and-basemap-processing-tp22956400p22959409.html Sent from the matplotlib - users mailing list archive at Nabble.com. |
|
From: João L. S. <js...@fc...> - 2009-04-08 20:40:46
|
antonv wrote: > Hi all, > > I am processing a lot of grib data from noaa with the use of matplotlib and > basemap. On my actual laptop (p4 3ghz, 512mb ram) the whole process takes > close to 3 hours... so it's time for a new machine but still on a very tight > budget :) > You should profile your application to see why it's taking so long. Maybe you just coded something in a slow way. Python is a great language, but if you don't know it well you might have programmed some parts in a way that takes orders of magnitude more time than other solutions. Even if your code reasonably optimized, you should know first why it's slow: Has the computer run out of memory and is swapping? Is the CPU at 100%? I'd recommend you ask a local python expert for some help. JLS |
|
From: Eric F. <ef...@ha...> - 2009-04-08 19:23:34
|
antonv wrote: > Hi all, > > I am processing a lot of grib data from noaa with the use of matplotlib and > basemap. On my actual laptop (p4 3ghz, 512mb ram) the whole process takes > close to 3 hours... so it's time for a new machine but still on a very tight > budget :) > > My main question is what should i emphasize more, a quad core processor > running on 64 bit vista/xp, or more memory and a fast hard drive, even a > raid drive? Will python, mpl and basemap take full advantage of multiple > cores or will they use only one? Also, would they work on a 64 bit > environment or would I be better off just sticking to XP32? Now memory wise, > it seems that on my actual machine the app uses all the available ram, how > much should i buy to make sure that all it's needs would be meet? Just a few comments; I am sure others are more knowledgeable about most of this. First, I think you need to try to figure out what the bottlenecks are. Can you monitor disk use, memory use, and cpu use? Is the disk maxed out and the cpu idle? If the disk is heavily used, is it swapping? From what you have said, it is impossible to tell whether the disk speed would make a difference, for example. My guess is that it is going to be low priority. Second, as part of the above, you might review your code and see whether there are some very inefficient parts. How much time is spent in loops that could be vectorized? Are lists being used where arrays would be more efficient? In basemap, are you re-using instances where possible, or are you unnecessarily re-extracting coastlines, for example? Is it possible that you are running out of memory and then swapping because you are using pylab/pyplot and failing to close figures when you have finished with them? If your budget is tight, I would be very surprised if SCSI would be cost-effective. Generally, SATA is the way to go these days. I suspect there won't be much speed difference between 32-bit and 64-bit OS versions. RAM: I expect 4GB will be both cheap and adequate. To use multiple processors efficiently with matplotlib, you will need multiple processes; mpl and numpy do not automatically dispatch parts of a single job out to multiple processors. (I'm not sure what happens if you use threads--I think it will still be one job per processor--but the general advice is, don't use threads unless you really know what you are doing, really need them, and are willing to put in some heavy debugging time.) My guess is that your 3-hour-job could easily be split up into independent jobs working on independent chunks of data, in which case such a split would give you a big speed-up with more processor cores, assuming the work is CPU-intensive; if it is disk IO-bound, then the split won't help. Anyway, dual-core is pretty standard now, and you will want at least that. Quad might or might not help, as indicated above. Eric > > Processor wise, i see that both Intel and AMD have a plethora of options to > choose from... What would you recommend? > > And the last question is about hard drives. From your experience, what > drives should I look at? Is a SCSI raid still that much faster than a 10.000 > rpm hdd? I've also seen that there are some 15.000 rpm drives that have a > controller, would they worth the money or should I just get a 10.000 rpm hdd > and be done? > > Thanks for any help as lately I haven't kept up with the technology and I > feel like a noob :( > > Anton |
|
From: Gideon S. <si...@ma...> - 2009-04-08 18:29:41
|
Is there a way to save a figured at a specified size? -gideon |
|
From: antonv <vas...@ya...> - 2009-04-08 18:05:41
|
Hi all, I am processing a lot of grib data from noaa with the use of matplotlib and basemap. On my actual laptop (p4 3ghz, 512mb ram) the whole process takes close to 3 hours... so it's time for a new machine but still on a very tight budget :) My main question is what should i emphasize more, a quad core processor running on 64 bit vista/xp, or more memory and a fast hard drive, even a raid drive? Will python, mpl and basemap take full advantage of multiple cores or will they use only one? Also, would they work on a 64 bit environment or would I be better off just sticking to XP32? Now memory wise, it seems that on my actual machine the app uses all the available ram, how much should i buy to make sure that all it's needs would be meet? Processor wise, i see that both Intel and AMD have a plethora of options to choose from... What would you recommend? And the last question is about hard drives. From your experience, what drives should I look at? Is a SCSI raid still that much faster than a 10.000 rpm hdd? I've also seen that there are some 15.000 rpm drives that have a controller, would they worth the money or should I just get a 10.000 rpm hdd and be done? Thanks for any help as lately I haven't kept up with the technology and I feel like a noob :( Anton -- View this message in context: http://www.nabble.com/Computer-specs-for-fast-matplotlib-and-basemap-processing-tp22956400p22956400.html Sent from the matplotlib - users mailing list archive at Nabble.com. |
|
From: Ryan M. <rm...@gm...> - 2009-04-08 14:33:38
|
On Tue, Apr 7, 2009 at 4:29 PM, Jae-Joon Lee <lee...@gm...> wrote:
> Hi,
>
> I'm not a frequent user of matplotlib.dates module, so other expert
> may give you a better answer.
> My understanding is that, for the date time formatting, the (x-) data
> needs to be days (if not datetime instance) from some reference point
> (1, 1, 1? I'm not sure).
>
> The easiest way I can think of in your case is
>
>
> from matplotlib.dates import datetime, SEC_PER_DAY
> ordinal_today=datetime.datetime.today().toordinal()
> xvals = ordinal_today + np.arange(1200, dtype="d")/SEC_PER_DAY
You can also do this without converting to ordinal by hand:
from datetime import datetime, timedelta
today = datetime.today()
xvals = [today + timedelta(seconds=s) for s in range(1200)]
# Matplotlib can use lists, but you can also make this into a numpy
object array
xvals = np.array(xvals)
Since matplotlib's date formatter and locator require absolute times, you
could also just make your own locator and formatter functions for your
relative time values:
import matplotlib.ticker as mticker
def minsec(sec, unused):
minutes = sec // 60
sec = sec - minutes * 60
return '%d:%02d' % (minutes, sec)
locator = mticker.MultipleLocator(60)
formatter = mticker.FuncFormatter(minsec)
Ryan
--
Ryan May
Graduate Research Assistant
School of Meteorology
University of Oklahoma
Sent from Enterprise, AL, United States
|
|
From: Andrew S. <str...@as...> - 2009-04-08 14:26:51
|
jtamir wrote: > Hi, > > I am having trouble installing Basemap. I followed the directions in the > README file included in the archive (and posted at > http://matplotlib.sourceforge.net/basemap/doc/html/users/installing.html). > After successfully installing the GEOS library (also included), I cd to the > "top level basemap directory" and run the command "python setup.py install." > The install fails, with multiple compile errors related to src/_proj.c. > > However, it appears that > lib/python2.5/site-packages/numpy/core/include/numpy/__multiarray_a > pi.h also produces errors, so I suspect it may have to do with gcc... > Does that file .h exist at that location? Typically, it is the first error that I look at -- can you re-send the output including the first error? -Andrew |
|
From: LUK S. <shu...@po...> - 2009-04-08 11:31:10
|
Jesper Larsen wrote:
> Hi matplotlib-users,
>
> I have an application which I am currently translating to other
> languages including Chinese. I was wondering what recommendations you
> have for internationalization with regards to matplotlib. Using the
> default font it seems like Chinese characters are not showing up on
> the plots. I tried running this file:
>
> # -*- coding: utf-8 -*-
> from matplotlib import pyplot as p
> p.plot([1,2,4])
> wind = u'\u98ce'
> p.title(wind)
> p.savefig('test.png')
Try include the font in your matplotlibrc file. For example, I was able
to show the "wind" "风" by adding 'AR PL SungtiL GB' (no quotes) to the
font.sans-serif property line.
Regards,
ST
--
|
|
From: Jouni K. S. <jk...@ik...> - 2009-04-08 11:23:54
|
Jesper Larsen <jes...@gm...> writes: > wind = u'\u98ce' > p.title(wind) > But there is just a box instead of the proper character on the plot. > Any ideas what went wrong? Do I have to use a special font? Of course you need a font that contains the Chinese characters you are using. I have no idea whether matplotlib has any issues with such fonts (though I think e.g. OpenType is only partially supported), but getting a box sounds like the font does not have the character you need. > I also tried using TeX following the example here: > http://matplotlib.sourceforge.net/examples/pylab_examples/tex_unicode_demo.html > but it did not work when I put in Chinese symbols. TeX's Unicode support is not at all complete - matplotlib merely selects the utf8 input encoding, which isn't enough to make TeX work with Chinese. Perhaps the instructions http://www.math.nus.edu.sg/aslaksen/cs/cjk.html will get you started if you want to go the TeX route. -- Jouni K. Seppänen http://www.iki.fi/jks |
|
From: Jesper L. <jes...@gm...> - 2009-04-08 09:57:22
|
Hi matplotlib-users,
I have an application which I am currently translating to other
languages including Chinese. I was wondering what recommendations you
have for internationalization with regards to matplotlib. Using the
default font it seems like Chinese characters are not showing up on
the plots. I tried running this file:
# -*- coding: utf-8 -*-
from matplotlib import pyplot as p
p.plot([1,2,4])
wind = u'\u98ce'
p.title(wind)
p.savefig('test.png')
But there is just a box instead of the proper character on the plot.
Any ideas what went wrong? Do I have to use a special font?
I also tried using TeX following the example here:
http://matplotlib.sourceforge.net/examples/pylab_examples/tex_unicode_demo.html
but it did not work when I put in Chinese symbols.
Any ideas?
Best regards,
Jesper
|