Every Jupyter notebook I open seems to use a constant 900-1200 MB of RAM continuously. Where is most of this overhead coming from? It seems to be relatively independent of the contents of the notebook, as most of the jump in usage is when the notebook is first opened, before any cells are even executed. In fact, sometimes after executing a number of the cells the RAM usage actually goes down.
To give an idea, in the notebook I have open now, the largest variable I've defined is an ~3000-element list of tuples, each of which contains 5 ints and a float. So it's not like I have several million-row data tables open--and as I said, the RAM usage was already up near that 1-GB mark before I even ran the code that generated this array.
I'm gathering that the Python runtime itself is somehow using most of this RAM, but if I run Python from the command line it doesn't do this. Also, the memory usage is showing up as being used by the browser process, NOT the background Python process (even when the Jupyter notebook is the only thing running in the browser--and when the browser was closed prior to launching the notebook, hence it isn't some kind of left over data from previously browsed websites that never got cleared).