1

I am fairly new to python, and come from a C# background. In C# l, third party libraries are commonly stored inside the project folder.

This means that libraries are totally internal to the project. The project then is not dependent on anything outside of the project folder (other than .net framework of course).

I really like this structure and have tried successfully to mirror this in python by copying the libraries into a lib directory, in the project root, and adding the lib folder to the python path on startup of the application.

I am worried that there may be something I am overlooking by doing this as I have looked around a bit amd have not really seen anyone else in thw python community doing this.

My question is simply - is this ok? Is there something that I may miss by simply dumping the necessary .py libraries in, rather than using easy install, and thus storing the libraries in site packages, at a system level?

Please feel free to let me know of any drawbacks you can see, no matter how simple.

Thanks!

1
  • 1
    it should be fine ... beware of missing dll's though if you are distributing it . a better system is to leave them where they are and use pyinstaller or something similar to package your project for distribution, this will bundle in dependencies typically Commented Apr 23, 2013 at 22:25

2 Answers 2

1

I'll espouse the usage of virtualenv and pip for development purposes. This will give you exactly the sandbox that you are used to. As for distribution, use setup.py and reuse the requirements.txt file that you would use with pip install -r to install dependencies to generate the install_requires argument to setuptools.setup. I've been meaning to set up an example that shows this off a little - check out https://github.com/dave-shawley/setup-example for a nice example with some description too. I plan on adding a little more to this as time allows.

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks for the reply.. I admit that I havent fully investigated virtualenv. However, say for example I wanted to checkout my code onto a computer that has not had it before... does using virtual env not mean that I have to still install the third party libs on my system (even if its done automatically by virtualenv)? Ideally I would like to just checkout/clone the code, then just run straiht away. This may just be my lack of virtualenv knowledge here.... Going to read up on it now anyway
Had a look at your example there. It seems that it does what I thought - defines dependencies for the project that are auto installed on the system. Is this correct? This is what I Would prefer to avoid... I would like everything to be self contained within the project folder. Is my current method the way of doing this?
0

If you want to closely manage the dependencies of your code on the per project basis you might want to take a look at virtualenv.

Virtualenv will allow you to keep your dependencies close to your source but will remove the error prone manual copying of the .py files.

On top of that remamber that some packages are not pure python and they sometimes contain compiled C code - if you use virtualenv you do not have to worry about it.

2 Comments

Can virtualenv be used for this purpose if the OP wants to distribute the code? I'm genuinely curious because that would make my like easier as well. Is there a guide for doing this?
Yea.... Have a read at my comment to d shawley above. Can I clone my code to a fresh python instal, and run immediately?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.