6

In our company we're using subversion. We have different python modules (own and third party) in different versions in use. The various applications we develop have various dependencies regarding the version of the shared modules.

One possibility is using virtualenv installing the modules from a local pypi server. So on every initial checkout we need to create a virtualenv, activate it and install dependent modules from requirements.txt.

Disadvantages:

  • Relatively complex operation for a simple task like checkout and run
  • Your able to miss the creation of the virtualenv and your working with the modules installed in site-packages
  • Need for a local pypi server (ok, your otherwise able to use urls pointing to your vcs)

So we came up with another solution and I ask for your opinion: In the path of the application we use svn:externals (aka git submodules) to "link" to the specified module (from it's release path and with specified revision number to keep it read only), so the module will be placed locally in the path of the application. A "import mylib" will work as it was installed in python site-packages or in the virtualenv. This could be extended to even put a release of wx, numpy and other often used libraries into our repository and link them locally.

The advantages are:

  • After the initial checkout your ready to run (really important point for me)
  • version dependencies are fixed (like requirements.txt)

The actual question is: Are there projects out there on github/sorceforge using this scheme? Why is everybody using virtualenv instead of this (seemingly) simpler scheme? I never saw such a solution, so maybe we miss a point?

PS: I posted this already on pypa-dev mailinglist but it seems to be the wrong place for this kind of question. Please excuse this cross post.

5
  • 2
    I do this all the time at work. Like you say, you can lock down versions. The only problem is that I started doing it before they added the fancy new syntax for svn:externals, so I had to give a fully qualified path, and we've been acquired two times since then and the name of the servers keep changing. So use relative paths if at all possible... Commented Aug 4, 2015 at 1:42
  • 3
    this is not something new - you have it e.g. in the android sources, but I do not like this approach - updating requirements to the new version is quite painfull process (especially when you have different development platforms e.g. os & linux &| win), having your own pypi with wheels (precompiled) and installing requirements is quite fast process, you can use pip all the time and you have all the tools at your disposal (e.g. pip-tools) Commented Jun 29, 2016 at 19:55
  • 2
    We used to do this with svn and now do something similar with git. But, what's the question, exactly? Commented Jun 30, 2016 at 13:21
  • @Aya We tried it with gits submodule functionality but it's not possible to get a subpath of a repository (like /source/lib` ). Do I miss a point? You're right, it's not clear what's the actual question. I will edit it to be more precise. Commented Jun 30, 2016 at 17:05
  • @GüntherJena If your filesystem supports symbolic links, you can use that as a workaround. Commented Jun 30, 2016 at 17:12

1 Answer 1

6
+50

In the path of the application we use svn:externals (aka git submodules) to "link" to the specified module (from it's release path and with specified revision number to keep it read only), so the module will be placed locally in the path of the application.

This is a more traditional method for managing package dependencies, and is the simpler of the two options for software which is only used internally. With regards to...

After the initial checkout you're ready to run

...that's not strictly true. If one of your dependencies is a Python library written in C, it will need to be compiled first.


We tried it with git's submodule functionality but it's not possible to get a subpath of a repository (like /source/lib)

This is fairly easy to work around if you check out the whole repository in a location outside your PYTHONPATH, then just symlink to the required files or directories inside your PYTHONPATH, although it does require you to be using a filesystem which support symlinks.

For example, with a layout like...

myproject
|- bin
|  |- myprogram.py
|
|- lib
|  |- mymodule.py
|  |- mypackage
|  |  |- __init__.py
|  |
|  |- foopackage -> ../submodules/libfoo/lib/foopackage
|  |- barmodule
|     |- __init__.py -> ../../submodules/libbar/lib/barmodule.py
|
|- submodules
   |- libfoo
   |  |- bin
   |  |- lib
   |     |- foopackage
   |        |- __init__.py
   |
   |- libbar
       |- bin
       |- lib
          | barmodule.py

...you need only have my_project/lib in your PYTHONPATH, and everything should import correctly.


Are there projects out there on github/sourceforge using this scheme?

The submodule information is just stored in a file called .gitmodules, and a quick Google for "site:github.com .gitmodules" returns quite a few results.


Why is everybody using virtualenv instead of this (seemingly) simpler scheme?

For packages published on PyPI, and installed with pip, it's arguably easier from a dependency-management point-of-view.

If your software has a relatively simple dependency graph, like...

myproject
|- libfoo
|- libbar

...it's no big deal, but when it becomes more like...

myproject
|- libfoo
|  |- libsubfoo
|     |- libsubsubfoo
|        |- libsubsubsubfoo
|           |- libsubsubsubsubfoo
|- libbar
   |- libsubbar1
   |- libsubbar2
   |- libsubbar3
   |- libsubbar4

...you may not want to take on the responsibility of working out which versions of all those sub-packages are compatible, should you need to upgrade libbar for whatever reason. You can delegate that responsibility to the maintainer of the libbar package.


In your particular case, the decision as to whether your solution is the right one will depend on the answers to the questions:-

  1. Are all of the external modules you need to use actually available from svn repositories?
  2. Do those repositories use svn:externals correctly to include compatible versions of any dependencies they require, or if not, are you prepared to take on the responsibility of managing those dependencies yourself?

If the answer to both questions is "yes", then your solution is probably right for your case.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.