On Fri, Aug 30, 2013 at 03:27:42PM +0800, Daniel Veillard wrote:
On Thu, Aug 29, 2013 at 12:24:41PM +0100, Daniel P. Berrange wrote:
> As everyone knows, we have historically always shipped the python binding
> as part of the libvirt primary tar.gz distribution. In some ways that has
> simplified life for people, since we know they'll always have a libvirt
> python that matches their libvirt C library.
>
> At the same time though, this policy of ours is causing increasing amounts
> of pain for a number of our downstream users.
>
> In OpenStack, in particular, their development and test environments aim
> to be able to not rely on any system installed python packages. They use
> a virtualenv and pip to install all python deps from PyPi (equivalent of
> Perl's CPAN in Python world). This approach works for everything except
> the libvirt Python code which is not available standalone on PyPi. This
> is causing so much pain that people have suggested taking the libvirt
> python code we ship and just uploading it to PyPi themselves[1]. This
> would obviously be a somewhat hostile action to take, but the way we
> distribute libvirt python is forcing OpenStack to consider such things.
>
> In RHEL world too, bundling of libvirt + its python binding is causing
> pain with the fairly recent concept of "software collections"[2]. This
> allows users to install multiple versions of languages like Python, Perl,
> etc on the same box in parallel. To use libvirt python with thse alternate
> python installs though, requires that they recompile the entire libvirt
> distribution just to get the Python binding. This is obviously not an
> approach that works for most people, particularly if they're looking to
> populate their software collection using 'pip' rather than RPM.
>
> Looking on google there are a number of other people asking for libvirt
> python as a separate module, eg on Stackoverflow[3].
>
>
> I don't think these issues are going to go away, in fact I think they
> will likely become more pressing, until the point where some 3rd party
> takes the step of providing libvirt python bindings themselves. I don't
> think we want to let ourselves drift into the situation where we loose
> control over releasing libvirt python bindings.
>
> IMHO we should / must listen to our users here before it is too late.
>
> We can still release libvirt python at the same time as normal libvirt
> releases, and require that people update the bindings whenever adding
> new APIs (if the generator doesn't cope with them). We should simply
> distribute python as a separate tar.gz, as we do for all other languages,
> and upload it to PyPi, as well as
libvirt.org FTP when doing a release.
>
> Obviously there will be some work to separate things out, but I don't
> see that being insurmountable, since all other language bindings manage
> to be separate, even when doing code generation. We'd also want to
> change to use distutils, rather than autoconf, since that's what the
> python world wants.
Okay, message received :-)
First we keep the status quo for 1.1.2, obvious but better to state it.
Of course, I'm not suggesting we rush into anything. Perhaps the next
release, perhaps the one after, etc....
Second the key point is really to have tarballs of the python
bindings
available as separate source from upstream (us !), and make sure we
remove the bindings from the libvirt tarball releases. Right ?
Yes, that is the key point.
Third, having a separate repository for the python bindings
doesn't
bring much unless we really want to generate tarball on a regular basis
for that project independantly of libvirt ones. On the other hand having
the merged repository like we do now means the python bindings patches
tend to be reviewed and tested as the new APIs are added, i.e. when it's
fresh, and keep people more inclined to actually think about them :)
On the other hand moving to a separate repo would likely loose our git history
(not sure if we can keep it, i doubt) which would be a bummer IMHO.
IMHO I'd rather see us have separate python repository, since I
like the clarity of one repo == one dist. In particular I've
never been a fan of projects where there are multiple different
build systems used for different parts of the git tree.
I'd like to thing we can address the issue of API additions via
automated testing. More generally I'd like to see us get a bit
more serious about several of our language bindings.
For both Perl and Python I'd like to see us guarantee that they
will always be in sync via automated testing, and aim to also
bring other bindings upto parity too over time.
So I would be tempted by the minimal approach of scinding the
tarballs
making sure that we get in the python subdirectory that will lead to
python bindings a classic and up to date setup.py, maybe move some of
the auto* python specific code there as equivalent scripts or focuse
our effort into making sure the setup.py is good enough. Then the
corresponding spec.in goes there too.
At libvirt release time I regenerate 2 tarballs instead of one,
and we have the pleasure to create new components builds for the
python bindings from there.
Opinions about this plan ?
Regards,
Daniel
--
|:
http://berrange.com -o-
http://www.flickr.com/photos/dberrange/ :|
|:
http://libvirt.org -o-
http://virt-manager.org :|
|:
http://autobuild.org -o-
http://search.cpan.org/~danberr/ :|
|:
http://entangle-photo.org -o-
http://live.gnome.org/gtk-vnc :|