[libvirt] [PATCH] build: be smarter about building documentation

I'm tired of cryptic reports on IRC from people who build from git, then type 'make install' and have it fail quite a ways down the road because the documentation wasn't built. It's a feature that documentation is not built during development if the toolchain is not present (not all git developers build tarballs, and the tarballs already contain pre-built docs); but this only works as long as you don't try to install or make a tarball from that setup. With this patch in place, and without xhtml1-dtds, I now get this nice failure: $ make install cfg.mk:109: *** ERROR: missing doc toolchain (install xhtml1-dtds and xmllint). Stop. and all with no impact to regular 'make' or 'make check'. Along the way, I tried to discover why 'yum-builddeps libvirt' doesn't always tell developers to install xhtml1-dtds - my conclusion was that if the .srpm is unavailable, then yum-builddeps can't figure out anything that is required. The spec file already has a BuildRequires on the doc toolchain (and therefore, the docs shipped with an rpm are up-to-date, even if the spec file applied patches that affect the docs). So thankfully I don't have to make any spec file changes in this patch. * cfg.mk: Let 'make install' and 'make dist' error much earlier if we detect a build from git without doc toolchain. Signed-off-by: Eric Blake <eblake@redhat.com> --- Heavily influenced by code in gnulib's GNUMakefile :) cfg.mk | 23 +++++++++++++++++++++++ 1 file changed, 23 insertions(+) diff --git a/cfg.mk b/cfg.mk index e6584e8..7e17b5c 100644 --- a/cfg.mk +++ b/cfg.mk @@ -88,6 +88,29 @@ else distdir: sc_vulnerable_makefile_CVE-2012-3386.z endif +# We intentionally don't require the doc toolchain during 'make' during +# development, but 'make dist' must ship pre-built docs, and 'make install' +# from a git build fails if docs weren't built, both with awkward error +# messages if we let the build run that far. Therefore, bail loud and +# early if we detect a git tree but no doc toolchain. +ifeq ($(MAKELEVEL),0) + _is-dist-target ?= $(filter-out %clean, \ + $(filter maintainer-% dist% alpha beta stable,$(MAKECMDGOALS))) + _is-install-target ?= $(filter-out %check, $(filter install%,$(MAKECMDGOALS))) + ifneq (,$(_is-dist-target)$(_is-install-target)) + ifeq ($(shell \ + if test -e $(srcdir)/.git; then \ + if test -x $(XMLLINT) && test -x $(XMLCATALOG) && \ + $(XMLCATALOG) '$(XML_CATALOG_FILE)' \ + "-//W3C//DTD XHTML 1.0 Strict//EN" >/dev/null; then \ + echo works; \ + else echo oops; fi;\ + else echo tarball; fi),oops) + $(error ERROR: missing doc toolchain (install xhtml1-dtds and xmllint)) + endif + endif +endif + # Files that should never cause syntax check failures. VC_LIST_ALWAYS_EXCLUDE_REGEX = \ (^(HACKING|docs/(news\.html\.in|.*\.patch))|\.po)$$ -- 1.8.3.1

On Thu, Sep 19, 2013 at 04:27:41PM -0600, Eric Blake wrote:
I'm tired of cryptic reports on IRC from people who build from git, then type 'make install' and have it fail quite a ways down the road because the documentation wasn't built. It's a feature that documentation is not built during development if the toolchain is not present (not all git developers build tarballs, and the tarballs already contain pre-built docs); but this only works as long as you don't try to install or make a tarball from that setup. With this patch in place, and without xhtml1-dtds, I now get this nice failure:
$ make install cfg.mk:109: *** ERROR: missing doc toolchain (install xhtml1-dtds and xmllint). Stop.
and all with no impact to regular 'make' or 'make check'.
Along the way, I tried to discover why 'yum-builddeps libvirt' doesn't always tell developers to install xhtml1-dtds - my conclusion was that if the .srpm is unavailable, then yum-builddeps can't figure out anything that is required. The spec file already has a BuildRequires on the doc toolchain (and therefore, the docs shipped with an rpm are up-to-date, even if the spec file applied patches that affect the docs). So thankfully I don't have to make any spec file changes in this patch.
* cfg.mk: Let 'make install' and 'make dist' error much earlier if we detect a build from git without doc toolchain.
Signed-off-by: Eric Blake <eblake@redhat.com> ---
Heavily influenced by code in gnulib's GNUMakefile :)
cfg.mk | 23 +++++++++++++++++++++++ 1 file changed, 23 insertions(+)
diff --git a/cfg.mk b/cfg.mk index e6584e8..7e17b5c 100644 --- a/cfg.mk +++ b/cfg.mk @@ -88,6 +88,29 @@ else distdir: sc_vulnerable_makefile_CVE-2012-3386.z endif
+# We intentionally don't require the doc toolchain during 'make' during +# development, but 'make dist' must ship pre-built docs, and 'make install' +# from a git build fails if docs weren't built, both with awkward error +# messages if we let the build run that far. Therefore, bail loud and +# early if we detect a git tree but no doc toolchain. +ifeq ($(MAKELEVEL),0) + _is-dist-target ?= $(filter-out %clean, \ + $(filter maintainer-% dist% alpha beta stable,$(MAKECMDGOALS))) + _is-install-target ?= $(filter-out %check, $(filter install%,$(MAKECMDGOALS))) + ifneq (,$(_is-dist-target)$(_is-install-target)) + ifeq ($(shell \ + if test -e $(srcdir)/.git; then \ + if test -x $(XMLLINT) && test -x $(XMLCATALOG) && \ + $(XMLCATALOG) '$(XML_CATALOG_FILE)' \ + "-//W3C//DTD XHTML 1.0 Strict//EN" >/dev/null; then \ + echo works; \ + else echo oops; fi;\ + else echo tarball; fi),oops) + $(error ERROR: missing doc toolchain (install xhtml1-dtds and xmllint)) + endif + endif +endif
Can't we just make our existing rules fatal. I really don't see the point in treating docs errors as non-fatal. If the docs are not built, or are outdated, we should try to build them and fail if the tools aren't present. We already require libxml.so be present, so requiring the libxml/libxslt cli tools really isn't a burden in the great scheme of things. IOW, we should remove all the @if test -x $(XMLLINT) && test -x $(XMLCATALOG) ; then \ @if [ -x $(XSLTPROC) ] ; then \ conditionals, and just let 'make' do its normal deps calculation and errore reporting. As long as the docs are included in the tar.gz people building fro mthe tar.gz will still not build the docs. Daniel -- |: http://berrange.com -o- http://www.flickr.com/photos/dberrange/ :| |: http://libvirt.org -o- http://virt-manager.org :| |: http://autobuild.org -o- http://search.cpan.org/~danberr/ :| |: http://entangle-photo.org -o- http://live.gnome.org/gtk-vnc :|

On 09/27/2013 09:48 AM, Daniel P. Berrange wrote:
* cfg.mk: Let 'make install' and 'make dist' error much earlier if we detect a build from git without doc toolchain.
Signed-off-by: Eric Blake <eblake@redhat.com> ---
Can't we just make our existing rules fatal. I really don't see the point in treating docs errors as non-fatal. If the docs are not built, or are outdated, we should try to build them and fail if the tools aren't present. We already require libxml.so be present, so requiring the libxml/libxslt cli tools really isn't a burden in the great scheme of things.
IOW, we should remove all the
@if test -x $(XMLLINT) && test -x $(XMLCATALOG) ; then \ @if [ -x $(XSLTPROC) ] ; then \
conditionals, and just let 'make' do its normal deps calculation and errore reporting.
As long as the docs are included in the tar.gz people building fro mthe tar.gz will still not build the docs.
Which means ./autogen.sh (via bootstrap.conf) should start requiring the doc toolchain for a git build. Makes sense, if the created tarball indeed has timestamps new enough to prevent a rebuild of the docs (easy enough to verify). I'll play with that idea, but it means that the v2 of this patch will probably miss 1.1.3. -- Eric Blake eblake redhat com +1-919-301-3266 Libvirt virtualization library http://libvirt.org

On 09/27/2013 09:48 AM, Daniel P. Berrange wrote:
On Thu, Sep 19, 2013 at 04:27:41PM -0600, Eric Blake wrote:
I'm tired of cryptic reports on IRC from people who build from git, then type 'make install' and have it fail quite a ways down the road because the documentation wasn't built. It's a feature that documentation is not built during development if the toolchain is not present (not all git developers build tarballs, and the tarballs already contain pre-built docs); but this only works as long as you don't try to install or make a tarball from that setup. With this patch in place, and without xhtml1-dtds, I now get this nice failure:
$ make install cfg.mk:109: *** ERROR: missing doc toolchain (install xhtml1-dtds and xmllint). Stop.
Can't we just make our existing rules fatal. I really don't see the point in treating docs errors as non-fatal. If the docs are not built, or are outdated, we should try to build them and fail if the tools aren't present.
This matches my reasoning behind my patch to ditch automake's maintainer mode: https://www.redhat.com/archives/libvir-list/2013-October/msg00226.html
We already require libxml.so be present, so requiring the libxml/libxslt cli tools really isn't a burden in the great scheme of things.
xsltproc and xmllint are easy to come by (it seems that every distro has a way to download them), but having the xhtml dtds is a bit harder (I couldn't find whether FreeBSD supports them by default, and I know that cygwin does not have them available in the distro yet). I can easily make bootstrap fail if the tools aren't present, but I don't know how to make it fail if the dtds are missing. On the other hand, I also just proved to myself that it is fairly easy to get the dtds set up in a local catalog. A single wget of 4 files from w3c, followed by a few xmlcatalog calls, is sufficient:
cd docs wget http://www.w3.org/TR/xhtml1/DTD/xhtml{1-strict.dtd,-{lat1,special,symbol}.ent} xmlcatalog --noout --create catalog xmlcatalog --noout --add public "-//W3C//DTD XHTML 1.0 Strict//EN" xhtml1-strict.dtd catalog xmlcatalog --noout --add public "-//W3C//ENTITIES Latin 1 for XHTML//EN" xhtml-lat1.dtd catalog xmlcatalog --noout --add public "-//W3C//ENTITIES Special for XHTML//EN" xhtml-special.dtd catalog xmlcatalog --noout --add public "-//W3C//ENTITIES Symbols for XHTML//EN" xhtml-symbol.dtd catalog cd .. ./configure --with-xml-catalog-file=$PWD/docs/catalog
Would it be appropriate to create a local catalog on any system where xhtml1-dtds is not already present as part of the distro, during the bootstrap phase, to make it much easier to continue to build from git on FreeBSD and Cygwin?
IOW, we should remove all the
@if test -x $(XMLLINT) && test -x $(XMLCATALOG) ; then \ @if [ -x $(XSLTPROC) ] ; then \
conditionals, and just let 'make' do its normal deps calculation and errore reporting.
Yes, I'd still like to do this, but only if I can get consensus on how to handle development on platforms that don't ship xhtml dtds in an easy-to-access distro location.
As long as the docs are included in the tar.gz people building fro mthe tar.gz will still not build the docs.
Yes, I still plan on ensuring that this works. -- Eric Blake eblake redhat com +1-919-301-3266 Libvirt virtualization library http://libvirt.org

On Fri, Oct 04, 2013 at 05:14:00PM -0600, Eric Blake wrote:
On 09/27/2013 09:48 AM, Daniel P. Berrange wrote:
On Thu, Sep 19, 2013 at 04:27:41PM -0600, Eric Blake wrote:
I'm tired of cryptic reports on IRC from people who build from git, then type 'make install' and have it fail quite a ways down the road because the documentation wasn't built. It's a feature that documentation is not built during development if the toolchain is not present (not all git developers build tarballs, and the tarballs already contain pre-built docs); but this only works as long as you don't try to install or make a tarball from that setup. With this patch in place, and without xhtml1-dtds, I now get this nice failure:
$ make install cfg.mk:109: *** ERROR: missing doc toolchain (install xhtml1-dtds and xmllint). Stop.
Can't we just make our existing rules fatal. I really don't see the point in treating docs errors as non-fatal. If the docs are not built, or are outdated, we should try to build them and fail if the tools aren't present.
This matches my reasoning behind my patch to ditch automake's maintainer mode: https://www.redhat.com/archives/libvir-list/2013-October/msg00226.html
We already require libxml.so be present, so requiring the libxml/libxslt cli tools really isn't a burden in the great scheme of things.
xsltproc and xmllint are easy to come by (it seems that every distro has a way to download them), but having the xhtml dtds is a bit harder (I couldn't find whether FreeBSD supports them by default, and I know that cygwin does not have them available in the distro yet). I can easily make bootstrap fail if the tools aren't present, but I don't know how to make it fail if the dtds are missing.
On the other hand, I also just proved to myself that it is fairly easy to get the dtds set up in a local catalog. A single wget of 4 files from w3c, followed by a few xmlcatalog calls, is sufficient:
cd docs wget http://www.w3.org/TR/xhtml1/DTD/xhtml{1-strict.dtd,-{lat1,special,symbol}.ent} xmlcatalog --noout --create catalog xmlcatalog --noout --add public "-//W3C//DTD XHTML 1.0 Strict//EN" xhtml1-strict.dtd catalog xmlcatalog --noout --add public "-//W3C//ENTITIES Latin 1 for XHTML//EN" xhtml-lat1.dtd catalog xmlcatalog --noout --add public "-//W3C//ENTITIES Special for XHTML//EN" xhtml-special.dtd catalog xmlcatalog --noout --add public "-//W3C//ENTITIES Symbols for XHTML//EN" xhtml-symbol.dtd catalog cd .. ./configure --with-xml-catalog-file=$PWD/docs/catalog
Would it be appropriate to create a local catalog on any system where xhtml1-dtds is not already present as part of the distro, during the bootstrap phase, to make it much easier to continue to build from git on FreeBSD and Cygwin?
Or could we just skip the XMLLINT step on BSD/Cygwin. eg still generate the docs, but don't validate for HTML schema compliance. As long as we are validating HTML compliance on Linux I think that would be sufficient to stop problems. We already have logic to skip XMLLINT, but it does not create the HTML output file. Could we just turn the 'echo missing XHMTL1 DTD' into a 'cat $< > $@' ?
IOW, we should remove all the
@if test -x $(XMLLINT) && test -x $(XMLCATALOG) ; then \ @if [ -x $(XSLTPROC) ] ; then \
conditionals, and just let 'make' do its normal deps calculation and errore reporting.
Yes, I'd still like to do this, but only if I can get consensus on how to handle development on platforms that don't ship xhtml dtds in an easy-to-access distro location.
Daniel -- |: http://berrange.com -o- http://www.flickr.com/photos/dberrange/ :| |: http://libvirt.org -o- http://virt-manager.org :| |: http://autobuild.org -o- http://search.cpan.org/~danberr/ :| |: http://entangle-photo.org -o- http://live.gnome.org/gtk-vnc :|
participants (2)
-
Daniel P. Berrange
-
Eric Blake