Date: prev next · Thread: first prev next last
2025 Archives by date, by thread · List index


Hi Haruhiko, *,

On Tue, Oct 14, 2025 at 9:18 PM jijinbei <hallucigenia2002@gmail.com> wrote:
[…]
I certainly don’t deny that gbuild has been stable and proven to work.
However, the fact that it hasn’t been simplified for more than a decade is exactly what concerns 
me.
Even if the macro logic itself is not too complex, the interactions between configure.ac, 
numerous .mk files, and environment-dependent variables make the system very difficult for new 
contributors to follow.

maybe there's just a misunderstanding here. Configure.ac doesn't have
interactions with the gbuild/make system apart from creating the
config_host.mk (and config_build.mk in case of cross-compiling),
creating the configuration-header files in config_{host|build} and
doing some variable replacement in some files (foo.in → foo file),
pretty standard stuff, nothing really special and complex.
Also the build then doesn't need any environment variables, everything
make needs is in the make variables in the config_host.mk file,
including the PATH at configure time. make itself then exports those
into the environment for the recipes that are invoked, but in general
there is no dependency on environment variables.
When configure is run, then environment variables can be used to
bypass some detections/provide direct values, so running configure is
the primary part that is influenced by different environment
variables.

There are very few variables where make provides a default that can be
overridden using an environment variable (those using ?= assignment) -
but the vast majority of variables can only be overridden by
specifying them as parameters when invoking make.

bazel would also have to create the config_foo.h headers or that would
require a massive list of defines/command line arguments to the
compiler, and that would be a problem since we already did exceed
commandline length limits in the past, especially on windows, have to
put arguments into temporary files that the command can read in some
cases to stay within the limits...

And there still just be the necessary complexity in supporting so many
switches - you still need to be able to tell bazel to skip certain
parts, to only include some portion of others, to use different
variants of a third thing depending on the circumstances, would still
need to provide stuff like the various foo_LIBS/FLAGS to pass them to
externals that are built. In the current case that's in the
config_host.mk – primarily the BUILD_TYPE on a macro level and the
ENABLE_FOO (or sometimes WITH_FOO) on a micro level. All that would
also have to be represented in bazel, as would the flags for the
external components. That part is hard to cut down, but also I don't
think it is hard to understand. People might not know where/what to
look for at first, but that would be the same if a user never used
bazel before.

[…]
What I was trying to say is that there are still quite a few parts of LibreOffice that seem to 
reimplement functionality already available in standard or widely-used libraries.

Ah, ok, but that is not due to the build system..

If we could gradually replace some of these with well-maintained external ones, we could reduce 
maintenance cost and modernize the codebase.

For example:

our own smart pointer implementations,
tools/json_writer.cxx (custom JSON writer),

that's 360 lines including comments - and sometimes a generic library
just isn't worth it. Passing the data in a form that the library would
understand is just as much effort as just generating the json right
away, not to mention faster since you don't have to convert your data
multiple times.

comphelper/source/misc/base64.cxx (custom Base64 implementation),

That's also quite trivial, so remember the small nodejs package that
apparently thousands of people used that was just a simple left-pad
function or something that the author removed and broke all those
packages?
For some things it is just not worth using an external library.

A counter-example to that is
https://gerrit.libreoffice.org/c/core/+/191301 for example, although
that's replying one external with another, but shows that there's no
general problem of using external implementations, the large number of
externals is prove of that especially for larger stuff that is what's
done in fact.

comphelper/source/misc/threadpool.cxx (in-house thread pool system)

Yeah, but remember LibreOffice has decades of history, so while there
might be cross-platform alternatives now, that certainly wasn't the
case during times where Apple still built powerPC based machines...
But again replacing that is not a matter of the build system making it
hard, that would just be a matter of finding one that's suitable
without having to rewrite the whole codebase to fit that replacement's
philosophy.

Many of these seem to date back to the pre-C++11 era, before the standard library or widely 
adopted dependencies became mature enough.

Exactly. So just imagine LibreOffice would already use bazel today:
replacing that is still the same amount of work as with gbuild, isn't
it? Since you need to primarily change the c++ code, not the build
system. All the build system would do is to put the external library's
files into some directory where the rest of the code can find it, and
that is trivial. Adapting the code to use that new library is where
all the work would go into.

So using existing, reliable libraries where possible could make maintenance easier in the long 
run.

Yes, emphasis on existing, and reliable, and add to that
cross-platform and well-maintained and non-trivial. The external
dependencies/libraries however can also make life a lot harder if they
decide to no longer support an operating system version that we like
to support or when they rise their build requirements to something we
don't support, etc.  It is a balance act. Again taking the nodejs
example, could have been a more complex package that was removed, and
then the effects would have been way worse. You'll end up with a fork,
so you have an abstraction/compatibility layer from your own internals
to what the external library expects without the benefit of that
library being maintained elsewhere.

No idea what chore library is

Ah, that was a typo. I meant the C++ standard library’s chrono

Ah, that for example would already have required Windows 10, so way
too recent, and after you already wrote your own stuff, switching to
something else is harder than if you can do it from the get-go. And
you risk to run into corner cases that break (i.e. find bugs in that
library), causing regressions, etc. But again I don't think any
refactoring of that kind is hindered by the build system.

For instance, with
```
bazel query 'deps(//vcl:libvcl)'
```

Yeah, but that type of dependency is just one of many, and you'd still
have the case where you have chains of dependencies/you don't want to
add all dependencies and the dependencies' dependencies everywhere. So
you'd still have to follow through many files.

we can directly inspect dependency trees, visualize relationships, and automatically detect 
cycles.

You can also do that with gnu make, the dependency graph was created
using the make info, there is not a universal tool for that that would
work on any project, that's true.
And make also detects cyclic dependencies

In the current gbuild system, these relationships exist mostly implicitly, across layers of 
macros and Makefiles.

The dependency definition itself is a macro, that's right, but that in
turn makes it possible to use that info to create the dependency graph
for example. But apart from that the dependencies are explicit:
$(eval $(call gb_Library_use_libraries,avmedia,\
    comphelper \
    cppu \
    cppuhelper \
    sal \
    svl \
    ucbhelper \
    vcl \
))

$(eval $(call gb_Library_use_externals,avmedia,\
    boost_headers \
))

e.g. as a sample for two types of dependencies, avmedia depends on the
comphelper, cppu, cppuhelper, sal, svl, ucbhelper and vcl libraries
and also on the external project "boos_headers"

Bazel’s model makes such structures visible and analyzable.

But then the question is whether all those kinds of dependencies can
be mapped out. And for me as human the difference between
 bazel query 'deps(//vcl:libvcl)'
and
  $(eval $(call gb_Library_use_externals,self,vcl))

is non-existent/same for me. I also don't need to have a tool spit out
the dependencies for me, since I can do "make avmedia.allbuild" for
example to build avmedia and all it depends on, so make knowing the
dependency chain is enough for most users/everyday work.
And make dump-deps or dump-deps-png (if you have dot tool available)
will create a dependency graph/image. (collapsed to module level by
default, but you could add -p to the corresponding commandline to see
library level/more detail, it isn't the full dependency tree, but that
then would be too overwhelming anyway)

I strongly suggest to aim for an automated conversion process.

Wouldn’t a fully automated conversion risk introducing an extra layer of complexity?

Well, on the one hand yes, it is added layer of complexity, but that
would make it also a lot easier to review and iterate with. If there
at some point would be a "here's the buildsystem converted to bazel"
dump with the hundreds of bazel files it is very hard to troubleshoot
stuff, especially when you're not familiar with bazel. We support
"real" cross-compilation (where a set of build tools is build and the
resulting target is for the different host and cannot be run on the
host as well as cross-compilation "light" (where the host binaries can
be run on the build machine), across three different platforms
(officially supported by TDF at least, + android) and we have lots of
configure options, so there inevitably will be bugs in any conversion
attempts not taking something into account.

I’m exploring whether we can design an approach that does not rely heavily on GNU make or 
Autoconf/configure.

At the first demo stage, especially covering configure of course
anything is fine, but any possible migration it either needs to be
slowly and incrementally or backed by some kind of automation that
does the build of the files, and only maybe leaving the CustomTarget
ones to manual conversion.

ciao
Christian

-- 
To unsubscribe e-mail to: discuss+unsubscribe@documentfoundation.org
Problems? https://www.libreoffice.org/get-help/mailing-lists/how-to-unsubscribe/
Posting guidelines + more: https://wiki.documentfoundation.org/Netiquette
List archive: https://listarchives.documentfoundation.org/www/discuss/
Privacy Policy: https://www.documentfoundation.org/privacy

Context


Privacy Policy | Impressum (Legal Info) | Copyright information: Unless otherwise specified, all text and images on this website are licensed under the Creative Commons Attribution-Share Alike 3.0 License. This does not include the source code of LibreOffice, which is licensed under the Mozilla Public License (MPLv2). "LibreOffice" and "The Document Foundation" are registered trademarks of their corresponding registered owners or are in actual use as trademarks in one or more countries. Their respective logos and icons are also subject to international copyright laws. Use thereof is explained in our trademark policy.