[cmake-developers] Experiments in CMake support for Clang (header & standard) modules
dblaikie at gmail.com
Thu Jul 19 19:07:01 EDT 2018
(just CC'ing you Richard in case you want to read my ramblings/spot any
Excuse the delay - coming back to this a bit now. Though the varying
opinions on what modules will take to integrate with build system still
weighs on me a bit - but I'm trying to find small ways/concrete steps to
make some progress on this rather than being lost in choice/opinion
To that end, Stephen, I've made a fork of your example repository & a very
simple/direct change to use C++ modules as currently implemented in Clang.
Some workarounds are required for a few bugs/incomplete features (oh, one I
didn't comment in source is the use of #include for the standard library -
that'd actually be "import legacy" in the current modules TS2/atom merged
proposal, if I understand correctly - and the way I've implemented it in
the example is as-if the standard library were not modularized (so the
project wraps it in a modularized header itself))
The build.sh script shows the commands required to build it (though I
haven't checked the exact fmodule-file dependencies to check that they're
all necessary, etc) - and with current Clang top-of-tree it does build and
run the example dinnerparty program.
There are a few ideas being tossed aroudn currently for how module
dependency discovery could be done by build systems or exposed by the
compiler itself (GCC has a service protocol it can interact with when it
needs a new compiled module description, which the build system could
implement to fulfill such requests - giving the build system all
information about the inter-module dependencies without a separate ahead of
time scan and while allowing maximal parallelism (compiler could request
all needed modules then wait for them all to be ready - rather than one at
a time, so as not to stall parallelism)).
The syntax is intended to support a build system that would do the scanning
itself, though - requiring limited preprocessing (I think the preprocessing
would need to know where to stop - since it might not be able to preprocess
the whole file without error without importing modules (since those imports
could contain new macro definitions used later in the file)).
If you happen to try experimenting with any ways the commands in the
build.sh file could be run from CMake it a sensible way - even if you
hypothesize what -MM support (or other compiler hooks like the dependency
server I alluded to above, etc) for modules might look like to do so, I'd
love to chat about it/throw ideas around/try mocking up/prototyping the
sort of compiler support (I don't think there's -MM support yet, but I
could see about adding it, for example) that seems like it might be most
One thing I'm vaguely concerned about is actually the dependency of
building modules within a single library (as in this project/example - at
least the way I've built it for now - I didn't try building it as separate
.so/.a files). At least across-library we can work at the library
granularity and provide on the command line (or via a file as GCC does) the
module files for all the modules from dependent libraries. But I'm not sure
how best to determine the order in which to build files within a library -
that's where the sort of -MM-esque stuff, etc, would be necessary. (though
that sort of stuff would be useful even cross-library to speed up the build
(eg: foo_a.cppm depends on bar_a.cppm but not bar_b.cppm - don't rebuild
foo_a.cppm if bar_b.cppm changes, even though libfoo depends on libbar, etc)
On Tue, May 15, 2018 at 1:34 AM Stephen Kelly <steveire at gmail.com> wrote:
> David Blaikie wrote:
> >> Nope, scratch that ^ I had thought that was the case, but talking more
> >> with Richard Smith it seems there's an expectation that modules will be
> >> somewhere between header and library granularity (obviously some small
> >> libraries today have one or only a few headers, some (like Qt) have many
> >> - maybe those on the Qt end might have slightly fewer modules than the
> >> have headers - but still several modules to one library most likely, by
> >> the sounds of it)
> >> Why? Richard maybe you can answer that? These are the kinds of things I
> >> was trying to get answers to in the previous post to iso sg2 in the
> >> google group. I didn't get an answer as definitive as this, so maybe you
> >> can share the reason behind such a definitive answer?
> > It's more that the functionality will allow this & just judging by how
> > people do things today (existing header granularity partly motivated by
> > the cost of headers that doesn't apply to modules), how they're likely to
> > do things in the future (I personally would guess people will probably
> > to just port their headers to modules - and a few places where there are
> > circular dependencies in headers or the like they might glob them up into
> > one module).
> It seems quite common to have one PCH file per shared library (that's what
> Qt does for example). What makes you so sure that won't be the case with
Can't say I've worked with code using existing PCH - if that seems common
enough, it might be a good analogy/guidance people might follow with C++
> I'd say that what people will do will be determined by whatever their
> optimize for. If it is necessary to list all used modules on the compile
> line, people would choose fewer modules. If 'import QtCore' is fast and
> allows the use of QString and QVariant etc and there is no downside, then
> that will be the granularity offered by Qt (instead of 'QtCore.QString').
> That is also comparable to '#include <QtCore>' which is possible today.
Yep, perhaps they will.
> >> I just looked through the commits from Boris, and it seems he made some
> >> changes relating to -fmodule-file=. That still presupposes that all
> >> (transitively) used module files are specified on the command line.
> > Actually I believe the need is only the immediate dependencies - at least
> > with Clang's implementation.
> Ok. That's not much better though. It still means editing/generating the
> buildsystem each time you add an import.
Isn't that true today with headers, though? But today the build system does
this under the covers with header scanning, -MM modes, etc?
What would be different about having a similar requirement for modules (but
a somewhat different discovery process)? I guess the difference is that now
that discovery changes the command used to compile a given source file,
whereas in the past (with headers) it didn't change those commands but did
end up with an implicit dependency ("when this header file changes, rerun
this compile command (even though it doesn't mention the header - trust us,
we know it depends on it)") whereas now it'd be explicit.
> I don't think a model with that requirement will gain adoption.
> >> I was talking about the -fprebuilt-module-path option added by Manman
> >> in https://reviews.llvm.org/D23125 because that actually relieves the
> >> user/buildsystem of maintaining a list of all used modules (I hope).
> > *nod* & as you say, GCC has something similar. Though the build system
> > probably wants to know about the used modules to do dependency analysis &
> > rebuilding correctly.
> Yes, presumably that will work with -MM.
> > Yeah, thanks for the link - useful to read.
> There seems to be a slew of activity around modules at the moment. You can
> read some other reactions here which might have input for your paper:
> I look forward to reading your paper anyway.
> >> I think clang outputs the definitions in a separate object file, but GCC
> >> currently doesn't. Perhaps that's a difference that cmake has to account
> >> for or pass on to the user.
> > Clang outputs frontend-usable (not object code, but serialized AST usable
> > for compiling other source code) descriptions of the entire module
> > (whatever it contains - declarations, definitions, etc) to the .pcm file.
> > It can then, in a separate step, build an object file from the pcm. I
> > think GCC produces both of these artifacts in one go - but not in the
> > file.
> Ok, I must have misremembered something.
> >> Sure. I didn't notice anything from reading, but I also didn't try it
> >> out. You might need to provide a repo with the module.modulemap/c++
> >> etc that are part of your experiment. Or better, provide something based
> >> on modules-ts that I can try out.
> > *nod* I'll see if I can get enough of modules-ts type things working to
> > provide some examples, but there's some more variance/uncertainty there
> > the compiler support, etc.
> Something working only with clang for example would be a good start.
> >> I'm guessing that's enough for you to implement what you want as an
> >> experiment?
> > OK, so in that case it requires source changes to cmake? *nod* sounds
> > plausible - I appreciate the pointers. I take it that implies there's not
> > a way I could hook into those file kinds and filters without changing
> > cmake? (ie: from within my project's cmake build files, without modifying
> > a cmake release)
> There is no way to hook into the system I described without patching
> Your custom command approach might be the way to do that if it is the
> Powered by www.kitware.com
> Please keep messages on-topic and check the CMake FAQ at:
> Kitware offers various services to support the CMake community. For more
> information on each offering, please visit:
> CMake Support: http://cmake.org/cmake/help/support.html
> CMake Consulting: http://cmake.org/cmake/help/consulting.html
> CMake Training Courses: http://cmake.org/cmake/help/training.html
> Visit other Kitware open-source projects at
> Follow this link to subscribe/unsubscribe:
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the cmake-developers