From Harry.Mallon at codex.online Tue May 1 06:33:45 2018 From: Harry.Mallon at codex.online (Harry Mallon) Date: Tue, 1 May 2018 10:33:45 +0000 Subject: [cmake-developers] COMPILE_OPTIONS or COMPILE_FLAGS Message-ID: <9A9559AE-243B-45C3-83AD-D3659F13A887@codex.online> Hi all, I just noticed that COMPILE_OPTIONS is 3.11.0 only. I can?t understand from the docs which of COMPILE_OPTIONS or COMPILE_FLAGS is preferred and why I might use one or the other. I will use COMPILE_FLAGS as it is supported on 3.10 (which we have on our linux build machine) for now. I am specifically talking about adding ?-mavx? to one file in particular but would prefer a general answer if anyone knows. Thanks, Harry Harry Mallon Senior Software Engineer [http://codex.online/?action=asset&id=E3D62C3D-A12C-447D-87A5-F36E7C2AA9A4] T +44 203 7000 989 60 Poland Street | London | England | W1F 7NT [http://codex.online/?action=asset&id=6F42BDF2-3C6D-4054-A5D2-277E0E535942] Three Billboards Blade Runner 2049 I, Tonya -------------- next part -------------- An HTML attachment was scrubbed... URL: From brad.king at kitware.com Tue May 1 09:13:48 2018 From: brad.king at kitware.com (Brad King) Date: Tue, 1 May 2018 09:13:48 -0400 Subject: [cmake-developers] COMPILE_OPTIONS or COMPILE_FLAGS In-Reply-To: <9A9559AE-243B-45C3-83AD-D3659F13A887@codex.online> References: <9A9559AE-243B-45C3-83AD-D3659F13A887@codex.online> Message-ID: <19e6cc86-9d45-628f-bc18-e703a2a4b61c@kitware.com> On 05/01/2018 06:33 AM, Harry Mallon wrote: > I just noticed that COMPILE_OPTIONS is 3.11.0 only. I can?t understand > from the docs which of COMPILE_OPTIONS or COMPILE_FLAGS is preferred > and why I might use one or the other. COMPILE_OPTIONS for targets has existed for a long time. What is new in 3.11 is COMPILE_OPTIONS for source files. COMPILE_FLAGS has existed for source files for a long time too. The difference is that COMPILE_FLAGS takes a command-line string of flags that requires manual escaping. COMPILE_OPTIONS takes a ;-list of flags and will do the escaping automatically. The docs of the two options should be updated to mention each other and make the distinction clear. -Brad From Harry.Mallon at codex.online Tue May 1 09:40:49 2018 From: Harry.Mallon at codex.online (Harry Mallon) Date: Tue, 1 May 2018 13:40:49 +0000 Subject: [cmake-developers] COMPILE_OPTIONS or COMPILE_FLAGS In-Reply-To: <19e6cc86-9d45-628f-bc18-e703a2a4b61c@kitware.com> References: <9A9559AE-243B-45C3-83AD-D3659F13A887@codex.online> <19e6cc86-9d45-628f-bc18-e703a2a4b61c@kitware.com> Message-ID: Thanks Brad, I think I can use FLAGS for now then, but if my requirements get more complicated I'll use OPTIONS and require 3.11. I think the docs change would be helpful. Cheers, Harry Harry Mallon CODEX | Senior Software Engineer 60 Poland Street | London | England | W1F 7NT E harry.mallon at codex.online | T +44 203 7000 989 ?On 01/05/2018, 14:13, "Brad King" wrote: On 05/01/2018 06:33 AM, Harry Mallon wrote: > I just noticed that COMPILE_OPTIONS is 3.11.0 only. I can?t understand > from the docs which of COMPILE_OPTIONS or COMPILE_FLAGS is preferred > and why I might use one or the other. COMPILE_OPTIONS for targets has existed for a long time. What is new in 3.11 is COMPILE_OPTIONS for source files. COMPILE_FLAGS has existed for source files for a long time too. The difference is that COMPILE_FLAGS takes a command-line string of flags that requires manual escaping. COMPILE_OPTIONS takes a ;-list of flags and will do the escaping automatically. The docs of the two options should be updated to mention each other and make the distinction clear. -Brad From duane at duaneellis.com Tue May 1 16:50:58 2018 From: duane at duaneellis.com (duane at duaneellis.com) Date: Tue, 01 May 2018 13:50:58 -0700 Subject: [cmake-developers] new generator question - xml file output for embedded IDE platforms. Message-ID: <20180501135058.5c1bb9f86d671edec44bb378f25c04cc.60ae5b8a42.wbe@email03.godaddy.com> Hi - I'm looking into adding a new "generator" type, that is basically a fancy form of "configure_file()" At this point, I've been stepping through Cmake code trying to understand the general flow and want to ask the question: Is this insane or stupid? Or not a bad idea. Some details to understand where I am headed and what I'm thinking. Generally, cmake produces a makefile, or - for example with Visual Studio it produces an XML file directly. In my case, I am focusing on micro-controller *embedded* targets - and I need to produce various XML files that are required by IDEs. In other cases I need to create GNU makefiles for command line gcc-arm it varies. I also need the ability to create Visual Studio (or linux) projects because it is often very helpful to create unit tests for libraries that can run on a host platform for some embedded libraries - it is this unit test part that makes Cmake is an interesting solution. For the EMBEDDED target - some assumptions & major compromises for this type of target is important and must be made - ie: You cannot compile and execute something, many of the various tests and such will just not be possible. The IDEs often cannot really manage re-running Cmake - (basically some IDEs perform an IMPORT operation) To simplify - I want to limit the supported items to two things: Build (1 to N) static libraries. Build & Link (1 to N) applications, typically this produces an ELF file Optionally extract a HEX or BIN file from the ELF file. The IDEs generally have: A top level Workspace file - Much like a Visual Studio SLN file. Each project has its own file - much like visual studio. There are sometimes additional files to be created Something for the debugger, or perhaps a "config.h" type file The goal here is not just compiling the code but fully supporting the IDE experience, ie: all debugger features work better if you build using the IDE My hunch is this: What is really needed is a TEMPLATE language - and to some degree that is what I am proposing and want feed back on. Assume the following is present, ie: on the command line or via a CMakeLists.txt file in a subdirectory A variable with the TOOL NAME (with version number) A variable with the CHIP NAME Then - Based on the TOOL & CHIP NAME - I can find (and read) other files with more details. For example Endian, specific ARCH ie: ARM CortexM3 vrs Atmel AVR Chip specific information like: Size of FLASH, starting address of FLASH, RAM, etc. AND - a directory that contains lots of "template" files Here are some example project files that might be generated. ARM - Kiel uVision - https://github.com/ARM-software/CMSIS_5/tree/develop/CMSIS/DSP/Projects/GCC Specific examples: https://github.com/ARM-software/CMSIS_5/blob/develop/CMSIS/DSP/Projects/GCC/arm_cortexM_math.uvoptx Source files appear around line 3000 ... there are many of these. IAR project https://github.com/ARM-software/CMSIS_5/tree/develop/CMSIS/DSP/Projects/IAR the EWW file - is the Workspace, the EWP is the PROJECT IAR also has something called an "argvars" file- used to set variables used across projects within a workspace. Example: https://github.com/ti-simplelink/ble-sdk-210-extra/blob/master/Projects/ble/multi_role/CC26xx/IAR/multi_role.custom_argvars IAR also has "icf" files - yet another xml file. Key point: This file is "imported" by the IDE in a *one*way* import step. TI CCS - supports something called a PROJECT SPEC file Narrowly focusing on the ARM targets TI has their own compiler, plus they support the GCC compiler) While the TI-CCS is an ARM-eclipse environment - they do support eclipse project files But that has its own list of issues when we talk about cross compiler tools. http://processors.wiki.ti.com/index.php/ProjectSpecs_in_CCS https://github.com/ti-simplelink/ble_examples/blob/master/examples/rtos/CC2640R2_LAUNCHXL/bleapps/peripheral_bidirectional_audio/tirtos/ccs/peripheral_bidirectional_audio_cc2640r2lp_app.projectspec If I narrow the supported list of features to something generally like this: For a given target: a) In some cases, might need to produce a "top level workspace file" (more below) Might need to add a standardized variable, ie: CMAKE_IDE_WORKSPACE_NAME or something. b) For a target - given a list of source files - create one or more static libraries. - CMake has this basic input construct already c) given a list of source files - create an 'executable' - that may link against the above libraries (plus others that are pre-built) - Cmake has this basic construct now. d) Some common things, ie: A list of Include Directories, a list of command line defines - Cmake has this e) The ability to GENERATE something like "foobar.h" from the file "foobar.h.in" - Cmake has this, with some limited supported features f) The ability to copy a file from (a template) directory into the "build" directory An example might be the startup assembly language file. - Cmake has this, via the configure_file() command. f) What's missing is the XML file creation - I want to really avoid writing a *CUSTOM* xml creator for each tool and each chip The permutations are horrible. Only SOME of the Cmake commands would be used/required/supported, for example (below is an invented XML syntax that shows the types of things that might be needed) @if( ${FEATURE} )@ @else()@ @endif()@ @foreach( SOURCEFILE ${SOURCE_FILE_LIST} )@ @${SOURCE_FILE}@ @endforeach()@ @foreach( this_library ${TARGET_LIBRARY_LIST} )@ @library( GET_LIBDIR this_dirname ${this_library} )@ -L@{this_dirname}@ @endforeach()@ @foreach( this_library ${TARGET_LIBRARY_LIST} )@ @library( GET_LIBNAME this_libname ${this_library} )@ -l@{this_libname}@ @endforeach()@ ============= I think - the above would cover about 80 to 90% of the use cases. Am I insane? and architecturally Cmake is not the way to do this? Thanks. From matthias.goesswein at eeas.at Wed May 2 03:28:09 2018 From: matthias.goesswein at eeas.at (=?UTF-8?Q?G=c3=b6=c3=9fwein_Matthias_/_eeas_gmbh?=) Date: Wed, 2 May 2018 09:28:09 +0200 Subject: [cmake-developers] new generator question - xml file output for embedded IDE platforms. In-Reply-To: <20180501135058.5c1bb9f86d671edec44bb378f25c04cc.60ae5b8a42.wbe@email03.godaddy.com> References: <20180501135058.5c1bb9f86d671edec44bb378f25c04cc.60ae5b8a42.wbe@email03.godaddy.com> Message-ID: Hi Duane, As far as i understand from your mail you want to generate project files from CMake. Although I would appreciate a more powerful configure_file command (e.g. a built in generator like the mustache generator (https://mustache.github.io/) or something like that) i must confess that configure_file is not the right command for the feature you want. It is used to generate some files from templates (e.g. Header Files), actually in a simple form of exchanging placeholders with contents of a CMake Variables. It works like that: The feature which you want is done by the CMake Generators, which you can specify by the command line option -G (or you can select it with the cmake-gui). If an IDE is actually not supported by CMake a generator it will have to be implemented for that in the source code of CMake. There are generators which generate IDE projects, where you can build it with the IDE itselt (e.g. Visual Studio) and there are so called "Extra generators" which generate Ninja/Makefiles and additional project files for the IDE. The build can be done by the IDE, but it will run the Ninja/Makefiles (e.g. Eclipse CDT generator). https://cmake.org/cmake/help/v3.11/manual/cmake-generators.7.html The Toolchain and the Processor (Chip) may be specified by Toolchain Files, which can also be chosen by the cmake-gui or on the command line with the option -T . You may have several toolchain files for different toolchains (e.g. one for embedded and one for PC) Some information regarding that can be found here: https://gitlab.kitware.com/cmake/community/wikis/doc/cmake/CrossCompiling https://cmake.org/cmake/help/latest/manual/cmake-toolchains.7.html When CMake is run it checks for a working compiler. To check that CMake tries to compile a simple program. For embedded compilers it is not always possible without some special files (e.g. linker files), so CMake also provides a special mode for that where only a library is created instead of an executable. (See Variable CMAKE_TRY_COMPILE_TARGET_TYPE, https://cmake.org/cmake/help/v3.11/variable/CMAKE_TRY_COMPILE_TARGET_TYPE.html) Best regards, Matthias. -- matthias.goesswein at eeas.at /mail www.eeas.at /web +43 660 1280 131 /phone ------------------------------ eeas gmbh Technologiepark 17 4320 Perg Austria ------------------------------ ATU67456549 /uid FN385458a /firmenbuchnummer landesgericht linz /firmenbuch Am 01.05.2018 um 22:50 schrieb duane at duaneellis.com: > Hi - > > I'm looking into adding a new "generator" type, that is basically a > fancy form of "configure_file()" > > At this point, I've been stepping through Cmake code trying to > understand the general flow > and want to ask the question: Is this insane or stupid? Or not a bad > idea. > > Some details to understand where I am headed and what I'm thinking. > > Generally, cmake produces a makefile, or - for example with Visual > Studio it produces an XML file directly. > > In my case, I am focusing on micro-controller *embedded* targets - and I > need to produce various XML files that are required by IDEs. > In other cases I need to create GNU makefiles for command line gcc-arm > it varies. > > I also need the ability to create Visual Studio (or linux) projects > because it is often very helpful to create unit tests for libraries that > can run on a host platform for some embedded libraries - it is this unit > test part that makes Cmake is an interesting solution. > > For the EMBEDDED target - some assumptions & major compromises for this > type of target is important and must be made - ie: You cannot compile > and execute something, many of the various tests and such will just not > be possible. The IDEs often cannot really manage re-running Cmake - > (basically some IDEs perform an IMPORT operation) > > To simplify - I want to limit the supported items to two things: > Build (1 to N) static libraries. > Build & Link (1 to N) applications, typically this produces an ELF > file > Optionally extract a HEX or BIN file from the ELF file. > > The IDEs generally have: > A top level Workspace file - Much like a Visual Studio SLN file. > Each project has its own file - much like visual studio. > There are sometimes additional files to be created > Something for the debugger, or perhaps a "config.h" type file > > The goal here is not just compiling the code but fully supporting the > IDE experience, ie: all debugger features work better if you build using > the IDE > > My hunch is this: > What is really needed is a TEMPLATE language - and to some degree > that is what I am proposing and want feed back on. > > Assume the following is present, ie: on the command line or via a > CMakeLists.txt file in a subdirectory > A variable with the TOOL NAME (with version number) > A variable with the CHIP NAME > > Then - > Based on the TOOL & CHIP NAME - I can find (and read) other files > with more details. > For example Endian, specific ARCH ie: ARM CortexM3 vrs Atmel AVR > Chip specific information like: Size of FLASH, starting address of > FLASH, RAM, etc. > > AND - a directory that contains lots of "template" files > > Here are some example project files that might be generated. > > ARM - Kiel uVision - > > https://github.com/ARM-software/CMSIS_5/tree/develop/CMSIS/DSP/Projects/GCC > Specific examples: > > https://github.com/ARM-software/CMSIS_5/blob/develop/CMSIS/DSP/Projects/GCC/arm_cortexM_math.uvoptx > Source files appear around line 3000 ... there are many of > these. > > IAR project > > https://github.com/ARM-software/CMSIS_5/tree/develop/CMSIS/DSP/Projects/IAR > the EWW file - is the Workspace, the EWP is the PROJECT > > IAR also has something called an "argvars" file- used to set > variables used across projects within a workspace. > Example: > > https://github.com/ti-simplelink/ble-sdk-210-extra/blob/master/Projects/ble/multi_role/CC26xx/IAR/multi_role.custom_argvars > > IAR also has "icf" files - yet another xml file. > Key point: This file is "imported" by the IDE in a *one*way* import > step. > > TI CCS - supports something called a PROJECT SPEC file > Narrowly focusing on the ARM targets TI has their own compiler, plus > they support the GCC compiler) > While the TI-CCS is an ARM-eclipse environment - they do support > eclipse project files > But that has its own list of issues when we talk about cross > compiler tools. > > http://processors.wiki.ti.com/index.php/ProjectSpecs_in_CCS > > > https://github.com/ti-simplelink/ble_examples/blob/master/examples/rtos/CC2640R2_LAUNCHXL/bleapps/peripheral_bidirectional_audio/tirtos/ccs/peripheral_bidirectional_audio_cc2640r2lp_app.projectspec > > If I narrow the supported list of features to something generally like > this: > > For a given target: > > a) In some cases, might need to produce a "top level workspace file" > (more below) > Might need to add a standardized variable, ie: > CMAKE_IDE_WORKSPACE_NAME or something. > > b) For a target - given a list of source files - create one or more > static libraries. > - CMake has this basic input construct already > > c) given a list of source files - create an 'executable' - that may link > against the above libraries (plus others that are pre-built) > - Cmake has this basic construct now. > > d) Some common things, ie: A list of Include Directories, a list of > command line defines > - Cmake has this > > e) The ability to GENERATE something like "foobar.h" from the file > "foobar.h.in" > - Cmake has this, with some limited supported features > > f) The ability to copy a file from (a template) directory into the > "build" directory > An example might be the startup assembly language file. > - Cmake has this, via the configure_file() command. > > f) What's missing is the XML file creation - > I want to really avoid writing a *CUSTOM* xml creator for each tool > and each chip > The permutations are horrible. > > Only SOME of the Cmake commands would be used/required/supported, for > example > (below is an invented XML syntax that shows the types of things that > might be needed) > > > > @if( ${FEATURE} )@ > > @else()@ > > @endif()@ > > > > @foreach( SOURCEFILE ${SOURCE_FILE_LIST} )@ > @${SOURCE_FILE}@ > @endforeach()@ > > > @foreach( this_library ${TARGET_LIBRARY_LIST} )@ > @library( GET_LIBDIR this_dirname ${this_library} )@ > -L@{this_dirname}@ > @endforeach()@ > > > > @foreach( this_library ${TARGET_LIBRARY_LIST} )@ > @library( GET_LIBNAME this_libname ${this_library} )@ > -l@{this_libname}@ > @endforeach()@ > > > > > ============= > > I think - the above would cover about 80 to 90% of the use cases. > > Am I insane? and architecturally Cmake is not the way to do this? > > > Thanks. > > From duane at duaneellis.com Wed May 2 12:17:53 2018 From: duane at duaneellis.com (duane at duaneellis.com) Date: Wed, 02 May 2018 09:17:53 -0700 Subject: [cmake-developers] new generator question - xml file output for embedded IDE platforms. Message-ID: <20180502091753.5c1bb9f86d671edec44bb378f25c04cc.d4e2eee4e9.wbe@email03.godaddy.com> >> configure_file is not the right command Yea, it's the nearest existing item, and it only does the most simplistic replacement that's why I use that as a basis for my example. It is in effect, like the final last 'sed' step done by gnu autoconfigure tools. Nothing more. >> If an IDE is actually not supported by CMake a generator it will have to be implemented for that in the source code of CMake. yea, i'm trying to avoid that - but I can write that if required :-( It's more then the IDE, it is also the CHIP effectively the SYSTEM What I need is the variable data that Cmake has already and I need to be able to tell CMake that it *cannot* run the compiler instead, all of the information about the compiler will be provided via some Cmake script, for example names like this, either on the command line or specified in a Cmake file that holds alot of variables. Cmake-Embedded-${CompilerName}.txt Cmake-Embedded-${ChipName}.txt Possibly: Cmake-Embedded-${RtosName}.txt or Cmake-Embedded-BareMetal.txt And packages (aka: Libraries) that you might want to use would never be discovered and would instead be specified in some form, for example Cmake-Embedded-Package-${PackageName}.txt An Embedded Package (aka: A static library) provided it does not require a specific hardware access should EASILY be re-usable in a host environment. Thus, a package could provide for example Cmake-Embedded-Package-HostLib-${PackageName} And several unit test type applications like this. Cmake-Embedded-Package-HostTest-${PackageName} Key thing to remember, i'm approaching this from the *embedded*side* where most - if not all of the flexibility found on a HOST simply does not exist, thus moving an embedded package to host is easy because the host side is far more flexibility then the embedded. the WIN I am looking for is the "HostLib" and "HostTest" Cmake already provides this, why re-invent the wheel. Other alternatives is autoconfig - which - is very Windows Unfriendly, lots and lots of embedded types use Windows - in some cases the IDE is only available on windows. Bottom line: Given the information in the Cmake-Embedded-*.txt files there is enough information to create the embedded IDE project files which are generally simple XML files And that's the idea ... Thanks for your help & comments. -Duane. From neundorf at kde.org Wed May 2 15:02:55 2018 From: neundorf at kde.org (Alexander Neundorf) Date: Wed, 02 May 2018 21:02:55 +0200 Subject: [cmake-developers] new generator question - xml file output for embedded IDE platforms. In-Reply-To: <20180502091753.5c1bb9f86d671edec44bb378f25c04cc.d4e2eee4e9.wbe@email03.godaddy.com> References: <20180502091753.5c1bb9f86d671edec44bb378f25c04cc.d4e2eee4e9.wbe@email03.godaddy.com> Message-ID: <2716250.omRkvHdh85@linux-l7nd> On 2018 M05 2, Wed 09:17:53 CEST duane at duaneellis.com wrote: > >> configure_file is not the right command > > Yea, it's the nearest existing item, and it only does the most > simplistic replacement that's why I use that as a basis for my example. > It is in effect, like the final last 'sed' step done by gnu > autoconfigure tools. Nothing more. > > >> If an IDE is actually not supported by CMake a generator it will have to > >> be implemented for that in the source code of CMake. > yea, i'm trying to avoid that - but I can write that if required :-( I think you'll have to do this. > > It's more then the IDE, it is also the CHIP effectively the SYSTEM > > What I need is the variable data that Cmake has already and I need to be > able to tell CMake that it *cannot* run the compiler instead, all of the > information about the compiler will be provided via some Cmake script, > for example names like this, either on the command line or specified in > a Cmake file that holds alot of variables. > > Cmake-Embedded-${CompilerName}.txt > Cmake-Embedded-${ChipName}.txt > > Possibly: > > Cmake-Embedded-${RtosName}.txt > > or > > Cmake-Embedded-BareMetal.txt this exists basically. The files cmake loads are named --.cmake. In CMake there are e.g. Generic-SDCC-C.cmake. This would be "BareMetal" using sdcc. You can write your own files for your own platforms. OS and other settings can be set up in your toolchain file. Are you familiar with cmake cross-compiling support in CMake ? If not, please get into this, it handles many of the issues you have. Alex From duane at duaneellis.com Wed May 2 18:57:21 2018 From: duane at duaneellis.com (duane at duaneellis.com) Date: Wed, 02 May 2018 15:57:21 -0700 Subject: [cmake-developers] new generator question - xml file output for embedded IDE platforms. Message-ID: <20180502155721.5c1bb9f86d671edec44bb378f25c04cc.c88d7048cf.wbe@email03.godaddy.com> > > >> If an IDE is actually not supported by CMake a generator it will have to > > >> be implemented for that in the source code of CMake. > > yea, i'm trying to avoid that - but I can write that if required :-( > I think you'll have to do this. Hmm - Another approach is to use Python to create the IDE files And from that python script generate CMakefiles for the unit test situation. Sort of backwards from where I am starting, but that said - my focus is the Embedded side, not the host. One thing that is never easy is a "split language" development environment. ie: Some stuff is done in C (the generator) - and some stuff is done in Script (ie: Cmake Language) you don't have a debugger for Cmake, just print statements, and you are always asking your self the question: "Should this step be in Language A, or Language B" >> {duane: Various cmake examples, snip snip } >> Cmake-Embedded-${CompilerName}.txt >> Cmake-Embedded-${ChipName}.txt >> Cmake-Embedded-${RtosName}.txt >> Cmake-Embedded-BareMetal.txt > this exists basically. > [snip] >In CMake there are e.g. Generic-SDCC-C.cmake. but that uses/requires/assumes - Make files are used to run the build. Having the IDE execute Make - while doable, is *NOT* desired. >> Are you familiar with cmake cross-compiling support in CMake ? yes, I'm currently building Cmake from source, and stepping through the code experimenting and learning how Generators work. For my generator I basically need to make a dummy (does nothing) generator and let everything be done in the template Often it seems to always comes back to the generator and what needs to be done there. It is same problem I talked about earlier - some things are in CmakeScript, Some are in C, and others will be in the template it self, and where does feature(X) need to be in C, or in Cmake or in the Template? -Duane. From duane at duaneellis.com Wed May 2 19:23:40 2018 From: duane at duaneellis.com (duane at duaneellis.com) Date: Wed, 02 May 2018 16:23:40 -0700 Subject: [cmake-developers] new generator question - xml file output for embedded IDE platforms. Message-ID: <20180502162340.5c1bb9f86d671edec44bb378f25c04cc.8e5536d62f.wbe@email03.godaddy.com> >> Are you familiar with cmake cross-compiling support in CMake ? > yes, I'm currently building Cmake from source, .... > Often it seems to always comes back to the generator and .. I'll add for example when it is processing languages, CMakeCInformation.cmake (and the CXX version) which that script assumes that it will be managing not only the compiler, but linker and libarian Why this way? Because in this case, there is ZERO need to do all of the "./configure" type testing and determination because at the end of the day, CMake will not be executing the compiler period. The IDE will do all of that work. This type of change is non trivial, and I'm wondering if it is the right approach or not. -Duane. From steveire at gmail.com Mon May 7 12:01:47 2018 From: steveire at gmail.com (Stephen Kelly) Date: Mon, 7 May 2018 17:01:47 +0100 Subject: [cmake-developers] Experiments in CMake support for Clang (header & standard) modules In-Reply-To: References: <2a2abd48-dee2-dc1a-b5e6-33e89d9b1472@gmail.com> Message-ID: <4f9ba68d-ae14-4088-a8cb-8a2b23659ad6@gmail.com> I think this discussion is more suited to the cmake-developers mailing list. Moving there. Hopefully Brad or someone else can provide other input from research already done. On 05/07/2018 12:49 AM, David Blaikie wrote: > >> The basic commands required are: >> >> ? clang++ -fmodules -xc++ -Xclang -emit-module -Xclang >> -fmodules-codegen -fmodule-name=foo foo.modulemap -o foo.pcm >> ? clang++ -fmodules -c -fmodule-file=foo.pcm use.cpp >> ? clang++ -c foo.pcm >> ? clang++ foo.o use.o -o a.out > > Ok. Fundamentally, I am suspicious of having to have a > -fmodule-file=foo.pcm for every 'import foo' in each cpp file. I > shouldn't have to manually add that each time I add a new import > to my cpp file. Even if it can be automated (eg by CMake), I > shouldn't have to have my buildsystem be regenerated each time I > add an import to my cpp file either. > > That's something I mentioned in the google groups post I made > which you linked to. How will that work when using Qt or any other > library? > > > - My understanding/feeling is that this would be similar to how a user > has to change their link command when they pick up a new dependency. Perhaps it would be interesting to get an idea of how often users need to change their buildsystems because of a new link dependency, and how often users add includes to existing c++ files. I expect you'll find the latter to be a far bigger number. I also expect that expecting users to edit their buildsystem, or allow it to be regenerated every time they add/remove includes would lead to less adoption of modules. I can see people trying them and then giving up in frustration. I think I read somewhere that the buildsystem in google already requires included '.h' files to be listed explicitly in the buildsystem, so it's no change in workflow there. For other teams, that would be a change which could be a change in workflow and something rebelled against. By the way, do you have any idea how much modules adoption would be needed to constitute "success"? Is there a goal there? > Nope, scratch that ^ I had thought that was the case, but talking more > with Richard Smith it seems there's an expectation that modules will > be somewhere between header and library granularity (obviously some > small libraries today have one or only a few headers, some (like Qt) > have many - maybe those on the Qt end might have slightly fewer > modules than the have headers - but still several modules to one > library most likely, by the sounds of it) Why? Richard maybe you can answer that? These are the kinds of things I was trying to get answers to in the previous post to iso sg2 in the google group. I didn't get an answer as definitive as this, so maybe you can share the reason behind such a definitive answer? > Now, admittedly, external dependencies are a little more complicated > than internal (within a single project consisting of multiple > libraries) - which is why I'd like to focus a bit on the simpler > internal case first. Fair enough. > ? > > Today, a beginner can find a random C++ book, type in a code > example from chapter one and put `g++ -I/opt/book_examples > prog1.cpp` into a terminal and get something compiling and > running. With modules, they'll potentially have to pass a whole > list of module files too. > > > Yeah, there's some talk of supporting a mode that doesn't explicitly > build/use modules in the filesystem, but only in memory for the > purpose of preserving the isolation semantics of modules. This would > be used in simple direct-compilation cases like this. Such a library > might need a configuration file or similar the compiler can parse to > discover the parameters (warning flags, define flags, whatever else) > needed to build the BMI. Perhaps. I'd be interested in how far into the book such a system would take a beginner. Maybe that's fine, I don't know. Such a system might not help with code in stack overflow questions/answers though, which would probably be simpler sticking with includes (eg for Qt/boost). Library authors will presumably have some say, or try to introduce some 'best practice' for users to follow. And such best practice will be different for each library. ? > ? > > I raised some of these issues a few years ago regarding the clang > implementation with files named exactly module.modulemap: > > http://clang-developers.42468.n3.nabble.com/How-do-I-try-out-C-modules-with-clang-td4041946.html > > http://clang-developers.42468.n3.nabble.com/How-do-I-try-out-C-modules-with-clang-td4041946i20.html > > Interestingly, GCC is taking a directory-centric approach in the > driver (-fmodule-path=) as opposed to the 'add a file to your > compile line for each import' that Clang and MSVC are taking: > > ?http://gcc.gnu.org/wiki/cxx-modules > > Why is Clang not doing a directory-centric driver-interface? It > seems to obviously solve problems. I wonder if modules can be a > success without coordination between major compiler and > buildsystem developers. That's why I made the git repo - to help > work on something more concrete to see how things scale. > > > 'We' (myself & other Clang developers) are/will be talking to GCC > folks to try to get consistency here, in one direction or another > (maybe some 3rd direction different from Clang or LLVM's). As you > noted in a follow-up, there is a directory-based flag in Clang now, > added by Boris as he's been working through adding modules support to > Build2. I just looked through the commits from Boris, and it seems he made some changes relating to -fmodule-file=. That still presupposes that all (transitively) used module files are specified on the command line. I was talking about the -fprebuilt-module-path option added by Manman Ren in https://reviews.llvm.org/D23125 because that actually relieves the user/buildsystem of maintaining a list of all used modules (I hope). > Having just read all of my old posts again, I still worry things > like this will hinder modules 'too much' to be successful. The > more (small) barriers exist, the less chance of success. If > modules aren't successful, then they'll become a poisoned chalice > and no one will be able to work on fixing them. That's actually > exactly what I expect to happen, but I also still hope I'm just > missing something :). I really want to see a committee document > from the people working on modules which actually explores the > problems and barriers to adoption and concludes with 'none of > those things matter'. I think it's fixable, but I haven't seen > anyone interested enough to fix the problems (or even to find out > what they are). > > > Indeed - hence my desire to talk through these things, get some > practical experience, document them to the committee in perhaps a > less-ranty, more concrete form along with pros/cons/unknowns/etc to > hopefully find some consistency, maybe write up a document of "this is > how we expect build systems to integrate with this C++ feature", etc. Great. Nathan Sidwell already wrote a paper which is clearer than I am on some of the problems: ?http://open-std.org/JTC1/SC22/WG21/docs/papers/2017/p0778r0.pdf However he told me it 'wasn't popular'. I don't know if he means the problems were dismissed, or his proposed solution was dismissed as not popular. Nevertheless, I recommend reading the problems stated there. > >> My current very simplistic prototype, to build a module file, its >> respective module object file, and include those in the >> library/link for anything that depends on this library: >> >> ? add_custom_command( >> ? ? ? ? ? COMMAND ${CMAKE_CXX_COMPILER} ${CMAKE_CXX_FLAGS} -xc++ >> -c -Xclang -emit-module -fmodules -fmodule-name=Hello >> ${CMAKE_CURRENT_SOURCE_DIR}/module.modulemap -o >> ${CMAKE_CURRENT_BINARY_DIR}/hello_module.pcm -Xclang >> -fmodules-codegen >> ? ? ? ? ? DEPENDS module.modulemap hello.h > > Why does this command depend on hello.h? > > > Because it builds the binary module interface (hello_module.pcm) that > is a serialized form of the compiler's internal representation of the > contents of module.modulemap which refers to hello.h (the modulemap > lists the header files that are part of the module). This is all using > Clang's current backwards semi-compatible "header modules" stuff. In a > "real" modules system, ideally there wouldn't be any modulemap. Just a > .cppm file, and any files it depends on (discovered through the build > system scanning the module imports, or a compiler-driven .d file style > thing). > > Perhaps it'd be better for me to demonstrate something closer to the > actual modules reality, rather than this retro header modules stuff > that clang supports. That would be better for me. I'm interested in modules-ts, but I'm not interested in clang-modules. > ? > > If that is changed and module.modulemap is not, what will happen? > > > If hello.h is changed and module.modulemap is not changed? The > hello_module.pcm does need to be rebuilt. Hmm, this assumes that the pcm/BMI only contains declarations and not definitions, right? I think clang outputs the definitions in a separate object file, but GCC currently doesn't. Perhaps that's a difference that cmake has to account for or pass on to the user. > > Ideally all of this would be implicit (maybe with some > flag/configuration, or detected based on new file extensions for C++ > interface definitions) in the add_library - taking, let's imagine, the > .ccm (let's say, for argument's sake*) file listed in the > add_library's inputs and using it to build a .pcm (BMI), building that > .pcm as an object file along with all the normal .cc files, Ok, I think this is the separation I described above. > * alternatively, maybe they'll all just be .cc files & a build system > would be scanning the .cc files to figure out dependencies & could > notice that one of them is the blessed module interface definition > based on the first line in the file. Today, users have to contend with errors resulting from their own code being incorrect, using some 3rd party template incorrectly, linking not working due to incorrect link dependencies, and incorrect compiles due to missing include directories (or incorrect defines specified). I can see incorrect inputs to module generation being a new category of errors to confuse users. For example, if in your list of files there are two files which look like the blessed module interface based on the first line in the file, there will be something to debug. > So I suppose the more advanced question: Is there a way I can extend > handling of existing CXX files (and/or define a new kind of file, say, > CXXM?) specified in a cc_library. If I want to potentially check if a > .cc file is a module, discover its module dependencies, add new rules > about how to build those, etc. Is that do-able within my cmake > project, or would that require changes to cmake itself? (I'm happy to > poke around at what those changes might look like) One of the things users can do in order to ensure that CMake works best is to explicitly list the cpp files they want compiled, instead of relying on globbing as users are prone to want to do: ?https://stackoverflow.com/questions/1027247/is-it-better-to-specify-source-files-with-glob-or-each-file-individually-in-cmak if using globbing, adding a new file does not cause the buildsystem to be regenerated, and you won't have a working build until you explicitly cause cmake to be run again. I expect you could get into similar problems with modules - needing a module to be regenerated because its dependencies change (because it exports what it imports from a dependency for example). I'm not sure anything can be done to cause cmake to reliably regenerate the module in that case. It seems similar to the globbing case to me. But aside from that you could probably experimentally come up with a way to do the check for whether a file is a module and discover its direct dependencies using file(READ). You might want to delegate to a script in another language to determine transitive dependencies and what add_custom{_command,_target} code to generate. > >> But this isn't ideal - I don't /think/ I've got the dependencies >> quite right & things might not be rebuilding at the right times. >> Also it involves hardcoding a bunch of things like the pcm file >> names, header files, etc. > > Indeed. I think part of that comes from the way modules have been > designed. The TS has similar issues. > > > Sure - but I'd still be curious to understand how I might go about > modifying the build system to handle this. If there are obvious things > I have gotten wrong about the dependencies, etc, that would cause this > not to rebuild on modifications to any of the source/header files - > I'd love any tips you've got. Sure. I didn't notice anything from reading, but I also didn't try it out. You might need to provide a repo with the module.modulemap/c++ files etc that are part of your experiment. Or better, provide something based on modules-ts that I can try out. > & if there are good paths forward for ways to prototype changes to the > build system to handle, say, specifying a switch/setting a > property/turning on a feature that I could implement that would > collect all the .ccm files in an add_library rule and use them to make > a .pcm file - I'd be happy to try prototyping that. cmGeneratorTarget has a set of methods like GetResxSources which return a subset of the files provided to add_library/target_sources by splitting them by 'kind'. You would probably extend ComputeKindedSources to handle the ccm extension, add a GetCCMFiles() to cmGeneratorTarget, then use that new GetCCMFiles() in the makefiles/ninja generator to generate rules. When extending ComputeKindedSources could use ?if(Target->getPropertyAsBool("MAKE_CCM_RULES")) as a condition to populating the 'kind'. Then rules will only be created for targets which use something like ?set_property(TARGET myTarget PROPERTY MAKE_CCM_RULES ON) in cmake code. I'm guessing that's enough for you to implement what you want as an experiment? >> Ideally, at least for a simplistic build, I wouldn't mind >> generating a modulemap from all the .h files (& have those >> headers listed in the add_library command - perhaps splitting >> public and private headers in some way, only including the public >> headers in the module file, likely). Eventually for the >> standards-proposal version, it's expected that there won't be any >> modulemap file, but maybe all headers are included in the module >> compilation (just pass the headers directly to the compiler). > > In a design based on passing directories instead of files, would > those directories be redundant with the include directories? > > > I'm not sure I understand the question, but if I do, I think the > answer would be: no, they wouldn't be redundant. The system will not > have precompiled modules available to use - because binary module > definitions are compiler (& compiler version, and to some degree, > compiler flags (eg: are you building this for x86 32 bit or 64 bit?)) > dependent. Right. I discussed modules with Nathan Sidwell meanwhile and realised this too. > ? > > One of the problems modules adoption will hit is that all the > compilers are designing fundamentally different command line > interfaces for them. > > > *nod* We'll be working amongst GCC and Clang at least to try to > converge on something common. Different flags would not be a problem for cmake at least, but if Clang didn't have something like -fprebuilt-module-path and GCC did, that would be the kind of 'fundamental' difference I mean. >> This also doesn't start to approach the issue of how to build >> modules for external libraries - which I'm happy to >> discuss/prototype too, though interested in working to streamline >> the inter-library but intra-project (not inter-project) case first. > > Yes, there are many aspects to consider. > > Are you interested in design of a CMake abstraction for this > stuff? I have thoughts on that, but I don't know if your level of > interest stretches that far. > > > Not sure how much work it'd be - at the moment my immediate interest > is to show as much real-world/can-actually-run prototype with cmake as > possible, either with or without changes to cmake itself (or a > combination of minimal cmake changes plus project-specific recipes of > how to write a user's cmake files to work with this stuff) or also > showing non-working/hypothetical prototypes of what ideal user cmake > files would look like with reasonable/viable (but not yet implemented) > cmake support. Yes, it's specifying the ideal user cmake files that I mean. Given that granularity of modules can be anywhere on the spectrum between one-module-file-per-library and one-module-file-per-class, I think cmake will need to consider one-module-file-per-library and *not*-one-module-file-per-library separately. In the *not*-one-module-file-per-library case, cmake might have to delegate more to the user, so it would be more inconvenient for them. In the one-module-file-per-library case, I think the ideal is something like: ?add_library(foo foo.cpp) ?# assuming foo.h is a module interface file, this creates ?# a c++-module called foo and makes it an interface usage ?# requirement of the foo target defined above ?add_cxx_module(foo foo.h) ?# bar.cpp imports foo. ?add_library(bar bar.cpp) ?# bar links to foo, and a suitable compile line argument is added if ?# needed for the foo module. ?target_link_libraries(bar foo) This would work best if foo.h did not contain ?module; ?export module foo; (after http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p0713r1.html) but instead contained only ?module; and the module name came from the buildsystem (or from the compiler using the basename). As it is, the above cmake code would have to determine the module name from foo.h and throw an error if it was different from foo. Having the module name inside the source just adds scope for things to be wrong. It would be better to specify the module name on the outside. I wonder what you think about that, and whether it can be changed in the modules ts? My thoughts on this and what the ideal in cmake would be are changing as the discussion continues. > Can you help? It would really help my understanding of where > things currently stand with modules. > > > I can certainly have a go, for sure. Great, thanks. > ? > > For example, is there only one way to port the contents of the cpp > files? > > > Much like header grouping - how granular headers are (how many headers > you have for a given library) is up to the developer to some degree > (certain things can't be split up), similarly with modules - given a > set of C++ definitions, it's not 100% constrained how those > definitions are exposed as modules - the developer has some freedom > over how the declarations of those entities are grouped into modules. Yes, exactly. This repo is small, but has a few libraries, so if we start with one approach we should be easily able to also try a different approach and examine what the difference is and what it means. > ? > > After that, is there one importable module per class or one per > shared library (which I think would make more sense for Qt)? > > > Apparently (this was a surprise to me - since I'd been thinking about > this based on the Clang header modules (backwards compatibility stuff, > not the standardized/new language feature modules)) the thinking is > probably somewhere between one-per-class and one-per-shared-library. > But for me, in terms of how a build file would interact with this, > more than one-per-shared-library is probably the critical tipping point. Yes. I think you're talking about the one-module-file-per-library and *not*-one-module-file-per-library distinction I mentioned above. > If it was just one per shared library, then I'd feel like the > dependency/flag management would be relatively simple. You have to add > a flag to the linker commandline to link in a library, so you have to > add a flag to the compile step to reference a module, great. But, no, > bit more complicated than that given the finer granularity that's > expected here. "finer granularity that's *allowed* here" really. If there is a simple thing for the user to do (ie one-module-file-per-library), then cmake can make that simple to achieve (because the dependencies between modules are the same as dependencies between targets, which the user already specifies with target_link_libraries). If the user wants to do the more complicated thing (*not*-one-module-file-per-library), then cmake can provide APIs for the user to do that (perhaps by requiring the user to explicitly specify the dependencies between modules). My point is that cmake can design optimize for the easy way and I think users will choose the easy way most of the time. ? > > The git repo is an attempt to make the discussion concrete because > it would show how multiple classes and multiple libraries with > dependencies could interact in a modules world. I'm interested in > what it would look like ported to modules-ts, because as far as I > know, clang-modules and module maps would not need porting of the > cpp files at all. > > > Right, clang header-modules is a backwards compatibility feature. It > does require a constrained subset of C++ to be used to be effective > (ie: basically your headers need to be what we think of as > ideal/canonical headers - reincludable, independent, complete, etc). > So if you've got good/isolated headers, you can port them to Clang's > header modules by adding the module maps & potentially not doing > anything else - though, if you rely on not changing your build system, > then that presents some problems if you want to scale (more cores) or > distribute your build. Because the build system doesn't know about > these? dependencies - so if you have, say, two .cc files that both > include foo.h then bar.h - well, the build system runs two compiles, > both compiles try to implicitly build the foo.h module - one blocks > waiting for the other to complete, then they continue and block again > waiting for bar.h module to be built. If the build system knew about > these dependencies (what Google uses - what we call "explicit > (header)modules") then it could build the foo.h module and the bar.h > module in parallel, then build the two .cc files in parallel. I think that the 'build a module' step should be a precondition to the compile step. I think the compiler should issue an error if it encounters an import for a module it doesn't find a file for. No one expects a linker to compile foo.cpp into foo.o and link it just because it encounters a fooFunc without a definition which was declared in foo.h. That would reduce the magic and expect something like ?add_cxx_module(somename somefile.h otherfiles.h) to specify a module file and its constituent partitions, which I think is fine. >> Basically: What do folks think about supporting these sort of >> features in CMake C++ Builds? Any pointers on how I might best >> implement this with or without changes to CMake? > > I think some design is needed up front. I expect CMake would want > to have a first-class (on equal footing with include directories > or compile definitions and with particular handling) concept for > modules, extending the install(TARGET) command to install module > binary files etc. > > > Module binary files wouldn't be installed in the sense of being part > of the shipped package of a library - because module binary files are > compiler/flag/etc specific. Ok. Thanks, Stephen. -------------- next part -------------- An HTML attachment was scrubbed... URL: From brad.king at kitware.com Mon May 7 13:13:14 2018 From: brad.king at kitware.com (Brad King) Date: Mon, 7 May 2018 13:13:14 -0400 Subject: [cmake-developers] Experiments in CMake support for Clang (header & standard) modules In-Reply-To: <4f9ba68d-ae14-4088-a8cb-8a2b23659ad6@gmail.com> References: <2a2abd48-dee2-dc1a-b5e6-33e89d9b1472@gmail.com> <4f9ba68d-ae14-4088-a8cb-8a2b23659ad6@gmail.com> Message-ID: <17a12188-2e2e-26d8-8ace-7b3f1765f7a3@kitware.com> On 05/07/2018 12:01 PM, Stephen Kelly wrote: > Hopefully Brad or someone else can provide other input from research already done. I'm not particularly familiar with what compiler writers or the modules standard specification expects build systems to do w.r.t modules. However, IIUC at least at one time the expectation was that the module files would not be installed like headers and are used only within a local build tree. Are modules even supposed to be first-class entities in the build system specification that users write? In the Fortran world users just list all the sources and build systems are expected to figure it out. CMake has very good support for Fortran modules. Our Ninja generator has rules to preprocess the translation units first, then parse the preprocessed output to extract module definitions and usages, then inject the added dependencies into the build graph, and then begin compilation of sources ordered by those dependencies (this requires a custom fork of Ninja pending upstream acceptance). Is that what is expected from C++ buildsystems for modules too? -Brad From dblaikie at gmail.com Wed May 9 18:28:49 2018 From: dblaikie at gmail.com (David Blaikie) Date: Wed, 09 May 2018 22:28:49 +0000 Subject: [cmake-developers] Experiments in CMake support for Clang (header & standard) modules In-Reply-To: References: <2a2abd48-dee2-dc1a-b5e6-33e89d9b1472@gmail.com> <4f9ba68d-ae14-4088-a8cb-8a2b23659ad6@gmail.com> Message-ID: Forwarding to the developers list, since my original reply bounced as I wasn't subscribed to the list yet. ---------- Forwarded message --------- From: David Blaikie Date: Wed, May 9, 2018 at 3:23 PM Subject: Re: Experiments in CMake support for Clang (header & standard) modules To: Stephen Kelly Cc: cmake-developers at cmake.org , Richard Smith < richard at metafoo.co.uk>, On Mon, May 7, 2018 at 9:01 AM Stephen Kelly wrote: > > I think this discussion is more suited to the cmake-developers mailing > list. Moving there. > Sure thing - I was sort of hoping much/all of this functionality could be implemented at the user-level without changes to cmake itself at least initially - then upstream convenience mechanisms so everyone doesn't have to rewrite the functionality locally. > Hopefully Brad or someone else can provide other input from research > already done. > > > On 05/07/2018 12:49 AM, David Blaikie wrote: > > > The basic commands required are: >> >> clang++ -fmodules -xc++ -Xclang -emit-module -Xclang -fmodules-codegen >> -fmodule-name=foo foo.modulemap -o foo.pcm >> clang++ -fmodules -c -fmodule-file=foo.pcm use.cpp >> clang++ -c foo.pcm >> clang++ foo.o use.o -o a.out >> >> >> Ok. Fundamentally, I am suspicious of having to have a >> -fmodule-file=foo.pcm for every 'import foo' in each cpp file. I shouldn't >> have to manually add that each time I add a new import to my cpp file. Even >> if it can be automated (eg by CMake), I shouldn't have to have my >> buildsystem be regenerated each time I add an import to my cpp file either. >> >> That's something I mentioned in the google groups post I made which you >> linked to. How will that work when using Qt or any other library? >> > > - My understanding/feeling is that this would be similar to how a user has > to change their link command when they pick up a new dependency. > > > Perhaps it would be interesting to get an idea of how often users need to > change their buildsystems because of a new link dependency, and how often > users add includes to existing c++ files. > Right - my mental model was more that modules would be more one-to-one with libraries, rather than the way headers are these days (with many headers for one library). That seems to be incorrect & it's likely there may be multiple modules (perhaps somewhat fewer than headers) for a single library. > I expect you'll find the latter to be a far bigger number. > > I also expect that expecting users to edit their buildsystem, or allow it > to be regenerated every time they add/remove includes would lead to less > adoption of modules. I can see people trying them and then giving up in > frustration. > > I think I read somewhere that the buildsystem in google already requires > included '.h' files to be listed explicitly in the buildsystem, so it's no > change in workflow there. > Not for the users of the library - but yes, a library needs to declare which headers it contains/provides. (I'm not sure it's at the point where that's required across the codebase - but it's a direction things are moving in). > For other teams, that would be a change which could be a change in > workflow and something rebelled against. > > By the way, do you have any idea how much modules adoption would be needed > to constitute "success"? Is there a goal there? > I don't know of any assessment like that. > Nope, scratch that ^ I had thought that was the case, but talking more > with Richard Smith it seems there's an expectation that modules will be > somewhere between header and library granularity (obviously some small > libraries today have one or only a few headers, some (like Qt) have many - > maybe those on the Qt end might have slightly fewer modules than the have > headers - but still several modules to one library most likely, by the > sounds of it) > > > Why? Richard maybe you can answer that? These are the kinds of things I > was trying to get answers to in the previous post to iso sg2 in the google > group. I didn't get an answer as definitive as this, so maybe you can share > the reason behind such a definitive answer? > It's more that the functionality will allow this & just judging by how people do things today (existing header granularity partly motivated by the cost of headers that doesn't apply to modules), how they're likely to do things in the future (I personally would guess people will probably try to just port their headers to modules - and a few places where there are circular dependencies in headers or the like they might glob them up into one module). > Now, admittedly, external dependencies are a little more complicated than > internal (within a single project consisting of multiple libraries) - which > is why I'd like to focus a bit on the simpler internal case first. > > > Fair enough. > > > > >> Today, a beginner can find a random C++ book, type in a code example from >> chapter one and put `g++ -I/opt/book_examples prog1.cpp` into a terminal >> and get something compiling and running. With modules, they'll potentially >> have to pass a whole list of module files too. >> > > Yeah, there's some talk of supporting a mode that doesn't explicitly > build/use modules in the filesystem, but only in memory for the purpose of > preserving the isolation semantics of modules. This would be used in simple > direct-compilation cases like this. Such a library might need a > configuration file or similar the compiler can parse to discover the > parameters (warning flags, define flags, whatever else) needed to build the > BMI. > > > Perhaps. I'd be interested in how far into the book such a system would > take a beginner. Maybe that's fine, I don't know. Such a system might not > help with code in stack overflow questions/answers though, which would > probably be simpler sticking with includes (eg for Qt/boost). > I imagine the expectation is that eventually there will just be modules APIs to some libraries/classes, so it won't always be a user choice. (if there's sufficient penetration of modules, as you say - if there's sufficiently ubiquitous build support, etc - then library vendors will only provide a module API without a "backwards compatible" header based API) > Library authors will presumably have some say, or try to introduce some > 'best practice' for users to follow. And such best practice will be > different for each library. > I would expect so - as they do with documentation today. Though if they only provide a modules or only provide a header API - users will use whatever they're provided. > I raised some of these issues a few years ago regarding the clang >> implementation with files named exactly module.modulemap: >> >> >> http://clang-developers.42468.n3.nabble.com/How-do-I-try-out-C-modules-with-clang-td4041946.html >> >> >> http://clang-developers.42468.n3.nabble.com/How-do-I-try-out-C-modules-with-clang-td4041946i20.html >> >> Interestingly, GCC is taking a directory-centric approach in the driver >> (-fmodule-path=) as opposed to the 'add a file to your compile line >> for each import' that Clang and MSVC are taking: >> >> http://gcc.gnu.org/wiki/cxx-modules >> >> Why is Clang not doing a directory-centric driver-interface? It seems to >> obviously solve problems. I wonder if modules can be a success without >> coordination between major compiler and buildsystem developers. That's why >> I made the git repo - to help work on something more concrete to see how >> things scale. >> > > 'We' (myself & other Clang developers) are/will be talking to GCC folks to > try to get consistency here, in one direction or another (maybe some 3rd > direction different from Clang or LLVM's). As you noted in a follow-up, > there is a directory-based flag in Clang now, added by Boris as he's been > working through adding modules support to Build2. > > > I just looked through the commits from Boris, and it seems he made some > changes relating to -fmodule-file=. That still presupposes that all > (transitively) used module files are specified on the command line. > Actually I believe the need is only the immediate dependencies - at least with Clang's implementation. > I was talking about the -fprebuilt-module-path option added by Manman Ren > in https://reviews.llvm.org/D23125 because that actually relieves the > user/buildsystem of maintaining a list of all used modules (I hope). > *nod* & as you say, GCC has something similar. Though the build system probably wants to know about the used modules to do dependency analysis & rebuilding correctly. So that's something I'm still trying to figure out - if the dependency information (& indeed, the names of the BMI files) can be wholely known only by the compiler or wholely > > > Having just read all of my old posts again, I still worry things like this >> will hinder modules 'too much' to be successful. The more (small) barriers >> exist, the less chance of success. If modules aren't successful, then >> they'll become a poisoned chalice and no one will be able to work on fixing >> them. That's actually exactly what I expect to happen, but I also still >> hope I'm just missing something :). I really want to see a committee >> document from the people working on modules which actually explores the >> problems and barriers to adoption and concludes with 'none of those things >> matter'. I think it's fixable, but I haven't seen anyone interested enough >> to fix the problems (or even to find out what they are). >> > > Indeed - hence my desire to talk through these things, get some practical > experience, document them to the committee in perhaps a less-ranty, more > concrete form along with pros/cons/unknowns/etc to hopefully find some > consistency, maybe write up a document of "this is how we expect build > systems to integrate with this C++ feature", etc. > > > Great. Nathan Sidwell already wrote a paper which is clearer than I am on > some of the problems: > > http://open-std.org/JTC1/SC22/WG21/docs/papers/2017/p0778r0.pdf > > However he told me it 'wasn't popular'. I don't know if he means the > problems were dismissed, or his proposed solution was dismissed as not > popular. > > Nevertheless, I recommend reading the problems stated there. > Yeah, thanks for the link - useful to read. > My current very simplistic prototype, to build a module file, its >> respective module object file, and include those in the library/link for >> anything that depends on this library: >> >> add_custom_command( >> COMMAND ${CMAKE_CXX_COMPILER} ${CMAKE_CXX_FLAGS} -xc++ -c >> -Xclang -emit-module -fmodules -fmodule-name=Hello >> ${CMAKE_CURRENT_SOURCE_DIR}/module.modulemap -o >> ${CMAKE_CURRENT_BINARY_DIR}/hello_module.pcm -Xclang -fmodules-codegen >> DEPENDS module.modulemap hello.h >> >> >> Why does this command depend on hello.h? >> > > Because it builds the binary module interface (hello_module.pcm) that is a > serialized form of the compiler's internal representation of the contents > of module.modulemap which refers to hello.h (the modulemap lists the header > files that are part of the module). This is all using Clang's current > backwards semi-compatible "header modules" stuff. In a "real" modules > system, ideally there wouldn't be any modulemap. Just a .cppm file, and any > files it depends on (discovered through the build system scanning the > module imports, or a compiler-driven .d file style thing). > > Perhaps it'd be better for me to demonstrate something closer to the > actual modules reality, rather than this retro header modules stuff that > clang supports. > > > That would be better for me. I'm interested in modules-ts, but I'm not > interested in clang-modules. > > > > >> If that is changed and module.modulemap is not, what will happen? >> > > If hello.h is changed and module.modulemap is not changed? The > hello_module.pcm does need to be rebuilt. > > > Hmm, this assumes that the pcm/BMI only contains declarations and not > definitions, right? > Nah, it might contain definitions, same as a header file would today (inline) - or non-inline definitions. > I think clang outputs the definitions in a separate object file, but GCC > currently doesn't. Perhaps that's a difference that cmake has to account > for or pass on to the user. > Clang outputs frontend-usable (not object code, but serialized AST usable for compiling other source code) descriptions of the entire module (whatever it contains - declarations, definitions, etc) to the .pcm file. It can then, in a separate step, build an object file from the pcm. I think GCC produces both of these artifacts in one go - but not in the same file. > Ideally all of this would be implicit (maybe with some flag/configuration, > or detected based on new file extensions for C++ interface definitions) in > the add_library - taking, let's imagine, the .ccm (let's say, for > argument's sake*) file listed in the add_library's inputs and using it to > build a .pcm (BMI), building that .pcm as an object file along with all the > normal .cc files, > > > Ok, I think this is the separation I described above. > > > * alternatively, maybe they'll all just be .cc files & a build system > would be scanning the .cc files to figure out dependencies & could notice > that one of them is the blessed module interface definition based on the > first line in the file. > > > Today, users have to contend with errors resulting from their own code > being incorrect, using some 3rd party template incorrectly, linking not > working due to incorrect link dependencies, and incorrect compiles due to > missing include directories (or incorrect defines specified). I can see > incorrect inputs to module generation being a new category of errors to > confuse users. > > For example, if in your list of files there are two files which look like > the blessed module interface based on the first line in the file, > Two files that both start with "export module foo"? Yes, that would be problematic/erroneous. > there will be something to debug. > > > So I suppose the more advanced question: Is there a way I can extend > handling of existing CXX files (and/or define a new kind of file, say, > CXXM?) specified in a cc_library. If I want to potentially check if a .cc > file is a module, discover its module dependencies, add new rules about how > to build those, etc. Is that do-able within my cmake project, or would that > require changes to cmake itself? (I'm happy to poke around at what those > changes might look like) > > > One of the things users can do in order to ensure that CMake works best is > to explicitly list the cpp files they want compiled, instead of relying on > globbing as users are prone to want to do: > > > https://stackoverflow.com/questions/1027247/is-it-better-to-specify-source-files-with-glob-or-each-file-individually-in-cmak > > if using globbing, adding a new file does not cause the buildsystem to be > regenerated, and you won't have a working build until you explicitly cause > cmake to be run again. > > I expect you could get into similar problems with modules - needing a > module to be regenerated because its dependencies change (because it > exports what it imports from a dependency for example). I'm not sure > anything can be done to cause cmake to reliably regenerate the module in > that case. It seems similar to the globbing case to me. > > But aside from that you could probably experimentally come up with a way > to do the check for whether a file is a module and discover its direct > dependencies using file(READ). You might want to delegate to a script in > another language to determine transitive dependencies and what > add_custom{_command,_target} code to generate. > OK > > > >> But this isn't ideal - I don't /think/ I've got the dependencies quite >> right & things might not be rebuilding at the right times. >> Also it involves hardcoding a bunch of things like the pcm file names, >> header files, etc. >> >> >> Indeed. I think part of that comes from the way modules have been >> designed. The TS has similar issues. >> > > Sure - but I'd still be curious to understand how I might go about > modifying the build system to handle this. If there are obvious things I > have gotten wrong about the dependencies, etc, that would cause this not to > rebuild on modifications to any of the source/header files - I'd love any > tips you've got. > > > Sure. I didn't notice anything from reading, but I also didn't try it out. > You might need to provide a repo with the module.modulemap/c++ files etc > that are part of your experiment. Or better, provide something based on > modules-ts that I can try out. > *nod* I'll see if I can get enough of modules-ts type things working to provide some examples, but there's some more variance/uncertainty there in the compiler support, etc. > & if there are good paths forward for ways to prototype changes to the > build system to handle, say, specifying a switch/setting a property/turning > on a feature that I could implement that would collect all the .ccm files > in an add_library rule and use them to make a .pcm file - I'd be happy to > try prototyping that. > > > cmGeneratorTarget has a set of methods like GetResxSources which return a > subset of the files provided to add_library/target_sources by splitting > them by 'kind'. You would probably extend ComputeKindedSources to handle > the ccm extension, add a GetCCMFiles() to cmGeneratorTarget, then use that > new GetCCMFiles() in the makefiles/ninja generator to generate rules. > > When extending ComputeKindedSources could use > > if(Target->getPropertyAsBool("MAKE_CCM_RULES")) > > as a condition to populating the 'kind'. Then rules will only be created > for targets which use something like > > set_property(TARGET myTarget PROPERTY MAKE_CCM_RULES ON) > > in cmake code. > > I'm guessing that's enough for you to implement what you want as an > experiment? > OK, so in that case it requires source changes to cmake? *nod* sounds plausible - I appreciate the pointers. I take it that implies there's not a way I could hook into those file kinds and filters without changing cmake? (ie: from within my project's cmake build files, without modifying a cmake release) > > > Ideally, at least for a simplistic build, I wouldn't mind generating a >> modulemap from all the .h files (& have those headers listed in the >> add_library command - perhaps splitting public and private headers in some >> way, only including the public headers in the module file, likely). >> Eventually for the standards-proposal version, it's expected that there >> won't be any modulemap file, but maybe all headers are included in the >> module compilation (just pass the headers directly to the compiler). >> >> >> In a design based on passing directories instead of files, would those >> directories be redundant with the include directories? >> > > I'm not sure I understand the question, but if I do, I think the answer > would be: no, they wouldn't be redundant. The system will not have > precompiled modules available to use - because binary module definitions > are compiler (& compiler version, and to some degree, compiler flags (eg: > are you building this for x86 32 bit or 64 bit?)) dependent. > > > Right. I discussed modules with Nathan Sidwell meanwhile and realised this > too. > > > > >> One of the problems modules adoption will hit is that all the compilers >> are designing fundamentally different command line interfaces for them. >> > > *nod* We'll be working amongst GCC and Clang at least to try to converge > on something common. > > > Different flags would not be a problem for cmake at least, but if Clang > didn't have something like -fprebuilt-module-path and GCC did, that would > be the kind of 'fundamental' difference I mean. > > > This also doesn't start to approach the issue of how to build modules for >> external libraries - which I'm happy to discuss/prototype too, though >> interested in working to streamline the inter-library but intra-project >> (not inter-project) case first. >> >> >> Yes, there are many aspects to consider. >> >> Are you interested in design of a CMake abstraction for this stuff? I >> have thoughts on that, but I don't know if your level of interest stretches >> that far. >> > > Not sure how much work it'd be - at the moment my immediate interest is to > show as much real-world/can-actually-run prototype with cmake as possible, > either with or without changes to cmake itself (or a combination of minimal > cmake changes plus project-specific recipes of how to write a user's cmake > files to work with this stuff) or also showing non-working/hypothetical > prototypes of what ideal user cmake files would look like with > reasonable/viable (but not yet implemented) cmake support. > > > Yes, it's specifying the ideal user cmake files that I mean. Given that > granularity of modules can be anywhere on the spectrum between > one-module-file-per-library and one-module-file-per-class, I think cmake > will need to consider one-module-file-per-library and > *not*-one-module-file-per-library separately. > > In the *not*-one-module-file-per-library case, cmake might have to > delegate more to the user, so it would be more inconvenient for them. > I'm not sure why that would be the case - they'd hopefully be as easy as the other. The same way that header inclusion discovery doesn't require the user to do anything today - though maybe we'd get more thorough checking that you don't depend on modules from libraries you haven't depended upon, that'd be nice. > > In the one-module-file-per-library case, I think the ideal is something > like: > > add_library(foo foo.cpp) > # assuming foo.h is a module interface file, this creates > # a c++-module called foo and makes it an interface usage > # requirement of the foo target defined above > add_cxx_module(foo foo.h) > > # bar.cpp imports foo. > add_library(bar bar.cpp) > # bar links to foo, and a suitable compile line argument is added if > # needed for the foo module. > target_link_libraries(bar foo) > > This would work best if foo.h did not contain > > module; > export module foo; > > (after > http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p0713r1.html) > > but instead contained only > > module; > > and the module name came from the buildsystem (or from the compiler using > the basename). > > As it is, the above cmake code would have to determine the module name > from foo.h and throw an error if it was different from foo. Having the > module name inside the source just adds scope for things to be wrong. It > would be better to specify the module name on the outside. > > I wonder what you think about that, and whether it can be changed in the > modules ts? My thoughts on this and what the ideal in cmake would be are > changing as the discussion continues. > I'm not too involved in the standardization process - though I'm trying to help inform it by getting a better understanding of what's clear, what's not, how it all might come together & helping inform the committee of that. So I'm not sure if/how practical it would be to change that at this stage, but my default assumption would be "not easy" - at least in the sense that, for now, I'm mostly looking at trying to describe how the existing things could work (and what parts of that might be painful - including this name duplication sort of situations like you're pointing out) & likely leaving it to other people on the committee to decide if those pain points are sufficient to consider changing the design. > > > Can you help? It would really help my understanding of where things >> currently stand with modules. >> > > I can certainly have a go, for sure. > > > Great, thanks. > > > > >> For example, is there only one way to port the contents of the cpp files? >> > > Much like header grouping - how granular headers are (how many headers you > have for a given library) is up to the developer to some degree (certain > things can't be split up), similarly with modules - given a set of C++ > definitions, it's not 100% constrained how those definitions are exposed as > modules - the developer has some freedom over how the declarations of those > entities are grouped into modules. > > > Yes, exactly. This repo is small, but has a few libraries, so if we start > with one approach we should be easily able to also try a different approach > and examine what the difference is and what it means. > > > > >> After that, is there one importable module per class or one per shared >> library (which I think would make more sense for Qt)? >> > > Apparently (this was a surprise to me - since I'd been thinking about this > based on the Clang header modules (backwards compatibility stuff, not the > standardized/new language feature modules)) the thinking is probably > somewhere between one-per-class and one-per-shared-library. But for me, in > terms of how a build file would interact with this, more than > one-per-shared-library is probably the critical tipping point. > > > Yes. I think you're talking about the one-module-file-per-library and > *not*-one-module-file-per-library distinction I mentioned above. > > > If it was just one per shared library, then I'd feel like the > dependency/flag management would be relatively simple. You have to add a > flag to the linker commandline to link in a library, so you have to add a > flag to the compile step to reference a module, great. But, no, bit more > complicated than that given the finer granularity that's expected here. > > > "finer granularity that's *allowed* here" really. If there is a simple > thing for the user to do (ie one-module-file-per-library), then cmake can > make that simple to achieve (because the dependencies between modules are > the same as dependencies between targets, which the user already specifies > with target_link_libraries). > > If the user wants to do the more complicated thing > (*not*-one-module-file-per-library), then cmake can provide APIs for the > user to do that (perhaps by requiring the user to explicitly specify the > dependencies between modules). > > My point is that cmake can design optimize for the easy way and I think > users will choose the easy way most of the time. > > > > The git repo is an attempt to make the discussion concrete because it >> would show how multiple classes and multiple libraries with dependencies >> could interact in a modules world. I'm interested in what it would look >> like ported to modules-ts, because as far as I know, clang-modules and >> module maps would not need porting of the cpp files at all. >> > > Right, clang header-modules is a backwards compatibility feature. It does > require a constrained subset of C++ to be used to be effective (ie: > basically your headers need to be what we think of as ideal/canonical > headers - reincludable, independent, complete, etc). So if you've got > good/isolated headers, you can port them to Clang's header modules by > adding the module maps & potentially not doing anything else - though, if > you rely on not changing your build system, then that presents some > problems if you want to scale (more cores) or distribute your build. > Because the build system doesn't know about these dependencies - so if you > have, say, two .cc files that both include foo.h then bar.h - well, the > build system runs two compiles, both compiles try to implicitly build the > foo.h module - one blocks waiting for the other to complete, then they > continue and block again waiting for bar.h module to be built. If the build > system knew about these dependencies (what Google uses - what we call > "explicit (header)modules") then it could build the foo.h module and the > bar.h module in parallel, then build the two .cc files in parallel. > > > I think that the 'build a module' step should be a precondition to the > compile step. I think the compiler should issue an error if it encounters > an import for a module it doesn't find a file for. No one expects a linker > to compile foo.cpp into foo.o and link it just because it encounters a > fooFunc without a definition which was declared in foo.h. > > That would reduce the magic and expect something like > > add_cxx_module(somename somefile.h otherfiles.h) > > to specify a module file and its constituent partitions, which I think is > fine. > > > Basically: What do folks think about supporting these sort of features in >> CMake C++ Builds? Any pointers on how I might best implement this with or >> without changes to CMake? >> >> >> I think some design is needed up front. I expect CMake would want to have >> a first-class (on equal footing with include directories or compile >> definitions and with particular handling) concept for modules, extending >> the install(TARGET) command to install module binary files etc. >> > > Module binary files wouldn't be installed in the sense of being part of > the shipped package of a library - because module binary files are > compiler/flag/etc specific. > > > Ok. > > Thanks, > > Stephen. > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dblaikie at gmail.com Wed May 9 18:36:31 2018 From: dblaikie at gmail.com (David Blaikie) Date: Wed, 09 May 2018 22:36:31 +0000 Subject: [cmake-developers] Experiments in CMake support for Clang (header & standard) modules In-Reply-To: <17a12188-2e2e-26d8-8ace-7b3f1765f7a3@kitware.com> References: <2a2abd48-dee2-dc1a-b5e6-33e89d9b1472@gmail.com> <4f9ba68d-ae14-4088-a8cb-8a2b23659ad6@gmail.com> <17a12188-2e2e-26d8-8ace-7b3f1765f7a3@kitware.com> Message-ID: On Mon, May 7, 2018 at 10:13 AM Brad King wrote: > On 05/07/2018 12:01 PM, Stephen Kelly wrote: > > Hopefully Brad or someone else can provide other input from research > already done. > > I'm not particularly familiar with what compiler writers or the modules > standard specification expects build systems to do w.r.t modules. > However, IIUC at least at one time the expectation was that the module > files would not be installed like headers The module interface source file is, to the best of my knowledge, intended to be installed like headers - and I'm currently advocating/leaning/pushing towards it being installed exactly that way (in the same directories, using the same include search path, etc). > and are used only within a local build tree. The /binary/ representation of the module interface is likely to be only local to a specific build tree - since it's not portable between compilers or different compiler flag sets, even. > Are modules even supposed to be first-class entities > in the build system specification that users write? > > In the Fortran world users just list all the sources and build systems are > expected to figure it out. CMake has very good support for Fortran > modules. > Our Ninja generator has rules to preprocess the translation units first, > then parse the preprocessed output to extract module definitions and > usages, > then inject the added dependencies into the build graph, and then begin > compilation of sources ordered by those dependencies (this requires a > custom fork of Ninja pending upstream acceptance). > > Is that what is expected from C++ buildsystems for modules too? > Yes, likely something along those lines - though I'm looking at a few different possible support models. A couple of major different ones (that may be both supported by GCC and Clang at least, if they work out/make sense) are: * the wrapper-script approach, where, once the compiler determines the set of direct module dependencies, it would invoke a script to ask for the location of the binary module interface files for those modules. Build systems could use this to dynamically discover the module dependencies of a file as it is being compiled. * tool-based parsing (more like what you've described Fortran+Ninja+CMake is doing). The goal is to limit the syntax of modules code enough that discovering the set of direct module dependencies is practical for an external (non-compiler) tool - much like just some preprocessing and looking for relatively simple keywords, etc. - then the tool/build system can find the dependencies ahead of time without running the compiler (a 3rd scenario, is what I've been sort of calling the "hello world" example - where it's probably important that it's still practical for a new user to compile something simple like "hello world" that uses a modularized standard library, without having to use a build system for it (ie: just run the compiler & it goes off & builds the in-memory equivalent of BMIs without even writing them to disk/reusing them in any way)) - Dave > > -Brad > -------------- next part -------------- An HTML attachment was scrubbed... URL: From steveire at gmail.com Tue May 15 03:22:52 2018 From: steveire at gmail.com (Stephen Kelly) Date: Tue, 15 May 2018 08:22:52 +0100 Subject: [cmake-developers] Experiments in CMake support for Clang (header & standard) modules References: <2a2abd48-dee2-dc1a-b5e6-33e89d9b1472@gmail.com> <4f9ba68d-ae14-4088-a8cb-8a2b23659ad6@gmail.com> <17a12188-2e2e-26d8-8ace-7b3f1765f7a3@kitware.com> Message-ID: Brad King wrote: > On 05/07/2018 12:01 PM, Stephen Kelly wrote: >> Hopefully Brad or someone else can provide other input from research >> already done. > > I'm not particularly familiar with what compiler writers or the modules > standard specification expects build systems to do w.r.t modules. > However, IIUC at least at one time the expectation was that the module > files would not be installed like headers and are used only within a > local build tree. Are modules even supposed to be first-class entities > in the build system specification that users write? The answer is probably both 'hopefully not' and 'sometimes'. > In the Fortran world users just list all the sources and build systems are > expected to figure it out. CMake has very good support for Fortran > modules. Our Ninja generator has rules to preprocess the translation units > first, then parse the preprocessed output to extract module definitions > and usages, then inject the added dependencies into the build graph, and > then begin compilation of sources ordered by those dependencies (this > requires a custom fork of Ninja pending upstream acceptance). > > Is that what is expected from C++ buildsystems for modules too? Hopefully. However, in some cases, the step of 'extracting module definitions and usages' might be very hard to do. This document is quite concise about that: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2018/p1052r0.html So, the answer for cmake might be that CMake can learn to extract that stuff, but ignore certain cases like imports within ifdefs. Maybe CMake could then also provide API for users to specify the usages/dependencies explicitly in those cases. I don't know how convenient that would be (or could be made through design). Thanks, Stephen. From steveire at gmail.com Tue May 15 04:34:30 2018 From: steveire at gmail.com (Stephen Kelly) Date: Tue, 15 May 2018 09:34:30 +0100 Subject: [cmake-developers] Experiments in CMake support for Clang (header & standard) modules References: <2a2abd48-dee2-dc1a-b5e6-33e89d9b1472@gmail.com> <4f9ba68d-ae14-4088-a8cb-8a2b23659ad6@gmail.com> Message-ID: David Blaikie wrote: >> Nope, scratch that ^ I had thought that was the case, but talking more >> with Richard Smith it seems there's an expectation that modules will be >> somewhere between header and library granularity (obviously some small >> libraries today have one or only a few headers, some (like Qt) have many >> - maybe those on the Qt end might have slightly fewer modules than the >> have headers - but still several modules to one library most likely, by >> the sounds of it) >> >> >> Why? Richard maybe you can answer that? These are the kinds of things I >> was trying to get answers to in the previous post to iso sg2 in the >> google group. I didn't get an answer as definitive as this, so maybe you >> can share the reason behind such a definitive answer? >> > > It's more that the functionality will allow this & just judging by how > people do things today (existing header granularity partly motivated by > the cost of headers that doesn't apply to modules), how they're likely to > do things in the future (I personally would guess people will probably try > to just port their headers to modules - and a few places where there are > circular dependencies in headers or the like they might glob them up into > one module). It seems quite common to have one PCH file per shared library (that's what Qt does for example). What makes you so sure that won't be the case with modules? I'd say that what people will do will be determined by whatever their tools optimize for. If it is necessary to list all used modules on the compile line, people would choose fewer modules. If 'import QtCore' is fast and allows the use of QString and QVariant etc and there is no downside, then that will be the granularity offered by Qt (instead of 'QtCore.QString'). That is also comparable to '#include ' which is possible today. >> I just looked through the commits from Boris, and it seems he made some >> changes relating to -fmodule-file=. That still presupposes that all >> (transitively) used module files are specified on the command line. >> > > Actually I believe the need is only the immediate dependencies - at least > with Clang's implementation. Ok. That's not much better though. It still means editing/generating the buildsystem each time you add an import. I don't think a model with that requirement will gain adoption. >> I was talking about the -fprebuilt-module-path option added by Manman Ren >> in https://reviews.llvm.org/D23125 because that actually relieves the >> user/buildsystem of maintaining a list of all used modules (I hope). >> > > *nod* & as you say, GCC has something similar. Though the build system > probably wants to know about the used modules to do dependency analysis & > rebuilding correctly. Yes, presumably that will work with -MM. > Yeah, thanks for the link - useful to read. There seems to be a slew of activity around modules at the moment. You can read some other reactions here which might have input for your paper: https://www.reddit.com/r/cpp/comments/8jb0nt/what_modules_can_actually_provide_and_what_not/ https://www.reddit.com/r/cpp/comments/8j1edf/really_think_that_the_macro_story_in_modules_is/ I look forward to reading your paper anyway. >> I think clang outputs the definitions in a separate object file, but GCC >> currently doesn't. Perhaps that's a difference that cmake has to account >> for or pass on to the user. >> > > Clang outputs frontend-usable (not object code, but serialized AST usable > for compiling other source code) descriptions of the entire module > (whatever it contains - declarations, definitions, etc) to the .pcm file. > It can then, in a separate step, build an object file from the pcm. I > think GCC produces both of these artifacts in one go - but not in the same > file. Ok, I must have misremembered something. >> Sure. I didn't notice anything from reading, but I also didn't try it >> out. You might need to provide a repo with the module.modulemap/c++ files >> etc that are part of your experiment. Or better, provide something based >> on modules-ts that I can try out. >> > > *nod* I'll see if I can get enough of modules-ts type things working to > provide some examples, but there's some more variance/uncertainty there in > the compiler support, etc. Something working only with clang for example would be a good start. >> I'm guessing that's enough for you to implement what you want as an >> experiment? >> > > OK, so in that case it requires source changes to cmake? *nod* sounds > plausible - I appreciate the pointers. I take it that implies there's not > a way I could hook into those file kinds and filters without changing > cmake? (ie: from within my project's cmake build files, without modifying > a cmake release) There is no way to hook into the system I described without patching CMake. Your custom command approach might be the way to do that if it is the priority. Thanks, Stephen. From brad.king at kitware.com Tue May 15 09:18:07 2018 From: brad.king at kitware.com (Brad King) Date: Tue, 15 May 2018 09:18:07 -0400 Subject: [cmake-developers] Experiments in CMake support for Clang (header & standard) modules In-Reply-To: References: <2a2abd48-dee2-dc1a-b5e6-33e89d9b1472@gmail.com> <4f9ba68d-ae14-4088-a8cb-8a2b23659ad6@gmail.com> <17a12188-2e2e-26d8-8ace-7b3f1765f7a3@kitware.com> Message-ID: On 05/15/2018 03:22 AM, Stephen Kelly wrote: > So, the answer for cmake might be that CMake can learn to extract that > stuff, but ignore certain cases like imports within ifdefs. We'd need to do the extraction from already-preprocessed sources. This is how Fortran+Ninja+CMake works. Unfortunately for C++ this will typically require preprocessing twice: once just to extract module dependencies and again to actually compile. With Fortran we compile using the already-preprocessed source but doing that with C++ will break things like Clang's nice handling of macros in diagnostic messages. -Brad From neundorf at kde.org Tue May 15 15:45:06 2018 From: neundorf at kde.org (Alexander Neundorf) Date: Tue, 15 May 2018 21:45:06 +0200 Subject: [cmake-developers] How to handle dependencies of protobuf files ? Message-ID: <4713367.Uu4IFVk9sG@linux-l7nd> Hi, I stumbled upon a problem with protobuf files, I attached a testcase. There is a MyBase.proto, which is "imported" by Complex.proto. If MyBase.proto is modified, protoc is run again in MyBase.proto, but not on Complex.proto, although it should. You can have a look at the attached example. The message MyData (in Complex.proto) has a member MyBase b1. If I rename the message MyBase (in MyBase.proto) e.g. to MyBaseXYZ, then the build fails, because Complex.pb.h was not regenerated, so it still refered to the now not existing class MyBase. Is there already a solution to handle this ? I think to do it properly, there would have to be a dependency scanning for proto files like there is for C/C++ headers. Parsing at the proto-files at cmake time wouldn't be good enough (since editing a proto file doesn't trigger a cmake run). Comments ? Alex -------------- next part -------------- A non-text attachment was scrubbed... Name: protodeps.tar.gz Type: application/x-compressed-tar Size: 897 bytes Desc: not available URL: From kinga.kasa at doclerholding.com Thu May 17 05:56:25 2018 From: kinga.kasa at doclerholding.com (Kinga Kasa) Date: Thu, 17 May 2018 09:56:25 +0000 Subject: [cmake-developers] Using CMake to build big projects with cross-dependencies Message-ID: <84e20ee802fc4f92b53b5a25482c3c1d@doclerholding.com> Dear CMake Developers Team, I have a question regarding CMake. We are currently working on a rather big project and we would like to build it with CMake. The structure of the project looks like the following: we have a root directory where we have 50+ subdirectories with projects in them with their own CMakeLists.txt files. These subprojects are successfully built with cmake on their own. What we would like to do now is building the whole project together. It would be really easy (with add_subdirectories or include or ExternalProject_Add, we tried all of these) but the problem is that these subprojects are depending on each other and building the whole project with these will result in the cmake running for hours and hours (stuck at saying Configuring done, eventually it will finish, but it runs for hours). We implemented in these cmake files our own logic to not to generate and build the same subprojects multiple times (using a property, which is a list, where we store the already included targets name and every time we try to add a new target it checks whether its already included or not). We would be glad if you could help us out with some advice on how to accomplish building the whole big project without building some targets multiple times but building it effectively without having to wait hours for the cmake to run. Thank you in advance, Kinga K?sa -------------- next part -------------- An HTML attachment was scrubbed... URL: From brad.king at kitware.com Thu May 17 11:46:56 2018 From: brad.king at kitware.com (Brad King) Date: Thu, 17 May 2018 11:46:56 -0400 Subject: [cmake-developers] Using CMake to build big projects with cross-dependencies In-Reply-To: <84e20ee802fc4f92b53b5a25482c3c1d@doclerholding.com> References: <84e20ee802fc4f92b53b5a25482c3c1d@doclerholding.com> Message-ID: <4eba9402-d54a-e645-8eaf-23fb4f717f81@kitware.com> On 05/17/2018 05:56 AM, Kinga Kasa wrote: > cmake running for hours and hours (stuck at saying Configuring done, > eventually it will finish, but it runs for hours). That's not expected. Even on projects with tens of thousands of source files and thousands of libraries and executables it typically takes only a few minutes. You could try breaking in the debugger locally during generation to see what it's doing. Or, you could try bisecting your project's content to get a smaller example that reproduces the long generation time. > ExternalProject_Add We typically use this on very large projects to avoid any individual build taking too long. -Brad From robert.maynard at kitware.com Thu May 17 13:47:24 2018 From: robert.maynard at kitware.com (Robert Maynard) Date: Thu, 17 May 2018 13:47:24 -0400 Subject: [cmake-developers] [ANNOUNCE] CMake 3.11.2 available for download Message-ID: We are pleased to announce that CMake 3.11.2 is now available for download. Please use the latest release from our download page: https://cmake.org/download/ * Calling "add_library()" to create an alias of an imported target that is not globally visible now causes an error again as it did prior to 3.11.0. This diagnostic was accidentally dropped from CMake 3.11.0 and 3.11.1 by the change to allow globally visible imported targets to be aliased. * The "FindQt4" module "qt4_wrap_cpp", "qt4_wrap_ui" and "qt4_add_resources" macros now set "SKIP_AUTOMOC" and "SKIP_AUTOUIC" on their generated files. These files never need to be processed by moc or uic, and we must say so explicitly to account for policy "CMP0071". Thanks for your support! ------------------------------------------------------------------------- Changes in 3.11.2 since 3.11.1: Brad King (8): Ninja: Do not add empty custom command for file(GENERATE) outputs C++ feature checks: Filter out warnings caused by local configuration libuv: linux/sparc64: use fcntl to set and clear O_NONBLOCK FindCUDA: Fix regression in separable compilation without cublas FindBoost: Remove extra indentation in 1.65/1.66 dependency block add_library: Restore error on alias of non-global imported target add_custom_{command,target}: Fix crash on empty expanded command CMake 3.11.2 Christian Pfeiffer (1): IRSL: Fix Intel library list for ifort-only setups Christof Kr?ger (1): InstallRequiredSystemLibraries: Check for existence of mfcm dlls Filip Matzner (1): FindBoost: Backport versioned python dependencies for v1.35 to v1.66 Marc Chevrier (5): Fix CMAKE_DISABLE_SOURCE_CHANGES recognition of top of build tree FindJava, FindJNI, UseJava: update for version 10 support FindJava, FindJNI: Ensure most recent version is searched first FindJava, FindJNI: fix erroneous regex, enhance registry lookup Help: Specify COMPILE_OPTIONS and COMPILE_FLAGS source properties usage Matthew Woehlke (2): Qt4Macros: Use get_property/set_property Qt4Macros: Don't AUTOMOC or AUTOUIC qt4-generated files Rolf Eike Beer (2): FindPkgConfig: do not unset unused variable FindBLAS: do not write an imported target name into BLAS_LIBRARIES Sebastian Holtermann (1): Autogen: Register generated dependency files From bill.hoffman at kitware.com Thu May 17 13:59:36 2018 From: bill.hoffman at kitware.com (Bill Hoffman) Date: Thu, 17 May 2018 13:59:36 -0400 Subject: [cmake-developers] Using CMake to build big projects with cross-dependencies In-Reply-To: <4eba9402-d54a-e645-8eaf-23fb4f717f81@kitware.com> References: <84e20ee802fc4f92b53b5a25482c3c1d@doclerholding.com> <4eba9402-d54a-e645-8eaf-23fb4f717f81@kitware.com> Message-ID: On 5/17/2018 11:46 AM, Brad King wrote: > On 05/17/2018 05:56 AM, Kinga Kasa wrote: >> cmake running for hours and hours (stuck at saying Configuring done, >> eventually it will finish, but it runs for hours). > That's not expected. Even on projects with tens of thousands of > source files and thousands of libraries and executables it typically > takes only a few minutes. You could try breaking in the debugger > locally during generation to see what it's doing. Or, you could > try bisecting your project's content to get a smaller example that > reproduces the long generation time. Another quick way to see what is going on is to run cmake with the --trace option: cmake . --trace It will produce a lot of output but you should be able to see if it is repeating something or getting stuck on some operation. -Bill From matthias.goesswein at eeas.at Sun May 20 14:21:35 2018 From: matthias.goesswein at eeas.at (=?UTF-8?Q?G=c3=b6=c3=9fwein_Matthias_/_eeas_gmbh?=) Date: Sun, 20 May 2018 20:21:35 +0200 Subject: [cmake-developers] "Linking" of Object Libraries In-Reply-To: References: <1d60abc2-57ee-11ab-0b30-cbf28cc32a68@eeas.at> Message-ID: Hello, I found a strange behavior within the object libraries in the upcoming CMake 3.12 (I used 3.11.20180519-gdb88f for testing). If I have for example two object libraries, which are used in one executable: add_library(ObjLib1 OBJECT ObjLib1.c) target_include_directories(ObjLib1 PUBLIC ...) add_library(ObjLib2 OBJECT ObjLib2.c) target_include_directories(ObjLib2 PUBLIC ...) add_executable(MyExe main.c) target_link_libraries(MyExe ObjLib1 ObjLib2) Then this works fine. But if for some reason one object library "links" to the other and this is used for the executable then it does not work: add_library(ObjLib1 OBJECT ObjLib1.c) target_include_directories(ObjLib1 PUBLIC ...) target_link_libraries(ObjLib1 PUBLIC ObjLib2) add_library(ObjLib2 OBJECT ObjLib2.c) target_include_directories(ObjLib2 PUBLIC ...) add_executable(MyExe main.c) target_link_libraries(MyExe ObjLib1) I only get the usage requirements of ObjLib2, but not the object files of ObjLib2 into the executable. If I use STATIC Libraries instead it works. Is this behavior intended? I read the documentation too and i know that there is no link step for object libraries, but I guess it's the same for the static libraries (they are not linked together, instead both are used on the link line of the executable). A similar solution would be nice for object libraries, because otherwise the usage of object libraries which depend on other object libraries is not working well. Right now to get the compilation working I have to either use the TARGET_OBJECTS generator expression at the add_executable command, or I have to link explicit to ObjLib2 for the executable: add_executable(MyExe main.c $) or target_link_libraries(MyExe ObjLib1 ObjLib2) For both possibilities I have to repeat in some sort the dependency and it gets worse if the depth of the dependencies is going deeper. Best regards, Matthias. From brad.king at kitware.com Mon May 21 10:54:00 2018 From: brad.king at kitware.com (Brad King) Date: Mon, 21 May 2018 10:54:00 -0400 Subject: [cmake-developers] "Linking" of Object Libraries In-Reply-To: References: <1d60abc2-57ee-11ab-0b30-cbf28cc32a68@eeas.at> Message-ID: On 05/20/2018 02:21 PM, G??wein Matthias / eeas gmbh wrote: > I found a strange behavior within the object libraries in the upcoming > CMake 3.12 (I used 3.11.20180519-gdb88f for testing). Thanks for trying it out! > If I have for example two object libraries, which are used in one > executable: > > add_library(ObjLib1 OBJECT ObjLib1.c) > target_include_directories(ObjLib1 PUBLIC ...) > > add_library(ObjLib2 OBJECT ObjLib2.c) > target_include_directories(ObjLib2 PUBLIC ...) > > add_executable(MyExe main.c) > target_link_libraries(MyExe ObjLib1 ObjLib2) > > Then this works fine. Good. > But if for some reason one object library "links" > to the other and this is used for the executable then it does not work: > > add_library(ObjLib1 OBJECT ObjLib1.c) > target_include_directories(ObjLib1 PUBLIC ...) > target_link_libraries(ObjLib1 PUBLIC ObjLib2) > > add_library(ObjLib2 OBJECT ObjLib2.c) > target_include_directories(ObjLib2 PUBLIC ...) > > add_executable(MyExe main.c) > target_link_libraries(MyExe ObjLib1) > > I only get the usage requirements of ObjLib2, but not the object files > of ObjLib2 into the executable. This is expected as things are currently designed. Object files are only linked when the object library is *directly* referenced by a target. Only usage requirements are transitive, not the object files. > If I use STATIC Libraries instead it works. > I read the documentation too and i know that > there is no link step for object libraries, but I guess it's the same > for the static libraries Static libraries have an archiving step rather than a link step and plays the role of collecting object files together. Object libraries have no such step. A major distinction is that listing object files on a link line causes them to be included in the link unconditionally. We can't simply list all object library transitive dependencies or their objects may be included multiple times in one target or duplicated in multiple dependent targets. > usage of object libraries which depend on other object libraries > is not working well...I have to repeat in some sort the dependency Yes, and this is because object libraries are meant to be building blocks for normal libraries and executables. How they are packaged into those libraries and executables needs to be in full control of project code. For example, maybe your ObjLib2 objects are supposed to go into some static library that MyExe links. -Brad From neundorf at kde.org Fri May 25 15:42:06 2018 From: neundorf at kde.org (Alexander Neundorf) Date: Fri, 25 May 2018 21:42:06 +0200 Subject: [cmake-developers] How to handle dependencies of protobuf files ? In-Reply-To: <4713367.Uu4IFVk9sG@linux-l7nd> References: <4713367.Uu4IFVk9sG@linux-l7nd> Message-ID: <1680541.IPl1eWS8Cu@linux-l7nd> Any comments ? Alex On 2018 M05 15, Tue 21:45:06 CEST Alexander Neundorf wrote: > Hi, > > I stumbled upon a problem with protobuf files, I attached a testcase. > There is a MyBase.proto, which is "imported" by Complex.proto. > If MyBase.proto is modified, protoc is run again in MyBase.proto, but not on > Complex.proto, although it should. > You can have a look at the attached example. > > The message MyData (in Complex.proto) has a member MyBase b1. > If I rename the message MyBase (in MyBase.proto) e.g. to MyBaseXYZ, then the > build fails, because Complex.pb.h was not regenerated, so it still refered > to the now not existing class MyBase. > > Is there already a solution to handle this ? > > I think to do it properly, there would have to be a dependency scanning for > proto files like there is for C/C++ headers. > Parsing at the proto-files at cmake time wouldn't be good enough (since > editing a proto file doesn't trigger a cmake run). > > Comments ? > > Alex From kyle.edwards at kitware.com Fri May 25 16:27:35 2018 From: kyle.edwards at kitware.com (Kyle Edwards) Date: Fri, 25 May 2018 16:27:35 -0400 Subject: [cmake-developers] Error handling in dashboard scripts Message-ID: Hi all, I'm working on a set of build scripts that use CMake and CTest, and I'm trying to figure out the best way to handle failures in CTest (I'm using a dashboard script internally.) If the configure or build step fails, I want the failure to be reported to CDash with ctest_submit(), but I also want CTest to exit with an error code so that the calling process can detect a failure. The documentation for ctest_configure(), ctest_build(), and ctest_submit() isn't completely clear on what happens if one of these steps fails. Let's say I have the following dashboard script (this is pseudocode, arguments have been deliberately omitted for brevity, this example won't work): ctest_start() ctest_configure() ctest_build() ctest_test() ctest_submit() What happens if ctest_configure() fails (for example, because CMake failed to find a needed library)? Does the entire script stop right there? Or does it continue and try to execute ctest_build() anyway? Does ctest_build() silently do nothing because the configure step failed? Looking through the documentation for ctest_configure() and ctest_build(), I see some information on the RETURN_VALUE and CAPTURE_CMAKE_ERROR arguments, but it's not clear what happens if these aren't used. If someone could clarify for me what's supposed to happen here, and what the recommended best practices are for making sure that ctest_submit() still gets called in the event of a failure, I will gladly submit a documentation patch with this information. I'm guessing it involves using the RETURN_VALUE and CAPTURE_CMAKE_ERROR arguments, but the documentation doesn't state this. Kyle From brad.king at kitware.com Tue May 29 09:48:44 2018 From: brad.king at kitware.com (Brad King) Date: Tue, 29 May 2018 09:48:44 -0400 Subject: [cmake-developers] Error handling in dashboard scripts In-Reply-To: References: Message-ID: <434d4fc6-2eba-1f6c-8495-99374f1fecc7@kitware.com> On 05/25/2018 04:27 PM, Kyle Edwards wrote: > ctest_start() > ctest_configure() > ctest_build() > ctest_test() > ctest_submit() > > What happens if ctest_configure() fails (for example, because CMake > failed to find a needed library)? Does the entire script stop right > there? Or does it continue and try to execute ctest_build() anyway? IIRC it just moves on to each command in sequence and does what that command would normally do, letting it fail as it may given whatever state in the build tree is left behind by earlier failures, if any. Unless the failure is so bad that ctest_submit hasn't been told where to submit the results it should always be reached. -Brad From brad.king at kitware.com Tue May 29 09:52:16 2018 From: brad.king at kitware.com (Brad King) Date: Tue, 29 May 2018 09:52:16 -0400 Subject: [cmake-developers] How to handle dependencies of protobuf files ? In-Reply-To: <4713367.Uu4IFVk9sG@linux-l7nd> References: <4713367.Uu4IFVk9sG@linux-l7nd> Message-ID: <4bad3d7c-c953-f6dd-48d7-7dfc6362e4e7@kitware.com> On 05/15/2018 03:45 PM, Alexander Neundorf wrote: > I think to do it properly, there would have to be a dependency scanning for > proto files like there is for C/C++ headers. In order to handle implicit dependencies like that implied by import "MyBase.proto"; then they would somehow need to be extracted from source content. Ideally protoc should be able to write a depfile as a side effect. Otherwise all dependencies should be listed explicitly somewhere. -Brad From dmitry.gurulev at intel.com Tue May 29 10:30:10 2018 From: dmitry.gurulev at intel.com (Gurulev, Dmitry) Date: Tue, 29 May 2018 14:30:10 +0000 Subject: [cmake-developers] Proposed new CMake module FindITT.cmake Message-ID: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> Hi All, I'd like to maintain new finding module for ITT (Instrumentation and Tracing Technology) library. You may find more details about ITT at https://github.com/intel/IntelSEAPI/wiki or at https://software.intel.com/en-us/node/544195. Please let me know if I need to provide more details. Proposed module is here: https://github.com/dmitry-gurulev/cmake/blob/master/FindITT.cmake PS Although IntelSEAPI (which contains ITT sources) is a CMake based library, ITT itself might be used as standalone binary library, so finding module still makes sense. BTW, I'm going to propose CMake package configuration file for SEAPI, as it is recommended at https://gitlab.kitware.com/cmake/community/wikis/doc/cmake/dev/Module-Maintainers -------------------------------------------------------------------- Joint Stock Company Intel A/O Registered legal address: Krylatsky Hills Business Park, 17 Krylatskaya Str., Bldg 4, Moscow 121614, Russian Federation This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. -------------- next part -------------- An HTML attachment was scrubbed... URL: From brad.king at kitware.com Tue May 29 11:12:16 2018 From: brad.king at kitware.com (Brad King) Date: Tue, 29 May 2018 11:12:16 -0400 Subject: [cmake-developers] Proposed new CMake module FindITT.cmake In-Reply-To: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> References: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> Message-ID: On 05/29/2018 10:30 AM, Gurulev, Dmitry wrote: > Although IntelSEAPI (which contains ITT sources) is a CMake based > library, ITT itself might be used as standalone binary library, so > finding module still makes sense If the upstream is CMake-friendly it should provide a CMake Package Configuration File with its SDK along with the headers and libraries. See documentation here: https://cmake.org/cmake/help/v3.11/manual/cmake-packages.7.html That will allow `find_package` to work without any find module and without guessing which headers and libraries match. Thanks, -Brad From wouter.klouwen at youview.com Tue May 29 12:36:18 2018 From: wouter.klouwen at youview.com (Wouter Klouwen) Date: Tue, 29 May 2018 17:36:18 +0100 Subject: [cmake-developers] Unit testing CMake modules Message-ID: <82707266-5124-1526-bff9-3fdc79690321@youview.com> Hi all, We have a rather large amount of CMake code (4k+ lines) in various modules/functions. They contain the common logic for many of our projects. This makes it quite an important base for everything our team does. Rather shamefully at present these modules are rather poorly tested. I'd like to improve this, but the current way of testing CMake code is typically to run a trimmed project, and to verify whether certain invocations produce a certain output or file hierarchy. This involves a bit of infrastructure and can be a bit cumbersome maintain, to diagnose when tests fail, and it requires a separate run of the tests. The overall cumbersomeness of the setup in turn discourages in our team, including myself, from adding tests. I'd like a more integrated approach that makes running at least some basic tests part of the build progress, and a more direct way of reporting failures. In other programming environments, testing often involves some kind of mocking environment, and CMake helpfully allows the overriding of CMake built in functions, though this is typically discouraged. In my ideal world it would be possible to save the state of the current function set up, then call a function with a certain given number of parameters, and expect a certain sequence of events, such as functions to be called, environment variables to be set, etc. Something akin to CMakePushCheckState, except for built in functions. Then a module could provide for some functions that would set up expectations and verify them, within a run of cmake, or possibly some other commands could be added, to give some syntactic glossy coat to it. As it wouldn't actually trigger any of the expensive generating functions, it would be lightweight, quick to run and give pretty direct errors in terms of failed expectations, reducing debug time. If it was done with CMake commands, I might imagine it to look something like: function(foobar) # function to test, does something non trivial if ("FOO" IN_LIST ARGV) install(FILES foo DESTINATION foo_dir) else("BAR" IN_LIST ARGV) message(FATAL_ERROR "Some error") else("BAZ" IN_LIST ARGV) set(BAZ True PARENT_SCOPE) endif() endfunction() test(foobar) expect(WITH "FOO" CALL install FILES foo DESTINATION foo_dir) expect(WITH "BAR" CALL message FATAL_ERROR "Some error") expect(WITH "BAZ" ENVIRONMENT BAZ True) endtest(foobar) What do people think? Is this crazy? Is there a quicker way to get somewhere close? Should I put some effort into making this into an actual proposal/working code? Thanks in advance, W This transmission contains information that may be confidential and contain personal views which are not necessarily those of YouView TV Ltd. YouView TV Ltd (Co No:7308805) is a limited liability company registered in England and Wales with its registered address at YouView TV Ltd, 3rd Floor, 10 Lower Thames Street, London, EC3R 6YT. For details see our web site at http://www.youview.com From neundorf at kde.org Tue May 29 16:00:24 2018 From: neundorf at kde.org (Alexander Neundorf) Date: Tue, 29 May 2018 22:00:24 +0200 Subject: [cmake-developers] How to handle dependencies of protobuf files ? In-Reply-To: <4bad3d7c-c953-f6dd-48d7-7dfc6362e4e7@kitware.com> References: <4713367.Uu4IFVk9sG@linux-l7nd> <4bad3d7c-c953-f6dd-48d7-7dfc6362e4e7@kitware.com> Message-ID: <16253534.cqdFakOLjg@linux-l7nd> Hi, On 2018 M05 29, Tue 09:52:16 CEST Brad King wrote: > On 05/15/2018 03:45 PM, Alexander Neundorf wrote: > > I think to do it properly, there would have to be a dependency scanning > > for > > proto files like there is for C/C++ headers. > > In order to handle implicit dependencies like that implied by > > import "MyBase.proto"; > > then they would somehow need to be extracted from source content. > Ideally protoc should be able to write a depfile as a side effect. Parsing them using cmake would more or less work, also the include dirs are known, so technically this would probably work. But the parsing would happen at cmake-time, not at compile-time, but editing a proto-file does not trigger a cmake run... Would that have to be implemented similar to the C dependency scanning ? > Otherwise all dependencies should be listed explicitly somewhere. so the cheap solution would be to add an argument to PROTOBUF_GENERATE_CPP() to list proto-files these proto-files depend on which is forwarded to (every) add_custom_command() call inside PROTOBUF_GENERATE_CPP(). Not very elegant, but at least it would make it work correctly. This would require that if in a set of proto-files there are different dependencies, for each of those with different dependencies a separate PROTOBUF_GENERATE_CPP() would be needed. A bit ugly, but better than now. Alex From michael.stuermer at schaeffler.com Wed May 30 00:27:24 2018 From: michael.stuermer at schaeffler.com (Stuermer, Michael SP/HZA-ZSEP) Date: Wed, 30 May 2018 04:27:24 +0000 Subject: [cmake-developers] Unit testing CMake modules In-Reply-To: <82707266-5124-1526-bff9-3fdc79690321@youview.com> References: <82707266-5124-1526-bff9-3fdc79690321@youview.com> Message-ID: Hello Wouter, testing CMake code is indeed very important. This is why kitware does it as well. Did you check the CMake code testing infrastructure in "Tests/RunCMake" in the sources? It is a very flexible concept which makes adding tests easy enough for everyone contribute (IMO). At the beginning it might be a bit confusing how expected results etc. are handled, but once you understand how it works it's really nice. Another (maybe the main) advantage: you have tons of examples and the RunCMake infrastructure is maintained so you don't have to it all on your own. best regards, Michael > -----Urspr?ngliche Nachricht----- > Von: cmake-developers [mailto:cmake-developers-bounces at cmake.org] Im > Auftrag von Wouter Klouwen > Gesendet: Dienstag, 29. Mai 2018 18:36 > An: CMake Developers > Betreff: [cmake-developers] Unit testing CMake modules > > Hi all, > > We have a rather large amount of CMake code (4k+ lines) in various > modules/functions. They contain the common logic for many of our projects. > This makes it quite an important base for everything our team does. > > Rather shamefully at present these modules are rather poorly tested. I'd like > to improve this, but the current way of testing CMake code is typically to run > a trimmed project, and to verify whether certain invocations produce a > certain output or file hierarchy. > This involves a bit of infrastructure and can be a bit cumbersome maintain, to > diagnose when tests fail, and it requires a separate run of the tests. > The overall cumbersomeness of the setup in turn discourages in our team, > including myself, from adding tests. > > I'd like a more integrated approach that makes running at least some basic > tests part of the build progress, and a more direct way of reporting failures. > > In other programming environments, testing often involves some kind of > mocking environment, and CMake helpfully allows the overriding of CMake > built in functions, though this is typically discouraged. > > In my ideal world it would be possible to save the state of the current > function set up, then call a function with a certain given number of > parameters, and expect a certain sequence of events, such as functions to > be called, environment variables to be set, etc. > Something akin to CMakePushCheckState, except for built in functions. > > Then a module could provide for some functions that would set up > expectations and verify them, within a run of cmake, or possibly some other > commands could be added, to give some syntactic glossy coat to it. > > As it wouldn't actually trigger any of the expensive generating functions, it > would be lightweight, quick to run and give pretty direct errors in terms of > failed expectations, reducing debug time. > > If it was done with CMake commands, I might imagine it to look something > like: > > function(foobar) > # function to test, does something non trivial > if ("FOO" IN_LIST ARGV) > install(FILES foo DESTINATION foo_dir) > else("BAR" IN_LIST ARGV) > message(FATAL_ERROR "Some error") > else("BAZ" IN_LIST ARGV) > set(BAZ True PARENT_SCOPE) > endif() > endfunction() > > test(foobar) > expect(WITH "FOO" CALL install FILES foo DESTINATION foo_dir) > expect(WITH "BAR" CALL message FATAL_ERROR "Some error") > expect(WITH "BAZ" ENVIRONMENT BAZ True) > endtest(foobar) > > What do people think? Is this crazy? Is there a quicker way to get somewhere > close? Should I put some effort into making this into an actual > proposal/working code? > > Thanks in advance, > W > > > This transmission contains information that may be confidential and contain > personal views which are not necessarily those of YouView TV Ltd. YouView > TV Ltd (Co No:7308805) is a limited liability company registered in England and > Wales with its registered address at YouView TV Ltd, 3rd Floor, 10 Lower > Thames Street, London, EC3R 6YT. For details see our web site at > http://www.youview.com > -- > > Powered by www.kitware.com > > Please keep messages on-topic and check the CMake FAQ at: > http://www.cmake.org/Wiki/CMake_FAQ > > Kitware offers various services to support the CMake community. For more > information on each offering, please visit: > > CMake Support: http://cmake.org/cmake/help/support.html > CMake Consulting: http://cmake.org/cmake/help/consulting.html > CMake Training Courses: http://cmake.org/cmake/help/training.html > > Visit other Kitware open-source projects at > http://www.kitware.com/opensource/opensource.html > > Follow this link to subscribe/unsubscribe: > https://cmake.org/mailman/listinfo/cmake-developers From smspillaz at gmail.com Wed May 30 01:26:17 2018 From: smspillaz at gmail.com (Sam Spilsbury) Date: Wed, 30 May 2018 13:26:17 +0800 Subject: [cmake-developers] Unit testing CMake modules In-Reply-To: <82707266-5124-1526-bff9-3fdc79690321@youview.com> References: <82707266-5124-1526-bff9-3fdc79690321@youview.com> Message-ID: A little while ago I wrote a framework to do just that: https://github.com/polysquare/cmake-unit I haven't maintained it in a few years, but it works exactly the way you would expect it to (the syntax is a bit different though). On Wed, May 30, 2018 at 12:36 AM, Wouter Klouwen wrote: > Hi all, > > We have a rather large amount of CMake code (4k+ lines) in various > modules/functions. They contain the common logic for many of our > projects. This makes it quite an important base for everything our team > does. > > Rather shamefully at present these modules are rather poorly tested. I'd > like to improve this, but the current way of testing CMake code > is typically to run a trimmed project, and to verify whether certain > invocations produce a certain output or file hierarchy. > This involves a bit of infrastructure and can be a bit cumbersome > maintain, to diagnose when tests fail, and it requires a separate run of > the tests. > The overall cumbersomeness of the setup in turn discourages in our team, > including myself, from adding tests. > > I'd like a more integrated approach that makes running at least some > basic tests part of the build progress, and a more direct way of > reporting failures. > > In other programming environments, testing often involves some kind of > mocking environment, and CMake helpfully allows the overriding of CMake > built in functions, though this is typically discouraged. > > In my ideal world it would be possible to save the state of the current > function set up, then call a function with a certain given number of > parameters, and expect a certain sequence of events, such as functions > to be called, environment variables to be set, etc. > Something akin to CMakePushCheckState, except for built in functions. > > Then a module could provide for some functions that would set up > expectations and verify them, within a run of cmake, or possibly some > other commands could be added, to give some syntactic glossy coat to it. > > As it wouldn't actually trigger any of the expensive generating > functions, it would be lightweight, quick to run and give pretty direct > errors in terms of failed expectations, reducing debug time. > > If it was done with CMake commands, I might imagine it to look something > like: > > function(foobar) > # function to test, does something non trivial > if ("FOO" IN_LIST ARGV) > install(FILES foo DESTINATION foo_dir) > else("BAR" IN_LIST ARGV) > message(FATAL_ERROR "Some error") > else("BAZ" IN_LIST ARGV) > set(BAZ True PARENT_SCOPE) > endif() > endfunction() > > test(foobar) > expect(WITH "FOO" CALL install FILES foo DESTINATION foo_dir) > expect(WITH "BAR" CALL message FATAL_ERROR "Some error") > expect(WITH "BAZ" ENVIRONMENT BAZ True) > endtest(foobar) > > What do people think? Is this crazy? Is there a quicker way to get > somewhere close? Should I put some effort into making this into an > actual proposal/working code? > > Thanks in advance, > W > > > This transmission contains information that may be confidential and contain personal views which are not necessarily those of YouView TV Ltd. YouView TV Ltd (Co No:7308805) is a limited liability company registered in England and Wales with its registered address at YouView TV Ltd, 3rd Floor, 10 Lower Thames Street, London, EC3R 6YT. For details see our web site at http://www.youview.com > -- > > Powered by www.kitware.com > > Please keep messages on-topic and check the CMake FAQ at: http://www.cmake.org/Wiki/CMake_FAQ > > Kitware offers various services to support the CMake community. For more information on each offering, please visit: > > CMake Support: http://cmake.org/cmake/help/support.html > CMake Consulting: http://cmake.org/cmake/help/consulting.html > CMake Training Courses: http://cmake.org/cmake/help/training.html > > Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html > > Follow this link to subscribe/unsubscribe: > https://cmake.org/mailman/listinfo/cmake-developers -- Sam Spilsbury pgp.mit.edu: 0xb8a90fb599bae9c2 From dmitry.gurulev at intel.com Wed May 30 01:38:24 2018 From: dmitry.gurulev at intel.com (Gurulev, Dmitry) Date: Wed, 30 May 2018 05:38:24 +0000 Subject: [cmake-developers] Proposed new CMake module FindITT.cmake In-Reply-To: References: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> Message-ID: <6A0B8B99D901C14DB6772E649B90563136B53144@IRSMSX107.ger.corp.intel.com> Hi Brad, I'm not a maintainer of SEAPI, so I may only propose a package config for it. And second thing - ITT library itself (not a part of SEAPI) is also used and distributed (ex. w/ Intel's VTune app.) -----Original Message----- From: Brad King [mailto:brad.king at kitware.com] Sent: Tuesday, May 29, 2018 6:12 PM To: Gurulev, Dmitry ; cmake-developers at cmake.org Subject: Re: [cmake-developers] Proposed new CMake module FindITT.cmake On 05/29/2018 10:30 AM, Gurulev, Dmitry wrote: > Although IntelSEAPI (which contains ITT sources) is a CMake based > library, ITT itself might be used as standalone binary library, so > finding module still makes sense If the upstream is CMake-friendly it should provide a CMake Package Configuration File with its SDK along with the headers and libraries. See documentation here: https://cmake.org/cmake/help/v3.11/manual/cmake-packages.7.html That will allow `find_package` to work without any find module and without guessing which headers and libraries match. Thanks, -Brad -------------------------------------------------------------------- Joint Stock Company Intel A/O Registered legal address: Krylatsky Hills Business Park, 17 Krylatskaya Str., Bldg 4, Moscow 121614, Russian Federation This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. From sergey.nikulov at gmail.com Wed May 30 01:52:10 2018 From: sergey.nikulov at gmail.com (Sergei Nikulov) Date: Wed, 30 May 2018 08:52:10 +0300 Subject: [cmake-developers] Proposed new CMake module FindITT.cmake In-Reply-To: <6A0B8B99D901C14DB6772E649B90563136B53144@IRSMSX107.ger.corp.intel.com> References: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B53144@IRSMSX107.ger.corp.intel.com> Message-ID: ??, 30 ??? 2018 ?. ? 8:38, Gurulev, Dmitry : > Hi Brad, > I'm not a maintainer of SEAPI, so I may only propose a package config for it. And second thing - ITT library itself (not a part of SEAPI) is also used and distributed (ex. w/ Intel's VTune app.) You can propose PR with CMake Package Configuration for SEAPI project on GitHub. If they found it reasonable, they will merge it. -- Best Regards, Sergei Nikulov From dmitry.gurulev at intel.com Wed May 30 01:56:07 2018 From: dmitry.gurulev at intel.com (Gurulev, Dmitry) Date: Wed, 30 May 2018 05:56:07 +0000 Subject: [cmake-developers] Proposed new CMake module FindITT.cmake In-Reply-To: References: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B53144@IRSMSX107.ger.corp.intel.com> Message-ID: <6A0B8B99D901C14DB6772E649B90563136B531C6@IRSMSX107.ger.corp.intel.com> Yes, of course, I can. And I'm going to do that, I've already written. My question is - what about stand-alone ITT? -----Original Message----- From: Sergei Nikulov [mailto:sergey.nikulov at gmail.com] Sent: Wednesday, May 30, 2018 8:52 AM To: Gurulev, Dmitry Cc: brad.king at kitware.com; cmake-developers at cmake.org Subject: Re: [cmake-developers] Proposed new CMake module FindITT.cmake ??, 30 ??? 2018 ?. ? 8:38, Gurulev, Dmitry : > Hi Brad, > I'm not a maintainer of SEAPI, so I may only propose a package config > for it. And second thing - ITT library itself (not a part of SEAPI) is also used and distributed (ex. w/ Intel's VTune app.) You can propose PR with CMake Package Configuration for SEAPI project on GitHub. If they found it reasonable, they will merge it. -- Best Regards, Sergei Nikulov -------------------------------------------------------------------- Joint Stock Company Intel A/O Registered legal address: Krylatsky Hills Business Park, 17 Krylatskaya Str., Bldg 4, Moscow 121614, Russian Federation This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. From brad.king at kitware.com Wed May 30 09:10:33 2018 From: brad.king at kitware.com (Brad King) Date: Wed, 30 May 2018 09:10:33 -0400 Subject: [cmake-developers] Proposed new CMake module FindITT.cmake In-Reply-To: <6A0B8B99D901C14DB6772E649B90563136B531C6@IRSMSX107.ger.corp.intel.com> References: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B53144@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B531C6@IRSMSX107.ger.corp.intel.com> Message-ID: <9c557db0-b21b-5323-a97b-a558a79e8aca@kitware.com> On 05/30/2018 01:56 AM, Gurulev, Dmitry wrote: > what about stand-alone ITT? That project's upstream should be able to provide a package configuration file as well. -Brad From brad.king at kitware.com Wed May 30 09:12:05 2018 From: brad.king at kitware.com (Brad King) Date: Wed, 30 May 2018 09:12:05 -0400 Subject: [cmake-developers] How to handle dependencies of protobuf files ? In-Reply-To: <16253534.cqdFakOLjg@linux-l7nd> References: <4713367.Uu4IFVk9sG@linux-l7nd> <4bad3d7c-c953-f6dd-48d7-7dfc6362e4e7@kitware.com> <16253534.cqdFakOLjg@linux-l7nd> Message-ID: <7609a51e-6012-cd97-3329-7c713a66e44f@kitware.com> On 05/29/2018 04:00 PM, Alexander Neundorf wrote: >> In order to handle implicit dependencies like that implied by >> >> import "MyBase.proto"; >> >> then they would somehow need to be extracted from source content. > > Would that have to be implemented similar to the C dependency scanning ? Yes, but even better would be if we can ask protoc to print the dependencies out for us so we don't have to parse the sources ourselves. > so the cheap solution would be to add an argument to PROTOBUF_GENERATE_CPP() > to list proto-files these proto-files depend on Yes, that would be a good intermediate solution. -Brad From eric.noulard at gmail.com Wed May 30 09:54:33 2018 From: eric.noulard at gmail.com (Eric Noulard) Date: Wed, 30 May 2018 15:54:33 +0200 Subject: [cmake-developers] How to handle dependencies of protobuf files ? In-Reply-To: <7609a51e-6012-cd97-3329-7c713a66e44f@kitware.com> References: <4713367.Uu4IFVk9sG@linux-l7nd> <4bad3d7c-c953-f6dd-48d7-7dfc6362e4e7@kitware.com> <16253534.cqdFakOLjg@linux-l7nd> <7609a51e-6012-cd97-3329-7c713a66e44f@kitware.com> Message-ID: Le mer. 30 mai 2018 ? 15:12, Brad King a ?crit : > On 05/29/2018 04:00 PM, Alexander Neundorf wrote: > >> In order to handle implicit dependencies like that implied by > >> > >> import "MyBase.proto"; > >> > >> then they would somehow need to be extracted from source content. > > > > Would that have to be implemented similar to the C dependency scanning ? > > Yes, but even better would be if we can ask protoc to print the > dependencies > out for us so we don't have to parse the sources ourselves. > protoc can already do something like that but it spits out a makefile includable file. see --dependency_out=FILE option. Write a dependency output file in the format expected by make. This writes the transitive set of input file paths to FILE moreover the generated makefile depends on the langage generator used (--cpp_out, --java_out, --python_out, ...) because dependencies are expressed between proto and generated source files. May be it would be possible to write a protoc "plugin" https://www.expobrain.net/2015/09/13/create-a-plugin-for-google-protocol-buffer/ which would spit out easy to digest dep spec for CMake. Unfortunately I'm not volunteering :-( just giving some idea. > > so the cheap solution would be to add an argument to > PROTOBUF_GENERATE_CPP() > > to list proto-files these proto-files depend on > > Yes, that would be a good intermediate solution. > > -Brad > -- > > Powered by www.kitware.com > > Please keep messages on-topic and check the CMake FAQ at: > http://www.cmake.org/Wiki/CMake_FAQ > > Kitware offers various services to support the CMake community. For more > information on each offering, please visit: > > CMake Support: http://cmake.org/cmake/help/support.html > CMake Consulting: http://cmake.org/cmake/help/consulting.html > CMake Training Courses: http://cmake.org/cmake/help/training.html > > Visit other Kitware open-source projects at > http://www.kitware.com/opensource/opensource.html > > Follow this link to subscribe/unsubscribe: > https://cmake.org/mailman/listinfo/cmake-developers > -- Eric -------------- next part -------------- An HTML attachment was scrubbed... URL: From brad.king at kitware.com Wed May 30 10:04:03 2018 From: brad.king at kitware.com (Brad King) Date: Wed, 30 May 2018 10:04:03 -0400 Subject: [cmake-developers] How to handle dependencies of protobuf files ? In-Reply-To: References: <4713367.Uu4IFVk9sG@linux-l7nd> <4bad3d7c-c953-f6dd-48d7-7dfc6362e4e7@kitware.com> <16253534.cqdFakOLjg@linux-l7nd> <7609a51e-6012-cd97-3329-7c713a66e44f@kitware.com> Message-ID: <6dd9dba3-5183-5a04-2a5a-44d06df6f2e0@kitware.com> On 05/30/2018 09:54 AM, Eric Noulard wrote: > protoc can already do something like that but it spits out a makefile includable file. > > see?--dependency_out=FILE option. That may work well for the Ninja generator at least. See the add_custom_command `DEPFILE` option. One day it would be nice to teach the Makefile generator to use depfile-style dependencies (on compilers that support it) instead of doing its own scanning. -Brad From dmitry.gurulev at intel.com Wed May 30 10:05:15 2018 From: dmitry.gurulev at intel.com (Gurulev, Dmitry) Date: Wed, 30 May 2018 14:05:15 +0000 Subject: [cmake-developers] Proposed new CMake module FindITT.cmake In-Reply-To: <9c557db0-b21b-5323-a97b-a558a79e8aca@kitware.com> References: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B53144@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B531C6@IRSMSX107.ger.corp.intel.com>, <9c557db0-b21b-5323-a97b-a558a79e8aca@kitware.com> Message-ID: <6A0B8B99D901C14DB6772E649B90563136B55542@IRSMSX107.ger.corp.intel.com> >That project's upstream should be able to provide a package configuration file as well. It might be a binary package, no way to provide package config. ________________________________________ From: Brad King [brad.king at kitware.com] Sent: Wednesday, May 30, 2018 4:10 PM To: Gurulev, Dmitry; Sergei Nikulov Cc: cmake-developers at cmake.org Subject: Re: [cmake-developers] Proposed new CMake module FindITT.cmake On 05/30/2018 01:56 AM, Gurulev, Dmitry wrote: > what about stand-alone ITT? That project's upstream should be able to provide a package configuration file as well. -Brad -------------------------------------------------------------------- Joint Stock Company Intel A/O Registered legal address: Krylatsky Hills Business Park, 17 Krylatskaya Str., Bldg 4, Moscow 121614, Russian Federation This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. From brad.king at kitware.com Wed May 30 10:12:04 2018 From: brad.king at kitware.com (Brad King) Date: Wed, 30 May 2018 10:12:04 -0400 Subject: [cmake-developers] Proposed new CMake module FindITT.cmake In-Reply-To: <6A0B8B99D901C14DB6772E649B90563136B55542@IRSMSX107.ger.corp.intel.com> References: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B53144@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B531C6@IRSMSX107.ger.corp.intel.com> <9c557db0-b21b-5323-a97b-a558a79e8aca@kitware.com> <6A0B8B99D901C14DB6772E649B90563136B55542@IRSMSX107.ger.corp.intel.com> Message-ID: On 05/30/2018 10:05 AM, Gurulev, Dmitry wrote: > It might be a binary package, no way to provide package config. I don't understand. Does it provide header files? Does it provide a "linktome.lib" file? -Brad From dmitry.gurulev at intel.com Wed May 30 10:18:54 2018 From: dmitry.gurulev at intel.com (Gurulev, Dmitry) Date: Wed, 30 May 2018 14:18:54 +0000 Subject: [cmake-developers] Proposed new CMake module FindITT.cmake In-Reply-To: References: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B53144@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B531C6@IRSMSX107.ger.corp.intel.com> <9c557db0-b21b-5323-a97b-a558a79e8aca@kitware.com> <6A0B8B99D901C14DB6772E649B90563136B55542@IRSMSX107.ger.corp.intel.com>, Message-ID: <6A0B8B99D901C14DB6772E649B90563136B55573@IRSMSX107.ger.corp.intel.com> Example - ITT comes w/ VTune app. in binary form (/opt/intel/vtune_amplifier_2018.2.0.551022/lib64/libittnotify.a). But VTune is not CMake based distributive, no way to add cmake package config to it. ________________________________________ From: Brad King [brad.king at kitware.com] Sent: Wednesday, May 30, 2018 5:12 PM To: Gurulev, Dmitry; Sergei Nikulov Cc: cmake-developers at cmake.org Subject: Re: [cmake-developers] Proposed new CMake module FindITT.cmake On 05/30/2018 10:05 AM, Gurulev, Dmitry wrote: > It might be a binary package, no way to provide package config. I don't understand. Does it provide header files? Does it provide a "linktome.lib" file? -Brad -------------------------------------------------------------------- Joint Stock Company Intel A/O Registered legal address: Krylatsky Hills Business Park, 17 Krylatskaya Str., Bldg 4, Moscow 121614, Russian Federation This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. From brad.king at kitware.com Wed May 30 10:25:22 2018 From: brad.king at kitware.com (Brad King) Date: Wed, 30 May 2018 10:25:22 -0400 Subject: [cmake-developers] Proposed new CMake module FindITT.cmake In-Reply-To: <6A0B8B99D901C14DB6772E649B90563136B55573@IRSMSX107.ger.corp.intel.com> References: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B53144@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B531C6@IRSMSX107.ger.corp.intel.com> <9c557db0-b21b-5323-a97b-a558a79e8aca@kitware.com> <6A0B8B99D901C14DB6772E649B90563136B55542@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B55573@IRSMSX107.ger.corp.intel.com> Message-ID: On 05/30/2018 10:18 AM, Gurulev, Dmitry wrote: > ITT comes w/ VTune app. in binary form > (/opt/intel/vtune_amplifier_2018.2.0.551022/lib64/libittnotify.a) Is there a header file too? > VTune is not CMake based distributive, no way to add cmake > package config to it. It's possible to provide a package configuration file for projects that don't build with CMake. Qt5 does it. IIRC the Intel TBB developers were looking at doing it too. Upstream could manually code an ITTConfig.cmake file with the proper imported targets. -Brad From dmitry.gurulev at intel.com Wed May 30 10:37:41 2018 From: dmitry.gurulev at intel.com (Gurulev, Dmitry) Date: Wed, 30 May 2018 14:37:41 +0000 Subject: [cmake-developers] Proposed new CMake module FindITT.cmake In-Reply-To: References: <6A0B8B99D901C14DB6772E649B90563136B51F4F@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B53144@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B531C6@IRSMSX107.ger.corp.intel.com> <9c557db0-b21b-5323-a97b-a558a79e8aca@kitware.com> <6A0B8B99D901C14DB6772E649B90563136B55542@IRSMSX107.ger.corp.intel.com> <6A0B8B99D901C14DB6772E649B90563136B55573@IRSMSX107.ger.corp.intel.com>, Message-ID: <6A0B8B99D901C14DB6772E649B90563136B55599@IRSMSX107.ger.corp.intel.com> >Is there a header file too? Yes, it is - /opt/intel/vtune_amplifier_2018.2.0.551022/include/ittnotify.h >It's possible to provide a package configuration file for projects >that don't build with CMake. Qt5 does it. IIRC the Intel TBB >developers were looking at doing it too. Thanks for the point, I'm able to contact w/ VTune team and suggest them to look at doing that too. BTW, for now VTune doesn't provide it and I doubt it will in short term. ________________________________________ From: Brad King [brad.king at kitware.com] Sent: Wednesday, May 30, 2018 5:25 PM To: Gurulev, Dmitry; Sergei Nikulov Cc: cmake-developers at cmake.org Subject: Re: [cmake-developers] Proposed new CMake module FindITT.cmake On 05/30/2018 10:18 AM, Gurulev, Dmitry wrote: > ITT comes w/ VTune app. in binary form > (/opt/intel/vtune_amplifier_2018.2.0.551022/lib64/libittnotify.a) Is there a header file too? > VTune is not CMake based distributive, no way to add cmake > package config to it. It's possible to provide a package configuration file for projects that don't build with CMake. Qt5 does it. IIRC the Intel TBB developers were looking at doing it too. Upstream could manually code an ITTConfig.cmake file with the proper imported targets. -Brad -------------------------------------------------------------------- Joint Stock Company Intel A/O Registered legal address: Krylatsky Hills Business Park, 17 Krylatskaya Str., Bldg 4, Moscow 121614, Russian Federation This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. From shawn.waldon at kitware.com Wed May 30 10:59:19 2018 From: shawn.waldon at kitware.com (Shawn Waldon) Date: Wed, 30 May 2018 10:59:19 -0400 Subject: [cmake-developers] Error handling in dashboard scripts In-Reply-To: References: Message-ID: > The documentation for ctest_configure(), ctest_build(), and ctest_submit() > isn't completely clear on what happens if one of these steps fails. Let's > say I have the following dashboard script (this is pseudocode, arguments > have been deliberately omitted for brevity, this example won't work): > > ctest_start() > ctest_configure() > ctest_build() > ctest_test() > ctest_submit() > > What happens if ctest_configure() fails (for example, because CMake failed > to find a needed library)? Does the entire script stop right there? Or does > it continue and try to execute ctest_build() anyway? Does ctest_build() > silently do nothing because the configure step failed? Looking through the > documentation for ctest_configure() and ctest_build(), I see some > information on the RETURN_VALUE and CAPTURE_CMAKE_ERROR arguments, but it's > not clear what happens if these aren't used. > > If someone could clarify for me what's supposed to happen here, and what > the recommended best practices are for making sure that ctest_submit() > still gets called in the event of a failure, I will gladly submit a > documentation patch with this information. I'm guessing it involves using > the RETURN_VALUE and CAPTURE_CMAKE_ERROR arguments, but the documentation > doesn't state this. > In our VTK/ParaView dashboards we ensure the submit happens by doing partial submissions after each stage (this also gives incremental results on CDash): set(success TRUE) ctest_configure(RETURN_VALUE configure_result) ctest_submit(PARTS Configure) # Tell submit to submit a partial result with only the configure step if (configure_result) message("Configure failed") set(success FALSE) # you could exit here endif() if (success) ... # run next stage here HTH, Shawn -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.maynard at kitware.com Thu May 31 16:11:49 2018 From: robert.maynard at kitware.com (Robert Maynard) Date: Thu, 31 May 2018 16:11:49 -0400 Subject: [cmake-developers] [ANNOUNCE] CMake 3.11.3 available for download Message-ID: We are pleased to announce that CMake 3.11.3 is now available for download. Please use the latest release from our download page: https://cmake.org/download/ Thanks for your support! ------------------------------------------------------------------------- Changes in 3.11.3 since 3.11.2: Brad King (3): cmSystemTools: Revert GetRealPath implementation on Windows CPack: Fix cross-compilation of WiX generator CMake 3.11.3 Sander Vrijders (1): TestDriver: Replace strncpy with strcpy