To make, or not to make – QMake and beyond

Published Monday October 12th, 2009
60 Comments on To make, or not to make – QMake and beyond
Posted in Build system

QMake is one of those crucial tools which just has to work, or it totally ruins your day; that is, if your project is using qmake, of course.

Before QMake came around, there was TMake. TMake was a perl script which did a good job for what it was designed for at the time. The inner workings of QMake was based upon TMake, and a gazillion features and hacks later, QMake has ended up as a very-hard-to-maintain-without-breaking-anything-esoteric beast. The question is, what are we going to do about it?

There have been plenty of internal discussions going on for years, if we should scrap QMake and make a new, go for one of the existing build systems, or to spend extra resources trying to fix up qmake (and not break compatibility). A few projects began, but never finished. One project (QBuild) was even released with Qtopia, but not completed and further developed.
Given the non-uniform set of developers at Qt, the discussions have be fierce, and opinions as plentiful as the number of developers. There’s really no tool out there which satisfies our complete wish list, and we have looked into many of them. Either the language is broken, it’s lacking features, it’s too hard to mold to do trivial things, the language is too verbose for trivial projects (XML for example), the language is not limiting enough (Python with all its libs, for exampe), too slow, doesn’t parallelize well, doesn’t see all dependencies or takes forever to process them, it’s trigger (fork) happy, and the list goes on and on.

So, in order to make the decision even harder, we seek your opinion on the matter. ๐Ÿ™‚

I’ll present some opinions for a potential new build system, and how to potentially tackle them.

1) “Proprietary” language
As we have already experienced with QMake, maintaining an own language is sub-optimal. You actively have to work on both developing the language to support new features, and on the internals/usage of the language. You might end up ‘designing yourself’ into bad constructs which hinders new development, and doesn’t make as much sense as perhaps other possible language constructs. It can make the language hard to learn for new users. One example of this in QMake can be seen in the file

    QTDIR/mkspecs/features/exclusive_builds.prf

Another problem with a proprietary language is that it creates an additional hurdle for new users to overcome before being able to use the system, as they cannot relate any previous language skills.

It would therefor be great if the language of this new build system was a language which most people can relate to, for example JavaScript, which is getting more and more widespread (http://www.langpop.com/). By using a language which is already defined, we could focus entirely on how to use the language most efficiently and cleanly, rather than also creating a language.
We should probably also steer away from 3rd party scripting languages, as they usually come with a plethora of utility libraries, making it very hard to enforce a structure of the build system, allowing for too much customization. People going that route should probably rather use a custom script for their builds (and any build system could run that, if need be).

2) IDE integration
The build system needs to integrate well into IDEs.
We have created our own IDE, and that IDE needs to interact properly with the build process of Qt, and Qt-based projects. This means that the IDE must be able to show all the files for all different platforms at the same time, and also know which ones are built on the host and target (mixed cross-platform build) platforms, to ensure that Intellisense/code completion works as expected. It also must be able to alter the build process, like adding/modifying/removing compiler/linker options and files, which is then reflected in the build system project files.

This is not as easy as simply creating a separate file which specifies the sources needed for a build, since we need to list platform specific files, and also change settings depending on platform.

The problem then is of course conditions in the language, as these are evaluated on parse time, and thus normally not known by the IDE which needs to parse the project language.

One possible solution we’ve found to this issue would be to ensure that the settings are visible after the parsing, and easy to evaluate. In JavaScript we could define targets like this:

    var corelib = new Object()
    corelib.target = "QtCore"
    corelib.defines                 = ["QT_BUILD_CORE_LIB",
                                       "QT_NO_USING_NAMESPACE"]
    corelib.defines["mac && bundle"]= ["QT_NO_DEBUG_PLUGIN_CHECK"];
    corelib.sources                 = ["Foo.cpp", "Bar.cpp"]
    corelib.sources["windows"]      = ["Foo_win.cpp", "Bar_win.cpp"]
    corelib.sources["unix && !mac"] = ["Foo_x11.cpp", "Bar_x11.cpp"]

or, even more optimal for an IDE, as JSON:

    var corelib = {
        "target" : "QtCore",
        "defines" : { "all"           : ["QT_BUILD_CORE_LIB",
                                         "QT_NO_USING_NAMESPACE"],
                      "mac && bundle" : ["QT_NO_DEBUG_PLUGIN_CHECK"] },
        "sources" : { "all"           : ["Foo.cpp", "Bar.cpp"],
                      "windows"       : ["Foo_win.cpp", "Bar_win.cpp"],
                      "unix && !mac"  : ["Foo_x11.cpp", "Bar_x11.cpp"] }
        }

(Note that the “all” entry in the array would equal the normal array in the non-JSON code)
So, the IDE would be able to see all the possible sources, and know which configuration they belong to. Then the backends would evaluate the final sources based on the current configuration, when adding the target “corelib” to the DAG (Directed Acyclic Graph), or project dependency tree, if you will.

3) Build directly from tool
Since the tool itself has the whole DAG tree, it should be able to build all the targets directly, without going through another build system. CMake, for example, relies on the Makefile generator, and creates Makefiles which in turn calls back into CMake to do the build. This solution is not only fork-heavy, but also limiting, since you can only parallelize as well as the backend generator allows you to. So, unless the output Makefile is one single huge Makefile, you’ll run into limitations. Scons will build directly, but is too slow at parsing dependencies, so each build you do takes forever to start. Waf is better in this respect, but both lack a proper set of backends for project generations (Vcproj, XCode, Makefiles etc). They are also based on Python, which adds a dependency of a 3rd party library, which we want to avoid due to both the multitude of platforms Qt supports, and because of all the utility libraries.

4) Integration towards distributed build systems
We internally use several distributed build system, depending on which platforms we’re on (Teambuilder on Linux, Distributed Network Builds on Mac and IncrediBuild on Windows), and they all have the same limitation when we use Makefiles: They fork off more processes than what is necessary/useful on the system.

Say you have a build farm with 20 open slots. You fire off ‘make -j20’ in good faith to fill the farm. However, at the same time someone else does the same, and you both get only 10 slots each. You’ll now have 10 ‘sleeping’ processes, all which are just waiting for their slot in the farm, when those resources would be better spent elsewhere.

The building backend should be able to interface with the various tools to see how much resources is available at any given time. That way it scale up or down the number of parallel jobs to the optimum. Also, the distributed build system probably knows better, as it might NOT be optimal to run 1 local compile and 20 remote ones, while your machine is linking on the other core; maybe 0 local and 10 remote is better for *your* machine, even if there are 20 open slots in the farm?

The build tool shouldn’t have to know this, and the developer shouldn’t have to think about how parallel he wants to be. This is why such an interface is important, so that it’s possible for the distributed systems to add these algorithms to the build tool.

5) Cross-compiling
In large mixed cross-compiled (were you compile projects for both host and target systems at the same time) projects, most of the sub-projects normally compile for the target system, while only a few (supporting) projects are intended for the host. This means that projects should easily and automatically pick up the target platform, while a bit more attention should be required for host platform projects only.

The project shouldn’t have to worry too much about which platform, nor how to manipulate the project file to achieve the result. Ideally they would only have to mark that a project is intended for the host platform, like for example:

    var moc = {
        "target" : "moc",
        "platform": "Host",
        "defines" : { "all"           : ["QT_MOC"]},
        ...
        }

As the configure process has already figured out which parameters are needed for the host and target platform, the build system would know which defaults to apply to each project type in the cross-compiling setup; so the project developer need not do anything else to the project, unless he/she needs to override the cross-compiling defaults.

6) pkg-config support
The build system needs to both be able to use pkg-config information, and to output the pkg-config files for a built project. Individual projects should not need to maintain a separate .pc file with replacement variables.

7) Deployment
The build system should be able to create deployment files for the various popular platforms, like .msi, .deb, .rpm, .dmg. Of course it could do this by creating scripts which are run with common platform specific tools to create these deployment files.

8 ) Configuration
The tool should handle the configuration of projects. That way we only need to maintain one set of configuration rules on all platforms, and extending with new options would be uniform. Ideally the system should also automatically generate the -h/–help documentation, based on the configuration script, so we don’t need to maintain a separate doc file together with the configuration script.
This is the same reason why we keep as much documentation together with the Qt source code, to make the maintenance of the documentation as easy and coupled as possible.

So, these are some of the things we’ve been thinking about, however, there’s a lot more to a build system than just these 8 items. Also, it’s worth mentioning that nothing is set in stone yet. We’re at the thinking stage. No matter what we decide, QMake will still around for a long time.

So, what’s your opinion?

Do you like this? Share it
Share on LinkedInGoogle+Share on FacebookTweet about this on Twitter

Posted in Build system

60 comments

nathan says:

I think the idea of using JavaScript has all the many advantages, and I’ll list here those I can think of.
1. If you have installed Qt, you already have an interpreter for this language. (I.e., the exact opposite of the Python libs problem.)
2. JavaScript is then the QtScript language and the Qt build language, maximizing the value of having to learn the language in the first place.
3. Constant efforts worldwide are going into making interpreters for this language faster. Qt Software is continuing to integrate such efforts into QtScript.
4. Insert here all the advantages of the well-established JSON standard you mentioned above.

On a different note, I ask the following question naรฏvely, regarding computing dependencies from source code that may have conditions in it that need evaluating: Can’t the build system have a second mode besides “build mode,” something like “just show all the files you would have built?” E.g., ./thebuildtool –justshowfiles. The IDE runs it in build mode for building and in –justshowfiles mode when populating the widgets of the GUI. No need for the GUI to try to understand the conditions in the script language; just let the script interpreter do it perfectly.
The build tool itself which directs the various steps of the build (compiler, linker, other external tools) is a tiny percentage of the build process’s time, and so this second mode (which does no compiling/linking/external tool running, but just outputs files that would have been given to those tools) should be very speedy.
This question probably ignores a lot of the complexities your post raises, but I ask it anyway, to help inform the discussion (or at least me).

SDiZ says:

I vote for CMake.

Yes, it is fork-heavy. But it is best around if you don’t want to invent yet-another-build-system. Fast and clean dependency checking, this is a big plus.

It is also used by kde, so most of the qt-related configure glitches have been sorted out.

Felix says:

As a commercial customer with thousands of projects depending on qmake (we even wrote a build system around qmake), it would be a blast to have a non-backwards-compatible solution. We always found a way to accomplish things using the qmake language, even adapting qmake itself. My vote is 1!

DAVIDB says:

Cmake – its good enough for KDE which handles like a billion lines of code in hundreds if not thousands of projects. It’s been around a while and has proven to be a very good cross platform configuration tool. Have you even considered Cmake ?

Marco says:

To me, CMake is the most powerful and general system out there, it is also widely adopted (KDE anyone?). Why don’t you just contribute to that project, or maybe fork it? Or propose a CMake 3.0?

Michael "CMake" Howell says:

Why not simply add the ability to directly build to CMake? That would solve the only problem I saw you having with it.

Mathieu says:

I’m sure you have been considering CMake. What were your issues with this one and the others you did not mention ?

Coder says:

Why not take an existing Open Source cross-platform build system and contribute to it. For example, you said that SCons wasn’t good because of performance of initial dependency scan. Why not contribute some resources to the project to improve performance or add other features that eliminate or work around the issue. You said that CMake was fairly good except it uses a secondary build system (ps. I agree this is a big problem). Why not contribute to the project and fix this wart. There are many other systems out there that could all use the combined experience and resources of team Troll. I suggest choosing one (or multiple) of them and trying to make them better.

Thomas Zander says:

cmake also fails many of the points made, so it might be better than qmake, but I’m thinking we might just hit a new wall in 2 years time due to all the other issues pointed out.

Bluebird says:

Did you guys read the blog ?

He gives reasons why CMake is suboptimal in some cases:
Make, for example, relies on the Makefile generator, and creates Makefiles which in turn calls back into CMake to do the build. This solution is not only fork-heavy, but also limiting, since you can only parallelize as well as the backend generator allows you to.

Another reason is hinted with the language. CMake language, while superficially trivial, has some nice subtle bugs, for example when you touch string concatenation, it’s a real beast.

gwossum says:

I use SCons for most Qt based projects. Although I like SCons, it took a long time to add all the support we needed, and it is still evolving. It can take a lot of boiler plate or a good custom SCons tool for small projects. For this reason, I sometimes still use qmake for really small projects. My biggest complaint with qmake is that the dependency tracking is virtually non-existent.

I’ve dabbled with CMake recently. I still like SCons better for my own projects, but it has its charms. CMake might be the best off the shelf tool for Qt.

Just stay away from autotools!!!

mariuz says:

I would choose javascript or minimal python (waf like)
I have seen the configure script for php.net (on windows only) and it is a java script (shoking)

maybe qt could use qtscript , i think it’s easier to parse than qmake

cyril says:

I vote for CMake. I know and I have suffered myself from the painful syntax, the tons of bugs and all, but it’s definitely the most powerful and complete existing solution. My guess is that you should really consider forking it or contributing to CMake 3.0 !

Scorp1us says:

One of the patterns I’ve come across is the structure/event pattern. In effect any structure can be expressed as a combination of the following events:
-push()
-pop()
-item(item designator, value, addtl attribs)

For any N-based tree implemented in XML, JSON, Python, DICOM, and any other you can express it as some combination of the above events. Implementing a parser (data to events, events to data) means that you don’t care about the input output formats because what you handle is a structure of events. I store these events in a RecursiveMap data structure

It seems to me that you could seamlessly support all the above formats trivially.

bastibense says:

I have some good experiences with CMake, not too sure about the small details under the hood, but it has done the job just fine for most of the stuff I did in the past.

No matter what you decide to do, please -NO- Automake. Thanks. ๐Ÿ™‚

rae says:

Most cross-platform places I’ve seen use Jam.

http://www.freetype.org/jam

Damien Arthur says:

a build tool based on json would be sweet!

Adam Higerd says:

@rae: I don’t know if I’d say “most;” the only major project I know of that uses jam is Boost. jam is even more arcane than cmake.

I like the idea of using ECMAScript / JSON notation. It’s simple and not overly verbose, but expressive enough to handle non-trivial setups. With some minor extensions to handle importing modules (something ECMAScript2 lacks, although 3 contains) it could very well serve to generate a representation of the project and its build process. I still like the concept of generating makefiles or project files for a targeted IDE, though; that’s been one of the things I’ve particularly liked about qmake.

Max says:

Marius,

First of all – thank you for bringing this issue up.
I believe that many people have thoughts to share on Qmake.

Some background:

In our company we heavily rely on Qt and are pretty happy with the library itself,
but the larger our projects get, the more problems we have with Qmake. Right now we’re truly unhappy.
We produce products for Windows, Linux and Mac platforms,
I’m involved in many questions related to the build & release process.

Some feedback on your opinions list:

> 1) โ€œProprietaryโ€ language
This was never the major problem for us. Projects are composited of multiple subprojects,
which are done “mostly by copy/paste” from the build system perspective. So once the “basis”
of the build system is done – it’s not difficult to maintain.
This could cause problems for you as maintainers, but “proprietary” language was never
a problem for us as users.

> 2) IDE integration
Nice feature, but not dramatically important for us. It is nice when you can directly integrate
with IDE (Visual Studio) is a good example, but –
– There are always much more IDEs that you can handle
– Most IDEs could be integrated with command line

In our projects we’re allowing developers to generate Visual Studio and XCode project files,
but this is not the way build is generated. But even before this was allowed, most devs were
still using their favorite editors / IDEs to do their day-to-day work – lack of this option
didn’t cause serious problems.

> 3) Build directly from tool
These are some internals which I can’t comment

> 4) Integration towards distributed build systems
Yes – this one is truly important. We were not able to efficiently integrate our projects
that use Qmake with distributed builds. On Windows we were not able to use IncrediBuild,
mostly because generated *.vcproj had problems. Similar problems were on Xcode *.xcodeproj.
In the end we’ve decided that we cant afford maintaining *.vcproj/*.xcodeproj on our own
and stopped trying.

I believe that likely you won’t be able to invent your own distributed build system, so you’ll
likely need to integrate with something already on the market. So usage of native build
systems (vcproj/xcodeproj is a good example, like CMake) has a lot of advantages.

> 5) Cross-compiling
Cross-compiling was never an issue for us – we’ve dropped this idea long time ago.
The main reason for it is simple: build process is not only about compilation, but also
about packages creation – which involves a lot of tools from the target platform (like
when you create a build for Mac). And as we need to do something on target systems anyway –
we avoid cross-compilation when possible and compile on target systems only

> 6) pkg-config support
> 7) Deployment
> 8 ) Configuration
Items 6,7 and 8 are not related to the build system, from my point of view.
I’d prefer to have a solution that does build process in a proper way, instead of
having something that tries to “cover it all”, but with lack of some functionality.

In our projects we handle all 3 items with our custom scripts (mostly python),
and i can’t imagine a solution that will make our life easier here.

Few more thoughts:

We considered moving to scons or CMake. Scons was too slow, while CMake looked more promising.
But considered our investment into Qt, we were not brave enough to use “not officially supported” tool,
we didn’t want to be on our own with Qt+Cmake integration problems.

Ranged by priorities, my requirements to ‘improved qmake’ are:
1) Cross-platform project files (the same *.pro file should be used on different platforms in majority of cases)
2) Paralleled build on all platforms (for example current win32-msvc target relies on nmake. nmake is pain)
3) Clear and proper handling of dependencies (at least a way to explicitly declare dependencies between subprojects)
4) Usage of native build tools when possible (Visual Studio on Windows, XCode on Mac, make (?) on Linux)
5) Easy integration with custom build tools (imagine you have some drivers compiled through ddkbuild on Windows – need integrate them to the build)

Don’t really care about:
1) Language that will be used
2) pkg-config support
3) Deployment
4) Configuration
5) Cross-compiling (as a way to compile multiple target platforms at the same time)
– these are not parts of the build system, it’s just a nice, but not mandatory extension.

Regards,
Max

Tim says:

Well I think IDE integration is the most important factor to consider, and one that you can’t do right if your build file is a script. I’d strongly vote against any of the existing build systems for this reason. Build files should be static – along the lines of .ini files rather than .exe.

spinynorman says:

Well, whatever you do please provide a painless upgrade path – a way to import QMake .pro files into the new system.

Please don’t base any replacement on JavaScript. It may be a fine language but the origins of Qt are as a desktop, not web, UI/cross-platform library. Your core audience is developers familiar with desktop rather than web languages. I’d much rather have it Python based than JavaScript.

In any new system, please don’t have any hidden behaviors like special include or library paths (such as the libary paths that QMake itself decides to add to Makefiles). The workarounds that are necessary when the developer needs something different are a nuisance.

Can’t QMake be fixed rather than replaced? The most developer friendly solution might be a series of incremental changes over the next few Qt releases that deprecate old features that are hard to maintain or crufty, and replace them with new better ways of doing things.

BogDan Vatra says:

IMHO CMake is the best build tool I know, I use it in every project I made, but when it comes to cross-compile is a little pain in the ass (no Symbian, no wince, etc.).
If you want to make a new build system and want developer to use it you should consider:
– do not make it Qt only, make it universal.
– do not use python, perl, ruby, etc. I don’t want to install python or perl to build my apps. I want to use ONLY a tool to do the job.
– make it’s syntax simple, and flexible, I need a way to find a library, or a header, I don’t want to use pkg-config or anther tool to do that, you should use CMake as an example.

My is vote for CMake, but it needs all your love to make it THE build system.

Some random ideas:

If you consider using your own tool, make it a library. So that it’s easyly integratable into some IDE. I got this idea from LLVM, which is a compiler, but consists of several libraries and therefore easy to integrate into some application, may it be a special JITer, compiler or an IDE. A project-builder like this would be handy as library as well. That still means that there could/should be some ready-made command line tool, of course ๐Ÿ™‚

If you go the library-way, make it possible to add/remove entries not only by the JSON file, but also via method calls. And then to re-write the changed DAG out as JSON again.

QMake currently has several generators, e.g. unix (make) or windows (nmake). Maybe it can gain a qbuilder (or whatever the new name will be) generator. That would allow you to use qmake to convert a qmake-project into a qbuilder-project.

Reed says:

Jam (Boost Jam or Perforce) are pretty good.

I once used SCons and it seemed at first like it would be really great, but it leaves you to implement lots of what really ought to come with the build system (mainly system checks). But if Qt Software had the time and resources to create a really good “library” of useful stuff for it, that would be neat.

Using a Makefile generator is great when you’re supplying source code, then the user/customer doesn’t need any special tools or libraries to compile. Though it does add complexity and subtle problems in that end up never getting resolved, because you can always just regenerate all the makefiles if something isn’t working right and rebuild the project. It would be great to have all the special features that most “build systems” have (checks for system-specific stuff, packaging rules, rules to run preprocessors to generate files, etc.) implemented entirely within Make, but this is not really practical it seems (though I’d be interested if anyone tried).

The main jobs that qmake does is (1) set up (some) files to have moc run on them, (2) abstract the platform and compiler specific parts that will actually do the build (By generating compiler/platform-specific makefiles, or a Visual C++ project)… right? Or do other people use it for other things I’m not aware it can do?

Miguel A. Alvarado V. says:

It must be done, rewrite QMake.

Adam says:

+1 for CMake here. We started with qmake but found it lacking for our build process, so we switched to CMake and never looked back. I understand that separating the build from the tool is not ideal, but that can also bring advantages as it DOES allow you to control parallelism and process based on the backend. Not to mention it can handle using platform specific optimizations based on the compiler used.

Max Howell says:

Your points all look pretty damn great to me.

It would be nice if you would build on top of an existing solution that isn’t quite there. I know you guys like doing it yourself. But qmake never evolved into much because only Qt used it, and partly that’s because everyone felt it was your tool, and you wouldn’t care for patches to make it otherwise.

An additional thing I want to see in a real next gen build system is an API so package managers can query the source distribution for its dependencies. The current state of deps in the autoconf scripts and the package management system is tediously duplicated work.

Finally something you may have not considered, but I think is important is the ability to create self-contained tarballs. Autotools wins over CMake still because 1. It is much more portable and 2. You can generate a (multi-megabyte!) configure script that only depends on bash and some kind of make. Now you can do what scons do and make a mini-scons script that then only depends on Python (that is acceptable). To be fair I’m not sure how important this point is. It certainly adds “points”, and makes it more likely you’ll beat autotools for uptake.

Honestly, if you guys do this, and pull it off, you’ll be my heroes.

Nothing that currently exists is close to good enough. CMake gets a lot of “wow! it’s great!” comments, but I could write several blog posts about how it is in fact merely mediocre. I guess you guys agree since you are not considering it.

Donald Carr says:

If you are going to assume full responsibility for generating compiler build steps, please allow for intermediate representation at the atomic level. It is currently incredibly useful to debug at the Makefile level, and then fix the issue at the .pro level. I did not invest a large amount of energy in understanding qbuild, but found it something of a blackbox. I would trigger the build, it would take a large pregnant pause then fail at some point. Verbose debug output is no substitute for an atomic intermediate representation, which facilitates direct meddling to establish the point of failure.

I have never attempted to get a large cmake project cross compiling for a given target, but have heard similar charges leveled against it citing similar blackbox activity.

People are going to deviate from any anticipated usage, please make debugging issues a primary consideration.

Javier Jardรณn says:

I’ve created this page: http://live.gnome.org/JavierJardon/NewBuildSystem to recompile some info about build tools, maybe It has some interesting info for someone

Also, there are some discussions about Non-Recursive Automake ( http://www.flameeyes.eu/autotools-mythbuster/automake/nonrecursive.html and http://danielkitta.org/blog/2009/07/30/non-recursive-automake-performance/ )

Regards

panzi says:

Because it’s late I only comment on one thing (maybe on more tomorrow or some other day):
For the case a new tool would be written I propose you use QML as the syntax. Maybe you’ll need to generalize QML a little bit or at least you’ll have to write some kind of interpreter, definitely some kind of parser that gives the IDE an AST or something like that. The advantage would be, that Qt developers already know the QML syntax. Actually I think the QML syntax is pretty neat and the best thing about it is: it’s declarative! No loops and only very “controlled” ifs etc. (or am I wrong here?). This would make it also pretty comfortable to handle in an IDE. In an IDE you don’t want a make script that has the need to execute code in order to tell you what’s going on. It has to be described at a higher (declarative) level.

panzi says:

One last thing: Did you take a look at Waf? http://code.google.com/p/waf/
Yeah, it is like SCons in the way that you write Python code. Maybe just a declarative frontend for Waf would do (a yaml or QML parser that directly parses to the according Waf object in order to run it and to an AST for the IDE, provided that the IDE has Python bindings)?

Cliff says:

So how would you solve the bootstrapping problem if you want to use QtScript? It should be possible to build Qt on a system without any previous Qt version installed.
About CMake: Maybe it is the best tool available but for me it still does not feel right. There must be a better solution for this problem.
Ecmascript-Syntax is better than any strange language but it won’t fully fix the problem for IDEs in my eyes. Should the IDE evaluate the code? If you want to change the build options graphically you would need a refactoring support as well.

Bill King says:

Add a vote for ressurecting qbuild. It was fast, and fit most of these criteria nicely.

panzi says:


Binary {
id: corelib
target: "QtCore"
defines: {
["QT_BUILD_CORE_LIB", "QT_NO_USING_NAMESPACE"] +
(MAC && BUNDLE ? ["QT_NO_DEBUG_PLUGIN_CHEK"] : [])
}
sources: {
["Foo.cpp", "Bar.cpp"] +
(WINDOWS ? ["Foo_win.cpp", "Bar_win.cpp"] : []) +
(UNIX + !MAC ? ["Foo_x11.cpp", "Bar_x11.cpp"] : [])
}
}

Hm, is there a more nice way to write these conditions? Maybe introduce a new condition syntax (the + is the operator that should be used for chaining. maybe skip that an default to operato+?):

Binary {
id: corelib
target: "QtCore"
defines: ["QT_BUILD_CORE_LIB", "QT_NO_USING_NAMESPACE"]
defines+ if MAC && BUNDLE: ["QT_NO_DEBUG_PLUGIN_CHEK"]
sources: ["Foo.cpp", "Bar.cpp"]
sources+ if WINDOWS: ["Foo_win.cpp", "Bar_win.cpp"]
sources+ if UNIX && !MAC: ["Foo_x11.cpp", "Bar_x11.cpp"]
}
}

Yet another new condition syntax:

Binary {
id: corelib
target: "QtCore"
defines: ["QT_BUILD_CORE_LIB", "QT_NO_USING_NAMESPACE"]
if MAC && BUNDLE: {
defines+: ["QT_NO_DEBUG_PLUGIN_CHEK"]
}
sources: ["Foo.cpp", "Bar.cpp"]
if WINDOWS: {
sources+: ["Foo_win.cpp", "Bar_win.cpp"]
}
if UNIX && !MAC: {
sources+: ["Foo_x11.cpp", "Bar_x11.cpp"]
}
}

Nah, I don’t like that you still have to write the +. Much to procedural. Maybe even mor CSS like:


Binary {
id: corelib
target: "QtCore"
defines: ["QT_BUILD_CORE_LIB", "QT_NO_USING_NAMESPACE"]
sources: ["Foo.cpp", "Bar.cpp"]
}

Binary {
id: corelib
if: MAC && BUNDLE
defines: ["QT_NO_DEBUG_PLUGIN_CHEK"]
}

Binary {
id: corelib
if: WINDOWS
sources: ["Foo_win.cpp", "Bar_win.cpp"]
}

Binary {
id: corelib
if: UNIX && !MAC
sources: ["Foo_x11.cpp", "Bar_x11.cpp"]
}

Well that’s a lot to write. And on 2nd thought it doesn’t really eliminate the need for the + operator because in css this would would mean if … use this value for defines/sources INSTEAD.

panzi says:

Maybe new syntax “conditional list” where all parts that have a true condition are joined:

Binary {
id: corelib
target: "QtCore"
defines: [true: ["QT_BUILD_CORE_LIB", "QT_NO_USING_NAMESPACE"], MAC && BUNDLE: ["QT_NO_DEBUG_PLUGIN_CHEK"]]
}

Craig Ringer says:

CMake looks nice … at first.

Like most build systems, it turns out to have some huge, ugly warts. The configuration language is reminiscent of Perl before “use strict” combined with the nastier bits of berkeley shell’s freaky escaping-and-quoting quirks and fuzzy “value”/”command” delineation. It even has TDWTF-esque “TRUE” “FALSE” “NOTFOUND” . The IF() / ELSE() / ENDIF() stuff is ghastly, too.

CMake also can’t make up its mind about whether it’s a batch configuration tool which takes command-line arguments and creates build files (autohell style), or whether it’s a GUI configuration tool where you set variables in an already-established project (IDE-style). The CMake cache lands up caching things you don’t want cached, so when a user changes a command-line argument and re-runs cmake it seems to have no effect. This is particularly aggrivating when a user runs cmake, which reports that it can’t find a required library (say libtiff) so they install it and re-run cmake – which still denies the existence of libtiff, because it’s cached the “knowledge” that it’s not present.

The shipped configuration modules are buggy, inconsistent, often don’t handle non-UNIX platforms properly, don’t quote and escape properly so they die if there are spaces on paths, almost always fail to consider platforms on which -debug and -release builds must link to different libraries, and generally suck. I’ve had to copy almost every standard Find module I use into my source tree and modify it to handle one or more basic issues – improper caching of negative results, failure to handle debug library linkage on win32, failure to handle spaces in paths, etc.

CMake seems to be almost opaque to IDEs. Everything it does is in an imperative style that means that until the CMake code actually runs, an IDE can’t know what files will be created/modified, what compiler definitions will be set, etc.

It’s Visual Studio integration is /ugly/ too. It generates a fake target that re-runs CMake, but in many cases that can update dependencies that cause everything to rebuild again.

On the other hand, at least it *can* generate Visual Studio projects, which is a godsend when doing Windows development – and especially when working with Windows developers who’ve never used anything else and don’t want to. To me, CMake’s ability to work with external tools to actually do the build is a major plus – you can use Visual Studio on Windows, XCode on OS X (if you want to), stick to `vim’ and `make’ on *nix, etc. It adapts to the developer’s tools rather than trying to force developers into its tools – something that’s always going to cause major resistance in the adoption of any build system.

CMake is a bit of a monster. The fact that it’s preferable to most of what’s out there says a lot about the state of build systems today. That said, CMake is probably the best candidate for being tidied up into something really nice to work with. Is starting again – again – really viable, after all?

Craig Scott says:

Firstly, I cannot express just how ecstatic I am to see this issue being explored. Having used most of the build systems mentioned in this post and comments thus far, I can sympathize with most of the issues raised.

Some additional food for thought. A weakness of qmake is its (understandable) focus on C++. It is, afterall, meant for Qt code. That said, trying to integrate other code that is not C++ with your project can be tough work. I had to conjure up Fortran90 support into qmake for our own software builds, and it was not fun. Fixing F90 and Qt support in SCons was marginally better. Whatever path you guys end up taking, please think about the inevitable cases where people need to do something not originally considered by the designers of the build system. Eg can people write their own plugin or something to add support for feature X or language Y? Being able to hook into the dependency system more robustly for qmake would have made my life much easier.

Another thing that I’ve found is that there perhaps is more focus on smaller builds than truly big builds, or at least this is what I must conclude from seeing first hand how poorly things scale up. When you start working with 100 or so projects that have inter-project dependencies and many directories and thousands of files are involved, pretty much every mainstream build system out there breaks. As a guide, people start rejecting a build system that doesn’t start compiling within a second or two of invoking the command. Rightly so, because longer delays discourage people from kicking off builds, which in turn discourages experimentation with the code.

And then there are dependencies. Just try to specify dependencies between 100 projects in a sensible way for all platforms and then sit back and watch the platform-specific build tools totally make a mess of them! I’d probably have to say this is the single biggest problem we have with every build system we’ve used. SCons was the most robust, but 20+ seconds for your build to start up was the cost of that robustness. I loved that the dependencies considered your compiler flags rather than just relying on time stamps (which can be confused by source code control which sets file timestamps according to when a file was last committed), but having to parse all SConscript files in your build tree to work that out every time is just too expensive.

In our case, we have developers with experience levels at all parts of the spectrum. We have power users right through to those who could not write a script to save themselves. What they can all do, however, is modify a list of source files if they need to add/remove things and they can generally work with a sensible GUI if they are not overwhelmed by a large number of choices. I also wholeheartedly support Holger’s suggestion of making the build system a library so that it can be used within a plugin to some other IDE. This would make it possible to add support for whatever you come up with into things like Visual Studio, KDevelop and of course Qt Creator (my new favourite).

Finally, here’s my summarised take on the 8 points you mentioned:

1) Proprietary language: Not that fussed, but it is soooo much easier when you don’t have to install additional dependencies just for your build system. Even having to rely on Python (like SCons does) caused annoying additional steps for users starting out and getting their build environments in place for us.

2) IDE integration: This makes a big difference to less experienced users. In my experience, those coming from a Visual Studio background often complain loudly when forced to work with text files that have to be “compiled” into VS project files every time they add/remove something or change a compiler flag. Being able to edit things through a UI in their IDE would greatly lower the barrier to acceptance.

3) Build directly from tool: Handy, but most users would have to be convinced why they can’t use the tool they are used to. You take on more maintenance responsibilities if you take this path.

4) Integration towards distributed build systems: I see why others want this. Personally, I’d just be happy with one that made good use of all my available cores and still handle my inter-project dependencies robustly.

5) Cross-compiling: I much prefer to compile on the platform itself. Once you have to integrate third party packages that need to be installed to supply bits of the build (eg libraries, headers, etc.), trying to set up cross-compiling gets too hard. For us, we just set up virtual machines to build for each platform and that works a *treat*. ๐Ÿ™‚

6) pkg-config support: Haven’t seen this used on Windows much, so less useful to us. Probably showing my lack of knowledge about this one……

7) Deployment: Nice, but we’ve already got our own system. Still, if the build system supported it, that would be great. Sort it out after the rest of the issues though. You can’t deploy something that you can’t build robustly!

8 ) Configuration: Very important to us. We have switches to turn things on and off in our build tree and we’d die without the ability to do this. Being able to cache options from one invocation to the next has proven very worthwhile for us. SCons has direct support for this, and it is trivial to do it with qmake using file includes and CONFIG+=foo, etc.

End of essay. ๐Ÿ˜‰

Uwe says:

Usually a qmake project definition is written ( or copied ) once by someone who is familiar with qmake. In the worst case this job takes a couple of hours, what is nothing in a project running over years. Then developers don’t do much more than adding or removing files/subprojects. So IMHO an advanced IDE integration is completely irrelevant.

I’m Qt developer since Qt 1.1 in many commercial projects and with Qwt I’m maintaining a cross platform library that runs on many platforms I’ve never seen myself. All problems I had with qmake in the past were related to the weak documentation. For me it would be more important to see a better documentation of the existing qmake, than a new build environment.

Jeremy says:

Glad to hear you’re examining replacements for QMake!

1/ Proprietary language / tool: In my opinion, using (and improving if necessary) an existing build tool is definitely the way to go. When trying to convince other developers to use Qt, the fact that all of Qt’s demos use a build system which is Qt-specific is a major hurdle. You don’t want to have to learn how to use a build tool which is specific to the framework.

2/ IDE integration: Personally, integration with an IDE is not very high on my priority list but if you go for an existing build tool there’s a better chance that the integration has been done for you, or at least there’s a wider developer base likely to support the IDE integration.

3/ Build directly from the tool : I don’t see how that’s an advantage, I agree with Craig Scott’s statement that you’re just putting more maintenance burden upon yourself and unnecessarily confusing users.

4/ Integration with distributed build systems : nice to have, but not high on my priority list.

5/ Cross-compiling: oh yes please! This is one of the most important points from my point of view if you want to achieve the vision of “Qt everywhere”.

6/ pkg-config support: this is a bit platform-specific. pkg-config should definitely be one of the ways to detect libraries / headers but it’s unfortunately never quite enough as it doesn’t cover Windows. I like CMake’s approach of shipping ‘modules’ which go about detecting the package you’re looking for with whatever means are relevant for your current platform.

7/ deployment : nice to have to kick-start cross-platform deployments. However this should not be too hard-wired into the build system, as there will always be cases where you need more flexibility than the tool can provide. For instance Linux distributions will want to retain fine-grained control over how packages are built and I doubt they will be satisfied with the default out-of-the-box support provided by the build tool.

8/ configuration : automatic documentation is definitely a must-have, otherwise I doubt people will document the build options for their projects.

I am personally a big fan of CMake, so I’d definitely be excited if the trolls picked CMake and applied their skills to improving it. It’s fast and just works whether I am on Linux, MacOS/X, Windows or even cross-compiling for Windows from Linux. Also, you can probably leverage all the heavy lifting which has been done by KDE.

Per says:

Please learn from the terrible mistakes of CMake, if you do this. In particular: Allow the easy and familiar ./configure interface to switches with integrated documentation. Easy integration with package management software, such as RPM. Support for the equivalent of ‘make dist’ (creating distribution tarballs) and ‘make uninstall’. Make it easy to see raw compilation output when you want to, for debugging and checking that it does what you want.

Sylvain says:

Javascript + JSON is really good, please go for it…

Chris says:

How about a XML-based approach, like Ant? I don’t know, if Ant supports any other languages than Java and the interaction with pkg-config wouldn’t work either, but perhaps this could be added somehow.

Etienne says:

Hello,

I think that qmake is nice and small, has low complexity, is easy to write and read, is well know…
Please keep it simple, why do we need a more complex language?

In my humble opinion, I would prefer enhancing it by:
– adding a direct build command ( not using Makefile) e.g. qmake -build
– build in parallel.
– add a C++ preprocessor syntax #define, #ifdef , specially for debug/release
– add run options ( actually they are with qtcreator in a .user file not sharable because of absolute path)
– add dependencies (with quick dependencies tree evalutation) with other .pro

By the way, we use CMake at works in real big projets, it is very paintfull, specially with dependencies.

Thanks and see you

Etienne

cartman says:

http://code.google.com/p/gyp/ might be the answer to all your problems.

Sebastian says:

First of all, I’d also like to thank you bringing up this topic.

Long ago, I looked at QMake and I believe at that time it wasn’t able to create full-blown VS project files that allowed to tune all usual settings from within the IDE, but some crippled wrapper projects that just called nmake and lacked the usual settings. That was unacceptable for me, and I turned towards CMake, never looking back. This was not because CMake was so good! In fact, I almost hate it for its insane syntax, bad documentation (yes, there’s a lot of it, but it’s *bad*) and slowness in development / fixing bugs. But I found it to simply be the most powerful build tool out there, at least if you require native project files to be created (so Scons, Jam, Waf etc. are out of the question).

Recently, another requirement appeared for me: I wanted to be able to set compiler specific command line options (like e.g. “/fp:fast” for VS) that also show up at the correct place in the IDE’s project settings. CMake seems to just put such compiler settings into the generic “Command Line” field for VS, leaving the dedicated “Floating Point Model” field on the “Code generation” tab untouched, which of course is very misleading. In search for a build tool that would properly handle this I checked Bakefile [1] (slowly developed, XML syntax) and Premake [2] (supports “exotic” platforms, LUA scripting), the latter being generally able to do that, but it still lacks support for make compiler settings.

Then today, after reading this article, I checked QMake again to find out it not creates full-blown VS project files, but also properly recognizes compiler settings like “/fp:fast” for VS and adjusts the appropriate setting the generated project file rather than just appending it to the compiler’s command line! So I guess I’ll be switching back to QMake, then ๐Ÿ™‚

In short, my preferences / requirements for a built tool would be:

1) Prefer to contribute to an existing project rather than inventing something new, or forking a project.
2) Prefer to use a standardized language over something home-brewed. Do not use XML because of its bloat.
3) Being able to create project files for well-established build environments is more important than allowing the build tool to compile the sources itself.

[1] http://www.bakefile.org/
[2] http://industriousone.com/premake

Thomas Berg says:

Excellent post ๐Ÿ™‚

@ Craig Scott and Jeremy, who both commented that being able to build directly from the tool shouldn’t be important: I just have to say that after getting used to SCons in several large projects, this is the feature I like the most: Change a compiler flag in one subdirectory of a large project, add some new source files, even new files with Q_OBJECT macros in them, and there’s just one command to get a correct, incremental build (no unnecessary rebuilding). Even custom dependencies are handled perfectly. For example the sip tool of PyQt4 or the tblgen tool of LLVM are easy to support.

Also, you get excellent parallel builds, and stuff like retrieval of already built files from a cache (shared between developers), even on Windows.

About language: Python is a heavy dependency, so I can see why you would want something else. Bootstrapping javascript shouldn’t be a problem; if I understood correctly qmake already contains a javascript interpreter, right?

Marius says:

@Thomas: You’re right, it did. However, with the recent transition to JavaScriptCore we removed this feature, as it wasn’t as deeply integrated as it should have been, and not really used by anyone. But yes, JavaScriptCore can also be bootstrapped, if we just put a little work behind it.

I think all the long comments justifies a blog post follow-up though, rather than a long comment reply ๐Ÿ™‚

Alex says:

I fully agree with Per here.

I have two projects, one using autotools and the other using cmake.
While cmake does provide some good features (VS integration (even though a bit ugly), no libtool silliness, etc…), there are definitely some very bad things there:
* Lack of ./configure. Nothing beats ./configure, period. All that –help, –prefix, etc… are very easy to use and impossible to forget.
* A simple ./configure –help lists everything relevant to that package. With CMake, there are so many variables that it’s impossible to memorize them all.
* With autotools, cross-compilation is usually as simple as setting CC and CXX and running ./configure –target=other –host=mine. I have yet to discover how to properly do a cross-compilation with cmake, with all its underdocumented variables.
* Lack of make dist and make uninstall really get on the nerves. With autotools they just work by default.
* Ugly CMake language. Sure, autotools have that horrible M4 stuff, but at least I can write configure.ac mostly in shell, which is a lot better than cmake language.

QtFan says:

@Marius:
great post and great work (of course)!

I’m curious, when you are talking about changing/replacing QMake,
are there any thoughts about changing/extending MOC !?

I would like to see things like “QHibernate”, “QIoC”, and so on…
Qt is great but it is still “low-level” compered to Java and DotNET.
I think that we need “out of the box” support for
Dependency Injection/Inversion of Control and Persistence.

I don’t see how this can be achieved without changing MOC?

Anyway, thanks for great framework.

Florian says:

I strongly vote for rewritting QMake and for staying backward compatible (!). That mainly means that you need to write a good unit test for the existing QMake functionality and keep that running while refactoring. I think you should apply the same programming strategies that give Qt such a good quality to QMake as well. Most problems we had with QMake where related to the many undocumented features that one has to rely on, which changed from Qt release to Qt release.

In our company we have thousands of QMake profiles and it would really be a bad decision for us if you stop QMake!

You should not make the error of building a new tool that tries to solve all problems (since most of your customers already
have solutions for that, e.g. parallel builds, deployment, …)
Especially when you talk about deployment, there are so many ways to build cross-platform installers that I am sure that your solution for
each platform will be suboptimal, so in the end everybody will go back to his own installer solution.

I think if you would create a new tool, we would NOT use it in our company and change to CMake instead, because CMake will never go away, but who will give us the guarantee that your new tool will not die after 2-3 years like QBuild?

Adam Higerd says:

@QtFan: Having worked with three different signal/slot implementations (Boost, gobject, Qt) I can say without a doubt that moc is the best thing that’s ever happened to C++. Boost may have a performance benefit over Qt signals, but that difference is dwarfed by the flexibility, dynamicness, and introspective abilities provided by having a metaobject. Furthermore, Qt’s signal dispatch mechanism allows signals to be passed across threads in perfect safety. I’ve written several tools that take advantage of the features offered by QMetaObject, including some that make use of the metacall system directly instead of using generated moc code.

panzi says:

Just a quick note before I go to the university:
Please no procedural code, IMHO declarative code is the way to go. Please no C like macros. I’d wish that C/C++ would have no macros. Please no XML. YAML would be definitely be the better choice.

YAML:

corelib:
target: QtCore
defines:
all: ["QT_BUILD_CORE_LIB", "QT_NO_USING_NAMESPACE"]
"mac && bundle" : ["QT_NO_DEBUG_PLUGIN_CHECK"]
sources:
all: ["Foo.cpp", "Bar.cpp"]
windows: ["Foo_win.cpp", "Bar_win.cpp"]
"unix && !mac": ["Foo_x11.cpp", "Bar_x11.cpp"]

QtFan says:

@Adam Higerd: Yes, and I think we need a little more information in QMetaObject.

We need decent IoC container. I know about http://qtioccontainer.sourceforge.net/
but that is the maximum that can be done with current Qt.

Guys, GUI is one part of the game, how do you wire/weave you business code?
There is no need to do this manually. This really slows down the development process.

Anyway, i was just thinking that is a good time to think about this when you
are changing QMake. “Something” like annotations in Java 5 or whatever …
could open the door to projects like “QIoC” or “QHibernate”.

Stefan says:

Don’t underestimate the power of using a domain specific language instead some general purpose language like javascript. Therefore you might think how to improve the one used by qmake today. Therefore I plead for 1. hope that you rewrite qmake.

And stay backward compatible. It will be a lot of pain to rewrite all the build systems that use qmake today!

JubiluM says:

Oh well…any considerations of removing C++ support from Qt any time soon, so that we can be prepared?

No, seriously…qmake has been a major tool for us doing our work through the years. After trying different solutions (CMake with it’s null documentation and even lesser usability, maketools, various scripting alternatives…) and more or less hitting the wall, we accepted qmake and its nature with some minor workarounds we have to apply. And we do anything from simple tests to very complicated things. And we really love working with qmake.

The beuty of qmake is that you say:
-qmake
-make release

And (basically) thats all there is to it, on all platforms. From a project size of a file to a project size of thousands of files.

If you don’t yet have a (q)makefile for a sourcetree, qmake generates one for you.

And *.pro – files work nowdays wonderfully with QtCreator. Just read it in…change project from IDE or change it manually. So it integrates nicely with the IDE. Actually qmakefiles allow one to work with a decent number of different IDE’s. And make’s…if you will. Propably everyone has an opinion about an IDE or a make and qmake supports wonderfully many different needs and that is where it shines.

Syntax used in qmakefiles is close to traditional makefiles, which is familiar to most of C++ users, I believe. Its easy to write well structured qmakefiles, with a good “signal-to-noise-ratio”…which is that simplified, but yet expressive syntax helps keeping extra clutter away. Its easy to read and comprehend a qmakefile written by someone else.

Now you would be ready to throw qmake away and force people to learn a yet another syntax or language for administering projects, and make people rewrite god-knows-how-many thousands of qmakefiles?

Javascript….are you serious!? Scripting languages come and go. I know that qmake has inbuild value for its users that sustains, if you let it and improve it.

I realize that all the functionality comes with a price of hard maintainability of qmake code itself. But the solution is not to replace it with Javascript or with some other horror alternative….you can not get away from maintanance problems or complications of the cross-platform-make-tool itself and you just have to deal with it, irrespective of the language.

So if the code is somewhat inflexible or cluttered on some parts, redesign and recode those parts. If it doesn’t yet parallelize, change it so that it does…you have classes in Qt that ease with concurrency related problems. And there is never enough of high quality documentation, it can save you a day or two.

Ok, bottom line….it would be very disappointing, and actually pure madness to see you cast away qmake. Just make it even better, maybe modular in a way that it would be easy to add new functionality…and people will come to you ๐Ÿ™‚ !

Hi
the truth for my company is … all unix make systems suck.
OUCH sorry but that’s true from a financial point of view.
And at the time you are finished with hatred against windows users, well .. the reality is
EVERYONE can use vstudio files.
Ok the ui is not perfect (setting up complex projects is sometimes really annoying because of ui glitches)
but every user is just able to create projects and build them.

We also work on unix, our flowline simulation system is completely platform independent. And we spent more money on getting the buildsystem running on linux than on the whole conversion part. (and by the way we are not using qt as a base).
The real numbers are 5 days patching system specific tools like file, threading, timing,
22 days to get a reliable working system that just works and does not need to be fiddled with all the time.

Important for our buildsystem is just
easy to use
…. Adding files eg, is just not necessary we have a script that searches through our project paths for files and searches for dependencies.

hierarchical settings
… settings are defined once and are used in all subprojects. copy&paste of makefiles is the worst thing that can happen on large projects if you change to a new compiler and need to modify the settings (we ran into that)

No intelligence in the buildsystem
… at least we have no place where the need more intelligence than just simple rules how to do what.
we HAVE platform and also configuration specific code. But that is completely handled IN our code.
if a file should not be compiled on win32 it has a #ifndef around it. Finished.

But we found the following things usefull and are on our internal list of “how our ideal buildsystem would be”
* automatically find files to compile. Have a pattern (or optionally regexp) to find the files you need to work on. Each group of these files is handled separately
* settings groups
have a list of settings (compiler, compilerdefines, … ) grouped together.
Allow hierarchy of groups (Global settings, local project just modifies a part of it)
* dependency groups
we do not define for each library wich other library is necessary. we just say “this project depends on
group x” … group x is a collection of other projects.
* multithreaded/distributed compilation …
we have here a completely different way using a rendermanager for that. It depends on creating group of
tasks and instead of assigning them to all machines (and probably wait) all systems search for an available
task wich is ready to work (all dependencies are fullfilled) and does it. Therefore there is no central
management system and it is scalable and works on one machine with multiple cores or on the network.
Although the system works it is more a testhack. So the task groups are in the moment rather simple.
Different compilation steps (compile cpp files, compile other resources like images, documentation, cuda …
) are handled in separate files right now. the send system just creates own jobs out of it.

In the end … we do not have the ideal build system. But it works because we have it as simple as possible.
Because no one wants to work with a buildsystem and needs to learn it.
It should just “Collect files to work on based on rules”, “Create tasks to do based on rules” “execute that tasks” everything else is not part of a buildsystem and that just makes it usefull.

hg says:

I suspect there are really only two choices:
– fix cmake
– fix qmake

My vote is for fixing cmake. If you opt to rewrite qmake then the qt build systems will remain significantly split. If you opt for cmake and resource it then there is quite likely to be only one significant cross platform qt build system in a fairly short space of time and, hopefully, more eyeballs on cross platform issues.

gwright says:

Personally, I think qmake is great. However, there is one huge flaw with it which is a showstopper for at least me (and I’d guess a few others have hit this problem as well).

Basically, the makefile generator backends for qmake are paired with specific linker versions. For example, on Windows it is currently impossible to use GNU Make (from MSYS) with MSVC. Likewise, it is impossible to use GNU Make with RealView ARM compilers, or to use nmake with gcc or other esoteric combinations.

The main reason for this is that whilst you can set the linker command in the qmake.conf file in the makespec, the *syntax* for that linker is hardcoded to whatever the Trolls think that Makefile generator should be used with. For example, the mingw generator hard assumes GNU ld-style syntax (with -o, -L, -l etc), which isn’t used by other compilers such as RVCT.

Other such hardcoded problems include the insistence of running the resource compiler on *all* Win32 apps (which isn’t required in a lot of cases, this should be optional) and hardcoded resource compiler syntax.

If this could be fixed, qmake would do a much better job of being a fully capable cross-platform buildsystem.

Marius says:

Thanks for all the comments on this blog entry. I’ve posted a follow-up post here: http://labs.trolltech.com/blogs/2009/10/14/to-make-or-not-to-make-qmake-and-beyond-redux/

Please have a read through, and give more comments!

Thanks!

Commenting closed.

Get started today with Qt Download now