The reason given when this same article was discussed on reddit was that library authors need to use the same compiler as their customers. In other words, a GCC-compiled Windows DLL is not a drop-in replacement for a MSVC-compiled DLL.
I'm sure this is true for C++ which has a crazily complex ABI for things like classes and exceptions. But is this really the case for plain C? (not in the Linux world AFAIK)
x264 has worked fine in many many MSVC applications, even compiled in minGW with gcc.
There are some minor catches:
1. If you want a .lib instead of a .dll, you have to make one yourself separately using a tool. Not a big deal, it's just that gcc does do it for you.
2. No debug symbols, because MSVC doesn't support DWARF/etc.
3. gcc guarantees 16-byte stack alignment on x86_32, but only with respect to the caller; MSVC does not, so the stack may not be aligned as gcc expects it. You can use -mpreferred-stack-boundary to make gcc expect it, or you can just explicitly align the stack, either with the gcc intrinsic (requires ~gcc 4.2) or a small assembly function.
There are probably a few others, but overall, it does work; cdecl is cdecl.
C works fine, I even toyed with a kernel module build using MinGW instead of the WDK. One caveat is that GCC doesn't generate PDB information, the debugging format used by Microsoft tools.
Just as a comparison, I've heard that Oracle will most likely adopt GCC's C11 ABI when C11 support is added so that one actually will be able to mix-and-match objects from both compilers.
Pathscale compiler doesn't work on Windows, even though they claimed a beta will be relesed at some point. Clang+LLVM on Windows is very immature, I like to play with it and put it to the test once in a while but it has many problems under Windows I wouldn't trust it in production yet.
Gcc works and produces good code, unfortunately you have to deal with the disgrace that is MinGW and it doesn't integrate in Visual Studio. Most Windows developers I know care about Visual Studio integration.
The Intel compiler produces excellent code, it is C99 and Microsoft compatible, and it seaminglessly integrates with Visual Studio. Unfortunately, it's very expensive, I have never seen it used in production.
When discussing this topic, people like to say that C is more commonly used that C++. But I think that the reality is that -Microsoft's customers- use C++ much more than C. I am such a customer and in my field 95% or more of programming is C++.
I still wish they would support a few very basic C99 features though.
I think there might be a bit more than people realise - for example, the last company I worked for did primarily C++ Windows applications, but we had several third-party C libraries built in as well (Python, GDAL, proj, etc) which we needed to be able to build in VC++. Presumably this kind of use case is exactly the sort of thing which would prevent libraries like Python from moving on to use C99 features.
I definitely agree that it would be much better if MS just supported those few features, which frankly don't seem that hard in the scheme of things. Makes me think that they see some advantage to them in not supporting it, although I'm not sure what that'd actually be.
Yes, a few of the libraries I've been involved with do not want to allow C99 features into the codebase because of MSVC. This, IMO, is the main reason for wanting some C99 in MSVC -- to stop Windows from holding the rest of OSS hostage when it comes to more modern language features.
Surely Windows must use tons of C. I wonder what compiler internal devs are using. Archaic support of C seems more like a tactical choice than a laziness one in that case
The same thing could be described as a disincentive for supporting more-recent C++, and yet Microsoft is implementing those features at a pace which is unusually fast (for VC). I'm sure the real issue is much more prosaic: it costs money to implement, and it's not something that most of VC's customers are asking for. The expense here probably isn't in changing the compiler itself; it's in the implementation of the test suites. And while there may be open source test suites available under an acceptable license, the company's willingness to make use of outside source is changing at a glacially-slow pace -- particularly for core products.
Disclaimer: I work at Microsoft and know some people on the C++ compiler team. These comments are purely speculative on my part.
I don't think there's a disincentive on the C++ side - the best Windows apps use C++ so improving C++ helps make Windows a strong platform. The article also points out many of the most missed features of C99 are straightforward, and hopefully the tests would be too.
One is not obligated to support all features of a standard in order to support some, a reality Microsoft has consistently availed itself of during its existence.
You really have no idea what my standards are. All I did was say they didn't have to support VLAs to support other parts of C99. That tells you absolutely nothing, stop assuming so much.
Perl Philosphy is simply to torment the implementors on behalf of the user -- Larry Wall
Look at the C99 rationale:
The inability to declare arrays whose size is known only at execution time was often cited as a primary deterrent to using C as a numerical computing language.
That's also the main reason for the introduction of complex types and type-generic math functions.
It's not as if the C committee got together and thought "Hey, what can we do to piss off compiler writers?" - there was actual demand for these features!
Sure, it made language semantics messier (runtime evaluation of sizeof, introduction of variably-modified types restricted to block or prototype scope, magical macros for math functions - which has been fixed with C11, btw), but C99 is indeed a superior language than C90 for doing numerics.
I don't dispute for a moment that some people wanted VLAs or that they're useful in some contexts. On the other hand, your cited field is relatively small, and the use case could have been satisfied with a mechanism much more in line with the existing language.
In fact, we've de-facto had such a mechanism for decades, but it hasn't been codified anywhere, actually for some of the same reasons people don't appreciate VLAs, but it at least rests much more comfortably with the rest of the language.
At some point, you go from enhancing C to breaking it in the interests of a small minority (who apparently don't want to use Fortran?). VLAs are, if not over the line, right on it.
// C99
double trace(size_t n, double mat[][n])
{
double sum = 0;
for(size_t i = 0; i < n; ++i)
sum += mat[i][i];
return sum;
}
/* C90 */
double trace(size_t n, double mat[])
{
double sum = 0;
size_t i = 0;
for(; i < n; ++i)
sum += mat[i * n + i];
return sum;
}
While this may not look like much of an improvement in simple cases, not having to manually emulate array subscription is quite convenient in more complex ones.
2. Allocation of variably-sized objects with automatic storage duration. Some libc implementations provide alloca() for that purpose -- unfortunately, it has issues (see eg http://c-faq.com/malloc/alloca.glb.html ).
I know this is probably a losing battle, but I feel better at least being able to express my displeasure about the issue.