Hacker News new | past | comments | ask | show | jobs | submit login

Why is it "simply unacceptable" to produce the code to the least common denominator?

Anything that is successful must have more versions in the installed base. You can target a single version of something only if it's not used at all. It seems too much developers live in "Everybody must have that what I have" world.

I haven't tried myself, but I can't believe Python changed that much that you can't write the plain Python code to the least common denominator of versions >= 2.4. If you have the specific examples why you can't I'd like to know them.

Edit: I've rechecked, he actually mentions 2.2 as the lowest version he saw, still I believe it doesn't change too much.




You can write code that runs on 2.4, yes. But you miss quite a few useful features of more recent Pythons by doing so (context managers, for example, are a big deal, as are several of the newer standard-library modules). The same thing used to happen with Python 2.3; you could write 2.3-compatible code, but it meant no decorators, no generators, etc.

Python has these useful features and has had them for years; why is it acceptable to say that it will be 5-10 years before we can reliably use anything new?


In his particular case, he certainly was not missing the newer Python modules as he was able to write a "small" C-only replacement. That what he wrote in C would certainly be easier to write in Python 2.4 (or 2.2)

Did you understand that he wrote the big C program just to process a small file with the syntax as in the following example (taken from his own file):

    main = Server(
        chroot="./",
        hosts = [mongrel2]
    )
    settings = {"zeromq.threads": 1}
    mimetypes = {".txt": "text/plain"}
    servers = [main]


"Big"? You keep using that word. I do not think it means what you think it means:

  $ rm src/parser.c src/cli.c src/lexer.c src/linenoise.c 
  $ wc -l src/*.c src/*.rl  148 src/ast.c
  646 src/commands.c
  267 src/config_file.c
   93 src/constants.c
   36 src/m2sh.c
   13 src/token.c
  186 src/cli.rl
  156 src/lexer.rl
 1545 total
That's dinky tiny, even with the linenoise and generated files it's only 4061 lines long, which isn't much at all.


My count is 4636 lines, and I can imagine much. much smaller Python 2.3 code:

    ast.c 115
    ast.h 45
    cli.c 482
    cli.h 31
    cli.rl 143
    commands.c 498
    commands.h 5
    config_file.c 193
    config_file.h 27
    constants.c 82
    constants.h 14
    lexer.c 360
    lexer.rl 121
    linenoise.c 433
    linenoise.h 40
    m2sh.c 27
    mimetypes.csql 851
    parser.c 1074
    parser.h 15
    parser.y 69
    token.c 11


No, most of that is generated from the .rl files or just SQL that's common to everything. Also, you'll want to throw in all of Storm and PyREPL if you want a real comparison.

Then again, 4600 lines of anything is tiny. You have a massively skewed view of "Big" and haven't disproven anything by finding 600 lines of cruft in one directory.


I fully agree with you that this is more than tiny for a C project, but it can be big compared to a small script. I also develop/maintain the projects measured in zipped megabytes (where I don't even have a wish to try to wait to count the lines!) I only compared it to some script solution.

You're right, 600 lines more or less are irrelevant. I just thought you're measuring something else as I saw the total of 1000 instead of 4000. Still it is all tiny for a real C program.

I also think that the C solution is more portable than the dependence on any version of Python. I also cross-compiled the code for 32 MB RAM mipsel platforms and I agree that only C dependencies are better than any script language dependencies (not counting shell, when it's carefully written).

But I actively use both Perl and Python so I'd still really like to know what was lacking in Python 2.2 or 2.3 or 2.4, of course only in case you already knew that you were to have any Python on the target computer. But then if you couldn't expect to have any Python, then the fact that distros still use older versions wasn't of much relevance.


I'm concerned with the general case of writing Python software. Zed's example perhaps could get by, but offers some minor annoyances.

Distributing a framework like Django, on the other hand, when distros lag six years behind current Python, is a royal pain in the ass.


Django is in a very different situation because Python is central to all of its users, therefore it is reasonable to expect that they maintain a somewhat modern (released in the last 3 or 5 years, say) environment. When Python merely performs a subsidiary role, and your users may never use it themselves, it is far more important to support old versions.


I mentioned Django as a counterexample; the conversation seemed to be heading toward using the single specific example of Mongrel to demonstrate that there either is not a problem here or that it's not particularly serious.

Meanwhile, anyone who depends on Python for much more than a config-file parser knows what a genuine and large issue this is, and that "just write to the lowest common denominator" is not really a reasonable solution.


However your replies started with your response "That's simply unacceptable" to my "I'd very much like to know why it wasn't easier for him to simply write in Python 2.4" You mentioned (in this context actually OT) Django much later.

And I still don't know if he tried adjusting his already existing Python code to simply be backwards compatible -- he never mentions that, neither why that would be a problem in his particular case.

Regarding Django, see the other replies here about side-by-side installations, it's more than reasonable if you have something big which always runs, but not reasonable for a small config script used once.


My point is that the lowest common denominator is higher for users of something like Django because Python is a central component of what they do, thus it's reasonable to assume that they are not using an 8-year old Python environment.


Tell that to the many, many people I know who are using Django on RHEL. There's a reason why we just (as of Django 1.2) got around to dropping Python 2.3 support...


We can't drop Python 2.3 until 2012 (when RHEL4 reaches end-of-life).


Sounds like he could've easily embedded python into his app instead.


What's the point of ever producing newer versions of a language with new features (or, golly, even bug fixes) if no one will ever use them because everyone has a 'lowest common denominator' mentality? More to the point, what is the process of changing the 'lowest common denominator' everyone's using?


Where do you get "no one will ever use them" from? We're talking about the darned config script which the guy first wrote in Python and then discovered that he would have to write the processing of that file in the lower version of Python than he blindly used or to use C. And he decided to use C, I still haven't read anywhere why exactly.

The platforms upgrades have their own dynamic. But you just shouldn't be surprised that somebody who has the server running for years doesn't want to install the latest Python only to process one config script. Not more not less.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: