Could you tell us anything about why these things are bad?
Regarding the last point (the one that you cannot change), I'm not sure I prefer the behavior of any other language. In Python, for instance, all of the number-like things I tried were falsy iff they were 0, and I found that almost any container is falsy iff it is empty, but Queues are always truthy. This is a weird exception, and I would rather not have to remember it. Collections and numeric types defined in external libraries can easily have the same problem.
Global automatically declared variables is a well-known bad feature of some non-modern languages that gives nothing but an opportunity to make hardly detectable mistakes because of misprints.
Arbitrary nonsensical type casts must be prohibited. The compiler/interpreter/IDE must infer types and help the programmer to catch their mistakes as early as possible.
As a general rule of language design, if something looks unintentional or ambiguous to a human, it should look so to a compiler/interpreter. The more coherent the programming environment is to your conscience, the easier the programming is.
Thanks. I agree that Haskell is a great language, but this is a dynamically typed scripting language. As such, perhaps it would be more useful to compare it to languages that are actually comparable, like Ruby, Perl, your favorite LISP, Javascript, or Python.
This language does not include any nonsensical type casts. You cannot add a table to a string, take the arithmetic negation of a function, or index a number. (Actually, if you wanted to you could write code that permits you to do most of these things. For instance, I usually use some code that adds the ability to index strings.) String concatenation and arithmetic addition are different operators, so unlike in some prominent scripting languages you cannot find a and b such that +(a+b) == 50005 and +(b+a) == 55.
It does, however, include some forms that allow you to branch based on whether or not a value is a member of the set {nil, false}. I suppose that it would be better for the compiler to be able to guarantee that the value was a member of the set {true, false} and to yell at you rather than running your code if it was not, but this is a fairly rare property across all programming languages. I frequently use a programming language called Scala, and it pretends to provide some typechecks, but in reality all reference types are silently nullable, and all method calls are silently of the bottom type.
-- Only nil and false are falsy; 0 and '' are true!
Didn't read further. Bad language design. Must die.