Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Launching Mathematica 10 (stephenwolfram.com)
84 points by co_pl_te on July 9, 2014 | hide | past | favorite | 79 comments


This made me laugh:

http://www.wolfram.com/mathematica/new-in-10/key-value-assoc...

Version 10 introduces fundamental new constructs called associations. They associate keys with values, allowing highly efficient lookup and updating, even with millions of elements. Associations provide generalizations of symbolically indexed lists, associative arrays, dictionaries, hashmaps, structs, and a variety of other powerful data structures.


A very broad generalization of hash tables has been built into the language since day 1 so this feature is mostly for people who just want a structure that they are already used to (or don't want the extra behavior for performance reasons or whatever).

Hash-tables in previous versions of Mathematica:

    >mymap["abc"]=1
    >mymap[2]=2
    >mpmap[hello]=3
    >mymap["abc"]
    1
    >mymap[2]
    2
    >mymap[hello]
    3
Whoa:

    >myDerivative[a_ x_^n_] = n a x^(n - 1)
    >myDerivative[1.1y^3]
    3.3 y^2


More specifically, the the previous hash-table-like method is just assigning constants as values for a function.

    In[1]:= mymap["abc"]=1;
            mymap[2]=2;
            mpmap[hello]=3;
            DownValues[mymap]
    Out[4]= {HoldPattern[mymap[2]]:>2,HoldPattern[mymap[abc]]:>1}
Defining a function is really just a pattern is matched based on arguments, and is replaced with an expression with the arguments substituted into it. And you can set a constant key argument to a value of a constant value — creating the odd built in hash-table-alike which has been around forever.

That said, Association[] looks nice. Presumably it gets a speed improvement by bypassing the pattern matching. And I remember being annoyed by using JSON data in Mathematica structures, this looks nicer. The literal syntax is appreciated too. Actually I'm kind of surprised they didn't choose another weird unicode symbol like 〚, for the literal syntax.


Well, unlike adding and changing definitions for symbols in the symbol table, associations are actually immutable. So you can pass them around as values, mutate them, and the old versions are still available unchanged. That's just not true of using downvalues of symbols.

Also, using definitions on symbols to store key-values is incredibly limiting, because you can't do reverse lookups without getting into very low level (and slow) hackery via DownValues. Or list the keys or values. Or even know how many keys you have.

Associations really will replace basically all uses of symbol downvalues for storing key-value pairs. And Associations have really nice general-purpose functions like GroupBy, Counts, Merge, and so on (mentioned below).

Associations even support function application syntax, so you should be able to drop them in into existing code that expects symbol downvalues as you're using them above.

Associations are the single most significant language-level improvement we've had in many versions. And we've integrated them really nicely across the system -- they're not just some special-purpose data structure.


Yeah, using definitions on symbols to store key-values is indeed incredibly limiting.

Association seems very similar to List: immutable, really good Part syntax for accessing things in deeper levels, etc. I've started to read some of the docs on it too, it looks very nice.


Shades of DEF FN from Applesoft BASIC!


Sounds like a hashtable ... is that what made you laugh ?


I think adding hashtables to a high-level language in 10th release, 26 years after its inception, made him/her laugh.

As a note: IIRC, SWI-Prolog got hash tables in their 7th release, so some fundamental data structure can be added to a language when it's seen as not needed by the language developers.


To be fair, fast immutable hash tables have only been around as a technology since around 2005. (Yes, Mathematica is mostly based on immutability) .Other than internal R&D, Wolfram never add experimental methods to Mathematica.

Of course, its always had lists of rules as a far more general (though much slower) form of the hash table concept.


I assume you mean persistent, not immutable.

    a[x] = 1
    b = a
    a[y] = 2  <---- mutates a
    a[x] = 3  <-- is this legal?
    b[x]    <--- persistent- uh, what is Mathematica's semantics here?


Yes, Associations use the same basic algorithm as Scala and Clojure's hashmaps, and are indeed persistent.

And b[x] there gives 1 as you would expect.


Do you know when that change was put into place? I have been using Mathematica sense version 5 and my intuition was that `b[x] == 3`. Testing in mathematica 8 which I have on hand confirms this.


Mathematica 8 doesn't have associations, so you must be testing symbol assignments.

Write a = <||> before running the above code to test this on associations; associations are persistent, the symbol table is not.


No, I meant lists of rules like {a->1,b->2}, which capture the same kind of patterns as in the previous example but in an immutable (and copy-on-write) way.


just to clarify, you mean fast immutable hash tables that support insertion right? (Which I agree is hard) Otherwise you can just use a BST or something and it's fast enough for lookups.


Associations aren't just a data structure, they've been designed to fit in a sensible way into the rest of the language, via a principle I call the "central dogma". This means they work in a predictable way with a huge number of existing functions (though we still have more to do).

For example, Associations interact naturally with the hierarchical part-specification language used by Part (http://reference.wolfram.com/language/ref/Part.html):

   In[1]:= people = { 
      <|"name" -> "bob", "age" -> 20, "sex" -> "M"|>, 
      <|"name" -> "sue", "age" -> 25, "sex" -> "F"|>, 
      <|"name" -> "ann", "age" -> 18, "sex" -> "F"|>
   }; 
   
   In[2]:= people[[ All, "age" ]] (* extract list of ages *)
   Out[2]= {20, 25, 18}

   In[3]:= people[[ All, "sex" ]] (* extract list of sexes *)
   Out[3]= {"M", "F", "F"}

   In[4]:= people[[ 2, "age" ]] (* extract age of 2nd person *)
   Out[4]= 25

   In[5]:= people[[ 2, {"age","sex"} ]] (* extract age and sex *)
   Out[5]= <|"age" -> 25, "sex" -> "F"|>
This naturally generalizes to 'indexed tables', in which the outermost list becomes an association, because associations serve double-duty as "structs" and "hash-maps", just like lists are used for both "vectors" and "tuples":

   In[6]:= people = <|
      236234 -> <|"name" -> "bob", "age" -> 20, "sex" -> "M"|>, 
      253456 -> <|"name" -> "sue", "age" -> 25, "sex" -> "F"|>, 
      323442 -> <|"name" -> "ann", "age" -> 18, "sex" -> "F"|>
   |>; 
   
   In[7]:= people[[ All, "age" ]] (* extract association between ID and age *)
   Out[7]= <| 236234 -> 20, 253456 -> 25, 323442 -> 18|>

   In[8]:= people[[ All, "sex" ]] (* extract association between ID and sex *)
   Out[8]= <| 236234 -> "M", 253456 -> "F", 323442 -> "F"|>

   In[9]:= people[[ Key[323442], "age" ]] (* extract age of person with ID 323442 *)
   Out[9]= 18

   (* extract age and sex of person with ID 323442 *)
   In[10]:= people[[ Key[323442], {"age","sex"} ]] 
   Out[10]= <|"age" -> 18, "sex" -> "F"|>

The uniform addressing scheme behind Part (and Extract, Position, etc) is tremendously useful in day-to-day code, because it makes it much easier to write programs as functions that transform potentially complex, hierarchical data in a series of steps.

This is similar in some ways to the ideas behind Haskell's lens library, Clojure's assoc-in and friends, even the schemes used in JQuery and XPath. But it's core to WL.

The semantics of Part are also extended to become a full-fledged query language, as used by Dataset (http://reference.wolfram.com/language/ref/Dataset.html):

   (* load a dataset of passengers of the Titanic *)
   titanic = ExampleData[{"Dataset", "Titanic"}]

   (* produce a histogram of passenger ages *) 
   titanic[Histogram, "age"] 

   (* produce a histograms for 1st class, 2nd class, etc.. *)
   titanic[GroupBy[Key["class"]], Histogram[#, {0,80,4}]&, "age"]  
There are also some really nice functions to work with associations, like the map-reduce-like GroupBy (http://reference.wolfram.com/language/ref/GroupBy.html):

   (* split sentence into list of words *)
   In[16]:= words = StringSplit["it was the best of times it was the worst of times"] 
   Out[16]= {"it", "was", "the", "best", "of", "times", "it", "was", 
      "the", "worst", "of", "times"}

   (* group words that have the same length *) 
   In[17]:= GroupBy[words, StringLength] 
   Out[17]= <|
      2 -> {"it", "of", "it", "of"}, 
      3 -> {"was", "the", "was", "the"}, 
      4 -> {"best"}, 
      5 -> {"times", "worst", "times"}
   |>
   
   (* reduce each group into an association of counts *)
   In[18]:= GroupBy[words, StringLength, Counts] 
   Out[18]= <|
      2 -> <|"it" -> 2, "of" -> 2|>, 
      3 -> <|"was" -> 2, "the" -> 2|>, 
      4 -> <|"best" -> 1|>, 
      5 -> <|"times" -> 2, "worst" -> 1|>
   |>
And Counts and CountsBy (http://reference.wolfram.com/language/ref/CountsBy.html):

   In[21]:= CountsBy[words, StringLength]
   Out[21]= <|2 -> 4, 3 -> 4, 4 -> 1, 5 -> 3|>
And AssociationMap (http://reference.wolfram.com/language/ref/AssociationMap.htm...):

   In[23]:= AssociationMap[WordData[#, "PartsOfSpeech"]&, words]
   Out[23]= <|
      "it" -> {"Pronoun"}, "was" -> {"Verb"}, 
      "the" -> {"Determiner"}, 
      "best" -> {"Noun", "Adjective", "Verb", "Adverb"}, 
      "of" -> {"Preposition"}, "times" -> {"Noun"}, 
      "worst" -> {"Noun", "Adjective", "Verb", "Adverb"}
   |>
Here's some more info about associations: http://reference.wolfram.com/language/guide/Associations.htm...


I'm very impressed by this stuff--especially the new computational geometry features.

I'm just a little hesitant to do any meaningful work in such a closed ecosystem.


For a non-commerical user, it's hard to see Mathematica as more than TODO list for http://www.sagemath.org


I'd like that to be true, but I think it's still some years away from being a similar experience. Sage is impressive but still very much shows its seams: it's trying to glue together a bunch of separately developed projects, with their own ideas about things (everything from Maxima to R), and the glue is often pretty noticeable if you do anything remotely complex.

(That said, I also avoid using Mathematica for most things because I'm squeamish about ending up with any significant project too closely tied to a proprietary platform.)


> I'm just a little hesitant to do any meaningful work in such a closed ecosystem.

It's interesting. I do sometimes wonder if it's being held back because it's proprietary. Other companies have proved it's possible to monetise an open source product, and Wolfram Research is a private company so they can really do what they want in that regard. I want to see it succeed because I think it's a great product, and a very clever one at that, but I can't justify doing research that uses it when there are open source implementations I can use instead.


I wish that whenever Wolfram passes on (no offense!), his heirs will open-source it (at least for non-commercial use, not necessarily BSD-style) as a gift to humanity. (While still maintaining a commercial business-support strucutre)


I once tried to do meaningful work and, while things such as the NDSolve[] function are positively amazing, programming in Mathematica (which means using its editor) is a huge pain when you are used, say, to vim.

In the end I moved everything I had to the Python scientific stack and I'm very happy I did it.


You need to get used to the notebook concept. In the end you'll find that it's much more convenient for interactive work than a traditional command line. For non-interactive work you can still use Vim if you like.

In fact Mathematica shows its advantages most when used interactively (compared to languages, like Python, which were not really designed for this).


Python and Julia have a nice interactive notebook: IPython/IJulia.


Here's the list of what's new since version 9 of Mathematica: http://www.wolfram.com/mathematica/new-in-10/


ZIPCodeData and many other geographic functions are missing and have badly mangled documentation. I assume something went wrong in an automated build process. This is on the Mac OSX release.

So far ZIPCodeData[], NeighborhoodData[] and MountainData[],BroadcastStationData[], MovieData[], BuildingData[], PersonData[] are all mangled.

Worse yet, URLFetch is broken also. Something bad happened, at least with the Mac release.

In[1]:= URLFetch["http://www.google.com/"]

General::unavail: ExportString is not available in this version of the Wolfram Language.

ImportString::string: First argument ExportString[{60, 33, 100, 111, 99, 116, 121, 112, 101, 32, 104, 116, 109, 108, 62, 60, 104, 116, 109, 108, 32, 105, 116, <<19757>>, 111, 100, 121, 62, 60, 47, 104, 116, 109, 108, 62}, <<2>>] is not a string.

Out[1]= ImportString[ExportString[{60, 33, 100, 111, 99, 116, 121, 112, 101,


What is your value of $Version and SystemInformation["Small"]? I can't replicate these problems on any of the 3 platforms using freshly-downloaded versions.


Would having an old version installed make a difference? I renamed my current install to "mathematica backup.app" then installed mms 10. It feels like some part of mms has failed without adequately reporting it.

Mac OSX 10.9.4

uname -a <~/Library/Mathematica/Licensing nyx[33] Darwin nyx.local 13.3.0 Darwin Kernel Version 13.3.0: Tue Jun 3 21:27:35 PDT 2014; root:xnu-2422.110.17~1/RELEASE_X86_64 x86_64

Session log below.

Mathematica 10.0 for Mac OS X x86 (64-bit) Copyright 1988-2014 Wolfram Research, Inc.

In[1]:= $Version

Out[1]= 10.0 for Mac OS X x86 (64-bit) (June 29, 2014)

In[2]:= $VersionNumber

Out[2]= 10.

In[3]:= SystemInformation["Small"]

Out[3]= {Kernel ->

> {SystemID -> MacOSX-x86-64, ReleaseID -> 10.0.0.0 (5098698, 5098537),

> CreationDate -> DateObject[{2014, 6, 29}, TimeObject[{20, 38, 32}]]},

> FrontEnd ->

> {OperatingSystem -> $Failed, ReleaseID -> Missing[NotActive],

> CreationDate -> DateObject[$Failed]}}

In[4]:= URLFetch["http://www.google.com/"]

General::unavail: ExportString is not available in this version of the Wolfram Language.

ImportString::string: First argument ExportString[{60, 33, 100, 111, 99, 116, 121, 112, 101, 32, 104, 116, 109, 108, 62, 60, 104, 116, 109, 108, 32, 105, 116, <<19641>>, 111, 100, 121, 62, 60, 47, 104, 116, 109, 108, 62}, <<2>>] is not a string.

Out[4]= ImportString[ExportString[{60, 33, 100, 111, 99, 116, 121, 112, 101,

> 32, 104, 116, 109, 108, 62, 60, 104, 116, 109, 108, 32, 105, 116, 101,

-- Snip --

In[5]:= ExportString[]

ExportString::argrx: ExportString called with 0 arguments; 2 arguments are expected.

Out[5]= ExportString[]


Looks like it... I would delete both from disk, redownload 10, and reinstall it. Also check that you didn't run out of disk space.

If all else fails, phone our technical support at 1-800-WOLFRAM, they're pretty good.


I checked the MD5sum given on the wolfram website (b58c6bb7393f23137355da923f2734fc), they match.

Completely removing my previous copy of Mathematica and nuking my ~/Library/Mathematica directory made the documentation formatting errors go away. I'm not sure if that was the cause or not.

However, I still get ExportString messages out of many functions.

General::unavail: ExportString is not available in this version of the Wolfram Language.

EDIT: I'm trying to hunt down why I get the unveil error message. Might this be a limitation of a trial mathematica license?


Turns out the trial version is (probably accidentally) quite broken: http://community.wolfram.com/groups/-/m/t/290210?p_p_auth=uf...


This is an impressive release.

GeoGraphics is amazing example of the power of integration between domains. Built-in data about the world, with semantic integration into graphics as well as data analysis. This is way beyond what any existing mapping solution has delivered.

The geometry stuff is also amazing and very useful for generative design and "making"


I hate to be that guy, but I wish to share my opinion about closed source mathematical software.

There is no doubt that what Wolfram Research has done with Mathematica is amazing and tempting. It is a very complete and uniform engine, and can be very useful for very different kinds of mathematics.

However, Wolfram Research deliberately keeps their methods and source code closed. Note that this is more serious than just the "Stallman-esque" open-source-everything philosophy. Wolfram insists that users do not need to know implementation details of their methods. This is plainly in their documentation. You can see the uncompelling argument from Wolfram here [6]. The gist of the argument is that interfaces matter, not implementations.

I strongly argue that users, especially mathematicians and engineers, should care about the internals of mathematical software, especially when it's being used, even in a utilitarian fashion, for research and engineering.

Not only this, but Wolfram has litigated against his own employees for publishing mathematical proofs about cellular automata. Information about this lawsuit is sparse, but evidence of it can be seen in [0]. More information can be found here [1].

Unfortunately most responses to the above from users of Mathematica is "well I just use Mathematica as a calculator, nothing serious" or "I wouldn't look at the source code anyway, so what gives?" It's an unfortunate response, and I don't have a technical rebuttal, but a moral one, which many don't want to hear.

It pains me to see the technical reliance on Mathematica (and other software such as MATLAB) in professional mathematicians, scientists, and engineers. It reminds me of an addictive drug; one of the best hackers I know does their work completely in Mathematica, and can no longer work without it.

As is the case with a lot of closed source, proprietary software, there aren't a ton of good alternatives. There is a plethora of logistical issues with existing computer algebra systems, but I nonetheless recommend them. Sage [2] is a continuously growing system based on Python which has backing from a lot of mathematicians. They are continually improving it. There's also Maxima [3]. None of these has quite an extensive array of functionality and graphical capabilities as Mathematica.

I (and others) have written more about this issue more extensively here [4] for those interested. This is an extension on the article written by Jordi G. Hermoso [5].

If you took the time to read this, thanks.

[0] https://groups.yahoo.com/neo/groups/theory-edge/conversation...

[1] http://vserver1.cscs.lsa.umich.edu/~crshalizi/reviews/wolfra...

[2] http://www.sagemath.org/

[3] http://andrejv.github.io/wxmaxima/

[4] http://symbo1ics.com/blog/?p=69

[5] http://www.symbo1ics.com/files/jordi.pdf

[6] http://reference.wolfram.com/language/tutorial/WhyYouDoNotUs...


My impression is that you don't hate to be that guy, but actually love to express this opinion.

Open source systems like Sage always look desirable, simply by virtue of being open source. But every time I look at it I'm left with a very bad taste in the mouth because of the constant badmouthing of non-open-source systems that is going on in that community. Companies do that sort of thing, and it doesn't inspire trust. But we know that it can happen just because a few people in the management made bad decisions. But when a community (!) around an open source (!) system takes on that attitude, it looks much worse. Don't you realize you're driving people away?

Why not put all that energy into improving your own system instead of trying to actively hinder others? Examples of that are forking GMP and making in GPL (not LGPL); actively pointing out to people (as Mr. Hermoso did to me) that no you can't link Octave to Mathematica because Octave is GPL (which is just a hindrance for my research, as well as to others); building on the fallacy that results obtained with Sage are inherently better because Sage is open sourced software is _theoretically_ verifiable. All software is buggy, and the only thing that makes a research result more trustworthy is if it is indeed verified, not if it's theoretically verifiable, but no one ever does it. Practical verification is almost never about reading the source code. It's about making sure the result is consistent and computing it with alternative tools.


No, I don't love talking about it. I actually find it saddening and arduous.

I don't consider what I said "bad mouthing". Maybe it was. I tried to be as respectful as possible, and provide links where I could.

I am certainly trying to improve existing systems. I've written a library for doing computational group theory, for which a paper was just published, and I plan to include it in Maxima.

Regarding verifiability, Sage has a lot more going for it than "theoretical verification". Professional mathematicians, especially those in algebraic combinatorics, regularly hold conferences and write software along with papers to show correctness of the system, and write new mathematically grounded functionality.

I apologize to both the authors of open source systems and to potential consumers of such systems if I am driving them away. My goal is to at least spark the idea for one to step back and evaluate what it means/implies/etc. to make use of proprietary mathematical systems, especially in professional or academic settings.


Results obtained with are better because Sage is open: in a mathematical research paper you can say:

"This reduces the problem to computing blah, which we did using the following Sage code. The function foo used here uses the algorithm of X and Y as described in their paper [XY2006]"

You can't say:

"This reduces the problem to calculating blah which the following Mathematica code computes using an unspecified algorithm for which there is no accompanying paper proving correctness."

Of course a paper proving that an algorithm is correct can contain errors, and also even if the proof of correctness is fine the actual implementation can contain bugs. But if you have no way of knowing how something is computed and whether anybody at any point in time even tried to prove mathematically that the method used is correct, you have no moral authority to rely on the result. That's just the standard adopted in mathematics: you can depend on results you have good reason to believe are true and are documented in the literature; you can't rely on stuff that's not written about. I don't know how citing results works in other areas, but that's how it is in mathematics (at least in the fields of mathematics I'm familiar with).


I guess it depends on one's field.

I've never cited a software I didn't write myself, simply saying trust this software, here's the reference. I wouldn't trust a result from Sage any better than one from Mathematica, simply based on which system produced it. I'd trust that x is a solution of an equation if substituting it back verifies it. I'll trust that two graphs are isomorphic if the software gives me a vertex permutation that makes the adjacency matrices identical. It doesn't matter how that isomorphism was computed.

When publishing work, I'll aim to make it verifiable this way.

I believe the vast majority of the use of these systems is not of the type when one needs to blindly trust the software and refer back to it in the paper. At least in my field (physics) it isn't. Yet I use programs like this daily, and I clearly depend on them for my work.

Most of the functionality available in Mathematica (or, I'd argue, most similar systems) are not of the type that one needs to cite. They either use standard and well known algorithms that are available in a multitude of systems (do you cite the methods for matrix multiplication or eigenvalue computation, and would it make a difference?), or the results are much easier to verify than to compute.

In those cases when I need to rely on a published method, like you mention, the method is very unlikely to be a built-in part of any system. So I either need to re-implement it, or use the original code of the authors. If the authors implemented their method in Mathematica instead of Python, does that make their program less reliable? No. It's still a published method, anyone can verify it.

My point is that I hear this argument about Sage very often, and the typical generalization is: "if you used Mathematica for your research, that's wrong, because it's not verifiable". This is a fallacy. It completely ignores how these software are used in practice, and implies that results from open source software are somehow magically reliable (they're not) and don't need verification (they do).

I've yet to come across a situation where the argument does apply at all: point me to a paper which goes truly wrong by citing Mathematica/MATLAB/Maple/etc this way.


Note that I'm not saying that one can claim that a result is correct, a theorem is true, etc. based on the fact that some undocumented algorithm produced it. That's clearly unacceptable.

Nor am I saying that it's never necessary to rely on an algorithm to get such a result.

What I'm saying that when people use Mathematica or other closed source systems, they do not usually commit these mistakes.

Also note that Mathematica programs can be open source and documented (many are). Several built-in packages have accessible and documented source code (e.g. Combinatorica). There's nothing wrong with using these to obtain such a result, and cite the (public and documented) program used to create it.


I think we're in complete agreement: most of the time you don't need a citation for a program you use because the result can be easily verfied. And I agree that what you call a fallacy is a fallacy, I was just pointing out that open source can be citable in a way that closed source isn't and that that is an advantage.


People are apt to discount Mathematica completely. I don't see any problem using Mathematica to generate some results because the author prefers Mathematica over other offerings. However, it would give the results much more credence if the Mathematica-obtained result was then replicated using open software.

I would expect that the difficulty of the port could vary widely between different use cases.


> actively pointing out to people (as Mr. Hermoso did to me) that no you can't link Octave to Mathematica because Octave is GPL (which is just a hindrance for my research, as well as to others)

What exactly did you want to do?

GPLv3 (which is the license Octave uses) does not always prohibit linking GPLv3 code with proprietary code. In particular, if you want to hack up a private copy of Octave for your own use, and do not distribute that to others, that's fine.

The key grant of rights is this, from section 2: "You may make, run and propagate covered works that you do not convey, without conditions so long as your license otherwise remains in force".

A covered work is "either the unmodified Program or a work based on the Program".

"Convey" means "any kind of propagation that enables other parties to make or receive copies".

If you are just doing stuff for your own private use, you are not conveying, and so that grant of rights to "make, run, and propagate covered works...without conditions" applies to you.


We were considering making http://matlink.org/ compatible with Octave. The feedback I got on this was part of why this wasn't done. To make MATLink user friendly, it needs to come with compiled binaries, which would be linked against Mathematica's closed source MathLink library.

If it is the case that GPL doesn't forbid this, I'd love to hear about it.


Yes, you can do that for internal (ie private) use.

The only caveat would be if your job is at a university, and you plan to give copies to students. Distribution would be legally impossible.


I should add that every time I asked WRI support about implementation details, I did receive an answer with references to the method used.


Did they ever share any of their algorithms created in-house? (They claim many are.) If they share the method, why can't they share the code?

They do have some notes on internal implementation, that do not seem up-to-date, here [0]. Were they any more detailed than this?

[0] http://reference.wolfram.com/language/tutorial/SomeNotesOnIn...


While well-meaning, sentiments like these are part of the problem.

The proposed frame is "mathematica is somewhat better in polish and functionality, but we should stick to OSS on principle."

The problem is, by this argument, no will actually ever know what Mathematica is, what it does, and what cool ideas it had that could potentially inspire further work.

All the examples of alternative software have lots to learn from Mathematica; but thinking of Mathematica as a collection of algorithms for math is both wrong and misses the point. Mathematica is increasingly about knowledge computing; its moved on from where other systems are now just thinking about getting to.

Instead of competing with and replacing mathematica, everyone would be better off first learning from it, and then trying to apply and extend the fundamental principles in their own work.


I think you put your finger on it... people talking about Mathematica/Wolfram Language as a CAS are really chasing an idea of what it was 10 or 15 years ago.

Maybe I'm naive but I think that we really can just "all get along" -- notebook-based programming is just one example of how ideas can incubate in closed-source projects, and end up benefiting open-source projects. I'm hoping knowledge-based computation ends up being another.

And the reverse happens too -- Light Table is tremendously exciting, and I hope WRI can learn from that as it develops.


Mathematica is decades ahead of LightTable


Can you list the things that Mathematica does that you think LightTable "should" do?


I am not proposing that Mathematica is "somewhat better". Mathematica is vastly better for many things, especially visualization.

Other systems should "learn" from it, sure. It is hard to change Maxima, unfortunately, since it's entrenched in its roots from the 1960s. That's no excuse for making it difficult to use.

I think Sage's Python interface is doing better and better, learning what it needs to learn from Mathematica, including its documentation and interface.

I definitely don't agree with you that everyone should use or learn Mathematica though to learn it. As a student of computer algebra, I'd have to contend that Mathematica actually does computer algebra remotely correctly. A much more beautiful system for doing computer algebra was Axiom [0].

[0] http://www.axiom-developer.org/


I am worried that Sage will get stuck in the mud as its Python code base grows, without a powerful expressive functional core like Mathematica's Lisp-ish core language. For example, see the discussion in this page about simple association lists vs what Mathematica 10 just launched


Have you noticed that in Mathematica you're not even able to make your own opaque data types? Only Wolfram has that ability.

Python is usable for actually writing software systems and algorithms. When a Mathematica code base grows over 10 lines, it becomes virtually unmaintainable in my experience.


You can make your own opaque data objects, quite easily, thanks to HoldAll, UpValues, and Internal`SetNoEntry (the nuclear option).

What precisely becomes unmaintainable about Mathematica/Wolfram Language code after 10 lines? You could just be bad at programming in WL.


Name[Field1, Field2, ...] is not an opaque data type. Maybe you can give me an idiomatic example of how to build a binary tree, for example? Or maybe something more complicated like a doubly linked list?

I do not write Mathematica code, but most code I've seen usually ends up being this mess of functions. I'll give you that maybe the code I've seen has just been bad, so we can ignore my point there.


People into traditional programming languages look at Mathematica and see abstractions for which they see no purpose.

People into PL research look at Mathematica and scoff at its low-brow, for-the-masses term rewriting, lacking whatever theoretical property deemed "essential" that week.

Yet somehow the language has formalized and made more computable and consistent more domains of math, science, and increasingly data than any other.

There is no honor or glory in purposeful ignorance.


I disagree that it has made more domains of math computable and consistent. It's syntax is consistent, but it's evaluation semantics is wildly inconsistent. This is noticeable when you use some of their internal simplification algorithms on non-trivial problem.

I think Axiom covered more surface area in terms of mathematics than Mathematica. Mathematica is mostly good at performing over reals and complexes and doing term-rewriting algebra. Axiom supported arbitrary algebraic structures.


That doesn't preclude Wolfram releasing the source code for the calculation engine.


Sometimes it's just a matter of signing an NDA to see the source code. I use a pretty expensive ($100k) EE simulation package and have questioned model implementations on a few occasions. The developers had no issues sending me the code in question for a simple 1 page NDA.


I think you are a bit confused. I would compare Mathematica to a an old digital desktop calculator. Nobody who used those wanted to see their insides. The point in Mathematica is not that they have some closed source pixie dust that no one else can understand or implement. The value of the software comes from the fact that you need a lot of grunt work in a large software package to maintain user experience - fix all the unfun bugs etc - and very few open source projects manage to attract enough interest to have people do also the gruntwork and not the interesting new development or 'cool refactorings'.


Interesting features, but I feel like it's a lost cause, as it's not worth investing in a proprietary platform.


Tell that to the millions of engineers that invest in Matlab. But seriously, I've found lots of use for Mathematica's visualization techniques whose results (images and animations) are certainly platform independent.


What platform is not proprietary?


Octave, SciPy, etc.

I'm not aware of a replacement for Mathematica's symbolic manipulation that's even close. It's good enough that it's scared any pretenders out of the castle.


Does anyone know if Mathematica 10 is retina-ready for Mac yet? This may seem petty to some, but when you do a lot of data visualisation, it really does matter.

As an aside, I've been using IPython Notebooks a lot over Mathematica over the past year and have generally been really happy with the change – not least because I know I won't be locked out of my notebooks if I can't afford a Mathematica license in future! That all said, I recently revisited a Mathematica notebook I'd made in the past, and it reminded me why I started using it in the first place. It really is a great piece of software: great libraries, great performance, generally a pleasure to use.


> Does anyone know if Mathematica 10 is retina-ready for Mac yet? This may seem petty to some, but when you do a lot of data visualisation, it really does matter.

Yes it is.


Great! Thank you. Will look forward to seeing some of the new visualisation styles in high-DPI :)


We'll see if they fixed the proxy so it can get through our corporate firewall, otherwise the alpha integration is useless. No other programs have a problem except Mathematica.


I like to dump on Wolfram as much as anybody else, but tbh, these really are nice looking visualizations backed with a nice set of general purpose tooling.


£2050+VAT = yowsers.


300 for non-commercial use.


Also based on the Mathematica 10 kernel is the Programming Cloud which you can try for free: http://www.wolframcloud.com/


I don't know if it is launch-day blues, but on new MacBook Pro, the demo notebooks are sluggish, and some of the examples throw errors.

And it uses "CTRL" to mean "Command", throughout. Tantalizing, but not a perfect first-impression, and it reinforces the feeling the I'd be let down if I stray from the path of polished examples and into real work.

...and it seems I corrupted the master copy of the "Things to Try" notebook -- clicking on the link again, from the home page, launches the corrupted notebook I made earlier.


Honest question, who uses Mathematica commercially? What industries?



I do. I use it as an advanced calculator for doing complex mathematics too hard to deal with by hand. To answer what industries, I do everything from semiconductor device physics to databases and machine learning.


Or you can get Raspi, I believe it already had pre-release version of the version 10 for while.

Just, it will be slow...


Can the RPi version use Wolfram Cloud to speed it up?


You can see it rendering its buttons and menus.. slowly. Typing may take 10 seconds to respond. It is completely unoptimized for ARM, or at least was when I tried it.


It's true it can't use libraries like the Intel Performance Primitives on ARM, but that's not why the UI is slow on RPi. RPi's UI is slow because all graphics-intensive GUI stuff on RPi is slow. Firefox is unusable, for example.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: