(no subject)

Feb 21, 2008 16:53

I went to an all-day sales pitch conference sponsored by Wolfram Research yesterday, all about just how damn cool the new version of Mathematica is. There's a lot to like: as a language, it seems to have a nice blend of Lisp-like and APL-like features, so you can do all your standard functional programming tricks and what looks like a decent subset of array programming tricks, as well as writing normal imperative code. The standard library is, of course, vast, with loads of clever symbolic, numerical, graphics and GUI code built in, and in the new version there's also lots of standard geographical/scientific/financial/etc data available, import and export filters for loads of standard formats, and other niceties. One thing I really liked was the Manipulate[] function: hand it an expression (which can evaluate to a number, a symbolic form, a graph, a 3D plot, a sound file, an animation, or whatever) and a list of parameters, and it will automagically construct a GUI widget with sliders and checkboxes that allow you to manipulate the parameters interactively and observe the result. You can even control the parameters using a gamepad, if you want... They seem to have made a major effort to make everything interoperate smoothly in the new version - one slightly silly demo they showed us was putting slider bars as the limits of an integral, and changing the value of the result as the bar was dragged about. That was always the major problem with open source mathematics software, from my limited experience - nothing does everything, so you have to learn N different incompatible sublanguages, write loads of glue code, and constantly switch applications. The Sage guys seem to be working on this, though - I'll have to check it out.

Have a look at the big collection of Mathematica demos at http://demonstrations.wolfram.com, which includes a lot of examples of Manipulate[]. There are videos, or you can download a free-as-in-beer notebook reader.

In other news, I've been having a bit of a play with the NetBeans IDE for Java, and really liking it. I've got used to doing everything in vi and the command-line, which has its upsides, but IDEs can make life so much easier for the beginner. In particular, NetBeans' wiggly red underlining has been a huge help in learning the language, and the integrated documentation browser is very nice. I haven't needed the automated refactoring support yet, but it's fun to play with - select! Click! Extract Method! :-)

But here's my question - why is it so slow? I know it's written in an interpreted language, but so is Emacs, and that runs without too much complaint on 1980s hardware. And the compiler's written in C, unless I'm much mistaken, and that's slow as hell too. Or, conversely, why were the compilers with Delphi and Turbo Pascal so fast? Simple Java programs take several seconds to compile on my 1GHz machine, where their Pascal equivalents would have compiled in an eyeblink on its predecessor's predecessor1. Is there something about Pascal that makes it especially easy to compile, and if so, what is it? Java seems at least as regular to me, and generating bytecode ought to be easier than generating native code. Or is Anders Hejlsberg just a ninja?

Thesis now at 63 pages and 22563 words, according to wc *.tex, which means that I've written 20 pages and, um, several thousand words in the last sixteen days (nearly 1000 words today, but many of those were "XXX proof here"). Progress is being made, though there's an awful lot still to do.

1 I'm sure I've mentioned our fifteen-minute link times for our medium-sized C++ app when I was working at $company: while our network of file dependencies wasn't as bad as it could have been, it still resulted in the linker having to do a lot of work. And while compiling can be distributed easily around a network, linking can't :-(

java, computers, thesis, programming, conferences, maths, mathematica

Previous post Next post
Up