Ada

Jan 08, 2008 03:39

So I read about Ada today -- I was prompted by reading this article, in which some CS professors gripe about Java as a first language.

Blind, uninformed and ignorant prejudice proved fortunate for me in this special case -- I am lucky to have avoided Java like the plague until long after learning how to learn a programming language. Let's face it -- even though Java is actually really awesome, Java also sucks, and who could have understood my prescience in avoiding it :) -- no offense to the amazing and wonderful Dr. Trytten, of course!
EDIT: That deserves more thought. Dr. Trytten is a perfectly fantastic professor. Teaching Java first is part of a reasonable strategy that probably works for a lot of people, I suspect in particular for those people who continue learning after Java. I have heard of alumni returning to campus and recommending everyone from OU (where Java is taught first) to beef up on their C++ -- the article mentions also that several companies are unhappy with the crop of programmers that were first fed Java. The second concern addressed in the paper -- that "[Java-first] students had no feeling for the relationship between the source program and what the hardware would actually do" -- rings true for me. I began using GW-BASIC with the understanding that in a rough, indirect way I was exploiting the capabilities of the underlying hardware by talking to the 12MHz intel 80286 chip that was the brain of the system I was on. Programming was for me an exercise intimately tied to the hardware; when it took way too long for my for-next loop to iterate through each of the possible background colors afforded in SCREEN mode 0, I was piqued -- how could Wolfenstein 3D scroll so smoothly as I pressed my arrow keys, how did the muzzle flash sprites alternate so quickly, but I could not change the color of the screen before my eyes discerned a delay? Then I found out about assembly language, and my console ANSI palette rotation demos were faster than ever! Since then, I have understood "programming" as writing assembly with varying layers of abstraction on top of it, and things made sense because the microprocessor understood linear streams of opcodes, large chunks of which corresponded roughly to each line of code I wrote in BASIC. I came to learn that certain operations -- SCREEN, for example, translated into very concise assembler instructions: mov al, 13; int 10h; While others -- console printing for example -- stood for long, complex mini-programs. All of which either accessed the screen and keyboard through interrupts or peeks and pokes at specified memory addresses. (How excellent -- now I am waiting for the nanophysicists to develop a chip that interprets a discrete stream of electrons whose spins comprise the ones and zeroes of a binary string -- then I can truly program with the greatest efficiency physically possible.) The chief drawback to this was that in "programming" I was merely operating the processor as a technician does any piece of equipment -- a very talented and apt one, to be sure, but merely a technician -- I looked up BASIC instructions and arranged them carefully to produce expected results. There was no "computer science" involved, my most complicated data structure was the stack supplied by the processor to my process, and this I could abuse as a 64K array in whatever way I saw fit. So it was to my mind that efficient and thorough understanding of the architecture from a technical operator's perspective were essential and instrumental to my future application of computer science concepts -- never have I ever been hindered by the attitude that a pre-existing library could abstract the space I design my solutions in to a higher level, that I could be focusing somewhere above code as assembler-macro. In effect I find that this perspective has granted me a strong appreciation for the introduction of encapsulation -- a critical and simply-understood point in programming evolution where design can step up a level. It is now quite apparent to me that computer science is not at all about low-level scrambling of linear streams of opcodes being sucked like a spaghetti noodle by the CPU out of RAM. Indeed, one could do computer science in a classroom on a whiteboard with symbols and diagrams and graphs and charts where appropriate, but then it is no more than course in applied modern algebra with occasional throwbacks to calculus. The excitement, the rush, the fun, the raw, visceral appeal of it all must come from implementing and executing the design, from watching the theories of computer science take shape and spin and glide and whistle as they travel from your brain down your arms out your fingertips onto the keyboard, through the processor and finally out of the display. The thrill of acceleration down a desert highway, the adrenaline rush that hits when the red and blue lights electrify your rear-view mirror, the smug smirk you swallow as the cop saunters back up to your window to return your license and registration along with the ticket for 20 over -- all these would mean nothing if engineers sat in classrooms and talked and drew and thought and designed and no cars were ever built. And which of these engineers does not privately cringe at the boundaries of his fabrication plant, at the limitations inherent in raw materials as fabric of art -- who would not wish to be able to quickly and inexpensively realize and race down the test tracks every whimsical design that was born into his creative mind, and who still would not like to ride the crazy cars his friends dream up? But this is the luxury of the computer scientist, with the ubiquity and economy of personal computer hardware -- what sense is there in having literally at your fingertips the entire assembly line and production facilities necessary to breathe life into the creative, abstract imagination of your art if you are not going to completely and thoroughly utilize those means of production? If you are going to learn Java and benefit from inheritance and encapsulation and binary searches and generics and the whole gamut of features available to you, well, you have no reason not to take advantage of what is right before you and know and understand and cathect and grok the whole machine!

Man, I really want to learn Ada now.

I found it fascinating and cool to read that Ada can be found in both Raytheon Canada's Canadian Automated Air Traffic System and in the Boeing 777's fly-by-wire system. Writing mission-critical code seems very, very exciting.

I will probably try it out a few times before making yet another life-changing decision -- but, man: "[Ada] has been used ... where a software bug can mean fatalities."

I want to write that software. I want to work with people who have a real need for that kind of software.

I am probably going to finish up this Java 2 storm-tracking project first though :)

programming, java, ada, lol rant

Previous post Next post
Up