Read The Implementation of Functional Programming Languages over the weekend. Didn’t dig it as much as SICP or EOPL or LiSP, but I’ve finally grokked the basics of graph reduction evaluators and have a couple simplistic ones running on LLVM now :-).
A bunch of books off Amazon. Some classics, some geeky stuff, some that are both. When they get here I’ll be happy as a baby in a topless bar.
The Craft of Text Editing — by Craig A. Finseth.
It’s interesting in that despite being written in 1991 most of what it covers regarding editor specific functionality and concerns is still valid. It’s essentially built around a model-view-controller pattern, long before the patterns book was published. There still hasn’t been much improvement on gap buffers or linked lists and such for modeling the data. User interface design is obviously important, if you expect to have any users. I like timeless shit like that.
A few parts such as chapter seven cover implementation and performance details which aren’t of particular concern today, but I think are still worth the read simply because it’s an interesting look into the sorts of tradeoffs one faces when performance matters (chapter seven itself even says as much, pointing out that for screen management you could just use the prevalent curses library). Thankfully we don’t generally use monochrome text-only terminals over a connection who’s speed is measured in baud anymore, but while people typing faster than the networked terminal could respond on slower connections is no longer a real concern, over time people have gotten lazy and despite multi gigahertz machines with megabit/sec connections we still deal with lag. The lag’s just in other areas now, but it’s still there, especially on the web. Especially when someone’s on dialup, or a cell phone, neither of which are exactly uncommon just as 300 baud connections wern’t uncommon in 1991. I’m thinking specifically of XML‘s rampant misuse for non-trivially-sized datasets here as one example where something like JSON or SQLite or Google’s newly released Protocol Buffers would be far more appropriate. Using scripting languages to implement entire applications (as in RoR, Django, etc…) would rank pretty high as well, but at least in those cases it’s the server’s CPU cycles they’re chewing away and not your own computer’s. Unless of course it’s also an AJAX site. Sigh.
Writing a new editor, of any sort, from scratch today would simply be retarded. You aren’t going to do anything new, editors have been done to death. It would accomplish nothing but a pointless reinvention of the wheel when all you needed to do was pick a well designed customizable editor, and learn to use it. Emacs, vi, LyX, Eclipse, TextMate, there’s shit loads of good editors already. But the fact that the book covers writing a text editor for obsolete equipment seems incidental to me. It’s more about properly designing software, using a text editor as a specific case example. In that regard it will remain valid for many years to come, especially in a world where test-driven development and “agile” development run rampant and naive programmers think “if it works, it’s done”. Not to say that TDD/Agile are bad ideas, I think TDD is awesome. But when they’re mis-used by the lazy, they’re fucking horrible.
 Design, not program. The book won’t teach you how to code in general, let alone anything usable, save for an appendix covering the minimum of C necessary to, with effort, understand the code examples. It was written for programmers, ergo it contains mostly concepts, not code.
 I know there’s going to be some ass chewing over this. Oh well.
 Especially handy when writing papers, articles, and such like. While the web has only recently begun to understand the value of separating content from presentation, it’s been around for document editing for eons. If you’re trying to do both at the same time, you’re wasting time. The trick is these two choice quotes from Bram Moolenaar:
“I want to get the work done, I don’t have time to look through the documentation to find some new command”. If you think like this, you will get stuck in the stone age of computing. Some people use Notepad for everything, and then wonder why other people get their work done in half the time…
Don’t overdo it. If you always try to find the perfect command for every little thing you do, your mind will have no time left to think about the work you were actually doing. Just pick out those actions that take more time than necessary, and train the commands until you don’t need to think about it when using them. Then you can concentrate on the text.
Came across an interesting page w.r.t. firearms.
” [...] Since I was suspicious of the Handgun Control Inc. statistics that everyone quotes, I decided to do my own research. Here’s what I found. I welcome corrections and reliable sources of more recent statistics.
“Out of 30,708 Americans who died by gunfire in 1998, only 316 were shot in justifiable homicides by private citizens with firearms.” -Handgun Control, Inc.
Let’s break that number down. [...]“
Which subsequently falls entirely apart under even the most basic of analysis. From: http://seanbonner.vox.com/library/post/grigsby-why-im-no-longer-antigun.html
Which reminds me of another short post I had seen before regarding firearm permits and gun crime and shows quite literally, “If guns are outlawed, only outlaws will have guns”:
Statistics are the new propaganda for everything ranging from politics to science. Think about it. Are there any beliefs you hold that may be based in large part upon misrepresented “statistics”? Ever bothered to research their data yourself?
Paradigms of Artificial Intelligence Programming
Artificial Intelligence: A Modern Approach
Introduction to Algorithms
The Algorithm Design Manual
Introduction to the Design and Analysis of Algorithms
Lisp in Small Pieces
After busting my ass helping build a deck and screening a porch and moving blocks and blah blah blah the last couple weeks, it’s yay-more-dead-tree time. The thing is, I want all of the above and I can only afford one, maybe two of the books I want, and I have to decide which to get (right now, eventually I will have them all). Being addicted to books really sucks sometimes.
Paradigms of Artificial Intelligence Programming is quite aged, however it uses Common Lisp as well as a Prolog implementation written in CL, both of which are still massively used and interest me (well, ok, I’m more of a Scheme guy now, but I’d assume it would at worst be an interesting exercise to translate the code to Scheme). And while much of the contents itself (ELIZA???) are a bit aged aged, it’s still all stuff that facinates me (and which I can think of uses for).
Or there’s Artificial Intelligence: A Modern Approach, one of which’s authors is the same as PAIP above and is basically the current defacto textbook on the subject, and its code is available in lisp, python, and java. But it’s apparent that they both cover subjects that the other does not, and given that PAIP contains a good deal of information not directly related to AI, I don’t think it’s fair to entirely discount it yet. AIMA on the other hand, contains a *lot* more recent material specific to AI. I do know however, that I would likely feel very much more comfortable diving into AIMA if I had already read PAIP, and PAIP would teach me a larger variety of information anyway.
So if I choose an AI book, it will be PAIP. Well, it’s a start.
Lisp In Small Pieces. “The first starts from a simple evaluation function and enriches it with multiple name spaces, continuations and side-effects with commented variants, while at the same time the language used to define these features is reduced to a simple lambda-calculus. [...] The second part focuses more on implementation techniques and discusses precompilation for fast interpretation: threaded code or bytecode; compilation towards C. Some extensions are also described such as dynamic evaluation, reflection, macros and objects.” Pretty much the book to get if you want to design a lisp implementation or understand how they work under the hood. After having read SICP, HTDP, TSPL, TLS, TSS, TRS, etc… its attraction is obvious. I would assume learning how to build something from the ground up would be a great way to learn how to use it more effectively. Sure, other books get into how to write an interpreter, but it’s nothing at all quite like this. It would have practical immediate value to me, and still be challenging enough to prove entertaining. However, I feel that perhaps compiler theory may be a bit over my head without a better background in algorithms in general, even if this book is quite well known for explaining everything it covers in a very clear manner.
Then again, the same could be said of the AI books above. Ok, so my choice won’t be any of the above. It’s clear the best choice given my limited funds would be one of the algorithm books.
Introduction to Algorithms is straight out, despite being the still-used-everywhere classic I simply lack the background in math for much of it to be of use to me, and despite its intent as a textbook it is apparent that it is best served as a reference for algorithms, for which I already have The Art of Computer Programming, Volumes 1-3 Boxed Set (which, as it happens, I quite clearly recall paying $125 (sticker price) for from Barnes & Noble some years ago. Inflation, maybe?).
The Algorithm Design Manual is also out, as a 2nd edition is about to be released and the waiting time for the previous is absurd. By the time it got here, the 2nd edition very likely may be available. Otherwise I may have simply chosen it to begin with: “THE ALGORITHM DESIGN MANUAL comes with a CD-ROM that contains: * a complete hypertext version of the full printed book. * the source code and URLs for all cited implementations. * over 30 hours of audio lectures on the design and analysis of algorithms are provided, all keyed to on-line lecture notes.” One of the things I loved most about SICP was that there are free lecture videos available that go along with it. They saved me a whole shitload of headache in the stickier parts of the book. While I’m sure somebody somewhere has made the contents of the CD (including the entire book…) available, I’ll wait until I can buy it. Generic star-of-the-week pop music or shitty unoriginal films (read: RIAA/MPAA) crying a river despite record setting profits are one thing, but a book such as this is probably worth the damn money.
So it’s between Algorithms and Introduction to the Design and Analysis of Algorithms. Hmmm. Both are quite highly praised for being clear and descriptive and well written and all that jazz. IDAA appears to have a novel method of categorizing the algorithms, by how they works, not what they do. That appeals to me greatly. Algorithms however appears to be cited significantly more than IDAA, and is much less expensive. More importantly, while Algorithms seems to clearly explain the concepts behind the algorithms it contains, IDAA seems to be focused towards doing the same for algorithms in general. Naturally, I would prefer the latter.
So Introduction to the Design and Analysis of Algorithms it will be. Here’s to hoping reviews on Amazon are worth a damn. Cheers.
Finished reading those books. Actually, I read the first two within half a day of getting them, the 3rd took a while. They include footnotes as to the differences between Scheme and Common Lisp where relevant, many of which are simply to deal with CL being a Lisp-2. And they were all worth every fucking penny.
The Little Schemer was far more basic than I had expected but it did get around to covering the Y combinator, in quite an interesting way. Probably not so basic if you havn’t worked through SICP, HTDP, TSPL, and such. Heavy on recursion, only actually uses a very small subset of Scheme. It goes on to, much like SICP, write a small Scheme interpreter inside Scheme itself.
The Seasoned Schemer is pretty much what you would expect: more Scheme. Gets into let, set!, call/cc, etc…, including Y! which I don’t recall seeing before. Interesting to note, there’s a footnote for Common Lisp in chapter 19 that simply says: “This is impossible in Lisp, but Scheme can do it.”. It was referring to a particular use of letcc (aka call/cc), but it gave me a chuckle.
The Reasoned Schemer… wow. Despite being the shortest, I really had to take my time with that one. It’s pretty much entirely on using and writing a Prolog implementation in Scheme.
And if you havn’t guessed from the title of this post, they all use foods as example data, in an oft humorous way.