enjoying salad since 1978.

Sunday, December 23, 2001

the New version of E is out. I'll finish my Beginner's Guide to E over xmas break.

Friday, December 21, 2001

Speaking of Pendleton, look at their hiedous website

Also, I saw Lord of the Rings wednesday night. Peter Jackson gets it. What an amazing movie his first time out with a big budget. I'd be very interested in reading a book about the techniques he developed for such seemless actor/CGI integration. It was really excellent.

You know, I lived there for 18 years and I had no idea that the city emblem of P-town was a Sheep. A FRIGGIN' SHEEP!

Using TextArea's to edit is really frustrating. That crappy Pendleton website just crashed NS6.2 and took my entry with it. It reminds of the lame joke where jesus and satan are in an essay contest and after both their computers crash Jesus didn't lose his text. The punchline? Jesus saves.

Busy, Busy Week.

I have half a micro-essay about rectifying two competeing coding disciplines written. Sorry, I don't mean to tease.

Next week I'm on vacation to the frozen tundra of Pendleton. In my spare time, I'll hack out a basic memory profiling using the JVMPI

Greg Graffin's American Lesion is really a good album.

Sunday, December 16, 2001

My latest bedtime reading made the wonderful correlation between the fact that Samuel Beckett wrote a great deal of his work lying in bed and that many of his protagonists were quadraplegic.

You know, I never cared for Beckett's plays that much but his short stories are some of my favorite reading.

My other bedtime reading is pretty damn interesting as well. Even though I've read many of these articles and chapters in PDF format, I still find that I glean more from Abrash's work everytime I read it. I have to pull out my MASM->TASM translator guide again though. I guess I should port all of my old assembly code to NASM for good measure.

I've been thinking about OpenGL on Playstation2/Linux. I wonder if it actually works...

Friday, December 14, 2001

Wow, I can't believe that Boxer is still around, it was my preferred text editor to hack assembly in. It had keybindings for every editor known to man at the time, including WordStar. I didn't use WordStar keybindings, I used emacs keybindings but my assembly mentor was a big WordStar on CP/M user.

I feel that I should rewrite a paragraph in my earlier critique. I didn't realize that the concept of implicit context mapping from the URL was such a strange idea. Imagine the following

  1. You have an Apache webserver with a Content Handler registered at the location "/users"
  2. You find a user via http://junk.com/users/stevej?mode=full
  3. the first character of /stevej?mode=full is ripped off, and you now have the username sittting on top of your stack until the '? and the rest of the string is easily parsable.

We've hereby completely bypassed the use of any regexes to determine where a user request should be routed to. viola.

Thursday, December 13, 2001

The livejournal crew have released a document describing their new proposed architecture. Let's talk about it (Indented text is from the doc, text directly below that is my criticism):

Please realize that my criticisms are friendly. I'm not God of Programming, I'm just a dude who has ideas, experiences, and is knowledgable about the literature.

Backhander. The backhander will have to look several places in the HTTP request to determine which cluster to throw it to:
REQUEST_URI =~ m!^/(users|community|~)/(\w+)!
REQUEST_URI =~ m![\?\&]user=(\w+)!
Post data: user
Every single HTTP request goes through a regex? I think it would make more sense that every URI is implicitly mappable to it's context. Even a simple string comparison with a bunch of nasty if/elseif blocks would be better than having to parse every single request. No matter how fast you think regex is, "string" equals "string" is faster. And no matter how fast you think "string" equals "string" is, not doing any comparision at all is blindingly faster. "string" means send them to /path/to/string (don't forget the rest of the request data) and "junkXYZ090 means send them to /path/to/junkXYZ090 and if /path/to/junkXYZ090 doesn't exist, 404 'em.
"The ultimate ideal would be to have LiveJournal to scale linearly with the number of servers we buy."
Linear time complexity is never the ideal. I'm not nitpicking! It's an important distinction. If linear time complexity plays into your scalibility plans, it must only be as a bottom line. Please remember that the order of magnitude doesn't indicate the absolute amount of time spent on a specific dataset size but rather how expensive the algorithm is as your dataset grows. What they seem to be saying is that they wants each webserver to be able to handle less requests as they add more webservers.
"Currently, there is one master database, 5 slave databases, and a ton of web servers."
This is my largest criticism of LiveJournal. They don't have more users than Blogger and yet they require a whole lot more resouces. I feel that their dynamic generation of ever user page is to blame for this. Handling user pages as static content generated upon delta rather than upon request would lessen the requirements on the database tremendously. Of course,since livejournal hosts every page this quickly becomes a matter of using a massive NFS mount or caching whole pages in the database instea of just caching some data required to generate a whole page.

I'll have more criticisms as I read their proposal more.

Rebuttals welcome. Thanks for listening.

Wednesday, December 12, 2001

I've been fighting off a cold these past few days, trying to balance resting and defeating the enemy: midnight code.

It's 4am and I should definitely be asleep but I'm the victim of that nagging "could be a sore throat but it isn't quite" funkiness.

Geekbooks.org has been on my mind. So has smart contracts and a rebirth of a highly paired-down yet useful AMIX. Don't ask about either of these, I'll deny having mentioned them.

I see that MarkM put a new directory in the E cvs tree for 8.10delta release, I'm glad to see progress is moving along steadily. I really want to start hacking again in E.

Cannon's having a big xmas bash at his place on Saturday. Chocolate Fondue? Speaking of which, Star Wars: Starfighter has cleaned a lot of right-brain blockage lately. I've solved some of my hardest code problems blasting my way through Trade Federation blockades and flying through tight canyons. I've also died a lot in the game from lack of focus. Oh well. I'm a much better shot than I was with the original X-Wing for the PC.

Monday, December 10, 2001

>>>>>decompression<<<<<

Tuesday, December 04, 2001

Because my posting has been so few and far between, I present this gift

A dog in a funny suit

Monday, December 03, 2001

Technology conspired against me this weekend: First my imap-ssl server refuses to talk to Entourage for MacOS 9 and since I want to move to Entourage/X this is greatly disappointing. Also, my DSL has been out today (again). The techs were convinced that it was my modem but just a few hours ago my network connection magically came alive again. Now I have to cancel this Covad support person scheduled to arrive at 8am tomorrow and I can't email this to support because support no longer accepts email. hilarity ensues.

Eudora is really pissing me off. Ctrl-E instinctively takes me to the end of the sentence from my years hacking in Emacs but it means 'immediate send' in Eudora-speak.

I really wanted to get more coding done this weekend. guh..

In less whiny news, the Arc project looks rather interesting. My resurgance of interest in Lisp/Scheme has been brought about by LispMe 3.0. Plus I've been hacking in Python on my Visor lately. It doesn't seem to like all the functional constructs I'm used to in Python. If you've never hacked Python using functional idioms then you really should. it's fun.

speaking of which, please note the Sharp SL-5000. Linux, Embedded QT, Java, Compact Flash.. I want one.