Tuesday, January 30, 2007

TGD - expanding the field

Some primarily stochastic thought processes which occurred when I tried to apply Richard Dawkin's hypotheses to the Æsir. If your default browsing font doesn't contain a glyph for the ligature, well, tough ;-)


I suppose one of the first things to note is that whereas Xtianity is considered a religion, the Æsir and Vanir (the Norse pantheon) are thought more often as folklore or mythology.[1] In fact, upon reading the Eddas it's easy to have the impression that the sagas of Odin, Thor, Loki and chums have the same qualities as those associated with, for instance, Siegfried (of Das Niebelungenlied fame), King Arthur or Finn MCoul. Essentially, the gods have the air of being erstwhile real blokes (and of course blokesses), who have accumulated stories, feats and powers as people seek to glorify them, in order that when they later claim to be descended from same they can hope to persuade people of the existence of said powers.[2]



What I find interesting is that the only difference between a quirky set of historically interesting tales and gospel truth is how many people believe in what's said. For instance, it would be easy to apply the same distinction above between religion and folklore to the Roman pantheon, which equally was a major European religion relegated to providing saints and fables once Constantine got splashed in the font. Of course, to do so would be to ignore that there were multiple pre-Christian religions in the Roman Empire. Not merely in the same way that the Norse posh nobs worshipped Odin and the thains worshipped Thor, there were actually completely different mythological universes. I'm going to choose one, completely at random.



Between the second century BC and fourth (maybe fifth) century AD, a particular popular mythos in the Empire was Mithrainism.[3] If there are any modern Mithraists, I don't know about it. Which is not surprising, considering how wacky their religion was. [Update: apparently some Zoroastrians still venerate Mithras.]



Mithras was supposed to have been born around 270BC to a virgin Mother of God (the date of the celebration of his birth was December 25th). He was worshipped as a member of a trinity, as the mediating force between the heaven and earth. In fact, heaven was not only the celestial abode of God but also the place where atoned souls would go when they died, the true believers being absolved of their earthly sins. Those less fortunate were condemned to an infernal hell. Initiates (ceremonies were closed affairs, available only to men who had performed the appropriate rites) were baptised, and Sundays were a sacred time when the Mithraists ate bread, representing the body of their God, and drank wine, representing his blood; these were symbolic of the final supper he shared with his followers before ascending to heaven in about the 64th year of his life. Along with Odin and Osiris, he is supposed to have died and been resurrected before his final ascension.



You'd never get away with that rubbish these days, which is why this is clearly a deluded heathen folk tale as opposed to, um. You can clearly see why Dawkins didn't talk about this one in TGD...



[1] Actually, there is a religion with such a pantheon, called the Ásatrú - the word is Icelandic for Æsir faith. Despite widespread confusion, none of the major organisations of this faith are actually neo-Nazis or supremacy groups.


[2] I suppose this makes Finn and Aragorn the same being.


[3] Just through etymology I am reminded that I haven't yet covered Jainism. I need to.

Monday, January 29, 2007

That syncing feeling

So, ITunes tells me my iPod is up-to-date, and that it won't copy a few songs to my iPod because the iPod software needs updating first. The word which springs to mind is "erk".

Friday, January 26, 2007

Scary stuff

Possibly the scariest diagram anyone will ever have to look at. It's even less penetrable than that Eric Levenez history of unix thing.

Sacrilege!

As I wrote the @interface to an object today, I found myself wanting for but one thing:



- (NSArray <MyProtocol> *) foo;

Where - as if you hadn't guessed it - the pointy bracket bit (the lengths to which I go to avoid typing out HTML entity names) would specify that all of the objects in the array returned by -foo conform to @protocol(MyProtocol). I then realised that this wouldn't be quite as useful as I might think, but also decided that it wouldn't be too hard even on the existing ObjC runtimes to come up with a nightmare function such as:



(Protocol *) objc_class_to_protocol(Class *cls);

...therefore meaning that my hitherto unattainable pipe dream:



- (NSArray <MythicalNSStringProtocol> *) foo;

may indeed be somewhat closer to realisability. Of course, with all of this being compile-time type checking (as with the similar beasties on Java) there would be no need to frob the runtime.



Update 2007-01-26T13:22: yes, I realise that the snippits above read "an NSArray or subclass, which also conforms to the MyProtocol protocol". I also know what I mean.

Friday, January 12, 2007

Eating one's own dog food

I feel foolish for having made this error (especially after having so patiently explained how this stuff works on Mach), but today I did it. I reported the amount of free memory on a Linux system as being the amount reported by free as free.


My own opinion on this is that I suffer from a view of hardware management which was tainted by using micros like the Dragon 32 (Radio Shack/Tandy TRS-80 to my American readers) and the Amiga, where there were basically two types of memory usage: yes and no. A particular block was either in use by the system, or it wasn't. On the Dragon 32 it was even easier than that of course; all bytes were available for reading and writing, but what happened was context-dependent. Also because this was the MC6809E, whether such a peek/poke made sense depended on whether you were trying to hit an I/O address, and what was plugged in. The Amiga had a particularly lame memory allocator which quickly sucked performance like a vacuum of performance suckage +2; but a byte of RAM was either in use or it was available.


But I digress. The point is, that such a simple view of memory availability is no longer sensible but it's hard for me to think around it without a lot of work, just as it was hard for me to become a programmer after I'd been taught BASIC and Pascal. If I were involved in UNIX internals more (and indeed that would be fun, although I think maybe Linux wouldn't be my first choice to open up) I'd probably be able to think about these things properly, just as I had to throw myself into C programming in a big way before I lost my BASIC-isms.


For the record (and so that it looks like this post is going somewhere), both operating systems have an intermediate state for RAM to be in between "used" and "not used" (where I'm ignoring kernel wired memory, and Linux kernel buffers). On Mach, there's the "inactive" state which I've already described at el linko above. On Linux, it's used as an I/O cache for (mainly disk, mainly read-ahead) operations. This means that Linux will automatically take almost all (if not all) of the memory during the boot process. The way that inactive memory gets populated on Mach means that on that system (e.g. Darwin) actually the amount of free memory starts large and inactive starts small, but over time as the active and inactive counts go up, the free count goes down, and it's rare for memory to be re-freed. On both systems, free memory is really "memory it wasn't worth you buying" as it's not being put to any use at all.