Richard Jones' Log

Thu, 24 Feb 2005
More Roundup wiki spam, ZWiki oddness

The Roundup wiki is using ZWiki, and when I went to undo the spam edit there was no record of the transaction. I've never packed the ZODB, so I'm quite flummoxed as to why there's no undo. After I'd fixed the page I could see my edit actions in the undo log*. Wierd.

Thank goodness for the Google cache, or the page would've been lost.

Wed, 23 Feb 2005
Shiny new hardware, obpystone

The old (at least 5 years old dual P3 860MHz):

$ python
Python 1.5.2 ...
$ python /usr/local/lib/python2.3/test/pystone.py
Pystone(1.1) time for 50000 passes = 5.16
This machine benchmarks at 9689.92 pystones/second

*cough*

$ /usr/local/bin/python -V 
Python 2.3.4
$ /usr/local/bin/python /usr/local/lib/python2.3/test/pystone.py
Pystone(1.1) time for 50000 passes = 3.87
This machine benchmarks at 12919.9 pystones/second

The new (both machines have dual Opterons at 1.8GHz):

ellis:~$ python -V
Python 2.3.5
ellis:~$ python /usr/lib/python2.3/test/pystone.py
Pystone(1.1) time for 50000 passes = 1.17
This machine benchmarks at 42735 pystones/second
gaiman:~$ python -V
Python 2.3.5
gaiman:~$ python /usr/lib/python2.3/test/pystone.py
Pystone(1.1) time for 50000 passes = 1.16
This machine benchmarks at 43103.4 pystones/second

Oh, and:

gaiman:~$ python -c 'import sys;print sys.maxint' 
9223372036854775807

Cooool.... :)

Updated to correct speed of new CPUs

Tue, 15 Feb 2005
I take back all I said about __iter__ators

Some time ago, I questioned the wisdom of adding the new iterator protocol to Python. Now I'm wiser, and I understand :)

Roundup 0.8 includes per-item access controls - for example, you can specify that users may only view / edit certain issues, or perhaps certain messages attached to issues. The HTML templating system now automatically filters out inaccessible items from listings. In one situation, it does so using an iterator:

class MultilinkIterator:
    def __init__(self, classname, client, values):
        self.classname = classname
        self.client = client
        self.values = values
        self.id = -1
    def next(self):
        '''Return the next item, but skip inaccessible items.'''
        check = self.client.db.security.hasPermission
        userid = self.client.userid
        while 1:
            self.id += 1
            if self.id >= len(self.values):
                raise StopIteration
            value = self.values[self.id]
            if check('View', userid, self.classname, itemid=value):
                return HTMLItem(self.client, self.classname, value)
    def __iter__(self):
        return self

Doing this using an old-style __getitem__ "iterator" would be much more difficult and messy.

Update: inspired by Bob's comment, I re-wrote it as a generator (my second ever ;)

def multilinkGenerator(classname, client, values):
    id = -1
    check = client.db.security.hasPermission
    userid = client.userid
    while 1:
        id += 1
        if id >= len(values):
            raise StopIteration
        value = values[id]
        if check('View', userid, classname, itemid=value):
            yield HTMLItem(client, classname, value)

I'm going to have so much fun playing with Python 2.3+ stuff* :)

*: Roundup's been holding me back ... until today's 0.8 release the minimum requirement was Python 2.1.

Wed, 09 Feb 2005
Whee - pycon flight & accommodation booked

I'll be staying a the Best Western in Arlington, a Metro stop away from the Con (and Sprints). Anyone else staying there?

category: Python | permanent link
Recent reading / listening

I don't usually post about what I've been reading, but there's been a mixed bag recently, and some of it was inspired by other weblogs' posts.

Fables Vol. 1: Legends in Exile (Bill Willingham)
Not as enamored of this as some other people. I found the dialogue to be quite unnatural and even jarring in places. Also, I felt the overall story wasted the opportunities at hand.
Y: The Last Man Vol. 1: Unmanned (Brian K. Vaughan, Pia Guerra, Jr. Jose Marzan)
Wow. I read this on the train coming in to work and then again on the way home. I'll be reading it again too. Excellent use of the medium. Great artwork, great dialogue. I can't wait to read more.
Going Postal & The Gods Trilogy (Terry Pratchett)
I hadn't read Pratchett for ages until this Christmas when my brother bought me Going Postal. I really liked it. In the post-Christmas sales Rachel picked up the Gods Trilogy. Perfect reading material for just picking up and reading a chapter before nodding off at night.

Finally, Rachel got me '64-'95 by Lemon Jelly yesterday, and I love it :)

Update: I'm such a geek, Rachel corrected my spelling of dialog dialogue.

category: Noise | permanent link
Tue, 08 Feb 2005
Thankyou

Thankyou Justus Pendleton. I don't believe we've conversed, so I can't be sure how else to contact you. You know who you are though :)

category: Python | permanent link
Kids these days

Me: Waiting for my coffee this morning, I saw a couple of guys sitting at one of the caf� tables with a spread of mobile phone contracts on the table - guy A obviously setting guy B some wonderful new 3G phone.

C: In my day, the kids were ruining their lives over drugs and alcohol. Now it's mobiles.

G: Yeah, now we've got all these fresh-faced, clean kids in debt to their mobile phones instead.

C: So now instead of seeing the drug dealers at the train station, it's mobile phone dealers.

*: it should be pointed out that C is originally from Sydney :)

category: Noise | permanent link
pyblagg re-blat solved

I finally figured out why pyblagg was being blatted* by some people's weblogs even at this stage. I've been running it in test mode for a little over a week, and figured that all the undated feeds had all been fetched by now. Well, they probably have all been fetched. The catch (until this morning) was that when I fetched an undated feed, I assigned the fetch date to the feed's new entries in the feed so they'd be reasonably sorted with the other entries which are dated. Unfortunately, I also cleaned out the entries database, removing entries older that an week. Therefore those undated feeds would get re-fetched, and thus re-blat the pyblagg page, once a week. Sigh. Undated feeds. So now I only delete the old feed entries if there's > 200 (arbitrary number) entries for the feed. Hopefully that'll reduce or elminate the blatting from undated feeds.

*: "blatting" in this instance == all of a given feed's entries show up in a block in the page, even ancient ones.

category: Python | permanent link
Mon, 07 Feb 2005
Still working out the kinks in pyblagg

From the recent updates, it looks like there's still a few undated blogs it hadn't managed to fetch from until now. Sigh. Undated entries mess up an aggregator like this when their blog is first fetched. It'll calm down though.

category: Python | permanent link
Fri, 04 Feb 2005
New pyblagg generator running...

The new, improved pyblagg is up and running. If this post appears on it, in a bit over an hour, then it's working :)

The new parser:

  • handles many more feeds,
  • has better handling of broken feeds (data does leak, script doesn't die, feed is marked broken),
  • uses the latest feedparser which supports many more feed formats, and automatically handles if-modified-since,
  • has an automated scraper to handle new / modified / deleted feeds listed on the wiki (runs once a week and tells me what it's done so that people spamming the wiki page will be innefective)
  • uses an sqlite database to store state, and
  • is just generally much neater code that my old massively-hacked script :)

Feeds that don't support if-modified-since are listed under "no-update*" since they're fetched but we got no new entries.

Wed, 02 Feb 2005
Sometimes it works, sometimes it doesn't

Booklet for first conference's 245 presentations exported, copy edited, fed back into online system and typeset in two days, off to the printers tomorrow. On the other hand, Zope-onna-OSX-laptop (mark 2) is floundering, big time. Bob Ippolito has been amazing with his support, but I'm really fighting to make Zope work in a py2app world. I need some sleep now. Via watching some classic season 2 Buffy.

category: Noise | permanent link