Kamaelia 1.0.12.0 Released

December 28, 2010 at 06:43 PM | categories: python, oldblog | View Comments

I'm happy to announce Kamaelia's 4th release of 2010:  1.0.12.0 (Y.Y.M.r) Kamaelia is a component system based around unix-like concurrency/composition & pipelining. There's a strong focus on networked multimedia systems.

Kamaelia's license changed earlier this year to the Apache 2.0 License.

The release is divided up as follows:
  • Axon - the core component framework. Provides safe and secure message based concurrency & composition using generators as limited co-routines, threads, experimental process based support, and software transactional memory. Includes examples.

  • Kamaelia - A large Ol' Bucket of components, both application specific and generic. Components vary from network systems, through digital tv, graphics, visualisation, data processing etc. These reflect the work and systems that Kamaelia has been used to build. Includes examples.

  • Apps - A collection of applications built using Kamaelia. Whilst Kamaelia includes a collection of examples, these are either releases of internal apps or exemplars created by contributors.

  • Bindings - a collection of bindings we maintain as part of Kamaelia, including things like DVB bindings. (Bindings recently changed over to using Cython to make life simpler)

Website:
    http://www.kamaelia.org/Home.html

Source:
    http://code.google.com/p/kamaelia

Tutorial:
    http://www.kamaelia.org/PragmaticConcurrency.html

Detail of changes:
    http://groups.google.com/group/kamaelia/browse_frm/thread/db45646ce1790233

Download:
    http://www.kamaelia.org/release/MonthlyReleases/Kamaelia-1.0.12.0.tar.gz

Overview of Changes in this release:
  • This rolls up (primarily) 3 application and examples branches. The core functionality for these, as ever, is in the main Kamaelia.Apps namespace, meaning these applications and examples are designed for inclusion or extraction into other applications relatively easily. As a result they act as exemplars for things like 3D visualisation, video and audio communications, twitter mining, database interaction and analysis and django integration. They're also useful (and used) as standalone apps in their own right.
  • Examples (and application components) added for using the 3D graph visualisation (PyOpenGL based) - one based on visualising collaborations, another based on viewed FOAF networks.
  • Whiteboard application extended such that:
    • It supports multiway video comms as well as multiway audio comms.
    • Adds support for "decks" (collections of slides which can be downloaded, saved, loaded, emailed, encrypted, etc)
    • Removes pymedia dependency
    • Change audio over to us PyAlsaAudio directly.
    • Adds support for calibrated touch screen displays to Pygame Display.
      • For example large digital whiteboards in addition to tablets etc.
  • Adds in a "Social Bookmarking system" that does the following:
    • Harvests a semantic web/RDF data store for realtime search terms (relating to live television broadcast)
    • Uses these search terms to search twitter, to identify conversations around the semantic web data.
    • Takes the resulting tweets, and stores them in a DB
    • Analyses the tweets (including fixing language for analysis using NLTK) for a variety of aspects, storing these in the DB
    • Presents the results (graphs of buzz/popularity around the content)
    • Additionally the system attempts to identify particularly interesting/notable moments based on audience conversations, and provides links back to the actual broadcast programmes.
    • Additionally provides an API for data, generates word clouds etc.
    • Front end uses Django and web graph APIs to presnet data.

Mailing list:
    http://groups.google.com/group/kamaelia

Have fun :-)

Read and Post Comments

Europython 2010 Videos Now Online

August 14, 2010 at 01:55 PM | categories: python, oldblog | View Comments

Just a brief note to say that all the Europython videos I and helpers recorded are now uploaded and online on blip.tv. Since not everyone is subscribed to mailing lists, please find below the list/summary that I sent to the list. If anyone has any objections to their talk being up, please note I will take it down. However, please note - it's there because it's great that you were willing to stand up and talk!

Also, the real thanks have to go to John Pinner, Richard Taylor and Alex Wilmer (and the rest of the crew), without whom this years' Europython wouldn't've been the same. Likewise the same can be said about the many speakers willing to step forward and talk about something they love. And many thanks to Marijn, Richard and Walter too for their fantastic help in producing these videos :-)
Europython Community -- Awards Ceremony -- Thankyou Christian Tismer
http://europythonvideos.blip.tv/file/3980661/

Adewale Oshineye TDD on App Engine
http://europythonvideos.blip.tv/file/3980732/

Ali Afshar Glashammer
http://europythonvideos.blip.tv/file/3980733/

Andrew Godwin Fun with Django and Databases
http://europythonvideos.blip.tv/file/3980734/

Austin Bingham Python from the Inside Out
http://europythonvideos.blip.tv/file/3980760/

Bart Demeulenaere Pyradiso
http://europythonvideos.blip.tv/file/3980758/

Bruce Lawson Keynote Open Standards democratising and future proofing the web
http://europythonvideos.blip.tv/file/3980759/

Conference Opening Comments Housekeeping
http://europythonvideos.blip.tv/file/3980755/

Daniel Roseman Advanced Django ORM techniques
http://europythonvideos.blip.tv/file/3980796/

David Read Open Data and coding data.gov.uk
http://europythonvideos.blip.tv/file/3980804/

Denis Bilenko gevent network library
http://europythonvideos.blip.tv/file/3980788/

Donald McCarthy Python and EDA
http://europythonvideos.blip.tv/file/3980791/

Europython Community Awards Ceremony 4 Thanking Christian DryRun
http://europythonvideos.blip.tv/file/3980789/

Geoffrey Bache PyUseCase Testing Python GUIs
http://europythonvideos.blip.tv/file/3980793/

Guido van Rossum Appstats
http://europythonvideos.blip.tv/file/3980802/

Guido van Rossum Keynote
http://europythonvideos.blip.tv/file/3980864/

Henrik Vendelbo Real Time Websites with Python
http://europythonvideos.blip.tv/file/3980849/

Holger Kregel py.test rapid multipurpose testing
http://europythonvideos.blip.tv/file/3980855/

Jonathan Fine JavaScript 4 Pythonistas
http://europythonvideos.blip.tv/file/3980863/

Jonathan Fine JavaScript and MillerColumns
http://europythonvideos.blip.tv/file/3980845/

Jonathan Hartley Hobbyist OpenGL from Python
http://europythonvideos.blip.tv/file/3980846/

Kit Blake Mobi a mobile user agent lib
http://europythonvideos.blip.tv/file/3980862/

Kit Blake Web mobile templating in Silva
http://europythonvideos.blip.tv/file/3980853/

Lennart Regebro Porting to Python 3
http://europythonvideos.blip.tv/file/3980935/

Marc-Andre Lemburg Running Ghana VAT on Python
http://europythonvideos.blip.tv/file/3980937/

Mark Fink Visualizing Software Quality
http://europythonvideos.blip.tv/file/3980933/

Mark Shannon HotPy A comparison
http://europythonvideos.blip.tv/file/3980963/

Matteo Malosio Python Arduino and Mech Music
http://europythonvideos.blip.tv/file/3980961/

Michael Brunton-Spall Open Platform The Guardian API 1 Year On
http://europythonvideos.blip.tv/file/3980959/

Michael Brunton-Spall The Guardian and Appengine
http://europythonvideos.blip.tv/file/3980962/

Michael Foord Unittest New And Improved Part 1 of 2
http://europythonvideos.blip.tv/file/3980985/

Michael Foord Unittest New And Improved Part 2 of 2
http://europythonvideos.blip.tv/file/3980990/

Michael Sparks Arduino and Python
http://europythonvideos.blip.tv/file/3980998/

Nicholas Tollervey Organise a Python code dojo
http://europythonvideos.blip.tv/file/3980996/

Nicholas Tollervey Understanding FluidDB
http://europythonvideos.blip.tv/file/3980986/

Paul Boddie et al Web SIG
http://europythonvideos.blip.tv/file/3981027/

Paul Boddie Web Collaboration and Python
http://europythonvideos.blip.tv/file/3981026/

Peter Howard Aerodynamics and Pianos
http://europythonvideos.blip.tv/file/3981030/

PyPy Status and News Part Overview 1 of 3
http://europythonvideos.blip.tv/file/3981017/

PyPy Status and News Part JIT Compilation 2 of 3
http://europythonvideos.blip.tv/file/3981028/

PyPy Status and News Part cpyext 3 of 3
http://europythonvideos.blip.tv/file/4000720/

Raymond Hettinger Code Clinic 1 of 3
http://europythonvideos.blip.tv/file/4000722/

Raymond Hettinger Code Clinic 2 of 3
http://europythonvideos.blip.tv/file/4000757/

Raymond Hettinger Code Clinic 3 of 3
http://europythonvideos.blip.tv/file/4000761/

Raymond Hettinger Tips and Tricks 1 of 2
http://europythonvideos.blip.tv/file/4000758/

Raymond Hettinger Tips and Tricks 2 of 2
http://europythonvideos.blip.tv/file/4000752/

Richard Barrett Small The Trojan Snake
http://europythonvideos.blip.tv/file/4000778/

Richard Jones Keynote State of Python
http://europythonvideos.blip.tv/file/4000785/

Rob Collins Introduction to SMTPP
http://europythonvideos.blip.tv/file/4000771/

Russel Winder Keynote
http://europythonvideos.blip.tv/file/4000777/

Scott Wilson Flatland Form Processing
http://europythonvideos.blip.tv/file/4000782/

Semen Trygubenko Python and Machine Learning
http://europythonvideos.blip.tv/file/4000783/

Soeren Sonnenburg SHOGUN
http://europythonvideos.blip.tv/file/4000805/

Stefan Schwazer Robust Python Programs
http://europythonvideos.blip.tv/file/4000817/

Steve Holden Awards Ceremony 1 shaky
http://europythonvideos.blip.tv/file/4000798/

Steve Holden Awards Ceremony 2 PSF Community Service Award
http://europythonvideos.blip.tv/file/4000800/

Steve Holden Awards Ceremony 3 Frank Willison Award
http://europythonvideos.blip.tv/file/4000802/

Tomasz Walen Grzegorz Jakacki Codility Testing coders
http://europythonvideos.blip.tv/file/4000812/

Wesley Chun Programming Office with Python
http://europythonvideos.blip.tv/file/4000816/

Zeth What does it all mean
http://europythonvideos.blip.tv/file/4000813/

Lightning Talks:
A better pdb
http://europythonvideos.blip.tv/file/3980837/

Albertas upicasa
http://europythonvideos.blip.tv/file/3980891/

Andreas Klockner PyCUDA
http://europythonvideos.blip.tv/file/3980893/

Ariel Ben Yehuda cfg.parser
http://europythonvideos.blip.tv/file/3980900/

BidForFix.com
http://europythonvideos.blip.tv/file/3980882/

Brett Cannon How to properly package your apps front end code
http://europythonvideos.blip.tv/file/3980888/

Brian Brazil Pycon Ireland
http://europythonvideos.blip.tv/file/3980870/

Care Team Network
http://europythonvideos.blip.tv/file/3980871/

Conference Close
http://europythonvideos.blip.tv/file/3980889/

Ed Crewe From Shell Scripting to Config Mangagement
http://europythonvideos.blip.tv/file/3980890/

Experiences from Python Barcamp Cologne
http://europythonvideos.blip.tv/file/3980898/

Fiona Burrows Write More Games
http://europythonvideos.blip.tv/file/3980894/

Headroid Arduino Robot Face
http://europythonvideos.blip.tv/file/3980892/

Heres What I Think hwit.org
http://europythonvideos.blip.tv/file/3980902/

Jonathan Fine The easiest quiz in the world
http://europythonvideos.blip.tv/file/3980886/

Jonathan Hartley Run Snake Run
http://europythonvideos.blip.tv/file/3980897/

Laurens Van Houtven Python + E == Mont-E
http://europythonvideos.blip.tv/file/3980872/

Luke Leighton A Cry For Help
http://europythonvideos.blip.tv/file/3980885/

Luke Leighton Pyjamas
http://europythonvideos.blip.tv/file/3980869/

Magic Folder File Syncing
http://europythonvideos.blip.tv/file/3980881/

Marc-Andre Lemburg Growing the PSF
http://europythonvideos.blip.tv/file/3980884/

Martijn Faassen How to Fail at Pyweek
http://europythonvideos.blip.tv/file/3980879/

Michael Brunton Spall Python Javascript and Ruby in half an hour
http://europythonvideos.blip.tv/file/3980877/

Michael Sparks Embracing Concurrency
http://europythonvideos.blip.tv/file/3980901/

Moin Moin 2.0
http://europythonvideos.blip.tv/file/3980887/

Monstrum and Mercurial For Legions
http://europythonvideos.blip.tv/file/3980880/

Plone Conference
http://europythonvideos.blip.tv/file/3980895/

Porting Skynet to Python 3
http://europythonvideos.blip.tv/file/3980876/

Pure Python Proxying httplib and urllib2
http://europythonvideos.blip.tv/file/3980874/

Python Status Information 1 of 2
http://europythonvideos.blip.tv/file/3980883/

Python Status Information 2 of 2
http://europythonvideos.blip.tv/file/3980896/

Richard Jones PyWeek
http://europythonvideos.blip.tv/file/3980873/

Richard Jones The Cheese Shop
http://europythonvideos.blip.tv/file/3980867/

Richard Taylor
http://europythonvideos.blip.tv/file/3980899/

Sarah Mount Open Ihm Richard Jones With Gui
http://europythonvideos.blip.tv/file/3980922/

Steve Holden PSF
http://europythonvideos.blip.tv/file/3980917/

ThePythonGameBook.com
http://europythonvideos.blip.tv/file/3980919/

Unladen Swallow
http://europythonvideos.blip.tv/file/3980923/

Zero 14
http://europythonvideos.blip.tv/file/3980925/

Other (aborted) lightning talks:
20100722-2LT-10
http://europythonvideos.blip.tv/file/3980847/

20100722-2LT-5
http://europythonvideos.blip.tv/file/3980850/

20100722-2LT-6
http://europythonvideos.blip.tv/file/3980851/

20100722-LT-11
http://europythonvideos.blip.tv/file/3980854/

Share and Enjoy :-)
Read and Post Comments

If you were 7 again...

June 27, 2010 at 06:28 PM | categories: python, oldblog | View Comments

If you were 7 again, what would you expect to find in a book on beginning programming? I have some thoughts on this, and going to do this, but I'm curious to the thoughts of others.
Read and Post Comments

Python Magazine is dead ?

March 20, 2010 at 09:08 PM | categories: python, oldblog | View Comments

For the past several months the python magazine hasn't sent any new issues out. Indeed, since late last year they ripped out their website and had a banner saying "We're busy building a new python magazine", with a link laughably suggesting that there is more information available. This is after last year them getting several months behind with the magazine. They also said " Don't worry—your subscription and back issues are safe and will be available when the new site launches. ". That's fine, in theory. However consider:
  • Whilst they may let the grass grow under their feet they haven't bothered telling their subscribers. Paid subscribers.
  • They haven't bothered updating their website telling their customers what they're doing.
  • Indeed, they appear, from a subscriber point of view, to have simply cut and run.
I can't actually think what excuse they can come up with that justifies not actually bothering to contact subscribers for well over 1/2 a year, but I'm sure they have one. On the flip side, they don't have any contact address on their front page, nor on their content free "what we're doing" page. Beyond this, last year they decided, of their own volition to charge my credit card to renew my subscription. Now I was going to renew anyway - it's been a great magazine in the past, but charging my card without upfront consent struck me as rather dodgy.
Since they've now reneged on their half of the sale contract and not delivered, and actually have a good reason to need to get in contact with them, I can't. This means I'm left with 2 choices:
  • Either put out a public notice in the hope that it's something that someone there will read, and actually get back in contact to let me know how to contact them
  • Contact Visa and say that they're a rogue trader, and that they should be banned from making any further transactions against my card (especially given the last one was done without my explicit consent.
Neither is particularly attractive, and hopefully someone knows how to get in contact with them because they sure aren't advertising any contact details right now.

Finally, I get that it's a small publication, that it's one borne out of love, rather than profit (at a guess based on guestimates of costs), but if you're having trouble getting things back started, at least have the decency to tell your subscribers, rather than having content free "information" pages.

After all, a lot can change in 1/2 a year... (Last issue I have is from August 2009...)

(Sorry to anyone who reads this who have nothing to with the python magazine, but if you know someone there, please let me know who is "running" it these days)
Read and Post Comments

Kamaelia components from decorated generators. Pythonic concurrency?

October 04, 2009 at 10:52 PM | categories: python, oldblog | View Comments

A few months ago, there was a thread on the then google group python-concurrency about some standard forms for showing how some libraries deal with concurrent problems. The specific example chosen looked like this:
#!/bin/sh
tail -f /var/log/system.log |grep pants
Pete Fein also posted an example of this using generators, based on David Beazley's talk on python generators being used as (limited) coroutines:
    import time
    import re

    def follow(fname):
        f = file(fname)
        f.seek(0,2) # go to the end
        while True:
            l = f.readline()
            if not l: # no data
                time.sleep(.1)
            else:
                yield l

    def grep(lines, pattern):
        regex = re.compile(pattern)
        for l in lines:
            if regex.match(l):
                yield l

    def printer(lines):
        for l in lines:
            print l.strip()

    f = follow('/var/log/system.log')
    g = grep(f, ".*pants.*")
    p = printer(g)

    for i in p:
        pass

The question/challenge raised on the list was essentially "what does this look like in your framework or system?". For some reason, someone saw fit to move the mailing list from google groups, and delete the archives, so I can't point at the thread, but I did repost my answer for what was called "99 bottles" for kamaelia on the python wiki .

I quite liked the example for describing how to take this and convert it into a collection of kamaelia components, primarily because by doing so we gain a number of reusable components in this way. For me it was able describing how to move from something rather ad-hoc to something somewhat more generally usable.

For me, the point about Kamaelia is really that it's a component framework aimed at making concurrent problems more tractable & maintainable. Basically so that I can get stuff done quicker, that won't need rewriting completely to use concurrency, which someone else can hack on without needing to come back to me to understand it. In practice though, this also means that I tend to focus on building stuff, rather than asking "is it concurrent?". (Axon kinda ensures that it either is, or is concurrent friendly) This does sometimes also mean I focus on getting the job done, rather than "does this look nice"... Whilst that does matter to me, I do have deadlines like the next person :-)

For example, one thing missing from the above is that when you do something like:
    tail -f /var/log/system.log |grep pants
You aren't interested in the fact this uses 3 processes - tail, grep & parent process - but the fact that by writing it like this you're able to solve a problem quickly and simply. It also isn't particularly pretty, though I personally I view the shell version as rather elegant.

Naturally, being pleased with my version, I blogged about it. Much like anyone else, when I write something it seems like a good idea at the time :-). As sometimes happens, it made it onto reddit with some really nice & honest comments.

And what were those comments? If I had to summarise in one word "ugh!"

Whilst I don't aim for pretty (I aim for safe/correct :), pretty is nice, and pretty is fun. As a result, I've wanted to come back to this.Ugly is no fun :-( . Fun matters :-)

There was also a comment that suggested using decorators to achieve the same goal. However, at that point in time I had a mental block about what that would look like in this case. So I just thought "OK, I agree, can't quite see how to do it". I did recognise though that they're right to say that decorators would improve this case.

In particular the stumbling block is the way python generators are used in the above example is effectively a one way chaining. printer pulls values from grep. grep pulls values from follow. When one of them exits, they all exit. Essentially this is pull based.

In Kamaelia, components can be push, pull or push & pull. Furthermore they can push and pull in as many directions as you need. At the time mapping between the two sensibly it didn't seem tractable to me. Then this morning, as I woke blearily, I realised that the reason why. Essentially the above generator form isn't really directly the same as the shell form - though it is close.

Taking grep, for example, if I do this:
grep "foo" somefile
Then grep will open the file "somefile", read it, and output lines that match the pattern and exit.

However, if I do this:
bla | grep "foo"
Then grep will read values from stdin, and output lines which match the pattern. Furthermore, it will pause outputting values when bla stops pushing values into the chain, and exit when bla exits (after finishing processing stdin). ie It essentially has two modes of operating, based on getting a value or having an absent value.

In essence, the differences about what's happening here are subtle - in the shell we pass in a symbol which represents which stream needs opening, whereas in the example above, we pass in, effectively, an open stream. Also the shell is very much a combination of push and pull, whereas the generator pipeline above is essentially pull.

This made me realise that rather than activating the generator we want to read from *outside* the generator we're piping into, if we activate the generator *inside* the generator we're piping into, the problem becomes tractable.

For example, if we change this:
def grep(lines, pattern):
    regex = re.compile(pattern)
    for l in lines: # Note this requires an activate generator, or another iterable
        if regex.match(l):
            yield l
To this:
def grep(lines, pattern):
    "To stop this generator, you need to call it's .throw() method. The wrapper could do this"
    regex = re.compile(pattern)
    while 1:
        for l in lines(): # Note we activate the generator here inside instead
            if regex.search(l):
                yield l
        yield

We gain something that can operate very much like the command line grep. That is, it reads from its equivalent to stdin until stdin is exhausted. To indicated stdin is exhausted it simply yields - ie yields None. The caller can then go off and get more data to feed grep. Alternatively the caller can shutdown this grep at any point in time by throwing in an exception.

Making this small transform allows the above example to be rewritten as kamaelia components like this:
import sys
import time
import re
import Axon
from Kamaelia.Chassis.Pipeline import Pipeline
from decorators import blockingProducer, TransformerGenComponent

@blockingProducer
def follow(fname):
    "To stop this generator, you need to call it's .throw() method. The wrapper could do this"
    f = file(fname)
    f.seek(0,2) # go to the end
    while True:
        l = f.readline()
        if not l: # no data
            time.sleep(.1)
        else:
            yield l

@TransformerGenComponent
def grep(lines, pattern):
    "To stop this generator, you need to call it's .throw() method"
    regex = re.compile(pattern)
    while 1:
        for l in lines():
            if regex.search(l):
                yield l
        yield

@TransformerGenComponent
def printer(lines):
    "To stop this generator, you need to call it's .throw() method"
    while 1:
        for line in lines():
            sys.stdout.write(line)
            sys.stdout.flush()
        yield

Pipeline(
    follow('/var/log/system.log'),
    grep(None, ".*pants.*"),
    printer(None)
).run()

The implementation for both decorators.py and example.py above can both be found here:
http://code.google.com/p/kamaelia/source/browse/trunk/Sketches/MPS/AxonDecorators/
Similarly, if we wanted to use multiple processes, we could rewrite that final pipeline like this:
    from Axon.experimental.Process import ProcessPipeline

    ProcessPipeline(
        follow('/var/log/system.log'),
        grep(None, ".*pants.*"),
        printer(None)
    ).run()
Specifically the above will use 4 processes. One container process, and 3 subprocesses. (ProcessPipeline would benefit from a rewrite using multiprocess rather than pprocess though)

The other nice thing about this approach is that suppose you wanted to define your own generator source like this:
def source():
    for i in ["hello", "world", "game", "over"]:
        yield i
You could use that instead of "follow" above like this:
    Pipeline(
        grep(source,
".*pants.*"),
        printer(None)
    ).run()
For me, this has a certain symmetry with the change from this
tail somefile.txt | grep ".*pants.*" | cat -
to this:
grep ".*pants.*" source | cat -
ie if you pass in an absent value, it processes the standard inbox "inbox", rather than stdin. If you pass in a value, it's assumed to be a generator that needs activating.

Stepping back, and answering the "why? What does this give you?" question, it becomes more apparent as to why this might be useful when you start monitoring 5 log files at once for POST requests. For example, putting that all together in a single file would look like this:
(assuming you didn't reuse existing components :)
import sys
import time
import re
import Axon
from Kamaelia.Util.Backplane import Backplane, SubscribeTo, PublishTo
from Kamaelia.Chassis.Pipeline import Pipeline
from decorators import blockingProducer, TransformerGenComponent

@blockingProducer
def follow(fname):
    f = file(fname)
    f.seek(0,2) # go to the end
    while True:
        l = f.readline()
        if not l: # no data
            time.sleep(.1)
        else:
            yield l

@TransformerGenComponent
def grep(lines, pattern):
    regex = re.compile(pattern)
    while 1:
        for l in lines():
            if regex.search(l):
                yield l
        yield

@TransformerGenComponent
def printer(lines):
    while 1:
        for line in lines():
            sys.stdout.write(line)
            sys.stdout.flush()
        yield

Backplane("RESULTS").activate()

for logfile in ["com.example.1", "com.example.2", "com.example.3","com.example.4","com.example.5"]:
    Pipeline(
        follow(logfile+"-access.log"),
        grep(None, "POST"),
        PublishTo("RESULTS")
    ).activate()

Pipeline(
    SubscribeTo("RESULTS"),
    printer(None)
).run()
Now, I don't particularly like the word pythonic - maybe it is, maybe it isn't - but hopefully this example does look better than perhaps than last time! The biggest area needing work, from my perspective, in this  example is the names of the decorators.

Since this will be going into the next release of Axon - any feedback - especially on naming - would be welcome :-).

(Incidentally, follow/grep have already been added to kamaelia, so this would really be simpler, but it does make an interesting example IMO :-)
Read and Post Comments

Restarting Python Northwest. 24th Sept

September 14, 2009 at 11:36 PM | categories: python, oldblog | View Comments

A few people will have already noticed some small comments about this, but we're plotting to restart python northwest. Specifically, we're restarting this month.

Details:
  • When: Thursday 24th September, 6pm
  • Who: Who can come? If you're reading this YOU can (assuming you're sufficiently close :-)
    More specifically anyone from beginners, the inexperienced through deeply experienced and all the way back to the plain py-curious.
  •  What: Suggestion is to start off with a social meet, and chat about stuff we've found interesting/useful/fun with python recently. Topics likely to include robots and audio generation, the recent unconference, and europython.

How did this happen? I tweeted the idea, a couple of others seconded it, the David Jones pointed out "it easier to arrange for a specific 2 people to meet than it was to e-mail a vague cloud of people and get _any_ 2 to meet anywhere.", so that's where we'll be.

If twitter feedback is anything go by, we're hardly going to be alone, so please come along - the more the merrier :-) Better yet, please reply to this post saying you're coming along!

More generally, assuming this continues, pynw will probably be every third thursday in the month, maybe alternating between technical meets and social ones. (probably topic for discussion :-)

Please forward this to anyone you think may be interested!

See you there!

Read and Post Comments

Traffic Server to be Open Source?!

July 07, 2009 at 12:02 PM | categories: python, oldblog | View Comments

If this happens this will be awesome. Traffic Server is some really nice code. It's a large codebase, but it's really cool, and it *scales*. (I used to work at Inktomi, so have been inside the code as well). For those that don't know what it is, it's a very high performance web caching proxy, with a plugin architecture, allowing for the addition of other protocols. It used to support HTTP (& obvious friends), NNTP, RTSP, RTP, WMV, etc.

That's pretty much made my day that has.
Read and Post Comments

Europython Videos Transcoding

July 07, 2009 at 01:02 AM | categories: python, oldblog | View Comments

Since I've had a few questions about this, a short status update. At Europython last week I was recording all the talks I was attending. Including the lightning talks this means I have video from 55 talks. The video files from the camera are too large for blip.tv, so I'm transcoding them down to a smaller size, before uploading them. Since these 55 talks are spread over nearly 80 files, that naturally takes time.

Fortunately/obviously, I'm automating this, and it'll come as no shock to some that I'm automating it using kamaelia. This automation needs to to be stoppable, since I need to only do this overnight, for practicality reasons.

Anyway, for those curious, this is the code I'm using to do the transcode & upload. You'll note that it saturates my CPU, keeping both cores busy. Also, it's interleaving an IO bound process (ftp upload) with CPU bound - transcode.

import os
import re
import Axon

from Kamaelia.Chassis.Graphline import Graphline
from Kamaelia.Chassis.Pipeline import Pipeline

class Find(Axon.Component.component):
    path = "."
    walktype = "a"
    act_like_find = True
    def find(self, path = ".", walktype="a"):
        if walktype == "a":
            addfiles = True
            adddirs = True
        elif walktype == "f":
            addfiles = True
            adddirs = False
        elif walktype == "d":
            adddirs = True
            addfiles = False

        deque = []
        deque.insert(0,  (os.path.join(path,x) for x in os.listdir(path)) )
        while len(deque)>0:
            try:
                fullentry = deque[0].next()
                if os.path.isdir(fullentry):
                    if adddirs:
                        yield fullentry
                    try:
                        X= [os.path.join(fullentry,x) for x in os.listdir(fullentry)]
                        deque.insert(0, iter(X))
                    except OSError:
                        if not self.act_like_find:
                            raise
                elif os.path.isfile(fullentry):
                    if addfiles:
                        yield fullentry
            except StopIteration:

                deque.pop(0)

    def main(self):
        gotShutdown = False
        for e in self.find(path = self.path, walktype=self.walktype):
            self.send(e, "outbox")
            yield 1
            if self.dataReady("control"):
                gotShutdown = True
                break

        if not gotShutdown:
            self.send(Axon.Ipc.producerFinished(), "signal")
        else:
            self.send(self.recv("control"), "signal")

class Sort(Axon.Component.component):
    def main(self):
        dataset = []
        while 1:
            for i in self.Inbox("inbox"):
                dataset.append(i)
            if self.dataReady("control"):
                break
            self.pause()
            yield 1
        dataset.sort()
        for i in dataset:
            self.send(i, "outbox")
            yield 1
        self.send(self.recv("control"), "signal")

class Grep(Axon.Component.component):
    pattern = "."
    invert = False
    def main(self):
        match = re.compile(self.pattern)
        while 1:
            for i in self.Inbox("inbox"):
                if match.search(i):
                    if not self.invert:
                        self.send(i, "outbox")
                else:
                    if self.invert:
                        self.send(i, "outbox")
            if self.dataReady("control"):
                break
            self.pause()
            yield 1
        self.send(self.recv("control"), "signal")

class TwoWayBalancer(Axon.Component.component):
    Outboxes=["outbox1", "outbox2", "signal1","signal2"]
    def main(self):
        c = 0
        while 1:
            yield 1
            for job in self.Inbox("inbox"):
                if c == 0:
                    dest = "outbox1"
                else:
                    dest = "outbox2"
                c = (c + 1) % 2

                self.send(job, dest)
                job = None
            if not self.anyReady():
                self.pause()
            if self.dataReady("control"):
                break
        R=self.recv("control")
        self.send(R, "signal1")
        self.send(R, "signal2")


class Transcoder(Axon.ThreadedComponent.threadedcomponent):
    command = 'ffmpeg >transcode.log 2>&1 -i "%(SOURCEFILE)s" -s 640x360 -vcodec mpeg4 -acodec copy -vb 1500000 %(ENCODINGNAME)s'
    def main(self):
        while 1:
            for sourcefile in self.Inbox("inbox"):
                shortname = os.path.basename(sourcefile)
                encoding_name = shortname.replace(".mp4", ".avi")
                finalname = sourcefile.replace(".mp4", ".avi")
                # Do the actual transcode
                print "TRANSCODING", sourcefile, encoding_name
                os.system( self.command % {"SOURCEFILE": sourcefile, "ENCODINGNAME":encoding_name})

                # file is transcoded, move to done
                print "MOVING DONE FILE", sourcefile, os.path.join("done", sourcefile)
                os.rename(sourcefile, os.path.join("done", sourcefile))

                # Move encoded version to upload queue
                upload_name = os.path.join( "to_upload", encoding_name)
                print "MOVING TO UPLOAD QUEUE", encoding_name, upload_name
                os.rename(encoding_name, upload_name )

                # And tell the encoder to upload it please
                print "SETTING OFF UPLOAD",upload_name, finalname
                self.send( (upload_name, finalname), "outbox")
                print "-----------------"
            if self.dataReady("control"):
                break
        self.send(self.recv("control"), "signal")

class Uploader(Axon.ThreadedComponent.threadedcomponent):
    command = "ftpput --server=%(HOSTNAME)s --verbose --user=%(USERNAME)s --pass=%(PASSWORD)s --binary --passive %(UPLOADFILE)s"
    username =
< editted :-) >
    password = < editted :-) >
    hostname = "ftp.blip.tv"
    def main(self):
        while 1:
            for (upload_name, finalname) in self.Inbox("inbox"):
                print "UPLOADING", upload_name
                os.system( self.command % {
                                        "HOSTNAME":self.hostname,
                                        "USERNAME":self.username,
                                        "PASSWORD":self.password,
                                        "UPLOADFILE":upload_name,
                                     } )
                print "MOVING", upload_name, "TO", os.path.join("encoded", finalname)
                os.rename(upload_name, os.path.join("encoded", finalname))
                print "-----------------"

            if self.dataReady("control"):
                break
            if not self.anyReady():
                self.pause()
        self.send(self.recv("control"), "signal")

Graphline(
    FILES = Pipeline(
                Find(path=".",walktype="f"),
                Sort(),
                Grep(pattern="(done|encoded|unsorted|transcode.log|to_upload)",
                     invert = True),
            ),
    SPLIT = TwoWayBalancer(), # Would probably be nicer as a customised PAR chassis
    CONSUME1 = Pipeline(
                    Transcoder(),
                    Uploader(),
               ),
    CONSUME2 = Pipeline(
                    Transcoder(),
                    Uploader(),
               ),
    linkages = {
        ("FILES","outbox"):("SPLIT","inbox"),
        ("SPLIT","outbox1"):("CONSUME1","inbox"),
        ("SPLIT","outbox2"):("CONSUME2","inbox"),

        ("FILES","signal"):("SPLIT","control"),
        ("SPLIT","signal1"):("CONSUME1","control"),
        ("SPLIT","signal2"):("CONSUME2","control"),
    }
).run()

It should be fairly clear that this will go as fast as it can, so please be patient :-)


Read and Post Comments

Autoloading in python

June 21, 2009 at 04:14 PM | categories: python, oldblog | View Comments

Before I started using python, I'd used perl for several years, and one thing which I'd liked about perl was its autoload facility. Now in python the closest equivalent that I've seen is __getattr__ for classes, but not __getattr__ for a module. This seemed like a real shame since there are times when autoload can be incredibly useful.
If it seems chaotic, consider the Unix PATH variable. Any time you type a name, the shell looks in lots of locations and runs the first one it finds. That's effectively the same sort of idea as autoloading. Yes, you can do some really nasty magic if you want, but then you can do that with the shell to, and generally people get along find.
Anyway, vaguely curious about it I decided to do some digging around, and came across this post by Leif K Brookes, which suggests this:
You could wrap it in an object, but that's a bit of a hack.

import sys

class Foo(object):
     def __init__(self, wrapped):
         self.wrapped = wrapped

     def __getattr__(self, name):
         try:
             return getattr(self.wrapped, name)
         except AttributeError:
             return 'default'

sys.modules[__name__] = Foo(sys.modules[__name__])

That looked reasonable, so I created a file mymod.py which looks like this:
import sys

def greet(greeting="Hello World"):
   print greeting

class mymod_proxy(object):
    def __init__(self):
        super(mymod_proxy, self).__init__()
        self.wrapped = sys.modules["mymod"]
    def __getattr__(self, name):
        try:
            return getattr(self.wrapped, name)
        except AttributeError:
            def f():
                greet(name)
            return f

sys.modules["mymod"] = mymod()
And tried using it like this:
~> python
Python 2.5.1 (r251:54863, Jan 10 2008, 18:01:57)
[GCC 4.2.1 (SUSE Linux)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import mymod
>>> mymod.hello()
hello
>>> from mymod import Hello_World
>>> Hello_World()
Hello_World
And as you can see, it seems to work as expected/desired.

Now the reason I'd been thinking about this, is because I'd like to retain the hierarchy of components in Kamaelia that we have at the moment (it's useful for navigating what's where), but given we tend to use them in a similar way to Unix pipelines it's natural to want to be able to do something like:
from Kamaelia import Pipeline, ConsoleReader, ConsoleWriter
Pipeline(
    ConsoleReader(),
    ConsoleWriter(),
).run()
Rather than the more verbose form specifically pulling them in from particular points. Likewise, we don't really want to import every single module in Kamaelia.py, because of the large number of components there (Kamaelia is really a toolbox IMO where things get wired together, and Axon is the tool for making new tools), the majority of which won't be used in ever application!

Now, I haven't done this yet, and wouldn't do it lightly, but the fact that you can actually make autoload functionality work, seems kinda cool, and and a nice opportunity. But I'm also now wondering just how nasty this approach seems to people. After all, Leif describes it as "a bit of a hack", and whilst it's neat, I'm not taking in the positive view. I'm interested in any views on better ways of doing Autoload in python, and also whether people view it as a nice thing at all. (One person's aesthetic is another person's pain after all...)
Read and Post Comments

Europython

June 10, 2009 at 05:35 PM | categories: python, oldblog | View Comments

 I've mentioned this in a couple of places, but mentioning on my blog seems appropriate too.

I'm giving a tutorial on Kamaelia at Europython '09 this year.

Europython details:
   Where: Birmingham UK
   When:
      Tutorial days 28/29th June.
      Main conference: 30th June - 2nd July
      Kamaelia specifically: 28th June, 9am
            http://www.europython.eu/talks/timetable/
   Cost:
       Tutorial days: £100
       Conference days: £190
   More info:
       http://www.europython.eu/
       http://www.europython.eu/talks/timetable/

Blurb for the Kamaelia tutorial:
Kamaelia: Pragmatic Concurrency

Tutorial, Half day (intermediate)

Why use concurrency? Since concurrency is viewed as an advanced topic by many developers, this question is often overlooked. However, many real world systems, including transportation, companies, electronics and Unix systems are highly concurrent and accessible by the majority of people. One motivation can be “many hands make light work” but in software this often appears to be false – in no small part due to the tools we use to create systems. Despite this, the need for concurrency often creeps into many systems.

Kamaelia is a toolset and mindset aimed at assisting in structuring your code such that you can focus on the problem you want to solve, but in a way that results in naturally reusable code that happens to be painlessly concurrent. It was designed originally to make maintenance of highly concurrent network systems simpler, but has general application in a wider variety of problem domains, including desktop applications, web backend systems (eg video transcode & SMS services), through to tools for teaching a child to read and write.

This tutorial will cover:
  • A fast overview in the style of a lightning talk.
  • Kamaelia's core – Axon – which provides the basic tools needed for concurrent systems, followed by a session on implementing your own core.
  • Practical walk throughs of real world Kamaelia systems to demonstrate how to build and design systems of your own.
  • More advanced concepts such as reusing centralised services and constrained data sharing, as well as ongoing open issues will be touched upon.
  • Tips, tricks and rules of thumb when working with concurrent systems.
During this highly practical tutorial, where you will create your own version of Axon, your own components and first Kamaelia based system (bring a laptop!). The course expects a level of familiarity with Python but no prior experience of concurrency is assumed.
The structure of this in terms of time is 2 x 1.5 hour sessions, with a 15 minute break in the middle, so hopefully enough time to impart enough useful knowledge to help you get started with Kamaelia.

Also, if Kamaelia isn't interesting to you (sob :),  Ali Afshar who hangs out on Kamaelia's IRC channel is also giving a tutorial there on PyGTK, along with lots of other people giving interesting tutorials and talks :-)
Read and Post Comments

« Previous Page -- Next Page »