Dienstag, 26. April 2011

Accepted to GSoC

Yesterday the projects which were accepted to GSoC have been announced. Among them are several interesting projects under the PSF umbrella, including mine.

During GSoC I will create a benchmark suite (based on existing ones) with "real world" benchmarks which can be easily used for every Python interpreter. Up until now each interpreter more or less rolled his own suite of benchmarks of varying quality. This makes comparisons unnecessarily hard and binds resources better used elsewhere.

Furthermore I will create an application which is able to download and build interpreters and execute the benchmarks with them using a simple configuration. Up until now such an application does not exist and e.g. http://speed.pypy.org compares released and current(from trunk) PyPy versions with other released CPython versions. As nice as that is, being able to compare the most current versions of various implementations is clearly favorable.

Once that work is completed I will port the benchmark suite to Python 3.x, as several benchmarks have dependencies that do not support 3.x, yet, I will not be able to port the entire suite, however it will be at least in start when it comes to benchmarks for 3.x.

I'm currently compiling a list with information on available benchmarks (what and how does it test) so that people unfamiliar with them can achieve an easy overview, once that is finished I will send E-Mails to the CPython, PyPy, IronPython, Jython and Cython mailing lists with the benchmarks I propose and asking for other benchmarks or changes to my proposed list.

Further information on my project will be published here and on Twitter as soon as possible.

2 Kommentare:

  1. Hi there,

    I am working on a benchmarking package for Python which is available at https://bitbucket.org/tleeuwenburg/benchmarker.pyand on the cheese shop. Naturally, I'm curious to learn as much as I can whenever I see anyone else working on benchmarking.

    The benchmarker.py package provides a decorator which you can wrap around a function, which will trigger the collection of profiling stats for that function call. At the end of the session, the stats object will get archived out onto a sensible directory layout for later use. There is also a codespeed integration module (nonfunctional but currently under active development) which will crawl the archives files and submit them to codespeed.

    Is what you're doing available on a public repo somewhere? I'd be interested to take a look at your approach and understand what is going on there. It sounds like what you're doing is more advanced than I have a need for, but I'd still be interested to take a look.

    Cheers,
    -Tennessee

    AntwortenLöschen