Yesterday the projects which were accepted to GSoC have been announced. Among them are several interesting projects under the PSF umbrella, including mine.
During GSoC I will create a benchmark suite (based on existing ones) with "real world" benchmarks which can be easily used for every Python interpreter. Up until now each interpreter more or less rolled his own suite of benchmarks of varying quality. This makes comparisons unnecessarily hard and binds resources better used elsewhere.
Furthermore I will create an application which is able to download and build interpreters and execute the benchmarks with them using a simple configuration. Up until now such an application does not exist and e.g. http://speed.pypy.org compares released and current(from trunk) PyPy versions with other released CPython versions. As nice as that is, being able to compare the most current versions of various implementations is clearly favorable.
Once that work is completed I will port the benchmark suite to Python 3.x, as several benchmarks have dependencies that do not support 3.x, yet, I will not be able to port the entire suite, however it will be at least in start when it comes to benchmarks for 3.x.
I'm currently compiling a list with information on available benchmarks (what and how does it test) so that people unfamiliar with them can achieve an easy overview, once that is finished I will send E-Mails to the CPython, PyPy, IronPython, Jython and Cython mailing lists with the benchmarks I propose and asking for other benchmarks or changes to my proposed list.
Further information on my project will be published here and on Twitter as soon as possible.