I have now updated my Mother library for it to work with Processing 2.0, and took the opportunity to also address the usability problems the previous version had. This one should be much easier to get started with!
If this is the first you hear of it, Mother is a standalone host for running multiple Processing sketches in parallel, and mixing their output, in a manner analogous to VJing (more info and videos at www.onar3d.com/mother/).
I’d greatly appreciate if you would give it a try and let me know how you get along, treating this release as a Beta for now!
It works fine on Windows 7 & 8, 32bit and 64bit, and on OSX.
Getting started instructions are at the top of the “Mother Documentation.pdf” included in the library zip file, along with more info on the library.
Two new research papers I’ve been working on are now available to view, while another few are in various stages of publication / completion and will too be linked from here once they’re publicly available. The new papers are:
Using Music as a Signal for Biofeedback
Ilias Bergstrom, Sofia Seinfeld, Jorge Arroyo-Palacios, Mel Slater, Maria V. Sanchez-Vives, International Journal of Psychophysiology, 2013
Drumming in Immersive Virtual Reality: The Body Shapes the Way We Play
Konstantina Kilteni, Ilias Bergstrom, Mel Slater, IEEE Transactions on Visualization and Computer Graphics, 2013
UCL has now posted my PhD thesis document for online download here.
The description of my PhD work is more up-to-date in this document than in the academic papers that preceded it, so if you have come here after having read my articles, you will most likely find my thesis to also be of interest!
I was invited to present a seminar on my PhD research at Queen Mary University’s Centre for Digital Music, on Monday the 8th of March, at 16:15. Entrance is free and accessible to all. For more information, and information on how to get there, please refer to this link.
A presentation of a novel, entirely custom developed system, for facilitating the live performance of Visual Music / abstract animation. The hypothesis is that by using musical instruments as the primary user interface for the performance, we may usefully re-map the embodied/enactive knowledge that musicians have of their instruments. Musicians may then perform live visual music, taking advantage of the expressivity their instruments afford them. For this work, a new control data mapping strategy had to be developed, of ‘Mutable Mapping’, which entails manually manipulating the mapping during a performance, gradually altering and re-routing digital control data.
In order to find out which one of two different visual music technologies audiences prefer, I have created an online survey for comparing them. Please take part by filling it out, and when I have enough responses I will post the results here!
The survey is reached using the link below, and should not take you more than 10-15 minutes to complete:
I have not disclosed too many details now, because that may influence the way you answer. Rest assured I will post all the details when the survey is closed.
EDIT: I have now compiled the results and written up a research paper describing them, which has been submitted for review towards publication in a journal. As soon as the results have been published, I’ll also post a summary here!
0.6.1 is only a minor update, adding no new functionality. The only change was to re-compile the Foetus library so that it is also compatible with Java 1.5, as on OSX there were reports that also recent versions of Processing was running on JVM 1.5. If Mother 0.6 is running fine for you, there is no need to get 0.6.1.
My PhD supervisor Beau Lotto recently did a TED talk in Oxford, which features two applications I have written.
First is a live video to sound perceptual substitution program, followed by an application that uses an image as a score for a virtual 32-piece classical orchestra. Both are featured from 12:20 onwards, but of course the whole talk is a very interesting watch!