A few people have expressed interest in checking out my MFA thesis on women in electronic instrument design. It is on UMI ProQuest in open access, so you can download and read it in full. I’m still sort of traumatized from the whole process of writing it, and it veers dangerously close to becoming a feminist rant at times (not sorry).
Anyway, here’s a LINK.
The development of optical sound recording for either independent playback or accompaniment with film in the early 20th century resulted in a number of experiments with synthetic and graphically generated sound. In Russia and the Soviet Union, simultaneous development of various methods of graphical and ornamental sound proliferated throughout the late 1920s and through the 1930s up until the beginning of WWII. The ANS Synthesizer, developed in 1938 by Evgeny Murzin in Moscow, generated sound through the electronic translation of light moving across an etched glass panel. In Germany, experiments with drawn and ornamental sound took place from the early 1930s onward.
Following WWII, the work of Norman McLaren with the support of the National Film Board of Canada continued development of his graphical sound techniques in accompaniment with hand-drawn animation. The Oramics Machine, designed by Daphne Oram in the UK and realized with the expertise of engineer Graham Wrench, utilized ten film painted film strips positioned above an array of photoelectric cells to control various sound parameters. Experiments with sound-on-disk optical sound based instruments and film sound-sync systems–such as the Optigan, the Welte Light-Tone, and the “Singing Keyboard”–utilized principles of optical sound recording and playback, are also relevant to this discourse.
The increasing availability of low cost and open source physical computing technology enables artists or creators to elegantly utilize previously unavailable or overly complex forms of data, while additionally casting the artist or creator as technologist and engineer. Physical computing relies on microcontroller embedded systems to utilize sensors and other physical data in interactive digital systems, expanding human-computer interfaces vastly beyond their original conventions. New media and interactive telecommunications curricula are being introduced at many levels of education and academia; microcontrollers like Arduino and single board computers like Raspberry Pi have become familiar household names as the Maker Movement proliferates technology, art, and culture. The efficacy of this and related hardware grows smaller, more affordable, powerful, and well documented.
Custom build for my friend Jenny who plays as Gossimer. I’ll try to dig up some video because this thing sounds kind of demonic. It’s two cassette decks hacked together with a single loop running between them. You can record continuously onto the tape or simply play it back. You can also control the speed and direction of the motor with a wimpy little 555 PWM circuit I threw together (heat sink bobby pin). LM386 amps pick up radio signals REALLY well, so I spend a substantial amount of time thwarting those signals with various components strategically placed on different pins.
I received a commission to circuit bend a number of different things, including the Casio PT-80 shown.This particular Casio is less sought after for circuit bending as not many tone bending or glitch points can be found in it as the sound parameters are stored on a memory card. There are, however, a number of interesting percussion effects available with some exploration. I added three switches which each effect the percussion in a different way, plus a pot to allow a sort of filter sweep, as well as a pot to rapidly alter the tempo. A fun circuit to bend, despite its limitations.
Hello friends! I have moved my site to a new host and am doing a bit of housekeeping. All that sweet sweet content–and technical documentation–will return ASAP.
Friday and Saturday went very well at Signal Flow. I spend the day on Friday going through my code to see if I could get things running more consistently and managed to figure out a workaround. The Timer library for Arduino ended up being the answer to my woes, and I was able to set up a signal to be sent to Processing at set intervals in order to keep the videos active when the board wasn’t touched awhile.
I also made some fliers to get people in the door, since the installation is so far removed from the music building and other Signal Flow festivities. L and I put them up all over campus and they definitely did the trick.
Lots of people came through and stayed awhile to stare at the weather balloon. I switched up the videos a lot yesterday to make sure it was always pretty fresh. My friend came to document everything and filmed for a good hour or so, so I’ll have plenty of good material to work with. People were really interested in the balloon and the projections, and I had SuperCollider throwing out my binary sequences as well as a semi-randomized pattern for my retrigger signal from Arduino. I pointed the speakers at the walls on either side of the room and it spatialized nicely. Most of the frequencies I used were harmonically related so they resonated nicely with each other in the room and it became very immersive. It was interesting talking with people about the history of the Ouija board and Spiritualism, and about early Electronic music and the Occult (some interesting connections there). A few groups used the Ouija board to ask some questions and had fun with the answers. One lady asked it when her granddaughter would be born, which I thought was sweet.
The new code worked so well that I was even able to leave everything unattended for a couple of hours to go watch everyone’s performances! That was an exciting change, as I had mostly been babysitting the installation in case things froze up. The tricky thing is that the capacitance threshold seems to be well calibrated to my body but not all bodies, so some people had a harder time triggering the letters. The fact that each letter has a different threshold is also not entirely helpful. I am going to spend some time after this weekend testing each individual point and calibrating everything individually, as well as setting an integer range as an output so I can use the board as a MIDI controller with Max MSP and Ableton.