Tags : programming, midi, music, code, coding, sciart, guitar, alda-tabs, english, open-source, software, ableton

Note
One of the main goals of the SciArt Lab is the open exploration of innovative ideas from a maker/hacker perspective, finding innovation through prototyping rather than relying on mere theoretical approaches. In that sense, we try to combine disruptive technologies and scientific knowledge with unconventional developments and real implementations/prototypes of our ideas. If you want to know more about the research of the SciArt Lab check our website.

What is this article about?

This article is an introduction to some of the projects which have been developed by the SciArt Lab around topics related with digital musical creation.

In this post I will summarize part of my hands-on experience based on the intersection of DIY electronics, MIDI controllers, and the development of new tools (coded in Java, Groovy, Processing, Javascript) in combination with existing software such as Ableton Live.

This is an on-going exploration, so follow us on Twitter and keep updated in the near future.

Music and digital creation

I can summarize the current projects of the SciArt Lab as a set of fun experiments.

Basically, we are hacking music with both sound synthesis, MIDI experiments, DIY electronics and algorithmic composition, combining experimental music with brand new technologies. Discovering how coding and music can be combined by prototyping domain-specific languages, enabling self-composed songs with genetic algorithms or re-discovering MIDI controllers to create audio art.

A. Genetic algorithms, mathematical compositions and generative music

We are exploring the potential of applying Artificial Intelligence techniques and software development to create programs which are able to write and play their own music.

Take a look of our first experiments, watching our videos of cellular automata with emergent properties for music composition.

Each cellular automaton is able to compose its own music based on simple rules, evolving while it plays synthetic instruments in Ableton Live or external devices through MIDI events.

B. Domain Specific Languages

Alda-tabs is the first Domain Specific Language for Guitar Players in the Java Virtual Machine. This piece of software can help guitar players to “execute” their music notes in the JVM, write songs and get audio feedback with basic tab syntax. You can read more about this in this article.

Take a look of the potential of Alda-tabs with chords and arpeggios listening this example (code also provided):

./alda-tabs.sh examples/01-guitartabs-example.alda

C. Digital instruments and physical interfaces

A couple of years ago, when I was working as Visiting Researcher at the Critical Making Lab (University of Toronto), I discovered how a humanistic-based approach to DIY electronics, coding and making could change forever my conception on research. That experience helped me to see the importance of hands-on learning and the role that tangible objects could have for theoretical or intellectual explorations.

Currently I am working on a prototype of a physical MIDI interface to control digital instruments directly from a guitar fret. This same concept will be explored with different objects and materials (conductive or not) in the following months.

The idea is to go beyond the keyboard as the standard digital music interface and build physical MIDI controllers with wood, cardboard, fabric, etc. More details about this project will be published soon.

In the meantime, I have been also testing some libraries of Javascript for sound synthesis, and playing around with p5.js to develop the foundations of SoundBox, an experimental digital environment for synthetic music creation. Basically, the idea with this tool is to transform a human voice or an instrument (through a microphone) in a MIDI interface. Right now, it basically detects the fundamental frequency of the microphone’s sound signal, allowing the user to transform a continuous signal in a set of discrete notes. It also parses that information and reproduces the played sequence in Sin Oscillator.

It is a very straightforward prototype with troubles with some harmonics, but it has been a good experiment to learn about how these issues work. Let’s see, but maybe SoundBox is the starting point for something more sophisticated.

img2

D. Music Visualization

One of the research interests of the SciArt Lab is information visualization in unusual ways. I always was fascinated about synesthesia and lately I have been testing visual approaches to music. The idea with some of the prototypes I have been working in is to map MIDI inputs both with physical visualizations (i.a. LEDs) and computational ones.

In this second category, I have been testing the possibility of creating my own 2D graphics with Processing and SVG and animate them while controling their movements and shapes directly from external MIDI inputs. This is one example of one program/animation that I have implemented recently:

In the previous example, an animation is created dynamically in Processing while the behavior of an animated cartoon responds to the inputs received from an external DIY electronics device. Both the graphics and the sound are produced by the orders received through MIDI inputs.

E. Postsynaptic Symphonies

I have always liked music. I started writing songs with a keyboard as a kid and continued with a guitar when I was a teenager. Nowadays, I enjoy playing several kinds of instruments. Besides the keyboard, my acoustic guitar and my wife’s classical guitar, I have two harmonicas, some flutes, an ocarina, an ukulele and a guitalele.

Recently, as part of the open-ended exploration of the SciArt Lab, I have been writing also some digital music. I call them postynaptic symphonies because I found interesting the cognitive experience of listening that sort of unpredictable songs.

I have published some postynaptic symphonies in SoundCloud:

More information about the evolution of my music-related projects will be coming soon :)