How much control do you have with the solenoids, and the hammers? Is it just control of the velocity of strike, or is it possible to have positional control of the hammer?
If you could control the position, maybe you can create some interesting tonal effects that a human pianist could not achieve, by holding the hammer close to the string to partially damp it
Positional control of the keys (not the hammers) is possible but the solenoids get very hot. Strike velocity is not proportional, I will have to add some filtering to ensure that loud passages don't break a hammershank.
As for the tonal effects, you can't really manipulate a piano hammer like that through the action, and this mechanism still uses the normal action the way it is intended, only it strikes from below the pivot rather than above, so at the back of the key. The whole function of the piano action is to give a uniform response to uniform input so the hammer can only describe a particular arc with a speed relative to the force with which the key is struck. You can't 'freeze' the hammer at the top of that arc, it will just fall back down until it gets caught by the back-checks.
What you can do and what would be impossible with just one pianist and just your fingers is to strike key combinations that would be otherwise impossible. For instance to make a chord that uses more than the normal number of fingers or that would require extreme contortions or more than two hands.
Then you could make some interesting effects by using each human key strike to trigger extra keys, such as one or two octaves higher or lower, or something more colorful like a note one fifth higher at reduced velocity, or time delayed perhaps
edit - just noticed that it seems you hadn't yet been able to get the key sensors working, only the output
Church organs have such functionality, typically limited to activating octaves higher and lower than the rank played. Quite impressive given that that all works mechanically.
And yes, that would require input, but I will be working on that next and I can already play using a digital keyboard on the piano (which gives a very eerie effect), and that way I can do such stuff.
The Windows 3.1-era Band-in-a-box had a feature (and probably still has, if that software is still around) that allowed playing along a pre-programmed chord progression in different styles, which always produced a harmony that matched the current chord with a a single key press. You could play in the style of eg. Erroll Garner, even if you only knew how the "bare" melody of the song goes.
That was amazing "AI" for the 1990s. It would play never-ending bebop solos that were pretty legit. The author must have been a well-versed musician and programmed in the standard licks from the jazz vocabulary.
My friend and serial colleague David Levitt wrote his PhD thesis about applying AI to Jazz improvisation at MIT EECS in 1981, titled "A Melody Description System for Jazz Improvisation". His PhD advisor was Marvin Minsky.
>A Melody Description System for Jazz Improvisation
>This thesis describes a program which embodies several computational aspects of music improvisation and composition. Given an initial melody and harmonic outline, the program produces a single voice of improvisation. While the improvisation is intended to resemble a jazz (eg. "swing" or "bebop") solo, the program's descriptive techniques reflect computational elements common to a range of tonal music. These include building up phrases from partial descriptions, recognizing interesting features, and re-examining a remembered phrase to produce a variation in a new context. Throughout, the program uses various ideas of similarity, repeated patterns, defaults, and mutually constraining structures, trying to anticipate the audience's expectations to make the improvisations understandable and interesting.
David also developed an early musical/graphical Mac app called "Harmony Grid", and a visual programming language called "Hookup" for controlling and integrating MIDI and hardware and software like the Macromedia Director player library. David shared an office at the MIT AI Lab's "Music Hacker's Hangout" with Miller Puckette, who developed Max/MSP and PD. He later worked on Body Electric aka Bounce, another MIDI supporting visual programming language for real time VR simulation developed by Chuck Blanchard at VPL.
>Representing Musical Relationships in the Harmony Grid
>One of the most difficult questions of multimedia design is when is it appropriate to use a given medium or mode of interaction. Sometimes the answer is not as obvious as it might seem. For instance, a program which manipulates music obviously has to be capable of playing music, presenting auditory information, but to what extent should the interface to the program be an auditory one? This chapter describes one such program which teaches about music, but with which the student interacts using a two-dimensional spatial representation of musical relationships. In other words a cross-modality mapping occurs. This kind of mapping appears to be especially successful because variants of it have been applied to good educational effect more than once. Indeed, Chapter 12 describes a successful use of a different but closely related cross-modality mapping. A very interesting and open question is: why is this mapping appropriate, where others may not be?
>1. Introduction
>The Harmony Grid was developed to help musicians and non-musicians visualize harmonic relationships in the traditional western 12-semitone chromatic system. It spatially represents some of the thinking that goes on when a composer or improvisor describes a melody or chord progression: 'Now the chord root is descending on the circle of fifths; now the melody is ascending . . chromatically; now it's descending on the whole tone scale~ .. ' and so on. Articulately or not, we create these structures when we make and appreciate music - dynamic ways of thinking about notes as 'near' each other. The Harmony Grid lets us visualize these musical relationships as adjacencies in fields of spatial intervals, displayed as two-dimensional grids.
Here he is demonstrating his "Music Box" software, and talking about his research at the MIT AI Lab and Atari Cambridge Research Lab:
If you could control the position, maybe you can create some interesting tonal effects that a human pianist could not achieve, by holding the hammer close to the string to partially damp it
(Similar to what guitarists can do)