This paper describes Cross-talk, a piece of music and performance system for two instruments augmented with infrared motion-tracking capability, and an artificial software improviser. Cross-talk was commissioned by the Ammerman Center for Arts and Technology at Connecticut College, for the 13th Biennial Symposium on Arts and Technology. The work is part of an ongoing collaboration focused on developing integrated hardware and software performance systems to extend the timbral and expressive capabilities of traditional musical instruments and to generate musical structure in response to information retrieved from human performers in real-time. Artistic motivations and prior related work are presented here, along with a summary of the programmatic narrative behind Cross-talk and an accompanying qualitative description of the piece. Technical details are provided for important components of the work, including the Gesturally Extended Piano and the “factorOracle” software module, which is used to facilitate the system's machine improvisation capability.