The goal of this paper is to describe a system-aided performance and composition tool that aims to expand guitarist capacities by providing innovative ways in which the user can interact with the system. In order to achieve that, we decided to use an agent-based approach, independently modeling the active elements involved in a guitar performance as autonomous agents - named Left-Hand, Right-Hand, and Speaker (the guitar itself). These agents are able to communicate to each other in order to make some musical decisions, specially related to the chord's shape choice. The musical elements (harmony and rhythm) are independently defined respectively by the Left-Hand and Right-Hand agents. The most relevant aspects of this work, however, are the algorithms and strategies to process both harmonic and rhythmic data. Finally, we perform an evaluation of the system and discuss the results of the implemented techniques.