In this article, we expose our research program in bio-affective-social computing aimed at guiding and participating to the development of multimodal socially intelligent agents in terms of: (1) decision-making and (2) communication. We present some of the results we obtained working on certain modules of the research paradigm which we chose to explore. In particular, we work on emotion recognition from sensed physiological signals, on the construction of emotion user-models from the emotion recognition module, on the design and implementation of an architecture for socially intelligent agents that enables them to use these user-models to interact with the user more intuitively in a variety of contexts.