This paper presents a rule-based approach for both offline and real-time recognition of Activities of Daily Living (ADL), leveraging events produced by a non-intrusive multi-modal sensor infrastructure deployed in a residential environment. Novel aspects of the approach include: the ability to recognise arbitrary scenarios of complex activities using bottom-up multi-level reasoning, starting from sensor events at the lowest level; an effective heuristics-based method for distinguishing between actual and ghost images in video data; and a highly accurate indoor localisation approach that fuses different sources of location information. The proposed approach is implemented as a rule-based system using Jess and is evaluated using data collected in a smart home environment. Experimental results show high levels of accuracy and performance, proving the effectiveness of the approach in real world setups.