Wearable sensors such as action cameras and smartphones, can collect content-rich information that can be used to characterize human activities. In this talk,I will describe how wearable sensors can be used to understand, predict and assist human activity. First person vision systems (wearable cameras) are excellent for recognizing hand object manipulations. I will describe our recent work that can be used to automatically recognize first-person activities, understanding scene functionality and also predict into the future what a person will do. I will also describe how a wearable sensor — the smartphone — can be used as an assistive technology. I will present, NavCog, a smartphone based indoor navigation app that can be used to help people with visual impairments travel in new environments.