Abstract: An approach to representing visual spacetime with a distributed oriented energy representation will be presented. This representation systematically exposes the structure of visual spacetime in terms of local 3D spacetime orientation. Advantages of this abstraction will be illustrated via two important applications: (1) spatiotemporal grouping and (2) human action detection and localization. The overarching goal is to establish a unified approach to representation and analysis of visual image dynamics that is broadly applicable to the diverse phenomena encountered in the natural world. Previous research largely has approached the analysis of visual dynamics by appealing to representations based on image motion. Although of obvious importance, motion represents a particular instance of the myriad spatiotemporal patterns encountered in image sequences. Examples of non-motion-related patterns of significance include, unstructured (e.g., “blank wall”), temporal flicker (pure temporal intensity change), and dynamic texture (e.g., as typically associated with stochastic phenomena, such as windblown vegetation and turbulent water). These types of dynamic patterns have received far less attention than motion and fail to be captured by the standard vector field representation embodied by optical flow-based approaches. In contrast, the presented representation allows for the description of motion and non-motion-related patterns in a uniform manner.