Perceiving the environment for better and more efficient situational awareness is essential in applications such as wildlife surveillance, wildfire detection, crop irrigation, and building management. Energy-harvesting, intermittently-powered sensors have emerged as a zero maintenance solution for long-term environmental perception. However, these devices suffer from intermittent and varying energy supply, which presents three major challenges for executing perceptual tasks: (1) intelligently scaling computation in light of constrained resources and dynamic energy availability, (2) planning communication and sensing tasks, (3) and coordinating sensor nodes to increase the total perceptual range of the network. We propose an adaptive framework, AdaSens, which adapts the operations of intermittently-powered sensor nodes in a coordinated manner to cover as much as possible of the targeted scene, both spatially and temporally, under interruptions and constrained resources. We evaluate AdaSens on a real-world surveillance video dataset, VideoWeb, and show at least 16% improvement on the coverage of the important frames compared with other methods.