At FXPAL we developed a system, ReBoard, for automatically capturing whiteboard content using a nearby camera pointed at the board. We also built several tools in the space of intelligent capture, including an app that would give you live feedback about the quality of captured documents.
For some reason, until now it hadn't occurred to me that combining the two ideas would be quite useful. Captures in the ReBoard system can be low quality for all kinds of reasons: someone is blocking the board, the light is too low, shadows are bad, etc. ReBoard has built-in mechanisms to deal with these as well as it can, but surely it would also be useful to add a post hoc step to evaluate the quality of each shot. This could be fed back to users in a small display near the board or on a back-end web UI (in the initial ReBoard system we used a Chumby for the in situ display).
This concept could be extended now that external capture (e.g., from drones) is becoming commonplace. Imagine giving a drone a high level command to capture high-quality media from an event. You could even specify certain requirements (the presence or absence of people, for example, or a mixture of different kinds of shots). The drone could capture media as well as it could then run this post hoc process to get near real time feedback about the quality of recent shots. If any are unusable, they would be culled and requeued for the drone.
By the way, I recently built a new version of ReBoard that works with a confederation of GoPro cameras. Email me if you're interested (carter at fxpal.com)!