Fluid Views is a web-based search environment designed to bridge overview and detail by integrating dynamic queries, semantic zooming, and dual layers. The most common form of search results is long ranked and paginated lists, which are seldom examined beyond the top ten items. To support more exploratory forms of information seeking, we bring together the notion of relevance with the power of visual encoding. In Fluid Views, results portray relevance via size and detail in a dynamic top layer and semantic similarity via position on a base map. We designed Fluid Views with temporal, spatial, and content-defined base maps for both textual and visual resources, and tested our prototype system on books, blogs, and photos.
This project was carried out at the University of Calgary.