Search papers, labs, and topics across Lattice.
The paper presents a framework for integrating human-centered AI (HCAI) into visualization and visual analytics, mapping four HCAI tool capabilities (amplify, augment, empower, enhance) onto the four phases of visual sensemaking (view, explore, schematize, report). This mapping identifies existing tools, future possibilities, challenges, and ethical considerations for each combination, serving as an R&D agenda. The framework aims to guide visualization researchers and practitioners in integrating AI into their work and understanding how visualization can support HCAI research.
Forget simply automating tasks – this paper charts a course for AI to truly *enhance* human visual sensemaking, not replace it.
The emergence of generative AI, large language models, and foundation models is fundamentally reshaping computer science, and visualization and visual analytics are no exception. We present a systematic framework for understanding how human-centered AI (HCAI) can transform the visualization discipline. Our framework maps four key HCAI tool capabilities—amplify, augment, empower, and enhance—onto the four phases of visual sensemaking: view, explore, schematize, and report. For each combination, we review existing tools, envision future possibilities, identify challenges and pitfalls, and examine ethical considerations. This design space can serve as an R&D agenda for both visualization researchers and practitioners to integrate AI into their work as well as understanding how visualization can support HCAI research.