top of page

Mapping Our Actions

As artificial intelligence continues to grow rapidly—with major investments from governments, universities, and tech companies—it is important to ask: what are we leaving behind? This inquiry explores the tension between AI’s logic—rooted in data, prediction, and abstraction—and ways of knowing that are place-based, relational, and embedded in local or Indigenous traditions. To investigate this tension, we draw on creative practices, community partnerships, and interdisciplinary research that foreground ecological and cultural perspectives. In particular, we ask how AI may be reshaping our relationship to land and knowledge, and whether it can be reimagined to reflect ethical, ecological, and culturally grounded values. Rather than simply critique AI, this section considers how art-based and community-driven approaches can challenge dominant technological paradigms and open up more just and sustainable futures. 


AI, after all, is not neutral. It reflects the assumptions, biases, and priorities of those who build it. This means it can reinforce dominant systems while sidelining other ways of knowing. For example, AI might be used to monitor forests or predict crop yields, but these tools rarely account for Indigenous knowledge about the land or respect the protocols communities have for engaging with nature. In this way, AI becomes not just a tool but a terrain of struggle—one where different visions of the future compete. 

​

This is not to say that AI and local knowledge are inherently incompatible. However, their coexistence is neither straightforward nor guaranteed, because AI’s operational logic—its emphasis on speed, prediction, and optimization—often conflicts with relational, place-based knowledge. The stakes are tangible: decisions made by AI in environmental monitoring or resource management can reinforce dominant priorities while marginalizing community voices, making outcomes uncertain and contested. 

​

In response, some artists, researchers, and community leaders are experimenting with ways to "slow down" AI design and deployment. Drawing on philosopher Alia Al-Saji’s notion of “hesitation,” we ask: What if AI could be trained to pause, reflect, and listen—rather than immediately act on data? By introducing moments of deliberation and ambiguity, these approaches create a generative ground where AI can engage with multiple perspectives, support ethical reflection, and reveal dimensions of environmental and cultural knowledge that would otherwise remain invisible. 

 

Creative projects in digital humanities—like thick mapping and locative media—offer inspiring examples. These approaches use digital tools to tell layered stories about place, combining personal memory, community knowledge, and environmental data. When used thoughtfully, AI could support this kind of storytelling, helping communities map their land in ways that honor their histories and perspectives. 

​

In the end, the challenge is not just improving AI, but rethinking how we relate to knowledge, land, and each other. From the perspectives of science, literature, and ethics, we can see how AI’s biases and infrastructures shape environmental understanding. By combining creativity and collaboration, we can imagine technologies that learn from local and Indigenous practices and build more just, ecologically attuned futures. 

Proudly supported by Western University, through the Western Sustainable Impact Fund, and the Department of Gender, Sexuality, and Women's Studies.

© 2025 RhizomeMind.

All rights reserved.

bottom of page