Irsila is an autonomous architectural machine capable of continuously adapting itself to negotiate the dynamically changing goals of a near-future cultural centre.
Irsila’s system proposes the usage of a library of voxelised spatial parts. These voxelised parts are assembled through an algorithmic process that combines spatial assembly and spatial planning algorithms. Irsila generates infinite outcomes that fulfil the user’s needs as well as multiple dynamic objectives including occupancy, physical stability, and spatial distribution. The spatial outcomes are constantly evaluated, and the system determines each new goal state based on the new spatial demands. Machine learning is applied to both reconfiguration and spatial scale planning.
Irsila’s multi-scalar system is physically reconfigured through a bespoke multi-agent robotic system that inhabits it and its library of connectable parts. The building learns effective collaborative sequences of reversible assembly within the physical constraints of the robotic system.
With its multi-scalar system formed as a result of interconnected robotic, spatial, and algorithmic research, Irsila is a spatial sandbox which brings together the architect, the user, and artificial intelligence to create a collaborative architectural proposal.
Using augmented reality, virtual and physical spaces are merged together to create a hybrid experience.
The robotic body was created via generative design with different materials and loading conditions in order to ensure its minimal weight and capacity to bear high loads.
A demonstration of a robot’s three-dimensional movement and the rearrangement of modular components.
A demonstration of the collaboration of multi-agent robots.
A demonstration of the physical interaction with the robotic system.
The robots are trained to self-navigate around a building assembly.