I. Starting with a problem

Have you ever found yourself lost?

Imagine yourself on your way somewhere. Suddenly your phone dies. You don’t know how to proceed, you don’t remember how you got there, and you have no idea where you are. 

This is because your phone has taken over some prior tasks (insight, planning, decision) that are otherwise, when you are aware of your movements in space around you, performed by your brain. Your brain automatically creates a mental map which later on serves as a repository for thinking and planning future steps. So, if your phone does that instead of you, and if it suddenly dies, then your mind remains as empty as a computer that has recently lost all its data. We also don't remember phone numbers anymore, because our phone knows them all. 

BrAInfu.cc emerges as a dystopian question, of how we might experience space in the future. The project observes our dependence on mobile devices, and speculates how AI could affect our own intelligence and cognition in the future. 

We like to define humankind as intelligent. What distinguishes us from other species is our need to control phenomena by creating tools – 


technologies – injecting them with the intelligence we have, thus expecting them to become immortal, and make our lives effortlessly prosper. However, technologies can be useless if not mediated by design that applies an interface to them, and so enables or enriches our experiences.

Interfaces facilitate interaction. Currently, those embodied in our smartphones give us access to almost anything, anytime, anywhere. A never-ending wave in the informational pollution, which also brings sensory overload, possibly causing us cognitive impairments. No matter that everything functions in the realm of publicly shared data; do we still have any privacy to experience the present with joy and ascribe it an intimate meaning, that cannot be tracked or algorithmically predicted? 

Fuc#, in all its metaphors, stands for something we all can relate to: an experience that results in physical and emotional sensation, which is either indescribably horrible or awesome. And so brAInfu.cc was innitialy based on the most unpleasant consequence of our interaction with AI. As a result, the project delivers a concept – a tangible interface in which AI works in favour of our intelligence and enhances our cognitive performance while experiencing space around us.  



[**your eyes will be happier if you change the quality to HD1080p (in the bottom right corner) before watching**]


BriAInfu.cc, a watch for space

Imagine two small marbles in your hand. One shows where you are right now, and the other marks the location of where you want to go. They would accurately roll regarding the direction you are facing, and through the movement give you hints, to make your way there easier. 

BrAInfu.cc results in a playful tool for experiencing space, a conceptual tangible interface that forms an abstract framework of spatial and temporal reference through two small, skin adhesive, marbles. While it draws attention to joy, haptic feedback encourages psychomotor learning and so facilitates active memorisation. It uses synthetic technology to achieve data privacy and lets people control actionable data through latent interaction. As such, it shifts the focus from functionality to meaning.


An abstract representation provokes people to process raw data in their own way, and to apply it individually to their plans, without being persuaded.





Marble's surface display time; the larger shows minutes, and the smaller hours, as handles of a watch. Time you have spent already, and time you might spend to get there is marked beside.


The larger marble shows your current location, while the smaller proportionally maps your final destination. Marbles change their position accurately to the direction you are facing.


as the marbles touch each other, voice recognition is triggered, expecting you to tell where do you want to go. Then they would roll apart, mapping locations of here and there

[control] pull marbles together

Detach from skin 

on a particular finger gesture, skin adhesion is de-activated, and the marbles released from the skin

[control] spock finger move


marbles become warmer, closer that you get to your final destination, also to remind you if you are walking in the right direction

Moves to give directions

when you don't know where to turn to, bigger marble will roll, eg. left, left, right, to suggest you to which direction you could turn to

[control] tap index to the thumb

Because following a navigational dot on the screen of your phone won’t make you even realisenor remember that this street is actually parallel to the one you walk through every other day.

Mind info 

representation as you would have developed it on your own - a mind map (places you have visited get encoded in your brain. The electrodic surface of the marbles can access but not store that data, and then project it onto your skin)

[control] bend thumb finger

Pop info

common spatial data, organised in a simplified way, to focus on the points of decision making only. In this way information is reduced to its minimum.

[control] bend index finger

Boost info

something that makes your spatial experience easier (e.g. transpirational nodes)

[control] bend middle finger


something that makes your spatial experience harder (e.g. traffic, areas with high stress level, areas with frequency levels that do not respond well to your current physical condition)

[control] bend ring finger

ALT info

non_algorithmic personalisation (locations where you have experienced delight, either sensed by the marbles, or input manually) 

[control] bend pinky finger

Overlaid information

you can see as many info layers at the same time, as you want 

[control] more fingers bend at the same time

Zooming out

clustered information

[control] spreading fingers apart

Zooming in

getting detailed information

[control] bringing fingers together

When we access data on our phones, someone else can often see what we do with it. Brainfu.cc imagines a situation in which data about our interactions do not get stored, nor shared behind our back. It concentrates on data generated by person’s own psychomotorics, and not on something algorithmically predicted to be general for us all. 

Electrodic surface

Our brain works as a kind of plug and play device. It doesn’t need our biological receptors, but could process any kind of data that gets in touch to our nerves. The surface of the interface contains electrodes that, while in contact with skin, react and enable a data transfer to the nerves.

Skin adhesion

The surface of the interface is made of a skin adhesive material, so it stays attached to the skin, unless released from it on a gesture trigger (the material follows the logic of the changing structure of a gecko’s skin under different emotional conditions).

AI thingies

The Bio Processor works as a data transmitter, and therefore doesn’t have any capacity to store data. It operates independently from the computer B, so once it loses contact with the skin, it also loses all the traces of personal data. 

Computer B has a memory of geo-representative data streamed from public satellites. It communicates only in one way, to update the containing data. 

The main computer synchronises geodata from Computer B with what is being perceived by the sensors and bio-processed. The main computer monitors the marbles, and responds to the gesture control of a person by movement, or controls laser projection onto skin.

Bio processor  

Besides the hippocampus (responsible for the majority of spatial orientation and memorisation), our brain variety of cells, that encode our spatial position when we keep attention to surroundings. As such our brain constructs a mental representation of the space.

When the interface is in touch with our skin, the Bio Processor can get data from our brain, and then projects a schematic representation of the locations we have visited onto our skin (one option would be a synthetic subcutaneous projection, where marbles inject bio synthetic fibres that are capable of visualising info underneath the surface of our skin for a couple of seconds until those fibres innocuously evaporate)



(The) Future of Our Spatial Experience?

As such design doesn't only organise information and makes it perceivable, but also takes responsibility of what affect interaction will have on the end user. Therefore this information should be delivered from the real sciences. And foremost designers should be aware of how our cognition works and might be affected by the technologies on the top of which design interfaces are applied. 

Design mediates between technology and people. It enables interaction by providing an interface. Interfaces transform the role of technology into sensory aids, sensory replacements, or sensory extensions.



brAInfu.cc aims to make everyone involved rethink where are we going, and how to be aware of our path now, before we suddenly get lost in the future, without knowing what went wrong.


I started this project with a goal to find or speculate a design solution for an environment where Artificial Intelligence (AI) might enhance or replace our inborn ability of spatial orientation. Then, I concluded the research with principles, and applied them to the concrete design solution - tangible interface. Proposed tangible interface is more a case of a speculative future, however its design principles could be functionally applied to current problems (e.g. visualising the relation of users current location and final destination with dots on smart phone's lock screen, besides obvious information about time)