BriAInfu.cc, a watch for space
Imagine two small marbles in your hand. One shows where you are right now, and the other marks the location of where you want to go. They would accurately roll regarding the direction you are facing, and through the movement give you hints, to make your way there easier.
BrAInfu.cc results in a playful tool for experiencing space, a conceptual tangible interface that forms an abstract framework of spatial and temporal reference through two small, skin adhesive, marbles. While it draws attention to joy, haptic feedback encourages psychomotor learning and so facilitates active memorisation. It uses synthetic technology to achieve data privacy and lets people control actionable data through latent interaction. As such, it shifts the focus from functionality to meaning.
An abstract representation provokes people to process raw data in their own way, and to apply it individually to their plans, without being persuaded.
Marble's surface display time; the larger shows minutes, and the smaller hours, as handles of a watch. Time you have spent already, and time you might spend to get there is marked beside.
The larger marble shows your current location, while the smaller proportionally maps your final destination. Marbles change their position accurately to the direction you are facing.
as the marbles touch each other, voice recognition is triggered, expecting you to tell where do you want to go. Then they would roll apart, mapping locations of here and there
[control] pull marbles together
Detach from skin
on a particular finger gesture, skin adhesion is de-activated, and the marbles released from the skin
[control] spock finger move
marbles become warmer, closer that you get to your final destination, also to remind you if you are walking in the right direction
Moves to give directions
when you don't know where to turn to, bigger marble will roll, eg. left, left, right, to suggest you to which direction you could turn to
[control] tap index to the thumb
Because following a navigational dot on the screen of your phone won’t make you even realisenor remember that this street is actually parallel to the one you walk through every other day.
representation as you would have developed it on your own - a mind map (places you have visited get encoded in your brain. The electrodic surface of the marbles can access but not store that data, and then project it onto your skin)
[control] bend thumb finger
common spatial data, organised in a simplified way, to focus on the points of decision making only. In this way information is reduced to its minimum.
[control] bend index finger
something that makes your spatial experience easier (e.g. transpirational nodes)
[control] bend middle finger
something that makes your spatial experience harder (e.g. traffic, areas with high stress level, areas with frequency levels that do not respond well to your current physical condition)
[control] bend ring finger
non_algorithmic personalisation (locations where you have experienced delight, either sensed by the marbles, or input manually)
[control] bend pinky finger
you can see as many info layers at the same time, as you want
[control] more fingers bend at the same time
[control] spreading fingers apart
getting detailed information
[control] bringing fingers together
When we access data on our phones, someone else can often see what we do with it. Brainfu.cc imagines a situation in which data about our interactions do not get stored, nor shared behind our back. It concentrates on data generated by person’s own psychomotorics, and not on something algorithmically predicted to be general for us all.
Our brain works as a kind of plug and play device. It doesn’t need our biological receptors, but could process any kind of data that gets in touch to our nerves. The surface of the interface contains electrodes that, while in contact with skin, react and enable a data transfer to the nerves.
The surface of the interface is made of a skin adhesive material, so it stays attached to the skin, unless released from it on a gesture trigger (the material follows the logic of the changing structure of a gecko’s skin under different emotional conditions).
The Bio Processor works as a data transmitter, and therefore doesn’t have any capacity to store data. It operates independently from the computer B, so once it loses contact with the skin, it also loses all the traces of personal data.
Computer B has a memory of geo-representative data streamed from public satellites. It communicates only in one way, to update the containing data.
The main computer synchronises geodata from Computer B with what is being perceived by the sensors and bio-processed. The main computer monitors the marbles, and responds to the gesture control of a person by movement, or controls laser projection onto skin.
Besides the hippocampus (responsible for the majority of spatial orientation and memorisation), our brain variety of cells, that encode our spatial position when we keep attention to surroundings. As such our brain constructs a mental representation of the space.
When the interface is in touch with our skin, the Bio Processor can get data from our brain, and then projects a schematic representation of the locations we have visited onto our skin (one option would be a synthetic subcutaneous projection, where marbles inject bio synthetic fibres that are capable of visualising info underneath the surface of our skin for a couple of seconds until those fibres innocuously evaporate)