A sense is a faculty by which the body percieves an external stimulus. One of the faculties of sight smell, hearing, taste and touch. Ity basically helps you percieve and understand the world around you, a way for your body to gather information. It seems kind of limited though, the reality is we have far more than 5 senses. When you think of eyesight alone, being able to see is actually 2 senses, because in your eyes, rods sense brightness and cones sense color, your eyes also have a sense of movement. So when we look at the senses, we have to think what they’re really there for. So sight is there absorbing light, understanding motion, and being able to experience the world around you visually absorbing the light around you, sound is detecting air viobrations of the world around you,touch is detedcting the physical contact of the world around you, Smell is just picking up volatile chemicals in the air around you, and all of these are fairly well understood even by lower order animals, which use all of these things to understand the world. Taste is very intersting because it was evolved as a chemical process to decide if food was poisonous or rotten.
The concept of a “sense” is very poorly defined. How many do we actually have? We have the sense of pressure, different from the sense of touch, you know there is force on your skin even if you weren’t able to actually feel that thing. Pain has it’s own network of nerve receptors. Itch has its own senses, that’s seperate from the sense of touch. Pain and itch used to be considered parts of it, but we found that they have their own receptor ssystems in the body. Thermoreception, or the ability to tell when something is warm or cold, propriaception, when your body knows where your arm is in relation to your leg, the idea that your body can tell where things are, your limbs. It also its what police are testing when they make you close your eyes and touch your nose, or patting your head and rubbing your belly and the same time. There’s also balance, or ewquilibrioception, a sense of gravity, a sense of up or down, it seems simple. But if you have an inner ear problem some people can’t tell up or down, they have trouble even standing. There are also stretch and tension receptors in your muscles and gastrointestinal tract. Blood vessel constrictions are part of these stretch and tension receptors, if you get a headache, part of that has to do with those receptors. Chemoreceptors are connected to vominiting. There are hunger and thirst receptors that can tell you when to eat or drink. And those are just the basic ones, the physical receptors that the body uses to get the word to your brain about whats happening around it. There are more than that even some we have some we don’t, magnetoreception, where birds can tell what direction north is, dogs have thios too, they spin before they poop because they like pooping on a north south access. Humans have it but we’re bad at it and it’s stuck up in our nose and doesn’t really work for most people. Electrorecption where sharks can detect elecrtrical fields and systems in the water nearby, some animals create electrical fields to pick up what’s happening ariound them and some just absorb electrical fields from other animals because some animals are just creating low level bioelectrical fields all the time. There’s a sense of people loking at you, you can tell out of the corner of your eye when someone’s been looking at you. There’s also an innate sense of abstract sense of time. Children even have an internal clock.
How close are we to the matrix where the outside world is brought to us on wires. We can hear we can see, we can smell, we can taste all with this biological brain our brain is driving. But electronics are starting to learn to interact directly with some of these. The newest digital sense is electronic taste, popular with the food show crowd. Scientists from the university of singapore created a device tat uses electrodes to stimulat te taste buds, directly. It can even use temperature to alter the falvor experience. At the moment te can replicate the 4 basic flavors salt, sweet sour, and bitter, but not the 5th Umami. Without smell though, they’re kinda stuck when it comes to taste. Back in 2003 the iSmell was introduced, which is kind of like a smelljet printer, cartridges. Unfrotunatel because the nose is sensitive we can tell the tiny imperfections in the tech seems off. The world of factory conference got together to work on digital sense. Many agree that there is a long way to go, but the benefits are vast, think downloading colons, room fresheners or bug repellents , aroma therapies to cure ills and relieve stress that comes out of a button in your coat, all with digital downloads. We just need to perfect it, the most dreamed about electronic sense is probably bionic eyeballs, the FDA approved a retinal prosthesis called the Argus 2 which impants directly into the high resolution center of the retina. The electronics can restore SOME shape recognition and light sensitivity and while we can’t cure blindness it’s pretty advanced. Bionic eyes involve equipment sort of like Geordi Laforege’s visor. A person wears a device with a camera, and the signals are interpreted by the brain after some training and experience. But the best most widely used sensory prosthesis is definitely electronic hearing. We can enance and clean sounds wit computers but te tecnoog can onl get so small, we can’t cure deafness but hearing loss can be digitally repaired with a cochlear implant, sort of like a super hearing aid, it recieves soundwaves, it translates them to electronic signals and then sends them directly into the auditory nerve. Unfortunately instead of clear sound users say the implant sounds kind of clunky and digital. But like the retinal implant, it’s kinda attempting to replace damaged human parts and it sends the signals to the brain on its own so it takes some training and it’s not perfect yet.
The most famous type of sense entanlgement is synesthsia, it’s the combination of senses in the brain, the stimulation of one sensory or cognitive pathway which leads to an automatic involuntary experience in a second sensory or cognitive pathway. It happens in 5-15% of people and to put it in less scientific terminology, it means sometimes you’ll “taste music” or you’ll “hear colors”.
What senses could you have in the future. Some bodyhackers have them already, but you might be able to get them soon. The senses we could have is unlimited. If we’re defining a sense as consistent input that the brain could process, the sky is the limit. We could add any number of new processes that the brain because its pladstic, will be able to take care of. In a sense, we can augment our existing senses in novel ways. So similar to how we know things are wet even though we have no water receptors, we could use the brain’s plastic ability to do anything. David eaglement is wearing something called a sense vest, he’s basically created a computerized vest that takes data from the internet and turns it into a series of vibrations that haoppen in various parts of this thing that he wears. After he wore it for enough time and understood what those vibrations were doing, he doesn’t necessarily feel them anymore, similar to how when your phone rings in your pocket you get used to it. You may intuit that your phone was ringing and realize without actually feeling it ring. but in this case they fed live weather data into the vest, so david eaglemen could eventually tell what the weather was gonna be from that input, without thinking “oh, the left pocket on my vest is vibrating that means there’s gonna be rain”, instead he just kinda knew it would rain in 20 minutes. This is how the brain can suck up information and how good it is at this. It’s so good at this that people who are missing senses, people who become blind can learn to echolocate. Children who are born blind tend to learn to click and they take that sound information, which bounces off the room around them and comes back to their ears. the difference in space of their ears allows them to determine the angels of the walls, what is in the room, if there’s soft surfaces or hard surfaces, this is how bats get around at night. IUt’s a myth that bats can see, they can see, but they use the sonar information because it’s so precise, more prescise. They can bounce sound off the environment and get spacial data, intersstingly most children who natively and intuitively learn to click are actually discouraged from doing so because it’s weird. If you learn to do this you can so much with it, Daniel Kish rides his bike outside with people walking, he can ride and click at the same time. His clicks and then uses that sound inforamtion. And it’s not just hearing, the brain has wired into the visual cortex as well and the visual centers of his brain fires when he clicks. Meaning, his brain has taught itself to see without using his eyeballs. There are other things we can do to augment senses like with prosthetics, MIT came up with a pressure sensative prosthetic which is only one of many senses we use to grab onto things and it does help give feedback when you’re gripping something. This new prosthetic senses how much grip its giving and gives that information back to the person wearing, how they do that is incredible. The DARPA created an interface that actually works with current and existing nerve cells so if you have an arm that had to be ampoutated for whatever reason you still have nerve cells at the end of the arm, it uses a hub and wraps around the nerve to a piece of metal and has little spokes where the nerve can grow around the device, that with a little bit of electrical stimulation allows signals to be sent to the brain. The brain leanrns to pick those up and it can touch things, sense pressure.. They call it “the Luke Arm” after luke skywalker’s stump he got after he yelled at his dad. But speaking of that, altering your body is not just for people who’ve been in accidents, there is also a community of body modification people, body hackers are very populatr when it comes to adding sensors. The best is an external mod, like Eaglemens’ vest but the internal body mods are really interesting, for example, you can imbed a rare earth magnet in your finger tip, which not only allows you to sense electromagnetic fields, like where north is, but when they get near electrical devices which create electromagnetic flux, they vibrate ever so slightly and the person with the magnet in their finger can use theior finger to sense powerchord transformers, microwaves, they can feel laptop fans and how fast they’re moving, they can interact with the universe around them in such a different way than if you don’t have that electromagenetic field sense, there’s also things like the sensor suite, which is similar to the body mod of david eaglemen’s vest. Irt uses ultrasonic pulses to detect stimuli that are 60 feet away and then provide skin prssure, depending on how close those objects are. So if you program it to sense wi fi networks that are open, you could be walking down the street and intuit that 60 feet to your right and in front of you is an open wi fi network, which can be very useful when data gets low. Niel, Harveson is colorblind so he created a camera that he attached to his head, he is actually a registered cyborg, neil Harvesson’s camera looks at what he’s looking at and senses what color it is. Because neill can’t see it, then it creates a set of musical notes and those hues of color are converted to those tones and sent into his body using bone conduction so no one else hears and he doesn’t have to wear headphones it just vibrates slightly and he can hear it. So when he looks at a painting he can see a variety of different hues or a field of different hues. There are also mobile technological advances, things like prosthetics or Neill Harvesson’s cyborg implant that lets him hear color. These things are all based on advanceing mobile technology. Things we take for granted that are in our pocket can do a lot of things.
Welcome to the future brain, we can now start importing technology to enhance our senses, or natural perception of the world. As it stands now, as biological creautres, we only see a very small strip of what’s going on, take electromagentic radiation. There’s a little strip of that we can see and we call it visible light, but hte rest of that spectrum, radiowaves, cellphones, tv, gamma rays, x rays, infrared heat, its invisble to us because we dont have biological receptors for it. So CNN is passing through your body right now and you dont know it because you dont have the right receptors for iut. It turns out that the part that we see of the spectrum is 1/10 trillionth of it. So we’re not seeing most of what’s going on. What’s very interesting i think as we keep pushing with technology is we’ll be able to take more and more data from those invisible parts of the world and start feeding them into our brain. For instance snakes see in the infrared rage and honeybees see in the ultraviolet range, well there’s no reason why we cant start building devices to see that and feed it firectly into our brains. It turns out what the brain is really good at doing is exracting information from streams of data. And it doesnt matter how we get those data streams there. David Eagleman’s lab is building a vibratory vest so they can feed in sensory information through the skin of your torso rather than through typical sensory channels, so for example people who are deaf and want to be able to hear, we set up a microphone on the vest and then the auditory stream is turned into this matrix of vibrations on your skin. What that does is it feeds in electrical signals into the brain that represent the auditory information. And if it sounds crazy that you’d eveer be able to understand all these signals through your skin, remember that all the audotroy system is doing is taking signals and turning them into electrical signals in your brain. So we’re developing this right now so that deaf people will be able to hear through their skin, but the next stage is to feed not just auditory information but other data streams into the vest, for example stock market data or weather data, and people will be able to percieve these data streams just by walking around all day and unconsciousnly having this stream of information coming into their body and it will expand their sensory world. I think this is where technology and the brain have a very fertile meeting ground because we will be able to enhance the window of reality that we’re able to see.
We used to think of the brain as a fixed system with different parts dedicated to specific jobs. Like seeing, or decideing, or moving. BUT no region works in insolation. The brain is a vast dynamic interconnected network that’s always changing. Instead of hardwired, I like to think of the brain as “livewired”, and that flexibility of the brain opens up new possibilities for our future. It could be argued that this future has been with us since the 1970s in the form of a simple piece of technology. This is a cochlear implant and it can give hearing to deaf people. It picks up sounds and converts them to electrical signals that plug directly into the cells of the inner ear. When it was first introducesd scientists didnt think it was going to work, because biology is wired with such precision and specificity, but this jjust takes crude signals and shoves them into the brain in ways that the brain’s not expecting. The cochlear implant represents a marriage between metal electrodes and biological cells, and yet, it works. Around the world, almost a quarter million people have had the chance to hear for the first time thanks to these implants. Here’s how. Whether it comes from your ears, or your eyes, or a touch on your skin, all the information thaat enters your brain is converted into the same stuff, electrochemical signals. These are the common curren cy of the brain. When the implant produces these signals, however curdely, the brain finds a way to make sense of them. It hunts for patterns. Cross referencing with other senses. At first the signals are unintelligible. But soon, meaning emerges. Cochlear impkants rveal something amazing about the brain which is wahatever signals you feed into it, the brain will figure out how to extract something useful out of that. As long as the data coming in has a structure that maps onto the outside world, the brain will figure out how to decode it. And this turns out to be one of nature’s greatest tricks. And now that we know about it, it opens up a world of possibilities. Why restrict ourselves to trying to replace lost or damaged senses? There must be ways for us to enhance or add to the senses that we already have. In eagleman’s lab he’s created this vest, it turns sound into patterns of vibration that are felt on the skin of the torso. The idea is that given enough time, the wearer’s brain will learn to automatically decode these vibrations and they’ll instinctively feel and understand information. This is the alien language game where you feel a word presented to you as a pattern of vibration on your torso, through time you get better and better at this as the brain starts decoding how these inputs maps onto words that you know and your job is just to figure out what the language of the vest is. I can feel the vibrations ojn your body, it makes no sense, just random vibrations, but eventually the brain starts to pick up patterns. It seems strange that you could understand information through your torso, but that’s the surprise it doesn’t matter how signals find their way to the brain. We have these peripheral senses that we plug in, but here’s the thing, our eyes nose, and mouth are just what we inherit from our evolutionary past, what they were used for a tthe time. But we dont have to stick with it, it might be possible that we could plug some sensory channel in some unusal port in the brain and the brain will just figure it out. Maybe in the near future we can invent new sorts of sensory devices and plug them directly into the brain. In theory, there’s no limit to the new sensory expansion that we can create. So imagine if we could feed in an input of real time weather data so you could feel if its raining 100 miles away or if you could feel it’s going to snow tomorrow. Or imagine feeding in real time wall street data and devleoping an intuitive sense of how the markets were moving, you’d be plugged into the gobal economy. Because of the brain’s capacity to take on new inputs, we should be able to explant the experience of being human. We could enjoy things that wouldn’t be possible with the traditional senses we arrived with. It may be that the evolutioon of our technology, rather than our biology is what guides the journey of our species from hereon out. As we move into the future we’ll increrasingly design our own portals on the world. As far as we can tell, there’s no limit on what the brain can incorporate. Instead we now have the tools to shape our own sensory experiences to widen our small windows on reality.
Can we create new senses for humans. David Eagleman says we are built out of very small stuff and we are imbedded in a very large cosmos, and the fact is we are not very good at understanding reality at either of those scales and that’s because our brains havent evolcved to understand the world at that scale. Instead we’re trapped on a very thin slice of perception in the middle. But even at that slice of reality we call home we’re not seing most of the action thats going on. Take the colors of our world, light waves, electromagnetic radiation that bounces off objects and hits specialized receptors in the back of our eyes. But we’re not seeing all the waves out there, in fact what we see is less than a ten trillionth of what’s out there. So you have radiowaves, microwaves, xrays, and gamma rays passing through your body right now and you’re completely unaware of it because you don’t come with the proper biological receptors for picking it up. There are thousands of cell phone conversations passing through you right now and you’re utterly blind to it. It’s not that these things are inherently unseeable, snakes include some infrared spectrum in their reality and honeybees include ultraviolet in their view of the world. Butterflies can see hundreds more colors than use because they have more cones, mantis shrimps see millions. Of course we build machines in the dashboards of our cars to pick up on signals in the radiofrequency range and we build mahcines in hospitals to pick up on the xray range. But you can’t sense any of those by yourself, at least not yet because you dont come equip with the proper sensors. What this means is that our experience of reality is constrained by our biology and that goes against the common sense notion that our eyes, ears, and fingertips are just picking up the reality that’s out there. Instead our brains are sampling just a little bit of the world. Across the animal kingdom different animals pick up on different parts of reality. So in the blind and deaf world of the tick, the important signals are temperature and buteric acid. In the world of the blackose knife fish, its sensory world is lavishly colored by electrical fields. And for the echo locating bat, its reality is constructed out of air compression waves. Thats’ the slice of their ecosystem that they can pick up on, and we have a word for this in science, “the Umwelt”, the german word for “the surrounding world”. Presumably, every animal assumes that it’s umwelt is the entire objective reality out there. Because why would you ever stop to imagine that there’s something beyond what we can sense. Instead what we do is accept reality as it’s presented to us. So let’s do a consciousness razor on this. Imagine you are a bloodhound dog, your whole world is about smelling. You’ve got a long snout with 200 million scent receptorts in it. And you have wet nostrils that attract and trap scenet molecules, and your nsotrils even have slits so you get big nosefulls of air. Everything is about smell for you. So one day you stop in your tracks with a revelation. You look at your human owner and tink wat is it like to have the pitiful impoverished nose of the human. What is it like when you take a feeble nose full of air, how can you not know there’s a cat 100 yards away, or that your neighbor was on this very spot 6 hours ago. So because we’re humans we’ve never experienced that world of smell so we don’t miss it. Because we are firmly settled into our Umwelt. But the question is do we have to be stuck there? I’m interested in the way technology might expand our unmwelt and change te experience to a post human stage. We alreadyu know we can marry our technology to our biology because there are hundreds of thousands of people walking around with artifical hearing and artificial vision. So how this works is you take a microphone, digitzie the signal and you put an electrode strip directly into the inner ear. Or with the retinal implant you take the camera, digitize the signal, then you plug an electrode grid directly into the optic nerve. And as recently as 15 years ago, there were a lot of scientists who thought these technoogies wouldn’t work. Its because these technologies speak the language of silicon valley and it’s not exactly the same dialect as our natural biological sense organs. But the fact is that it works, the brain figures out how to use the signals just fine. How do we understand tat? Well here’s the big secret, your brain is not hearing or seeing an of this, your brain is locked in a vault of silence and darkness inside your skull. All it ever sees are electrochemical signals that come in along different data cables and this is all it has to work with and noting more. Now, amazingly, the brain is really good at taking in these signals and attracting patterns and assigning meaning so that it takes this inner cosmos and puts together a story of “this”, your subjective world. But here’s the key point. Your brain doesn’t know and it doesn’t care where it gets the data from. Whatever information comes in, it just figures out what to do with it. And this is a very efficient kind of machine. It’s essentially a general purpose computing device and it just takes in everything and figures out what it’s going to do with it. And that frees up mother nature to tinker around with different sorts of input channels. So I call this the PH model of ecvolution: but PH stands for potatohead and david Eagleman uses this name to emphasize that all these sensors that we know and love like our eyes, ears, and fingertips, these are merely peripheral plug and play devices. You stick them in and you’re good to go. The brain figures out what to do with the data that comes in. And when you look accross the animal kindom, you find lots of peroipheral devices. Snakes have heat pits with which to detect infrared. And the ghost nightfish has electroreceptors. And the starnose mole has an appendage with 22 fingers on it, with which it feels around and constructs a 3d model of the world. And many birds have magnetite so that they can orient the magnetic field of the planet. This means nature doesn’t have to continually re design the brain, instead with the principles of brain operation established, all nature has to worry about is designing new peripherals. So what this means is this. The Lesson that surfaces is that there’s nothing really special or fundamental about the biology we come to the table with, it’s just what we have inherited from a complex road of evolution, but it’s not what we have to stick with. And our best proof of principle of this comes from whats’ called “sensory substitution”. And that refers to feeding information into the brain via unusual sensory channels, and the brain just figures out what to do with it. That might sound speculative, but the first paper demonstrating this was published in the journal nature in 1969, so a Scientist named Paul Bokeereda, put blind people in a miodifed dental chair and he set up a video feed and he put something in front of the camera and then you would feel that poked into your back with a grin of solenoids. So if you wiggle a coffee cup in front of a camera, you feel that in your back. Amazingly, blind people got pretty good at being able to determine what was in front of the camera just by feeling it in te small of their back. Now tere have been many modern incarnations of this. The sonic glasses take a video feed right infornt of you and turn it into a sonic landscape. So as things move around it sounds like a cacophony of sound, but after several weeks, blind people start getting pretty good at understanding what’s in front of them based on what they’re hearing. And it doesn’t ave to be through the ears, this system uses an electrotactile grid on the forehead, so whatever’s in front of the video feed you’re feeling it on your forehead. Why the forehead? Because you’re ot using it for much else. The most modern incarnation is called the “Brain Port”, and this is a little electrode grid that sits on your tongue and the videofeed gets turned into these little electrotactile signals and blind people get so good at using this that they can throw a ball into a basket or they can navigate complex obstacle courses. They can come to see through their tongue. Now that sounds completely insane but remember all vision ever is, is electrochemical signals coursing around in our brain. our brain doesn’t know were te signals come from. It just figures out what to do with them. So in Eagleman’s lab sensory substitution for te deaf. And this is a project he’s undertaken wit graduate students in is lab. e wants to make it so tat sound from the world gets converted in a wa so tat a deaf person can understand wat is being said. And we’re doing tis wit portable computing, and make sure tis would run on cellpones and tablets and make tis wearable tec. Someting you could wear under your clothing. As I’m speaking te sound gets captured b te tablet and gets mapped onto a vest tats covered in vibrator motos, just like te motors in your cellpone. So as I’m speaking the sound is getting transpllated to a pattern of vibration on the vestr. This is not just conceptual. As david eagleman speaks, te sound gets translated into dynamic patterns of vibration, were e’s feeling the sonic world around him. So he’s been testing it with deaf people and it turns out that after a bit of time, people can start feeling they can startt understanding te language of the vest. A def grad student in eagleman’s lab named Jonathan is able to interpret the vibrational signals of spoken words into a signal. We have used this for “sensory substitution”, but we can also use this for sensory addition, how can we use tech like this to add a completely new kind of sense to expand the human Umwelt. Ex: could we feed real time data from the internet into somebody’s brain and can they develop a direct perceptual experience? A subject wearing a vest and is feeling a real time feed from the internet for 5 seconds, then 2 buttons appear on the smartphone and he has to make a choice. He makes a choice and gets feedbakc after one second. The subject has no idea what all the patterns mean, but we’re seeing if he gets better at figuring out which button to press. e doesn’t know tat wat we’re feeding im is real time data from the stock market and e’s making bu and sell decisions, and the feedbakc is telling him whether he did the right thing or not. So can we expand human umwelt so that after several weeks he comes to have a direct perceptual experience of the economic movements of the planet. You could also link the vest to your twitter feed and funnel data of positive and negative words being used to feel how your audience thinks about you. We’re also expanding the umwelt of pilots, so in this case the vest is streaming 9 different measures from the quadcopter, pitch, yaw, roll, orientation, and eading, which improves the pilot’s abilities to fly it. It’s like he’s extending is skin up there, far away. We enviesion in te future is taking te cockpit of te millenium falcon full of gaugaes, and instead of trying to read the whole thing, you feel it. We live in a world of information now. And there’s a difference between accessing big data and experiencing it. an solo could fly the ship with just his vest. This is also being used in the military to help airforce pilots become aces. So there’s no end to te possibilities on the horizon for human expansion. Imagine an astronanaut being able tof eel the overall healt of te international space station, or having you feel the invisible states of your own health like blood sugar and microbiome, or aving 30 degree vision, or seeing in infrared ulraviolet. Or being linked to our social network and appliances in our House. So as we move into the future, we are going to be able to increasingly choose our own periopheral devices, we no longe rhave to wait for mother nature’s sensory gifts and evolution on her time scales. But we have the tools to go out and define our own trajectory and worldview. so the question now is how do you want to go out and experience your universe.
Maybe we can direct this power in new ways, and open a new chapter in the human story.
Sensory augmentation that let you see in infrared or hear ultrasonic, that lets you zoom in on things too small to see clearly or too far away, that enhances your sense of taste smell or touch, maybe even adding new thigns we cant normally taste. There’s some debate about what we normall CAN taste. We know 5 for sure, sweet, sour, bitter, salty, and umami, that savory taste we associate to msg. Some others have also been suggested and there’s argument about what exactly taste is, but with just those 5 we have a panoply of dishes and tastes we enjoy by mixing the intensity and ratio of those. As we do with color, when we essentially can see red, green, or blue. We can still make a huge spectrum of color hues and textures. If we added a fourth cone, say one sensitive to the infrared, could we alter the brain to see this as a new distinct color, or several, as that spectrum is far bigger than the standard visual one. Including a device that saw infrared but showed it to us as red is handy. But what new experiences are available when it is a new color, how much better a meal if you can taste 3 or 4 entirely new flavors? It would be handy to hear ultrasonic or see behind you, but how much neeater to hear instruments designed JSUST for ultrasonic ranges? Or truly see 50 or 60 colors, not just combinations of 3. Of course our brains would need to be pbhysically modified for things like that, but it is also more for remembering that sometimes you dont want your senses augmented, indeed you might want them attenuated. It would be rather nice to have the ability to shut off your smell around foul sense. Or tune out liud distracting noises, particularly if it can be done automatically, removing glare or blinding lights. Filtering out just those sounds you dont want to hear so you can focus on wat someone’s saying to you in a crowded room. The alternative to that is ears that you can move around to focus on sounds, like cats can do, but that would be a physical augmentation, so lets move to those.