HUMANITY+

Bringing you tomorrow, today

NANO, TECHNOLOGY

The Swarm: Microbots, Kilobots, Biomimicry, and SRMRs

Many people are somewhat familiar with the concept of the swarm bots, even children, a concept which they’ve probably seen explored in the movie Big Hero Six, where they are caled Microbots.

MICROBOT: Any tiny bot. Like a nanite (nanobot) but larger – varies in size from insect-sized down to microscopic (note: the distinction between microbot and nanobot is arbitrary. Generally anything smaller than 10 microns (total unit) is (usually) considered a nanite)

KILOBOTS

The robot apocalypse is now closer than ever. Behold the wonder of the world’s first ever robot swarm. In the journal science engineers at harvard said they designed 1024 tiny robots that are able to organize themselves into any pattern they’re instructed to make without any human interaction. It’s hailed as a milestone in collective artificial intelligence. The robots nicknamed Kilobots, a scary name, are small, just a few centimeters across, cheap to make, and very simple. They communicate with each other using infrared signals and follow the same basic prinicplkes that ants, fish, and even individual cells can use to organize themselves. The harvard engineers programmed each kilobot to do 3 relatively simple things. 1 Figure out where it is in relation to its fellow robots, 2 identofy the edge of the group of robots, 3 move along the edge until it finds a spot it’s allowed to stop. The kilobots were given instructions to make a certain shape and 4 seed robots were set as the markers where the formation was supposed to start. After that the bots just started to move arbitrarily moving the outer edge of the group until it reached a coordinate that filled in the shape they were tryoing to make. In a few hours all 1000 + robots followed this pattern until the shape was complete. It’s the first time a large group of robots have been shown to follow a collective algorithm, or “shared set of instructions and rules”. The engineers say this technology could be used to have robots build buildings, or treat disease in teh body, or even form a network of self driving vehicles that coordinate wihtout traffic.

Swarm behavior is when a group of animals, birds, fish, termites, ants, act as one big thing, they can perfectly synchronize their movements or build huge mounds. Wouldn’t it be great idf we could get machines to act like that. The science of swarming behavior is inspiring scientists to build robots that in the future, might be able to help with everything from building construction to search and rescue missions. The thing that makes swarming behavior so perfect for robotics, is because in a swarm no one member does anything that’s too complicated, an animal is just following a few simple rules, like staying the same distance from all of it’s neighbors. This means you don’t have to make the robot super fancy. And you don’t need to program each one to tell it exactly what to do. Instead you just give a bunch of robots the same basic rules and because of how those rules play out in large numbers, the group will self organize and figure out how to do whatever complicated thing we want them to do. This is already a reality in today’s robotics. In 2014 researchers at harvard made over 1000 robots that could arrange themselves into almost any shape or pattern the scientists wanted. The scientists never told any each individual robot where to go, instead they just gave each one of them the same simple rules to floow. Like measuring how far away you are from your neighbor or find an outer edge of your robot swarm and move along that edge. By doing those things over and over the robots figured out exactly where to go. By doing those things over and over, the robots figured out where to go. The same harvard engineers also took some inspiration from termites to build robots that could build pyramids, castles, and other structures out of foam blocks. In this case they borrowed a strategy termites use known as Stigmergy, a method of indirectly communicating with each other to reach a common goal. When humans work on huge construction projects we need checklists, blueprints, chains of command and all that requires communictaioons. But Termites instead build by paying attention to tiny clues left over by other fellow termites in their enviroment. When they make mudballs they add in some pheramones that tells other termites where to build, this lets them coordinate their actions. In Harvard researchers used a similar idea to design robots that could place blocks based on what the structure looked like at the moment. So one robot could put its block somewhere that indicated where the robots behind it should put their own blocks down.Instead of just blindly building because of how they were programmed, they could adapt on the fly, even when the researchers tried to mess with them by moving blocks that robots had previously put down. Each robot placed its block based on how the block that was placed down before it wsas oriented. These robots are so far confined to the labs, but the idea is to eventually have them work for uis and solve real world medical problems. Robots might be able to build things in dangerous places like disaster areas or even on Mars. There might be robot swarms all over the place one day.

MICROBOTS

Scientists like Bradley Nelson have created a microbot that can cure blindness. He is now developing microscopically sized robots to treat and even destroy deadly illenesses. Microrobots are now used to perform surgeries on the eye. the device is only a hundredth of an inch wide. small enough to fit into the needle of a syringe. The robot delivers an extremely small dose of medicine at the retina to treat a type of blindness caused by blood vessels in the back of the eye. The microbots are created from an alloy highly sensitive to magentic fields made from Samarium and Cobalt. Which means we can direct the movement of the robot from outside the patient’s body without even touching it. By adjusting the strength of 8 electromagnets the surgeon can move the robot an direction in 3 dimensions. A light probe and an LED lets you see what you’re doing while driving the robot. The magnets will be arranged in a housing that surrounds the patient’s head while a doctor peers through a micrscope to guide the drug filled robot. Another robot begin developed by Nelson is a one so small it swims in the blood. Problem is if you get too small, the laws of physics no longer let you swim the way you do now. swimmer the size of a bacterium would never be get around using flippers or the breast stroke, because the smaller you get, the bigger the molecules get, until it gets too hard to push them out of the way. If you made yourself that small and tried to swim it would be like swimming in honey, because the friction from the water molecules becomes a major drag. For a long time, scientists weren’t able to figure out how bacteria were able to swim, but eventually they discovered the secret. The tail, or flagellum, seems to move back and forth, but viewed from another angle it’s clear it moves inn a totally different way. Brad and his team developed corkscrew tailed robots that mimic bacteria like e coli and salmonella. The tail twists, propelling the robot forward without having to do the backstroke. instead of propelling itself or using it’s inertia, it’s actually cutting through the fluid. It’s pulling itself rather than pushing itself. The smallest microbot made today is only 30 micrometers long, that’s 30 millionths of a meter, a 3rd the width of a human hair. They are virtually robotic bacteria that work for us now. He’s modified these microbots to overcome the physical obstacles in the microscopic realm.

BIOMETRIC ROBOTS

Imagine tiny robots made of living heart cells swimming towards you being guided by laser light, well now that’s a thing. Scientists have now created tiny swimming robots shaped like Jellyfish and stingrays. They claim to be using these mechanical copycats to study propulsion, test biological materials, and find new vehicle designes. But what about swimming armies of tiny robots. But mechanical objects like this mimic biological systems it’s called biomimicry,. Some scientists believe thaty biomimicry builds on millions of years of evolution, it’s a more effective and efficient approach to engineering. A team of scientists lead by Kevin parker at Harvard, created a 1/10th scale versiopn of a rayfish, the tiny gold skel;eton and a rubber body powered by 200,000 rat heart and muscle cells. This biological hybrid machine can swim through an obstacle course in a salt solution with a little help from lasers, it’s so creepy. The reason they use heart cells is the way that they beat, contracting in response to stimuli, they started by placing the heart cells on top of the robot, but these weren’t ordinary heart cells, they were modified to contain light-detecting protein usually found in our eyes, making them activate in response to light. this meant that the reserchers could signal the cells to contract simply by flashing a light at them. When activated the cells contract down, bedniong te skeleton, and wen they release the flexible wings were bound to neutral, just like a live ray or the beating of a heart. So now they can make the robots swimm, but since the contraction of the cells happen on both sides of the mechanical ray at the same time, the robot was impossible to steer. So they further modified the cells to make the ones on the left and right wings responsive to different wavelengths of light, flashing both wavelengths at the same time caused the ray to swim forward, but flashing one or the other activated only one wong of the ray and could turn the vehicle. After this the team started building obstacle courses for the robot like they were designing a biomemetic videogame. Bio robot hybrids are relatively new and others have mimiced aquatic animals like seaturtles, jellyfish, and now stingray. Obviously this is really cool, but why would we want to build robots out of biological materials in the first place? Stanford professor of Engineering says it all comes down to energy, but traditional robotic systems run out of energy before they took over the world, but biological systems are really efficient at using energy. If you wanted to make smaller, more powerful fatser robots, improving energy use is a good way to do that. Biomemetic jellyfish or stingrays don’t require batteries, gas, or solar panels. They use the same source of energy as the rest of the cells in our body do. Glucose, a form of sugar which scientists put in the water to feed the cells. At this point the robot only swims if it’s in a glucose and salt solution at body temperature with particular lightwaves flashing at it, but it’s still step forward to making microbots that sense, respond, and move on their own. In other words, intlelligent robots made from live biological materuials could be coming soon.

BUGBOTS

There’s a special branch of science that seeks to understand nature by imitating it. It’s called Bionics, the science of designing mechanical systems, that are based on living systems. We’ve gotten closer than ever to learning and duplicating some of nature’s hardest tricks. Robots that can fly and robots that can feel. First of all, humans have been trying to build nmachines that use flapping wings to fly since 400 BC, these are called ornathopters, Leonardo Davincii has tried and failed to imitate the design that works so well for birds, bats, and insects. Nature just does it better than us at designing things. Flapping wings are more eficient, more wind-tolerant, and more agile than fixed wings, they can react to unexpected obstacles and even stop mid flight to hover in the air. But in journal science graduate students at the harvard school of engineering and applied science say that they have succeeded in making a bioniuc fly. A bug sized robot that can hover in place and perform controlled manuvers much like a housefly, published in in 2013. Described by the researchers as the first of it’s kind, the robot gives researches a new way to study flight dynamics. It’s also a step toward imitating flight in nature. Imagine robotic swarms that can maneuver through obstacles and acclimate to changing weather conditions, or drones that look like birds that can hover in place for hours to gather surveillance. The US army already has an eye on this technology because of it’s piezoelectric materials. Solids, usually crystals that gain electric charge in response to physical pressure, It’s how some lighters work, a tiny hammer hits a piezoelectric crystal, and creates a current that ignites the gas. The robots have crystals as well that respond to minute changes in charge and pressure, translating to minute changes in motion, allowing it to navigate around and flap its wings.

100 years ago we barely knew how to make an airplane fly but insects have been doing it from hundreds of millions of years. Researchers figure we could probably learn a thing or two from all that experience. If nature has already figured out how to do it, why do the workl all over again. There’s a whole field of research focusing on technology based on biology. It’s called biomimicry and it’s particularly useful in microrobotics, because robots are just machines designed to accomplish a task, but often those tasks have already been done by a living thing like a human or an animal. So in one of the fastest growing areas of robotics science engineers are actively studying nature to see how it could teach us to build better robots. Some robots are designed to fly like bugs. There’s more to flying than just flapping wiongs. A group of swiss researchers modelled the AirBurr robot after insects. It’s designers were trying to solve one of the biggest problems robots face when exploring unknown territory. How do they get around without crashing into things. To avoid collisions robots have to tell where they’re going and map the terrain as they go. but mapping systems tend to be complicated, fragile and expensive so if a robot does crash it breaks and it’s kind of a big financial deal. The airburr is housed inside of a big flexible frame designed to bump into walls and survive. If it does fall out of the air 4 legs extend to get it upright so it can start flying again. in this way, it doesn’t need complex systems to get around, it just bumps around but eventually makes its way. Eventually robots like the airburr might help with robot search and rescue missions, flying through unknown debris filled places and bumping intop places as they go. In the sci fi show black mirror, the government solves the problem of colony collapse disorder and the declining bee population by creating robotic bee swarms that pollinate the flowers in their stead.

MICROFISH (CLEANING BLOODSTREAM)

The “fantastic voyage,” where scientists shrink down, suit up in some type of submarine or robot, and enter a patient’s body, is a common trope in popular culture. In The Adventures of Jimmy Neutron, Archer, Dexter’s Lab, Futurama, Invader Zim, and (of course) The Magic School Bus, characters all shrink down to explore.

Now, Professors Shaochen Chen and Joseph Wang of the Nano Engineering Department at the University of California, San Diego, are using tiny 3D printed robots in a similar way. While they’re not shrinking down to pilot the technology, the scientists will still be using the robots to target microscopic ailments, much as Ms. Frizzle set out to do. Chen and Wang will just be doing it from the safety of the lab.

Source

Micro-Fish

Thinner than the width of a human hair, these robots called “micro-fish” may one day be prescribed by your M.D. Why the name micro-fish? Well, they’re actually designed to look like fish.

The fish-shape isn’t achieved using a typical 3D printer. You can’t head over to Home Depot, grab a Makerbot, and start printing these guys. The fish were printed using a high-resolution 3D printing technology called “microscale continuous optical printing,” or μCOP. This was actually developed by Chen.

The printer uses 2 million tiny mirrors that function and aim individually. As a beam of UV light is projected, the mirrors direct the light at a photosensitive material. The material solidifies when the UV light hits it.

The fish are essentially a “delivery vehicle” for nanoparticles that Chen and Wang attached to their tails. The platinum particles react with hydrogen peroxide, which propels the fish. The fish even have magnetic particles attached to their heads, which help them steer.

 

 

Mini-Robots With Guns

These aren’t the first mini-robots designed for use with magnets. Researchers from Boston’s Children’s Hospital and the University of Houston invented “millirobots.” These small magnetic robots are designed to swim through a patient’s bloodstream and spinal fluid. Once inside, they are programmed to assemble into an electromagnetic gun.

The whole time, the doctor mans the controls, guiding the bots to the fluid build up or blocked passageway. They then use them to inject drugs directly into the area. The scientists on the project are still working to scale down the robots appropriately.

Special Delivery

Speaking of scales, the next step for our fish was to figure out what it would try delivering first.

Chen and his team installed particles to the fish that react with toxins such as bee venom. When they met, the particles glow red. In the experiment, the fish was placed in a solution with the toxin. The fish then, “swam through the solution and nabbed the toxin along the way.” Throughout the experiment, the scientists monitored the intensity of the red-glow.

The fish are able to remove substances, but they can also deliver them as well. Chen and his team hope to attempt using the fish for “targeted drug delivery or as sensors.” The fish seems to be doing exactly what the scientists intended, delivering “treatment to an otherwise difficult to access part of the body without causing ill effects.

Our bodies won’t be full of fish any time soon. The technology is still in development, but has been backed by grants from the National Science Foundation and the National Institutes of Health. Which means that, soon, this technology, which seems only possible in fiction, may be coming to a doctor’s office near you.

SRMRs

Let’s get even smaller. Big robots are great, but big robots made up of smaller robots are even better. Robots are taking over, gutter cleaning robots, lawn mower robots, and pretty soon rtobotic cars. These robots all have in common? They’re all really good at certain tasks but if they try to do anything else they’re pretty lousy, these robots lack versatility, there’s no single robot that’s good at everrything, but what if you could have a big robot that’s made of a bunch of tiny robots, and those tiny robots can change the shape of the larger robot to complete various tasks. This is a real thing, it has a name, Self Reconfigurable Modular Robots. Imagine the robot needs to screwin a lightbuld so thousands of tiny robots join together to create legs, giving it a firm stance on the ground. More tiny robots form a along arm that reaches up to the shelf and picks a fresh lightbulb, then they reconfigure again giving the arm 360 degrees of rotation and the robot screws in the lightbulb. Once the task is done the robots sever their connection and the big robot disintegrates. But for this to happen a lot of different components need to get together. For example, communication, each module has to be able to communicate with all the other ones as well as the overall system that’s issuing the commands. We’re talking layer upon layer of artificial intelligence from the overall system to the indivudla components. And this modular design means that those modules have to join together somehow. Those connections might be made by magnetic contacts, or even by plugs in siockets. Whatever the method, it has to be both flexible and strong. There are already several examples in the real world. MIT students have built Mblocks, cube robots that can form together to make more complex shapes. At Harvard university, you’ve got the kilobots, don’t worry, they’re friendly to humans and they display swarm behavior to complete tasks. In the future we’ll have these robots made up of smaller robots that can revolutionize everything, from manufacturing to our experience in the home. But what if we got even smaller than that. Let’s talk Smart dust. Not the dust on your bookshelfs, machines that are developed on the microscale to accomplish a certain task, like sensing the environment.. The Michigan microMote developed at the university of Michigan aims to do just that. They hope to get a sensor down to one cubic millimeter in size. Maybe in the future we’ll have sensors in robots that are so small, we’re no longer talking about larger robots made of smaller robots, every single object we come into is its own swarm. That’s endlessly customizable environments, it’s called Utility Fog. Maybe this future will never come around, maybe we run into miniturization issues, but even if I never get my own personal robocouch, I know that these miniature machines are set to make a big impact.

INGESTIBLE MACHINES

We know that even common medications can have side effects. Exasterbating this is that about half of people take meds incorectly according to one survey by the WHO. Scientists are working on this problem in the form of nanoimplants. A team from the american chemical society that built nanosheets that combined anti-inflammatory drugs with electrodes through a polymer film. They were then able to control the release of the drug through electrical shocks. It’s still in the development stage but they say they say this technique could be useful in treating diseases like epilepsy, where a medication is already in your body and can be released right away as the seizure happens. This study is one of many projects in the works to create programable pharmaceuticals and nanomedicine. For example, swallowable microchips have already been approved by the FDA. A company called Proteus digital created microchip imbedded pills with tiny snesors that could react to digestive juices. That relays a signal to a patch on your skin which can then be relayed to your doctors. So if something goes wrong or you’re not doing your meds correctly they can get alerted automatically.

Ingestible technology is exactly what it sounds like. Tiny sensors embedded in pills made of metals that are safe to ingest like copper and magnesium. The coating dissolves in stomach acid which activates the metal sensors, starting it’s tracking of your vitals like temperature and heart rate. It sends that information outside your body via an adhesive patch worn on your skin straight to your smartphone using bluetooth. Because it’s in your digestive tract, you pass it just like you would anything else. There are a few of these devices under development right now, but the closest to launching is a digital pill from Proteus digital health. Sanctioned by the FDA, the company’s ingestible sensor marks the time of ingestion then monitors how many steps you take, rest periods, and heart rate and sends all that health data to your smartphone. Another device called proteus discover takes it even further. These sensors are packed inside each pill of a perscription logging the time you take each dose along with how it’s working inside your body. These devices can modify medication intake and check for dangerous mixes, potentially preventing complications that stem from mixing certain drugs. These ingestible devices are actually in use today 2016. Both proteus products focus on patient monitoring with an emphasis on chronic patients, because some people don’t always tell doctors the truth about their habits. These situations leading to more serious and more expensive illnesses. These medicine based problems don’t just afect the individual they affect the whole country. The economic cost of medication based problems, including costs to nursing home,s hospitals, and ambulance care total nearlyy 85 billion dollars annually. But digital pills aren’t just about monitoring patients, some ingestible devices are also used for screening and preventative medicine. Pill Cam Colon is a miniturized camera imbedded in a disposable capsule used to non invasively check colon health. The camera bot is about the size of a vitamin pill, you swallow it and as it passes through your digestive system doctors can get a close look at the colon. They can check the images for signs of polyps or other early signs of colorectal cancer without having to do an invasive exam involving sedation or radiation. And it’s an FDA approved screening method for patients who can’t submit to a regular colonoscopy. Even though these microsensors pass harmlessly through your body, the technology does raise some interesting ethical questions. This comes with the territory when you have a sensor or a camera inside your body transmiting those images and that information. But those are all questions that will be raised as the tech becomes more widespread.

PILLCAM

Fantastic Voyage, a sci fi smash hit in 1966, scientists shrink a team of doctors and send them into a sick man’s body on a mission to cure him. Today as our devices get smaller and smaller, we are getting closer to micromachines that can take pictures of the human body. A capsul, a miniturized camera, everytime it blinks it takes a picture at 2 frames per second. As the capsul goes through the GI tract it gives us a picture of what’s going on inside. It’s called the Pillcam and it travels through the body just like a piece of food, taking over 55,000 pictures over the course of 8 hours, pictures that can provide diagnosis that once would have required surgery. Pillcam is made of an inert plastic that doesn’t creat a toxic response in the body. Inside the pill is a minicatalogue of the electronics industry, a tiny videocamera, fas, radiotransmitter, a battery, and a computer chip to drive it all. 25 Years ago all those components would have taken up a cubic yard of space, today it all fits inside a one inch capsule that was only a fraction of an ounce. The next generations will not only be able to take pictures, but will be able to biopsy and sample the tissue in that area, or even deliver an entire treatment, like placing a clip on a bleeding site or deliver medications to specific areas.

BEE SWARMBOTS

Researchers in Japan have developed tiny insect-sized drones that can artificially pollinate plants, in a bid to take the pressure off declining bee populations.

The team from the National Institute of Advanced Industrial Science and Technology in Tokyo designed the “artificial pollinator” drones so that, rather than replacing bees, they might in the future aid in carrying the pollination burden that modern agriculture demands have put on bee populations.

However, they have yet to be tested outside the lab and are currently remote-controlled, not autonomous, so this is just the first step. Their study is published in the journal Chem.

The drones, measuring just 4 centimeters (1.6 inches) across, have a small patch of horse hair stuck to the underside of them to mimic the fuzzy body of a bee, which helps pick up pollen when they are visiting flowers for their nectar. The horse hair is coated in a type of ionic liquid sticky gel, accidentally discovered and then forgotten about for nearly a decade by the study’s lead author Eijiro Miyako.

content-1486727009-drone-bee.jpgOK, so it doesn’t actually look much like a bee. Dr Eijiro Miyako

Inspired by concerns for declining bee populations and the rise in robotic insects, Miyako thought the gel could be applied to act in a similar way to bees, being sticky enough to collect pollen but not too sticky that it can’t be deposited elsewhere.

“This project is the result of serendipity,” Miyako said in a statement. “We were surprised that after eight years, the ionic gel didn’t degrade and was still so viscous. Conventional gels are mainly made of water and can’t be used for a long time, so we decided to use this material for research.”

Miyako’s team tested the remote-controlled drones in the lab on pink-leaved Japanese lilies. They successfully managed to pollinate them, causing them to begin the process of producing seeds.

The researchers think their development could help to counter the stress on declining bee populations. “We believe that robotic pollinators could be trained to learn pollination paths using global positioning systems and artificial intelligence,” Miyako said.

However, the drones are not quite ready for mass pollination – they’re still tricky to control and they’ve only been tested on one kind of flower. As we know, flowers come in a variety of shapes and sizes, often requiring bees to crawl inside them.

But if you’re worried about the scarily prophetic Black Mirror episode where robot pollinator bees are hacked and programmed to kill people, don’t worry. This has been addressed already by The Verge who, when they put it to Miyako, elicited this wonderful response: “Come on! All of the robots must be used for peace, right?”

Er, yes please!

japanese scientists successfully pollinate flowers via a bee-inspired drone from designboom on Vimeo.

CAN THEY COMPETE WITH THE REAL THING?

The latest service to be revolutionised by drones might not be package delivery or internet connections but the far more valuable service of pollination. Researchers in Japan have been exploring the potential of using miniature drones covered with sticky hairs to act like robotic bees to counter the decline of natural pollinators.

Writing in a paper in the journal Chem, the team demonstrated their drone on an open bamboo lily (Lilium japonicum) flower. With a bit of practice, the device could pick up 41% of the pollen available within three landings and successfully pollinated the flower in 53 out of 100 attempts. It used a patch of hairs augmented with a non-toxic ionic liquid gel that used static electricity and stickiness to be able to “lift and stick” the pollen. Although the drone was manually operated in this study, the team stated that by adding artificial intelligence and GPS, it could learn to forage for and pollinate plants on its own.

But it takes more than just sticky hairs to be a good pollinator. As someone who studies pollinating insects, I think these drones have a lot of catching up to do to match our existing pollinators, which include bees, butterflies and even some larger animals, in all their diversity. But it is always good to see science learning from nature and these studies also help us to appreciate the wonders of what nature has already provided.

Pollination is complex task and should not be underrated. It involves finding flowers and deciding if they are suitable and haven’t already been visited. The pollinator then needs to successfully handle the flower, picking pollen up and putting it down in another plant, while co-ordinating with its team and optimising its route between flowers. In all of these tasks, our existing pollinators excel, their skills honed through millions of years of evolution. In some cases, our technology can match them and in others it has some way to go.

The three major factors that make insect pollinators such as bees so good at what they do are their independent decision making, learning and teamwork. Each bee can decide what flowers are suitable, manage their energy usage and keep themselves clean of stale pollen.

image-20170209-8631-6yubqq.JPGSticky hairs. Dr. Eijiro Miyak

Modern drones can already achieve this level of individual management. As they have the technology to track faces, they could track flowers as well. They could also plot routes via GPS and return to base for recharging on sensing a low battery. In the long run, they may even have a potential advantage over natural pollinators as pollination would be their sole function. Bees, on the other hand, are looking to feed themselves and their brood, and pollination happens as a by-product.

The areas where drones need development, however, are learning and teamwork. Flowers are also not always as open and simple as those of the bamboo lily and quite a few of our commercially pollinated food resources have much trickier flowers (such as beans) or need repeated visits (such as strawberry flowers) to produce good fruit.

To solve this, bees learn and specialise on a specific flower so they can handle them quickly and efficiently. They also learn the position of rewards to learn the best routes. With all individuals in the team doing this, they divide their labour and get a lot more done. To replicate this in drones would involve some serious programming and the ability of the drone to change its behaviour or shape to adjust to flowers, or having different drones for different jobs as we have different species of pollinator.

Having more than one drone requires co-ordination and preferably non-centralised control, whereby individual drones can make their own decisions based on information from their colleagues and a set of simple rules. Honeybees have the ability to recruit others to rich floral rewards using movements known as the waggle dance. Bumblebees can tell if a flower has already been visited by the smell of the footprints left by previous visitors. All these adaptations make our pollinators very efficient at what they do. Similar skills would have to be developed into a team of pollinating drones in order for them to work as efficient pollinators.

Although I feel that these robots are a long way away from becoming the optimal pollinators, they may well have a place in our future. I could see these drones being used in the environments that are unsuitable for natural pollinators, such as a research lab where precision is needed in the crossing of plant breeds. Or even in a biodome on Mars where a swarm of honeybees may not be the safest solution. It will be interesting to see what else robotics can learn from our insect pollinators and what they can improve upon.

The ConversationElizabeth Franklin, Demonstrator (Biosciences), Bournemouth University

This article was originally published on The Conversation. Read the original article.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: