How do robots see




















Ask students to each draw a map of the classroom or room they are in. Give them a few minutes to make their maps. After students have finished making their maps, ask: What senses did you use to make your maps?

Most students will have relied heavily on their sense of sight. Ask: If you could not see, what other senses could you use? Explain that some robots make maps of their environment using their various senses. Explain that unlike humans or animals, robots do not have naturally occurring senses.

Engineers must create them, as sensors, for robots. Robots need to use sensors to create a picture of whatever environment they are in. LIDAR is a technology that uses a laser to measure distance.

Lasers illuminate objects in an environment and reflect the light back. The robot analyzes these reflections to create a map of its environment. LIDAR tells robots what is around them and where it is located. Before playing the video clip, ask students to keep the following question in mind: Why does Chimp need to use sensors? How does Chimp sense its environment?

Play the clip, and then repeat the questions, eliciting student responses. Explain that sometimes scientists purposefully use nature as inspiration for creating technology. This is called biomimicry. Remind students of the senses they used to create a map of what was around them in the first step. Explain that some animals use sound waves to determine the location of objects. This is called echolocation. Show the provided diagram of echolocation. Explain that when animals use echolocation, they make sounds that reflect off surfaces and return to their ears.

This information is then interpreted by the animal so it can understand its environment and what action to take next. Compare echolocation to hearing an echo, an experience many students will be familiar with. When a loud sound travels a large distance and hits a reflective surface, some of the sound waves may bounce back in a way that a person can hear. Ask: What are some animals that use echolocation? Answers may include bats and toothed whales, like dolphins.

Tell students they are going to participate in a game to experience what it is like to use sound rather than sight to locate objects. One student will play the role of a bat and another will play the role of an insect. In a space with plenty of room, mark off a 6 x 6 m 19 x 19 ft square using masking tape. This is the field of play.

The insect may not leave the square during play. When the insect is in the square, the bat is blindfolded and led to the square. This continues until the bat locates the insect and places one hand gently on the insect, indicating that dinner is served.

Remind students there is no running. Are there differences in the way bats and robots use senses to gain a better understanding of their environments? The squares marked A and B are the same shade of grey Wikipedia. Although it might seem like the machine wins this round, robots have problems recognising shadows and accounting for the way they change the landscape.

This is why autonomous vehicles need more than a pair of suitably advanced cameras. Radar and laser scanners are necessary because machine intelligences need much more information to recognise an object than we do.

To be faithful assistants and useful workers, they need to recognise people and our intentions. All of these are pattern-recognition problems. The contextual awareness needed to safely navigate the world is not to be taken lightly.

Beiker gives the example of a plastic ball rolling into the road. Most human drivers would expect that a child might follow it, and slow down accordingly. Multi-beam sensors simultaneously produce multiple detection beams, and are ideal for object and collision avoidance.

Finally, rotational sensors produce a single beam while the device is rotated, and are often used for object detection and avoidance.

An important task often assigned to robots, especially within the manufacturing industry, is to pick up objects. Part detection sensors are commonly used in industrial robots , and can detect whether or not a part has arrived at a particular location. There are a number of different types of these sensors, each with unique capabilities, including detecting the presence, shape, distance, color and orientation of an object.

Robot vision sensors offer several high-tech benefits to collaborative robots across industries. Both 2D and 3D vision allow robots to manipulate different parts, without reprogramming, pick up objects of an unknown position and orientation, and correct for inaccuracies.

The introduction of robots into more intimate aspects of our lives such as in our homes requires a deeper and more nuanced understanding of three-dimensional objects.

These are questions we get asked a lot. In fact, they may lead to more human jobs. According to a report from the World Economic Forum , robots may displace 75 million jobs globally by , but in time create a net positive of million new ones!

Take the example of a spreadsheet. Prior to its existence, accountants spent a lot of time adding numbers manually. The advent of the spreadsheet made work so much easier for accountants, who were now able to spend their time making informed business decisions, improve communications, increase reporting, and streamline efficiencies.

These advancements increased profitability which actually created way more jobs, although programmatically, the responsibility for the person that added the numbers no longer existed.

In fact, the future of robots and humans very much go hand-in-hand. A funny thing happens when you take motors and put them in a configuration that looks like an arm.

Yes, in the future robots will be functionally capable of taking over most of if not all automation.



0コメント

  • 1000 / 1000