This application claims benefit of provisional application No. 60/166,237 filed Nov. 18, 1999.
The present invention is directed to autonomous, microprocessor controlled home cleaning robots having useful functions. More specifically, the present invention relates to autonomous, mobile home cleaning robots having a low energy cleaning apparatus. Even more specifically, the present invention relates to autonomous, mobile home cleaning robots having a low energy cleaning apparatus and a capability of adaptively performing and being trained to perform useful chores.
Toys have provided play value and entertainment to children when the child imagines the toys are capable of independent behavior. Microprocessor controlled toys have recently offered limited simulations of living behavior for the non-productive enjoyment of children including violence-oriented video games. Microprocessor based toys, until now, do not educate by engaging in useful task-oriented behaviors with the child. Ideally a toy should benefit the child by not only providing play value, but also transparently encourage creative, task-oriented behavior which benefits the child and reduces the workload of working families. This invention is directed toward that end.
Principles of toys can be adapted to useful home cleaning robots. A toy that serves that purpose would be capable of performing useful tasks, capable of easily being trained by the child to perform tasks, and would be adaptive in operation to account for less then ideal training. Further the toy should have the appearance of some real or imaginary thing consistent with the useful behavior the child and toy would be engaged in so that the child's interaction is with an emotionally engaging plaything. Once learned, the task-oriented behavior should be storable, transferable, and recallable.
Non-functional toys intended to encourage task-oriented behavior in children have traditionally approximated tools and appliances used to perform tasks. For example, U.S. Pat. No. 5,919,078 (Cassidy, issued Jul. 6, 1999) discloses a toy which has the appearance of a cyclone-type vacuum cleaner. However, it does not vacuum, learn, or adapt.
Toys are also known to the art, which while they do not perform useful functions, do have some level of behavioral response to their environment. Recent examples of such toys are “Electronic Furby” available from Tiger Electronics, Vernon Hills, Ill. and various “Actimates” interactive dolls from Microsoft Corp., Redmond Wash. These toys are not suitable for teaching children to perform useful tasks although some of the better toys may build intellectual skills in reading, writing, or math. They do not learn tasks nor are they substantially adaptive to their environment.
Toys are also known to the art which are programmable by some means but which do not respond to environmental changes. For example U.S. Pat. No. 4,702,718 (Yanase, issued Oct. 27, 1987) discloses a mobile toy wherein the toy responds optically to prerecorded, rotating disks.
Toys are known which are mobile and to a limited degree have some means to perform a useful function but which are not trainable or adaptive. An example is a Dustbot toy previously sold by Radio Shack/Tandy Corporation, Fort Worth, Tex., catalog number 60-2556 which was a motorized, mobile toy capable of lightly vacuuming crumbs from a table-top. The toy was not trainable or adaptive.
Expensive consumer robots primarily intended for entertainment are known. A recent example is a robotic entertainment dog called “Aibo” available briefly from the Sony Corporation at a cost two orders of magnitude beyond most toys. Various devices of this type including commercially available research robots have been promoted as home robots for many years without widespread commercial success. Typically they require complex user interactions including programming, are not designed to perform useful tasks and are too costly to serve as children's toys as opposed to prestigious adult entertainment devices.
Many industrial and military “robots” exist which are trainable or adaptively interact with their environment or both. This robotic art is not directed at toys or the home. It focuses exclusively on utility without regard to play value. U.S. Pat. No. 3,952,361 (Wilkins, issued Apr. 27, 1976) discloses the general principle of task training in a self-guided floor cleaner which is manually operated through a floor-cleaning task. The device is trained by recording pulse-driven wheel motor signals during the manual operation onto a tape recorder. The tape subsequently is played to generate motor-driving pulses for automated operation.
Other “training” means used in mobile commercial robots include making a digital image map of the ceiling during manual operation from an upward-focused, robot-mounted video camera as in U.S. Pat. No. 5,155,684 (Burke et al. Issued Oct. 13, 1992) which is hereby incorporated by reference; setting up external beacons for triangulation as in U.S. Pat. No. 5,974,347 (Nelson, issued Oct. 26, 1999) which is hereby incorporated by reference; or using combinations of directional cues present in the operating environment such as gravity, the earth's magnetic field (multi-axis magnetometers), inertial guidance systems, global positioning via satellite (GPS), and radar imaging as in the case of guided missiles. Examples of such missile guidance technologies include U.S. Pat. No. 5,451,014 (Dare et al. issued Sep. 19, 1995) disclosing an inertial guidance system not requiring initialization; U.S. Pat. No. 5,943,009 (Abbot, Aug. 24, 1999) disclosing a simple GPS guidance system; and U.S. Pat. No. 5,917,442 (Manoongian et al., issued Jun. 29, 1999) disclosing guidance means where the target is illuminated (by radar). Related in technology, but not purpose, is U.S. Pat. No. 5,883,861 (Moser et al., issued May 12, 1998) disclosing an electronic compass in a wristwatch. Although many of these guidance technologies have been reduced to compact solid-state devices, they have not, sans warheads, heretofore been adapted for use in educational toys.
There is an unfilled for home cleaning robots that use low energy cleaning techniques and thus make chores easier for the user.
The present invention relates to autonomous, mobile, microprocessor-controlled home cleaning robots provided with the means to perform useful functions and capable of learning and adaptively performing useful functions.
In one embodiment, the present invention is a mobile, microprocessor-controlled home cleaning robot. The robot comprises a platform, a motive force attached to the platform. This motive force moves the platform on a substantially horizontal surface. The robot also includes a computer processing unit capable of storing, receiving and transmitting data that is attached to said platform. The robot also includes at least one sensor attached to the platform, which is capable of detecting a change on the horizontal surface. The sensor provides input to the computer processing unit. The platform includes a cleaning implement operatively associated with the platform and a power source connected to the motive force and computer processing unit, whereby the computer processing unit directs horizontal movement of the platform based upon input data received from the at least one sensor.
In one embodiment the present invention is comprised of an autonomous, adaptive mobile home cleaning robot provided with a detachable or dischargeable electrostatic cleaning cloth.
In one embodiment the present invention is comprised of an autonomous, trainable, adaptive mobile home cleaning robot provided with a detachable or dischargeable electrostatic cleaning cloth.
FIG. 1 is a perspective view of one embodiment of the platform of the robot of the present invention;
FIG. 2 is a side elevational view of the platform shown in FIG. 1;
FIG. 3 is a side elevational view of one embodiment of a cover for the platform, wherein the cover is designed to look like a turtle;
FIG. 4 is a top planar view of a further embodiment of a cover for the platform, wherein the cover is designed to look like a mouse;
FIG. 5 is a block diagram of one embodiment of a robot control system of the present invention;
FIG. 6 is a schematic plan view of an alternative robot platform and control system in accordance with the present invention;
FIG. 7 is a diagram explanatory of a deviation of the robot from a predetermined straight path in accordance with the control system of FIG. 6.
FIG. 8a is an illustrative block diagram showing a mobile robot, constructed and operated in accordance with one embodiment of the invention, which includes a camera having an upwardly pointing field of view for viewing a ceiling above the robot, the ceiling having a plurality of ceiling fixture light sources;
FIG. 8b is a block diagram of the image processor 118 of FIG. 8a;
FIG. 8C is a block diagram which illustrates a feedback control system wherein ceiling related position measurements function as an error signal;
FIGS. 9a and 9b illustrate an image plane of the ceiling vision system of FIG. 8a;
FIGS. 10a, 10b and 10c are illustrative views of the control system in FIG. 8a within an environment having a plurality of ceiling fixtures;
FIGS. 11a, 11b, 11c, 11d, 11e and 11f are graphical representations of the mathematical derivation of robot position relative to ceiling light fixtures;
FIG. 12 is a perspective view of a robot having a triangulation control system;
FIG. 13 shows a perspective view of the rotating directional loop antenna;
FIG. 14A shows a diagram of two circle equations together showing the intersection which provides the x-y coordinates defining the location of the robot using the triangulation control system in FIG. 12;
FIG. 14B shows a diagram of one circle defined by the angle A and the chord between transmitters T1 and T2, with the offset a and radius r1;
FIG. 14C shows a diagram of another circle defined by the angle B and the chord between transmitters T2 and T3, with the offsets b, c, and radius r2;
FIG. 15 shows a functional block diagram of that part of the control system of FIG. 12 located on the robot along with three continuous wave transmitters;
FIG. 16 shows the functional blocks associated with signal detection and pulse generation of the system in FIG. 12; and
FIG. 17 is a schematic diagram of the sequencer of the control system in FIG. 12.
As used herein, the word “autonomous” is meant to describe the characteristic of independent mobility as opposed to active guidance by a human. For example a radio-controlled home cleaning robot relying on human operation of the remote control would not be autonomous. A similar home cleaning robot being instantly navigated by an onboard or off-board microprocessor and sensors without immediate human guidance would be autonomous.
As used herein the word “learning” is meant to describe mapping by being guided through a desired path or task manually and electronically recording the motions made to follow the path or perform the task. This may also be referred to as “training” the home cleaning robot. The recording can be of encoders on motors or wheels, recording an environment map of images or sonar responses, images, or various forms of beacons such as radio frequency sources, or passive RF beacons, or reflective or active optical beacons. Other mapping means can be used such as off board imaging or sensing of the mobile home cleaning robot in its environment while being guided. Learning in this sense can be accomplished by physically manipulating the home cleaning robot or by remotely controlling the home cleaning robot through a desired task, or by reinforcing desired behaviors as they occur by any communicative means. Programming such as the writing of non-variant software is not “learning” in the instant sense.
As used herein the word “adaptive” refers to storage of prior actions with respect to a desired goal or endpoint and changing the map of desired motor actions to optimize various behavior goals. For example, if a goal is to avoid light, and traveling along a first path does not reduce the level of incident light, that action would not be repeated but others would be tried successively until a direction or motion was found that resulted in reduced levels of light. In other words the behavior to a stimuli is not fixed, but varied until the desired goal is substantially achieved. Similar adaptive behaviors include, but are not limited to, tactile or sonar detection of obstacles that are discovered after programming and selecting actions which result in planning a path around the obstacle. It is to be understood that adaptive behavior is not limited to path selection but may also be applied to other output parameters such as light projection, audio output, speech patterns, and so on—dynamic selection of a behavior in accordance with the environment as found.
The primary emphasis of the instant invention is to provide an automated home cleaning robot having a low energy-cleaning device, which will free the user from such tasks. The present invention may optionally have play value which can be achieved through the inclusion of the inclusion of a personality by animalistic appearance, actions, sound, and the like distinguishes the instant invention from non-toys.
As used herein the phrase “play value” refers to the quality of home cleaning robots that provides pleasure, recreation, and training for user. One optional aspect of the instant invention is that it could provide play value to children (of all ages) while learning to perform useful tasks and teaching and watching their toys perform such tasks.
As used herein the word “platform” refers to an electromechanical device under microprocessor or computer control capable of some physical action such as, but not limited to, motion including but not limited to movement across a surface such as a horizontal surface, heating, spraying, moving air in response to sensor inputs such as sensed light, odor, contact, sound, radar, magnetic fields, electromagnetic fields, moisture, and the like. Typically a platform will be comprised of a microprocessor, a locomotion means, sensors, and a power source. A platform may be embodied in a single physical device or be distributed. For example a mobile platform may be guided by a remote computer or by wireless Internet access means to a remote computer. A data storage means may be on-board the mobile home cleaning robot or at a remote sight.
The general design principles of robot platforms are well known and described in the prior art. For applications, which require movement on a relatively flat, horizontal surface, the most suitable platform for the present invention is a wheeled or tracked locomotion form where the wheels may be selectively driven. The wheel or track alignment is substantially parallel. In two-wheeled, as opposed to tracked, platforms, one or more additional castered wheels or sphere-in-sockets may be used to support the body in addition to the independent drive wheels. A track-driven platform may be entirely supported by the tracks such as in the case of a bulldozer. Wheeled robotic platforms are available from Cybermotion, Salem, Va.; IS Robotics, Somerville, Mass.; Poulan, Robotic Solar Mower Dept., Shreveport, La.; and Nomadic Technologies Mountain View, Calif.
The robot of the present invention is “autonomously movable”. “Autonomously movable”, as used herein, is illustratively defined as the robot can move or translate within, preferably throughout, boundaries of a substantially horizontal surface that is desired to be cleaned without input from the user. “Movable”, as used herein, means the movement or translation of the entire robot body, or in other words, the robot does not have a fixed base. The robot body can translate and optionally can rotate. In contrast, a robot that has a fixed base that rotates to accomplish tasks, such as sweep an arm of the robot, is not included within the meaning of the present invention.
The Home cleaning robot of the present invention is typically less than 10 kilograms, preferably less than 8 kilograms.
FIG. 1 illustrates one embodiment of the platform of the present invention provided with motor-driven wheels. The drive wheels 2, are separately and independently driven by an encoder-equipped motor 1 mounted on a common circuit board printed onto the platform, 10. The platform is provided with fastening points 3, for attachment of the cover by a fastening means not illustrated. Sensors 4 and 6, the power cell 5, and microprocessor control unit 9 are likewise mounted on the platform printed circuit board. In an alternative embodiment, a sound producing means 7, and an infrared port 8, for download or uploading instructions and remote operation of the platform is provided. It should be noted that tracks rather than wheels could be used when the application involves locomotion on other than a relatively smooth surface.
FIG. 2 is a side view of the platform showing a front-mounted contact sensor 4, the printed circuit board 11 mounted on the platform structure, and a ball support means 12.
FIGS. 3 and 4 illustrate typical covers that might be applied to the platform to provide an animalistic appearance. FIG. 3 illustrates a turtle shell cover, and FIG. 4 illustrates an animalistic cover, which may be fabricated from an electrostatic dusting material. The covers typically will extend beyond the wheels unless otherwise noted so that the wheels cannot be caught on vertical obstacles.
Other means of locomotion may be used without changing the scope of this invention. It is to be understood that a wheeled or tracked platform is to be applied to tasks that are to be performed on substantially level, horizontal surfaces such as floors, counter tops, lawns, gardens, roofs with low angles of inclination, and the like. The wheeled or tracked platform provides a motive force to move the platform on a substantially horizontal surface.
Generally, the robot is placed onto a substantially horizontal surface that is desired to be cleaned and then is powered on. Next, the robot moves randomly about the substantially horizontal surface performing a useful chore, such as cleaning with a nonwoven electrostatic cloth. Upon coming in contact with either a horizontal or vertical obstacle, the at least one sensor will trigger the platform to stop motion and then reorient itself and proceed with its task. This random motion robot does not include or require a navigation system.
As used herein the word “map” or “mapping” refers to a data structure stored in a computer memory means such as read and write memory, magnetic media, optical media, or the like which represents a task environment. This data may include but is not limited to a stored schedule of actions such as the number of encoder pulses per unit time from each of the locomotion motors, the compass direction per unit time, or relative position coordinates (e. g. triangulated position from sonar, light, or other beacon means, and other stored or calculated data against which real time sensor inputs can be compared to guide a mobile, computer operated platform or task performing components thereof such as manipulators, projectors, dispensing means, spray pumps, and so on. The map typically is initially built by a user manually leading the home cleaning robot through a set of desired actions or motions or the user doing so be remote direction. More data may be added adaptively during operation such as when obstacles are encountered. In a simple example a platform with two drive wheels may be manually pushed along a desired path. The output of optical, magnetic, or mechanical encoders on each drive wheel, a series of pulses, are recorded as a count per unit time for each encoder and stored in a memory means by the microprocessor under program control. The data storage means may be onboard the mobile home cleaning robot or located remotely via a wireless communications link or the Internet or some combination thereof.
One example of the microprocessor-based control and mapping system suitable for the guidance system of the present invention is shown and described in expired U.S. Pat. No. 4,674,048 (Okumura, issued Jun. 16, 1987), which is herein incorporated by reference. The guidance system comprises position identification means for sensing a distance traveled by the robot and a change in a direction of travel of the robot, calculating a position of the robot in two-dimensional coordinates in response to the sensed distance and the sensed change in direction, and generating a position signal representative of the robot position. Such a guidance system is known in the art. Obstruction sensor means senses an obstruction to generate an obstruction signal. The obstruction sensor means are mounted on a front end and both sides of the robot with respect to an intended direction of travel of the robot. Storage means stores a map consisting of a number of unit blocks, which are defined by parallel columns and parallel rows in the two-dimensional coordinates. Teaching means causes the robot to make a round along a boundary of a range to be traveled by the robot, so that the range is stored in the map of the storage means in response to the position signal output from the position identification means.
Referring to FIG. 5 of the drawing, a distance sensor 20 for producing a pulse signal which is proportional to a distance traveled by the mobile robot, e.g. number of rotations of drive wheels. A direction sensor 22, such as a gas rate gyro, is sensitive to a change in the traveling direction of the robot. The pulse signal output from the distance sensor 20 and the output of the direction sensor are supplied to position identification means 24. The position identification means 24 is constructed to measure a distance traveled by the robot by counting incoming pulses from the distance sensor 20 and to identify a moving direction of the robot from the output of the direction sensor 22, thereby identifying by operation instantaneous positions of the robot in two-dimensional coordinates for each unit travel distance. Obstruction sensors 26 are mounted on the front, opposite sides and back of the robot with respect to a direction of movement of the robot. Each of the obstruction sensors 26 is adapted to sense a wall, column or like obstruction and a distance to the obstruction by emitting a supersonic wave and receiving the reflection. Also mounted on the robot are touch sensors 4 which locate obstructions by mechanical contact therewith, independently of the obstruction sensors 26. The outputs of the sensors 4 and 26 are routed via an amplifier 28 and an input/output (I/O) port 29D to a control circuit 9, which comprises a microprocessor. Also, the output of the position identification means 24 is applied to the control circuit 9 via an I/O port 29A.
The control circuit 9 comprises a central operational circuitry (CPU) 30, and a storage 32 made up of a read only memory (ROM) and a random access memory (RAM). The control circuit 9 further comprises an oscillator 34A for generating clock pulses, and an interrupt controller 34B. As will be described, the CPU 30 delivers a drive signal to a drive circuit 36 via an I/O port 29C in order to reversibly control the rotation of drive motors (servo motors or stepping motors) 1A and 1B, which are respectively associated with right and left drive wheels of the robot. At the same time, the control 9 may optionally control the rotation of an optional drive motor 36 for cleaning sweepers, which are mounted on the robot. A control console 38 is accessible for selectively turning on and off a system power source, switching a running mode, setting a start position, adjusting a sensitivity of the direction sensor 22, etc. In order to teach the robot a boundary of a travel range assigned thereto, a command may be applied to the drive 36 by interruption with priority on a radio control basis. This is effected by a remote control transmit unit 40 and a receive unit 42. The outputs of the control console 38 and remote control receive unit 42 are routed also to the control circuit 9 via an I/O port 29B.
Referring to FIG. 6, one particular embodiment of the mobile robot is shown in a schematic plan view. As shown, the robot comprises a platform 10 which is substantially entirely surrounded by a front bumper 50, side bumpers 51 and 52, and a rear bumper 53, each carrying the touch sensor 4 therewith. An obstruction is sensed by the contact of any one of the bumpers 50-53 therewith.
As shown in FIG. 7, assume that the robot is deviated to the right from the reference path by a distance “d” with respect to the travelling direction of the robot, and that it is misoriented by an angle Θ relative to the reference path. Then, that the deviation of the robot is to the right of the reference path is determined. Also, whether the sign of d+tan Θ is positive or negative is determined by operation. Let it be assumed that d+tan Θ is either d+tan Θ≧0 or d+tan Θ+<0.
In the first-mentioned condition, d+tan Θ≧0, the distance d is large, or the angle Θ is relatively small, or the orientation of the robot lies in the positive angular range. Then, the rotation speed V of the left drive wheel is controlled to be V=V0−(d+tan Θ) (where the minimum value of V is assumed to be V0), while the rotation speed of the right drive wheel is kept at V0, whereby the robot is caused to make a leftward turn or rotate leftwardly about an axis thereof.
The other condition, d+tan Θ<0 represents a situation in which the angle .theta. is negative and the robot is directed toward the path at a large angle. In this case, while the rotation of the left drive wheel is maintained the same, the rotation speed V of the right drive wheel is controlled to be V=V0+(d+tan Θ.), thereby turning or rotating the robot to the right.
In this manner, the actual path of the robot is controlled to the reference path if dislocated therefrom, that is, the position of the robot is corrected.
The compensation effected for rightward deviation of the actual robot path from the reference path as described similarly applied to leftward deviation of the robot, except for the reversal of angles and that of the control over the right and left drive wheels.
Due to the use of a tan function as a compensation term for the angle Θ, so long as MAX in the relation −MAX<tan Θ<MAX is sufficiently large, there exists a position and an angle where d+tan Θ=0 holds, even if the deviation d from the path is substantial. At such a specific point, the right and left drive wheels of the robot are equal in velocity and they approach the path at an angle to the path which becomes closer to the right angle as the distance d increases and decreases with the decrease in the distance d. Stated another way, the orientation of the robot is compensated sharply when the distance d is large and the compensation is slowed down as the distance d becomes smaller. This insures smooth compensation. If desired, the term d may be multiplied by a positive constant “α” and the term tan Θ by a positive constant β so that any desired path compensation characteristic is established up to the point where αd+βtan Θ=0 holds, that is, the point where the robot advances straight with the right and left drive wheels running at an equal angle.
Teaching the robot a desired range of movement may be implemented by the supersonic wave sensors 4A, 4B and 4C and the touch sensors 5 which are mounted on the robot itself, instead of the remote control transmit and receive units. Some of the supersonic wave sensors 4A, 4B and 4C are capable of identifying short and medium ranges and the others, long ranges. Such self-teaching with the various sensors is optimum for cleaning, for example, the floor of a room which is surrounded by walls; the robot will make one round automatically along the walls of the room by sensing the walls with the sensors.
Another example of mapping suitable for use as a navigation system in the present invention includes mapping via imaging ceiling lights, which is known in the art. Such a system is shown and described in expired U.S. Pat. No. 4,933,864 (Evans et al., issued Jun. 12, 1990) and is herein incorporated by reference.
In such a mapping and navigation system, the robot microprocessor uses an imaged input to make a map of an environment, such as a kitchen, and determines the home cleaning robots position and orientation on that map from an image input, such as the ceiling lights in that room. In particular, the guidance system robot images light patterns on the ceiling. By extension, the camera could include the robot on a two dimensional surface.
Referring now to FIG. 8a there is shown a side view of one embodiment of a mobile robot 110 comprising an electronic imaging device, such as a camera 112. In accordance with the invention this optical configuration is arranged to view a ceiling 114 having a plurality of light fixtures 116, the ceiling 114 being disposed above the desired path of the robot 110. The camera 112 preferably includes a CCD imaging device having a square or rectangular field of view (FOV) which is directed obliquely upward such that it images the ceiling 114 within the forward path of the robot 110. The camera 112 generates a plurality of pixels, individual ones of which have a value indicative of an intensity of radiation incident upon a corresponding surface area of the camera radiation sensing device. Robot 110 further comprises an image processor 118 which is coupled to the output of camera 112. Image processor 118, as shown in greater detail in FIG. 8b, comprises a video memory 118A which stores a representation of one video frame output of camera 112. An input to video memory 118A may be provided by an analog to digital (A/D) converter 118B which digitizes the analog output of camera 112. The digital output of A/D 118B may form an address input to a lookup table (LUT) 118C wherein pixel brightness values may be reassigned. The LUT 118C may also be employed for image thresholding and/or histogram correction. Image processor 118 further comprises an image processing device, such as a microcomputer 118D, which is coupled to the video memory 118A and which is operable for reading the stored video frame data therefrom. Image processor 118 further comprises memory 118E which includes memory for storing program instructions, constants and temporary data. The program data may be operable for performing calculations of the type which will be described in detail hereinafter. An output of image processor 118 which is expressive of position information relating to ceiling fixtures 116 within the FOV of camera 112 may be supplied, via an RS-232 or parallel data link, to a navigation control processor 120 which derives navigation data based upon the perceived image of the ceiling environment, particularly the orientation of ceiling light fixtures. This data may be employed to steer the robot down a hallway or to orient the robot within a coordinate system of a room or other enclosure having ceiling light fixtures. An output of navigation control processor 120 is supplied to a drive and steering control 122 which has outputs coupled to drive and steering wheels 124. The wheels 124 are in contact with a supporting surface 126 which is typically a floor. Navigation control processor 120 typically receives an output from the drive and steering control 122, the output being expressive of odometer readings which relate to the distance traveled by the robot 110. Navigation control processor 120 comprises a data processing device having associated memory and support circuitry. An enclosure is provided to contain the aforementioned apparatus and to provide protection therefore.
As can be seen in FIG. 8c the navigation control processor 120 is generally responsible for interpreting robot 110 position measurements generated by ceiling navigation image processor 118, in conjunction with possible inputs from other sensor systems, to control the drive system 122 in order to guide the robot 110 along a desired path. Thus, position measurements function as an error signal in a feedback control system wherein the drive and steering mechanisms serve as the actuators which change the position of the robot.
The camera 112 may be a model TM440 CCD camera manufactured by Pulnix. The camera 112 may have a relatively short focal length of, for example, 8.5 mm in order to maximize the field of view. Microcomputer 118D may be a member of the 68000 family of microprocessor devices manufactured by Motorola, Inc. LUT 118C and video memory 118A may be contained within a frame grabber pc-board such as a type manufactured by Coreco or Imaging Technologies.
Referring briefly to FIG. 10a there is illustrated a typical institutional hallway. In a suitably thresholded camera image ceiling lights 116 are the overwhelmingly prominent visual features. The linear edges, or straight line boundaries, of the ceiling lights define, in accordance with the method and apparatus of the invention, reference lines for visual navigation.
As can be appreciated, when searching for and identifying the centers and edges of ceiling lights it is important to examine as few pixels as possible in order to reduce overall processing time. This search operation is facilitated by providing for an image threshold or a camera 112 aperture setting which causes the ceiling lights to appear as bright regions which are embedded within a dark background. A binary threshold technique may then be utilized to identify bright, illuminated pixels from dark pixels.
To initially locate a ceiling light in the image an initial preliminary search may be performed over the entire image, beginning at the top row of pixels and working towards the bottom row. Once a pixel is detected that has a value above a predetermined search threshold value the preliminary search is terminated. The predetermined threshold value is influenced by such factors as the type of camera employed, the camera aperture setting and/or the particular type of pixel thresholding. The preliminary search is preferably begun from the top of the image such that a ceiling light that is nearest to the robot will first be detected.
When a pixel above the threshold is detected a method of the invention, as described below, may thereafter employ a binary subdivision search. As an example; given a white point or pixel within a ceiling light there is next located an edge of the light where a transition from white to black occurs. This may be accomplished by moving outwards from the white point while examining pixel values to detect a transition from a pixel value which corresponds to that of the light to a pixel value which corresponds to the dark background. Of course, the pixel values may not normally correspond to fully white or fully black but will typically be expressed as varying shades of gray. Sampling every pixel while moving towards an edge of the light may be less than optimum in that the edge may be hundreds of pixels removed from the initially detected pixel. Therefore, a preferred method involves stepping initially by some relatively large increment of pixels, such as by 16 pixels per step. Stepping outward in 16 pixel increments continues until a pixel value indicates that the search has entered the dark background. At this time the search increment is divided by two and the search direction is reversed. This process of dividing the stepping increment and reversing the stepping direction continues until the step size is divided down to one. At that point the pixel under consideration is either one pixel into the bright light or one pixel into the dark background. This search technique is repeated, as described below, to detect multiple edges of a ceiling light in order to obtain sufficient information to accurately locate the left and the right edges and a center point of the light.
Referring to FIG. 11f it can be seen that after a pixel, designated by the point (X), within a light is found a vertical line (1) and a horizontal line (2) are projected through the point (X) to the edges of the light using the above described pixel search method. If the vertical line (1) is longer than the horizontal, a new horizontal line (3) is projected from the center of line (1). Instead, if the horizontal line (2) is longer a second vertical line is projected from the center of the horizontal line (2). These steps succeed in bringing the initial point, which may have been at an extreme edge of the light, farther into the center of the light as indicated by the point X′. Thereafter, the slope of the edges of the light is determined as described below.
A plurality of vertical lines (4, 5, and 6) are projected, one line (5) at the middle of the horizontal line (3) and the other two lines (4,6) approximately 25% in from the ends of the horizontal line (3). Thereafter, from the points (a, b, c, d, e, f) which define the ends of the vertical lines (4,5,6) there is found an average slope for the light. A line (7) is then projected which passes through the center of vertical line (5), the line (7) having a slope equal to the average slope of the light as previously calculated. It should be noted that the vertical lines (4, 5, 6) may have been drawn so close together that the calculated average slope may not be of high accuracy. Thus, the line (7) may not intersect the two ends of the light. Therefore, at points approximately 25% of the way in from the ends of line (7) two additional vertical lines (8,9) are projected and the average slope from the end points (g, h, i, j) of lines (8,9) is determined. From the center point of each of the two vertical lines (8,9) a line (10 and 11, respectively) is projected toward the nearest edge of the light along the most recently computed average slope. The edge transition between illuminated and nonilluminated pixels sensed along lines 10 and 11 indicate the true ends of the light (A,B). At a point halfway between the edges (A,B) is the center point of the light (CP).
After accurately locating one light a second light is found and analyzed in a substantially identical manner in order to generate a set of points with which to project lines (C,D) to the vanishing point at the horizon.
To find the second light a line (12) is projected downwards in the image from the center (CP) of the first light and perpendicular to the slope of line (7). Pixels along the line (12) are analyzed to determine if another light is encountered. Because of the differing angles which the lights may assume relative to one another line (12) may not intersect a second light. If this is the case two more lines (13,14) are projected from the ends of the first light perpendicularly to the line (7) to determine where and if a second light is intersected. From lines (12,13,14) it is assured that one of them will intersect another light if there is one.
It should be realized that the preceding description of a method of locating edges of ceiling lights is but one suitable technique. For example, known methods of finding straight line patterns in a video image include the use of Hough transforms, edge detection and linking, and curve fitting.
Referring to FIG. 9a it is shown that the camera 112 configuration is treated geometrically as a viewpoint 130 and an image plane 132. The viewpoint 130
Thyron Lewis (born November 25, 1982) is an American footballwide receiver for the Atlanta Havoc of the American Arena League (AAL). He played college football at Howard University and attended William H. Taft High School in Woodland Hills, California. He has also been a member of the Stockton Lightning, Tri-Cities Fever, Bossier–Shreveport Battle Wings and Los Angeles KISS. Lewis was also drafted by Team Tennessee in the supplemental 2008 AAFL Draft and by the Georgia Stallions in the 2009 UNGL Draft.
Lewis recorded 30 receptions for 513 yards and four touchdowns for the Howard Bison from 2004 to 2005.
Lewis played for the Stockton Lightning of the af2 in 2008. He recorded 28 receptions for 384 yards and 6 touchdowns for the Lightning. Lewis was traded to the af2's Tri-Cities Fever for future considerations in May 2008. He spent the 2009 season with the Bossier–Shreveport Battle Wings of the af2. He played for the Tri-Cities Fever of the Indoor Football League (IFL) in 2010. Lewis recorded 50 receptions for 675 yards and 25 touchdowns in 13 games, earning second-team All-IFL honors. He was assigned to Cleveland Gladiators of the AFL on October 8, 2010. He recorded 97 receptions for a team-leading 1,430 yards and 26 touchdowns in 2013. On October 28, 2014, Lewis was assigned to the Los Angeles KISS. He was placed on recallable reassignment on May 27, 2015. On June 4, 2015, he was assigned to the Gladiators. On January 26, 2017, Lewis signed with the Jacksonville Sharks, who had moved to the National Arena League. He signed with the Atlanta Havoc of the American Arena League (AAL) in December 2017.
- ^Moore, Jevone (April 12, 2014). "GLADIATORS WITHSTAND THE LA KISS ADVANCES". fi360news.com. Archived from the original on September 25, 2015. Retrieved September 25, 2015.
- ^"Team Tennessee Drafts 12 in Supplemental Draft". daashathletics.blogspot.com. February 17, 2008. Retrieved October 11, 2014.
- ^"Thyron T-LEW Lewis". mmbmusicgroup.com. Retrieved October 11, 2014.
- ^ abcde"#6 Lewis, Thyron, WR/LB". clevelandgladiators.com. Retrieved October 11, 2014.
- ^ ab"Fever add WR Thyron Lewis". nbcrightnow.com. May 7, 2008. Retrieved October 11, 2014.
- ^"Historical Team Transactions". arenafan.com. Retrieved October 11, 2014.
- ^"Transactions". arenafootball.com. Retrieved October 30, 2014.
- ^"Team Transactions". arenafan.com. Archived from the original on August 18, 2015. Retrieved August 18, 2015.
- ^"JACKSONVILLE SIGNS WR THYRON LEWIS". jaxsharks.com. January 26, 2017. Retrieved February 19, 2017.
- ^"Havoc sign NFL & AFL Veteran WR Thyron Lewis". atlantahavoc.com. December 2, 2017. Retrieved December 9, 2017.