Help - Search - Members - Calendar
Full Version: Slippage Detection
Unmanned Spaceflight.com > Mars & Missions > Past and Future > MER > Opportunity
Tman
If they get Oppy rescue from his awkward position, what could they do to beware he of such a thing in future?

Have anyone a notion if they could gage the electric power consumption of each motor? If so, I think, they must get a suddenly change in consumption by slippage of a wheel, because of a change of the power consumption by less resistance - I think it becomes lower then.
Edward Schmitz
Slippage detection from current draw is very problematic. Slippage can increase current load or decrease it. Or it can stay in the normal range. The current draw is constantly fluctuating. If the wheels treads are clogged with soil (like they are now), the current might drop because the wheels are smooth and the rovers not moving. If the soil is soft but little slippage is occuring, it is like going up hill. The current rises. If the soil is realy soft and sticky, the current can go way up. So there is no hard and fast way to tell it is slipping by current draw.
paxdan
Combine wheel rotation with visual odometry. Each rotation gives a nominal 80cm of travel. If you move less than 40cm per rotation as measured by visual odometry, halt and await further command. It would stop you spinning the wheel on the spot.
djellison
Visual Odometry would work - but it would be PAINFULLY slow, and I mean

"Today we covered 18 metres" would be a remarkable achievment

Doug
garybeau
I would think a simple solution for future rovers would be to use the same or similar technology as your optical mouse. A small self contained imaging system mounted under the rover that would compare frames hundreds of times per second to give you real time speed and direction over ground.

Gary
djellison
Interesting concept - I wonder if such a system could be deployed so as to be far enough away from the ground to not risk being 'ground out' - but close enough to work well.

Could make an interesting science instrument as well actually - partway between PC and MI

Doug
slinted
I believe that is the purpose of the 256x256 navcams, similar to the optical mouse concept. I have no idea if that is how they are used (or even if they are used onboard the rover, or if they are just for analysis on the ground) although obivously the frame rate is much much lower than what a mouse would use. One navcam (at least in terms of what gets downlinked to the ground, many more could be taken and used onboard the rover potentially) looking down and to the side for each driving segment.

Edward Schmitz
QUOTE (slinted @ May 9 2005, 04:41 PM)
I believe that is the purpose of the 256x256 navcams, similar to the optical mouse concept.  I have no idea if that is how they are used (or even if they are used onboard the rover, or if they are just for analysis on the ground) although obivously the frame rate is much much lower than what a mouse would use.  One navcam (at least in terms of what gets downlinked to the ground, many more could be taken and used onboard the rover potentially) looking down and to the side for each driving segment.


*

I believe that is what they are doing.
Edward Schmitz
QUOTE (djellison @ May 9 2005, 01:20 AM)
Visual Odometry would work - but it would be PAINFULLY slow, and I mean

"Today we covered 18 metres" would be a remarkable achievment

Doug
*

Visual odometry is not painfully slow. They have been doing it for a long time. They combine a short blind drive with an extended auto nav drive and get upwards of 150 meters. It is flash drive intensive. Three days on auto nav and they fill the drive. But I say use them and loose them. Discard the auto nav data after it's driven. I think they never want to discard any data that hasn't been down linked. But it is holding them back.
wyogold
QUOTE (Edward Schmitz @ May 10 2005, 02:20 AM)
QUOTE (djellison @ May 9 2005, 01:20 AM)
Visual Odometry would work - but it would be PAINFULLY slow, and I mean

"Today we covered 18 metres" would be a remarkable achievment

Doug
*

Visual odometry is not painfully slow. They have been doing it for a long time. They combine a short blind drive with an extended auto nav drive and get upwards of 150 meters. It is flash drive intensive. Three days on auto nav and they fill the drive. But I say use them and loose them. Discard the auto nav data after it's driven. I think they never want to discard any data that hasn't been down linked. But it is holding them back.
*



I think its because the funding and the mission revolve around the data sets. i agree quality over quantity

scott
djellison
Autonav is not the same as Visual Odometry

Auotnav takes a stereo hazcam pair - checks for hazards, continues driving, checks for hazards, continues driving, checks for hazards, continues driving, checks for hazards.

Visual Od has been used a few times to line the rover up infront of an IDD target - it takes hazcam images, identifies key features as identified on the ground, and drives until they match the positions requested by the ground. It was used in Eagle crater I believe.


Just to check I hadnt gone mad...

"Spirit began sol 178 by stowing the robotic arm and then backing up 1.5 meters (about 5 feet) from "Hank's Hollow" in order to properly place the miniature thermal emission spectrometer to get a good view of "Pot of Gold" and nearby rover tracks.

Engineers also took this opportunity to use visual odometry for the first time on Spirit. This is a technique in which the rover takes successive images of its surroundings during a drive and then matches features in those images on-board to compute how far and in what direction it has moved." - It had been auto-naving for almost 5 months by this point.





It aint autonav smile.gif

the VisOd Navcam work that has been done has been done in areas of high slippage (Burns Cliff, climbing to Larry's Lookout etc) - and whilst I'm unsure of the exact data-volume burden - these have always been short drives of a few metres and not the 50-100m+ Autonav drives that Spirit and Opportunity have both managed.


Doug
JES
Should we look for another software upload and reboot in the near future?
unsure.gif
djellison
I think the software onboard is able enough - but I do share the thoughts of others here...what is so special about the Autonav data ( equiv of a 360 4 filter panorama generated in about 4 sols ) that dictates it must be downlinked? Surely after a succefull autonav, the data isnt THAT important?

Doug
Marslauncher
Well I guess they will want to check on the autonav data for the next few months or until they get to an area that isnt as prone to trappings as the dunes, also on a side note do we know if any commands have been sent to Oppy or if we have even attempted any extraction techniques yet?

Thanks

John
jaredGalen
If things work out and they manage to get Oppy out I think the next worst thing that could happen is this.

http://mars.systemsfirst.com/PepsiMarsRover-1.wmv

Frankly it is too funny not to post. laugh.gif
garybeau
QUOTE (djellison @ May 9 2005, 05:26 PM)
Interesting concept - I wonder if such a system could be deployed so as to be far enough away from the ground to not risk being 'ground out' - but close enough to work well.

Could make an interesting science instrument as well actually - partway between PC and MI

Doug
*



I wasn't too sure how well it would work from several inches away, so I decided to give it a try. I took an optical mouse, removed the existing lens and mounted a lens from a webcam. Getting it focused was a problem because I had no visual feedback, I just kept moving it back and forth and adjusted the focus until the cursor on the screen started to move. It ended up working surprisingly well. I was able to get it work reliably from 3" to 24”. With the mouse in its original configuration, the cursor on the screen would move about 7" for every one inch of movement with the mouse. With the lens attached and held 12" away, the cursor on the screen would move 2" for every 20" I moved the mouse. It seemed pretty repeatable, when I moved the mouse back to it's original position, the cursor ended up back to where it started. The only drawback that I saw was that it took
quite a bit of light to work. This must be due to the high shutter speed (1500 fps)
In conclusion, I think the concept would work well for measuring ground speed. There would be a little variation due to the distance to the ground not at a fixed distance. The beauty of using something like this is you can get real time feedback on your movement and since all the computing is done on the chip you free up the main computer for more important task such as obstacle avoidance and route planning. I think if something like this was ever used on a rover, the best place to put it would be in the belly with a peephole pointing straight down to the ground where it would be protected from the elements. In order to get the light sensitivity up, the frame rate could be slowed way down. Doing a little math, it looks like there was about a 1:63 reduction in speed of the cursor. You could easily bring the frame rate down to 50 - 100 fps, which should increase the light sensitivity significantly. With a little bit of on chip software modification, I'm sure you could get it to output speed and direction instead of x,y position.
I don't think the images would be of much use to anyone. The image sensor is only 18x18 pixels. I'm sure someone at Nasa would want to stitch together a picture 18 pixels high x 6 kilometers long. biggrin.gif

Gary
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2024 Invision Power Services, Inc.