Help - Search - Members - Calendar
Full Version: Bad Performance?
Unmanned Spaceflight.com > Outer Solar System > Saturn > Cassini Huygens > Cassini general discussion and science results
Pages: 1, 2
Wyl2006
I am wondering about two special things:


1. Why was the Imager system of Cassini not set on a scan platform? I am really wondering that more than the half possibilites for RADAR-cartography are lost to do other experiments? This seems not an excellent technical solution.

2. Why was the Huygens camera build with such bad resolution? Also the "test images" done at the LPI parking area would give a lot of questions if people were not said that these things in the picture are trees.

Some experiments with a simple CCD-camera at 1/1000 of the illumination given on Earth shows clearly that better pictures could be done without problems, and there are better compression algorithms.

As the soviet space probe has shown during the 70s and 80s, it is possible to make panorama photos from a torrent surface during a flyby of the bus without serios problems.

I am really wondering on those funny self beloving of some ESA-"professionals" and the things told about these bunch of lousy photos they have gotten.

Some experiments with a simple CCD-camera at 1/1000 of the illumination given on Earth shows clearly that better pictures could be done without major problems.

I am working on that topic making cameras for supervision and examination for night and day.

Greetings: Wyl
Richard Trigaux
The main concern with Huygens was the data rate: very low. For this only reason, the images were of poor quality. Sounds too were only audiograms, not sounds.


Of course this is deceiptive, especially when we compare with the beautiful MER images. But it was this or nothing. Hope that the next Titan lander will have a larger data bandwidth. But for this it will need a radio relay orbiting around Titan, and... nuclear power. (RTG).

The scan platform on Cassini? Canceled because of cost, and no anticipated use of it. A decision that many will regreet, when it will come to fly by Enceladus at 25kms altitude.

Problems in conception? See the thread on ITAR.

Other problems? None as far as I know. Cassini does well and the engineers and technicians have all reasons to be proud.
remcook
QUOTE (Richard Trigaux @ May 11 2006, 07:54 PM) *
The scan platform on Cassini? Canceled because of cost, and no anticipated use of it.


no anticipated use?! The science planning has increased many times in complexity.
centsworth_II
QUOTE (Richard Trigaux @ May 11 2006, 02:54 PM) *
The scan platform on Cassini? Canceled because of cost....A decision that many will regret...


A stuck scan platform would also be cause for much regret.
tedstryk
The problem is that the Radar requires use of the same Antenna that is used to communicate with earth. So if it were aligned with the remote sensing instruments, Cassini could never use its High Gain Antenna while it was observing anything!
djellison
QUOTE (Wyl2006 @ May 11 2006, 07:19 PM) *
I am really wondering on those funny self beloving of some ESA-"professionals" and the things told about these bunch of lousy photos they have gotten.


Welcome to UMSF - but carefull on the 'self beloving' stuff - no need for insults.

The camera was built in the very early '90s, with a tiny volume, mass, power and data budget and sent to a very dim, cold, harsh environment - and given those constraints it was about as good as it could possibly have been. Given that, its origin is unimportant - but just to set the record straight, the camera was infact a US contribution to Huygens
http://www.lpl.arizona.edu/DISR/ - so if you feel the need for totally unjustified criticism, then lay it at their door, not ESA's.

Doug
remcook
for all those people who say 'my camera can do better', it would be fun to fully flight-test a normal digital camera. That means full environmental test and shake it! It would be interesting to see how a normal camera would do after such a test.
BruceMoomaw
The scan platform was removed from Cassini in 1992 -- along with a second scan platform for its fields and particles instruments, and a separate receiver antenna for Huygens' signals -- because at that time Cassini was in an extremely severe fiscal crisis. Congress was on the very brink of cancelling it because of its high cost -- only those design simplifications, plus very intense pressure from the ESA, kept Congress from doing so. And NASA asked the Space Studies Board for an appraisal to make sure that the new design would still be scientifically cost-effective: http://www7.nationalacademies.org/ssb/crafcassini1092.html .

As for the DISR: as other commenters have pointed out, the cameras' resolution was kept low simply because cameras are first-class data hogs, and making their resolution higher would have required much stronger transmitters and power supplies for the probe. (Without the cameras, Huygens' data rate could have been cut from 8000 bits/second to only about 200 -- as was the case with the Pioneer 13 Venus entry probe.) And, given those necessary limitations, the DISR team -- who were American, not European (led by Martin Tomasko of the Univ. of Arizona) -- chose to make a tradeoff between somewhat reduced resolution and greater total area coverage of Titan's surface. I don't think anyone can say they were wrong to do so. (The GCMS on Huygens, by the way, was also provided by the U.S.)
Mariner9
It has already been noted, but it is worth repeating: the descent camera was built a looooong time ago. At least, a long time ago in digital technology time frames.

Whenever you look at data being returned by a space mission, you really have to factor in the timeframe that the mission was developed.

In Cassini's case, it started being studied by JPL around 1980-81. They entered into talks with the ESA a few years later. Preliminary designs for Cassini were undoubtably being looked at by the mid-late 80s. Formal approval for the mission happened around 1989. As soon as these spacecraft start into the true design period, one of NASA's big things is flying proven hardware. So the designers probably were somewhat hindered from even including state of the art for 1989 .... so roll the clock back a few years from there.

For missions such as Galileo it was even worse. Essentially the spacecraft was re-designed several times, split into a probe carrier and orbiter, then merged back, and then the whole thing put into storage for a couple years after Challenger exploded. When it flew, it took a 6.5 year course to Jupiter, instead of the originally planned 3. So all told, when the first pictures of Ganymede started rolling in around July of 1996, the camera producing them was based on CCD technology from the mid 70s.
BruceMoomaw
Yes -- I still find it staggering that the Galileo Project, as a whole, stretched over half my lifetime. It was first seriously planned in 1976 (then called "Jupiter Orbiter and Probe"); it was cancelled by the House Appropriations Committee in 1977, only to have its funding restored in a very rare override of that committee by a full House vote; and its instruments were selected way back yonder in late 1977 (except for the Extreme UV Spectrometer and a total-radiation-dose monitor, which were added in the mid-1980s).

In the 1980s -- thanks entirely to the Shuttle's foibles, culminating in the Challenger disaster -- its launch was delayed by almost 8 years, its arrival at Jupiter by 11 years, and the mission's basic design was radically changed FIVE times. (And, as the crowning irony, if it hadn't been for every one of those delays -- including Challenger -- the mission would have failed totally and it wouldn't even have been America's fault. Messerschmidt built its small thrusters as part of a NASA deal with West Germany, and it wasn't until mid-1987 that it was discovered that they would all have burned out or exploded within a few months of launch, requiring a frantic last-ditch thruster redevelopment effort.) When that craft finally burned up in Jupiter's atmosphere, it was literally the end of an era for me.
centsworth_II
QUOTE (BruceMoomaw @ May 11 2006, 05:13 PM) *
...the DISR team... chose to make a tradeoff between somewhat reduced resolution and greater total area coverage of Titan's surface. I don't think anyone can say they were wrong to do so.


Limitations or no, the Huygens images stand as some of the most fantastic ever taken anywhere in the solar system. They are more than sufficient to reveal the totaly alien yet eerily familiar landscape of Titan. I'm no space exploration history expert, but as far as I know they are the best (only?) descent panoramas ever taken.
Mariner9
I think you are correct.

The Veneras only transmitted from the surface of Venus.

MERs had descent imagers, but I only recall a few fuzzy pics. and those are not panoramas.

Similar with Ranger, except the descent photos on the approach to the moon was the entire point of the mission. Lots of pics: straight down. No panoramas.

I don't remember any other missions having descent imaging at all, although the Lunar Surveyors might have.


I'm voting for a Discovery Mission with a Venus probe... that parachutes very, very slowly to the surface, with the entire point being to take as many panoramas as possible as it drifts slowly across the landscape. And, well I suppose while it is there it might as well do some sort of mineralogical study on the surface that the Veneras didn't get, this time with an incredible ammount of context information from the descent imaging .... but that's the scientists for you, always wanting some actual meat to the mission. Me, I admit I mostly want the pics. smile.gif
Richard Trigaux
Having worked in the area, and especially in testing components, I can confirm that electronic parts used in today space are in fact much in late compared to state of the art. The reasons are many, as well explained above by several; but there is also a delay simply in qualifying a component for space: it must work in a large range of temperatures (which usually constrain designers to use military-grade components) and especially with high doses of radioactivity. Especially the Huygens camera had to bear, in turn:
-Earth van Allen radiations
-interplanetary radiations
-neutron flux from the RTGs, for two more time than planned (no, one time, confused with Galileo. I corrected after the two further posts, thanks for seeing the bug).
-Saturn van Allen, which is enough to kill a man.
Integrated circuits are especially sensitive to radioactivity, which can create errors, break them abruptly ir degrade their performances until they stop working. Radiation probing a complex part like a CPU or a CCD requires years, build test circuit, expose the components to various radioactive conditions, do statistics, etc. I don't know how they did (this is included in design time) but we consider we can have relief if the Huygens camera was still working and gave images.

So it is really unfair to compare a 2006 digital camera working in a comfortable human environment, and a mid 70 integrated circuit still working after years in such a cold temp and destructive radioactivity.

A far space mission is a bit like a human life: it takes a long time, has its infancy, its great moments, its long waitings, and its death. And the world evolves significantly during all this time...
Analyst
QUOTE (Richard Trigaux @ May 12 2006, 07:55 AM) *
A far space mission is a bit like a human life: it takes a long time, has its infancy, its great moments, its long waitings, and its death. And the world evolves significantly during all this time...


Well said.

Why "neutron flux from the RTGs, for two more time than planned"?

Analyst
ugordan
I'd just add that the primary factor determining the low resolution of the images was the data rate. Do you even realize how low a bitrate 8 kilobits is? That's 7 times slower than today's dial-up modems are! Plus you have other instruments requiring bandwidth as well!
Another factor in determining the quality of the images is that the CCD chip on the camera accumulated 7 years of cosmic ray radiation damage -- you can't possibly expect the same level of quality as using a test device back here on Earth!
Saying today's digital cameras would blow the Huygens DISR away is nonsense. Your typical digital camera creates 8 bit images as opposed to Huygens' camera 12 bit images. Huygens had 16 times more sensitive dynamic range than your ordinary camera -- a capability that was crucial to even make out the faint surface features through the omnipresent haze. If you launched a 8 bit camera, you'd get nothing except haze (which reduces contrast and hence dynamic range) all the way down to the vicinity of the surface. So much for the superiority of commercial cameras. Finally, the compression algorithm for the images was IIRC made even before JPEG was standardized.
You really can't stress enough that any technology we launch into space is "obsolete" the moment it's launched. Think about that the next time you wish to complain about the quality of the data. When I first read details about Huygens's data rate and what number of images they were expecting to take in 2 hours time, I immediately thought "no way! that's going to look ugly!". When the images finally came back, I was actually pleasantly surprised because I had no expectations of superb imagery. In fact, the surface image was actually mindblowingly clear.
IMHO, given the circuimstances and the time when the probe was developed, I think it performed perfectly. The people who made the damn thing have every right to be "self beloving".

QUOTE (Analyst @ May 12 2006, 09:15 AM) *
Why "neutron flux from the RTGs, for two more time than planned"?

I was wondering about that as well. Maybe he thought two times more than had a direct-to-Jupiter flyby trajectory were executed. That trajectory option was never doable, given the sheer mass of the spacecraft. Cassini actually did pretty well - under 7 years to Saturn (as opposed to how long it took Galileo to Jupiter) and in fact there was no need for a backup, 11 year trajectory. On the other side, Huygens stayed attached to Cassini only a month or so longer than planned so that didn't have a big impact.
Richard Trigaux
QUOTE (Analyst @ May 12 2006, 08:15 AM) *
Well said.

Why "neutron flux from the RTGs, for two more time than planned"?

Analyst


Ooops, it is galileo which took twice the travel time. Sorry, I mixed the two. And I correct in the original post
BruceMoomaw
QUOTE (Mariner9 @ May 12 2006, 07:42 AM) *
I think you are correct.

The Veneras only transmitted from the surface of Venus.

MERs had descent imagers, but I only recall a few fuzzy pics. and those are not panoramas.

Similar with Ranger, except the descent photos on the approach to the moon was the entire point of the mission. Lots of pics: straight down. No panoramas.

I don't remember any other missions having descent imaging at all, although the Lunar Surveyors might have.
I'm voting for a Discovery Mission with a Venus probe... that parachutes very, very slowly to the surface, with the entire point being to take as many panoramas as possible as it drifts slowly across the landscape. And, well I suppose while it is there it might as well do some sort of mineralogical study on the surface that the Veneras didn't get, this time with an incredible amount of context information from the descent imaging .... but that's the scientists for you, always wanting some actual meat to the mission. Me, I admit I mostly want the pics. smile.gif


We mustn't forget NEAR, Hayabusa and Deep Impact, must we? Or, for that matter, the LMs. (No, none of the Lunar Surveyors did it -- only the first two even carried descent cameras, and the decision was finally made not to try using them. The Soviets seriously considerd adding a descent camera to a Venera, but finally ended up not doing so.)

We really should appreciate how lucky Huygens was to come down next to a shoreline and actually in the lakebed -- it's hard to think of a landing site that could have given us a clearer understanding of wht is actually occurring on that puzzling world.

For the record, there has already been at least one proposal before (presented twice) for a Discovery mission that would have done what you suggest at Venus -- in fact, it would have dropped four multispectral camera-equipped descent probes onto different parts of Venus' surface from a cloud-level balloon gondola, although they wouldn't have survived their landing. (It would also have featured a separate large entry probe for atmospheric analysis, and a few instruments on the balloon itself.) This concept was called "VEVA", and you can learn quite a lot about it in its final form at http://trs-new.jpl.nasa.gov/dspace/bitstre...6/1/00-0365.pdf .
Cugel
There is however one thing about DISR, or better about transmitting Huygens images, that went horribly wrong.
That of course being the loss of one of the two data channels.
With that we lost almost 50% of all imagery, although some of that was redundant. We could have had a lot more ground covered and with that a better performance.
As far as I know (but I don't hear much about the issue) it was human error and it should have been caught by whatever procedure. If you want to criticize anyone, this should be your point.
ngunn
I'm amazed that anyone's grumbling. As was pointed out at the time the extra data channel was designed for redundancy, not for extra data. Since the landing was successful and one channel worked the Huygens engineering team delivered the goods 100%: a wonderful example of human exploration working at full stretch - daring ambition, technical thoroughness and a real dash of luck with the place of landing. As for the pictures they're a superb first peek at a whole new and very strange environment - quite magic, I say.

On lucky landing locations: how likely was it that Spirit would descend right in the middle of dust devil alley?
Holder of the Two Leashes
QUOTE (Mariner9 @ May 12 2006, 02:42 AM) *
MERs had descent imagers, but I only recall a few fuzzy pics. and those are not panoramas.

Similar with Ranger, except the descent photos on the approach to the moon was the entire point of the mission. Lots of pics: straight down. No panoramas.


Too bad the Vikings didn't have descent imaging, especially Viking 2. Would of saved us a lot of grief down the years.

As for Ranger, you're right about the fact that there were no panoramas. But some of the cameras were offset just a little bit from "straight down". Not a really a major point, though, just an FYI.
tty
QUOTE (remcook @ May 11 2006, 11:12 PM) *
for all those people who say 'my camera can do better', it would be fun to fully flight-test a normal digital camera. That means full environmental test and shake it! It would be interesting to see how a normal camera would do after such a test.


I fully agree. Having seen on three (3) separate occasions new expensive digital cameras simply lie down and die when encountering tropical rainforest conditions (hot, humid) my opinion is that they are not quite ready for use on Terra yet. sad.gif

tty
djellison
Well - I'd be happy to spend £25 on a cheap crap digital camera to see if it works after a night in the freezer smile.gif

Doug
Cugel
QUOTE (ngunn @ May 12 2006, 12:32 PM) *
I'm amazed that anyone's grumbling. As was pointed out at the time the extra data channel was designed for redundancy, not for extra data. Since the landing was successful and one channel worked the Huygens engineering team delivered the goods 100%...


As far as I know (which might not be the ultimate truth) the redundancy was that the images were split up between the 2 channels and not that all images were transmitted on both channels. So, we ended up with 350 images instead of 700. Some of the ground covered in those missing 350 images is probably also on the set we did receive, but as DISR is a panoramic (using many small individual frames) camera I think we suffered a substantial loss. Having 2 channels to begin with was a good idea of course and one might argue that in the end it saved the day. (Depending on whether they would have shut down a single data channel.... better not think about that)
Richard Trigaux
You don't need to break a camera to see what happens. To spatialize a thing such a camere implies completelly re-doing the electronic parts (I did this with a PC, it was no longer looking like a PC), potting it, and especially radiation and temperature qualify all the large integrated circuits and CCD parts.
djellison
Actually - I was given a Tamagochi (you know the ltitle electonic pet things) - I killed it by putting it in the freezer overnight biggrin.gif

Doug
Bob Shaw
QUOTE (Richard Trigaux @ May 12 2006, 10:56 PM) *
You don't need to break a camera to see what happens. To spatialize a thing such a camere implies completelly re-doing the electronic parts (I did this with a PC, it was no longer looking like a PC), potting it, and especially radiation and temperature qualify all the large integrated circuits and CCD parts.


Richard:

I seem to recall that one of the differences between the early US Lunar probes and the Soviet ones was that the US made the error of assuming that vacuum was a good insulator - which is true, unless your TV subsystem goes live during launch, and HT fries everything possible in the nicely rarefied upper atmosphere! The Soviets, on the other hand, went for hermetically sealed compartments where their cruder electronics lived quite happily. I suspect that their clockwork spacecraft were lubricated with bicycle oil, too!

Bob Shaw
The Messenger
QUOTE (ngunn @ May 12 2006, 06:32 AM) *
I'm amazed that anyone's grumbling. As was pointed out at the time the extra data channel was designed for redundancy, not for extra data. Since the landing was successful and one channel worked the Huygens engineering team delivered the goods 100%: a wonderful example of human exploration working at full stretch - daring ambition, technical thoroughness and a real dash of luck with the place of landing. As for the pictures they're a superb first peek at a whole new and very strange environment - quite magic, I say.

On lucky landing locations: how likely was it that Spirit would descend right in the middle of dust devil alley?

100% is a stretch: The DWE was lost, and even though some of this was recovered by the big ears on the earth, there are still unconstrained parameters and a lot of guessing about the orientation and timing of the DISR images. Bruce also told us that (some/all (?)) of the close spectra that were planned to be collected just before landing were lost - that hurts! Throw in the counter-rotation, and there was a significant loss of scientific data...albeit any and all complaints have to be tempered by the bonanza of information gathered...and hopefully, some day shared.
BruceMoomaw
Some -- not all -- of the final pre-landing spectra, being taken at a high rate of speed and in large numbers, were lost. I doubt that the science loss from this was serious.
edstrick
Regarding Ranger. Ranger 6's television camera's high voltage electronics were destroyed during launch when hot gas backflow along the side of the rocket shorted out a pin on an umblial socket, causing the TV system to cycle on and then off during passage through the wrong part of the atmosphere, where high-voltage breakdown (like in a neon tube) destroyed critical circuitry.

High-voltage arcing in vaccuum is not a problem, but it's been a recurring problem in just-launched spacecraft, when high voltage was turned on in spacecraft or instruments before their in-vaccuum outgassing had declined to sufficiently low rates that there was negligable gas inside the high-voltage circuitry. Zap. Sput. AAAAAERRRRRRGGGGGGG! <whimper> I THINK this was one factor in the post-launch failure of Orbiting Astronomical Observatory OAO-1. I believe there was a full-up congressional investigation into the troubled and way over budget program. I never did see a full report/post-mortem on the flight and the failure. I'd be interested in a link to a scanned report.

Ranger 8 was intentionally left in it's in-flight orientation and not reoriented to put the impact point in the camera's field of view. This resulted in much greater coverage of the surface at high resolution, but image motion smear (expected) of the very last few images.
Richard Trigaux
QUOTE (edstrick @ May 13 2006, 08:35 AM) *
Regarding Ranger. Ranger 6's television camera's high voltage electronics were destroyed during launch when hot gas backflow along the side of the rocket shorted out a pin on an umblial socket, causing the TV system to cycle on and then off during passage through the wrong part of the atmosphere, where high-voltage breakdown (like in a neon tube) destroyed critical circuitry.

...



high voltage too is a problem, as around a spaceship (and into it) there are alway some residual gasses. This is part now of space design to use low outgassing chemicals, potting and insulating dangerous or sensitive equipments. But this is not alway possible.

Vacuum is also a problem in designing electronic equipments: there is no air convection in space, only radiation. Even in inhabited places, low gravity makes air much more insulating. So we must take special cautions for heat dissipation.

Once I designed an equipment which was to be borne attactched to an astronaut's belt, and they asked me to place a heat safety, to stop the equipment in case this box would become too hot and burn the guy. Seemingly it never tripped on.

In another instance I worked for the maintenance of a furnace which was to be installed in the shuttle bay, and which used vacuum to make a super thermal insulation. For this it had a large port hole, to allow easy escape of all gasses. But they placed on this port hole a quick closing mechanism, working automatically with a spring in case of a power cut, to avoid a bad working of the furnace to flow gasses on the other equipments around.
ngunn
QUOTE (The Messenger @ May 13 2006, 01:40 AM) *
100% is a stretch: The DWE was lost, and even though some of this was recovered by the big ears on the earth, there are still unconstrained parameters and a lot of guessing about the orientation and timing of the DISR images. Bruce also told us that (some/all (?)) of the close spectra that were planned to be collected just before landing were lost - that hurts!


I only said that the 2 Huygens radio channels were DESIGNED for redundancy. I know they weren't actually used in that way exactly. If I recall the comment made at the time was something like: "That's the scientists trying to screw the system." I read it in a press report - could have been New Scientist. If it's wrong I apologise. Still I don't understand why the doppler wind data wasn't on both channels. Why risk a whole experiment in this way?
Bob Shaw
QUOTE (ngunn @ May 15 2006, 11:44 AM) *
I only said that the 2 Huygens radio channels were DESIGNED for redundancy. I know they weren't actually used in that way exactly. If I recall the comment made at the time was something like: "That's the scientists trying to screw the system." I read it in a press report - could have been New Scientist. If it's wrong I apologise. Still I don't understand why the doppler wind data wasn't on both channels. Why risk a whole experiment in this way?


I think that's another case of disconnected mission planning, which appears to have been a bit more common on Cassini-Huygens than it should - not only did a radio link simply not get turned on, but they came close to losing the whole Huygens mission as a result of getting the Huygens to Cassini comms design wrong! OK, things like ITAR can't have helped - but the very presence of ITAR restrictions should have acted as a warning sign that any sort of interface (including comms) issue needed to be given a lot of attention. These are managerial failures - it's the job of the managers to direct the technical staff!

Bob Shaw
tedstryk
"If you launched a 8 bit camera, you'd get nothing except haze (which reduces contrast and hence dynamic range) all the way down to the vicinity of the surface. "

You could actually do this if you had software to maximize the use of the 8 bits, so that the brightest parts of the image were at the top of the scale and the dimmest were at the bottom. But this is too risky - if the gain was calculated wrong, you could end up with pure white or pure black images!

"MERs had descent imagers, but I only recall a few fuzzy pics. and those are not panoramas. "

The individual frames were better, but there were only three of them, and they were transmitted from the surface...we got lucky and Huygens landed, but it was not a mission requirement. I believe it is the first descent camera in history to transmit descent images DURING an atmospheric descent onto another planet.

"If it's wrong I apologise. Still I don't understand why the doppler wind data wasn't on both channels. Why risk a whole experiment in this way?"

Actually, we are lucky in the channel we lost, as the DWE data was recovered on the ground, but the other instruments besides DISR, would have been complete losses.
ugordan
QUOTE (tedstryk @ May 15 2006, 01:40 PM) *
Actually, we are lucky in the channel we lost, as the DWE data was recovered on the ground, but the other instruments besides DISR, would have been complete losses.

I thought the reasoning behind two channels was to provide redundancy, if the instrument team decided to exercise it. All other instruments except DISR pretty much used this redundancy (opposed to the DWE, for some reason), the DISR team instead wanted to cram as many images as possible so they used it as a 2x bandwidth increase. So I gather that had we lost the other channel instead, we'd still get "all" the science (plus the Doppler wind data).
The Messenger
The Doppler Wind experiment required ultra stable oscillators in both the receiver and the transmitter, so the loss of channel A reception, (which is reported to be because the A channel ultrastable oscillator on Cassini was not switched on) doomed the data. So even if this data was transferred to channel B - it was garbage. Low cost redundancy could have been encorporated if a phase lock loop would have been used on each channel, than rather ping-ponging between ultra stable oscillators on one channel, and it would have been even more accurate.

It is inaccurate to state that the DWE was recovered by analysizing the signal from the earth - The current descent profile is not constrained in every axis, and is at best, an approximation of wind directions and magnitudes. I don't think this profile has even been correlated with either accelerometer or VLA triangulation data, but I would like to be wrong about this.

One last thought - I was under the impression that some of Doppler wind data results (lateral winds not dependent upon two oscillators) were supposed to be transmitted on both the A and B channels, so the lateral winds should have been characterized; and I am under the impression that David Atkinson was puzzled that they ended up with garbage in all of the B channel Doppler wind data. This should have provided the lateral constraints, and with the A channel data recovered by the big ears on earth, the wind profile would have been complete.
tedstryk
QUOTE (ugordan @ May 15 2006, 12:47 PM) *
I thought the reasoning behind two channels was to provide redundancy, if the instrument team decided to exercise it. All other instruments except DISR pretty much used this redundancy (opposed to the DWE, for some reason), the DISR team instead wanted to cram as many images as possible so they used it as a 2x bandwidth increase. So I gather that had we lost the other channel instead, we'd still get "all" the science (plus the Doppler wind data).


You are correct. I was thinking that they used all of Channel A for extra images, but it was only the space that would have been allocated to imagery, and the other instruments did indeed use it for redundancy. rolleyes.gif
BruceMoomaw
Once again: a few of the final pre-landing DISR spectra -- taken at high speed -- were also stored only on Channel A and thus lost. But there was enough duplication in those spectra that I doubt anything at all important was lost.
ugordan
QUOTE (BruceMoomaw @ May 16 2006, 02:28 AM) *
Once again: a few of the final pre-landing DISR spectra -- taken at high speed -- were also stored only on Channel A and thus lost.

This is news to me. What was to be gained by transmitting those spectra only on one channel? It seems as an unnecessary risk with no real bonuses.

Hm... come to think of it, does the factor that Channel B transmits telemetry with a 6 second or so delay w/regard to Channel A have anything to do with it? High speed, pre-landing spectra would thus be lost in case the probe was destroyed upon impact so delayed information wasn't expected to be trasmitted at all, hence this Chan A only option.
The Messenger
What is really sad, is no one anticipated the long life of Huygens on the surface:

Hello Huygens, ya, this is Cassini, Aw got nothin but goose eggs on ma '"A" Channel reciever, would you flop what you'all got left on 'A' Channel over to 'B', and resend all you can until you fade?
ugordan
QUOTE (The Messenger @ May 16 2006, 02:50 PM) *
What is really sad, is no one anticipated the long life of Huygens on the surface:

First, Huygens was an atmospheric probe. Anything sent back from the surface would be merely bonus science.
Second, the impact speed was not trivial at all and it was uncertain whether the probe would ever survive it.
Third, it was probably seriously considered back then that Titan's surface might be covered with oceans/lakes of liquid methane. There was a distinct probability the probe would land into liquid. If this were to happen, the probe's on-surface life would most likely be reduced to only a few minutes as the criogenically cool methane freezed the probe to death.

From an engineering point of view, it was unrealistic to prepare for an extended surface mission, given also the fact that no great scientific benefit would result from greatly prolonged surface life.
If, on the other hand, they anticipated (and prepared for) the probe's survival and instead it crashed, some people would get a nice new reason to bash ESA on a failed "lander".
djellison
And of course the baseline mission at the time of launch had Cassini off over the horizon much MUCH quicker after landing than the altered mission design.

Of course, we're making assumptions. How much memory was onboard Hugyens? Would have even been possible to retain all the data from both channels from the entire sequence to the point of landing, and then begin transmitting it all again but on swopped channels ( for ultimate redundency ) - perhaps only for imagery. So - could ALL the imagery have been retained on board for the duration of EDL? Was the ability to reprogram Huygens post-launch to that extent even possible.

Doug
ugordan
QUOTE (djellison @ May 16 2006, 03:04 PM) *
How much memory was onboard Hugyens? Would have even been possible to retain all the data from both channels from the entire sequence to the point of landing, and then begin transmitting it all again but on swopped channels ( for ultimate redundency ) - perhaps only for imagery. So - could ALL the imagery have been retained on board for the duration of EDL? Was the ability to reprogram Huygens post-launch to that extent even possible.

The ability to store all onboard data in a buffer for later playback would serve one purpose only: redundancy. The DISR team already had the option to use redundancy and they chose not to. The buffer-redundancy would only actually be redundant in case the probe survived the landing. So it's kind of MAYBE-redundant system. It was probably possible, I imagine a 1 megabit buffer would more than suffice, but really, would you re-send all the EDL imagery back when you're on the surface, or would you send surface data instead -- and, just maybe, watch the probe slowly ride on Titanian waves or the sort...
If you re-sent all the images again after landing, but on the opposite channel than before, you effectively make those folks who provided 2 channels for REDUNDANCY purposes look like idiots, don't you wink.gif
Richard Trigaux
What is perfectly redundant on the countrary is the view of the surface, which is plesent maybe more than 100 times.

Y guess if it would be possible to do a super-resolution view of all these simgle images? It would be possible provides that Huygens slighty moved during the time on the ground, for whatever reason (soil melting under, wind, internal vibes...

But for this we should know the encoding algorithm for the images. It seems they used a tremendous compression method, with only two colours (black and white) so that the images are mere suites of bits.

In order to present their final image, it is likely that they simply did a statistical averaging of all the available images, but it is still more blurred than a super-resolution.
djellison
What was release soon after landing was very highly stretched JPGS ( much like the MER Jpgs )

once the proper 12 bit images are release to the PDS ( this summer I believe ) then it might be worth trying to do somethign with the data w.r.t super-res imaging.

As Jim Bell and I mentioned in a Pancam update - Super Res with JPG's just isn't worth bothering with really.

Doug
tedstryk
Actually, the full quality versions were out for a while. The problem is that the compression patterns (from Huygens, not something done on earth) are too similar because the camera didn't move at all. Therefore, super-res processing and stacking does little. You need a bit of camera motion for it to be of use.
djellison
I've not seen any Huygens stuff on the PDS at all Ted - what 'full quality' do you mean?

Also - the SSP team said that the vehicle did move after landing. Not a lot but stil...

" The probe had penetrated about 10 cm into surface, and settling gradually by a few millimetres after landing and tilting by a fraction of a degree"

It might not be much, but it's something - depends on the angular res of the camera I suppose - but if nothing else, stacking a lot of images of exactly the same spot should help increase dynamic range.

Doug
The Messenger
QUOTE (ugordan @ May 16 2006, 08:00 AM) *
From an engineering point of view, it was unrealistic to prepare for an extended surface mission, given also the fact that no great scientific benefit would result from greatly prolonged surface life.
If, on the other hand, they anticipated (and prepared for) the probe's survival and instead it crashed, some people would get a nice new reason to bash ESA on a failed "lander".

I don't agree. If you follow the Cassini Event logs, They have recently added programming that allows retransmitting of critical data in the event of a lose during transmission. During T-13, an ultrastable clocking system on one channel switched off, and much of the close-pass data would have been lost if Cassini engineers had not planned for this contingency. This was not an expensive programming change in terms of computer resources and real estate, just thoughful acceptance of the fact that seemingly unnecessary redundancy is a virtue.

There is no doubt this decision was prompted by the transmission failure during T-7(?), and also influenced by the loss of data from Huygens. In the case of Huygens, no, they did not have big enough buffers to retransmit all of the data, but they did have a seven year flight, during which they had plenty of time to contemplate contigencies; and they did know after changing the deployment scheme there may be more ground time. Retransmitting critical data, even if it the original data was fully redundant, would have been a very cheap safety play. So what if Cassini wasn't even listening? And speaking of not listening, my clients and servers do an automatic cycle reset when they don't see any valid data within a specified time out period before giving up. Why was Cassini so dumb?

Finally, if the ESA would not have revealed the loss of channel A data, would the Cassini team have taken the time to plan for such a contigency during T-13? What lessons are there to learn buried in the secret report of the DART debunkle? Let there be light.
Bob Shaw
QUOTE (ugordan @ May 16 2006, 03:46 PM) *
If you re-sent all the images again after landing, but on the opposite channel than before, you effectively make those folks who provided 2 channels for REDUNDANCY purposes look like idiots, don't you wink.gif


I think the appropriate term is 'belt and braces' (well, in the UK, anyway!).

Bob Shaw
ugordan
QUOTE (The Messenger @ May 16 2006, 06:15 PM) *
If you follow the Cassini Event logs, They have recently added programming that allows retransmitting of critical data in the event of a lose during transmission.

Recently? Not a chance. They had this ability from the start. This was more of a mission requirement than a contingency capability. That's the sort of contingency used for example when there's heavy rain over the DSN station and the data doesn't all come down in one piece. It had nothing to do with either Huygens or T-7. Cassini's SSR has two pointers, a record pointer and playback pointer. The two can be moved around to accomodate various changes. That's an oversimplification, but you get the point. Read Emily's blog on the recent anomaly.
QUOTE
There is no doubt this decision was prompted by the transmission failure during T-7(?), and also influenced by the loss of data from Huygens.

As I said, this had nothing to do with it.

QUOTE
In the case of Huygens, no, they did not have big enough buffers to retransmit all of the data, but they did have a seven year flight, during which they had plenty of time to contemplate contigencies; and they did know after changing the deployment scheme there may be more ground time. Retransmitting critical data, even if it the original data was fully redundant, would have been a very cheap safety play.

Just how exactly do you propose transmitting critical data again if you don't have a big enough buffer, as you yourself admit above? The second channel WAS a contingency by itself. And it worked.

QUOTE
So what if Cassini wasn't even listening? And speaking of not listening, my clients and servers do an automatic cycle reset when they don't see any valid data within a specified time out period before giving up. Why was Cassini so dumb?

Do you really think giving the S/C too much autonomy at mission critical times like that is anywhere near wise? Cassini was probably in an effective safe mode during probe relay, and its only orders were to listen, record and play back. Giving it an option to automatically switch the circuits on or off might have done more harm than good.
Spacecraft designers aren't that naive -- sometimes, less is more.

Also, your client-server analogy reminds me of the sort of comparisons: my digital camera would blow away the Huygens DISR and stuff... Comparisons like that don't really do justice to anything.

QUOTE
Finally, if the ESA would not have revealed the loss of channel A data, would the Cassini team have taken the time to plan for such a contigency during T-13? What lessons are there to learn buried in the secret report of the DART debunkle? Let there be light.

You are trying to find connections in places where there simply aren't any. Period.
alan
QUOTE (djellison @ May 16 2006, 11:22 AM) *
I've not seen any Huygens stuff on the PDS at all Ted - what 'full quality' do you mean?

Doug

DISR had them on their site, briefly. I heard ESA had them pulled and put stretched versions are on their Huygens site. Most of the amatuer mosaics were based on copies of the images posted at the DISR site. They are still links to them on the liekens site. http://anthony.liekens.net/index.php/Main/Huygens
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2024 Invision Power Services, Inc.