Help - Search - Members - Calendar
Full Version: Perseverance Imagery
Unmanned Spaceflight.com > Mars & Missions > Perseverance- Mars 2020 Rover
Pages: 1, 2, 3, 4
Greenish
Putting this here, for reference as the payload of the JSON feed link for Perseverence raws (see source in other thread here)
https://mars.nasa.gov/rss/api/?feed=raw_ima...;&extended=
Looks like there's a ton of good data in addition to just the (PNG! Bayer color separated!) images biggrin.gif

This is for the first image shown on the page at the moment.
CODE
"images":
    [
        {
        "extended":
            {
            "mastAz":"UNK",
            "mastEl":"UNK",
            "sclk":"667129493.453",
            "scaleFactor":"4",
            "xyz":"(0.0,0.0,0.0)",
            "subframeRect":"(1,1,1280,960)",
            "dimension":"(1280,960)"
            },
        "sol":2,
        "attitude":"(0.415617,-0.00408664,-0.00947025,0.909481)",
        "image_files":
            {
            "medium":"https://mars.nasa.gov/mars2020-raw-images/pub/ods/surface/sol/00002/ids/edr/browse/rcam/RRB_0002_0667129492_604ECM_N0010052AUT_04096_00_2I3J01_800.jpg",
            "small":"https://mars.nasa.gov/mars2020-raw-images/pub/ods/surface/sol/00002/ids/edr/browse/rcam/RRB_0002_0667129492_604ECM_N0010052AUT_04096_00_2I3J01_320.jpg",
            "full_res":"https://mars.nasa.gov/mars2020-raw-images/pub/ods/surface/sol/00002/ids/edr/browse/rcam/RRB_0002_0667129492_604ECM_N0010052AUT_04096_00_2I3J01.png",
            "large":"https://mars.nasa.gov/mars2020-raw-images/pub/ods/surface/sol/00002/ids/edr/browse/rcam/RRB_0002_0667129492_604ECM_N0010052AUT_04096_00_2I3J01_1200.jpg"
            },
        "imageid":"RRB_0002_0667129492_604ECM_N0010052AUT_04096_00_2I3J01",
        "camera":
            {
            "filter_name":"UNK",
            "camera_vector":"(-0.7838279435884001,0.600143487448691,0.15950407306054173)",
            "camera_model_component_list":"2.0;0.0;(46.176,2.97867,720.521);(-0.701049,0.00940617,0.713051);(8.39e-06,0.0168764,-0.00743155);(-0.00878744,-0.00869157,-0.00676256);(-1.05782,-0.466472,-0.724517);(-0.702572,0.0113481,0.711523);(-448.981,-528.002,453.359)",
            "camera_position":"(-1.05782,-0.466472,-0.724517)",
            "instrument":"REAR_HAZCAM_RIGHT",
            "camera_model_type":"CAHVORE"
            },
        "caption":"NASA's Mars Perseverance rover acquired this image of the area in back of it using its onboard Rear Right Hazard Avoidance Camera. \n\n This image was acquired on Feb. 21, 2021 (Sol 2) at the local mean solar time of 15:37:11.",
        "sample_type":"Full",
        "date_taken_mars":"Sol-00002M15:37:11.985",
        "credit":"NASA/JPL-Caltech",
        "date_taken_utc":"2021-02-21T02:16:26Z",
        "json_link":"https://mars.nasa.gov/rss/api/?feed=raw_images&category=mars2020&feedtype=json&id=RRB_0002_0667129492_604ECM_N0010052AUT_04096_00_2I3J01",
        "link":"https://mars.nasa.gov/mars2020/multimedia/raw-images/?id=RRB_0002_0667129492_604ECM_N0010052AUT_04096_00_2I3J01",
        "drive":"52",
        "title":"Mars Perseverance Sol 2: Rear Right Hazard Avoidance Camera (Hazcam)",
        "site":1,
        "date_received":"2021-02-21T23:12:58Z"
        },


(with syntax color):
Click to view attachment

Here's hoping that one of you skilled characters can make good use...
phase4
A small Perseverance update for Marslife, implementing the camera pointing data is a bit troublesome.
It suffers from a hit-and-miss accuracy. biggrin.gif

Click to view attachment
Greenish
So cool!

Maybe they're still refining which way is up... lots of local coordinate frames to sort out and probably need some time (and the sun shots it looks like they just took, and some 3d processing of the local area, and some radio ranging) to get absolute position & orientation info.

The instrument position details will certainly be different vs MSL. And in my skim of the data it looked to me like they re-jiggered some of the pointing data formats, did you see that too? Not sure which parts you're using.
djellison
For the less technically minded among us...Ryan Kinnet has put up a page that grabs a listing with links to the PNG files that you can then use any browser-plugin-batch-downloader with

https://twitter.com/rover_18/status/1364309922167488512

I tried Firefox with 'DownloadThemAll' and it worked perfectly.

Meanwhile THIS GUY has python code to also grab the data
https://twitter.com/kevinmgill/status/1364311336000258048




ugordan
Speaking of raw images, are these weird colorations due to something wrong with the pipeline or the cameras themselves (I'm thinking and hoping it's the former)?
Examples attached, one is a NavCam-L from yesterday (seems to have been pulled since), the other an RDC frame.
mcaplinger
QUOTE (ugordan @ Feb 23 2021, 12:45 PM) *
Speaking of raw images, are these weird colorations due to something wrong with the pipeline...

Yes.
ugordan
QUOTE (mcaplinger @ Feb 23 2021, 09:56 PM) *
Yes.

That's reassuring to hear. One could sort of expect this from (well, almost) off-the shelf commercial EDL cameras, but the navcams are on a whole other level.

QUOTE (Steve G @ Feb 23 2021, 09:57 PM) *
New raw images just trickling in.

Yep, looks like they started download on the full quality, Bayered RDC camera views.

On a side note, I found it interesting how the color balance between RDC and DDC was so different. Yes, different detector resolutions, but same vendor.
phase4
Yes, I hope this erronous positioning will improve when the rover is calibrated properly. Otherwise it is my code to blame... biggrin.gif
I use the rover attitude, mastAz and mastEl values for camerapointing, I don't know if that has changed much since Curiosity.
nprev
Hey, all. This thread is for the imagewizards among us & will focus on the abundance of data Mars 2020 will provide. Please post your products, methods, and tips, and use this thread to share information & likewise learn from others. smile.gif
bdshort
Okay, I hope this is the right forum for this question - one of the things I really like about both rovers having the ability to take stereo imagery is the ability to see stuff in 3D. Is there a utility out there that lets us view images in 3D using a VR headset? I've briefly searched around but haven't seen much. Anaglyphs are cool and everything but it leaves a bit to be desired.
Cargo Cult
QUOTE (bdshort @ Feb 24 2021, 07:01 AM) *
Okay, I hope this is the right forum for this question - one of the things I really like about both rovers having the ability to take stereo imagery is the ability to see stuff in 3D. Is there a utility out there that lets us view images in 3D using a VR headset? I've briefly searched around but haven't seen much. Anaglyphs are cool and everything but it leaves a bit to be desired.

Stereo imagery viewed in a VR headset is a bit underwhelming - reconstructing geometry using photogrammetry to create a fully 3D representation of a landscape is much more interesting.

I did this with some Curiosity imagery a few years ago, with fascinating results. If you have a SteamVR capable VR headset, you can have a look here: https://steamcommunity.com/sharedfiles/file...s/?id=928142301 - I typed up some fairly detailed notes in the description which will broadly apply to Perseverance imagery.

(Full disclosure: I work for Valve, creators of SteamVR. The Mars stuff was a fun personal project which turned into something a bit larger...)

Getting right back on topic for this thread - some notes on photogrammetry involving Perseverance imagery!

Some decent camera parameters* to start with in Agisoft Metashape (formerly PhotoScan):

Navcam
Camera type: Fisheye
Pixel size (mm): 0.0255 x 0.0255 (for 1280x960 images)
Focal length (mm) 19.1

Hazcam
Camera type: Fisheye
Pixel size (mm): 0.0255 x 0.0255 (for 1280x960 images)
Focal length (mm) 14

Using separate calibration profiles for left and right cameras may make sense - stuff worked better for Curiosity's navcams when I did this. (They're beautifully hand-made one-off scientific instruments, after all.) Metashape will further refine camera parameters once given these reasonable starting points.

It's all looking like really exciting data to play around with - navcam imagery in high-resolution, full colour after a bit of processing. I'm starting to figure out debayering stuff - I'm sure that this thread will come in great use!



* derived from The Mars 2020 Engineering Cameras and Microphone on the Perseverance Rover: A Next-Generation Imaging System for Mars Exploration
: Table 2, Perseverance Navcam, Hazcam, and Cachecam characteristics
bdshort
Great! Thanks for the answer. I have an Index, so I'll check that out!

Edit: Just tried it - that's awesome! Gives a great sense of scale to the rover, and I liked all the point of interest spots. I'm surprised there isn't more stuff out there like this, it would be a fantastic educational tool and it's just fun to stand or sit around in for awhile.

QUOTE (Cargo Cult @ Feb 24 2021, 04:44 AM) *
Stereo imagery viewed in a VR headset is a bit underwhelming - reconstructing geometry using photogrammetry to create a fully 3D representation of a landscape is much more interesting.

I did this with some Curiosity imagery a few years ago, with fascinating results. If you have a SteamVR capable VR headset, you can have a look here: https://steamcommunity.com/sharedfiles/file...s/?id=928142301 - I typed up some fairly detailed notes in the description which will broadly apply to Perseverance imagery.

(Full disclosure: I work for Valve, creators of SteamVR. The Mars stuff was a fun personal project which turned into something a bit larger...)

Getting right back on topic for this thread - some notes on photogrammetry involving Perseverance imagery!

Some decent camera parameters* to start with in Agisoft Metashape (formerly PhotoScan):

Navcam
Camera type: Fisheye
Pixel size (mm): 0.0255 x 0.0255 (for 1280x960 images)
Focal length (mm) 19.1

Hazcam
Camera type: Fisheye
Pixel size (mm): 0.0255 x 0.0255 (for 1280x960 images)
Focal length (mm) 14

Using separate calibration profiles for left and right cameras may make sense - stuff worked better for Curiosity's navcams when I did this. (They're beautifully hand-made one-off scientific instruments, after all.) Metashape will further refine camera parameters once given these reasonable starting points.

It's all looking like really exciting data to play around with - navcam imagery in high-resolution, full colour after a bit of processing. I'm starting to figure out debayering stuff - I'm sure that this thread will come in great use!



* derived from The Mars 2020 Engineering Cameras and Microphone on the Perseverance Rover: A Next-Generation Imaging System for Mars Exploration
: Table 2, Perseverance Navcam, Hazcam, and Cachecam characteristics
scalbers
QUOTE
There are what i assume to be quaternions (?) regarding attitude in the search/query api. see https://mars.nasa.gov/rss/api/?feed=raw_ima...04096_034085J01 for example. There are altazimuth compass bearings and elevation for the regular people as well, which could be used to project the images to a compass ball according to the metadata, rather than manually overlapping and tie-pointing the images. I do not know of software or libraries that can help with this, if you do please tell.


Regarding the API "quaternion" coordinate info with the raw images, a LINUX program 'jq' is good for manipulating these. While this is just a starting point it would be interesting to see how an automated program might work with putting together a mosaic. I would just need some time to hook up some of my Fortran code with some scripts.

CargoCult - can your SteamVR creation be viewed somehow with a regular computer? With Curiosity, sittingduck (YouTube link below) had some really nice videos moving through a 3D landscape that might be interesting to see in VR. Sittingduck back in 2016 had used a Blender Plug-In obtained from phase4.

https://www.youtube.com/watch?v=7zW9yISB01Y...eature=youtu.be
fredk
About image pointing, the algorithm I use for MSL, based on this post, seems to work fine for Percy, with the "rover_attitude" quaternion field replaced with "attitude". So for the "shiny rock" image:
https://mars.nasa.gov/mars2020-raw-images/p...6_034085J01.png
I get elevation,azimuth = 1.2, 251.5 degrees, which looks about right. That presumably is for the centre of the FOV, which happens to correspond almost exactly with the shiny rock.
djellison
It's not great - but this is my agisoft metashape results with the Navcam images so far
https://sketchfab.com/3d-models/m2020-landi...7215aa7db2fb0c8
scalbers
QUOTE (djellison @ Feb 25 2021, 02:04 AM) *
It's not great - but this is my agisoft metashape results with the Navcam images so far
https://sketchfab.com/3d-models/m2020-landi...7215aa7db2fb0c8


Looks pretty nice given the available vantage points (one rover location). Is there a possibility the "1st Person" navigation mode would work with this model?
MarT
QUOTE (Cargo Cult @ Feb 24 2021, 02:44 PM) *
Stereo imagery viewed in a VR headset is a bit underwhelming - reconstructing geometry using photogrammetry to create a fully 3D representation of a landscape is much more interesting.

I did this with some Curiosity imagery a few years ago, with fascinating results. If you have a SteamVR capable VR headset, you can have a look here: https://steamcommunity.com/sharedfiles/file...s/?id=928142301 - I typed up some fairly detailed notes in the description which will broadly apply to Perseverance imagery.

(Full disclosure: I work for Valve, creators of SteamVR. The Mars stuff was a fun personal project which turned into something a bit larger...)



Thats awsome! Great to see you here. I just ran into that two days ago. It is something I always wanted to do, but never fully found the time to work for a longer period of time. I have tried Photogrammetry on InSight, this is the result two years ago: https://www.youtube.com/watch?v=cBYAwTm_ArE...eature=youtu.be

Big thanks for the info here an on steam! It will be surely very helpfull to others too.

To stay on topic: I had a run with a short sequence of the true raw data from the down-look cams: https://www.youtube.com/watch?v=l4WKIoTjE4c...p;pbjreload=101
I will just wait for all the EDL data to download and will have a run with the data in Agisoft. I tried with the MARDI cam. While it was capable of clearly registering the images, I must have done something wrong, as the last image ended up "under the ground". Anyway the Perseverance data surely look promising in that matter.

I am working on a stabilized 360 video as with the MARDI cam. This is a quick version of the underlaying layer simulating approximately the view from 10km above the surface. I made it from the debayered data. Propably impacted by the FFMPEG compression.
phase4
QUOTE (fredk @ Feb 25 2021, 02:13 AM) *
About image pointing, the algorithm I use for MSL, based on this post, seems to work fine for Percy, with the "rover_attitude" quaternion field replaced with "attitude".

Thank you for this hint, fredk. I didn't realize the Spice toolkit works just as fine without kernels.
For future use, does anyone know how to obtain the zoom value from the JSON information? Should it be derived from the CAHVOR data?

Click to view attachment
mcaplinger
QUOTE (phase4 @ Feb 25 2021, 01:00 PM) *
For future use, does anyone know how to obtain the zoom value from the JSON information? Should it be derived from the CAHVOR data?

Should be if the CAHVOR model is set correctly, which they may or may not be at this point. See https://github.com/bvnayak/CAHVOR_camera_model and https://agupubs.onlinelibrary.wiley.com/doi...29/2003JE002199
John Whitehead
Do color calibration targets ever fade (change color) from solar (and worse) radiation on Mars?
How are they tested on Earth and proven to not fade?
mcaplinger
QUOTE (John Whitehead @ Feb 25 2021, 03:03 PM) *
Do color calibration targets ever fade (change color) from solar (and worse) radiation on Mars?
How are they tested on Earth and proven to not fade?

https://mastcamz.asu.edu/mars-in-full-color/
QUOTE
At the University of Winnipeg, the effect of intense Mars-like ultraviolet (UV) light on the colors of the eight materials was studied, confirming that the materials will only change very little with UV-exposure through a long mission at the Martian surface.

Typically, they get dusty before fading would be an issue. Hopefully these magnets work better than the last time.
JohnVV
just came across a paper on the cameras and mic

"The Mars 2020 Engineering Cameras and Microphone on the Perseverance Rover: A Next-Generation Imaging System for Mars Exploration"

https://link.springer.com/article/10.1007/s11214-020-00765-9
phase4
QUOTE (mcaplinger @ Feb 25 2021, 10:56 PM) *
Should be if the CAHVOR model is set correctly, which they may or may not be at this point. See https://github.com/bvnayak/CAHVOR_camera_model and https://agupubs.onlinelibrary.wiley.com/doi...29/2003JE002199


Thanks for the links. I tried the code but it gave very low values and fluctuating results for the Mastcam-Z focal length.
Will try again when new images arrive.


mcaplinger
QUOTE (phase4 @ Feb 26 2021, 04:32 AM) *
Thanks for the links. I tried the code but it gave very low values and fluctuating results for the Mastcam-Z focal length.

Seems like, for the one case I looked at, that the values in the JSON are in the order VORCAH instead of CAHVOR as one would expect. I don't know if this is intentional or a bug. And I'm not sure if these models are actually correct anyway. But you could take a look.
djellison
Jim has put out a Mastcam-Z filename decode guide

https://mastcamz.asu.edu/decoding-the-raw-p...mage-filenames/

It includes digits in there that describe the focal length in mm as three digits.

All the stereo pan images report 34mm for that - which matches the widest angle of Mastcam-Z MSL Mastcam Left.
mcaplinger
QUOTE (djellison @ Feb 26 2021, 09:30 AM) *
It includes digits in there that describe the focal length in mm as three digits.

Oh yeah, duh. Well, it was much more interesting to extract it from the camera model unsure.gif

FYI, widest angle of MCZ is 26mm.
djellison
Ahh ok - they miss spoke at yesterdays thing then. This matches MSL Mastcam Left then smile.gif
mcaplinger
QUOTE (djellison @ Feb 26 2021, 09:45 AM) *
This matches MSL Mastcam Left then smile.gif

Correct. Original MCZ spec was for 34mm to 100mm but we ended up with some extra credit range and can do 26mm to 110mm. Typically values 26, 34, 48, 63, 79, 100, and 110 will be used, but AFAIK nobody is sure yet how they'll be chosen. Zoom has to be stowed at 26mm for driving, so that might motivate use of that setting sometimes.
phase4
QUOTE (mcaplinger @ Feb 26 2021, 05:44 PM) *
Seems like, for the one case I looked at, that the values in the JSON are in the order VORCAH instead of CAHVOR as one would expect.

So close. Processing the CAHVOR as VORCAH gave more plausible results, although one decimal off.
It was a fun exercise but no longer nescessary... thanks for posting the decode guide Doug!

Click to view attachment
mcaplinger
QUOTE (phase4 @ Feb 26 2021, 10:51 AM) *
So close. Processing the CAHVOR as VORCAH gave more plausible results, although one decimal off.

Pixel size is 7.4e-6 m = 7.4e-3 mm, is that what you were using?
phase4
QUOTE (mcaplinger @ Feb 26 2021, 08:00 PM) *
Pixel size is 7.4e-6 m = 7.4e-3 mm, is that what you were using?


Ah thank you, I used 0.074 mm instead of 0.0074. Things are obviously ok now. biggrin.gif
fredk
In a previous post I estimated the horizontal FOV to be 19.5 degrees for an image with a "34" focal length filename field. I said there that that was near the maximum FOV of 19.2 deg, but I mistakenly used vertical FOV, not horizontal. My measured 19.5 deg corresponds very closely to the max horizontal 25.6 scaled from 26 to 34 mm, so everything is consistent with a 34 mm focal length (apart from the statement yesterday as Doug mentioned).

And we seem to have a little bonus of around 1608 horizontal pixels, vs the 1600 photoactive stated in Bell etal.
MahFL
Is there a way to tell the zoom used from the MastCamZ file names ?
mcaplinger
QUOTE (MahFL @ Feb 27 2021, 03:19 PM) *
Is there a way to tell the zoom used from the MastCamZ file names ?

Yes. See post #25 in this thread.
Greenish
It appears the JSON data for the Sol 9 images has corrected the CAHVORE formatting issue we saw in the earlier image metadata (i.e. in correct order).
I wonder if it was a different version effect from the cruise software load.
CODE
            "camera": {
                "filter_name": "UNK",
                "camera_vector": "(0.6929857012250769,-0.7042489328574726,-0.1542862873579498)",
                "camera_model_component_list": "(1.10353,-0.008945,-0.729124);(0.884935,-0.17181,0.432865);(2606.49,1675.44,1273.54);(747.819,-326.9,2757.19);(0.885293,-0.171227,0.432363);(1e-06,0.00889,-0.006754);(-0.006127,0.010389,0.004541);2.0;0.0",
                "camera_position": "(1.10353,-0.008945,-0.729124)",
                "instrument": "FRONT_HAZCAM_LEFT_A",
                "camera_model_type": "CAHVORE"


djellison
The front hazcams have quite a significant toe-out between them. ~20deg.
page 17
See https://www.ncbi.nlm.nih.gov/pmc/articles/P...Article_765.pdf
MarkL
QUOTE (JohnVV @ Feb 26 2021, 12:56 AM) *
just came across a paper on the cameras and mic



Thank you!
PDP8E
The cameras used for rover lookdown and lookup are AMS CMV20000

here is the datasheet from AMS

https://ams.com/documents/20143/36005/CMV20...e1-428cb363ab0a

Maybe the lookup camera can be used for stargazing, cloud studies, tau....

My debayer program is still a little wonky green

(are there R and B multiplier 'factors' known for this camera?)

I downsized it to fit in 3MB here
Click to view attachment
Andreas Plesch
Since I believe this may not have been mentioned, it is useful to know that the rss API json feed allows for just returning a single record based on id. It has an id query parameter.
So one can browse to find an image, say

CODE
https://mars.nasa.gov/mars2020-raw-images/pub/ods/surface/sol/00002/ids/edr/browse/edl/EDF_0002_0667111022_758ECV_N0010052EDLC00002_0010LUJ01.png


and then use the id (filename without suffix) to get the record:

CODE
https://mars.nasa.gov/rss/api/?feed=raw_images&category=mars2020&feedtype=json&id=EDF_0002_0667111022_758ECV_N0010052EDLC00002_0010LUJ01

For example, this is interesting pointing out that the downlook rover camera does not have any interesting additional data.

[edit:] If the filename for the image has trailing digits as in https://mars.nasa.gov/mars2020-raw-images/p...2_06_0LLJ01.png, for some cameras (not sure which ones) it is necessary to omit the trailing digits (here "01") for the json id: https://mars.nasa.gov/rss/api/?feed=raw_ima...AZ00102_07_0LLJ

EDF_0002_0667111022_758ECV_N0010052EDLC00002_0010LUJ01 (for EDL) vs.
FLE_0009_0667754529_115ECM_N0030000FHAZ00102_07_0LLJ (for FHAZ)

Greenish
Good info from Emily Lakdawalla on twitter:
QUOTE
@elakdawalla 8:54 PM · Mar 1, 2021
- I just got off WebEx with Justin Maki, who leads of the Perseverance engineering camera team. I've learned a lot and gotten a lot of confused questions sorted out. I'll try to bang out a blog entry with lots of techy detail about raw images tomorrow.
- The TL;DR: of the interview was: a lot of the things that are weird and confusing in the raw image metadata from sols 1-4 have to do with the rover being on the cruise flight software at the time.
- For example, the cruise flight software did not "know" how to automatically create image thumbnails. So they had to instruct the rover computer with separate commands to make thumbnails for each image, which is why sequence IDs don't match up between thumbnail and full-res.
- Many of the more confusing issues were solved by the flight software update. They're going to continue to tweak parameters over the next week or two, testing to see what modes they like best for returning their data, but before long they'll settle into some routines.
- It's SO FUN to see this process working out in real time. They *could* hold all the images back until they're happy with their tweaking, but they're not. They're just shunting the images out, never mind the temporarily wonky metadata.
JohnVV
QUOTE
My debayer program is still a little wonky green


have you tried the debayer in G'Mic
https://gmic.eu/
CODE
gmic input.png bayer2rgb -o output.png

fredk
Yeah, gmic gives the same wonky green/yellow cast. The particular deBayering interpolation algorithm shouldn't determine the overall hue (but may affect pixel-scale chroma details). What a deBayering by itself gives is known as "raw colour", and won't generally look right because the relative sensitivities of the RGB channels differ from those of the eye. DeBayered images released have similar casts, eg:
https://mars.nasa.gov/mars2020-raw-images/p...0_01_295J02.png

A simple relative scaling between RGB channels, ie a whitebalance, should help a lot with these images.
fredk
QUOTE (fredk @ Mar 2 2021, 07:02 PM) *
A simple relative scaling between RGB channels, ie a whitebalance, should help a lot with these images.

If the black level isn't maintained during the autostretch which is done on the public engineering cam frames, then such a simple fixed RGB scaling won't work for all frames and we're left with trial and error. So it could be that different tiles of a full frame navcam would need different colour adjustment.
lingyuk
So, as I understand it - Perseverance has the ability compress frames into .MP4 files? And the landing videos posted by NASA on youtube were uploads of those MP4 files? And the rover will later send all of the full resolution frames (1000s of them) of the landing?

My question is: are those MP4 files available anywhere to download? Has NASA made them available?

Because Youtube compresses videos a lot, and the original files would have a lot more detail. Thanks!
MarkL
QUOTE (lingyuk @ Mar 3 2021, 08:58 AM) *
My question is: are those MP4 files available anywhere to download? Has NASA made them available?

Great question. JPL ws able to get video footage very quickly and there are still a lot of individual frames to be published so these must have been videos created by the cameras and uplinked on Sol 1.

Where are the raw video files do you suppose?

Can we get our hands on them?
lingyuk
QUOTE (MarkL @ Mar 3 2021, 05:54 PM) *
Great question. JPL ws able to get video footage very quickly and there are still a lot of individual frames to be published so these must have been videos created by the cameras and uplinked on Sol 1.

Where are the raw video files do you suppose?

Can we get our hands on them?


I found this file:

https://mars.nasa.gov/system/downloadable_i..._deployment.mp4

It looks like it's the original MP4 file. I couldn't find any others though.
fredk
That video is slowed down and has duplicated frames, so must've been re-encoded from the original.
Osvold
Hello guys, I am new here and I have no expirience with pictures from Mars, so could be, this is obvious to you, but I struggle to find out how this works. I would like to know, if there is any way how to find out direction the rover is looking from a picture. As an example, i would love to know, if the hill Click to view attachment on the attached photo is the red encircled hill on the map Click to view attachment. How is possible to figure this out, apart form guessing from the map, please? Thanks a lot for any help.
john_s
One easy way to get oriented is to look for Phil Stooke's circular projections which he posts regularly to support his mapping efforts. These always have north at the top, and show vertically-exaggerated images of features in the distance. The one linked here confirms your hunch about the identity of that mesa.
Pando
QUOTE (Osvold @ Mar 8 2021, 08:05 AM) *
i would love to know, if the hill on the attached photo is the red encircled hill on the map


Yes, it is.
This is a "lo-fi" version of our main content. To view the full version with more information, formatting and images, please click here.
Invision Power Board © 2001-2024 Invision Power Services, Inc.