Most of these project ideas were recorded by Alex from our group discussions thus far (thanks!). Editing needed; the ideas need to be distilled and explained a little more; anyone with time to work on it would be welcome.
red: text I tried to capture but whose details I couldn't entirely capture (done by digital SLR model)gigapans.
Earthquake viewer for stars (Tony to elaborate here)
Convert star formation animation to 3d animation. The 2d video was shown by Bo Rippert, the work is from somone in England. He indicated the creator might share some data on a collaborative basis. (Bale, Bonnel, Bromm 2003 cluster formation simulation animation over 266,000 years)
Comet Garradd, tracking to study rotaiton. Animate? (Bo Rippert on the Big Island)
high enough resolution panoramics that they have a 2.5-D aspect in them (you can zoom in the picture quite well). You can make the 3D effect stronger if you place objects closer to the gigapan.
zoom-in: embedded in the image
try to plan deploying a bunch of equipment in the scene. you have 3D models of equipment ur trying to deploy. dump pictures onto 2D panoramic. if you had a setup so that as you zoom in, it kind of looks like the environment
(makes some of the traveller planning easier).
point to point aspect
DEM (digital elevation model): LIDAR is the best bet for that
binocular vision: pleasing effects. big wide-open space
== 360 resolution:
wow. gigapans 68571: you can see mauna kea from all the way
low resolution: 4 pictures in ehight: 360. 40 images. between 5 and 10 minutes to take pictures. massive singapore: camera sits there for 1 hr
google streetview car: 2D acting as 3D most interested in planning/executing/debrief cycle for sorties. worthwhile to have mini-CAVE for habitat for planning and for training. your team back at home can mark up environment for tags and information.
CREW on Mars can explore Mars. take tags out with your for heads-up display in environment. how much the 3D aspect
something that NASA is very very much into: participatory exploration. 4-month study: see how much of that experience we can give out to the outside world. how much can they participate from their own computers and cellphones?
how much can they go into our own facility to explore? NASA astrobiology institute has really gotten into these "virtual field trips" integrated into virtual field sites. directed at junior highs. next month at iceland: puts together virtual field trips.
tradeoff between 3D and 2D? the tracking system, the screens you need. if it's a habitat in space. if it's 2D we need a screen too.
flat-panel display: virtual generated stuff (as in FPS): addition of 3D
depth perception "free" in those
headtracking: optical system with extra cameras to track extra devices. precision optotracking. situational awareness - just IR cameras.
multi-use cameras. extra markers to track stuff.
pros/cons to doing tracking.
can't get camera minature enough "computer actuator with camera band" "pixel matching to get rough metric for distances in each pixel"
Comet - find 3d positions of objects orbiting, run visualization for EPO and to help scientists determine origin vents.