Hey, marekc, I'm a teacher too, right there with you on creating labs.
I didn't mess with any of the *raw* Hubble data, that would be incredibly difficult to use as you note.
My first step was going to
WikiSky. In the DSS menu, I selected Astro Photo Survey, which allows you to see Hubble and lots of other imagery. From there, you basically just dive deeply into a given object and look for two kinds of frames: the classic 'stair step' frame of Hubble's WFPC2 camera and the double-paned rectangular frames of the Advanced Camera for Surveys (ACS).
The trick with the contest was to find nice, rich areas without duplicating an image they had already done. This proved to be the most time-consuming part of the contest! I actually did a full process on one nice galaxy only to find that the European Hubble team had already done it. From that point on, I was careful to cross check every press release to make sure the image wasn't done. For a lab, you wouldn't have that problem!
Once I had a promising frame that looked like it had some good features, I did the HLA search on it. I stuck to only ACS frames because I didn't want to deal with the differing quality of the tiny center portion of the WFPC stairstep.
In HLA, I went right to Advanced Search, then unchecked All Instruments and only selected ACS. I filled in the object name (NGC 1763 for the winning image) and specified that I only wanted Combined (Level 2) from the pull-down menu. The Level 1 data is just too difficult to clean up, especially when the experts have already done it. Above Level 2, there are a lot of already-combined images.
From there, I used the Images tab almost exclusively to see what I was looking at. The whole field can be inspected with the nice Footprints interface, and when I do this for a lab, I'm going to incorporate a view from it.
Once a promising set of frames is found in the pages of images, the Interactive Display link can be used to really look at the data.
Here is the Interactive Display for the field that I used for the winning image. It can be zoomed, panned, and even contrast-controlled with the FITS2web interface. You might even be able to make a lab right there without bringing frames into FITSLiberator.
The downloads for the data take place with a 'cart' system where you queue them up to be downloaded in sequence. I had never used something like that for downloading, but Chrome supported it just fine after a warning. The resulting FITS files may contain multiple frames, and the downloads can be hefty.
Once I had the data files, I brought them into PI and away I went! But yes, I'll probably get my students to use FITSLiberator and Photoshop on the lab machine as well.
Hope this helps, you actually inspired me to start getting my talk about the HLA and the imaging contest together.