Sum that up, and you see these images indeed show what is "out there", but not being visible to our eyes. Astronomers collect data from the same spot in the night sky for hours, even days, with a spectrum as wide (and useful) as possible. As a matter of detail, mostly all of the light they collected is still very dim, and they do two things to come over that and result in a beautyful picture: "Stretch" the low light, so that a litte bit more bright than totally dark actually almost gets white, and filter out the read noise and other artifacts that sort of cover up the darker areas, where really much of that background data emitted by space objects is.
OK, thank you for elaborating.
Genuine question: Telescopes continuously move around to stay focused on whatever they are watching - as everything is on constant motion, right?
How does your camera relate to that?
Finally i found some time to answer this.
The camera is mounted to the scope eye-end, but the ocular (eyepiece) is replaced by a corrector lens (which makes the globular-segment projection of the lenses "flat", all rays of light parallelized, for sharpness in the corner area of the camera's sensor). From the view of the camera, everything is static (to a great amount), but alignment of the base, precision of motors, drives, vibrations by wind or nearby passing heavy vehicles etc. all play into this staticness of the view. Most advanced astrophotographers use a parallel, cheaper scope with a cheesy monocrome video camera (gsm cellphone like) to track a bright star near the object they are capturing, so the mount and motors can adjust to imprecisions in realtime. If you want to take single exposures longer than 2 minutes, there's no way around if you need sharp images.
I also remember playing around with cameras when I was younger, and what I enjoyed the most was making those LSD kinda of photos, just by letting in more light at night. Stable objects would sort of “move” in the resulting photo, creating the effect. I’m sure you know what I’m talking about, despite my lack of proper terminology. Similar to the wobbly whatever, live, on the telescope. And what about colour? Space and colour? Hm?
Not quite sure what you mean with the camera, but the scope "wobbles" is just warm (or cold) air moving in between the object and scope. The lower near the horizon the scope is pointed, the stronger the effect, mostly in the cities over the warm times of the year. These can get a problem when using large magnification, like for observing solar planets. When stacking a lot of short exposures, this effect is canelled out in the resulting photo. Nowadays AP's are using video cameras to capture planets, because they deliver many exposures per second, which are all stacked up (added on top of each other) in the process. Still there are a lot of particles between the camera and planet, so not really comparable in quality to Webb or Hubble pics.
Don’t get me wrong, I enjoy looking at the photos, and certainly do not want to spoil your hobby, but shit man - they do not represent reality.
Looking with your own two eyes at the moonless summer night sky, far from a luminated polluted area, is all the camera I will ever need.
No worries. I see your point, and i'd translate this aspect into "not real, in human vision relation". All the extra light data, the "invisible" parts of the spectrum contain the most beauty out there, but they would look very much like a nightvision capable animal's view, but not monochrome. The colors are either false color or shifted colors, for example the near-infrared H-Alpha waves from these cloudy nebulas, which are indeed technically red, and woudl be seen reddish by us if our retina could capture them.
And this is one of the fascinating aspect of AP, making the unseeable visible. It's there, and if we would have eyes of different design, we could actually see much of this naked.