Fun little Astronomy 101 factoid: surface brightness, or intensity per unit of solid angle, is constant over distance.
To unpack that statement a bit, consider that the angular resolution of your retina is around one arcminute, or 1/60 of a degree (the full moon is about 30 arcminutes across). So one "pixel" on your retina (ignoring for now the biology behind that) covers about (one arcminute)
2 of solid angle when looking at the scene in front of you. Similarly if you point a digital camera at a scene, the focal length of the camera and the size of the detector would let you calculate the corresponding solid angle for one pixel of the CCD sensor. If you set the focal length such that one pixel spans one arcminute, similar to your eye, the full moon would be about 30 pixels across in the resulting image.
When pointing this camera at a scene, the sensor is going to return a signal reporting how much light hit each pixel while the shutter was open. If you save your photos as RAW you will get these readouts as-is, otherwise they are lost to the image processing. The detectors used by professional astronomers can be precise enough to record the actual number of photons that hit the sensor, in fact! The surface brightness of a scene is the light flux per unit time per unit solid angle, or in terms of our camera, the number of photons (or amount of radiant energy) per second that hits one pixel.
So the fact up top means that if you take a photo of a uniform-brightness object (say a piece of paper under constant lighting), and move the camera closer or farther away, the light hitting each pixel will not change. In astronomical terms, imagine looking at two identical galaxies through a telescope, one twice as far away as the other. The farther one will be smaller, and thus will yield less total light, but each pixel of the first galaxy will be the same brightness as the corresponding pixel of the second galaxy. If you think in terms of the 1/R
2 law, in the picture of the farther galaxy each star (while too faint to resolve individually) is 1/4 as bright, but each pixel of the farther galaxy contains 4 times as many stars, so the two factors cancel out.
So, to apply this to
@Old Duck 's original question - go somewhere with very dark skies, and look up at the Pleiades, or the Orion Nebula. If the Earth was right up next to them, they would be exactly that brightness, but that brightness would cover a much larger portion of the sky. In the case of the Pleiades, it would probably be about as bright as the Milky Way is. Orion would be somewhat brighter, but still too faint to make out the vivid colors you see in photographs. But something like Barnard's Loop or the Witch Head Nebula would be mostly too faint to see at all with the naked eye.
And a final point to consider: dusty nebulae -- which are most star forming regions like Orion, the Coalsack, etc -- are also fairly opaque. So they would be also be highlighted by the fact that you'd see a few bright foreground stars in front of them, but no stars behind them.