Thanks for the answer.
My example is a thought experiment, and I deliberately made it simple. I know of no 4 pixel 1 bit image files in real life. I agree that my question doesn't have "meaning" without some sort of context, so let me try to explain it better.
Years ago I started experimenting with astrophotography. That led me to experimenting with what data (information) really is. I started out by finding as many different amateur astronomer images of the Andromeda galaxy as I could find, and then I aligned all the images, added each pixel of all the images and divided the value by the number of images I used. This method is well known, called Shift-Add og Stacking, but it had never been done with hundreds or even thousands of images. The result was quite mindblowing and time consuming.
During the home isolation these days, I decided to try and make an image again. This time however, I wanted to be more analytic in my approach. I went for the galaxy called NGC 4565, and you might not think so, but crawling the internet, I managed to collect roughly 600 different images of the galaxy. The goal was also to see how "deep" I could get the image, being how distant objects could I see. I'm not finished, but I have some results. It seems that looking at my current result, the distant galaxies in the image is roughly 10-15 billion lightyears away from Earth.
I've worked a lot with Entropy and Information, and since the task of making images like the one above is a question of separating signal (stars galaxies etc) from the noise, I have been looking at the way you normally do that (there are several ways), to see if I could come up with something better. That ended up with my question described in the OP.
The question is more general than my actual task of making an image of the Universe. Information and Entropy are closely related, to the point were some even argue that they are the same thingy:
Source: https://www.youtube.com/watch?v=sMb00lz-IfE
I was thinking more about all this yesterday, and of course it doesn't make much practical sense without an observer, perception and a lot of other factors, but that only makes my lack of understanding it more complex and thereby difficult.
Instead I've tried to look the other way, towards the basic axioms of Information Theory. Instead of transferring the information, I decided to look at information that isn't being used, like the pixels in an image file. They just exist in the thought experiment above, but they are still "stable" information (it takes energy to flip a bit). The information in an image has some sort of pattern that we can decode using eyes and brain, just like a spoken sentence carries information that we can decode by understanding the words and their meaning.
When you speak you transfer bits from your own brain to the brain of the person receiving the message. Some information get lost during the transfer, like when you use words the receiver doesn't understand. Basically
Claude Shannon discovered much of this back in 1948. Likewise in the image of the galaxy, there is information in what we perceive as noise. Once the signal to noise ratio get close to 1 it becomes difficult to distinguish signal from noise, but the information is still there, and by adding and dividing all the images, thereby increasing the SNR, I'm able to extract more signal, but it is to a certain degree a mystery why this works.
I find information and Entropy very interesting in many cases, and even though they connect the dots between Quantum Mechanics, Thermodynamics, Biology, Computing, and much more (with remarkable resemblance though still different), we still have a very limited understanding of what Information and Entropy really are.
So: I could arrange bits in a file so that the file carries information about how a galaxy "looks", or I could rearrange the exact same amount of information, even with what seems to be the same amount of Information Entropy, so that the image would be perceived as a picture of Cleopatra. I still don't understand why. It's the same amount of information, but it's different patterns, message, information or whatever it is called. That to me seems to indicate that there is more information in the files in the OP than just the four bits.
Seen from outside those files could be reduced to the number of bits they contain, but those bits alone doesn't show the pattern unless they are arranged in a very strict manner.