It is unlikely I can cope with the entire galaxy the way I currently do things. We anticipate that the galaxy size is 400 billion systems. If I miraculously managed to compress every system into only 10 bytes then the galaxy would take up 4TB of space which would in theory be doable (but 10 bytes is not a lot of space to store all of the information about a system).
Currently we know roughly 105million systems which contain 400million stars/planets (and even more with rings, belts and barycentres) and the entire data warehouse takes roughly 500GB. That means the average size of a system is roughly 5KB. At that size trying to store the entire galaxy would take up 2PB (petabytes) of space.
That is just for the data warehouse, the indexed part of the galaxy which powers the search is currently 600GB, so for the entire galaxy it would probably be a similar amount (2PB). That only covers storage though, I would almost certainly have to increase CPU power by adding more machines to the search to keep it responsive, potentially dozens.
I have other indexes as well which power some of the more modern plotters but those would probably be able to cope without much change. Those currently use 40 bytes per system although if I was pushed I could probably cut that down to 16 bytes for no change in functionality.
That is just for the space, it would almost certainly take more processing power to index all of that as well, and I would be unable to produce the galaxy dumps each night which currently take 5 hours to generate.
All that said, with the current rate of exploration I don't think we're in any danger of getting anywhere close to running me out of space. Also at some point we'd have enough data that we could probably reverse engineer enough of the stellar forge to correctly predict star positions (if people were so inclined and determined to work it out). That could reduce storage costs by several orders of magnitude as well.