I'm just guessing (or wondering, or ... dreaming, so don't take what i'm saying more than a general idea):AFAIK, the Humanize function is English based, and for that, it doesn't draw upon a table. I believe that proper localization would require a function written from the ground up with a particular language in mind, or from the ground up with a completely ambiguous structure relying on an expanding library of cultural interpretation(s). Otherwise, you will get just what you got: literal transposition of words for their translated counterparts irrespective of proper syntax or sentence structure in that language.
from what i hear, i think that the functions could "pick the desired phrase" ("far more than...", "less than ....", etc) and then pass the "desired phrase index" along with the rounded numbers and "units" (Ly, credits, tons, whatever) to a "localization function" that will pick from a table the translated phrase[index] and "search/replace" the placeholders with the passed variables.
Not exaclty easy, but all we have to manage are a .... tenth? of different "colloquial numbering".... seems feasible, to me.
OR
Just provide a default personality script called by the Humanise function (or transform the Humanise in a script?) so the whole job will be made by us