I am very curious but I honestly have no idea.
For example, in 1908 Honus Wagner was at .943 for a shortstop and the league fielding percentage was .929. That means his error rate was 70% of the league average (57/81).
In 1998, Nomar Garciaparra was at .962 for a shortstop and the league average was .972. That means he was 136% of the league average (38/28).
So do they go to the extreme and make it a straight adjustment? Say a nominal fielding percentage of .970, and go from there? That would put Honus at .979 (error rate 0.03 * 70% = error rate of .021) and Nomar at .959.
Or do they do something less extreme to also take into account their real fielding percentages? If you took Honus' real error rate (0.057) and multiplied it by .7, you'd get about 0.040, making his adjusted fielding percentage .960. Doing a similar calculation for Nomar, you'd get about 0.052, making his adjusted fielding percentage 0.948. Nomar still gets worse but Honus doesn't become the best fielder of all time.