I have noticed that the lightmaps look kinda shitty at some points, see attached a rather clear example of this on ffa_bespin. For the screenshot, I used mono-color textures, to make it more apparent.
It seems like the current interpolation (from upsizing the tiny lightmaps) creates a weird tile-like structure, which is clearly faulty. I extracted the lightmaps from ffa_bespin and resized them with a few methods and I got similar shitty results with very simple resizing algorithms. But advanced algorithms like Lanczos 3 gave fine results and created exactly the circle-like gradients you would expect.
I also attached a screenshot where the same happens with saber glows, which is much more apparent and ugly even with normal textures.
Those things do not seem to be affected by r_textureMode, so I was wondering whether this was a limitation of OpenGL or of the engine itself, which may possibly be doing the upsizing. If the engine is doing it, it should be rather trivial to replace this code with some Gaussian (?) interpolation with a certain performance hit.
If it is an OpenGL thing, can it be hacked with not much effort?
I know I could crawl the code myself for this, but being the noob that I am, I suspect I would take ages to find the right place, so if some of you guys already know something about the process, I would appreciate input.