Giter Club home page Giter Club logo

Comments (9)

klei1984 avatar klei1984 commented on May 22, 2024 1

Hello @segatrade ,

Yes. My screen and OS is 4k 42.5" scale 100%(no scale) so difference is noticeable

I think the best option for you now is to use screen_mode=2 and a virtual resolution that still could be rendered at 60+ FPS by the software renderer of the game on your computer.

In screen_mode=2 mode SDL2 will keep the native resolution and pixel formats requested by your monitor / screen at 4K. But the software renderer will render the game window in the configured smaller virtual resolution, e.g. FullHD. In this case SDL2 is responsible to scale up the rendered game window using HW acceleration to your native 4K screen.

On a 64" 4K HDR TV via hdmi at 30 Hz refresh rate I got the best picture quality with such a setup.

from max.

klei1984 avatar klei1984 commented on May 22, 2024 1

As documented in the previous comments, the slowness at high resolutions is not a defect, simply the software renderer reached its performance limits due to indexed palette color management.

Migration to a hardware render pipeline only makes sense after the game architecture is migrated to a modern event loop and to a HW accelerated GPU API such as Vulkan.

I close the defect report with no planned actions on short term.

from max.

klei1984 avatar klei1984 commented on May 22, 2024

scale_quality=1 or 2 is better? what difference?

The SDL2 documentation has this to say for SDL_HINT_RENDER_SCALE_QUALITY: https://wiki.libsdl.org/SDL2/SDL_HINT_RENDER_SCALE_QUALITY

0 or nearest nearest pixel sampling
1 or linear linear filtering (supported by OpenGL and Direct3D)
2 or best anisotropic filtering (supported by Direct3D)

2 or best means that from the list of supported features the best will be used by SDL that is supported by the HW and the OS. It is documented that on Windows and under Direct3D this could be anisotropic filtering. Which adds absolutely nothing above linear filtering for a top down rendered 2D texture. So I would say 1 and 2 is probably equal in display quality, but may not be equal from performance point of view. Who knows... maybe anisotropic filtering is accelerated better than a bi- or tri-linear filter.

SDL2 seems works with Vulkan

Yes it seems so. The SDL2 render API tries to do various texture or surface blitting operations in video memory accelerating operations using the video card. SDL2 can also create a system window and initialize a surface for OpenGL, Direct3D or even Vulkan API so that people can create their own render pipelines if they know how to do that. I tried to learn OpenGL, but would need years to do even the most basic fragment shaders that would be required.

M.A.X. uses a software renderer which means that the graphics is rendered onto a surface by the CPU 100% by software algorithms without HW acceleration. Then SDL2 takes the rendered surface, converts it to the screen format and draws the results onto the screen. This SDL2 part is accelerated by HW as mentioned earlier.

4K not working; Takes infinite to see main menu when open app. And multiple "not responding":

TL;DR: This is not an actual bug. It is a performance bottleneck from the CPU.

Complete story:
M.A.X. itself is unable to use more than one CPU core or more than one thread. The GUI and game logic, the software graphics renderer, the path finding engine, the computer player algorithms are all executed on a single CPU core in a sequential manner. Please do not forget that M.A.X. Port recreated the original M.A.X. from 1996 which was written for MS-DOS that was executed on a 100-200 MHz single core, single thread CPU and the OS was not able to manage parallel execution of tasks at all (IRQs excluded) and until 3Dfx there was no hardware acceleration for graphics except for a very important feature of the VGA and SVGA standards. 256 color indexed mode video modes supported double buffered color palette swaps basically for free. This means that by swapping out 768 bytes using the CPU we were able to change the entire color map that appeared on the screen. The caustic effects like water animation, blinking buildings, smoke from factories are all just color effects. At 7 FPS the CPU copied 768 bytes to the VGA's system palette array and the VGA HW did all the rest for free. The VGA hardware took each pixel (640*480 = 307.200 pixels) changed its color on the screen for free without CPU computational power, instantaneously at 60 FPS or better.

Now that operating systems like Windows do not support indexed color modes at all, there is no VGA palette array and no free, cheap color animations, we have to do all these costly operations by SW executed on the CPU or the GPU.

Now M.A.X. Port simulates an indexed system palette and VGA palette array in SW. SDL2 performs the color palette translations using some or no hardware acceleration pixel by pixel, translates the indexed palette mode rendered image to the OS screen format, automatically rescales the rendered texture to fit onto the screen resolution and performs various filterings to remove artifacts and aliasing errors. And to top all of these the software renderer simulates a Z buffer in SW.

So while under MS-DOS the single CPU core copied 768 bytes and the HW did instantly everything else for the 640x480 screen and the 307.200 pixels, now M.A.X. Port still running on a single CPU core copies the 768 bytes, then converts 3840x2160 = 8.294.400 pixels to use the new palette colors, then translates the resulting indexed image of 8.294.400 pixels to the OS preferred texture format, then renders the resulting texture onto the screen after all the post processing effects done by SDL2 that I mentioned earlier again transforming the 8.294.400 pixels.

Our single CPU core is unable to perform so much computations and this is why at 4K resolution the game slows down so much that the OS thinks the application program is not responding.

The solution to this is to use a 3D HW render API like Vulkan or OpenGL something like this:

  • Create a window and multiple surfaces or textures that could hold the ~50 layers of graphics that M.A.X. draws to.
  • Load all game assets like sprites, palettes, etc. into the GPU API specific data formats at the beginning of the game, scaled to the selected screen resolution or we need to create a mipmap.
  • Set up a top-down view port, z buffer and whatever required to optimize the processing of the multiplayer 2D graphics rendering steps probably by defining various maps.
  • create basic vertex and geometry shaders representing 2D sprites, effects and graphics layers.
  • create complex fragment shaders that could simulate the color animations and indexed color blending operations on HW.
  • create a new game logic integration that can move stuff around on the resulting screen, change view port positions based on mouse and keyboard input, etc.
  • split the graphics rendering and the game logic so that the frame rate of graphics and audio renderers can work asynchronously and independently from the frame rate of the game logic.

I cannot do any of these work packages due to lack of skills and resources. Maybe later, when the game itself is defect free in the sense that the game logic is at least on par with the original M.A.X.

Alternative design to Vulkan and OpenGL: As of now GNW provides the windowing system to M.A.X. and GNW windows are wrapped by a game specific windowing module. It would be possible to replace GNW's windowing system with imGui or SDL2 surfaces or directly with SDL2 textures. This way one or two layers of simulations can be removed and potentially the resulting renderer would be much faster.

The big question is, what is better. A big boom (Vulkan) or a step by step approach that might turn out to be a waste of time on the long run (replace GNW library and API with imGui or SDL2 or similar backends).

Hope this answers your questions and describes the root cause for the unplayable 4K resolution.

from max.

segatrade avatar segatrade commented on May 22, 2024

Thank you. Probably 4k isn't important. FHD and QHD much more than enough after VGA. More interesting to get access to create AI or scripts for unit management.

from max.

klei1984 avatar klei1984 commented on May 22, 2024

I forgot one more important aspect. The game renders to virtual surfaces that are scaled by SDL2 to match the screen resolution.

E.g. if the game is rendered in 640x480 resolution, the internal game resolution is small and the screen is actually 1920x1080, than SDL2 automatically scales up the small image to the big image size.

If the internal game resolution is 1920x1080, and the screen is also 1920x1080, than SDL2 does not have to scale anything as internal resolution and screen resolution are the same.

It can happen that if you set the internal game resolution to 3840x2160, while your screen is actually smaller e.g. 2560x1440, then SDL2 needs to scale a huge texture down to a smaller screen resolution.

There are 3 screen_mode settings.
0: windowed.
1: hardware video mode is changed to match internal game resolution as good as possible.
2: the internal game resolution is scaled to match the currently used video mode and screen resolution of the OS.

E.g. if your OS sets the screen size to 1920x1080 and your screen_mode setting is set to 2, than the resulting image size rendered onto the screen will always be 1920x1080. The internal resolution could be set to 4K, it will be rendered in 4K, then SDL2 will shrink the 4K image to fit into the Full HD screen.

On the other hand if the screen_mode is set to 1 and your monitor actually supports 4K and you set 4K internal game resolution, then in theory the game will switch to 4K screen mode even if your normal OS desktop resolution is just Full HD.

from max.

klei1984 avatar klei1984 commented on May 22, 2024

I agree, a scriptable or limitless external AI modules is much more interesting :)

from max.

segatrade avatar segatrade commented on May 22, 2024

Yes. My screen and OS is 4k 42.5" scale 100%(no scale) so difference is noticeable (DPI ~ 110 ppi) not like 4k on 15.6" screen where 4k almost useless because looks like QHD and even FHD looks 'retina'.

FHD and QHD makes M.A.X. looks modern and I think enough for new players who can't play in VGA. Probably FHD 15.6 is most popular PC screen now

OpenGL seems legacy (but it can die decades). Vulkan (it's also next OpenGL just renamed, because changes so big) is future

from max.

Related Issues (12)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.