Giter Club home page Giter Club logo

template_mister's People

Contributors

bellwood420 avatar grabulosaure avatar hackshed avatar jimmystones avatar jotego avatar kitrinx avatar mankeli avatar matijaerceg avatar milanpolle avatar mjy71 avatar nanoant avatar paulb-nl avatar rampa069 avatar robertpeip avatar s0urceror avatar schmid avatar skooterblog avatar sorgelig avatar wickerwaka avatar zakk4223 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

template_mister's Issues

adder for no reason?

It looks like ar=0 selects the internally defined aspect ratio. Then the core tells the framework to use an arbitrary aspect ratio by setting VIDEO_ARX to ar-1. But does it have to be ar-1? ar-1 means that an adder will be instantiated. Could this just be zero instead?

assign VIDEO_ARX = (!ar) ? 12'd4 : 12'd0;

assign VIDEO_ARX = (!ar) ? 12'd4 : (ar - 1'd1);

Direct Video interlace

Some cores like Genesis and PSX have 2 interlaced modes: normal(240 lines) and high resolution(480 lines). The normal mode is more like 240p and does not need to be deinterlaced by the scaler so the cores keep VGA_F1 at 0 every field.

This causes sync issues with Direct Video because it uses the field signal to generate the vde signal and the cores still output 2 different line counts every field.

Template_MiSTer/sys/sys_top.v

Lines 1162 to 1170 in c1080e2

if(~old_hs && vga_hs_osd) begin
old_vs <= vga_vs_osd;
if(~&vcnt) vcnt <= vcnt + 1'd1;
if(~old_vs & vga_vs_osd & ~f1) vsz <= vcnt;
if(old_vs & ~vga_vs_osd) vcnt <= 0;
if(vcnt == 1) vde <= 1;
if(vcnt == vsz - 3) vde <= 0;
end

To fix this we can either change the cores to always output the field when Direct Video is enabled or change the framework to not use the field signal but detect the different field lengths. Although there might be some cores like Atari 2600/7800 which don't have the field available.

I have tested below with some cores and it works well.

Change this

if(~old_vs & vga_vs_osd & ~f1) vsz <= vcnt;

to

if(~old_vs & vga_vs_osd) begin
    if (vcnt != vcnt_ll || vcnt < vcnt_l) vsz <= vcnt;
    vcnt_l <= vcnt;
    vcnt_ll <= vcnt_l;
end

Current version does not compile

Templat_MiSTer was compiling as at May 16, but currently (as at June 1) does not compile.
Looks to be related to May 30 commit.

Error (10996): Verilog HDL error at hps_io.sv(28): parameter "CONF_STR" has no initial or actual value

(EDIT ... no bug - my mistake, sorry!) Video info sometimes shows the wrong original resolution

This happens e.g. in MINIMIG I think. Also GAUNT2's original resolution is shown as 336:240 but actually it's using 448:256

I've tested this by tuning vscale_border and custom aspect ratios to make the signal match the pixels on my flatpanel.

In GAUNT2 at integers of 448 and 256 (AR 1.75) it matches the pixels exactly/perfectly (all pixels become even) but the video info says it should be integers of 336 and 240 (AR 1.5 which doesn't create integer scaling).

Screenshots are saved as 336x240 though so why must the output be scaled up using integers of 448x256? Quite confusing :)

Minimig some screenshots are saved as 1024x256 or so even though that's not what it's using on the flat panel.

I tested this by using integer scaling and found the perfect scaling for both Gauntlet 1 and Gauntlet 2 - they both have the same video info and screenshot size 336x240 but to make them become perfect integer scaling they need to be treated as 448x256.

arcade_video: support for rotation of 24bpp images

Currently the arcade_video module only supports 24bpp images nominally, as in practice it will fail when compiling with a lack of BRAM error. The reason is that the rotator module needs more BRAM than available for 24bpp images (even when resolution is low such as 384x224).

No Video at 1080P for cores built with the latest Template_MiSTer.

TLDR

It appears that some recent change in the MiSTer Template makes 1080P through my A/V Receiver to my TV no longer work, though 720P continues to work. When hooked directly to the TV, 1080P works. Older builds of cores, including the PSX one, work at both 720P and 1080P. The receiver is older, but previously had no issues. It's a Yamaha RX-V667.

I believe the issue may be due to this change:
MiSTer-devel/PSX_MiSTer@5aeb434

Which I believe was part of this issue:
#59

Full Issue Description

I've been using the PSX core betas for months without issue, running at 1080P with vsync_adjust set to 2, though also using 0 on occasion for some games that switch resolution frequently.

Since the PSX core was recently released I updated via the 1.4 Downloader to the latest MiSTer Main and the released PSX core, both dated 5/11/2022. I've also verified the SHA hashes of both to confirm they weren't corrupted. I'm also using a CEC-Less cable out of the MiSTer to the input.

After updating I can no longer get 1080P with any vsync_adjust value option, though 720P continues to work no matter the vsync_adjust used. The most recent PSX core test build from 5/9/2022 also continues to work fine at 1080P on the same MiSTer Main. In this case 'No Video' means the TV does not see a signal at all, it doesn't even attempt to sync.

Upon advice in the #debug channel on the discord I tried another recent core that would have the same change as the official PSX release from 5/11/2022, this being the Jackal Arcade core. It also exhibits the same issue, with 1080P output being broken, but 720P output being fine.

After some further recommendations from the #debug chat, I tried with a different TV, where 1080P worked as expected. I then tried plugging the MiSTer directly into the problem TV, rather than through my A/V setup, and surprisingly 1080P also worked there. However, it stopped working again once I plugged MiSTer back into my A/V Receiver, a Yamaha RX-V667.

So it seems like some recent change in the template has broken 1080P when routed through my receiver to my TV. With my current A/V Setup having the MiSTer plugged directly into the TV isn't ideal, and I would prefer to continue routing it through my Yamaha RX-V667, but I do understand this may be an edge case and I'll just have to redo my A/V setup to accommodate having the MiSTer attached directly to the TV.

MiSTer sends the wrong VIC code in the AVI InfoFrame for resolutions above 1920x1080, forcing an incorrect aspect ratio (16:9).

Since I bought a Magewell USB Capture HDMI I noticed that it always detects my MiSTer output as 16:9, even when I'm using resolutions with other aspect ratios, like 1920x1200 (8:5) or 1920x1440 (4:3). That didn't happen with other sources, like an OSSC in 1920x1200, for instance, which is correctly identified as 8:5.

After some research I found out MiSTer is sending the wrong VIC info in the AVI InfoFrame. Apparently, the automatic VIC detection of the HDMI chip (ADV7513) fails when we set resolutions or frequencies above its specifications. According to its specs, the limit is 165 MHz and 1920x1080.

When MiSTer is set to 1920x1440, the VIC is set to 16, which corresponds to 1920x1080 @ 60 Hz with a 16:9 aspect ratio. When the sink (in this case Magewell USB Capture HDMI) realizes this mismatch between the resolution and refresh rate in the actual image and in the VIC field, it uses what is in the image. However, there is no aspect ratio in the image, so it keeps using the wrong information set in VIC.

I confirmed this issue by building the SNES core with the VIC detection disabled. This is accomplished by disabling pixel repetition and setting VIC manually, according to the ADV7513 Programming Guide.

So what I did in hdmi_config.sv was:

  • set register 0x3B[6:5] to '10'
  • set register 0x3C to '00000000'

This fixed the issue and the aspect ratio is correctly identified in any resolution.

My suggestion is to update the framework so the automatic VIC detection is not used when the resolution is above 1920x1080 or the frequency is above 165 MHz. In these cases, VIC should be set to zero. There will be no loss since there are no valid VIC numbers for 1920x1440 resolutions (or any resolution above 1080p and below 2160p).

While I was messing in hdmi_config.sv, I also realized that the A0 field in the AVI InfoFrame (0x55[4]) is set to 1, which indicates that the Active Format Data is present in Data Byte 2 bits R3 through R0. However, R3 to R0 (0x56[3:0]) is currently set to '0000', which is not valid. It should be set to '1000', which means "Same as Picture Aspect Ratio". Moreover, when the aspect ratio is 16:9 or 4:3, it can be set to 0x56[5:4]. It is not mandatory, but I think it would increase the compatibility with some devices. When the aspect ratio is different than that, it should be set to '00'.

Finally, I also did some tests by setting the IT content to Game (0x57[7] to '1' and 0x59[5:4] to '11') and it makes my TV automatically change to game mode, like modern systems like PlayStation 4, PlayStation 5 and Nintendo Switch do. I believe this should be a mister.ini option, so people could choose the IT mode (Game, Graphics, etc.) or IT disabled according to what is best for their TVs/monitors.

I would implement this stuff myself but I have no experience with FPGA programming and I have no idea how to bring the variables I need to check into hdmi_config.sv, like frequency and resolution.

My current test hdmi_config.sv file is here. It is always sending VIC=0, but no VIC is always better than the wrong VIC, since with no VIC the capture device defaults to the square pixels aspect ratio.
https://github.com/skooterblog/Template_MiSTer/blob/master/sys/hdmi_config.sv

A video demonstrating the correct behavior with 1920x1440 and VIC=0 is here:
https://www.youtube.com/watch?v=2AW-cYGV8Kc
The video starts with the current core running showing the wrong aspect ratio (16:9) and later I load my test build which shows the correct aspect ratio (4:3).

I'd like to thank wickerwaka and Zakk for all the knowledge shared on the official Discord server, which helped me a lot in figuring out what was causing these issues and how to fix them.

Scaler won't downscale

I noticed this several months ago when the scaler code changed. If you try to display something that has more vertical resolution than your display, it won't downscale it. Instead, you get the top rows of pixels and the bottom rows are cut off. Any chance this can be fixed? It used to downscale correctly. This affects at least most arcade cores and ao486. I know you would lose horizontal lines in your image but at least you can see the whole screen instead of the top half. Specifically, I am using a 240p mode on my CRT TV and can't see the bottom half of anything higher than 240 vertical resolution.

Also, since I am on the subject of the scaler, it would be amazing if 480i support was added. This would allow full vertical resolution of 480 lines on CRT TV's

Thanks for your consideration to whomever can implement these features.

Adding a CRT Adjust option

I would like to know if you'd consider adding a CRT adjust option, plenty of people use CRTs with Mister, this option would be useful to both SD CRT TVs and VGA monitors.

Everytime I switch cores I have to recenter the image on my monitor, this gets tiresome, there isn't one perfect setup for everyone across all monitors either since we are talking about analog signals. Having a way to setup the CRT center in the core would allow to have everything work as is with an overscan area as originally intended on consoles/arcade boards.

Jotego's own cores have a CRT adjust so that's possibly something that could be implemented directly into the Mister framework if I am not mistaken.

Adding a framedoubler for 240p@120hz support

Adding a framedoubler option would allow to output a 240p@120hz image (for native 240p on VGA monitors or on some supported hdmi monitors).

It could also eventually be designed to allow an optional BFI implementation to insert a black frame every 60 images per second and improve motion with no additional input lag.

Consider moving the emu ports to an include file

I have made this change to my framework which you may consider useful for the general MiSTer framework:

  • the core ports are defined in a common file to all games

It looks like this when you use it:

module jtkicker_game(
    `include "jtframe_game_ports.inc" // see $JTFRAME/hdl/inc/jtframe_game_ports.inc
);

You can check out the included file here.

From a maintenance point of view, this is very practical because features can be added and fixed more easily for all cores. From the developer point of view, it takes clutter away and removes the need to keep track of new ports added via MISTER_ macros.

It may fit my development tool chain better than yours, but I just wanted to share my two cents. Please close the issue at any time. I am not requesting that you actually implement it, but just sharing some thoughts.

MiSTer volume settings not respected during core boot sequence

Many arcade cores will emit an obnoxious beep tone, siren, or other noise when they are starting up. This sound will play at apparent full volume regardless of the MiSTer sound volume or mute settings. Note that once the core has booted and the display has started to output/resolution banner is displayed the volume settings are correctly restored to the correct levels.

The expected behaviour is that the current MiSTer volume settings should be respected at all times.

This is an annoying and possibly dangerous issue, particularly when playing through headphones or with amplified speakers.

I have compiled a list of the arcade cores that I have found so far that exhibit this issue:

  • a.galaxn (Mr. Do’s Nightmare)
  • azurian
  • blkhole
  • canyon
  • Clean Sweep
  • devilfsg
  • dominos (loud!)
  • galaxian
  • kingball
  • luctoday
  • mars ?
  • mooncrgx? (Moon cresta)
  • mpatrol (popping)
  • omegab
  • orbitron
  • phoenix (siren)
  • pleiads (siren!)
  • shtrider
  • travrusa (popping)
  • ultratnk (engine sound)
  • uniwars
  • victorycb
  • warofbug

I would recommend trying dominos and pleiads to get a good sense of how "alarming" these startup sounds, played at full volume, can be. :)

Adding a 100% scanlines option in the scandoubler fx

The way the scandoubler fx is written only allows for up to a 75% scanlines option, considering the scanline filters do not work while using the scandoubler, this feature remains relevant, having a 100% scanlines option would allow to simulate a lower resolution PVM on a computer monitor.

On a CRT, 480p already has scanlines, so fake scanlines have to match/scale the existing ones, thus a 100% option makes sense, 25% and 75% look odd when paired with the monitor's native scanlines.

Separate audio for clock: why?

I have found today that a new PLL and a new clock signal was defined for audio. This makes no sense. What you care about is the sampling rate of the signals, not the actual clock. You can use whatever clock you want as long as your sampling period is uniform.

With this extra PLL you are consuming resources and now you have another clock tree to balance in the design. Compilation time goes up too. How about the synchronizer flip flops needed to convert from the emu clock domain to the audio clock? Did you see all the timing violations that the new clock brings up?

What you need is not a different clock. What you need is each core to provide a sample strobe to tell you when there is a new sample.

I propose to remove the clock signal. If you want to let the users play with cheap IIR filters you can do it with a clock enable signal to serve as your sample period.

Typo in comparison

There appears to be a typo in osd.v on line 134.

v_cnt_h <= v_cnt <= osd_t;

should be

v_cnt_h <= v_cnt < osd_t;

Integer scaling modes work incorrectly on a 4:3 display

The current logic for the integer scaling options (this goes for both the integer scaling options in the menu core and the mister ini file) is based on only the vertical resolutions of the source and destination images, assuming a wide-screen output display. On a 4:3 aspect-ratio output display, when using cores with a wider aspect-ratio (Atari Lynx, Gameboy Advance, WonderSwan), this strategy fails. What seems to happen is that the resulting image is too wide for the available 4:3 horizontal resolution and gets squashed down to fit. This results in both an incorrect aspect-ratio and non-integer horizontal scaling.

The LCD handheld cores need proper integer scaling for shadow masks to work correctly.

Example with the Atari Lynx core and a 4:3 iPad monitor:
Atari Lynx resolution: 160 x 102, square pixels, 160:102 display aspect-ratio
iPad monitor display resolution: 2048 x 1536, square pixels, 4:3 display aspect-ratio

Horizontal upscaling (desired in this case):
2048 / 160 = 12.8 = 12x up-scaling, 12 x 12 pixel shadow mask pattern, resulting resolution: 1920 x 1224. Resulting in a correct aspect-ratio and shadow mask pattern on this monitor.

Vertical upscaling (current logic):
1536 / 102 = 15.06 = 15x up-scaling, 15 x 15 pixel shadow mask pattern, resulting resolution: 2400 x 1530.

As 2400 is much wider than the available 2048 pixels, the image gets squashed down horizontally, to 2048 x 1530, resulting in an aspect-ratio close to 4:3. Shadow masks will also no longer work correctly.

One correct solution would be to calculate the scaling factors based on both horizontal and vertical resolutions and then choosing the smallest scaling factor of the two. This would work correctly for all variations of source and destination image aspect-ratios.

DAC board (PCM5102) no sound in some cores

I connected a little DAC board (PCM5102) like this:

-GPIO1 PIN 2: BCK
-GPIO1 PIN 7: DIN
-GPIO1 PIN 8: LCK
-VIN: can be 3.3v or 5v, it does not matter.
-GND: any GND available
-The big switch near the ethernet port is ON (I guess SW[0]?).

On some cores (SNES, MegaDrive, ao486, Minimig, SMS, Nes, Neo-Geo, menu with BGM script) audio through GPIO1 is completely off.

I found a change in sys_top which causes the issue:

This is from a working core (for example Gameboy):
assign AUDIO_SPDIF = SW[3] ? 1'bZ : SW[0] ? HDMI_LRCLK : spdif;
assign AUDIO_R = SW[3] ? 1'bZ : SW[0] ? HDMI_I2S : analog_r;
assign AUDIO_L = SW[3] ? 1'bZ : SW[0] ? HDMI_SCLK : analog_l;

This is from a non working core (Megadrive):
assign AUDIO_SPDIF = av_dis ? 1'bZ : (SW[0] | mcp_en) ? HDMI_LRCLK : spdif;
assign AUDIO_R = av_dis ? 1'bZ : (SW[0] | mcp_en) ? HDMI_I2S : analog_r;
assign AUDIO_L = av_dis ? 1'bZ : (SW[0] | mcp_en) ? HDMI_SCLK : analog_l;

I replaced this part in Megadrive core, then compiled it, and sound worked fine.

Maybe it is not an issue at all, and I forgot to configure something else on .ini files, I don't know.

Thanks.

Adding support for 960i (2*480i) in the scandoubler or another way to display an interlaced source using the scandoubler

A few games output 480i (for example, Sonic 2 multiplayer mode) which the scandoubler does not support, this means that when outputing to a VGA monitor you will see no image as it attempts to output a 15khz signal whenever the game outputs at 480i.

The scandoubler could, at no additional cost, double the interlaced signal to 960i to allow an image to be displayed.

If this is not an option for whatever reason, would it be possible to force the scaler for interlace resolutions only? or use it on a per game basis? (I assume bob deinterlace can't be implemented for the scandoubler?)

pixel jitter

Some games, especially Super Hang On, show pixel jittering.

Looking closely, all pixels jitter a tiny bit around. This becomes especially apparent when combined with filters like shadow masks.

A similar thing happened with the first split versions of the Xain'd Sleena core (just in case that might help).

Using vsync_adjust=2 (just in case this only happens in this low latency mode).

According to Jotego, this is a problem in the framework if I understand correctly:
jotego/jtcores#226

Code comments in shadow mask code

I was looking at the shadow mask code and explanative comments.

cdcb29f#diff-31aef25ffcf4833fc2d16e5f3d8f327e12febf4e6fb25e731cdf16cd16d5968aR99

I don't know verilog, i can barely understand what is going on.
It seems to me that comments are outdated.
It seems to me that the LUT has 64 slots and that the mask can be effectively 8 by 8.

Both the reference to max height and the top right corner being used to store the width and height max index seems incorrect or old.

Reading a second mouse

It is not clear, at least to me, how to read the data for a second mouse. The module hps_io seems to provide data for a single device. Does MiSTer support connecting two mouse devices?

The use case is arcade games for two players, where the mouse is the most convenient for most users.

PS: Two mouse devices are supported on MiST, so this seems to be more common a need than one would've thought

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.