• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RETROARCH - The all-in-one emulator dreams are made of, son

Yep, that's working pretty well. Thanks!

You can then compress the resulting .bin/.cue files into .PBP files in order to save disc space if you want. I use a program called PSX2PSP to convert my rips into EBOOT.PBP files that can then be renamed to whatever you want to call them. Retroarch will play them in that format. There are a handful of EA titles that won't work if compressed (Diablo, Need for Speed, PGA Tour 98) so be sure to try them out before you delete the BIN/CUE files.
 

The Stealth Fox

Junior Member
I can't seem to use 2 player on genesis plus GX games on the web app. Player 1 will control player 2 and player 2 controller won't work. Why is that?
 

maks

Member
Ive come across a minor issue with using the 'F1' key to access the Main Menu. I run retroarch in an arcade cabinet. Sometimes I'll use a keyboard but im mostly using joystick controls that is recognized as an xbox 360 pad.

To access the Main Menu on joystick i need to use a combination of Select+Start buttons. It looks like this...

input_enable_hotkey_btn = "7"
input_menu_toggle_btn = "6"


With this set i can no longer access the Main Menu with just the F1 key. It seems to require a hotkey even though i keep...

input_enable_hotkey = "nul"

Any way to keep it to just 'F1' on keyboard and Start+Select on joystick?
 
Can someone explain what resolution and settings to use for a 1080p set? I am seeing people talk about integer scaling, aspect ratios, and whether to have black boarders or fill the screen.

There is a guy on the libreto forums named Nesguy who is seemingly the only one who has tried to go down the path of getting shaders and everything going & a proper aspect on a 1080p set. It feels like he is the only one who's bothered to get the answers and it seems like even then there is no distinct answer.

For NES and Super NES 8:7 being correct, 6:5 1536x1200 to maximize screen real estate, BUT others say running 720p and letting the TV scale for integer scaling and it seems that's what the new RetroUSB AVS is going for, and what the Retron5 does.

What I have tried is rendering at 1536x1200 causing the 80 vertical pixels to get cropped to a 1536x1080 representation on screen. This is actually quite pleasing to my eyes. Seems to look really nice and I believe is an integer scale? What is the "best" or most popular way to display on 1080p?
 

nkarafo

Member
I would like to see an answer to that as well.

I use a 1080p TV and most of the time i have to chose between uneven scanlines that look hideous or huge black bars around the screen. Which becomes an even bigger problem with games that already have black bars.

And there's also the N64, Mupen64 lacks overscan cropping and as a result, you get black bars around the screen in many games regardless so with integer scale on top, you get a post stamp for a screen.

Is there at least a way for the integer scale to use smaller black bars?
 

alr1ght

bish gets all the credit :)
For NES and Super NES 8:7 being correct, 6:5 1536x1200 to maximize screen real estate, BUT others say running 720p and letting the TV scale for integer scaling and it seems that's what the new RetroUSB AVS is going for, and what the Retron5 does.

There's no correct answer. Internal resolution may be 8:7, but those games were always displayed in 4:3. You'd think devs would design their games with 4:3 in mind, but some drew their pixels in 8:7 and stretched them.

Then you get into how 8:7 does scale properly to 4:3 (really only an issue with fixed pixel displays) so you get artifacting and oddly stretched pixels. You could display them scaled up 8:7, but that means the aspect ratio is wrong (IMO) and it doesn't scale properly on 1080p sets.

a little workaround with the AVS
https://www.youtube.com/watch?v=lGSidnlOhd4&feature=youtu.be&t=416
 

Lork

Member
Is there a recommended CRT shader or two to start with? There seem to be literally hundreds of them and I have no idea what's what. It doesn't help that changes persist even if you don't explicitly save them and there doesn't seem to be any "default" option or a clear way to remove any shaders you've applied, all of which serves to discourage experimentation. I noticed that you can set shader passes to 0, but that seems like a really roundabout way to do it and it's tough to judge whether or not that's actually leaving things in the exact same state they were before any presets were applied.
 
When I crop overscan on an SNES game 256x240 crops to 256x224.

When dealing with scaling, do I assume all integer math works for 256x240?

ie: is a 3x vertical scale 720px or 672px?
 
Was there ever a fix for the horrible audio static in DeSmuME on RetroArch? I'm trying to run Portrait of Ruin and the sound is just horrendous. It's easy to fix in the standalone DeSmuME, but I can't seem to in the libretro build.
 

Knurek

Member
Was there ever a fix for the horrible audio static in DeSmuME on RetroArch? I'm trying to run Portrait of Ruin and the sound is just horrendous. It's easy to fix in the standalone DeSmuME, but I can't seem to in the libretro build.

Did you try disabling rewind?
Desmume core just can't seem to hit 100% with rewind enabled.
 
I only notice it on the title screen. After that the music and sound fx are fine.

Hmmmm, I might try again

Interesting, it sounds like it only occurs with certain sounds. I get it horribly on the title screen and with the sound effects of the crickets on the first screen. I wonder if it's something borked in one of the sound channels or something.

I wish DeSmuME was as good as Drastic on Android is. Drastic is damn near flawless.
 

Knurek

Member
Is there a way to have two gamepads configured as the same player?
I have both DualShock 4 and Wii U Pro controller connected to my PC, and everytime I want to use the other controller, I have to go to the Input Menu and change the User 1 Device Index to match the controller I use.
 

Awakened

Member
Is there a way to have two gamepads configured as the same player?
I have both DualShock 4 and Wii U Pro controller connected to my PC, and everytime I want to use the other controller, I have to go to the Input Menu and change the User 1 Device Index to match the controller I use.
There's no way to do this through the frontend AFAIK. You could map your keyboard user 1 keys and use AntiMicro to map the second gamepad to those keys, but you can't get analog out of that.
 

Knurek

Member
Just unplug one of them?

Doesn't seem to work - even if I turn off the one I'm currently using and sync new one, Retroarch doesn't automatically switch between them. It detects the new XInput pad fine, autoconfigures it, but doesn't switch to it for some reason.
I have to go to the menu and change device index. Kind of a bother when the pad Retroarch wants to use is disabled and I'm AFK.
 

Radius4

Member
So both are working under xinput?
XINPUT reserves the pad number so there isn't much that can be done about that you have to unplug and re-plug for it to become user 1.

You can unplug both and then plug the one you want to use (make sure to default the value of User X device index)
 

Knurek

Member
You can unplug both and then plug the one you want to use (make sure to default the value of User X device index)

Doesn't seem to work for wireless pads, unfortunately.
Oh well, Wii U Pro Pad is probably the best controller for Retroarch games anyway, I guess I'll just default to that from now on.
 

Radius4

Member
Works for me with a DS3 and a DS4, it depends on the driver though, I use SCP toolkit and it notifies the OS of the unplug event.

Also, STEAM is doing something weird with XINPUT now, if I plug and unplug my pads while steam is running they get assigned to another slot so have that in mind too.
 

Iced

Member
Curious if anyone has gotten this to run well with Gsync monitors yet. I'm on a 144hz AOC Gsync monitor. In Nvidia control panel I have a custom profile with the following settings:

m7OUPv2.jpg

RnoTQQg.jpg

wP1PRNo.jpg


With these settings, games will run smoothly in Retroarch but only if I have Vsync enabled in Retroarch's settings. I also have full screen (not windowed) and hard gpu sync enabled. I would think with Gsync I wouldn't want Vsync enabled in Retroarch, would I?
 

EasyMode

Member
Curious if anyone has gotten this to run well with Gsync monitors yet. I'm on a 144hz AOC Gsync monitor. In Nvidia control panel I have a custom profile with the following settings:

With these settings, games will run smoothly in Retroarch but only if I have Vsync enabled in Retroarch's settings. I also have full screen (not windowed) and hard gpu sync enabled. I would think with Gsync I wouldn't want Vsync enabled in Retroarch, would I?

For my G-Sync monitor I have vsync on in control panel, and vsync off in Retroarch (or any game).

I also have full screen on + windowed off and hard gpu sync on (I don't know if this has any effect with vsync off).
 

Iced

Member
For my G-Sync monitor I have vsync on in control panel, and vsync off in Retroarch (or any game).

I also have full screen on + windowed off and hard gpu sync on (I don't know if this has any effect with vsync off).

Tried out your settings and it works, but it seems to just be another way of accomplishing the same thing.

I guess the real question is: can Retroarch output individual games at their native refresh rates, or is it locked to whatever is in the Video settings (default 59.94)? I know there would only be slight variations, but for example: Mortal Kombat 1 arcade benefits greatly from gsync, as it runs at a significantly lower refresh rate than what is commonly seen. I've run it on Mame with gsync, and motion is gorgeous. I'd love to see similar performance in Retroarch. Maybe a configuration file setting?
 

Awakened

Member
Tried out your settings and it works, but it seems to just be another way of accomplishing the same thing.

I guess the real question is: can Retroarch output individual games at their native refresh rates, or is it locked to whatever is in the Video settings (default 59.94)? I know there would only be slight variations, but for example: Mortal Kombat 1 arcade benefits greatly from gsync, as it runs at a significantly lower refresh rate than what is commonly seen. I've run it on Mame with gsync, and motion is gorgeous. I'd love to see similar performance in Retroarch. Maybe a configuration file setting?
I've seen turning off vsync and audio rate control, then turning on audio sync as recommended settings for gsync here. Supposedly there is some stuttering with that setup though.
 

Iced

Member
I've seen turning off vsync and audio rate control, then turning on audio sync as recommended settings for gsync here. Supposedly there is some stuttering with that setup though.

I've seen that thread before. In that case, I would prefer to keep using vsync. I was just curious if there have been any developments recently.
 

Knurek

Member
Not sure if this is a bug, or 'works as intended', but if you start a game, mute the sound, you can use the mute function again to have sound back.
But if you start the game with config override with audio_mute_enable = "true" set (either core one or game one), there is no way to have sound back. Pressing the mute button doesn't seem to do anything.
 
What is the easiest way to setup Retroarch to display games on a 1080p monitor without doing per-core resolution configs?

Also, can someone inform me on how best to utilize Hard GPU Sync and the pros/cons?

Finally, to Vsync or not?
 

Awakened

Member
What is the easiest way to setup Retroarch to display games on a 1080p monitor without doing per-core resolution configs?

Also, can someone inform me on how best to utilize Hard GPU Sync and the pros/cons?

Finally, to Vsync or not?
You can turn on integer scaling to get a perfect multiple of the original resolution, or turn it off and use a shader like pixellate to scale to full 1080 with very minimal blur.

Turning on hard sync will reduce display lag quite a bit. There is a hard sync frames option under the on/off toggle. The default 0 cuts the most amount of lag, but is also the most CPU intensive. If you experience sound crackling, setting it to 1 usually eases the CPU load enough to fix that. It's kind of hard to tell the difference between 0 and 1 unless you're very sensitive to input lag, but 1 has slightly more lag than 0.

Turning off vsync creates tearing and stuttering unless you have vsync forced on in your driver or gsync, so generally you want it enabled.
 
You can turn on integer scaling to get a perfect multiple of the original resolution, or turn it off and use a shader like pixellate to scale to full 1080 with very minimal blur.

Turning on hard sync will reduce display lag quite a bit. There is a hard sync frames option under the on/off toggle. The default 0 cuts the most amount of lag, but is also the most CPU intensive. If you experience sound crackling, setting it to 1 usually eases the CPU load enough to fix that. It's kind of hard to tell the difference between 0 and 1 unless you're very sensitive to input lag, but 1 has slightly more lag than 0.

Turning off vsync creates tearing and stuttering unless you have vsync forced on in your driver or gsync, so generally you want it enabled.

This is probably the most helpful response I have ever received. I never looked into Pixellate before and it is pretty great. Thank you sir.
 
I don't mean to double reply but there doesn't seem to be a lot of action on this thread. I know Pixellate was mentioned for use when non-integer scaling at 1080p, but I am wondering if there is anything that will give the BVM look without integer scale? I know crt-royale-korozumi is kind of the top of the list for that scanline look, but as I understand it, you need to run in integer mode causing jail bars and shrinking screen realestate.
 

Brhoom

Banned
I don't mean to double reply but there doesn't seem to be a lot of action on this thread. I know Pixellate was mentioned for use when non-integer scaling at 1080p, but I am wondering if there is anything that will give the BVM look without integer scale? I know crt-royale-korozumi is kind of the top of the list for that scanline look, but as I understand it, you need to run in integer mode causing jail bars and shrinking screen realestate.

You can find in Retroarch foumrs a thread called "analog tv pack 3 shaders" I think, it includes a BVM shader that looks really nice, give it a try!
 

Awakened

Member
I don't mean to double reply but there doesn't seem to be a lot of action on this thread. I know Pixellate was mentioned for use when non-integer scaling at 1080p, but I am wondering if there is anything that will give the BVM look without integer scale? I know crt-royale-korozumi is kind of the top of the list for that scanline look, but as I understand it, you need to run in integer mode causing jail bars and shrinking screen realestate.
CRT-Easymode-Halation is pretty good about scaling to non-integer. I only notice uneven scanlines with that on certain shades of blue.
 

vanty

Member
What can you do to get the clean and sharp look of the NES/Famicom mini? I use crt-easymode 99% of the time but playing the Famicom Mini has made me want to give it a shot without the scanlines and everything so I turned it off but just got some soft/blurry image.

These are my video settings on a 1920x1200 monitor. Anything I need to change regardless of using a shader or not?

 

Knurek

Member
Is there a way to permanently define the hotkey for Desmume's quick switch function?
It's mapped to R3, you while it can be changed in the Quick Menu, the change doesn't get retained between sessions. :\

Related - is there a way to have the quick switch mode start with the bottom screen?
 

Iced

Member
What can you do to get the clean and sharp look of the NES/Famicom mini? I use crt-easymode 99% of the time but playing the Famicom Mini has made me want to give it a shot without the scanlines and everything so I turned it off but just got some soft/blurry image.

These are my video settings on a 1920x1200 monitor. Anything I need to change regardless of using a shader or not?

The two major offenders I see here are windowed scale and hw bilinear filtering. For the former, you optimally would want a multiple of four (or I suppose, if you're on a 4K display, a multiple of 8) for pixel-perfect presentation. For the latter, you want to disable it. That is what's causing your image to be soft and blurry.
 

Lettuce

Member
I was always under the impression 'Windowed Scaling' option was just when you were running the program in a window and not fullscreen, as i have never noticed a difference not matter what setting i set the option to?
 
The two major offenders I see here are windowed scale and hw bilinear filtering. For the former, you optimally would want a multiple of four (or I suppose, if you're on a 4K display, a multiple of 8) for pixel-perfect presentation. For the latter, you want to disable it. That is what's causing your image to be soft and blurry.

Change to a custom viewport that supports the 1200p monitor. Perfect for 240p x 5. Just turn off bilinear. That's what's causing the blur.

Now, I have a question for me. Would it make mores sense to use downsampling to render at 4K and then let the video card size it back to 1080p? My nVidia card supports this thing called DSR or dynamic super resolution. It seems like it might solve the integer scaling problem with 1080p.
 

Radius4

Member
Is there a way to permanently define the hotkey for Desmume's quick switch function?
It's mapped to R3, you while it can be changed in the Quick Menu, the change doesn't get retained between sessions. :\

Related - is there a way to have the quick switch mode start with the bottom screen?

It saves just fine, the remappings under controls save when you select "Save Core remap file" or "Save Game remap file"

I just pushed a few changes worth mentioning:
- Analog d-pad mode can be saved on overrides now
- Input device can be saved on overrides now
- Emulated input device can be saved on overrides now
- We have now a toggle to switch B/A to A/B
 

EasyMode

Member
Change to a custom viewport that supports the 1200p monitor. Perfect for 240p x 5. Just turn off bilinear. That's what's causing the blur.

Now, I have a question for me. Would it make mores sense to use downsampling to render at 4K and then let the video card size it back to 1080p? My nVidia card supports this thing called DSR or dynamic super resolution. It seems like it might solve the integer scaling problem with 1080p.

Downsampling from 4K to 1080p with integer scale on will get you a nice ~26% increase in viewing area. Here's a comparison with crt-aperture:

http://screenshotcomparison.com/comparison/191321

That's a shader I recently wrote, going for a PVM-style, bold scanline look for my 1440p monitor:


It's available through the online updater.
 
Top Bottom