PC's HDMI to CRT TV's Composite - Any Advice?

So I am going to be converting the HDMI signal from my PC into the Yellow/White RCA inputs on my 13" JVC CRT VCR Combo. I may even convert THAT signal into RF, just so I can ‘have’ to change the channel to 3 before I can see my games :smiley:

Now, obviously this thing won’t be doing 1080p 16:9 stretching, which is my default settings.

Are there any sort of settings/shaders/things I should know in order to get the ‘best’ quality signal for this setup? I know it won’t be pretty, but I don’t want pretty. I want it, if possible, to look like I plugged in (let’s say) my genesis into the old TV.

I’ve used some retro-shaders and they are fantastic, but I want to use THIS TV because I used it to Play NES in 1990, all the way through X360 in 2008, when I got an HD set because I couldn’t read the small text on Dead Rising lol.

Are you sure that HDMI is your best option? Is there any way to get an analog signal output? In any event, retro-shaders are meant to emulate CRT display. Since you are outputting to a CRT, the raw video should be sufficient. However, take a look at https://github.com/libretro/common-shaders/blob/master/crt/shaders/tvout-tweaks.cg

Well on the video card, the options I have are HDMI, Display Port, or DVI-D. So unfortunately there’s no analogue option.

I’m not sure what I’m looking at with the link you posted. It sounds cool, a shader made for this kind of thing, but I don’t know what all this code is I’m looking at…

I’m wondering what Resolution I should set the PC to for this too. I have it set as a standard to 1080p, and I know the composite converter will downscale it to 480i, but I always hear people talk about how the signal for retro games is 240p. I assume composite to RF is the same resolution.

In display settings 800x600 is the lowest available, so I’m not sure what to do about that either.

I would do my research (f you haven’t) and make sure that HDMI is the best conversion. I know it is equivalent to DVI-D, but you might want to compare HDMI to DisplayPort. I have a hunch that DisplayPort might be the better option here but I’m not sure.

That link is the actual shader (you can find it in your shaders directory in retroarch, open with notepad). The first part is the settings, and the rest is the code. You just tweak the parameters to whatever you want them to be. The part behind the // describes how to change the settings based on what you want, then you can change the numbers to adjust. For example #pragma parameter TVOUT_COMPOSITE_CONNECTION “TVOut Composite Enable” 0.0 0.0 1.0 1.0 // default, minimum, maximum, optional step If you want it to default to On you set the first 0.0 to 1.0. You might be able to edit these within retroarch as well – I think there is an “Adjust Parameters” option after you load the shader (if you can do that it is probably better). It’s not as hard as it looks. It’s really just editing some options and you just have to play around with it until you get a signal you like. You only need to pay attention to this part in the file and you can and should ignore the rest:

[TABLE="class: highlight tab-size js-file-line-container"]

///////////////



//    TV-out tweaks



//    Author: aliaspider - [email protected]



//    License: GPLv3



////////////////////////////////////////////////////////











// this shader is meant to be used when running



// an emulator on a real CRT-TV @240p or @480i



////////////////////////////////////////////////////////



// Basic settings:







// signal resolution



// higher = sharper



#pragma parameter TVOUT_RESOLUTION "TVOut Signal Resolution" 256.0 0.0 1024.0 32.0 // default, minimum, maximum, optional step







// simulate a composite connection instead of RGB



#pragma parameter TVOUT_COMPOSITE_CONNECTION "TVOut Composite Enable" 0.0 0.0 1.0 1.0







// use TV video color range (16-235)



// instead of PC full range (0-255)



#pragma parameter TVOUT_TV_COLOR_LEVELS "TVOut TV Color Levels Enable" 0.0 0.0 1.0 1.0



////////////////////////////////////////////////////////







////////////////////////////////////////////////////////



// Advanced settings:



//



// these values will be used instead



// if COMPOSITE_CONNECTION is defined



// to simulate different signal resolutions(bandwidth)



// for luma (Y) and chroma ( I and Q )



// this is just an approximation



// and will only simulate the low bandwidth anspect of



// composite signal, not the crosstalk between luma and chroma



// Y = 4MHz I=1.3MHz Q=0.4MHz



#pragma parameter TVOUT_RESOLUTION_Y "TVOut Luma (Y) Resolution" 256.0 0.0 1024.0 32.0



#pragma parameter TVOUT_RESOLUTION_I "TVOut Chroma (I) Resolution" 83.2 0.0 256.0 8.0



#pragma parameter TVOUT_RESOLUTION_Q "TVOut Chroma (Q) Resolution" 25.6 0.0 256.0 8.0







// formula is MHz=resolution*15750Hz



// 15750Hz being the horizontal Frequency of NTSC



// (=262.5*60Hz)



////////////////////////////////////////////////////////

[/TABLE]

You can see there are basically three options in this shader: output resolution, simulate composite, and tv out color levels. I would think you would definitely want the color level adjustment, and probably the resolution setting, but the composite simulation is probably unnecessary because you are going to use actual composite connections. Might be worth a try to see what it does though. It looks like the default resolution of 256 may address your concerns about outputting at 240p, in which case it wouldn’t matter very much what resolution you are running the PC at, but I’m just guessing having never tried it myself. Integer scaling may be an issue so keep that option in the Video settings in mind when you are messing with it, and it is also possible that a higher resolution that is an integer multiple of 256 may be better for you than 800x600 (such as 1024x768).

Just like video cards that have TV-out options, an HDMI-to-composite box will likely take whatever signal you throw at it and crunch it down to 480i. EDIT: vookvook’s advice about trying an integer multiple is a good idea

People say 240p but there wasn’t a standard called 240p until the iPod Video days. NTSC TVs accept 480i/30. Period. End of story. When people say 240p to refer to retro console output, they really mean 480i non-interlaced or “double-strike”, as Nintendo refers to it in its docs. This signal was a riff on the 480i standard whereby instead of showing half of the available fields (“fields” = lines) alternating even and odd twice per frame, they show the same fields (even or odd) twice per frame for double the framerate (that is, the familiar ~60 fps). The fact that it’s not interlaced is why people call it progressive, even though it’s not really true (that is, a lack of interlacing doesn’t necessarily imply a progressive signal).

In a typical interlaced signal, persistence of vision makes the alternating fields look like a single, whole signal while consuming half the bandwidth. In non-interlaced “240p”, half of those fields stay dark all the time, and that’s where the “scanline” look comes from. Most home TVs and connections were blurry enough that the dark gaps weren’t very visible but they were very apparent on broadcast monitors and through high-quality connections like S-Video and SCART, which was commonplace in PAL-land but not in NTSC locales. It’s common parlance to refer to those dark in-between lines as scanlines, but the active lines were actually the scanlines produced by the console and the dark lines were the gaps between the scanlines.