Crt update! It hasnt made anymore wierd flashes or unusual sounds. I have discovered that I cant use the highest brghtness, because its too bright! It makes black more gray. Wich is a good sign I suppose? The image is phenomenal. Its like Waifu2x on crack. I dont know how I am gonna cope whenever it is time to say goodbye to it. I am also at hunt for backup monitors. But man, they are hard to find nowadays.
Yeah, unlike LCDs, where 100% brightness means the backlight is at full strength, CRTs should have their brightness set based on a calibration image. If you load up the 240p test suite, the SMPTE color bars and pluge can help you dial it in properly. For me, I just turn it up until the scanlines (that is, the lines of resolution; not the gaps between them) are barely not visible on black screens.
That is exacly how I did, turned the brightness down until the lines are not visible on black. The image is really something. It’s so sad that crt technology was abandoned. Or at least, that the sed technology was not further developed
Found myself another one. Electron 21, 21". I thiiink it is a trinitron? I cant catch how it looks but it looks gooood…
It’s apparently a “diamondtron” screen, which is mitsubishi’s trinitron clone that it developed once Sony’s patents expired. People speak very highly of them.
I thought so. Well… i cant say the image is better… or no that is wrong word. The image is probably better than the other one. But i cant say I prefer it over that one. Both are equal good, but different. This one is much brighter thou, I only have the brightness at 15%. It kinda burns my eyes if I put it too high.
Also this might have darker blacks. If that is possible.
I would say if you really want one of these, but have a regular vga crt its not worth too much. But if you can get one rather cheap and easy then go for it.
Reset OSD settings, or simply set at brightness/50, contrast/100, then with a cold power on, if the screen gets super bright at the beginning, like brightness/100 did (black turns gray), and it takes very long to warm back to normal, that means the tube was heavily used for its lifetime.
All CRTs are dim in sRGB black level, even on a high-end Trinitron and the tube barely used before with default brightness/50 still can barely (or even can’t) recognize the sRGB black level, but high-end Trinitron such as Sony G520 (with its OEMs) has a built-in hardware sRGB mode, with that mode all black levels are getting normally look as on nowadays LCDs. All in all, all CRTs need to be tweaked (Gamma, etc) in software way to fit nowadays sRGB.
Great, nice find. Also for the persistent and desire… I believe it’s a pretty unique one. It’s an early Diamondtron model (first, or second gens) rebrand, because of the ejected OSD, and back input layout.
Thank you. It seems like it gets brighter first but then almost instantly the blacks turn back down to normal. I know its old but hopefully these two can last a while for me.
1.) “true” 240p running at doubled refresh rate, 2.) 480p with tvout-tweaks plus interlacing shader, 3.) high res with a CRT shader with mask disabled.
Hunter, I’m trying to do this but I have some doubts.
- What exactly do I need to activate this method?
- I can find the tvout-tweaks shader but I don’t know how to add an interlacing shader.
- Why do you send to deactivate the mask? a 1600x1200 monitor is incredibly sharp, shaders look better than flat.
-
You need to be able to create custom modelines. You can get them really specific to match the individual resolution/refresh rates (always double to maintain the 31 khz horizontal scan rate) or you can just use a general super-res (usually something like 3840x240 or 2560x240 or, if you’re on a Raspberry Pi, use 1920x240, as the others will never sync due to some GPU limitation).
-
go to the ‘presets’ directory and there are some tvout-tweaks directories that include the interlacing shader.
-
because the CRT’s own mask typically screws with the mask simulation. Plus, why fake it when you have the real thing??
- I’m looking at the github website for all the resolutions that I have to create modline … I think for now I’m going for the easiest option. For the general super resolution do I also have to create modline 2560x240 120Hz? I don’t know if my monitor supports that. I ask why ubuntu is giving me failures to create resolutions and until I fix it.
‘This is a very advanced feature and a bit obscure, more friendly information is needed.’
-
There they are all, I had the confusion because in the crt folder, at the end of the list there is a shader “tv-out-tweak” without scan lines. For the lines to look good the scaling has to be activated, I have tried some guejos and the Mortal Kombat (arcade) look very small.
-
It has no comparison, the mask of a TV is seen from more than one meter away, that of a monitor at 1600x1200 is not perceptible.
Option 2 is very similar to a professional monitor, even sharper than a BVM. Too sharp for me, I like the slotmask type.
I did some tests and if I use a 640x240 resolution the games look very nice, no scan lines and smooth contours.
I had a CRT monitor some years ago that was using a custom 320*240 @120 hz (powerstrip), and yes it was way too sharp. Probably the same with an LCD with interlace shader, but without the blur when things start moving and more bright. I would go with a guest.r shader with adjustable scanlines to avoid blur (poor response time of LCDs) at least.
- yeah, you have to create the modeline manually. In linux, it’s usually through xrandr.
Yes, it’s easy, the problem was when I had to add the resolution “–addmode” would add it to the HDI and not to the VGA, so, take the easy route, I removed the video card and used the integrated one and it worked. I wanted to see how “wonderful” it looks, as they say. It really looks very good, but it is excessively sharp, with a tvout shader you can add some imperfections and it looks more natural.
An anecdote, when I rebooted, I forgot to use xorx and it took me like 3 hours to realize it.
2d console and arcade games look great. 3D games look horrible at half resolution and saturn’s without interlacing. I think I should open a new post with all the things I am seeing.
How does choosing between 15Khz or 31Khz affect me and choosing between 3840 or 2560?
So is that xrandr method working with any PC, eg a laptop with Intel graphics? Just want to experiment a bit maybe. Can’t insert any 240p modeline on that on Windows (soft15khz, cru etc, none is working).
Normally you should be able to specify the output port, like addmode HDMI-0 (or whatever is listed via xrandr).
On my Samsung VGA CRT , Sharpness is a function in the monitor’s OSD, great.
It’s supposed to work, aside from bad drivers, Intel is limited by needing high pixelclocks, so superwide resolutions are the only options.