It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
My monitor (1920x1080) refuses to display anything smaller than 1024x786 without overscan. That means, that only the center part will be displayed.

I mean, for 4:3 resolutions, you get a black border at both sides (hopefully), but the visual window within those black borders for 640x480 is about 600x440. And for 320x200, it's less than 300x180.

If I cannot get my GPU/drivers/monitor to respect the aspect ratio, it gets worse, as the display tends to be stretched equally in all dimensions. It becomes like 600x380 or 300x140.

This renders many older GOG games unplayable.

Yes, I do have an old, tiny, 4:3 monitor. No, I am not going to use it. I want it on my 41" HD monitor, with black borders to the side.

I want to know up front if I will be able to play the game I buy here on my monitor. Because, after buying most of my old-time favorites, most of them suffer from this problem, which makes them mostly unplayable.

Ok, I do agree that part of the blame lies with nVidia and AMD: both don't have an easy solution either. While simply doubling the resolution would work.
Post edited November 15, 2015 by SymbolicFrank
I don't know much about graphics cards, but I suggest you check your card's options, because I have a 1920x1080 monitor and a modern AMD card and I have not had any problem with any GOG game.

P.S. The 1920x1080 resolution is by far the most common resolution among MaGog users (and I assume by extension among GOG users), with 26% of the user base using that resolution, so you are not in any way an exception.
Post edited November 15, 2015 by mrkgnao
Which game do you plan on buying ? Because that may be important .
Which monitor is it?
I would also suggest you look in the control center/graphic driver configuration tool of your graphic driver. With a NVidia card ther is the option to stretch the picture or disable any scaling method. Without Scaling you should get your resolution with black borders.
If that don't work, you could also try to play the old game in window mode if possible.
I would disable scaling in the video driver and then tweak your DOSBox settings.
avatar
SymbolicFrank: I mean, for 4:3 resolutions, you get a black border at both sides (hopefully), but the visual window within those black borders for 640x480 is about 600x440. And for 320x200, it's less than 300x180.
Does your monitor have an auto-adjust option? Does pushing it while on those low resolutions help any? I think that matters only if you are using a VGA-connection on a flat monitor, though.

Or if it is some kind of "TV monitor" (considering its size), does it have any options to disable overscan? On my 47" HDTV, I had to disable overscan so that part of the desktop would not go over the limits. This happened with all resolutions though, also 1920x1080. The TV actually had some kind of computer mode, which among other things removed overscan, but i think it also disabled some picture enhancements which would cause lag to the picture.
Post edited November 15, 2015 by timppu
avatar
SymbolicFrank: snip
Oh, you're trying to shrink your resolution and play old games in "fullscreen"? That's...something I'd never want to do. ^^

A workaround would be setting a black background, setting your taskbar to auto-hide, and playing it in windowed mode. But in general it's not gonna be a clean process unless you're great at editing configurations.
avatar
SymbolicFrank: snip
avatar
MaximumBunny: Oh, you're trying to shrink your resolution and play old games in "fullscreen"? That's...something I'd never want to do. ^^
I play most games fullscreen, including 640x480 games from the 80's, and I have had no problem doing so, almost always without changing anything in either my graphics card or the game/dosbox configuration. So it is quite possible if someone wants to do it.
Post edited November 15, 2015 by mrkgnao
It sounds as if the monitor (TV?) is set to use overscan (which is the default setting on most TV's), and that the computer is set to account for that by shrinking the image slightly and adding black borders outside (likely a default setting if you connect via HDMI, because HDMI is primarily used to connect to TV's rather than computer monitors, and TV's are by default set to use overscan - a perfect example that two wrongs don't make a right, just more wrong), except it only accounts for overscan at certain resolutions.

Disable overscan mode on both the TV and in the graphics drivers and it should work fine.

CRT TV's can't use the extreme borders because the quality degrades heavily (and to hide it, the extremes of the tube is behind a wood or plastic border to hide the imperfection), thus 1990's broadcast and production companies try to make sure everything they want to show is visible rather than outside those borders.

Fast-forward to today, where LCD panels can actually be fully utilised but are in reality often used for media created for 90's CRT TV's. Therefore producers of modern TV's decide that the default setting should be to (in the TV's software) expand the incoming image to beyond the panel's borders in order to show roughly what would be displayed on a 80's/90's CRT and hide what would be outside their borders.

Then we have computers. Computer CRT monitors can/could be adjusted as to how large the image should be, and where it should be placed, so we always used the full image. Now connect a computer to an LCD TV that assumes the borders of the image are insignificant and we get annoyed users that can't see their top/bottom panels, so GPU manufacturers decide that because TV manufacturers are idiots, they have to account for the latter's idiocy by shrinking the image slightly and adding a black border to get up to the full resolution.

What you get is a 1920x1080 image, shrunk to 1728x972 (or thereabouts), given a black border so that the total is 1920x1080 again, fed to the TV which does the opposite, expands the incoming image to roughly 2133x1200, cuts out the middle 1920x1080 and displays that. Yeah, quality loss. Better to disable all overscan-related idiocy and just get the actual 1920x1080 image as unchanged as possible from computer to display panel, it'll help all other resolutions as well.
Post edited November 16, 2015 by Maighstir
avatar
mrkgnao: I play most games fullscreen, including 640x480 games from the 80's, and I have had no problem doing so, almost always without changing anything in either my graphics card or the game/dosbox configuration. So it is quite possible if someone wants to do it.
I know it's possible. Lots of games start in low res mode. I'd just never *want* to do it. xP
avatar
SymbolicFrank: Ok, I do agree that part of the blame lies with nVidia and AMD: both don't have an easy solution either.
Uhm, NVidia does. At least I have no problems in that regard.

However, given that you claim your monitor is 41", it sounds like it's actually a TV, which is probably what is causing you problems. I had the same issue connecting my computer to my Samsung TV. However, out of all the HDMI ports my TV has, one is specifically marked with a PC label. Additionally, there is a setting in the TV menu which lets you switch off overscan for that port. So that's what I've done, and it works perfectly.

I've attached a screenshot of my NVidia scaling settings.
Attachments:
Depending on the game, I suggest using a source port. That's what I do for doom, quake, and descent.
avatar
SymbolicFrank: My monitor (1920x1080) refuses to display anything smaller than 1024x786 without overscan. That means, that only the center part will be displayed.

I mean, for 4:3 resolutions, you get a black border at both sides (hopefully), but the visual window within those black borders for 640x480 is about 600x440. And for 320x200, it's less than 300x180.

If I cannot get my GPU/drivers/monitor to respect the aspect ratio, it gets worse, as the display tends to be stretched equally in all dimensions. It becomes like 600x380 or 300x140.

This renders many older GOG games unplayable.

...
Ok, I do agree that part of the blame lies with nVidia and AMD: both don't have an easy solution either. While simply doubling the resolution would work.
no it's not an videocard problem. most videocards allow for almost any kind of resolutions including custom resolutions. it's your monitor/tv, and most likely it's a tv . best that you post the model name or serial number of your tv.

without knowing what you're using, i would guess there's a "game mode" on your tv. try that first.

if not , does the tv have other ports other than the HDMI port? if it has a olde VGA port, it might be more "willing" to accept an SD-like resolution signal. but depending on what videocard you have, you might need another overpriced conversion cable to plug into that older port on your computer end.

alternatively you could get one of these hardware HD converters/upscalers. extensive review of them in url.
retrogaming.hazard-city.de/micomsoft.html

but since those upscalers are made for connecting old consoles to HDTVs, there might be compatibility issues and need for more conversion cables.

EDIT: well, scratch that last suggestion, NVidia cards SHOULD have even better upscaling and upscan capability than those overpriced XRGB thingies. you really should tell us what settings you're using, and give us your tv's model if you can't find the manual for it.

TL:DR = Maighstir probably has the solution.
Post edited November 16, 2015 by dick1982
In dosbox config, set scaler to resolution you want = 2x, 3x, 4x.
set scaler it to forced.
then make sure to tick the "aspect" correction to on.

This should scale the initial game resolution accordingly. For example, typical VGA of 320x200 at 2x will produce 640x400 and so on.

However, Dosbox should REALLY implement proper fluid up-scaler, something that allows to specify TARGET resolution and then optional filter - and do the magic on one of the CPU cores.