Using a TV as a monitor - Why won't it work?

Started by Scavenger, Sun 27/11/2005 13:22:29

Previous topic - Next topic

Scavenger

Okay, I seem to have a huge problem with my computer.  I want to set the S-Video out to go to my TV, since viewing movies and games on a 21" TV is much easier for two people to see and interact with than my 15" LCD screen which is most useful for sprite art.

I have a RADEON 9550, with S-Video out. Unfortunately, the TV I have only has SCART sockets, so I rigged up an S-Video-Component Video-Scart setup, set my computer to use a secondary monitor, flipped the channel over to EXT-1 (Which is the scart port.). Nothing. EXT-2. Still nothing. EXT-3. Again, nothing.

What was I doing wrong? The switchbox certainly worked, as it played my PS2 and DVD Player perfectly. The SCART input on my TV worked, as I had Freeview working on it a little while ago.

So, I unplugged my monitor, and left only the S-Video cable in. Again, with the TV on and connected. Restarted the computer and... nothing. Then I got throughougly confused.

Is there some archaic and cryptic way of getting the video to come out of the S-Video port instead of the VGA port? Or are the S-Video and DV-Out just little fancy things to fill space?

Will I have to make a VGA-Component video adaptor to force it to comply? (Not recommended. I can solder and wire stuff, but I suck at making prototypes. And I have too many bad memories of the electronics factory.... *shudder*)

Any help is welcomed! ^^

Ashen

#1
Obvious point but: Have you 'activated' the secondary monitor (the TV in this case)? For my Radeon (9800), under XP, you have to go to the 'Display properties' dialogue (right-click on desktop, select 'Properties'), to the 'Settings' tab, set the secondary monitor as attached, and you should be able to send to the TV. There might be a few more things you need to configure to make the picture on the TV look better (refresh rates, etc) - sorry, it's been a while since I did this, and I don't remember all the details.

EDIT: OK, apparently not that easy - I've just discovered mine's stopped working (It's been a while since I tried). I think my problem is not having the right controllers installed any more - could that be your problem too?
I know what you're thinking ... Don't think that.

Scavenger

#2
Edit: Meh. My big TV doesn't support 60hz. =P

I'll try it later on my upstairs one o.o Which is an American TV, if I'm not mistaken.

TheYak

Most standard TVs won't support 60hz.   Depending upon whether the card supports dualview (displaying both outputs at once) and maintaining each at difference refresh rates, you may be able to override the display settings to put it at 30-ish Hz (Standard NTSC likes a 34Hz signal if I remember correctly).  Obviously, this would look like crap on a monitor and even potentially damage a monitor trying to display it.  Since you're talking LCD, it shouldn't give a rat's ass.

As for the custom refresh, if you can't find a setting in the control panel to obtain a rate that low, try going with TV Tool.  It's got a lot more customization ability.  For some reason, the graphics cards manufacturers don't really take into account various ways that TV-out might be used.  My old GeForce 2 GTS wouldn't work with any nVidia software, but worked perfectly with TV Tool. 

m0ds

Hmm, well, I can run my TV screen output at 60 or 70hz, with very little difference between each. To get it working, all I needed to do was "clone" the main source monitor and that automatically puts it on the TV. You may need to look for an "S-Video" and "Composite" checkbox and also set it to the right country & region, mine is PAL - UK, from a list of hundreds.

As for the s-video scart thing, are you talking about a scart adaptor which you can stick S-Video, plus red + white phono's into? If so then thats what I use, and it works fine. The connection is a bit dodgy tho.

SMF spam blocked by CleanTalk