Question Problems connecting PC with TV

DaveS

Addon Developer
Addon Developer
Donator
Beta Tester
Joined
Feb 4, 2008
Messages
9,478
Reaction score
732
Points
203
I just bought a new LCD TV, 32" Philips. And now I want to connect it to my PC. The connection is through a HDMI cable connected to a HDMI slot on the back of the TV to HDMI slot on the graphics card.

But when I turn on the TV and select HDMI slot as the source, it just complains about "unsupported video format".

I have been troubleshooting this for the last two days and nothing has worked so far.

Anyone have any ideas? The graphics card in my PC is a NVIDIA GeForce 8600GT which works fine otherwise, same thing with the TV. It's just that I can't get the TV and computer to work toghether.
 

TSPenguin

The Seeker
Joined
Jan 27, 2008
Messages
4,075
Reaction score
4
Points
63
You should run your HDMI port at the native resolution of the TV panel. Experimenting with refresh rates might be in order.
Some TVs designate a specific HDMI port for PC use.
Also make sure, that at no end is HDCP enabled.
 

dbeachy1

O-F Administrator
Administrator
Orbiter Contributor
Addon Developer
Donator
Beta Tester
Joined
Jan 14, 2008
Messages
9,220
Reaction score
1,568
Points
203
Location
VA
Website
alteaaerospace.com
Preferred Pronouns
he/him
As I understand it, HDCP only comes into play when the original video source (e.g., a Blu-ray player or player software) enables it whenever it plays a copyrighted (and therefore, encrypted) video; a normal Windows desktop feed should not cause the video card to send HDCP-encrypted data. However, I'm no expert on HDCP, so I may be wrong on that.

In any case, I have a home-theater PC hooked up to my TV via DVI, and I had to use the "PC-compatible port" on my TV (the second of two DVI ports) and then set that port's source to "PC" in the TV's setup menu. The first port is not compatible with PCs even though the connector is the same, so I suspect there are some protocol or signal differences between a standard digital feed and a PC video card feed. You could check your TV's manual (or Google) for information about connecting a PC to your specific TV model.

If worse comes to worse you can always use the analog HD component inputs (Y Pr Pb) on your TV (assuming it has those; most HD sets do). Some TVs also have a standard 15-pin VGA port, so that would work as well. Of course, HDMI or DVI would look best, since the data would then reach the TV in its original form rather than being converted to analog by the video card and then back to digital by the TV.
 

Notebook

Addon Developer
Addon Developer
News Reporter
Donator
Joined
Nov 20, 2007
Messages
11,822
Reaction score
644
Points
188
Silly question, have you tried another source on the HDMI input on your TV? Just to make sure that input works.

N.
 

DaveS

Addon Developer
Addon Developer
Donator
Beta Tester
Joined
Feb 4, 2008
Messages
9,478
Reaction score
732
Points
203
Silly question, have you tried another source on the HDMI input on your TV? Just to make sure that input works.
Thanks for the tip but no joy.

I have attached some screenshots that might be of use in troubleshooting this problem. Anything improperly set-up?
 

Attachments

  • NVIDIA_CP1.jpg
    NVIDIA_CP1.jpg
    179.5 KB · Views: 16
  • NVIDIA_CP2.jpg
    NVIDIA_CP2.jpg
    174.9 KB · Views: 13
  • Monitor_settings.jpg
    Monitor_settings.jpg
    69.3 KB · Views: 14
  • Monitor_settings_2.jpg
    Monitor_settings_2.jpg
    69.9 KB · Views: 12

n122vu

Addon Developer
Addon Developer
Donator
Joined
Nov 1, 2007
Messages
3,196
Reaction score
52
Points
73
Location
KDCY
Refresh Rate

I believe this was mentioned by dbeachy1, but have you checked the refresh rate on the HDMI port? From the Settings tab, highlight monitor 2, click Advanced, then Monitor tab. Check the box concerning modes this monitor cannot display, then from the drop-down list, select the highest refresh rate the monitor will support. This may or may not solve the problem.

Also, if you have your HDTV's documentation, it should tell you what PC input settings it supports.

**EDIT** Sorry, it was TSPenguin that mentioned refresh rates. Want to give proper credit.
 

DaveS

Addon Developer
Addon Developer
Donator
Beta Tester
Joined
Feb 4, 2008
Messages
9,478
Reaction score
732
Points
203
I believe this was mentioned by dbeachy1, but have you checked the refresh rate on the HDMI port? From the Settings tab, highlight monitor 2, click Advanced, then Monitor tab. Check the box concerning modes this monitor cannot display, then from the drop-down list, select the highest refresh rate the monitor will support. This may or may not solve the problem.
It didn't, but thanks for procedure anyway.

Also, if you have your HDTV's documentation, it should tell you what PC input settings it supports.
[/quote]
The only thing I can find is various resolutions and refresh rates for the computer which are:

  • 640x480 60 Hz
  • 800x600 60 Hz
  • 1024x768 60 Hz
  • 1920x1080i 60 Hz
  • 1920x1080p 60 Hz
My computer will only go as high as 1440x900. Could the graphics card be at fault here? Maybe it can't support HD output? It is a bit sluggish playing back 1080 HD movies.
 

Quick_Nick

Passed the Turing Test
Donator
Joined
Oct 20, 2007
Messages
4,088
Reaction score
204
Points
103
Location
Tucson, AZ
Shot in the dark: According to what I've recently 'learned' of HDTVs, (but without experience) things like PC output that use 'Progressive' scanning may not be easily compatible with Interlaced HDTVs.

Stuff that may be related:
http://en.wikipedia.org/wiki/Interlace - Intro paragraph
***grr... I can't find the video I'm looking for (a really good one :p)***
http://techdigs.net/content/view/53/42/ (have this instead :p)


Also important: My post almost definitely has no 'purpose' if you have a Progressive HDTV.
 

TSPenguin

The Seeker
Joined
Jan 27, 2008
Messages
4,075
Reaction score
4
Points
63
Quick Nick is right. If your TV does not support 1080p but only 1080i, then you will have some very interesting stories to tell once you get it working.
As I understand it, only very very few cards support interlaced output.
Also, select in the nvidia tab 1080p24 instead of the 50. Maybe that helps (If you didn't try that already, that is probably it)
 

DaveS

Addon Developer
Addon Developer
Donator
Beta Tester
Joined
Feb 4, 2008
Messages
9,478
Reaction score
732
Points
203
Quick Nick is right. If your TV does not support 1080p but only 1080i, then you will have some very interesting stories to tell once you get it working.
As I understand it, only very very few cards support interlaced output.
Also, select in the nvidia tab 1080p24 instead of the 50. Maybe that helps (If you didn't try that already, that is probably it)
From reading the specs on the TV, it's progressive scan, not interlaced scan. And changing it to 1080p24 did nothing.
 

TSPenguin

The Seeker
Joined
Jan 27, 2008
Messages
4,075
Reaction score
4
Points
63
By this time, I dare to ask if you checked if the cable is working correctly...
 

DaveS

Addon Developer
Addon Developer
Donator
Beta Tester
Joined
Feb 4, 2008
Messages
9,478
Reaction score
732
Points
203
By this time, I dare to ask if you checked if the cable is working correctly...
OK, how do I do that? The TV and computer is the only HDMI devices I have.
 

Bj

Addon Developer
Addon Developer
Donator
Joined
Oct 16, 2007
Messages
1,886
Reaction score
11
Points
0
Location
USA-WA
Website
www.orbiter-forum.com
Whats your graphics card?

perhaps its your driver, but if its a new card then it should be already updated.

...and do you have a DVI to HDMI cable/adapter? Perhaps that will work.
 

DaveS

Addon Developer
Addon Developer
Donator
Beta Tester
Joined
Feb 4, 2008
Messages
9,478
Reaction score
732
Points
203
Using an adapter is a good idea. I guess you have one DVI and one HDMI port?
Yes, but only on the card. No DVI connections on the TV. And the DVI connection is connected to my standard computer screen with a DVI/VGA adapter.
 

Hielor

Defender of Truth
Donator
Beta Tester
Joined
May 30, 2008
Messages
5,580
Reaction score
2
Points
0
Since it's a laptop, try disabling the internal screen (using the FN key and whatever number or F-key on your keyboard is labeled with a box with a vertical line on each side--should be a similar icon to what's next to the DVI or VGA-out).

Pressing that key combination will cycle between 3 modes: repeat internal-external, side-by-side internal-external, and external only. (maybe also internal only? try it).

What that will do is force your computer to find a resolution that the TV supports.
 

Hielor

Defender of Truth
Donator
Beta Tester
Joined
May 30, 2008
Messages
5,580
Reaction score
2
Points
0
Not a laptop, a stationary one.

So it is. I partially retract my statements then...

Try having the TV as the only thing plugged into the computer at boot-up.


-----Post Added-----


Can the TV work with any lower modes, in the signal format?
 

DaveS

Addon Developer
Addon Developer
Donator
Beta Tester
Joined
Feb 4, 2008
Messages
9,478
Reaction score
732
Points
203
Try having the TV as the only thing plugged into the computer at boot-up.
Aside from the mouse and keyboard? Not even the monitor?


Can the TV work with any lower modes, in the signal format?
Have tried every mode listed in the NVIDIA Control Panel and no success.
 
Top