Author

Topic: Why do I get an out of range error when I use DVI but VGA is fine? (Read 2497 times)

sr. member
Activity: 500
Merit: 253
Got it, my refresh rate has to be 60hz and windows keeps trying to make it 62. Hopefully someone has this issue and this thread helps them out.
legendary
Activity: 1022
Merit: 1001
Most monitors will auto-detect but just make sure its not set manually to analog instead of digital
sr. member
Activity: 313
Merit: 251
Third score
I can only think of a defective DVI cable where maybe one of the pins does not make good contact.

Maybe try another cable?
sr. member
Activity: 500
Merit: 253
HD 5970 with dual DVI and 1 mini dvi port. If I use a cheap DVI to VGA converter then I can get all ranges shown. If I take off the adaptor and just go straight through DVI then I can't see anything other than an "Out of range" red box in the center of a black screen.

I've tried multiple operating systems, drivers, nothing seems to work. Has anybody ever seen anything like this before? My monitor accepts both DVI and VGA, it's a 22 inch westing house L2610NW.
Jump to: