Sign in with
Sign up | Sign in
Your question

hd scam

Last response: in Graphics & Displays
Share
December 17, 2007 3:32:09 AM

I was surfing the net and suddenly I saw this forum and stating that all the video card that advertising their card that can play a HD is simply isn't true. Most of them only put out a 480i and I thought we need at least 1080i-1080p to have a true HD inspite of being HDCP compliant... So, can anyone confirm this here ???

More about : scam

a b U Graphics card
December 17, 2007 4:36:36 AM

Can anyone else make sense of the question being asked? I'm a bit buzzed at the moment and it doesn't make sense at all. If I'm understanding the OP's question, it's not a scam. The 8400/8500/8600 Nvidia's and the ATI's 2400/2600 can handle HD content at 1080p with no problems if I remember correctly.
December 17, 2007 4:41:07 AM

I can't quite translate what's being asked, lol. :D 
Related resources
December 17, 2007 4:51:24 AM

I assume you both all those card and saw it yourself that can do a 1080p output. I saw people complaining that their card will not output 1080p inspite of the card saying it can do it...

It was just a question that I like to be sure about it because I don't want to waste my time on having a HD if there is no technology that will support it...
December 17, 2007 5:01:54 AM

ITS NOT MY FAULT IF YOU PEOPLE ARE LACKING OF BRAIN TO COMPREHEND A SIMPLE SENTENCE. ITS O.K. TO BE IGNORANT BUT TO BE DUMB AS YOU ALL ARE,,, ARE SIMPLY PATHETIC.... WHY NOT JUST DONATE 99% OF YOUR BRAIN TO SOMEONE THAT WILL HAVE GOOD USED FOR IT....
December 17, 2007 5:09:51 AM

Are you like 5 years old or something?
December 17, 2007 5:24:27 AM

skittle said:
Are you like 5 years old or something?
:lol: 
a b U Graphics card
December 17, 2007 5:30:32 AM

nel said:
I was surfing the net and suddenly I saw this forum and stating that all the video card that advertising their card that can play a HD is simply isn't true.


I was surfing the net and suddenly I saw this forum. It states that all the advertisments of video cards that can play High-Definition video are false.

Most of them only put out a 480i and I thought we need at least 1080i-1080p to have a true HD inspite of being HDCP compliant... So, can anyone confirm this here ??? said:
Most of them only put out a 480i and I thought we need at least 1080i-1080p to have a true HD inspite of being HDCP compliant... So, can anyone confirm this here ???


Most of those video cards advertised could only output 480i. I thought that the output needed to be at least 1080i or 1080p to be called HD, instead of just being HDCP compliant.




Is this what you were actually asking? It is quite ironic to call someone dumb when in fact you can't even form a proper (american) english sentence, let alone a paragraph. Also, people that shout and rant at other people when in fact that person is asking for help is what I'd call a general a**hole.


To answer "parts" of your question:

From what I could also remember, nVidia's 8400, 8500 & 8600, and aTi's 2400 and 2600 could handle all the way up to 1080p.

HDCP was one requirement to be able to stream protected videos at 1080p/i or 720p/i, if the HDCP part of the card was missing and the software required it then the video would be downsampled to 480i.

Of course, this generally depends on which HD resolution you're talking about, whether it be 720p, 720i, 1080i or 1080p. In some respects, 480i was high-def but due to it being too common already I think people then started calling 480i to be just "Standard Definition".

You're Welcome.
December 17, 2007 5:37:44 AM

Why don't you construct proper sentences first. It'll make it easier for us to understand what exactly you're on about. Although I did think runswindows95's response was pretty good considering.
December 17, 2007 5:51:20 AM

I think I remember reading a Toms review similar to this matter. I think the 2400 and the 84/8500 were having trouble with 1080p playback but if I remember correctly it was concluded that it may have been a driver issue.

http://www.tomshardware.com/2007/10/26/avivo_hd_vs_pure...

To lazy to read it but theres the link.

And nel, just breath and calm down.

December 17, 2007 6:00:03 AM

Sorry, next time I will try to speak in plain English. lol !!!
December 17, 2007 6:02:06 AM

Thanks, Geef.
December 17, 2007 6:10:56 AM

by the way:

HD vs ED vs SD

Refers to the resolution or number of pixels used to represent a single video image frame. Standard Definition refers to having about 350,000 pixels per frame & High Definition refers to having about 2,000,000 pixels per frame, (or about 6 times more than SD). Therefore, 7201/p is NOT HD.
December 17, 2007 6:24:20 AM

nel said:
ITS NOT MY FAULT IF YOU PEOPLE ARE LACKING OF BRAIN TO COMPREHEND A SIMPLE SENTENCE. ITS O.K. TO BE IGNORANT BUT TO BE DUMB AS YOU ALL ARE,,, ARE SIMPLY PATHETIC.... WHY NOT JUST DONATE 99% OF YOUR BRAIN TO SOMEONE THAT WILL HAVE GOOD USED FOR IT....


I lol'd

1/10

troll softer
December 17, 2007 6:34:40 AM

nel said:
by the way:

HD vs ED vs SD

Refers to the resolution or number of pixels used to represent a single video image frame. Standard Definition refers to having about 350,000 pixels per frame & High Definition refers to having about 2,000,000 pixels per frame, (or about 6 times more than SD). Therefore, 7201/p is NOT HD.


Wrong, noob. HD video is defined as 720i/p and 1080i/p. There is no such thing as "ED", at least in the sense that you are referring to.

How old are you? The forum demands to know!
December 17, 2007 7:12:53 AM

skittle said:
Wrong, noob. HD video is defined as 720i/p and 1080i/p. There is no such thing as "ED", at least in the sense that you are referring to.

How old are you? The forum demands to know!



SDTV / EDTV / HDTV Specifications

1080i/p HDTV format is 1920 X 1080 pixels.
720i/p HDTV [aka SDTV (S=Super) or EDTV] is 1280 X 720.
480p EDTV (E=Enhanced or Extended) [may/may not be widescreen] is 640 X 480 or 704 X 480.
<420 SDTV (S=Standard) Less than 480p

HOW MANY RETARD THAT IS HERE NOW... OR IS IT JUST FULL MOON ???
December 17, 2007 7:20:17 AM

You know what,, the hell with the HD monitor and a HD video card. I'm just going to buy me a HD eye glasses or a HD contact lens then I will see everything in HD at any given moment... YEAH !!! LOL !!!!
December 17, 2007 7:48:26 AM

Or you could just buy a 60" HDTV and a Blue-Ray or HD DVD.
a c 363 U Graphics card
December 17, 2007 8:02:54 AM

I don't understand the point of this post other than the fact that it started with mis-information.

720i/p is the "old" HDTV standard.

1080i/p is the "new" HDTV standard.

You need an HDCP compliant video card and monitor/TV to watch copy protected HD content.
December 17, 2007 8:07:32 AM

nel said:

HOW MANY RETARD THAT IS HERE NOW... OR IS IT JUST FULL MOON ???


If English is not your native language, I would strongly urge you to go with another translation site / dictionary.

I think the one that you're using now may just be screwing with you.
December 17, 2007 8:25:51 AM

nel said:
SDTV / EDTV / HDTV Specifications

1080i/p HDTV format is 1920 X 1080 pixels.
720i/p HDTV [aka SDTV (S=Super) or EDTV] is 1280 X 720.
480p EDTV (E=Enhanced or Extended) [may/may not be widescreen] is 640 X 480 or 704 X 480.
<420 SDTV (S=Standard) Less than 480p

HOW MANY RETARD THAT IS HERE NOW... OR IS IT JUST FULL MOON ???


You also forgot 720x480. EDTV is essentially the same thing as S(standard)DTV. Its just progressive at 60hz (or 50 in europe), which is outside of the NTSC/PAL specifications.

Dont keep lying to your self. 1080/720i and p ARE part of the HDTV specification. I love how you just make up acronyms.
a b U Graphics card
December 17, 2007 8:53:01 AM

nel said:
SDTV / EDTV / HDTV Specifications

1080i/p HDTV format is 1920 X 1080 pixels.
720i/p HDTV [aka SDTV (S=Super) or EDTV] is 1280 X 720.
480p EDTV (E=Enhanced or Extended) [may/may not be widescreen] is 640 X 480 or 704 X 480.
<420 SDTV (S=Standard) Less than 480p

HOW MANY RETARD THAT IS HERE NOW... OR IS IT JUST FULL MOON ???



Well according to this ATSC document: http://www.atsc.org/standards/is_095.pdf

"...ranging from standard definition 480-line interlaced format to 720 or 1080-line HDTV formats."

Now unless the governing body for TV standards is wrong about 480i being Standard Definition and 720 & 1080 are HD. Or you are actually right. I choose the ATSC.

(By the way, the EDTV term seems to have been swallowed up by the HDTV definition)

December 17, 2007 9:16:41 AM

amnotanoobie said:

(By the way, the EDTV term seems to have been swallowed up by the HDTV definition)


EDTV really doesnt have much a place... as far as standards go. Stations cant broadcast it, and the only things that do make use of it are some dvd players and gaming consoles.
December 17, 2007 9:38:39 AM

What is EDTV Resolution?

Reviewer: Phil Conner

Copyright © 2007 PlasmaTVBuyingGuide.com. All Rights Reserved.

What is an Enhanced Definition (EDTV) TV? It is simply a TV—be it a plasma television, or other technology that has 853 x 480 native pixel resolution.

We have all heard certain plasma displays described as "enhanced" definition displays. "Enhanced" certainly makes these plasma TVs sound good … desirable even. Lucky for us there's more to this than just clever marketing. Enhanced-definition (or ED) plasma TVs are actually better than conventional, tube-type TVs—not just slimmer and wider.

Let me explain: Standard-definition (SD) TV—the sort most of us have been watching for years—has 480 visible lines of detail. This is the number of horizontal lines found on your TV screen. Remember, TVs are measured on the diagonal: The width of the screen changes, while its height remains more or less constant. Thus, it is the number of pixels on the vertical axis that really determines how much detail is visible.

Like SDTV, EDTV contains 480 horizontal lines of picture detail, but the difference is that these 480 lines are displayed differently on standard- versus enhanced-definition televisions. SDTV utilizes a process called "interlacing" to display these 480 active lines of information. An interlaced picture is actually a single frame of video "painted," line-by-line, onto the screen in two passes. On the first pass, all the odd numbered lines from top to bottom (i.e., numbers 1, 3, 5, … 479) are displayed. This takes 1/60th of a second to accomplish. In the next 1/60th of a second, all the even lines are painted. So, it takes exactly 1/30 of a second to display a full picture at 480i ("i" for "interlaced"). The refresh rate of such displays is 30 Hz.

With EDTVs, these 480 lines are displayed progressively, meaning all the lines are "painted" onto the screen sequentially (1, 2, 3, … 480) in one pass as opposed to two. Which essentially means that more information can be displayed with progressive scanning since there is not 1/60 of a second lag between "takes." What would otherwise take 1/30 of a second to be displayed using interlacing can actually be shown in half the time progressively. Progressive-scan formats provide full vertical resolution at all times, at a refresh rate of 60 Hz. Hence the enhanced picture quality that comes with progressive scanning.

Which is why EDTV has been called by some the biggest advance in video quality since color TV. Now, HDTV plasma is taking center stage as prices come down quickly on LCD and Plasma TVs. The simple anwer on what a HDTV plasma would be is 1024 x 768 resolution or higher (1366 x 768 is common on 50" plasma, and 1024 x 768 resolution is common on 42" plasma TVs).
If you want to learn more about how enhanced-definition plasma TVs stack up against high-definition ones, see my article, "EDTV Plasmas Vs. HDTV Plasmas: Drawing Some Conclusions About Native Resolutions".

If you want to learn more about how enhanced-definition plasma TVs stack up against high-definition ones, see my article, "EDTV Plasmas Vs. HDTV Plasmas: Drawing Some Conclusions About Native Resolutions".
December 17, 2007 10:03:19 AM

skittle said:
EDTV really doesnt have much a place... as far as standards go. Stations cant broadcast it, and the only things that do make use of it are some dvd players and gaming consoles.




By 2009 everything will be switch to HD. So all the analog signals will be obsolete....

What is HD to begin with: is a t.v. or a monitor with a better clarity thats all there is to it...
Now,,, do not forget my ORIGINAL POST QUESTION "TRUE HD"

back in the day we only have a black and white t.v. then it came a much better one, a color t.v. and you know what people just didn't came up with a HD that's why they named it a color t.v.
December 17, 2007 11:02:23 AM

nel said:
ITS NOT MY FAULT IF YOU PEOPLE ARE LACKING OF BRAIN TO COMPREHEND A SIMPLE SENTENCE. ITS O.K. TO BE IGNORANT BUT TO BE DUMB AS YOU ALL ARE,,, ARE SIMPLY PATHETIC.... WHY NOT JUST DONATE 99% OF YOUR BRAIN TO SOMEONE THAT WILL HAVE GOOD USED FOR IT....


:sleep:  :cry: 

Did someone get out of the wrong side of bed this morning?
December 17, 2007 11:40:42 AM

For years high end graphics cards have been easily able to render well beyond HDTV resolutions like 2560x1600 being standard on any decent card you get today. Also starting with the 6800 series integrated video decoding has become the standard for all high end video cards.

What definitely isn't standard is a cards media decoding capability. If it supports it, it can range from MPEG2 decoding to H264x to Xvid or DivX. Even then on a computer this feature my not even be used if the media player doesn't support the extended capability of the integrated decoder.

Although since quad core seems to be easily available to anyone there should be more than enough processing power without a video cards help. That's not to say you can enable advanced filters that will suck up any processing resource to the point where even SD media will slow to a crawl.

So case in point whatever you were reading was a lie. If it can handle even direct x 7 and is an integrated graphics chip in say a notebook it can render the HD media. It doesn't mean it will run smoothly though. It just means it's possible.
December 17, 2007 1:31:02 PM

nel said:
ITS NOT MY FAULT IF YOU PEOPLE ARE LACKING OF BRAIN TO COMPREHEND A SIMPLE SENTENCE.


True, but it's your fault if you can't communicate a question clearly.
Which seems to be the situation here. :sol: 

In any case, even a lowly 2400 PRO or 8400 GS can display 1080p video...
a c 143 U Graphics card
December 17, 2007 2:22:09 PM

I read somewhere that Vista will downgrade the resolution if it thinks the content is pirated. Maybe that's what's happening here?

Nel, when you grow up and go to your first job interview, try and remember: don't tell the interviewer that he/she has a useless brain that should be donated to science. They usually hate that. :) 

Also, if we're such idiots, then you're an even bigger idiot for asking our advice. Go somewhere else, where people are smart enough for your taste. We'll try to manage somehow without you. :kaola: 

December 17, 2007 2:27:06 PM

nel said:
What is EDTV Resolution?

Reviewer: Phil Conner

What is an Enhanced Definition (EDTV) TV? It is simply a TV—be it a plasma television, or other technology that has 853 x 480 native pixel resolution.


Like SDTV, EDTV contains 480 horizontal lines of picture detail, but the difference is that these 480 lines are displayed differently on standard- versus enhanced-definition televisions.


As a footnote, long before the age of Plasma sets and even a whisper of broadcast digital TV, there was some consideration to "Extended Definition" NTSC. this would work by Phase modulating a second subcarrier with hetrodyned information that could be detected and used by an EDTV to either add more information to the left and right of the picture, giving a 15:9 aspect, or increase horizontal resolution within the 4:3 frame. The idea never went beyond a technical curiosity because the signal to noise had to be so high for clean results, and it caused shimmering on standard sets.
December 17, 2007 2:29:34 PM

nel said:
By 2009 everything will be switch to HD. So all the analog signals will be obsolete....

What is HD to begin with: is a t.v. or a monitor with a better clarity thats all there is to it...
Now,,, do not forget my ORIGINAL POST QUESTION "TRUE HD"

back in the day we only have a black and white t.v. then it came a much better one, a color t.v. and you know what people just didn't came up with a HD that's why they named it a color t.v.


In 2009, everything is NOT switching to HD... everything is switching to digital signals. BIG difference. They are not phasing out SD. Curiously though... the last part you got right. Analog signals will be phased out (I see them delaying this)

What is HD? It certainly does not specifically refer to a TV or Monitor. It is a specification for digital video that has capabilities far above that of SD.


December 17, 2007 2:34:37 PM

aevm said:
I read somewhere that Vista will downgrade the resolution if it thinks the content is pirated. Maybe that's what's happening here?


I don't think so.

What you've heard is probably a twisted misunderstanding of the a content protection flag that can be enabled by the HD-DVD and Blu-ray manufacturers that will lower resolution if the content is played through analog - and not HDCP protected digital - outputs.

But I don't think they've enabled that flag on any movies to date. Once they do (if ever) it'll probably be quite widely publicised.

December 17, 2007 2:35:05 PM

aevm said:
I read somewhere that Vista will downgrade the resolution if it thinks the content is pirated.


Thats sort of the case. Basically if an AACS source calls for HDCP encrypted channels, you need to have an HDCP compliant video card, monitor and cable; otherwise it will degrade the video quality.

Just another reason that HDDVD is superior to BluRay. AACS isnt even mandatory on HDDVD (and there is no region coding). HDDVD is the consumer friendly format :]
December 17, 2007 2:41:08 PM

skittle said:
Thats sort of the case. Basically if an AACS source calls for HDCP encrypted channels, you need to have an HDCP compliant video card, monitor and cable; otherwise it will degrade the video quality.


Not quite: if you're using a digital signal (HDMI or DVI) and don't have an HDCP compliant video card and monitor, there will be no output at all. You will be completely restricted. (The cable doesn't need to be special, any digital cable will do).

This is the case on the PC anyway, I've tested and reviewed it.

If you don't have HDCP components, you can still play on analog though: VGA or component video.
But like I said, they have a mechanism to restrict full resolution analog they may enable in fuuture HD-DVD and Blu-ray disks...
December 17, 2007 11:07:26 PM

cleeve said:
But I don't think they've enabled that flag on any movies to date. Once they do (if ever) it'll probably be quite widely publicised.
They are going to wait until they get enough hardware sold that they are fully entrenched and then they absolutely will enable the flag.

It was hacked within the first 30-60 days of Vista's official release anyway.

I really don't like all the pirating that's going on, but I like Vista and the strong-arm tactics even less. Don't these guys know that the lion's share of the pirating is coming, and will continue to come, from China, Russia etc. They will rip the HD content and burn it to hundreds of thousands of disks and there isn't anything that Hollywood, MS or anyone else can do to stop them. Irritating the crap out of the end user will make very little difference.
a b U Graphics card
December 17, 2007 11:43:21 PM

nel said:
You know what,, the hell with the HD monitor and a HD video card. I'm just going to buy me a HD eye glasses or a HD contact lens then I will see everything in HD at any given moment... YEAH !!! LOL !!!!

I think you may find reading text a bit hard with all those pixels squished into contacts. Maybe it will show you how we saw your OP?
a b U Graphics card
December 18, 2007 12:16:08 AM

nel said:
back in the day we only have a black and white t.v. then it came a much better one, a color t.v. and you know what people just didn't came up with a HD that's why they named it a color t.v.



Hmmmm, they called a color t.v. back then because most tv's then were black and white. So at that time by definition a tv is a monitor that projects images onto a glass tube in black and white. So by saying Color TV you would differentiate it with the standard definition of a TV which includes having only black and white colors. Today a standard definition of a TV now includes the color, so when you say TV it automatically equals to a color tv back in the day. If you want to call a black and white tv now, you just don't call it a TV because the black and white color only is now outside the definition of a regular tv which includes by default, color.

Now why would people call it back then as HDTV, when in fact the major improvement back then was about the color and not the resolution/viewable image. In the future when HDTV is the standard, and almost impossible to get the old regular 480 line and below tv's, then HDTV would just become a TV. And the regular 480 line TV could probably called Standard Definition TV as it is less common than before.

Everytime I try to read the sentences that you actually wrote, it really gives me a headache. Probably due to very very very bad grammar, as I'm alright with bad spelling.
a b U Graphics card
December 18, 2007 12:21:46 AM

Ah but it doesn't project images onto a glass tube. It fires electrons at a glass tube coated in fluorescent material. On colour (learn to spell properly, it's 'colour' not 'color' gee ;) ) TVs a shadow mask is used to prevent electrons from the wrong "colour" electron gun hitting certain parts of the screen. A magnet near the screen screws it up by causing the electrons to deviate from their intended path and thus you have screwed up colours, which looks very cool :D 

Now, if someone could buy me an LCD I'd move away from this ancient tech :) 
December 18, 2007 12:41:26 AM

randomizer said:
On colour (learn to spell properly, it's 'colour' not 'color' gee ;) )
In the US it's color. Colour is British and uh... er... Australian. :lol: 
a b U Graphics card
December 18, 2007 12:46:41 AM

I know, and it is wrong. In fact, the US misspells more words than it gets right :kaola: 
December 18, 2007 1:20:06 AM

Communication is the response one receives.

One must not lay blame on the hearer, if the question is not communicated properly.
December 18, 2007 1:25:58 AM

randomizer said:
I know, and it is wrong. In fact, the US misspells more words than it gets right :kaola: 


Blasphemy, GW will smite you!
December 18, 2007 2:00:58 AM

randomizer said:
I know, and it is wrong. In fact, the US misspells more words than it gets right :kaola: 
Yup, and they change the dictionary to accommodate us. :kaola:  :lol: 
a b U Graphics card
December 18, 2007 2:06:33 AM

*shakes head in disgust*
December 18, 2007 2:07:53 AM

What really pisses me off is nuclear. Clearly it is supposed to be pronounced noo-klee-er, but everyone is now pronouncing it noo-kyuh-ler. There is no "U" between the "C" and the "L" damn it. [:zorg:2]
December 18, 2007 3:36:27 AM

Zorg said:
What really pisses me off is nuclear. Clearly it is supposed to be pronounced noo-klee-er, but everyone is now pronouncing it noo-kyuh-ler. There is no "U" between the "C" and the "L" damn it. [:zorg:2]


Only one dangerously incompetent bigshot that I know of mispronounces it that way.
December 18, 2007 3:38:52 AM

Well you should look around then, he is not alone, in his ability to pronounce words or his incompetence.
a b U Graphics card
December 18, 2007 4:37:41 AM

For the record, the way my brain works:

1% is used for computers, 9% on beer, and the other 90% is used for TNA.

Also, you figured Australians would have some humour considering they spell that word funny. I guess I don't see the humor of it all. :kaola: 
December 18, 2007 9:17:22 AM

Like armor and armour?
Omg why don't you complicate it further for us non english peeps?
!