Best card for HD video?

Toby

Distinguished
Apr 7, 2004
250
0
18,780
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Hi folks,

Which video card would you recommend for playing HD video (720p mostly)
- MPEG4, Quicktime, WMV etc? I can find loads of recommendations on
gaming cards (something this won't be doing a lot of), but straight
video out isn't something I've come across extensively in many reviews.

I need DVI output (preferably dual), PCI-Express architecture, and have
$1000 to spend on the board (possibly more if necessary). I'm after the
most expansion possibilities; I don't want to be buying another card
soon!

Doesn't have to be nVidia - just more familiar with these cards than
others.

Thanks in advance for any thoughts,

Toby
--
Toby Marsden
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

toby@toby.org.uk wrote:
> Hi folks,
>
> Which video card would you recommend for playing HD video (720p mostly)
> - MPEG4, Quicktime, WMV etc? I can find loads of recommendations on
> gaming cards (something this won't be doing a lot of), but straight
> video out isn't something I've come across extensively in many reviews.
>
> I need DVI output (preferably dual), PCI-Express architecture, and have
> $1000 to spend on the board (possibly more if necessary). I'm after the
> most expansion possibilities; I don't want to be buying another card
> soon!
>
> Doesn't have to be nVidia - just more familiar with these cards than
> others.
>
> Thanks in advance for any thoughts,
>
> Toby
> --
> Toby Marsden

Hi,

I've recently been interested in this issue. It seems that recent nVidia
cards can do hardware assited HD Windows Media Video 9 and MPEG2
decoding. nVidia refers to this as "Pure Video" read more here:
http://www.nvidia.com/page/purevideo.html

Note that although all the series 6xxx and now 7xxx cards support some
PureVideo features, generally only the top end card supports all the
features.

However, the catch is get this to work, you need to use the nVida
software codec available for download here:
http://www.nvidia.com/object/dvd_decoder.html

Plus you need to use Windows Media Player 10, Intervideo WinDVD 7 as the
player. (There are a few other software players that are integrating the
support, but things are still sketch, a lot of the most popular players
don't work).

If you do succesfully get it to work, the graphics card takes over from
the CPU to perform most of the decoding. In this review for the new
GeForce 7800 GTX, on an Athlon 64 4000+ processor, a HD MPEG2 file is
being played back with just under 20% CPU load!
http://www.guru3d.com/article/content/229/8/

Without that graphics card that system would probabably require 75%+ CPU
power just to decode that stream - a 50 minute HD video file that is 6
GB in size.

There are some older Pure Video tests here:
http://www.anandtech.com/video/showdoc.aspx?i=2305

If you have $1000 to spend, you may want to wait for the GeForce 7800
Ultra which will be even faster than the GTX. If you want to buy now,
the simple answer is the 7800 GTX, plus buy the nVidia DVD codec, and
maybe a copy of WinDVD 7 (if you don't want to play your HD content from
Windows Media Player).

Lastly, apparently the 7800 can also do hardware assisted decompression
of HD H.264 streams, which is the new video codec that Apple has put in
Quicktime 7, and which is probably going to be used in the new HD DVD
format whenever they agree on exactly what that is going to be.

Simon Howson
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

> I've recently been interested in this issue. It seems that recent nVidia
> cards can do hardware assited HD Windows Media Video 9 and MPEG2
> decoding. nVidia refers to this as "Pure Video" read more here:
> http://www.nvidia.com/page/purevideo.html
>
> Note that although all the series 6xxx and now 7xxx cards support some
> PureVideo features, generally only the top end card supports all the
> features.
>
> However, the catch is get this to work, you need to use the nVida
> software codec available for download here:
> http://www.nvidia.com/object/dvd_decoder.html
>
> Plus you need to use Windows Media Player 10, Intervideo WinDVD 7 as the
> player. (There are a few other software players that are integrating the
> support, but things are still sketch, a lot of the most popular players
> don't work).
>
> If you do succesfully get it to work, the graphics card takes over from
> the CPU to perform most of the decoding. In this review for the new
> GeForce 7800 GTX, on an Athlon 64 4000+ processor, a HD MPEG2 file is
> being played back with just under 20% CPU load!
> http://www.guru3d.com/article/content/229/8/
>
> Without that graphics card that system would probabably require 75%+ CPU
> power just to decode that stream - a 50 minute HD video file that is 6
> GB in size.
>
> There are some older Pure Video tests here:
> http://www.anandtech.com/video/showdoc.aspx?i=2305
>
> If you have $1000 to spend, you may want to wait for the GeForce 7800
> Ultra which will be even faster than the GTX. If you want to buy now,
> the simple answer is the 7800 GTX, plus buy the nVidia DVD codec, and
> maybe a copy of WinDVD 7 (if you don't want to play your HD content from
> Windows Media Player).
>
> Lastly, apparently the 7800 can also do hardware assisted decompression
> of HD H.264 streams, which is the new video codec that Apple has put in
> Quicktime 7, and which is probably going to be used in the new HD DVD
> format whenever they agree on exactly what that is going to be.
>
> Simon Howson

I should've mentioned that the new ATi GPU (codename R520) is also going
to offer HD hardware decoding abilities. That GPU is set to be released
on the 11th of August, and who knows it may be even better than what
nVidia is currently offering with the 7800...

Simon Howson
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

The nVidia decoder is only for MPEG2. The GeForce 6 MPEG2 Decoding will work
with any software DVD player which supports DXVA. The only features which
won't work is the MPEG-2 bad edit correction, which only works with the
nVidia DVD decoder.

For WMV HD, You need the beta WMV HD DXVA patch from Microsoft.


"Simon Howson" <simonhowson@NOSPAMyahoo.com.au> wrote in message
news:sD8Ge.65378$oJ.57125@news-server.bigpond.net.au...
> toby@toby.org.uk wrote:
>> Hi folks,
>>
>> Which video card would you recommend for playing HD video (720p mostly)
>> - MPEG4, Quicktime, WMV etc? I can find loads of recommendations on
>> gaming cards (something this won't be doing a lot of), but straight
>> video out isn't something I've come across extensively in many reviews.
>>
>> I need DVI output (preferably dual), PCI-Express architecture, and have
>> $1000 to spend on the board (possibly more if necessary). I'm after the
>> most expansion possibilities; I don't want to be buying another card
>> soon!
>>
>> Doesn't have to be nVidia - just more familiar with these cards than
>> others.
>>
>> Thanks in advance for any thoughts,
>>
>> Toby
>> --
>> Toby Marsden
>
> Hi,
>
> I've recently been interested in this issue. It seems that recent nVidia
> cards can do hardware assited HD Windows Media Video 9 and MPEG2 decoding.
> nVidia refers to this as "Pure Video" read more here:
> http://www.nvidia.com/page/purevideo.html
>
> Note that although all the series 6xxx and now 7xxx cards support some
> PureVideo features, generally only the top end card supports all the
> features.
>
> However, the catch is get this to work, you need to use the nVida software
> codec available for download here:
> http://www.nvidia.com/object/dvd_decoder.html
>
> Plus you need to use Windows Media Player 10, Intervideo WinDVD 7 as the
> player. (There are a few other software players that are integrating the
> support, but things are still sketch, a lot of the most popular players
> don't work).
>
> If you do succesfully get it to work, the graphics card takes over from
> the CPU to perform most of the decoding. In this review for the new
> GeForce 7800 GTX, on an Athlon 64 4000+ processor, a HD MPEG2 file is
> being played back with just under 20% CPU load!
> http://www.guru3d.com/article/content/229/8/
>
> Without that graphics card that system would probabably require 75%+ CPU
> power just to decode that stream - a 50 minute HD video file that is 6 GB
> in size.
>
> There are some older Pure Video tests here:
> http://www.anandtech.com/video/showdoc.aspx?i=2305
>
> If you have $1000 to spend, you may want to wait for the GeForce 7800
> Ultra which will be even faster than the GTX. If you want to buy now, the
> simple answer is the 7800 GTX, plus buy the nVidia DVD codec, and maybe a
> copy of WinDVD 7 (if you don't want to play your HD content from Windows
> Media Player).
>
> Lastly, apparently the 7800 can also do hardware assisted decompression of
> HD H.264 streams, which is the new video codec that Apple has put in
> Quicktime 7, and which is probably going to be used in the new HD DVD
> format whenever they agree on exactly what that is going to be.
>
> Simon Howson
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Remember, that's with 1080p content at 60 fps. A 30 fps clip drops the CPU
utilization to about 40%. 40% usage is not really discernable from 20% usage
unless you got some pretty CPU-intensive task running in the background.
Running WMV9 content with DXVA enabled also requires an exact combination of
bloated software apps (WMP10, DRM patch, etc.)

Get the 7800GTX for its breakneck speed in games, but don't "bet" on its
video acceleration abilities.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Simon Howson" <simonhowson@NOSPAMyahoo.com.au> wrote in message
news:N8tGe.66254$oJ.18123@news-server.bigpond.net.au...
> He's got $1000 to spend, so I think the 7800GTX would be the best option
> (or the Ultra if he wants to wait). 20% CPU usage sounds better to me than
> 70% usage. The only question I would have would be how good will the R520
> from ATi be at doing HD decoding. It sounds like at this stage nVidia
> haven't released drivers that support hardware assisted decoding of H.264,
> where as ATI are promising that it will be a standard feature when the
> card R520 based card is released.
>
> Simon Howson
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

HDTV is 30 new frames per second, non-interlaced, with each frame duplicated
to reduce flicker on CRTs. There is only 30 frames of content per second.

Phil Weldon

"First of One" <daxinfx@yahoo.com> wrote in message
news:zpqdnRsoJqao4nffRVn-1Q@rogers.com...
> Remember, that's with 1080p content at 60 fps. A 30 fps clip drops the CPU
> utilization to about 40%. 40% usage is not really discernable from 20%
> usage unless you got some pretty CPU-intensive task running in the
> background. Running WMV9 content with DXVA enabled also requires an exact
> combination of bloated software apps (WMP10, DRM patch, etc.)
>
> Get the 7800GTX for its breakneck speed in games, but don't "bet" on its
> video acceleration abilities.
>
> --
> "War is the continuation of politics by other means.
> It can therefore be said that politics is war without
> bloodshed while war is politics with bloodshed."
>
>
> "Simon Howson" <simonhowson@NOSPAMyahoo.com.au> wrote in message
> news:N8tGe.66254$oJ.18123@news-server.bigpond.net.au...
>> He's got $1000 to spend, so I think the 7800GTX would be the best option
>> (or the Ultra if he wants to wait). 20% CPU usage sounds better to me
>> than 70% usage. The only question I would have would be how good will the
>> R520 from ATi be at doing HD decoding. It sounds like at this stage
>> nVidia haven't released drivers that support hardware assisted decoding
>> of H.264, where as ATI are promising that it will be a standard feature
>> when the card R520 based card is released.
>>
>> Simon Howson
>
>
 

Bob

Distinguished
Dec 31, 2007
3,414
0
20,780
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Phil Weldon" <notdiscosed@example.com> wrote in message
news:9OvGe.18956$aY6.4202@newsread1.news.atl.earthlink.net...
> HDTV is 30 new frames per second, non-interlaced, with each frame
> duplicated to reduce flicker on CRTs. There is only 30 frames of content
> per second.
>
> Phil Weldon
Glad someone was here to clear that up. Seems that most people think that a
higher FPS means HD...If that were the case then the cameras that did all
the high speed filming of the shuttle would have an incredible resolution
that would not be able to be played back on anything known to mankind at
this point!
Bob
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

'Bob' wrote:
| Glad someone was here to clear that up. Seems that most people think that
a
| higher FPS means HD...If that were the case then the cameras that did all
| the high speed filming of the shuttle would have an incredible resolution
| that would not be able to be played back on anything known to mankind at
| this point!
_____

And 60 FRAME per second display of NTSC broadcast display has been around
ever since ram became cheap enough to put a frame buffer in a television set
to provide progressive scan (non-interlace, with 2 duplicate frames
replacing two interlaced, half vertical resolution, fields.)

Phil Weldon


"Bob" <luna5nospam@earthlink.net> wrote in message
news:B2yGe.455$4F5.296@fe06.lga...
>
> "Phil Weldon" <notdiscosed@example.com> wrote in message
> news:9OvGe.18956$aY6.4202@newsread1.news.atl.earthlink.net...
>> HDTV is 30 new frames per second, non-interlaced, with each frame
>> duplicated to reduce flicker on CRTs. There is only 30 frames of content
>> per second.
>>
>> Phil Weldon
> Glad someone was here to clear that up. Seems that most people think that
> a higher FPS means HD...If that were the case then the cameras that did
> all the high speed filming of the shuttle would have an incredible
> resolution that would not be able to be played back on anything known to
> mankind at this point!
> Bob
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Hi all,

Here's an update:-

Nvidia to support H.264 in next driver
http://www.theinquirer.net/?article=25000

I'll go for the 7800 after all, now where is that debit card ...

Matt U.K.


"Phil Weldon" <notdiscosed@example.com> wrote in message
news:9OvGe.18956$aY6.4202@newsread1.news.atl.earthlink.net...
> HDTV is 30 new frames per second, non-interlaced, with each frame
> duplicated to reduce flicker on CRTs. There is only 30 frames of content
> per second.
>
> Phil Weldon
>
> "First of One" <daxinfx@yahoo.com> wrote in message
> news:zpqdnRsoJqao4nffRVn-1Q@rogers.com...
>> Remember, that's with 1080p content at 60 fps. A 30 fps clip drops the
>> CPU utilization to about 40%. 40% usage is not really discernable from
>> 20% usage unless you got some pretty CPU-intensive task running in the
>> background. Running WMV9 content with DXVA enabled also requires an exact
>> combination of bloated software apps (WMP10, DRM patch, etc.)
>>
>> Get the 7800GTX for its breakneck speed in games, but don't "bet" on its
>> video acceleration abilities.
>>
>> --
>> "War is the continuation of politics by other means.
>> It can therefore be said that politics is war without
>> bloodshed while war is politics with bloodshed."
>>
>>
>> "Simon Howson" <simonhowson@NOSPAMyahoo.com.au> wrote in message
>> news:N8tGe.66254$oJ.18123@news-server.bigpond.net.au...
>>> He's got $1000 to spend, so I think the 7800GTX would be the best option
>>> (or the Ultra if he wants to wait). 20% CPU usage sounds better to me
>>> than 70% usage. The only question I would have would be how good will
>>> the R520 from ATi be at doing HD decoding. It sounds like at this stage
>>> nVidia haven't released drivers that support hardware assisted decoding
>>> of H.264, where as ATI are promising that it will be a standard feature
>>> when the card R520 based card is released.
>>>
>>> Simon Howson
>>
>>
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

So the video runs at 60 fps, but every two frames are duplicated? That's
ass-backwards, especially since the video clips were clearly intended for
computer monitors.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."


"Phil Weldon" <notdiscosed@example.com> wrote in message
news:9OvGe.18956$aY6.4202@newsread1.news.atl.earthlink.net...
> HDTV is 30 new frames per second, non-interlaced, with each frame
> duplicated to reduce flicker on CRTs. There is only 30 frames of content
> per second.
>
> Phil Weldon
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Phil Weldon" <notdiscosed@example.com> wrote in message
news:lByGe.19046$aY6.1528@newsread1.news.atl.earthlink.net...
> And 60 FRAME per second display of NTSC broadcast display has been around
> ever since ram became cheap enough to put a frame buffer in a television
> set to provide progressive scan (non-interlace, with 2 duplicate frames
> replacing two interlaced, half vertical resolution, fields.)


That's done on the TV set. The source signal is still NTSC interlaced.

The video clips referred to in the XBox 360 link are WMV files, 1440x1080,
60 fps. Microsoft also has a few clips at 22-23 fps, here:
http://www.microsoft.com/windows/windowsmedia/content_provider/film/ContentShowcase.aspx

Keep in mind these are computer video clips; they don't necessarily follow a
broadcast standard.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Phil Weldon wrote:
> HDTV is 30 new frames per second, non-interlaced, with each frame duplicated
> to reduce flicker on CRTs. There is only 30 frames of content per second.

If you're referring to broadcast HD, that is definitely not true. That
is either 1080i which is 60 interlaced frames per second, or 720p which
is 60 non-interlaced frames per second. Either has 60 new frames of
content per second.

Some WMV HD content is 1080p, however this is not a broadcast HD
standard and is basically for computer displays only at this point.

--
Robert Hancock Saskatoon, SK, Canada
To email, remove "nospam" from hancockr@nospamshaw.ca
Home Page: http://www.roberthancock.com/
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

'First of One' wrote:
| That's done on the TV set. The source signal is still NTSC interlaced.

_____
Of course it's done on the TV set. There is certainly no reason for
duplicated information using up bandwidth. The interlace format was chosen
to solve the problem of persistence (flicker) on CRT's without increasing
drive power and horizontal frequency in the sweep circuits. Advances have
made interlace unnecessary.



| The video clips referred to in the XBox 360 link are WMV files, 1440x1080,
| 60 fps. Microsoft also has a few clips at 22-23 fps, here:
|
http://www.microsoft.com/windows/windowsmedia/content_provider/film/ContentShowcase.aspx

| Keep in mind these are computer video clips; they don't necessarily follow
a
broadcast standard.
_____

There is no need for more than 30 frames of new information per second for
display. More than 30 frames of new information per second are redundant.

Phil Weldon

"First of One" <daxinfx@yahoo.com> wrote in message
news:dcGdnaCIgsROlnbfRVn-tQ@rogers.com...
> "Phil Weldon" <notdiscosed@example.com> wrote in message
> news:lByGe.19046$aY6.1528@newsread1.news.atl.earthlink.net...
>> And 60 FRAME per second display of NTSC broadcast display has been around
>> ever since ram became cheap enough to put a frame buffer in a television
>> set to provide progressive scan (non-interlace, with 2 duplicate frames
>> replacing two interlaced, half vertical resolution, fields.)
>
>
> That's done on the TV set. The source signal is still NTSC interlaced.
>
> The video clips referred to in the XBox 360 link are WMV files, 1440x1080,
> 60 fps. Microsoft also has a few clips at 22-23 fps, here:
> http://www.microsoft.com/windows/windowsmedia/content_provider/film/ContentShowcase.aspx
>
> Keep in mind these are computer video clips; they don't necessarily follow
> a broadcast standard.
>
> --
> "War is the continuation of politics by other means.
> It can therefore be said that politics is war without
> bloodshed while war is politics with bloodshed."
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

The frames are duplicated AT THE DISPLAY. The frame are NOT duplicated in
the data stream.
Thirty frames per second meet the persistence of vison requirement; display
of 30 frames per second at a 60 frames per second rate solves the CRT
phosphor persistence requirement.

The video clips are a demo, and not anything that represents practice, nor
information content.

Phil Weldon

"First of One" <daxinfx@yahoo.com> wrote in message
news:HqWdnbcHZI7QlXbfRVn-vA@rogers.com...
> So the video runs at 60 fps, but every two frames are duplicated? That's
> ass-backwards, especially since the video clips were clearly intended for
> computer monitors.
>
> --
> "War is the continuation of politics by other means.
> It can therefore be said that politics is war without
> bloodshed while war is politics with bloodshed."
>
>
> "Phil Weldon" <notdiscosed@example.com> wrote in message
> news:9OvGe.18956$aY6.4202@newsread1.news.atl.earthlink.net...
>> HDTV is 30 new frames per second, non-interlaced, with each frame
>> duplicated to reduce flicker on CRTs. There is only 30 frames of content
>> per second.
>>
>> Phil Weldon
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

'Robert Hancock' wrote:
| If you're referring to broadcast HD, that is definitely not true. That
| is either 1080i which is 60 interlaced frames per second, or 720p which
| is 60 non-interlaced frames per second. Either has 60 new frames of
| content per second.
|
| Some WMV HD content is 1080p, however this is not a broadcast HD
| standard and is basically for computer displays only at this point.
_____

'Computer display' is exactly that, computer display.
An interlaced frame is TWO fields, one of the odd numbered lines, one of the
even numbered lines. The odd numbered line field is alternated with the
even numbered line field, with a half line delay between. That is
interlaced display. TWO fields make one frame. Sixty FIELDS per second
make 30 FRAMES per second. It takes two fields to make ONE complete screen
of information.

Since computer display adapters have a built-in frame buffer, interlace is
not a necessary technique. One you have a frame buffer, there is a
disconnect between the order of tramsmitted lines and the order of displayed
line.

In both the 1080i and 720p there are 30 FRAMES displayed per second. The
differences between the two are resolution and the transmitted order of
lines. Display consists, for both, of one new complete screen image thirty
times per second.

HD video, HDTV, whatever; 30 complete frames per second are sufficient.
More frames of new information per second adds no more definition. Some
VERY high definition formats have fewer than 30 frames per second (70 mm
theatrical film, for example.) Some displays of 25 frames per second are
marginal (25 frames per second / 50 fields per second PAL and SECAM) because
of CRT limitations.

THERE IS A DIFFERENCE between the way information is stored/transmitted and
the way it is displayed. Once computers and frame buffers enter, interlaced
transmission/storage order can be displayed in progressive order and vice
versa; resolutions can be changed, aspect ratio can be changed. Don't
confuse display with content with transmission/storage.


Phil Weldon

"Robert Hancock" <hancockr@nospamshaw.ca> wrote in message
news:ag0He.71002$5V4.68398@pd7tw3no...
> Phil Weldon wrote:
>> HDTV is 30 new frames per second, non-interlaced, with each frame
>> duplicated to reduce flicker on CRTs. There is only 30 frames of content
>> per second.
>
> If you're referring to broadcast HD, that is definitely not true. That is
> either 1080i which is 60 interlaced frames per second, or 720p which is 60
> non-interlaced frames per second. Either has 60 new frames of content per
> second.
>
> Some WMV HD content is 1080p, however this is not a broadcast HD standard
> and is basically for computer displays only at this point.
>
> --
> Robert Hancock Saskatoon, SK, Canada
> To email, remove "nospam" from hancockr@nospamshaw.ca
> Home Page: http://www.roberthancock.com/
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Phil Weldon" <notdiscosed@example.com> wrote in message
news:_ZaHe.1025$ns.303@newsread1.news.atl.earthlink.net...
> There is no need for more than 30 frames of new information per second for
> display. More than 30 frames of new information per second are redundant.

90% of first-person shooter players will disagree with you. True, for slow
motion normally seen in movies, 30 fps is sufficient. However, for anything
with fast panning and zoom, 50-60 fps definitely makes a noticeable
difference. A demo video clip depicting *game* content can be justifiably
encoded with 60 fps of new information.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Gaming is only slightly related to movies (DV or film.)
Gamers may argue that they can see the impossible.
Impossible.

Phil Weldon


"First of One" <daxinfx@yahoo.com> wrote in message
news:FMadnTshCq8B2HPfRVn-qg@rogers.com...
> "Phil Weldon" <notdiscosed@example.com> wrote in message
> news:_ZaHe.1025$ns.303@newsread1.news.atl.earthlink.net...
>> There is no need for more than 30 frames of new information per second
>> for display. More than 30 frames of new information per second are
>> redundant.
>
> 90% of first-person shooter players will disagree with you. True, for slow
> motion normally seen in movies, 30 fps is sufficient. However, for
> anything with fast panning and zoom, 50-60 fps definitely makes a
> noticeable difference. A demo video clip depicting *game* content can be
> justifiably encoded with 60 fps of new information.
>
> --
> "War is the continuation of politics by other means.
> It can therefore be said that politics is war without
> bloodshed while war is politics with bloodshed."
>
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Impossible? It's easy enough to experiment:

In Quake2, look around with the mouse. Most systems today easily push 60 fps
in this game. Now type "cl_maxfps 30" at the console prompt to cap the
framerate at 30 fps and move the mouse again. The difference is fairly
noticeable.

The only reason movies can get away with 25-30 fps is because violent camera
movement of such magnitude seldom happens.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."

"Phil Weldon" <notdiscosed@example.com> wrote in message
news:NquHe.8758$6f.1221@newsread3.news.atl.earthlink.net...
> Gaming is only slightly related to movies (DV or film.)
> Gamers may argue that they can see the impossible.
> Impossible.
>
> "First of One" <daxinfx@yahoo.com> wrote in message
> news:FMadnTshCq8B2HPfRVn-qg@rogers.com...
>>
>> 90% of first-person shooter players will disagree with you. True, for
>> slow motion normally seen in movies, 30 fps is sufficient. However, for
>> anything with fast panning and zoom, 50-60 fps definitely makes a
>> noticeable difference. A demo video clip depicting *game* content can be
>> justifiably encoded with 60 fps of new information.
>>
>> "Phil Weldon" <notdiscosed@example.com> wrote in message
>> news:_ZaHe.1025$ns.303@newsread1.news.atl.earthlink.net...
>>> There is no need for more than 30 frames of new information per second
>>> for display. More than 30 frames of new information per second are
>>> redundant.