Firewire capture

G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

Would there be any difference in capture quality from a cheaper miniDV
single chip cam rather than using the original 3 chip cam used for videoing
the footage when transferring to PC? The reason is my intention to save on
wear from my good camera.

Thanks for any comments if there is anything I should be aware of ?
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

"David Donegan" wrote ...
> Would there be any difference in capture quality from
> a cheaper miniDV single chip cam rather than using the
> original 3 chip cam used for videoin the footage when
> transferring to PC? The reason is my intention to save
> on wear from my good camera.

The "camera" part of the camcorder plays no part in the
transfer of the digital data from the tape into the computer.
Therefore there will be no difference in capture quality
because of using a single-chip camera.

> Thanks for any comments if there is anything I should be
> aware of ?

Avoid the "long-playing" or "slow" speed. It is more
problematic to achieve proper "interchange" (the ability
to read the tape on a different machine than it was written
on.)
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

In message <kpjKe.2001$CM.1192@newsfe7-win.ntli.net>, David Donegan
<david.donegan@ntlworld.com> writes
>Would there be any difference in capture quality from a cheaper miniDV
>single chip cam rather than using the original 3 chip cam used for videoing
>the footage when transferring to PC? The reason is my intention to save on
>wear from my good camera.
>
>Thanks for any comments if there is anything I should be aware of ?
>
I don't think there would be an issue. However, I can't see any wear
problems with using your "good" camcorder, providing you use premium
quality tapes. IMHO too many people go hunting around for the cheapest
possible tapes then wonder why they have to regularly clean their
camcorder's heads.
--
Tony Morgan
http://www.camcord.info
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

"David Donegan" <david.donegan@ntlworld.com> wrote in message
news:kpjKe.2001$CM.1192@newsfe7-win.ntli.net...
> Would there be any difference in capture quality from a cheaper miniDV
> single chip cam rather than using the original 3 chip cam used for
videoing
> the footage when transferring to PC? The reason is my intention to save
on
> wear from my good camera.
>
> Thanks for any comments if there is anything I should be aware of ?

I do it routinely - our Canon XM1 (GL1) seems to record "beeps" randomly
during capture with MediaStudio Pro.

I was told (some time ago, on this group) that using Scenalyzer to capture
would fix the problem, but a cheaper solution was to use the little Sony
Handycam I already had.
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

"Richard Crowley" <rcrowley@xpr7t.net> wrote in message
news:11fjtptjuupg424@corp.supernews.com...
> "David Donegan" wrote ...
>> Would there be any difference in capture quality from
>> a cheaper miniDV single chip cam rather than using the
>> original 3 chip cam used for videoin the footage when
>> transferring to PC? The reason is my intention to save
>> on wear from my good camera.
>
> The "camera" part of the camcorder plays no part in the
> transfer of the digital data from the tape into the computer.
> Therefore there will be no difference in capture quality
> because of using a single-chip camera.
>
<snip>

Whilst I don't know this to be true in every case, it has come to my
attention in the past that manufacturers design systems so that no part is
better than the weakest link (because there would be no point in "wasting"
the money). Thus I can envisage that the playback/output circuitry in a
single CCD camera MIGHT not be as good as that in a 3-CCD camera, because
the quality of the captured image (in the camera) would not be as good.
This COULD affect external capture quality. However, I've not investigated
this, so I could be talking complete B*****cks. No doubt someone will
delight in telling me so, if this is the case.

Chas
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

"Chas Gill" <Chas.Gill@gollum.btinternet.com> wrote in message
news:ddcv8m$1fl$1@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com...
>
> "Richard Crowley" <rcrowley@xpr7t.net> wrote in message
> news:11fjtptjuupg424@corp.supernews.com...
>> "David Donegan" wrote ...
>>> Would there be any difference in capture quality from
>>> a cheaper miniDV single chip cam rather than using the
>>> original 3 chip cam used for videoin the footage when
>>> transferring to PC? The reason is my intention to save
>>> on wear from my good camera.
>>
>> The "camera" part of the camcorder plays no part in the
>> transfer of the digital data from the tape into the computer.
>> Therefore there will be no difference in capture quality
>> because of using a single-chip camera.
>>
> <snip>
>
> Whilst I don't know this to be true in every case, it has come to my
> attention in the past that manufacturers design systems so that no
> part is better than the weakest link (because there would be no point
> in "wasting" the money). Thus I can envisage that the playback/output
> circuitry in a single CCD camera MIGHT not be as good as that in a
> 3-CCD camera, because the quality of the captured image (in the
> camera) would not be as good. This COULD affect external capture
> quality. However, I've not investigated this, so I could be talking
> complete B*****cks. No doubt someone will delight in telling me so,
> if this is the case.

If we were talking about ANALOG camcorders, you would
have an excellent point. But we are talking about DIGITAL.

One of the blessings/curses of digital is that it either transfers
the bitstream PERFECTLY or it completely FAILS. There is
no "graceful disintegration" as we see in analog media/methods.
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

> Would there be any difference in capture quality from a cheaper miniDV
> single chip cam rather than using the original 3 chip cam used for videoing
> the footage when transferring to PC? The reason is my intention to save on
> wear from my good camera.

No. The DV video stream pulled off any tape, regardless of where it
has been filmed, is tranfered bit-for-bit accurate to the PC through any
other DV camcorder you play the video in.
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

In message <ddcv8m$1fl$1@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com>, Chas
Gill <Chas.Gill@gollum.btinternet.com> writes
>
>"Richard Crowley" <rcrowley@xpr7t.net> wrote in message
>news:11fjtptjuupg424@corp.supernews.com...
>> "David Donegan" wrote ...
>>> Would there be any difference in capture quality from
>>> a cheaper miniDV single chip cam rather than using the
>>> original 3 chip cam used for videoin the footage when
>>> transferring to PC? The reason is my intention to save
>>> on wear from my good camera.
>>
>> The "camera" part of the camcorder plays no part in the
>> transfer of the digital data from the tape into the computer.
>> Therefore there will be no difference in capture quality
>> because of using a single-chip camera.
>>
><snip>
>
>Whilst I don't know this to be true in every case, it has come to my
>attention in the past that manufacturers design systems so that no part is
>better than the weakest link (because there would be no point in "wasting"
>the money). Thus I can envisage that the playback/output circuitry in a
>single CCD camera MIGHT not be as good as that in a 3-CCD camera, because
>the quality of the captured image (in the camera) would not be as good.
>This COULD affect external capture quality. However, I've not investigated
>this, so I could be talking complete B*****cks. No doubt someone will
>delight in telling me so, if this is the case.
>
I would suggest that you are mistaken in this instance. The CCD(s)
output goes to the DSP with one configuration, then feeds it to drivers
for the tape recording.

On replay, the signal from the tape goes to the DSP with a different
configuration, then on to the firewire driver/contol circuits. It
doesn't go anywhere near the CCD. So I fail to see that the CCD(s)
characteristics can have any effect whatsoever during replay.
--
Tony Morgan
http://www.camcord.info
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

"Tony Morgan" <Tony@82.69.78.126> wrote in message
news:kprTq4FsCj+CFw1I@82.69.78.126...
> In message <ddcv8m$1fl$1@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com>, Chas Gill
> <Chas.Gill@gollum.btinternet.com> writes
>>
>>"Richard Crowley" <rcrowley@xpr7t.net> wrote in message
>>news:11fjtptjuupg424@corp.supernews.com...
>>> "David Donegan" wrote ...
>>>> Would there be any difference in capture quality from
>>>> a cheaper miniDV single chip cam rather than using the
>>>> original 3 chip cam used for videoin the footage when
>>>> transferring to PC? The reason is my intention to save
>>>> on wear from my good camera.
>>>
>>> The "camera" part of the camcorder plays no part in the
>>> transfer of the digital data from the tape into the computer.
>>> Therefore there will be no difference in capture quality
>>> because of using a single-chip camera.
>>>
>><snip>
>>
>>Whilst I don't know this to be true in every case, it has come to my
>>attention in the past that manufacturers design systems so that no part is
>>better than the weakest link (because there would be no point in "wasting"
>>the money). Thus I can envisage that the playback/output circuitry in a
>>single CCD camera MIGHT not be as good as that in a 3-CCD camera, because
>>the quality of the captured image (in the camera) would not be as good.
>>This COULD affect external capture quality. However, I've not
>>investigated
>>this, so I could be talking complete B*****cks. No doubt someone will
>>delight in telling me so, if this is the case.
>>
> I would suggest that you are mistaken in this instance. The CCD(s) output
> goes to the DSP with one configuration, then feeds it to drivers for the
> tape recording.
>
> On replay, the signal from the tape goes to the DSP with a different
> configuration, then on to the firewire driver/contol circuits. It doesn't
> go anywhere near the CCD. So I fail to see that the CCD(s) characteristics
> can have any effect whatsoever during replay.
> --
> Tony Morgan
> http://www.camcord.info

This I understand. I think the point I was making was that the bitstream
carries less data in a single CCD camera than it does in a 3-CCD camera.
Forgive me if I misunderstand something here, but if the 3-CCD camera is
capable of better quality images, then my logic tells me that each image
frame must have more bits to describe it. This would mean more bandwidth
required and consequently better quality output circuitry than might exist
in a single CCD camera. I have no idea if this is the case, however, and
would welcome enlightenment.
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

"Chas Gill" wrote ...
> ... I think the point I was making was that
> the bitstream carries less data in a single CCD camera than it does in a
> 3-CCD camera.

No. The bitstream niether knows nor cares what the source of
the image was. It is standardized across all camera types. Else
we couldn't use it interchangably.

> Forgive me if I misunderstand something here, but if the 3-CCD camera is
> capable of better quality images, then my logic tells me that each image
> frame must have more bits to describe it. This would mean more bandwidth
> required and consequently better quality output circuitry than might exist
> in a single CCD camera. I have no idea if this is the case, however, and
> would welcome enlightenment.

The "bandwidth" of the DV codec exceeds the resolution of
1-chip cameras. The bandwidth is always there whether the
camera can generate sufficiently detailed images to expolit it
or not.
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

In message <dddgg6$g27$1@nwrdmz01.dmz.ncs.ea.ibs-infra.bt.com>, Chas
Gill <Chas.Gill@gollum.btinternet.com> writes
Snipped...

>> I would suggest that you are mistaken in this instance. The CCD(s) output
>> goes to the DSP with one configuration, then feeds it to drivers for the
>> tape recording.
>>
>> On replay, the signal from the tape goes to the DSP with a different
>> configuration, then on to the firewire driver/contol circuits. It doesn't
>> go anywhere near the CCD. So I fail to see that the CCD(s) characteristics
>> can have any effect whatsoever during replay.
>> --
>> Tony Morgan
>> http://www.camcord.info
>
>This I understand. I think the point I was making was that the
>bitstream carries less data in a single CCD camera than it does in a
>3-CCD camera.

I don't think so. On record, (with 3-CCD) the DSP resolves the 3 CCD
inputs to DV25 digital signal, feeds that to the drivers for the tape.

On record with a single CCD, the DSP converts it to D25 digital signal
and feeds that (as before) to the drivers for the tape. Essentially the
same D25 signal, except that there might be a slightly lower quality
w.r.t. the 3-CCD operation.

The sampling bit rate is exactly the same for both cases, and the D25
bit-rate is exactly the same also.

>Forgive me if I misunderstand something here, but if the 3-CCD camera
>is capable of better quality images, then my logic tells me that each
>image frame must have more bits to describe it.

I think you are looking at an analogue signal, where the bandwidth is
very significant. But here we are in the digital domain.

> This would mean more bandwidth required and consequently better
>quality output circuitry than might exist in a single CCD camera. I
>have no idea if this is the case, however, and would welcome
>enlightenment.

Perhaps the best way of understanding it, is to consider the process
where you use your camcorder to convert an analogue signal to a digital
D25 signal to record to tape. Most camcorders with DV-in allow you to do
this. I have "captured" analogue signals from both VHS tape (a quite low
quality - aka analogue bandwidth) and from my Sky box (appreciably
better quality than from VHS). The sampling bit-rate is the same for
both. The D25 digital signal recorded to tape is also exactly the same.
For both. The difference is not in bit rate (anywhere in the signal
path) but when ultimately decoded from digital to analogue, you are left
with essentially the same lower quality analogue (from VHS), and the
higher quality analogue (from the Sky box).

For any particular sampling bit-rate, there is indeed a maximum analogue
bandwidth (aka quality) that can be encoded - but D25 encompasses both
VHS source and Skybox source within it's resolving range.

A similar situation exists when comparing 3-CCD input/output with
single-CCD input/output. The sampling bit rate is the same for both, and
the D25 bitrate is the same for both.

Three years ago, there was a more noticeable difference between 3-CCD
video and single CCD video, but the gap has closed considerably today -
especially with the upper-range single-CCD camcorders. Sony kicked it
off about three years ago with the development of their proprietary HAD
technology and their 16-bit DSPs. Prior to that all camcorder DSPs were
12-bit wide and today the low-end camcorders still use 12-bit
technology. Because of the DSP bus width, they are able to do more
real-time processing to improve the quality of the image presented to
the tape recording drivers - but the recording always uses the same
bit-rate of D25. Taking analogies further, it's like comparing 16-bit
PCs with 32-bit PCs and (now) 64-bit PCs. All it means is that more
processing can be done in a given "window" of time.

Another example might also illustrate it all. Consider the digital
storage of inputs from high quality microphones (with a dynamic range of
20Hz to 20KHz) compared with low quality microphones (with a dynamic
range of 50Hz to 16KHz). The sampling rate of both is the same, as is
the bit-rate of the resulting signal. But when the digital signal is
extracted, there is the same difference in bandwidth (aka quality) as
the original analogue signals.

Before someone jumps in as so often happens, I have here been discussing
in the context of miniDV camcorders, not the professional recording
technologies.

--
Tony Morgan
http://www.camcord.info
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

On 8/10/2005, Tony Morgan managed to type:
> In message <dddgg6$g27$1@nwrdmz01.dmz.ncs.ea.ibs-infra.bt.com>, Chas Gill
> <Chas.Gill@gollum.btinternet.com> writes
> Snipped...
>
>>> I would suggest that you are mistaken in this instance. The CCD(s) output
>>> goes to the DSP with one configuration, then feeds it to drivers for the
>>> tape recording.
>>>
>>> On replay, the signal from the tape goes to the DSP with a different
>>> configuration, then on to the firewire driver/contol circuits. It doesn't
>>> go anywhere near the CCD. So I fail to see that the CCD(s) characteristics
>>> can have any effect whatsoever during replay.
>>> --
>>> Tony Morgan
>>> http://www.camcord.info
>>
>>This I understand. I think the point I was making was that the bitstream
>> carries less data in a single CCD camera than it does in a 3-CCD camera.
>
> I don't think so. On record, (with 3-CCD) the DSP resolves the 3 CCD inputs
> to DV25 digital signal, feeds that to the drivers for the tape.
>
> On record with a single CCD, the DSP converts it to D25 digital signal and
> feeds that (as before) to the drivers for the tape. Essentially the same D25
> signal, except that there might be a slightly lower quality w.r.t. the 3-CCD
> operation.
>
> The sampling bit rate is exactly the same for both cases, and the D25
> bit-rate is exactly the same also.
>
>>Forgive me if I misunderstand something here, but if the 3-CCD camera is
>> capable of better quality images, then my logic tells me that each image
>> frame must have more bits to describe it.
>
> I think you are looking at an analogue signal, where the bandwidth is very
> significant. But here we are in the digital domain.
>
>> This would mean more bandwidth required and consequently better quality
>> output circuitry than might exist in a single CCD camera. I have no idea
>> if this is the case, however, and would welcome enlightenment.
>
> Perhaps the best way of understanding it, is to consider the process where
> you use your camcorder to convert an analogue signal to a digital D25 signal
> to record to tape. Most camcorders with DV-in allow you to do this. I have
> "captured" analogue signals from both VHS tape (a quite low quality - aka
> analogue bandwidth) and from my Sky box (appreciably better quality than from
> VHS). The sampling bit-rate is the same for both. The D25 digital signal
> recorded to tape is also exactly the same.
> For both. The difference is not in bit rate (anywhere in the signal path) but
> when ultimately decoded from digital to analogue, you are left with
> essentially the same lower quality analogue (from VHS), and the higher
> quality analogue (from the Sky box).
>
> For any particular sampling bit-rate, there is indeed a maximum analogue
> bandwidth (aka quality) that can be encoded - but D25 encompasses both VHS
> source and Skybox source within it's resolving range.
>
> A similar situation exists when comparing 3-CCD input/output with single-CCD
> input/output. The sampling bit rate is the same for both, and the D25 bitrate
> is the same for both.
>
> Three years ago, there was a more noticeable difference between 3-CCD video
> and single CCD video, but the gap has closed considerably today - especially
> with the upper-range single-CCD camcorders. Sony kicked it off about three
> years ago with the development of their proprietary HAD technology and their
> 16-bit DSPs. Prior to that all camcorder DSPs were 12-bit wide and today the
> low-end camcorders still use 12-bit technology. Because of the DSP bus width,
> they are able to do more real-time processing to improve the quality of the
> image presented to the tape recording drivers - but the recording always uses
> the same bit-rate of D25. Taking analogies further, it's like comparing
> 16-bit PCs with 32-bit PCs and (now) 64-bit PCs. All it means is that more
> processing can be done in a given "window" of time.
>
> Another example might also illustrate it all. Consider the digital storage of
> inputs from high quality microphones (with a dynamic range of 20Hz to 20KHz)
> compared with low quality microphones (with a dynamic range of 50Hz to
> 16KHz). The sampling rate of both is the same, as is the bit-rate of the
> resulting signal. But when the digital signal is extracted, there is the same
> difference in bandwidth (aka quality) as the original analogue signals.
>
> Before someone jumps in as so often happens, I have here been discussing in
> the context of miniDV camcorders, not the professional recording
> technologies.

Reading these posts gave me another idea about how to express the cheap
vs. expensive problem here. This will clarify (I hope) or muddy the
discussion ;-)

Both cameras use the same sampling rate, the same number of samples per
second, and the same digital tape format to create their digital data
from the analog input. The cheaper camera might produce a poorer analog
signal before it gets to digital, and might convert it less accurately,
but both cameras have an equally precise encoding of their respective
signals.

Thus the expensive circuitry produces a very accurate rendition of a
good signal and the cheap circuitry produces an equally accurate
rendition of a poor signal.

I'm fudging a bit here, in that poor A-to-D conversion means the
cheaper camera produces a very accurate rendition of a converted
signal, which is not necessarily the exact input to the A-D conversion.

On analog playback the cheaper camera may also do more poorly, due to
poor D-A conversion and/or poor analog circuitry after the conversion,
but on digital transfer, you get the bits that are on either tape (the
digital part was already said, in slightly different words, in this
thread).

Gino

--
Gene E. Bloch (Gino)
letters617blochg3251
(replace the numbers by "at" and "dotcom")
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

In message <4ctEHlI3Jl+CFw1h@82.69.78.126>, Tony Morgan
<Tony@82.69.78.126> writes
>But when the digital signal is extracted, there is the same difference
>in bandwidth (aka quality) as the original analogue signals.

Sorry, that should have read "But when the analogue signal is extracted
from the digital, there is..... etc"
--
Tony Morgan
http://www.camcord.info
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

One thing to be keep in mind - for some odd reason tapes recorded using
some Canon digital cam models, can't be played back for capture using
just any other digital video cam. For example, I can't capture a tape
using my Panasonic 3-chip PV-GS120 that was filmed using a Canon ZR
series cam. I have to get my Sony cam out to do this.

joe
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

"Gene E. Bloch" <spamfree@nobody.invalid> wrote in message
news:mn.53a47d58d22c5763.1980@nobody.invalid...
> On 8/10/2005, Tony Morgan managed to type:
>> In message <dddgg6$g27$1@nwrdmz01.dmz.ncs.ea.ibs-infra.bt.com>, Chas Gill
>> <Chas.Gill@gollum.btinternet.com> writes
>> Snipped...
>>
>>>> I would suggest that you are mistaken in this instance. The CCD(s)
>>>> output
>>>> goes to the DSP with one configuration, then feeds it to drivers for
>>>> the
>>>> tape recording.
>>>>
>>>> On replay, the signal from the tape goes to the DSP with a different
>>>> configuration, then on to the firewire driver/contol circuits. It
>>>> doesn't
>>>> go anywhere near the CCD. So I fail to see that the CCD(s)
>>>> characteristics
>>>> can have any effect whatsoever during replay.
>>>> --
>>>> Tony Morgan
>>>> http://www.camcord.info
>>>
>>>This I understand. I think the point I was making was that the bitstream
>>>carries less data in a single CCD camera than it does in a 3-CCD camera.
>>
>> I don't think so. On record, (with 3-CCD) the DSP resolves the 3 CCD
>> inputs to DV25 digital signal, feeds that to the drivers for the tape.
>>
>> On record with a single CCD, the DSP converts it to D25 digital signal
>> and feeds that (as before) to the drivers for the tape. Essentially the
>> same D25 signal, except that there might be a slightly lower quality
>> w.r.t. the 3-CCD operation.
>>
>> The sampling bit rate is exactly the same for both cases, and the D25
>> bit-rate is exactly the same also.
>>
>>>Forgive me if I misunderstand something here, but if the 3-CCD camera is
>>>capable of better quality images, then my logic tells me that each image
>>>frame must have more bits to describe it.
>>
>> I think you are looking at an analogue signal, where the bandwidth is
>> very significant. But here we are in the digital domain.
>>
>>> This would mean more bandwidth required and consequently better quality
>>> output circuitry than might exist in a single CCD camera. I have no
>>> idea if this is the case, however, and would welcome enlightenment.
>>
>> Perhaps the best way of understanding it, is to consider the process
>> where you use your camcorder to convert an analogue signal to a digital
>> D25 signal to record to tape. Most camcorders with DV-in allow you to do
>> this. I have "captured" analogue signals from both VHS tape (a quite low
>> quality - aka analogue bandwidth) and from my Sky box (appreciably better
>> quality than from VHS). The sampling bit-rate is the same for both. The
>> D25 digital signal recorded to tape is also exactly the same.
>> For both. The difference is not in bit rate (anywhere in the signal path)
>> but when ultimately decoded from digital to analogue, you are left with
>> essentially the same lower quality analogue (from VHS), and the higher
>> quality analogue (from the Sky box).
>>
>> For any particular sampling bit-rate, there is indeed a maximum analogue
>> bandwidth (aka quality) that can be encoded - but D25 encompasses both
>> VHS source and Skybox source within it's resolving range.
>>
>> A similar situation exists when comparing 3-CCD input/output with
>> single-CCD input/output. The sampling bit rate is the same for both, and
>> the D25 bitrate is the same for both.
>>
>> Three years ago, there was a more noticeable difference between 3-CCD
>> video and single CCD video, but the gap has closed considerably today -
>> especially with the upper-range single-CCD camcorders. Sony kicked it off
>> about three years ago with the development of their proprietary HAD
>> technology and their 16-bit DSPs. Prior to that all camcorder DSPs were
>> 12-bit wide and today the low-end camcorders still use 12-bit technology.
>> Because of the DSP bus width, they are able to do more real-time
>> processing to improve the quality of the image presented to the tape
>> recording drivers - but the recording always uses the same bit-rate of
>> D25. Taking analogies further, it's like comparing 16-bit PCs with 32-bit
>> PCs and (now) 64-bit PCs. All it means is that more processing can be
>> done in a given "window" of time.
>>
>> Another example might also illustrate it all. Consider the digital
>> storage of inputs from high quality microphones (with a dynamic range of
>> 20Hz to 20KHz) compared with low quality microphones (with a dynamic
>> range of 50Hz to 16KHz). The sampling rate of both is the same, as is the
>> bit-rate of the resulting signal. But when the digital signal is
>> extracted, there is the same difference in bandwidth (aka quality) as the
>> original analogue signals.
>>
>> Before someone jumps in as so often happens, I have here been discussing
>> in the context of miniDV camcorders, not the professional recording
>> technologies.
>
> Reading these posts gave me another idea about how to express the cheap
> vs. expensive problem here. This will clarify (I hope) or muddy the
> discussion ;-)
>
> Both cameras use the same sampling rate, the same number of samples per
> second, and the same digital tape format to create their digital data from
> the analog input. The cheaper camera might produce a poorer analog signal
> before it gets to digital, and might convert it less accurately, but both
> cameras have an equally precise encoding of their respective signals.
>
> Thus the expensive circuitry produces a very accurate rendition of a good
> signal and the cheap circuitry produces an equally accurate rendition of a
> poor signal.
>
> I'm fudging a bit here, in that poor A-to-D conversion means the cheaper
> camera produces a very accurate rendition of a converted signal, which is
> not necessarily the exact input to the A-D conversion.
>
> On analog playback the cheaper camera may also do more poorly, due to poor
> D-A conversion and/or poor analog circuitry after the conversion, but on
> digital transfer, you get the bits that are on either tape (the digital
> part was already said, in slightly different words, in this thread).
>
> Gino
>
> --
> Gene E. Bloch (Gino)
> letters617blochg3251
> (replace the numbers by "at" and "dotcom")
>
>

Thank you, gentlemen. It's all a lot clearer now.

Chas
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

"Gene E. Bloch" wrote ...
> Thus the expensive circuitry produces a very accurate rendition of a good
> signal and the cheap circuitry produces an equally accurate rendition of a
> poor signal.
>
> I'm fudging a bit here, in that poor A-to-D conversion means the cheaper
> camera produces a very accurate rendition of a converted signal, which is
> not necessarily the exact input to the A-D conversion.

Good A/D conversion, etc is pretty cheap and it wouldn't surprise me
if you found the same circuitry/chips/"quality" in very low-end camcorders
as you find in high-end.

But the pickup device(s) (optical chips) and the "glass" (lens assembly,
etc.) are the expensive and labor-intensive parts. That is where the biggest
difference is to be found between low-end (1-chip) and high-end (3-
chip) cameras. Then, of course, you additionally have the whole optical
splitters/filters required for 3-chip operation.

You could buy a 2-3 top of the line 1-chip camcorders for the price of
the average lens for a 3-chip camera.
 
G

Guest

Guest
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

"eunma" wrote...
> One thing to be keep in mind - for some odd reason tapes recorded using
> some Canon digital cam models, can't be played back for capture using
> just any other digital video cam. For example, I can't capture a tape
> using my Panasonic 3-chip PV-GS120 that was filmed using a Canon ZR
> series cam. I have to get my Sony cam out to do this.

Standard speed, or long-play?

This is called an "interchange" problem and indicates that the
camcorders are not all in proper alignment. (Unless it is long-
play, where this is very tricky.)
 

peter

Distinguished
Mar 29, 2004
3,226
0
20,780
Archived from groups: uk.rec.video.digital,rec.video.desktop (More info?)

Tony Morgan <Tony@82.69.78.126> wrote:

>Three years ago, there was a more noticeable difference between 3-CCD
>video and single CCD video, but the gap has closed considerably today -
>especially with the upper-range single-CCD camcorders.

Especially as the sub-£1000 "3-CCD" cams are IMV no better than the
average 1-CCD cam. I had a 2003 Panasonic ...70 (can't remember the
exact P/N) 3-CCD and I tried very hard on comparisons with a 1999 Sony
PC100, and even after transferring to Mpeg and looking at frame detail
could not see any difference between the two.

I don't think 3-CCD yields better quality until one gets to say a
TRV-950. The question I posted here is whether the TRV950 is still
better than the HC1E, but nobody seems to know.

Anyway, I agree with everyone here re the playback.
 

TRENDING THREADS