Sign in with
Sign up | Sign in
Your question
Closed

The SSD Power Consumption Hoax

Last response: in Reviews comments
Share
Anonymous
June 27, 2008 5:49:16 AM

This is interesting, but makes sense. So many power saving features have been implemented into HHDs, but SSDs are very young. I'm sure that at least some companies understand and recognize this, and are developing new features to help with this.

Technically, it may not be that the claim on "energy efficiency" is indeed true. the efficiency could be measured in mW spent per Kb of information retrieved. For example, a HHD in sequential reading uses up very little power, but also retrieves only the information needed. It could be that the SSD's have to check many more bits of data in order to find the information requested. This is just a theory.

I believe the salespeople took a logical step in claiming energy improvements, due to the fact that it uses no visible motors. However, this testing (at least in my eyes) shows that they have simply "forgot" to check up on the actual statistics.

It could also be that as cell density increases, read and write speed will increase as the power consumption stays almost static. Since the 1.8' SSD eat up almost the same number of watts for the larger SSDs, it could be said that the power itself is not being consumed in the actually memory itself, but in reality the controller that access the information. IT is very very interesting to see the small difference between the different sizes of SSDs, and might be a window into seeing the full potential of these devices.
Score
0
Anonymous
June 27, 2008 7:46:01 AM

Strange picture comes from this article.
The overview seems to be reasonable but comparing to the benchmark results it looks quite different.
Just look at the Battery Runtime and Energy Consumption charts!.
The SanDisk SDD drive at LOAD requires 1.0 mW while Hitachi HDD requires 1.1 mW at IDLE (for me it means in POWER SAVING MODE).
So if laptop work time with Hitachi HDD is longer than with SanDisk SSD the benchmarks seems to be VERY unreliable.
Score
9
Related resources
June 27, 2008 8:10:32 AM

Some improvements could be done to "disks" themselves (although can we still talk about disks in that case?), but I think there would be mquite a lot to gain with a software solution - like was done with Native Command Queueing on SATA disks.

For example, better working read and write caching would be a boon: since SSDs have pretty much no access time penalty, a large write cache becomes much less interesting - grouped writes, optimized for the SSD's internal cache, may reduce the time the system stays on. Deactivating the swap file (or not making use of it as often as Windows does) would also probably reduce 'active' time quite a lot.

It would be interesting to see the difference when using a more disk efficient OS: I dunno, any Linux-based distribution? After all, Dell provides Ubuntu on some laptops and UMPCs are currently often provided with SSDs - so it would actually be quite relevant.

Why the Linux comment? Typically on these systems, the swap gets written to only when the RAM reaches around 60% (or more) use, while Windows preemptively copies 'dirty' RAM pages to swap and frees RAM only when actually needed (leading to faster RAM pages freeing, but also frequently unnecessary disk access).

Running these tests again (with all disks and some extra RAM) with:
- Windows using no page file
- Linux with a swap partition
- Linux with deactivated swap
may give us a better idea of SSD advantages.

About those who would complain that relying on RAM instead of swap is not possible:
- current systems (except lower Windows editions) can handle more than 4 Gb of RAM; Windows 2003 for example, does support PAE and extended memory sizes (64 Gb)
- if you have cash to pay for an SSD, you have cash to pay for 4 or 8 Gb of RAM

Of course, that doesn't prevent SSD makers from looking for extended power schemes (reads should require less power than writes, and idle should require no power) and a low power SATA mode (allowed by shorter cabling) could be designed.

Mitch
Score
0
June 27, 2008 10:07:59 AM

Man, my laptop would be lucky to last 1 hour :lol: 
Score
-1
June 27, 2008 10:25:36 AM

This SHOULD have been included in the article... since its the basis of of your claims. Copied from MobileMark website:

Quote:
MobileMark 2007 incorporates the following applications:

* Adobe ® AcrobatReader 7.0
* Adobe® Illustrator® CS2
* Adobe® Photoshop® CS2
* Apple® Quicktime 7.1
* Intervideo® WinDVD® 8
* Macromedia® Flash 8
* Microsoft® Office® 2003 Pro
* Microsoft® Project 2003
* Winzip® 10.0


Quote:
Recognizing the increasing diversity in the use of notebook computing, MobileMark 2007 features DVD, Wireless and Reader modules in addition to the existing core Productivity module. Each module includes a robust and refreshed set of applications and can be run individually to show battery life in a greater variety of specific scenarios.
Score
3
Anonymous
June 27, 2008 10:34:31 AM

There could be a systematic error in the benchmarks shown: if the flash based "disks" are faster then the whole system CPU/MEM/Chipset would draw much more power with flash "disks" compared with conventional disks - just because the benchmarks could run more often in the same time.

Maybe one should compare something like playing video from disk where it is assured that the systems do precisely the same work?
Score
11
June 27, 2008 10:36:39 AM

I am thinking the idle power of an SSD could in theory be zero since it is a flash drive and there is no "spin up" this is interesting... seems the drive electronics (not the actual flash chips) are eating allot of juice.

An OS patch with an option to put the drive to sleep when inactive for something like 1 second seems like it would save power and since the power on should be instant have almost no negative impact.
Score
2
June 27, 2008 11:13:20 AM

I agree with Fritz, I think it's very important to make sure the work the laptop does with SSD and HDD is the same. And I'm not sure having a laptop run a benchmark until it runs out of battery comes close to real world usage. Playing a movie again and again would probably be a simple and clean test that is not questionable. At the moment I'm not convinced that your conclusions are correct.
Score
14
Anonymous
June 27, 2008 11:27:59 AM

What about just turning on a laptop and leaving it running word and outlook and browsing the web for the complete battery life, now thats a typical session...
Score
0
June 27, 2008 11:41:32 AM

As I pointed out in the conclusion, an application scenario that does not tax the drive very much can very well lead to increased battery runtime. But does this reflect user behavior?

Think this way: You hibernate the system, because you have to move location. You start applications, you close them. Anti virus software does its job. The swap file causes frequent access. Indexing triggers more drive activity. A p2p client causes permanent access as well... the more we multitask, the more drive access we cause. This leads to power disadvantages for Flash SSDs; at least for the current product generation.

I decided to run the tests because I had changed from a Seagate Momentus 7200.1 hard drive to the Mtron Flash SSD drive half a year ago - and I saw a decrease in battery runtime of my Lenovo T60 that could not be attributed to wear of the battery. Hence we followed up and found this.

If you want to send me more specific suggestions on additional tests, such as specific applications, I'll be happy to run them when we compare the next Flash SSDs. Email pschmid@bestofmedia.com.

Thanks for your comments,
Patrick
Score
-4
June 27, 2008 12:25:46 PM

In reply to neg@infostrade.com.pl:

That is the trick in marketing SSDs. If you plainly compare IDLE and LOAD wattage, yes, SSDs SHOULD be using less energy than conventional platter drives. But what you have forgotten is the read/write performance. You have to count that in when measuring power consumption. For example, if the SANDISK SSD takes 2 seconds instead of 1 second for TravelStar to read or write the same data, tell me, which one would be in LOAD state longer? Lets say if a noname SSD takes 1 second to write 1MB and a noname platter HDD takes 1 second to write 10MB. The LOAD wattage of the SSD has to be 1W and HDD to be 10W to make their power consumption equal given the same amount of data to be written! If you only look at the Wattage, it doesn't give you any result at all.

This is what we call "weighted".

I'm also eager to buy an SSD, not because of power consumption issues, but instead is because it seemed to be less prone to shock than conventional HDD. I hate sudden deaths of HDD without much warning even sometimes you think you keep it working in good environment. SSD has a "life expectancy" you can predict of, at least.
Score
-2
June 27, 2008 12:56:06 PM

To Tom's Hardware:

I also partly agree with Fritz and mastrom and partly disagree! Why, because as long as the drives (both SSD and HDD) are fast enough to catch up with the system in playing VIDEO file, it is in theory, like READ ONLY performance to me. So I can't conclude, but I will assume that SSD drives will use less power than HDD, because we all see the IDLE and LOAD wattage for both type of drives.

Instead, measurement of power consumption IMO should include a fixed but large file to process, see which drive could finish faster, given that we use the SAME MACHINE (just swap the drive). It is like a "control experiment" which we learn in high school science class. Just one difference in doing the same amount of job - the drive.

Whatever you called "real life" is very subjective and really vary from individuals. For benchmarking purpose, I think we should just "benchmark" it like you used to do with CPUs - on the same board, same other hardwares.

If you want to give different type of people the idea of which drive they should choose, haha, may be you should do another 2 tests beside "benchmarking". Which is plain idling (like web browsing, office applications, typing) that sort of less intensive work. And the other one is on intensive side like people who use Photoshop to edit and video editing for theeir whole office hour.

Sorry, I know it's gonna take a lot of time to do these. It's a matrix of drives x 3 working environment x time you need to test each scenario.

I don't see this article a conclusion at all, but, it's good work. It makes people think and reconsider.
Score
2
June 27, 2008 1:38:01 PM

I'm surprised nobody touched on this: SSD drives scaling down to 1.8" takes about same power as 2.5", This would imply that scaling up to 3.5" (or maybe 5.25" remember the bigfoot drives?) would take same power (or nearly so) as the 2.5" drives.

You'd be looking at huge power savings against 3.5" drives, not to mention performance increases. Then you could also add capacity almost indiscriminantly and power consumption probably wouldn't change much since it sounds like most of the power usage is coming from the controller which wouldn't likely change.

No, we aren't talking mobile power saving benefits with this scenario, but think of all the desktop workstations and home PCs in the world and all the potential power savings there.
Score
0
June 27, 2008 2:10:29 PM

Perhaps SSD makers need to seriously invest in the memristor technology HP has invented recently. This would solve their issue of power consumption.
Score
-1
June 27, 2008 3:14:26 PM

I agree with fritz (above & quoted below). Maybe since the drives are quite faster it's actually doing more work in the same time, meaning CPU use is higher and power consumption is up. Just an idea, I don't know how the benchmark program is designed.

"There could be a systematic error in the benchmarks shown: if the flash based "disks" are faster then the whole system CPU/MEM/Chipset would draw much more power with flash "disks" compared with conventional disks - just because the benchmarks could run more often in the same time.

Maybe one should compare something like playing video from disk where it is assured that the systems do precisely the same work?"
Score
4
June 27, 2008 3:14:39 PM

Wow, I've always respected Tom's hardware as the most reliable company on tests... well not anymore after this article...

Two of the BIG mistakes have already been mentioned in this comment page:
1- "additional tasks you would not have normally done on a laptop (media processing, HD video, etc...)" and the possibility that "Windows doing more background stuff (desktop search, adware/av, etc...) because it doesn't take as long to finish maintenance tasks." would DEFINITELY influence consumption of a whole system but it's not even considered.
2. "The SanDisk SDD drive at LOAD requires 1.0 mW while Hitachi HDD requires 1.1 mW at IDLE". So clearly, either the graph in "power consumption" is WRONG or the whole article is wrong. So I ask the editors to either take off the article or correct the graph.

While they correct the errors of this article, do not believe the conclusions of this article. SSD do consume less for any X amount of tasks on a Y amount of time, and that is the only factor that counts. full stop.

This article measure Y amount of time with a different amount of tasks...... very clever..... :-s
Score
17
June 27, 2008 3:32:57 PM

Meeooww!!!, SNARF! Interesting how what I assume to be early SSD adopters viciously defend their prized and over priced new toys..., LOL!

Face it you guys you where spoofed. Current generation SSD's are over priced and not ready for prime time. You are bankrolling R&D. The results are very valid as most users who actually do "work" on their laptop use multiple programs at once and have various services and applications running simultaneously. They don't just turn their notebooks on and occasionally surf the web or check email and no its not realistic as some goofus proposed to run the test on a linux system.., yea cause we all run Linux. Another guy was clutching at straws about saving a second here and a second there because of the awesome performance increase of the SSD will save so much time.

LOL! This has been an intertaining comments thread to read. Keep em coming.
Score
-5
June 27, 2008 3:56:57 PM

But if the flash drives are as much as three times faster in many aspects..
That would mean that whatever you are doing, will take less time to complete. And the drive would stop reading earlier.
Score
6
Anonymous
June 27, 2008 5:30:56 PM

Why is there a huge spread in io operations per second on the charts? Why are the numbers for MobileMark07 Performance and "File write Performance (PCMark05)" identical? Where did they get the "Performance*Runtime/1000" numbers? I can't replicate them from any numbers provided. Not only that, but the runtime minutes were pretty close, so those results would suggest that the Hitachi drives performance was better than the SanDisk SSD, and... well, here are the performance numbers I recreate from the runtime minutes: Hitachi-284, Sandisk-274, MemoRight-290 etc. Go and look at the all those performance charts and tell me, does that look right to you?
Score
4
Anonymous
June 27, 2008 6:23:36 PM

Third last paragraph:
"do not even provide convincing performance while they help to suck your battery empty quicker than before."

Last paragraph:
"you can at least be sure that you get the best drive performance"

must....resist.....lol....
Score
0
Anonymous
June 27, 2008 6:44:57 PM

well, so far the SSD drives have been compared to only one HD.
Probably a good performer.
I would like to see these tests versus a toshiba MK1234, and other mobile drives.

Most certainly SSD drives have a better performance/watt ratio then most desktop drives.
Also it can be mentioned that when flash size increases, battery consumption increases. SSD drives of below 16GB usually are more energy efficent then (mobile) HD drives.

I thought that the speed difference of flash (mlc/slc) was dependant on the controller, and not the flash cells themselves (since they only hold the memory bits. I thought the SLC/MLC flash cells are the same speed as the other but due to a more simple design and less 'lanes'(so to say) from controller to the cells, that being the reason of MLC drives being slower then SLC.

On the other hand, in defence of HD drives, SSD drives (have a real low latency (read after read), but their write after read/write, and read after write, as well as random read/writes of small blocks are extremely slow.
Score
1
Anonymous
June 27, 2008 6:46:15 PM

also, they forgot to include the powerloss on drive spinup
Score
0
June 27, 2008 6:49:49 PM

There is value in the fact that when he replaced his notebook drive, he got a lower battery life - that speaks to the article's credibility, and noone can deny that.

However, I think that the test can be tweaked a bit.
Run the test on the same notebook, same OS, same everything (the mobilemark test, that is)
When one test completes, mark the time it took and amount of battery it used (either by seeing what % is left, or by measuring the battery's V and A)

Then, put % used or power used over time completed, and you'll get a ratio - and you can use this to compare which drive uses the most battery, taking into account the traditional hard disks's variable load advantage, versus the SDD's high transfer rate advantage.
Score
3
June 27, 2008 7:44:10 PM

@pschmid

I think system uptime at idle should be used to state generically whether SSDs use more power. To answer what the minimal power costs are of having one drive over another.

These ratio ideas people are trying to hit at are close but not quite there. I think the ratio the road warriors care about should be "performance increase(or time saved)/battery life lost" instead of "work done/power used" or "work done/time".

If the work I need to get done takes 7hrs 3min on a HDD and 6hrs 3min on an SSD...who bloody cares? I get done an hour earlier!

But if the SSD only saves me 25 minutes and costs me an hour on the battery...I won't even finish. All of the other ratios are informational but "performace increase(or time saved)/battery life loss" is what the road warriors will want to know.

Thats not even fair though...because its the opportunity for performance increase more than anything else; the ratio fluctuates with user efficiency. You'd have to consider the work I can get done on espresso vs. my grandmother on her meds.

I won't even go into the "time worked/system load" differences between a mobile videographer and an author on the road.

These current tests don't consider the ratio, or user efficiency, or varying workloads so I don't think they can represent the valuable data related to realistic usage any better than an idle uptime test could.
Score
0
June 27, 2008 9:07:34 PM

Anyone remember the Quantum Bigfoot 4200 RPM drive? I'd like to see a 5.25" SSD version. I bet they can hit at least TB with that much surface space easy! If the supposed power usage does not seem to scale with size, clearly this is an interface electronics power issue.

I agree with the benchmark results, they seem kind of odd. If everything was equal, then yes these drives consume more power. But certainly, SSDs spend less time seeking which helps accomplish the overall task faster. Once the task is complete the drive returns to an idle state. The problem with a burn-in style benchmark is that they are designed to run a device at full load.

If we used this system to determine the performance characteristics of a hybrid automobile it would perform the same as a standard engine (or worse) because the SSDs strength (like the hybrid auto) is in the short bursts of energy it consumes. All these benchmarks prove is that at the worst possible scenario (constant [sequential] full load) the SSD loses... what you should do is run the SAME test on a heavily fragmented drive, then run both tests again.
Score
0
Anonymous
June 27, 2008 9:50:18 PM

While focused tests are of some value, they will always be open to valid "do they fit the real-world" criticism.

Most importantly:
a) I swap "x" disk with "y" disk in my laptop.
b) Over a period of weeks and doing approximately the same amount/type of work, I observe that I am getting a marked difference in battery life.
c) I swap "x" back in and once again notice the battery life change back to the original "x" value.
-----
d) We observe markedly similar results with a number of different systems and a number of users under these circumstances.

The result is a reasonably-close-to-best-case probabilistic view of what the average user is likely to encounter in terms of power-consumption vs productivity.

While Tom's Hardware deserves a great deal of backlash for the relatively recent and drastic drop in quality content, I believe, in this case, Tom's Hardware should be commended for alerting users to a reasonable possibility of an important discrepancy between current thinking and real-world applicability in this subject matter.
Score
1
Anonymous
June 27, 2008 10:17:58 PM

Igot1forya (2 posts up)
"Anyone remember the Quantum Bigfoot 4200 RPM drive? I'd like to see a 5.25" SSD version. I bet they can hit at least TB with that much surface space easy!"

I bet they can repeat almost achieving Deathstar failure rate again
Score
0
June 27, 2008 10:19:50 PM

An interesting article written by Anand @ anandtech,com. He swapped out a 1.8 HDD (PATA) for a 1.8 SSD (from a Macbook Air which is basically a plain old vanilla Intel Centrino except for the processor, but the point to take away is that the test was performed on the same exact platform with no changes except for the disk drive.).

To roughly paraphrase his review of the impact of battery life: It improved, depending on the usage.

Link to the review:
http://www.anandtech.com/mac/showdoc.aspx?i=3226&p=16


To be fair though, he also did a swap using a MacBook Pro and a 2.5 HDD vs 2.5 SSD. Anand's conclusion (Final Words):

In many ways, the SSD option on the MacBook Air is an easier decision to make. In many cases, performance went down but the improvements in battery life and application launch time make the option worth it if you've got the gold to spare.

Adding the Memoright MR25.2-128S to your MacBook Pro is a much more difficult decision to make. Battery life doesn't improve, but performance can increase anywhere from 0 - 60% depending on what you're doing. Within an application it's unlikely that you'll see any huge gains, you'd need a faster CPU for that. But, launching applications, interacting with the filesystem, booting your machine, all of these things get significantly quicker with the Memoright drive.

The problem is that despite the performance increases, the cost of entry is nothing short of tremendous. At $3,819 for 128GB the most expensive part of your notebook would be the hard drive, in fact it'd cost more than your entire notebook put together. Then there's the fact that the cost of Flash memory decreases by around 40% every year, meaning that your nearly $4K SSD would depreciate faster than a Honda Civic.

Link to this review: http://www.anandtech.com/showdoc.aspx?i=3287&p=7
Score
3
June 27, 2008 11:14:27 PM

nekatreven is absolutely correct.

The use of a benchmark suite in this article that stressed the computer as a whole is a completely invalid way to measure these devices. The computer as a whole is not what we're interested in. We're interested in which devices are more efficient and therefore should lead to longer battery time for an equal amount of work.

nekatreven suggests a completely idle laptop battery time measurement be used to see how idle power consumption affects battery run time, comparing SSDs to conventional HDs. This is very correct, and should be done.

I also propose a load efficiency test, where you use a benchmark program that does constant disk reads, writes, or a combination of reads/writes, for instance IOMeter. You measure battery run time, and total amount of data moved. Now you can express an efficiency of the storage device in terms of megabytes per Watt-hour. This is the actual efficiency measurement we're interested in -- when the devices are compared on an equal-work basis. Note that because conventional HDs are more efficient in terms of MB/Wh when working with sequential data than when working with random data, you have to do two suites of IOMeter tests - one working with sequential data and another working with random data, which should theoretically show a widening gap of MB/Wh between the SSDs and conventional HDs as the data accesses get more random. This further implies that energy efficiency of SSDs is greater for applications using random accesses (virus scan, Windows startup) than applications using sequential accesses (movie playback).

Given how the tests in this article were conducted, the results and conclusions are meaningless and irrelevant, as is typical for Tom's.
Score
5
Anonymous
June 27, 2008 11:15:39 PM

OUTSTANDING aarticle. It does reminds me a bit of cold fusion though. Has anyone else managed to duplicate these results? If so why the silence? If not, it would be nice to see some contradictory results.
Score
-4
June 27, 2008 11:58:59 PM

I would tend to agree with this testing. RMHD do have the ability to consume less power while sustaining their reads when the data is sequential. It just comes down to the efficiency of the SSD's controller.
Score
-3
June 28, 2008 12:21:56 AM

Good article, BAD Graphs!!! Its so annoying that every single graph in this article is labled (in Bold too) "Battery Runtime on Flash" while only one of those graphs actually deserves this description. You should just get rid of that "Batter Runtime on Flash" bold header and bold the text right under it.

Sorry to be anal, but if you really take yourselves seriously, you should realize that clear and concise graphs are really really important. Having graph headers that don't actually correspond to the information in the graph is just BAD!
Score
0
June 28, 2008 7:12:27 AM

The sad part is that I don't think tom's hardware was biased on purpose. They just don't get it.

It's obvious that since the performance is so much better with SSDs, you can do more in less time, so you can finish your work faster. And after that the system sits idle or even hibernating, using 0 (zero) Watts.

Also a faster drive means the system will spend less time waiting for the storage, so the CPU (and other components) will be used more. Meaning less battery life, depending on how CPU, and other components use power under more load.

So probably Tom's wanted to test the CPU of the laptop or something. Even then the test would be inconclusive, because when the CPU uses more power, it usually gets more work done.

I can't think of any situation when Tom's test is even remotely relevant.

It's like you could get from A to B with one car using 5 gallons of fuel in 2 hours, and you could do the same with another car, but in 4 hours. Then, if you were Tom's, you could argue that the fuel lasted 2 more hours in the slower car, so the slower car is better, because the faster car uses more fuel per hour :) 

For example my car computer has two operating modes for measuring fuel consumption: when the engine runs but the car is not moving, it measures in gallons per hour. When you start moving it, it starts measuring miles per gallon.

Obviously, if Tom's were to test this car, they would have tried to measure gallons per hour when the car was moving, and in miles per gallon when the car was not moving, if that would have been possible :) 

Now seriously, Tom's may be right and the SSD may use more power. But the article is so poor that it can't prove anything either way. Even an article saying "We changed our HDD in one of our laptops with an SSD and the battery lasted less. The end." would have been better than this graph supported stupidity.

I read Tom's hardware since the Celeron Mendocino days, but I never seen such a poor article.

And to answer one poster, I don't try to protect "my baby". I use a normal HDD on my laptop, because SSDs are way too expensive, and because I don't really trust them from a reliability point of view.
Score
7
Anonymous
June 28, 2008 9:30:04 AM

Ok now im confussed, this means the Asus EEE PC 1000 with harddisk would last longer then with SSD? in spite of what everone (even asus) says?
Score
-1
June 28, 2008 10:35:15 AM

I'm sorry, i must have lived under the rock. Can laptops last 7 hours under load nowadays? I always thought the max they can go is like 3 hours when idle. Wow, this is shocking for me.
Score
2
June 28, 2008 11:16:03 AM

I would agree with the issues on running apps which are not likely real world usage for laptops would be troublesome.

That should be addressed by the benchmark people to come up not just with a specific test but various workloads.

For example who uses their laptop as a server?

A big fault here from my viewpoint is trying to make the two storage mediums substitutes.

I have a hard drive plus a SS disk and use the SS for most frequently used pages which tend to be read only and don't have the write penalty.

To take a SS and cram it with 50 gigs of MP3 files is a waste since thumb drives were more suitable for access to those types of usage in your favorite play lists.

Another issue I have is the lousy utilization of ram in the computer by most operating systems. I have seen my 2 gig of ram barely get used and disk thrashing being done due to poor caching of stuff.

It takes a whole lot of tweaking and buffer size adjustment and even setting up ram disks in memory which can lower access needs to external storage.

Booting off an SSD if a hard drive is present is a waste of SSD space especially if you sleep the laptop between the rare boots.

For me the SSD only setups are for rugged usage that would give an HD heartburn.

If a manufacture wants that, simply up the battery capacity to compensate.

The whole agenda to make a laptop so thin that I feel like I have to treat it like fine china to keep from breaking it in half really makes it difficult to put in a decent size battery.

Sorry folks , but I am looking for a tool to do my work I want, not a potato chip I have to worry about shattering.
Score
0
June 28, 2008 4:14:25 PM

This article is heavily biased. You take the worst SSD when it comes to power consumption (Mtron, Memoright and Crucial) then stack it again normal hard drive. Most people go with these brands because of performance, not power consumption.

Secondly, aren't you blind not to see that Sandisk SSD outperform all these in term of power consumption (only 1 W when active).

Why don't you go test Samsung SSD and at least relearn how to make unbiased conclusions ?
Score
3
June 28, 2008 6:17:01 PM

Something doesn't seem right here with the flash drives actually decreasing battery life in laptops.... just doesn't pass the 'sniff test'. I was recently reading an article at PC World where they did these same tests, and they found that the SSD's save quite a bit on battery life..... if you are smart and disable the page/swapfile, which I do even on systems that have mechanical hard drives, because they seem to page stuff to the hard drive WAY too much if that pagefile is present.
Score
2
June 28, 2008 10:30:13 PM

warezmeMeeooww!!!, SNARF! Interesting how what I assume to be early SSD adopters viciously defend their prized and over priced new toys..., LOL! Face it you guys you where spoofed. Current generation SSD's are over priced and not ready for prime time. You are bankrolling R&D. The results are very valid as most users who actually do "work" on their laptop use multiple programs at once and have various services and applications running simultaneously. They don't just turn their notebooks on and occasionally surf the web or check email and no its not realistic...



Couple of points here:

1) Without earlier adopters to invest in new technology, you wouldn't have your "economical" existing technology - so thank those who have pushed technology forward, don't ridicule them as being stupid or ignorant.

2) I think several posters were trying to say that battery life should be measured under equal total work load. If an SSD can do twice the work at the cost of 30% of the battery life, does that mean it is worse than an HDD?
So they are suggesting that Tom's use equal work load test - maybe set up the systems to run a script at bootup that checks email, composes 10 emails, browses 100 web pages, watch 10 youtube videos, compose a ten thousand character word document, compose a three thousand character excel document, run a DirectX 3D test five times, sit idle for 10 minutes, run a script that increases the count of a "number of times run", and then powers down.
This way we can compare number of cycles of simulated work completed per battery life time. This permits a comparison of work performed... after all, if I can get my work done in 2/3 of the time and shut of the computer, I'd be okay with a 25% loss in overall time powered on. EXAMPLE: My battery is four hours and one cycle takes an hour with a HDD, so I get to do four work cycles before recharging. With the SSD I get 3 hours, and each work cycle is 40 minutes - so I get 4.5 work cycles in before recharging. Thus, to me, the greater amount of work per same battery charge done at a higher speed of productivity sounds better. I save an hour each day and get 12.5% more work done!
Score
1
June 29, 2008 2:30:09 AM

It's easy to look at the specs of the drives in isolation and criticize the results of these test. But a hard drive is useless without the rest of the computer system and it is the system with SDDs that used more power.

The SDDs specs indicate that they use less power, so there must be something in the way that the rest of the system uses the drives. Remember the tests were initiated because of observed behavior under normal operating conditions.

My view the manufactures need to more work on the drivers and recommended OS settings. I expect that the default settings have been optimized over many years of spinning disks.

Score
-1
Anonymous
June 30, 2008 3:30:23 AM

I can't believe it! This result is totally out of my imagination.
I think neither the Benchmark software nor the SSD tech is not suitable~~
Maybe when we all using SSD, this could be cerected?
Score
0
June 30, 2008 4:03:24 AM

This article should be renamed The SSD Power Consumption Hoax hoax.
Score
9
July 1, 2008 3:20:43 AM

I agree with Doubters, there really needs to be additional testing. It seems strange advertised stats are so Far off.

Signed:p HYSICIAN THOMAS STEWART VON DRASHEK M.D.
Score
2
July 1, 2008 3:36:33 PM

To Patrick:
First of all it's totally unacceptable to have this title and making such accusations about a "Hoax" when all but one of your graphs talk about performance and only one measures the "battery life". If you want to prove a "hoax" you need to perform all the battery life tests you can and then more.
Now to the real important stuff. Since I didn't know anything about your benchmark program I couldn't be sure how it works and I was unable to prove you wrong about your conclusion.
Going to http://www.bapco.com/techdocs.html I found the white paper for Mobilemark 2007. Under paragraph 2.5.2 Battery life rating methodology I read: "The benchmark generates battery life ratings as its principal metric. The battery life rating in MobileMark 2007 is measured in minutes. This metric reflects the number of minutes the system can remain operational while executing a chosen module. Each module will produce a different battery life rating, reflecting differences in system loading."
Everything ok until now. Each module should accommodate different usage patterns. And it continues: "The battery life is established by recording the start time of the benchmark, then repeatedly performing the workload. When the remaining battery capacity has fallen to 7% the benchmark records a timestamp once per minute. Once the battery has been depleted and the computer plugged in and rebooted, the benchmark compares the “start” timestamp and last recorded (“end”) timestamp. The battery life rating is the number of minutes between these timestamps."
Did you notice the important detail? "REPEATEDLY performing the workload". Well my friend, this is the reason your conclusion is completely off base. It is proved and accepted the SSD drives are faster (on average) than conventional Hard Disk Drives. Even your graphs prove this fact. However when you test the battery life by repeating the same workload again and again you force the system to perform more cycles of the same workload when the SSD is used compared to the HDD because most tasks in the workload wait for the hard disk to finish a task to move to the next. So unless you tell us how many times the workload was repeated by each configuration you can't compare the battery life times...
That’s is why the DVD playback test is so popular in battery life tests. It makes sure the computer will perform the same work per x amount of time. So please explain to me why you didn't publish the results of the "DVD2007: Battery Life" and the "Reader 2007: Battery Life" modules which are part of the MobileMark2007 as I see at the paragraph "3.0 MobileMark 2007 Scoring Methodology" in the white paper.
I suggest you change your title and publish more tests on the subject. Not only your graphs were wrong, your logic is flawed too.
Score
6
July 1, 2008 5:10:16 PM

pschmidOkay: we made a mistake when creating the charts; hence the Mobilemark performance results were off.[...]


Just the tip of the iceberg, my friend.
Score
3
July 1, 2008 9:23:59 PM

The whole article doesn't make sense to me and is, quite frankly, disappointing. The articles on Tom'sHardware are normally of the highest quality, but the use of words like "hoax" and completely unfounded speculation on the testing procedures of SSD manufacturers leaves me with a foul taste in my mouth. This isn't Fox News, you don't have to SHOCK US into paying attention. We're a bunch of computer geeks (well, I am... to speak for myself)... I love this stuff. Just give me the data and let me draw my conclusions... or at least be a little more considerate of the audience to whom you're writing.

If a SSD is consuming a maximum 1W of power, then there is no possible way it can drain a fixed number of Wh from a battery faster than another drive that draws a minimum of 1.1W of power. The resultant benchmark results must be put in context, which, despite the lengthy article, the author failed to do. The SSD's are not, nor cannot be the immediate cause of the reduction in battery life, despite their correlation, and despite the attempt to change only one component in the tests. As stated above, it's likely that the increase in power consumption is due to software not being optimized for use with SSDs.... with a very simple, and obviously overlooked possibility being that the processor is consuming more power on an SSD equipped system because it has less idle time...

I'll be looking forward to the SSD manufacturers' responses and further testing results.
Score
2
July 1, 2008 11:02:10 PM

Change the battery.

Whenever you install an SSD you must replace the old battery with a new one at the same time.

Although battery makers will tell they have all but eliminated 'memory effect' batteries still 'draw power' in patterns. A HDD creates a certain pattern, just changing over to an SSD, the power draw pattern is still replicated in a similar fashion within the battery from before/pre SSD.

A new battery with a new drive needs to 'bed in' the power draw pattern for that particular drive.
Score
0
July 2, 2008 12:10:30 AM

Why did you run this test using Windows Server 2003 Enterprise? Isn't that like testing a hybrid car's fuel efficiency by racing it at Indy?

Please re-run the battery life tests with some a sane use-case.

First, select an OS that matches your real-world use-case. I recommend Windows Vista SP1 with maximum power-saving settings (and Aero disabled, if you care about battery life that's what you already do). It would be very useful to also run this test on the latest Windows XP.

Second, pick a real-world use-case for this battery life. Perhaps watching a playlist of fansub'd anime. (certain high quality fansub sites distribute 170mb h.264 encoded episodes that run approx 24 minutes) Then for the next use-case set it on 5-6 auto-reloading content controlled test websites.

Two runs times two OS's times six drives. You've got a lot of work ahead of you but then perhaps you'll finally be able to give us some real results and not these contrived and meaningless statistics.
Score
2
!