BradC, the video will be decoded by the GPU/CPU regardless of the source, computer dvd drives have no decoding capabilities, much less laptop ones.
Your best best will always be to rip the dvd to the hard drive.
Sure, it would be the same if I rip it in the original MPEG2 format, which will take up 6-8GB of drive space.
I was referring to a transfer to MPEG4 or something more modern, like Jeff describes here:
So the comparison is: MPEG2 from DVD drive -vs- MPEG4 from Hard drive. I’d be interested to see a power-use comparison of those two scenarios, since the MPEG4 is very likely to use more CPU.
i have (and had) a very good experience with my battery. i bought a dual core dell in august 2006, and since than my battery dropped only to ~90% which not what I can say about most of the batteries.
the most important trick was to remove the battery when AC is plugged. the heat kills the battery (there is an wikipedia article on LiIon batteries, read it).
second, i configured my laptop to have the lowest consumption possible when running on battery. disabled one core, disabled bluetooth and wireless, decreased the voltage of my ati card (aticonfig --set-powerstate=1), decreased laptop’s backlight. powertop reports around 15.5-16W consumption in idle mode. last time my laptop survived 2h40min without recharging.
Well, my laptop uses a whopping 120W when running a little more than idle. I’ve haven’t tested it when running a heavy 3D app or anything, but I’d suspect it to go up quite some.
Ah well. That’s the downside of having a laptop that can run circles around the average desktop…
I have the same laptop, just 64GB SSD instead of 32 and 3GB of RAM.
The power use overall of this laptop is noticeably better than any other laptop I’ve used. I use this laptop at conferences when speaking, and I no longer bother plugging in the AC adapter - the 9-cell better will last me 3 sessions before I’ll switch to the 6-cell. I use video and audio in my talks, and run 3 watt USB powered speakers off of the laptop as well as running CamStudio to record the session.
The laptop has a wireless “kill switch” I use if I know I have no need of bluetooth / wifi / cell access - throw the switch and the laptop turns off these components. Not only is this cool for power use, but also I have this fear one day some hacker will use wifi to mess up my session. I didn’t say the fear was rational. Add the kill switch to the “single core” mode and the battery time is impressive to say the least.
Oh, I’m also loving the fact, for science, you defragged your SSD =p
BradC, despite the legalities of ripping DVDs there are dozens of commercial softwares that do it legally, so it’s a rather moot point.
MPEG4 does use slightly more CPU resources, but on a modern high-end CPU like this, usually not enough to bump it to the next power state. Rips are generally cropped resized as well. MPEG4 AVC however does stand a good chance of using enough CPU to lower battery life. MPEG2 DVDs actually use quite a bit of power because they’re so much higher bitrate than most MPEG4 rips; it takes significant CPU to parse decode the stream, plus at low bitrates cpu-saving skips are more common.
However, if you can get your graphics card to do the decoding of anything, power use will plummet. They’re amazingly good at that.
Interesting thing to point out. On linux, starting from kernel 2.6.23 (for x86) and 2.6.24 (for x86-64), there is new tickless mode: mode, in which CPU is being idle much longer than in regular modes. Basically, even when CPU is idle, OS programs PIC (Programmable Inerrupt Controller) to wake up CPU (to check whether there is something to do) 100 times per second (for Windows frequency of wake-ups is 100Hz, for linux #8212; from 100Hz up to 1000Hz). In tickless mode, unneeded interrupts are not scheduled, and CPU is really idle, which helps conserve power.
BTW, Intel has utility powertop for linux that can be used to diagnose power consumption.
Has anyone done any testing to see if a perpetually plugged-in laptop uses power to try and charge the battery occasionally? Would it make sense for an at-home worker to remove the battery when it’s plugged in?
If you’re truly environmentally conscious then almost any laptop will beat the pants off a desktop in terms of power usage. Although the same laptop will be more expensive, the difference is getting smaller and smaller each year. And who knows, maybe we’re already at a point where the difference in power consumption (price for the power) is equal to the difference in price of the hardware…
If nothing else you save energy. Running a 600W power supply all day is much more expensive than a laptop.
In Australia you can get a noname knock-off of the Kill-A-Watt from Jaycar for about $40, but you may be better off ebay. Search for “Power Monitor Meter” on ebay and you will find them.
I’ve been playing with mine for a while and it seems to be inaccurate especially at low watts (eg. testing vampire power etc), but accurate enough to get a good idea of which items you should about turning off and which ones don’t really matter.
I managed to cut my household electricity usage from about 4kWh a day to about 1kWh, which I was pretty smug about.
lesswatts.org has some more interesting ways of lowering power consumption (at least on Linux). For example, your computers sits polling the CD drives, just in case you happen to put a CD in. You can stop this behaviour, which is slightly inconvenient, but hey we’re big boys, we can handle it.
Stephane: Just to be clear, you do realize that the “600W” on a 600W power supply doesn’t mean it draws 600W, right? That number is the peak output – normally, both the output and the input will be a lot lower than that. (Still not negligible, though.)
I get about 45 minutes of life on my desktop replacement, so it’s pointless to optimize it – 60 minutes isn’t enough to get anything done on a plane anyway. =)
To be clear, all modern operating systems do this-- heck, even ancient crusty old circa-2001 Windows XP does it. As long as you have a CPU built in the last 2-3 years which supports dynamic clock adjustment. It’s quite standard…
I have a machine I built back in 2006 which has an A64 X2 4400+ in it, it’s a CnQ capable chip. XP and Vista both refuse to scale the chip, without use of trial software from AMD’s website. [K]Ubuntu automatically scales the frequency on my chip.
In most cases, only laptops and lowend builds force SpeedStep/CnQ, if you pay the premium for a high quality rig or top-notch processor, it’s expected that it will run at it’s rated speed constantly.
I have to ask, why spend money on a kill-a-watt when your acpi will happily report that data when on battery? The aforementioned powertop lists that data, and GNOME power manager will even graph it out over time for you.
I haven’t spent much time fiddling with Vista power management – did they fix that mess that was XP’s power management much? XP did indeed do CPU throttling, if you set it to the counter-inuitive “minimal power management” mode. Are they minimizing power, or management?
It’s interesting to see with all that low power hardware that my laptop’s not using much more power. At times, I’ve gotten it down to around 14W, which I don’t think is too shabby.
I also own a Dell 300m. What a wonderful machine it was. It has taken a beating, and is now relegated to a completely non mobile linux server. Yes server. The power connections to it have become messed up and I cant unplug it. I still love the thing though. I recently bought an XPS 1530 and aside from the touchpad problems I really like it so far. I bought it shortly after reading up about it and the 1330 here and other places.
Also noteworthy is that dedicated GPUs, such as the ATI-chip I’m presuming is used in this XPS1330, draw WAY more power than a low-performance chip such as the Intel GMA X3100. So unless you’re really going to be playing 3D-games or something on the laptop, an Intel chip is probably a better choice.
ATI drivers have a Optimal Battery life option. I think it almost double my battery time (Idle PC against Idle with Optimal Battery Life in Catalyst Control Center). Very useful indeed. ATI Catalyst Control Center is way better than nVidia Control Panel.
Hey Jeff, speaking in speedstep/Cool’N’Quiet, do they screw up with VMWare Virtual Machines? The program made some bold statments when I turn one VM on about having different clocks. The one thing it did say is that the clock in the VM might be wrong, but that you can sycronize the clock with VMWare Tools. Is there any other problems? Should I turn off speedstep (Always On option in Power Schemes in XP) in my laptop?