Quote:
Originally Posted by auroraglacialis
I am only in a certain aspect - in that I say that there needs to be deep cultural change. Jevons Paradox is IMO not an inevitability, it only is so in the context of a culture that resides on the paradigms of progress and growth as the basis for its existence. People in this culture prefer growth and progress over stability.
|
This culture relies on growth, it's not a matter of choice or preference, because it's built that way. Isn't it funny when two people who essentially agree just nitpick on things like these?
Quote:
|
Stability is even sort of a bad word now, because it has the connotation of something static and boring. But in a culture that values growth over all, it is inevitable that increases in efficiency end up powering more growth and progress instead of reducing consumption. In a different culture that has at its root things like preservation, sustainability and caretaking, an incerase in efficiency might actually lead to a true reduction in consumption while just maintaining what is or even having some slow growth.
|
The system is unfortunately built that way, yet there are still certain limits of consumption on individual basis. Though that's offset by the fact that there are simply so many individuals and they all consume resources separately instead of sharing them efficiently. Like comparing a household of 2 people vs. a household of 10 people, as the household of 10 doesn't consume 5 times as much as the one with only 2.
Quote:
|
Of course. That is why I said that this also has some value. Nowadays we can with the same energy have bright lights during dark nights where our ancestors had a candle. But fact is that overall we use a lot more energy than in the past, despite (or because of) increasing energy efficiency in part
|
Somehow I lost my train of thought there completely, so I'll just go Derpy on this one and say: Money
Quote:
Yes but before the computer system did not use any power to create fancy lights. Ok maybe the power-led
|
One would have to build a computer that puts the 70s' discos to shame if we are to go near comparable figures, but I guess you are talking about the principle.
Quote:
|
I did not quite understand the first part, but I think what happens a lot is that people will rather buy a computer that uses more power in total than one that consumes less but only has the capabilities of one that existed 10 years ago.
|
The main problem with it is that while computers have become better, the interface has remained the same for ages now. We still input commands via keyboard and use mouse around, and if one is futuristic enough, then you might even have a touch screen display on your desktop. Majority of computer users, the usual internet, youtube, e-mail etc. would be just fine with one of
these. (I would go with the E-350, and I thought about building a eco machine from that for my parents, but then that idea kind of got lost somewhere along the way.)
Most people just don't understand anything about the stuff they buy, computers included, and they just trust everything the salesmen tell them, and if the salesmen happen to have good relations with the local energy company, chances are that they are not going to recommend you the most efficient models.
I don't think people would just buy something because it uses more energy, even if they could do as well with a less powerful model, so they just buy something that works. And as it happens, like I said earlier, the majority of users would be completely happy with a computer that has the computational capabilities of a 10 year old model, but uses far less energy, but that's not where the mass markets are, so you know how it goes...
Quote:
|
This is because one wants to play 3D high resolution games or do crazy 3D modelling or edit large photos. Mostly the first one though I guess. The extreme alternative would be to reduce consumption extremely. One certainly can now play PacMan on a microprocessor that uses just the energy from a small solar cell instead of needing a C64 computer.
|
Remind me to hold you a personal presentation about this subject, because when I start to think about it, I could write a page or two about this subject if I got my creativity properly going.
Quote:
|
Lately I see some hope in that people start to put more value to energy consumption because they want mobile devices and battery capacity is limited. If battery capacity goes up again, I am sure so will energy consumption by these devices as a tradeoff to vastly increased computational capacity.
|
I'm fascinated by energy efficiency even on my desktop, and I have a power meter connected to the wall socket so I see how much power the system draws at any give time... 166W would be right now, but that is because I have folding@home running on all four cores.
I'm tempted to to upgrade for Ivy Bridge when it comes out next year, because it's even more energy efficient than my current model, and it's compatible with my current motherboard. I could even downgrade a bit on the performance side if the promised TDP values are anything to go by. Can't say for sure until I see final performance benchmarks, but so far it's looking really good if only Intel keeps their prices reasonable...
Quote:
|
Yes sure. And I think that this "system" is based on a culture of growth and progress at all costs which is killing the planet. Capitalism, Jevons Paradox and all the lot are things that come from that foundation of the cultural narrative that describes members of industrial human nations as infinitely innovative and progress as unidirectional and ever increasing. The money system with interest is based on growth as is the whole financial industry but also technology is drawn into that spiral.
|
The problem is that technology is harnessed to serve the money system, so technology has also become corrupt because of this.
Quote:
|
People cherish "Moores Law" that predicts exponential growth of computational power because growth is always good, just the costs have to be reduced. But in so many cases over and over again, the costs have decreased less strongly than the growth of that sector. I just read that while computational power increases every 18 months, it takes 20 months to halve the energy consumption per operation. This means that if one makes full use of the capabilities of Moores law and uses the new computational capabilities, overall the total energy consumption has to rise. This is no inevitability for individuals, but for society as a whole I think it is true that energy consumption by computer devices has risen strongly over the past years, in part because they have gotten more efficient and thus much mor ubiquotous.
|
You have raised my interest, and now I have to check into that later on, but for now I'm too tired to go pursue information. Suffice to say that most people don't need the raw computational power that is available these days, but the mass markets dictate what is sold and what is not, and so on...
Quote:
Originally Posted by Moco Loco
I assure you, if I didn't need a computer that happens to use more power, I'd be pretty much ecstatic  Unfortunately, I need to run Avid, Ableton, and a ****ton of Adobe things for school, and probably Avid even after I get out (one possible career path).
|
Video editing is one those things that is an endless computational capacity sink, but it does require pretty good hardware if you are not patient enough to wait for a whole day for a single task to complete.
Then again hardware video encoding and the likes are pretty neat, although the quality is one thing that's not as good as it is with software, but this is something I do not know nearly enough to really talk about.
Quote:
Originally Posted by Clarke
But we must remember that technology is not constrained by Moore's law. The only thing it is constrained by is the laws of physics, and it is AFAIK both Intel's and AMD's belief that transistors are old-hat.
And they're right. How does a energy saving of 100,000,000% sound? 
|
I think Intel for one won't be too pleased about the transition from transistors based chips, because they have so much invested in their fabs that it's not even funny. Ok, I lied, it is funny... At least the pun is...
I'm way too tired...