Quote:
Originally Posted by Advent
And then what? We'll eventually be demonstrating about the ethics of that. Humans are generally senseless in the grand scheme of things. If you don't want animals slaughtered, don't eat them.
|
See later when I'm talking to aurora. In the direction I'm talking about, there won't be any demonstrations about ethics, because there won't be any ethical questions.
Quote:
Originally Posted by auroraglacialis
our relationship to nonhumans is no longer a balanced one. We don't even feed the soil with our manure, the minimal service predators do to their environment. And not only do we take much more than we need, we dont even give what we do not need to those who could use it. And keeping cows in feedlots, fur animals or chicken in small cages and all the other stuff is IMO a sign of a relationship gone horribly wrong. How do we relate to each other and to the nonhuman world, that is the issue for me, not that there should be no death in the world.
|
This is, IMO, only a question of insufficiently advanced technology.

Modern biotechnology has produced the concept, and soon the prototypes, of "synthetic meat;" meat grown directly in vats, no animal required. If this were sucessful, it would completely remove almost all ethical questions about meat-eating, since, literally, no animals were harmed in the making of this steak.
More generally, a "relationship with nonhumans" is only necessary to the extent that we take things from non-humans. From an engineering perspective, that is simply inefficient, and every attempt should be made to eliminate that inefficiency; thus, synthetic meat. The ideal is not to establish a relationship with nature, where we pay back for what we take; it is to take nothing from nature at all. (Or from the delicate parts of nature, anyway. Fusing seawater for fuel, for instance, rather than modifying the atmosphere in large amounts.)
Quote:
Does intention require self-awareness?
What about a bug crawling up a leaf - does it not have an intention?
|
I don't think so; To
intend to do something requires a concept of "I" to do the intending. IMO, the bug is just a computer running unknown software on neural nets rather than sillicon, and is merely using the same sort of problem solving techniques our own computers use: it has a goal state, (FOOD!) and a current state, and has worked out what steps are needed to turn the current state into the goal state. It can then mindlessly execute the steps. We, in contrast, recognise "I" as a human in the world, and when dealing with humans, we can extrapolate goals based on behaviour. (except in the case of "I", when we don't have to extrapolate at all; we just look into our "things to be done" list.)
(I'm finding this difficult to explain, so that was probably not really helpful, sorry.

)
Quote:
|
Even if you give Gaia only that much similarity to a living being, it means that climate, ecosystems and species, individual animals and plants, water flows and nutrient cycles are all part of her and that she can also react on them, just like a hamster can react to you pinching him in the belly even if he has not the level or form of intelligence a scientist would demand to call something a "higher being".
|
I think a more apt metaphor is that the hamster will "react" to being fed, or for that matter, being poisoned. The entity only superficially reacts as a whole; the behaviour we ascribe as a single reaction is actually many interconnected parts behaving in very different ways. (In the case of the poisoning exmaple, some proteins may collapse, some may malfunction, others may continue working.)