1. How much should we rely on reason and how much on emotion when making decisions about our personal lives? Setting law? Other decisions?
2. Are there circumstances where purely reason should be used to make decisions? Only emotion?
I'll try to answer these two together, since they're very intertwined in my head. All below is imho, of course.
I view reason and emotion as two aspects that work together, but also must serve each other.
In any decision, i believe there should be -- at least at the end -- a rational process based on reason. However, emotions can, and very often should, serve as some of the inputs to that process. (The post i wrote -- and still must follow up on -- about "irrationality as the axiom" is about that.) I do not believe it's wise to react in emotions, but very much believe in reasonably acting upon them, so to speak.
However, just as emotions can prop up our reasonable process, so reason should also be used to check our emotions. Not only must we use reason to guide us in any dealings with others (since they may not share any emotional inputs to our conclusions), but it's also very advisable for us to spend time analyzing our emotions to help encourage healthy personal growth.
3. How much of our laws are based on morality? How much should be?
The first half of this question i cannot speak to. I don't really know much about the formal definitions of morality, nor the history/basis of law. If forced to guess, i'd probably assume that most of law is rooted in the "social contract" idea, and the rest wobbles around that mean at the whim of political climate and power structures' inclinations.
How much should be, however, is something upon which i can at least venture an opinion.
All law should be based on something "moral", in the sense that it should be based on placing value on something. Those values are probably things akin to "happiness", "liberty" or "stability", and whether or not they're truly moral values or just values based on assumptions of human existence, the end goal of law is nonetheless to try to establish and guarantee them.
The key, then, is to try to decide what, exactly, we wish to establish and guarantee. My personal vote is the idea of "free will": that one should be able to do as they wish, providing they harm or impede no other.
The catch, of course, is that there is nothing any of us can do that does not effect everyone else. My breathing takes oxygen away from the entire planet. My driving a car uses up fuel and pollutes the planet. My need for morning coffee drives a market with large economic, political and environmental implications; a person who wishes to use heroin drives an even more impactful one. Et cetera.
Even if the entire world could agree on a basic premise of trying to establish laws that provide "free will" to all (which is sadly a far cry from where we are now), it would then need to set to the task of trying to decide on the optimal point in all of the trade-offs involved. Breathing's probably going to be OK... but how much fuel consumption and pollution is too much? Does a society in which addictive substances are legal automatically endanger someone somewhere somehow?
It's not an easy problem, nor does it have a solution that will be static. I believe that all we can attempt to do, once we establish the "moral" basis we're shooting for, is build a system of laws that provide as much flexibility as possible and enable efficient review/revision as we learn more, but also protect the most basic of rights.
no subject
2. Are there circumstances where purely reason should be used to make decisions? Only emotion?
I'll try to answer these two together, since they're very intertwined in my head. All below is imho, of course.
I view reason and emotion as two aspects that work together, but also must serve each other.
In any decision, i believe there should be -- at least at the end -- a rational process based on reason. However, emotions can, and very often should, serve as some of the inputs to that process. (The post i wrote -- and still must follow up on -- about "irrationality as the axiom" is about that.) I do not believe it's wise to react in emotions, but very much believe in reasonably acting upon them, so to speak.
However, just as emotions can prop up our reasonable process, so reason should also be used to check our emotions. Not only must we use reason to guide us in any dealings with others (since they may not share any emotional inputs to our conclusions), but it's also very advisable for us to spend time analyzing our emotions to help encourage healthy personal growth.
3. How much of our laws are based on morality? How much should be?
The first half of this question i cannot speak to. I don't really know much about the formal definitions of morality, nor the history/basis of law. If forced to guess, i'd probably assume that most of law is rooted in the "social contract" idea, and the rest wobbles around that mean at the whim of political climate and power structures' inclinations.
How much should be, however, is something upon which i can at least venture an opinion.
All law should be based on something "moral", in the sense that it should be based on placing value on something. Those values are probably things akin to "happiness", "liberty" or "stability", and whether or not they're truly moral values or just values based on assumptions of human existence, the end goal of law is nonetheless to try to establish and guarantee them.
The key, then, is to try to decide what, exactly, we wish to establish and guarantee. My personal vote is the idea of "free will": that one should be able to do as they wish, providing they harm or impede no other.
The catch, of course, is that there is nothing any of us can do that does not effect everyone else. My breathing takes oxygen away from the entire planet. My driving a car uses up fuel and pollutes the planet. My need for morning coffee drives a market with large economic, political and environmental implications; a person who wishes to use heroin drives an even more impactful one. Et cetera.
Even if the entire world could agree on a basic premise of trying to establish laws that provide "free will" to all (which is sadly a far cry from where we are now), it would then need to set to the task of trying to decide on the optimal point in all of the trade-offs involved. Breathing's probably going to be OK... but how much fuel consumption and pollution is too much? Does a society in which addictive substances are legal automatically endanger someone somewhere somehow?
It's not an easy problem, nor does it have a solution that will be static. I believe that all we can attempt to do, once we establish the "moral" basis we're shooting for, is build a system of laws that provide as much flexibility as possible and enable efficient review/revision as we learn more, but also protect the most basic of rights.