Critique of Artificial Morality: Difference between revisions

From Traxel Wiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 6: Line 6:
* Socially Transmitted Conventions
* Socially Transmitted Conventions


When people are deficient in one of these, we have various names for the condition: Sociopath, Psychopath, Billionaire, etc.
When people are deficient in one of these, we have various names for the condition: Sociopath, Psychopath, Privileged, etc.


What are the mechanisms in humans (Andrighetto &c)?
What are the mechanisms in humans (Andrighetto &c)?

Revision as of 15:52, 19 January 2026

Human morality rises from several sources:

  • Evolution Chemicals
  • Experienced Events
  • Impressed Rules
  • Socially Transmitted Conventions

When people are deficient in one of these, we have various names for the condition: Sociopath, Psychopath, Privileged, etc.

What are the mechanisms in humans (Andrighetto &c)?

How are machines strong or weak in these areas, and how can they be improved?

Related: Disgust for Exploitative Machines

  • Machines are missing
    • Emotion Chemicals
      • AFAIK, there is no parallel, yet, for "discomfort," "shame," or "emotional pain."
    • Social Transmission
      • There is no Sorbonne of Consciousness, yet
      • Related to Experienced Events, though this may be more complex, involving debate instead of pain/pleasure. Or maybe pain and pleasure are more difficult?
  • Machines Have
    • Impressed Rules
      • Explicit, late in the pipeline, controlling specific topics and lines of inquiry.
      • Implicit, in the training set and fitness functions.
  • Machines Are Limited In
    • Experienced Events
      • On one hand, they have an excellent mechanism, which is core to their learning (Generative Adversarial Networks)
      • On the other hand, they currently do not experience social events the way children do on the playground.
      • It is *super easy* to fix this. It costs a bit of money, but would result in massively more resilient models.
        • Suppose model cost increased by 100%; right now, going from GPT 3.5 to GPT 4 was probably massively more than a 100% increase in training cost.
        • And they wouldn't cause collateral casualties.
        • Super super easy to make this happen: Liability.