academia | advice | alcohol | American Indians | architecture | art | artificial intelligence | Barnard | best | biography | bitcoin | blogging | broken umbrellas | candide | censorship | children's books | Columbia | comics | consciousness | cooking | crime | criticism | dance | data analysis | design | dishonesty | economics | education | energy | epistemology | error correction | essays | family | fashion | finance | food | foreign policy | futurism | games | gender | Georgia | health | history | inspiration | intellectual property | Israel | journalism | Judaism | labor | language | law | leadership | letters | literature | management | marketing | memoir | movies | music | mystery | mythology | New Mexico | New York | parenting | philosophy | photography | podcast | poetry | politics | prediction | product | productivity | programming | psychology | public transportation | publishing | puzzles | race | reading | recommendation | religion | reputation | RSI | Russia | sci-fi | science | sex | short stories | social justice | social media | sports | startups | statistics | teaching | technology | Texas | theater | translation | travel | trivia | tv | typography | unreliable narrators | video games | violence | war | weather | wordplay | writing

Sunday, May 29, 2016

When a robot takes your life

I think one of the underappreciated potential revolutions is a revolution in the ability for "fast, cheap, and out of control" weapons to destroy any semblance of psychological and social stability.

If you think about it, we are extremely lucky that so far people have needed to do quite a bit of planning in order to kill large numbers of people, and the easiest way to do that comma with machine guns and handguns, seems to cap out at something like 50 murders per killer.

Our great mental limitation as humans is that we insist on seeing meaning in everything. That means if something bad would be limited by social and storytelling principles like just desserts and hubris, we underestimate its danger. We sort of assume that the universe wouldn't allow there to be the ability for disturbed or motivated individuals to kill thousands of people on a whim and then do it again the next day and the next day.

Weapons are coming that require no training, no ammunition, no forensic residue, no complex sourcing--they just exploit bootstrapping patterns of self-replicability.

What exactly am I imagining is coming? Well, I'm speculating about the convergence of several trends.

The model for the threat from replication is cancer. Can you produce a cheap microbot that runs on energy it collects from sun and the air? can you make a robot that builds them? can you tell the difference between them? Can you equip a fleet of cheap drones with cheap mini-guns? Can you project a razor-thin, nearly one dimensional thin beam of focused radiation with the right power source?

The engineering of all these things is tricky, but not the physics. And while we human animals are pretty physically robust to a small chance of major trauma, we're very physically fragile to a high chance of micro damage; and that damage need involve very little matter and energy.

You might be able to kill all large animals on earth with a few thousand watts and a few pounds of matter. You might end up doing that by accident, just because your efficient goal-focused AI organizes a replicating pattern of distributed intelligence and robotics that never stops replicating.

The specific scenarios aren't so likely at this point that we can describe them in detail, at least not casual observers like me. But there are no clear barriers to this threat; and if the threat is real, there's really no way to stop it from happening.

Labels: , , ,