brooklyn nine nine fic where gina’s been a vampire since her early twenties and jake doesn’t know that this is news to anyone else, it’s not like she keeps it a big secret, she’s always drinking blood from kool-aide boxes at her desk and all surrounded by detectives, why is amy freaking out about this it’s not like gina’s suddenly any more inscrutably malevolent today

I rewrote this comic because it originally said “talking to party strangers on the regular” and I gotta admit, the phrase “party strangers” is SUPER-DUPER enticing

If you like Dinosaur Comics, you can become a patron and get RAD PRIZES

talking to people: the absolute worst thing

(Source: prismcess)


This account gives me life


This account gives me life


India teen tells US how to save $400 million by changing font (via The Hindu)

A 14-year-old Indian-origin boy has come up with a unique plan that could help the U.S. save nearly $400 million a year by merely changing the font used on official documents.

Alas, the facts don’t bear this out.




It’s GA m Z E e




[Previous posts snipped for space reasons.]

Ahhhhhhh. I see. :|a

The average human neuron is between 4 and 100 micrometers in length. That means that in a single-cell layer about the size of a dime, there are between 100 and 2,500 neurons. The human brain contains about 100,000,000,000 neurons. As far as “creating an artificial neuron” goes… it seems to be a complete waste of time? Set aside for a moment the matter of the hard limit of economies of scale. Even if you manage to create an artificial neuron that can do the work of 1,000 organic ones, that’s still a hundred million replacements to make. Additionally, each neuron in the brain produces only one type of neurotransmitter into the synaptic cleft, making it less and less feasible to replace more than a few neurons with an artificial one - it would disrupt the communication chain of serotonin neuron to dopamine neuron to GABA neuron if suddenly a large patch were replaced by a neuron that only “spoke” serotonin. (It would have to be implanted with either a self-sustaining bacterial culture that produced serotonin from the surrounding environment, or have enough to last it for a very long time, after which it would need to be replaced.) The human brain can translate chemical signals to the ones and zeroes of an electrical impulse; but that action takes place inside the neuron, and has nothing to do with communication between neurons.

It would be far more sensible and much more cost-efficient to culture types of neurons from stem cells, and use those to bolster the brain. But lack of neurons is not the problem which leads to death. It is either the failure of the body to support the brain, or a neurodegenerative disease such as the one which attacks the fatty “wire casing” around the axon and causes the neuron to fail to create or accurately transmit an electrical signal along the axon, aka, Alzheimers disease; and other such problems. The neurons are almost always fine. It is the failure of the supporting structures which leads to health issues.

Also:  ????????? The cost of creating and running a care system for clone bodies versus having a doctor or nurse do it is on an extraordinary scale - in the millions, if not billions or trillions. The matter of brain transfer is likewise difficult to an insane degree. Your “spinal cord” is not a cord which attaches to the brain like a plug and a socket - that “cord” is a bundle of neuron axons, strings attached to single neurons within the brain. Your “nervous system” - the nerves that tell you that a part of your body is saturating its surrounding tissues with the type of chemical that means “pain” - are the long, stringy ends of neurons inside your head. The level of neurosurgical expertise you would need to really keep your brain intact in the transfer process…. is bewildering to even imagine.

The nature of human consciousness itself - to give you the best guess of some of my friends in AI research - is that the pattern-noticing portions of your brain notice their own activity. (A gross oversimplification, but.) To create an “ego” or “consciousness” is a process that takes years of pattern accumulation and comparison. There’s a reason that “being able to tell the difference between the world and your own body” is a specific developmental stage for babies. The evolution of a consciousness is one in which your brain picks up every pattern it can find while you’re awake and prunes out and destroys those connections while you experience REM sleep, deciding on some basis what links between neurons should be maintained and which may be snipped. “You” are a pattern-finding mechanism that has learned to associate stimulus A with result B. How could a system of replacement neurons replicate that process of self-destruction and continuous re-shaping without. Being about as useful or long-lasting as an actual neuron? A still image could be captured, sure, but from that point onward your brain and the stores version would begin to diverge. It would not take long at all for “you” and your copy to become wholly separate “personalities”. Basically - “consciousness” is not a transferable quantity, it is an ongoing and continuous process of accumulation of data and destruction of data. It is the process which we hope to be able to replicate in an AI - to simply “copy” or record for a period of time what an organic human brain was doing would involve a lot of MRI time, a lot of radioactive isotopes (because that is how brain activity can be measured - by radioactively stained blood flow to certain regions after they have become active - OR by measuring electrical pulses from the outside in an electroencephalogram, which gives a precise electrical readout but no information about where exactly it happened), and a lot of money, and serve only as an… example? list of functions? There is no reason to assume that a computer or artificial intelligence, given that recording, would perpetuate the functions in the same way a human brain would - or perpetuate them at all.

Wanting to go on doing things forever may seem appealing to you! To me, the concept of eternal life is like someone walked up to me and said: “For several trillion dollars of research grants, how would you like to spend an eternity inside an airport terminal from which you can never leave?” That may seem harsh, but for me that is what being alive inside this particular universe is like. This is my airport terminal. Eighty more years here is quite long enough.

The “neurons of Theseus” approach was meant as a thought experiment about the continuity problem, not an actual proposal. (There’s no point in haggling over the price of a martini in the Tranquility Base Hilton hotel bar when that hotel doesn’t even exist!) I agree we’re only just beginning to understand the problems to be solved in understanding consciousness.

Philosophically, I believe that there is no such thing as a “soul” which is “really me”; that which I identify as myself is something done by the brain as it does what it does. Therefore, since the brain operates according to known physical laws, there’s no reason it should be impossible to copy a “running” consciousness from one computing substrate (a sack of meat) to another computing substrate (a manufactured computer). Possible doesn’t mean easy and I’m not confident it will happen in my lifetime. But I think it might.

As you say, “it would not take long at all for ‘you’ and your copy to become wholly separate ‘personalities’.” But this is a feature, not a bug! Suppose you want to join the expedition to Alpha Centauri, but you also want to stay behind on Earth and master surfing. Just fork and do both! To make a Homestuck analogy, sure, life is a little different for Davesprite now that he’s fused with a game construct, and yeah, Davesprite has had different experiences than the other Dave whom he came back in time to help, but he’s still a “real Dave” just as much as meteor-Dave is.

I’m very sorry if existence is really that hellish for you. Personally, I approach things from the other way around; suppose someone walked up to you and offered to painlessly euthanize you tomorrow. Perhaps you would take the offer, but I wouldn’t! I have things I want to do tomorrow! And I imagine that for any day N, if offered the chance to die on day N+1, I wouldn’t take it. Perhaps after 80 or 800 years I would get really bored and take the offer, but I don’t see why that gives any reason to not try and make it possible to live that long.


If you like Dinosaur Comics, you can become a patron and get RAD PRIZES

(Source: tldrwikipedia)


inklesspen replied to your post: (transhumanists are the “global elites should turn…

Eh, I’m an egalitarian sort of transhumanist; anyone who wants to should be able to turn themselves into an immortal computer/person. You may be okay with death, but I’m not.

That’s fair. I don’t think it’s an ethical or responsible use of time and funding to investigate transhumanist possibilities until or unless the rest of the planet has access to medical care on par with America’s. I think that to do so - if the research was successful - would foster a system in which that kind of access to “immortality” was permanently skewed towards the members of the dominant hegemony. I also see no practical use for storing copies of people’s minds, apart from the possible collaborative math/science/art aspect.

Based on my knowledge of the workings of the brain I do not believe such a “transfer” would be possible, feasible, or useful; if it does happen to be possible, the only means by which I can imagine it being done are ludicrously expensive and time-consuming. That’s for digital transhumanistic life. “You” don’t magically enter the computer; you get copypasted into the hardware and that copy is what’s “immortal”.

Organically, I think it’ll eventually be possible to clone oneself in order to have a new meat casing for one’s brain. (The cloned bodies could not, of course, develop brains - if they did then taking the body to house your own brain in it would be murder.) In that case, again - the expense would be extraordinary, and there’s no getting around the economic, labor, and social costs of having to nurture cloned bodies for future occupants. Someone’s got to be checking the temperature and the vital signs and adding fresh IV packets to the drip when the old ones run out. Assuming a nurse could do this as part of their job, your new body would cost about $810,000 to grow to adulthood, not counting any of the material or housing expenses. Then there’s the issue of brain transfer itself. Add another million to the price tag. Only the billionaires would be able to “live forever.”

It’s good to not want to accept the inevitability of death. That feeling indicates a healthy evolutionary aversion to nonexistence! I simply find the prospect of being alive forever a lot less exciting. I intend to live out my natural lifespan in a healthy manner. On good days I am enthusiastic about it. On worse days I force myself to go through the motions out of sheer cussedness and a determination that if I’m going to live then I’m damn well going to be good at it. On the worst days I can only trudge forwards with the understanding that I am serving out a very lengthy prison sentence. For me, being alive is only tolerable or enjoyable or precious because it is finite. I can understand that other people feel differently, but I find the prospect of immortality horrific.

My apologies if this came out sounding argumentative - when I find that someone disagrees with me I like to puzzle out why, so I wanted to explain my position.

There’s a lot of different ways to discuss the “uploading into computer” thing. I think a lot of them hinge on the same sort of questions as the classic “teleporter” problem; if you have a teleporter that (like in Star Trek) works by taking you apart and putting you back together again in another place, are you still the same person? (I say yes.) What if the “teleporter” just induces unconsciousness, makes an identical copy of you in the other place, and then painlessly kills the “original” you? Are you still the same person? (I also say yes, but I recognize a lot of people say no.)

But there’s also a sort of ship of theseus approach. Suppose we invent an artificial neuron. It works exactly the same as the regular neuron, except it also builds a software model of its operation. Suppose then that we replace one of your neurons with this artificial neuron. Surely you’re still the same person after this, just as you’d be the same person if you lost your leg and got an artificial leg. And then suppose we repeat the process on each neuron, one at a time, until your entire brain is composed of artificial neurons emulating the original ones. At that point, since the artificial neurons have a software map of how they work, that software is you. Perhaps you could detach from your body and go spend some time in a virtual environment, then come back to your body later or not, depending on how you feel.

As far as clone bodies go, I think there would eventually be economies of scale, particularly as we figure out how to automate parts of it. Why not have a computer system to monitor vitals and switch out IV drips? You still need someone on-call in case something goes wrong, but if we have the technology to grow a brainless clone in any degree of health, routine care ought to be simple to automate. In a mere ten years we went from it costing $100 million to sequence an entire human genome to “only” $10,000. The first couple hundred times will be unbelievably expensive, yeah, but I don’t see a reason we can’t drive down the marginal costs if we have the technological base to do it at all.

To be honest, I don’t understand a viewpoint that says “I’m glad I only get to do a limited number of things in my life.” If writing two novels is good, why not take ten times as long and write twenty? If exploring three cities is fun, why not explore thirty? I don’t believe there’s any meaning to life beyond what we make of it, so I want to make it the best I can.