Trying out a new theme with better navigation tools, to make it easier for visitors to enjoy the depths of my barely-one-year-old blog. Let me know what you think! Any and all suggestions are more than welcome!
Reading the intro to this post, I thought I was in for a juicy slice of biblical literalism that I could pick apart piece by idiotic piece. But as I continued, I was pleasantly surprised to find an actual thoughtful analysis of the ‘literalism’ of the bible. The author, though obviously religiously biased (but to each his own), chastises the oversimplification of a mass collection of texts into a dichotomy of literal vs. non-literal interpretations, and instead favors an approach that focuses on the author’s intentions, as any good book should be analyzed, fiction or nonfiction (which is up to you to decide).
“You can’t take the Bible literally!” This is an assertion I’ve heard many times. I don’t agree with it (as stated), but I understand the sentiment behind it. Often it is a reaction to fundamentalist claims about the Bible, such as that it teaches a young earth or creation in six, twenty-four hour days. Christians in the sciences, and other thinking Christians interested in the sciences, rightly raise questions about such views. Sometimes they insist “You can’t take the Bible literally!”
I want to affirm the sentiment behind the assertion, but not the assertion itself.1
I reject it primarily because it over simplifies the matter and because it is a false dichotomy (i.e., literal vs. non-literal are not the only options). Very briefly, consider the following thoughts:
- No one actually reads the whole Bible literally. No one, for example, interprets the dragon in the book of Revelation as an…
View original post 1,049 more words
This is an excerpt from an attempt to expand an idea I started with this short story. I’d love as much feedback as possible. Enjoy!
I think I was part of the last generation that didn’t grow up with BIT. I mean it was around, but nobody really understood how to use it. It wasn’t everywhere like it is now. I’ve seen kids with chips that couldn’t have been older than 10. That’s just crazy. You don’t even know how to use your brain yet. But I guess it’s just progress. I mean I’m glad I have my chip. I think that BIT is probably mankind’s greatest achievement, but I’m glad I had my brain to myself for as long as I did. Take nostalgia for example. With post-chip memory it’ll be a thing of the past. All those little annoyances of an experience that fade away as you get older, they’ll be there in vivid detail, so no nostalgia bias.
See, one of the first mainstream chip features was memory upgrades. Problem was everyone did them differently. The first ones all tried basic SDB, sensory data backup, which is basically just recording the standard sense-data to the cloud. But then to recall the memory, you had to redirect the neural pathway from the hippocampal formation to the memory database in the cloud, which everyone soon found out was a bad idea. With kids it was ok, their brains are still pretty plastic and open to rewiring. But in adults, this is one of the strongest neural pathways in the brain. So strong it would actually override the chip’s rewiring, which consumers saw as paying for shit that didn’t work. So then they tried storing the sense-data in the hippocampus, but that got tricky too, because the brain already stores the memories itself, so they had to find a way to attach the sense-data to the memory the brain created. Problem is, the way memory works, the brain only really stores the information from the memory that it feels is important. Over time, the more you recall the event, the information you use from it once you recall it, that’s what the brain reinforces. So if you don’t use all the sense-data stored for the memory – and who does? – the brain naturally dismisses it. Which, again, just comes off as faulty programming. So then someone said, ‘Well why are we trying to store information where the brain stores memories? Why don’t we just store it where the brain naturally stores information?’ Which, yeah, it’s obvious once you know it, but just no one had thought of it yet. So what we did was send the sense data to the parahippocampal cortices, which is where the brain stores semantic memory (facts, information, data), then all it took was strengthening the neural connection between the parahippocampal cortices and the hippocampus itself, where information is stored and where the memory is stored, which already existed. Just fire a few hundred neurons through it at the startup and the brain automatically reinforces it. Then it stays strong from there because it’s actually useful information to have when recalling memories.
So what resulted was two different sets of memories, pre-chip and post-chip, though technically it should be pre- and post-SDB, but no one says that. The pre-chip memories are still vague and fuzzy like natural, but the post-chip ones are fully-detailed, vivid. You can almost relive the moment in a weird way. What’s really weird is when you recall your pre-chip memories, the recollection gets stored by the SDB processors, so you have a post-chip memory of recalling a pre-chip memory. This really made people realize how faulty pre-chip memory was, because they could look at every time they recalled the memory and see that it changed each time. And the post-chip hippocampus tries to assimilate all of these different details into one memory, but it can’t. They’re too different, and they contradict each other.
Anyway, the younger you get the chip, the less of those memories you have. And kids are getting the chip younger and younger. Pretty soon they’ll just be putting it in babies and no one will even remember what natural memories were like. I’m just glad I was born when I was. I like my pre-chip memories. Somehow they feel more real, even though they’re actually less accurate. I think there’s something kind of special about that process. Maybe it’s just the thought of it being extinct that makes it feel that way, but I don’t know. Maybe we’re supposed to forget certain things. Maybe the past is supposed to look different every time you remember it. I mean, our brains could have adapted to keep memories exactly as they happened, but it didn’t. I’m not saying there’s a reason, like intelligent design or anything like that, I’m just saying the brain saw some reason to do it this way. But I guess we know better than our brains now, don’t we?
To my friend The Ark: I found your favorite beer mug. Looks like you smashed it somewhere in San Diego, must have been a crazy night!
On a Labor Day excursion to San Diego, my girlfriend and I stopped by Balboa Park’s “Museum of Man”, a decidedly non-PC name for one of few museums dedicated to anthropology. Among the mediocre exhibits on evolution and Mesoamerican culture containing mostly replicas and casts of artifacts, I was pleasantly surprised to find this gem. Tucked into the gimmicky “Beerology” exhibit, undoubtedly set-up to draw visitors to the small and not-so-noteworthy museum, was an actual beer cup buried alongside Pharaoh Akhenaten in his tomb in Amarna. We can pretty safely assume that the ‘first monotheist’ actually used this now gangrenous chalice to knock back his royal brewskies. I know it’s pretty nerdy, but this unexpected encounter with an ancient man with whom I happen to have a slight fan-boy fascination made me a little giddy for a fleeting moment. Anyways, I just thought I’d let the guy know his cup is waiting for him in SoCal if he wants to come pick it up sometime.