Dec 30, 2017 16:37
It's quite cold out there today.
I've been reading "Man and His Symbols" because I really don't know as much about Jung as I would like--and because I finished playing Horizon: Zero Dawn, the first video game of that sort I've played probably in 10 years. I got excited about Jung after getting a glimpse from Dr. Harris; Jung is a mysterious and quirky character to me, maybe some of this reading will give more clarity. I'm interested especially in the way that he came to think about dreams, personality, and the unconscious--during an era when neuroscience was in a very different place.
The other thing that has really been on my mind is a human-like artificial intelligence that is...conscious. I really only know the pop-science of AI, and how it relates to things like self-driving cars, NPCs in video games, or 'intelligent' systems that play Chess or Go that are fun to read about every once in a while. Oh, and Watson, I want to know about that project too. One aspect of AI that I have no idea about is the way mood and emotion could be modeled. I'm about to do some digging into it, but before my thoughts are influenced by the work that has already been done, I'm going to spend some time just thinking it through for a while.
I'm thinking about how Slavney and McHugh's perspectives of psychiatry. What happens if you apply the perspectives to artificial intelligence? Or, put another way, what happens if you develop an artificial intelligence in such a way that it has features of each of the perspectives as they envisioned them? Take the life story perspective for example. For a person, the life story in some ways starts before conception, with the production of gametes and ultimately the circumstances of conception. The first elements of life story that I can remember for myself are around two and a half or three years old, but that doesn't meant I can't conceive of how things that happened before my earliest memories influenced me. Would it be that for an AI, where there was some kind of nascent consciousness forming early memories that later developed into something that could be described best as a 'life story', or would an AI -- in a way-- 'wake up' more fully formed, since there might be access to past data and information? I know this is a favorite trope in science fiction, seeing a computerized artificial intelligence try to learn about what emotions are. Data from Star Trek is a fun example, or the ship computer on Dark Matter more recently (they played with the idea of machines with emotions, too.) So actually, it makes sense to me that an artificial intelligence would have a life story no matter what, if it was sufficiently complex to experience anything remotely like consciousness. Anyway, that's the kind of thing I want to examine a little bit before looking at what people out there actually working on this kind of thing have done.
I have two candles on my desk. They make me feel a kind of warmth even though the thermostat is set far colder than really makes sense. I guess I'm a little miserly when it comes to heating costs!!
The year has been good; there are great fluctuations in 'challenges' and 'successes' but on balance I'd say that this has been a year of successes. It wasn't a year where new ground was broken so much as it was a year of maintenance, persistence, and careful thought. Next year will be a time of increased preparation, sort of like winding up a mechanical toy or a music box, moving toward the final platform to jump off and out into the professional world more fully formed. There will be parallels to 2014, I think, in the best way. Okay, let me go do some reading!