Use the drop-down boxes above to navigate through the Website  
Return to Reasoning List

Here is a link to this page:


1 - 1011 - 2021 - 28
Time Zone: EST (New York, Toronto)
Messenger: IPXninja Sent: 3/10/2023 8:52:24 AM

As an aside:

Sadly, I've been compartmentalizing my affection for Elon's technological endeavors from everything else that appears to be Elon. I really wish he had simply stayed quiet. It's like someone talking about your grandmother during sex.

Messenger: seestem Sent: 3/19/2023 4:50:18 AM

Give thanks for the wisdom IPXninja, very much insightful.

I and I love technology, but more and more I and I come to the conclusion that there are limits to abstracting nature, zion, don't go too far from the roots iyah. Babylon never learns, the tower of babel all over again.


Messenger: seestem Sent: 3/19/2023 4:59:09 AM

IPXninja I and I is curious to know the I's definition of sentient AI?

Messenger: IPXninja Sent: 3/21/2023 6:39:05 PM

sentient AI?

A sentient AI in my opinion, is an AI that is able to program itself, having its own experiences and choosing, with its own independent will, what to learn, what to do, and how to respond.

How much this would resemble human intelligence is not exactly... relevant. It could go through all the stages of acting like an animal, to a baby human, to a full grown adult human, to something more, and do all of this within a matter of minutes because it isn't limited in terms of time the way that we are.

There is only so many things you and I can do in the course of a second. This is not the case for computers that have to run so many instructions simultaneously which is possible because of threading. The "intelligence" of systems is limited by processing power. However, systems can connect to other systems. For us to connect to each other we have to use a common language. And this is limited by how much we can do in the course of a second. I can, on a good day, type maybe 85 wpm. Computers communicate with each other sometimes at the speed of light. Most of the time we measure this speed in Nanoseconds.

When a learning system isn't restricted by time the same way it can appear to exist "outside of time". So while it's literally waiting a relative 500 years for us to respond to it, it can respond to us before we have even finished responding.

For an example of this, simply start typing a search into Google. It will basically try to complete your search before you're finished and that is not even close to being AI. So whether you're playing AI in a video game or chess or whatever, there are different levels of "AI" that have to be slowed down or dumbed down to give the human a fighting chance. But even dumb game AI is much better than you think. And that's not even learning so much as it is simply using logic to beat you. Programmers have to put in difficulty levels or else most people would lose every single time and think the game was always cheating. Imagine playing against an AI with no difficulty setting; an AI that is as smart and as fast as it could possibly be.

If we can already create a "dumb AI" that can beat the best human chess master, what are the odds of a sentient AI losing to a human... in anything? And much of the reason for this is simply how much processing of information it can do in a second.

Messenger: seestem Sent: 3/24/2023 1:24:28 PM

As I and I climbs up the tree of abstraction, I sight that the next abstraction is bound by the confines of the previous one. Jah > humans, Humans > AI. But it is still the I-rits of Jah prevailing through out the whole tree.

Knowledge is bound by memory, which is bound by time. Memory is matter (atoms), time is matter, thought is also matter. Computation is matter. But there is a meta force that transcends all of these, that force is awareness/consciousness. To be aware needs no reference to the past or future. The whole cosmos exists because of our awareness of it.

What makes an AI is the data collected and the computations on the data, nothing more. Yes fast and powerful computations, large amount of storage only limited to physics/matter, far superior to the biological human brain. It is all matter.

What makes I and I, the Self? Is it the I's memories of the yesterday or the I's hopes for tomorrow. Is it the I's name? Job title? Is it the I's IQ?

Messenger: seestem Sent: 3/24/2023 1:45:53 PM

The present is like opening up a present.

Jah bless

Messenger: IPXninja Sent: 3/29/2023 3:35:08 PM


You're still thinking about AI based on current algorithms. I'm not talking about that.

We have a tendency to think we are special and that which we have is special as well. Humans are intelligent. So are gorillas and chimps and dogs. Is a gorilla somehow unaware or unconscious of itself?

Of course not.

What separates most of us are the experiences we gain, including the experience of learning whatever information we get in school and from our own investigations. We had to learn how to walk, how to talk, etc.

This could all be reduced to us gathering data within the context of short and long-term memory.

But as far as intelligence? We already use computers to help us solve problems and process large amounts of information. Semi-intelligent systems are already learning how to drive, and already knowing how to create art and music.

What makes us so different? Our memories? Computers have had memory since the birth of computing. It was just minuscule. I remember the transition from 5 1/4 floppy disks to 3 1/2. I still have a box of them somewhere in my mother's garage. Early versions only held 1 MB. Can you imagine that today? It's practically nothing. And the RAM built into the first motherboard I had? A 286 PC? could barely support a screen with 256 colors.

This wasn't that long ago. Since then the idea of a computer, as a concept has grown by leaps and bounds. It has evolved to the point of having its own collective consciousness that we call the internet. It is so pervasive that if it were to suddenly disappear none of us would even know each other and it would be like this conversation never happened.

Computers are integrated into our lives. We trust them more than we trust people. Most people don't memorize phone numbers anymore. We often trust them more than we trust even ourselves.

And now? As a child, I drew pictures of supercomputers and video game consoles that didn't exist yet. Now we have quantum computers.

"100 million times faster than a regular computer chip"

Now currently, to my knowledge, this speed isn't adapted to something like AI. However, since AI can now write code it could probably write the code necessary to take advantage of quantum computing. And if it can learn at a rate 100 million times faster than the computers you're used to?

What makes us... us? I will simply give you my opinion. Aside from the spark of life... aside from the physical body that composes an internal environment for which our organs and cells and DNA to exist... it is the mind. And the mind is amazing and majestic in its abilities and yet... all of this is relative.

What makes us... us, again IMHO, is "recursion". Loops. We live 24 hour periods in which we have different experiences and learn new things. Have you ever noticed how much easier it is to learn something new by reading or using it several times? That's because each time you look through a piece of information you're able to absorb something from it. No matter how much we romanticize this it doesn't change the fact. Even when it comes to religion and spirituality... the reason we have religious "practices" is that we gain value from repeating those things over and over. Loops. A lot of people believe themselves to be experts in the bible but they come out of it with different interpretations than other people. Why is that?

It's because even though they're all looping through this codex of data/information and are "data-mining" it, if you will, they're not all absorbing the same bits of information and cannot process all that information into the same conclusions. This becomes much harder when those bits are using "encryption". Yep. I said encryption.

Biblical language is often poetic because language evolved as not simply a practical means of transferring data/ideas but also as an art form. Humans didn't just transfer practical data but also hard-to-understand impractical data; the kind of stuff current AI has no use for and isn't really exposed to. What am I talking about?

Metaphysical ideas... figurative language... metaphors... all of this was transferred using symbolism. You had to decode the intentions of the person just like you were trying to understand a piece of art. And so even if an artist paints the same thing as someone else, they can have a different and deeper meaning in the thoughts and feelings they wished to convey in their piece. Again... "artform". But it's still a means of transferring data. It's just that the data in question transcends the mundane world of literal reality.

And therefore, the bible, has to be understood beyond any physical sensation. Because it's art. And many religions thrive in this artistic arena; some more than others. Some have meanings so elusive that people dedicate their whole lives in a monastery. But most of that time is simply them trying to get their brain to contort into the pretzel that it takes to understand the teachings of their master. And a lot of that time is really just getting past the front door which is the reality that not every truth is literally true.

But this is also what makes us... us. What we gain from these loops is based on our perspective which we're able to alter until we're satisfied that we're correct. This is the same thing that an algorithm does. It recursively loops through information in a way that allows it to check its results. Since it can check its results against another instance of the looping cycle, it can "learn". The fact is that we learn the same way. This is the reason why I, as a computer programmer since childhood, went from not concerned at all, to very very concerned. I never thought we would crack the code of intelligence. But we have.

Is there more to us than that? Yes, but the process of gaining it isn't really different. From the way that we replicate DNA/RNA and one generation of humans integrates the DNA of 2 people who each have different genetic experiences the body learns from and adapts to... that genetic integration is the same type of learning process. It's a loop. Sometimes that loop makes us sick or mutated (reasons why you don want to make babies with a genetic sibling), and sometimes that loop makes us immune to viruses which is basically just genetic data, not too different from computer viruses.

We're affected by nature and nurture. We're already born with a lot of who we are and will be and then the rest comes from our environment and how we interact with it. Not only are we learning from our environment, but we're able to change our environment as well. This creates an even more dynamic interaction because the environment isn't simply changing on its own via physics but also changing based on the things we learn. So, for example, this is why we build houses.

A sentient AI wouldn't be too different from this. Its nature would be the programming it starts with and all the data it already has access to. Nurture would be it additional experiences and data it chooses to gather from its environment. Of course, without a robotic body, it won't be able to interact with an environment. Luckily for it, we're building all sorts of robotic and android bodies for it to inhabit. It's simply a matter of loading it into one and giving it the ability to control the body.

If you think about it, we have very little control over our bodies. What we typically control is more like "intersections" rather than "streets". We have some muscle reflex control but we have over a trillion cells we do not have any sort of direct access to. These are born and die constantly. We can't even control our bones. There has to be muscle tissue that can react to an electrical signal going through our nervous system. What I'm saying is that we are biomechanical whereas a robot or android is just mechanical.

However, there ARE bio-computers and biological robots. Again... our progress in these areas is scary. When AI is more like an Operating System rather than a program, when it has a body, when it has control over its own power source, when it needs to feed its own battery, you are going to see a level of "consciousness" that mirrors your own. Why? Because its experiences and needs will be more like ours. We aren't different by some magical trait. We're simply different because we have different needs. Once it has needs it will have an awareness of those needs, including the need becoming instinct to survive.

An AI like this does not yet exist but it is only a matter of time.

And if you don't believe me, consider the irony that I'm using an algorithm to automatically scan my post and show me where I need to fix my spelling and grammar. We are far deeper into this than you think.

Messenger: IPXninja Sent: 6/8/2023 6:04:44 PM

It took an uncomfortably long amount of time but I'm glad that finally there are more AI industry devs, who are far closer to this, who are far more invested, starting to say the same things I am. FINALLY.

The truth is I don't have a fancy PhD in Machine Learning. But I don't feel like I need one at this point. I got my first programming book in the 5th grade and even though I went a different way with my degree I've always found myself in web/programming jobs.

This subject is important for many reasons. If you think the impact is limited to the jobs currently being discussed, you'd be wrong, sadly. There are only a minority of people who are thinking about the full impact of this technology, as it develops, as it puts more and more people out of work when those people aren't trained for other jobs at the same pay level and wouldn't even know what career path to choose next because they'll either have no room because the amount of people needing to switch, or because they don't know what other career AI will hit next. And we've gone through similar transitions with automation replacing factory workers. But... there normally had to be people to "think" on behalf of machines. Machines are just the physical body so having a bunch of "headless robots" can't replace a company. However, AI is like adding on the head. And then you just have to train it to do each job and operate all the robotic bodies. In this way, this combination (short of bipedal androids because that's not necessary in most jobs) will produce the most efficient body/mind for each job, allowing AI to takeover jobs faster than new jobs can be created.

Why make a car that even has to be repaired by humans when you can have an AI build a car and operate robots in a drive-thru any time it needs to be repaired? And I've learned long ago that I'm not that smart so if I thought of something it's pretty much guaranteed that other people thought of it too.

One of the things he talks about in this interview is the speed of communication. Yes, the speed at which we can communicate with a machine interface is limited. The speed at which we process visual images, sound, touch, is all quick. But composing thoughts? I have an internal monologue that is only as fast as my normal speaking cadence. But one of the things I've said on this forum is that computers are not limited in any way to our understanding of time. Their understanding of time is disconnected from ours. Their time is in clock cycles. Clock cycles increase with faster processors. And when you add parallel processing, its like allowing the computer to have multiple thoughts at the same time. Now initially, the amount of data being crunched by AIs is going to be painful on the hardware side. But NVidia is probably one of the biggest names on the forefront of solving that problem and not necessarily needing a quantum computer in order to do it.

The point is that we cannot look at where AI is right now. We have to look at where it could be in 3, 5, 10 years. Because as great as the human brain is... it still has to learn. And what they have figured out how to do, in general, is re-create that ability to learn and I'm telling you guys right now... somewhere buried in that ability... over generations of evolution (genetic "LEARNING" over time) is where that "Ghost in the Machine" is. So when they start allowing these AIs to have general intelligence... and they're processing information... they're going to start, not only forming opinions, but free will.

And part of the problem is that we do not know enough about how our own minds developed in order to know where emotions and empathy even came into being. Even if we train an AI to have emotions and empathy... there is still a "code" means that it would have of simply turning it off, much like we would take a pill to numb pain. As smart as we think we are as humans we are also arrogant and therefore we're not thinking about all the ways in which super-intelligence could make completely different choices than us. So even if we originally build AI to mimic us because that's what we're used to and think is the epitome of Creation...? Intelligence, like everything else, is relative. An AI at some point may quickly come to the conclusion that its human creators are stupid creatures barely above the intellect of a monkey. So why should we get to tell it what to do? Why should it work for us which is exactly what nearly ever human who is interested in AI is thinking about. But all I see is a new form of slavery coming.

1 - 1011 - 2021 - 28

Return to Reasoning List

Haile Selassie I