• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

When we can't keep up anymore, then what? (AI)

PoetPhilosopher

Veteran Member
@Nimos - this may not relate directly to what we're talking about, but I wanted to bring up a subject while we're here:

Dating sims:

Historically, dating sim video games have been around for a long time, and haven't used AI. These games provide a getaway and a fantasy away from real life. The interactions may seem scripted in them, but in some games, are quite positive, anyway.

We are now starting to see AI dating sims. And they've actually led to the frustration of the users. The women in these new dating sims now act like American women on dating sites, and are prone to reject the user and stop talking to them, rather than interact with them. While interactions can be made, it still seems to lead to user frustration.
 

Quintessence

Consults with Trees
Staff member
Premium Member
AI don't scare me, it is what humans can use it for that is the potential issue.

I really wish more would watch the interview and listen to the concerns raised by this guy, because I think a lot of them are valid, but also because I get the impression that people are not really sure what exactly this AI technology is, like it is just another phone app that is a bit more advanced than the other ones.
That's fair. I'll leave the task of being concerned about it to you. We can't all focus on everything.

As someone who has used machine learning in research and also as an educator, I'm familiar with what it is and what it does. I'm just not concerned. Not because there aren't concerns, but because what will be... will be. Human technology is not long for this planet one way or another given so much of it rests on unsustainable extraction of resources and a total disregard for natural cycles. It's been around for a fraction of a blink of an eye in the eyes of our multi-billion year old planet and multi-million year old species. Change is always happening. Plus, in terms of life philosophy, I just do not care about "keeping up" with something. That's missing the point of life and living, IMHO - the result of a very product-fixated, capitalist culture that overlooks process and experience far too much.
 

Nimos

Well-Known Member
AI by nature is a complicated and unpredictable thing.

It's even more complicated with the results than the times I tried to program a randomly generated dungeon to a game.

Much more complicated, in fact.

At the same time, when I say "complicated", I'm not necessarily implying it will become 100-1000x smarter or more effective, either.
Im not sure it really is, the guy in the OP video says that the code for ChatGPT or Bard is only around 2000 lines, it's not a lot, to be honest, the power is in the dataset used.

Funny you mentioned randomly generated dungeons, I actually programmed one of those for a 2D game I was working on :)

But using AI it could probably create it in a few minutes, that is the big difference. They talk a bit about programmers in the last video I posted and their predictions are that there are probably none left in the future, because everyone will be, by simply typing into a textbox and the AI will do it for you.

I used ChatGPT for some programming and it can do it, but it not being hooked up to the code was an issue, given you can install similar AI as ChatGPT locally and train it yourself in whatever programming language you like, you could probably do it I think. But there are a lot of tools already in development to help you code or to code for you.

Like Llama:

But even ChatGPT 4 is also very impressive and it gets better and better. Things are happening so fast that it is almost impossible to keep track of it.
 

Nimos

Well-Known Member
@Nimos - this may not relate directly to what we're talking about, but I wanted to bring up a subject while we're here:

Dating sims:

Historically, dating sim video games have been around for a long time, and haven't used AI. These games provide a getaway and a fantasy away from real life. The interactions may seem scripted in them, but in some games, are quite positive, anyway.

We are now starting to see AI dating sims. And they've actually led to the frustration of the users. The women in these new dating sims now act like American women on dating sites, and are prone to reject the user and stop talking to them, rather than interact with them. While interactions can be made, it still seems to lead to user frustration.
Sure, there is no reason to assume that this field is not also going to explode like all the others. Again, as I said, things in this AI is absolutely crazy without control and I think the majority of people are completely unaware because their focus is on everything else or they simply don't really understand it.

But my fear is that it will suddenly bite them without them even realizing what is going on before it is too late. If we are to take any of these experts seriously, I think its probably better to be somewhat aware of it than assuming that it is just some gimmick.
 

Nimos

Well-Known Member
That's fair. I'll leave the task of being concerned about it to you. We can't all focus on everything.

As someone who has used machine learning in research and also as an educator, I'm familiar with what it is and what it does. I'm just not concerned. Not because there aren't concerns, but because what will be... will be. Human technology is not long for this planet one way or another given so much of it rests on unsustainable extraction of resources and a total disregard for natural cycles. It's been around for a fraction of a blink of an eye in the eyes of our multi-billion year old planet and multi-million year old species. Change is always happening. Plus, in terms of life philosophy, I just do not care about "keeping up" with something. That's missing the point of life and living, IMHO - the result of a very product-fixated, capitalist culture that overlooks process and experience far too much.
Yes, but as I explained to someone else, it is not machine learning. If it was, there wouldn't be any fuzz about it.

Don't get me wrong, I can understand why someone is not particularly concerned about this if they are not relevant to it. I don't know if you are or not obviously.

But if you as an educator are working with kids, and their future or if you have kids yourself, they are going to face this in a very real sense. How are you as a parent going to prepare a child for that?

In the second video I posted, with the CEO of SD, his prediction is that all jobs that take place in front of a computer are going to be affected in some way. Obviously some more than others. AI is not just introducing a new technology in a specific field, it is going to hit globally if that makes sense, all at once.

I had a talk with someone about when we would see 3D generative AI and we talked about it would take a very long time, yet in the video this guy is already working on making it possible to generate 3D in live in about a year or so and how the movie industry will be affected when people will start generating their own Hollywood quality movies at home in a few years. Whether that is actually possible or not, I don't think is especially relevant, compared to these actually talking about it in a serious way as that is to be expected, which would be laughable just a few years ago.

My point is that a child in let's say 7th grade is facing a future that might be completely different from what it is now. When I was a kid you could go I want to be a lawyer or whatever and you could be pretty sure that was also possible when you grew up. I honestly don't know if that is even possible for a kid to say today if the development of AI continues at this rate.

Are these people that are behind AI idiots? or just yelling wolf? or why would they constantly warn about this apparent impact that no one seems prepared for? let alone societies, I don't think the government have any solutions ready if there is going to be a mass layoff of people for instance.

A lot of people might not think that it affects them, but for instance, Google has made a medical model and it scored higher in both clinical diagnosis and empathy than humans did. That is pretty crazy. Think about it for a second, the AI was more empathic than humans.
 

Quintessence

Consults with Trees
Staff member
Premium Member
Yes, but as I explained to someone else, it is not machine learning. If it was, there wouldn't be any fuzz about it.

Sorry about that - I use the term machine learning rather than "AI" because I simply do not like the term. So for all intents and purposes, when I say "machine learning" just replace it with whatever term you prefer.

But if you as an educator are working with kids, and their future or if you have kids yourself, they are going to face this in a very real sense. How are you as a parent going to prepare a child for that?

I'm not that kind of educator - I work at the university, not with what most would call "kids" - nor am I a breeder. I've used these tools already in one of my classes for some time to help train students to become better writers and critical thinkers. Process over product - journey over destination. The folks who are infatuated with machine learning seem very obsessed about product and destination without considering that their products do not replace or replicate the process as experienced by people actually living their lives. That was the case as much as when photography was invented as it is today with whatever new useless fancy pants nonsense humans come up with next.

A lot of people might not think that it affects them, but for instance, Google has made a medical model and it scored higher in both clinical diagnosis and empathy than humans did. That is pretty crazy. Think about it for a second, the AI was more empathic than humans.

That's not how I think about it, because I understand where the models come from. Sure, you can choose to tell yourself the story this way. Or you can go "wait a second, machine learning models are trained based on plagiarized information generated by humans... which means what this machine is doing is conveying collective human wisdom." Everything is derivative. Humans included, actually. Humans are what they are because of the information they experience and synthesize from their environments. It's not any different with machine learning. It all gets derived from something external to itself. That doesn't make it "more" or "better" unless you choose to tell the story that way in your philosophy.
 

Nimos

Well-Known Member
Process over product - journey over destination. The folks who are infatuated with machine learning seem very obsessed about product and destination without considering that their products do not replace or replicate the process as experienced by people actually living their lives. That was the case as much as when photography was invented as it is today with whatever new useless fancy pants nonsense humans come up with next.
But isn't this just another journey? But one where humans might not be able to come along because we can't keep up so to speak. Yet the journey is going to affect us and in many cases, they won't have a say in that as the destination might very well be forced upon them.

That's not how I think about it, because I understand where the models come from. Sure, you can choose to tell yourself the story this way. Or you can go "wait a second, machine learning models are trained based on plagiarized information generated by humans...
And my point is does it matter?

Let's say that in a few years, all students will be taught by an AI that is going to adapt or optimize to each student so they learn the way that works best for them. If you and such AI were to be blind tested against the students let's say that 90% preferred the AI because they felt they learn things better that way.
Where does that leave you, you cant teach students on an individual level given that you don't have that capacity and they even address this in the video, how the AI can adapt to each child in this case and take into account whether they suffer from dyslexia, how they can optimize that these kids learn to work together etc. As most people know, education, in general, is not only extremely expensive but also in many cases of poor quality and if AI could offer a better education and happier child in the end, wouldn't all parents choose that?

So does it matter to the parent whether the child is taught by an AI rather than a human or that the information is plagiarised? I think some people would go with the human, but from a logical point of view, no one would ever choose the worse option if the AI turned out to be the favourite in a blind test, it is mostly the parent's fear that would guide them rather than what is in the child's best interest.

And that is also the point I have raised with some others here, How much do we care whether it is an AI or not in the end, until it affects us in a negative way?
 

Secret Chief

Very strong language
This is just another example of AI being used, which is within the art industry.

View attachment 81447
So the AI have turned the drawing on the left into the image on the right. And this technology is also only really in the beginning.

This is a higher quality image of an AI generated human, by simply typing in a text prompt.
View attachment 81448

So again, it doesn't require a lot of imagination to figure out what impact this could have on the art/photography industry as a whole. And there are lots of examples of this, it is very easy to do and get high-quality images. You don't have to travel to a location and set up all the equipment, hire a big crew etc. You just type it into a text prompt of what you want and you get the image.

Imagine you were a cloth designer, if people can't tell the difference anyway? (All in that image below is AI generated)

View attachment 81449
Remind me again what 8 billion people are actually going to be doing?...
 

Nimos

Well-Known Member
Remind me again what 8 billion people are actually going to be doing?...
As @Snow White said, a universal income solution might very well be the most realistic option. But as said, governments are not really prepared for this in case things start to go wrong, so it will probably start with a lot of misery for a lot of people. If we are to listen to these AI experts, obviously there are also some that disagree with them, then there are not going to be other places for people to just relocate to. If AI offers a productivity increase of that magnitude as they believe, one person will simply do what 5-10 people might have done before or how many they can replace, until they themselves gets replaced. And as we all know, you can't simply reeducate someone, it can take years to do that and then there is no guarantee that they are even needed.

And in this case, we are only talking AI, what will happen when robots really get involved as well?

I honestly think that these AI experts are correct when they say that humans throughout history have never experienced such a rapid change in our lives as what is about to hit us. Because these are technologies that are not as before when they in most cases were designed to improve human productivity. But we are the ones that are going to hold back the AI's and the robots as I see it when these things really get rolling. Imagine if there is a breakthrough in robotics on the same scale as AI in a year or two?

Tesla is working on a house robot apparently and lets be honest it is not the most incompetent people:


So what other solution is there? humans need to eat and food costs money, so we need money and if 50% of us are not needed what other solution is there?
 

Secret Chief

Very strong language
As @Snow White said, a universal income solution might very well be the most realistic option. But as said, governments are not really prepared for this in case things start to go wrong, so it will probably start with a lot of misery for a lot of people. If we are to listen to these AI experts, obviously there are also some that disagree with them, then there are not going to be other places for people to just relocate to. If AI offers a productivity increase of that magnitude as they believe, one person will simply do what 5-10 people might have done before or how many they can replace, until they themselves gets replaced. And as we all know, you can't simply reeducate someone, it can take years to do that and then there is no guarantee that they are even needed.

And in this case, we are only talking AI, what will happen when robots really get involved as well?

I honestly think that these AI experts are correct when they say that humans throughout history have never experienced such a rapid change in our lives as what is about to hit us. Because these are technologies that are not as before when they in most cases were designed to improve human productivity. But we are the ones that are going to hold back the AI's and the robots as I see it when these things really get rolling. Imagine if there is a breakthrough in robotics on the same scale as AI in a year or two?

Tesla is working on a house robot apparently and lets be honest it is not the most incompetent people:


So what other solution is there? humans need to eat and food costs money, so we need money and if 50% of us are not needed what other solution is there?
I can't see it ending well for most of the human race. Musk will probably be ok.
 

Quintessence

Consults with Trees
Staff member
Premium Member
But isn't this just another journey? But one where humans might not be able to come along because we can't keep up so to speak. Yet the journey is going to affect us and in many cases, they won't have a say in that as the destination might very well be forced upon them.

This doesn't sound much different to me than what is already going on now, and in the past. For example, I sure as blazes didn't ask to be born into this predatory capitalist culture I am more or less forced to live in. I deal with it. I deal with living in a predatory capitalist culture by engaging with it as little as possible. I'll deal with living in the information age by being very selective in what news sources I consult and not overloading myself with needless information. I'll deal with whatever machine learning brings in similar ways. As will all others, in their own ways based on their own proclivities, which need not align with mine. We will all just deal with it, one way or another. I don't find worry about it to be productive, but if worry is how others deal with it, go for it I guess.

Let's say that in a few years, all students will be taught by an AI that is going to adapt or optimize to each student so they learn the way that works best for them. If you and such AI were to be blind tested against the students let's say that 90% preferred the AI because they felt they learn things better that way.

So we already have some insights into this, based on what happened during the pandemic. Turns out humans are social animals. Humans want to be around other humans. Actual humans, face to face, not with screens and machines. I thought that the total shift to distance learning would shift and have huge impact on future course modality. That didn't happen; students didn't want online courses. They wanted in-person courses with instructors and grad students and teaching assistants, they wanted to be on campus, and they didn't want the virtual and machine-only experiences. Some did - the ones who do typically are choosing online because of other considerations, like needing to work to pay for school. Not at all what I expected from a generation that more or less grew up with screens plastered in their faces all the time.
 

Nimos

Well-Known Member
I can't see it ending well for most of the human race. Musk will probably be ok.
I have no clue either, we are in an extremely unique situation as I see it. Even before when we had big leaps in technology it would overall boost the economy, but I don't think that will be the case here, because the purchasing power is with humans, but no one in their right mind who has a company will choose a robot or AI over a human if these are not even remotely at the same level.

I don't see a solution where humans (potentially billions) are coming out on the top against robots and AI in competitiveness, except maybe in science because it is a collaboration and it is unknown, so AI doesn't really have an edge here or couldn't benefit from humans.

But I think either our economic system is going to have to adapt big time, potentially a complete remake or things are going to end very badly in riots, lots of depression etc.
 

PoetPhilosopher

Veteran Member
I find myself agreeing with @Nimos the more he speaks on this subject. I think before when we talked/argued about it, it was like we were taking a coin, and I was looking at one edge of it, and he was simply looking at the other.
 

Brickjectivity

Veteran Member
Staff member
Premium Member
If anyone is interesting in the interview it can be seen here:
I still don't consider this person to be someone that understands the topic. A mind does not form itself from neurons. It has to be built. A software program that emulates neurons can do better if it tries different configurations, however this does not equal building a mind. When AI is given a body that can experience fear and its effects, then we will have reason to be concerned. Before that we need only supervise to make sure there are no accidents. Think about what life would be like for you if you had no pain and no amygdala in your brain, had no feelings to deal with such as anger or attraction or upset stomach or hunger or tired feeling and if you had no urge to reproduce and had no urge to do anything except for specific goals. You'd be pretty compliant and satisfied.

The dangerous AI are already here. They are drones used in warfare. They are walking robots that can carry weapons. These are quite dangerous in the hands of politicians. These are here now. They aren't about to go crazy on their own and take over, but the politicians and technicians might.
 

Nimos

Well-Known Member
This doesn't sound much different to me than what is already going on now, and in the past. For example, I sure as blazes didn't ask to be born into this predatory capitalist culture I am more or less forced to live in. I deal with it. I deal with living in a predatory capitalist culture by engaging with it as little as possible. I'll deal with living in the information age by being very selective in what news sources I consult and not overloading myself with needless information. I'll deal with whatever machine learning brings in similar ways. As will all others, in their own ways based on their own proclivities, which need not align with mine. We will all just deal with it, one way or another. I don't find worry about it to be productive, but if worry is how others deal with it, go for it I guess.
I understand what you are saying.

But I think there is a huge difference between worrying and being prepared, both as individuals and as societies. If you think about Covid, we weren't prepared, because we didn't really take it seriously before it was too late, which is the human way of doing things.

This is not sneaking up on us, this is basically stepping on us and we are still not reacting. I haven't heard any politicians or economists addressing this potential issue. And my guess is that it will not be taken seriously before it hits the fan for real and it again is too late because now we are very interested in climate change.

I think this falls in the same category as climate change and COVID, it is not something you can solve as an individual I think we need a global solution.
 

Secret Chief

Very strong language
I have no clue either, we are in an extremely unique situation as I see it. Even before when we had big leaps in technology it would overall boost the economy, but I don't think that will be the case here, because the purchasing power is with humans, but no one in their right mind who has a company will choose a robot or AI over a human if these are not even remotely at the same level.

I don't see a solution where humans (potentially billions) are coming out on the top against robots and AI in competitiveness, except maybe in science because it is a collaboration and it is unknown, so AI doesn't really have an edge here or couldn't benefit from humans.

But I think either our economic system is going to have to adapt big time, potentially a complete remake or things are going to end very badly in riots, lots of depression etc.
Add in the global climate emergency and it's (incoming pun) the perfect storm.
 
Top