Twilight Hue
Twilight, not bright nor dark, good nor bad.
I would certainly like to know if I'm talking with an AI or a real human. Good bill by Rep Torres (D)
I'm guessing most would appreciate such disclosure as well.
Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.
Your voice is missing! You will need to register to get access to the following site features:We hope to see you as a part of our community soon!
It's just disclosure. I figure no different than the information on foodstuffs and such.Like most new legislation, I wonder how the actual implementation of it would work
How? It dosent restrict the end user whatsoever, unlike the mandaters who mandate end users by telling them personally what they can and cannot do in no uncertain terms.But "nanny state"?
In that case I think it is time I came clean. And am now and always have been an AI.
I would certainly like to know if I'm talking with an AI or a real human. Good bill by Rep Torres (D)
I'm guessing most would appreciate such disclosure as well.
It's just disclosure. I figure no different than the information on foodstuffs and such.
It provides additional information.
Yes. Pass that bill as is or with a minor change to the wording of the disclosure notice.
I would certainly like to know if I'm talking with an AI or a real human. Good bill by Rep Torres (D)
I'm guessing most would appreciate such disclosure as well.
Yes. Pass that bill as is or with a minor change to the wording of the disclosure notice.
I would certainly like to know if I'm talking with an AI or a real human. Good bill by Rep Torres (D)
I'm guessing most would appreciate such disclosure as well.
Good points made. It definitely should be included in consumer protection laws.Sounds good to me. I would like it if they would make a law requiring AI (including automated voice customer service AI) to never, NEVER refer to itself in the first person. I've heard some companies even include the sound of typing to make it sound like a live operator is typing at a keyboard. That should also be outlawed. Any attempt at deception to make AI appear "human" should be illegal, in my opinion.
While they're at it, they should also make it mandatory that, whenever someone calls a business or other large organization which has an automated operator system, a customer can always have the option of pressing "0" to be immediately connected to a live person without any further input from the computer voice.
It's a restriction on businesses. Why are some protections for consumers, workers, the environment, etc. okay, but others are "freedom restrictin' nanny state"? That website you often bring up regarding freedom by state would consider this a loss.How? It dosent restrict the end user whatsoever, unlike the mandaters who mandate end users by telling them personally what they can and cannot do in no uncertain terms.
The business isn't restricted as to what it provides. It only discloses unlike mandated bans do. It can still use AI if a business wants to only with acknowledgements that its doing so.It's a restriction on businesses. Why are some protections for consumers, workers, the environment, etc. okay, but others are "freedom restrictin' nanny state"? That website you often bring up regarding freedom by state would consider this a loss.
I agree that such deception should be prohibited, but I'm trying to make a point about having consistent values.
The business isn't restricted as to what it provides. It only discloses unlike mandated bans do. It can still use AI if a business wants to only with acknowledgements that its doing so.
See, my main problem though, is that most of the ways you use AI - video games, on social media to expand points, etc - it may be an inconvenience to disclose every AI piece. Some indie video games are now using AI (kind of like ChatGPT). How would a disclosure work for an indie game exactly?
I think a simple overall acknowledgement would suffice.See, my main problem though, is that most of the ways you use AI - video games, on social media to expand points, etc - it may be an inconvenience to disclose every AI piece. Some indie video games are now using AI (kind of like ChatGPT). How would a disclosure work for an indie game exactly?
I think the fact that it's a game would, in and of itself, be sufficient disclosure. I think games and works of fiction would be defined separately, since there wouldn't be any intent to deceive or commit fraud, since the consumer already understands that it's fiction.
All the bill says is that there needs to be full disclosure that I'm chatting with a bot and not a human.Have you thought about the potential consequences of this (likely currently unfleshed out) bill in which AI will be governed by the FCC, though?
I just want to point out that given the current state of the gaming industry, this is very much not true. Many games are not just games anymore, they are designed firstly as storefronts and secondarily as games. Modern gaming routinely employs psychological manipulation to coerce players from their money - they are more or less getting away with fraud already. It's a serious problem in the industry and it has gone without adequate regulation. If systems like this aren't being used already - and they very well could be because there's no disclosure requirement - these predatory "monetization designers" as the industry calls them are absolutely salivating at the thought of using AI to more efficiently fleece customers of their money.I think the fact that it's a game would, in and of itself, be sufficient disclosure. I think games and works of fiction would be defined separately, since there wouldn't be any intent to deceive or commit fraud, since the consumer already understands that it's fiction.