• Welcome to Religious Forums, a friendly forum to discuss all religions in a friendly surrounding.

    Your voice is missing! You will need to register to get access to the following site features:
    • Reply to discussions and create your own threads.
    • Our modern chat room. No add-ons or extensions required, just login and start chatting!
    • Access to private conversations with other members.

    We hope to see you as a part of our community soon!

A bit disturbing (Apple)

Nimos

Well-Known Member
So I read this story earlier, which I find rather disturbing to be honest.

Apple wants to look in your iPhone: Company plans to scan U.S. phones for images of child abuse
Apple to scan U.S. iPhones for images of child abuse

Don't get me wrong, I think their intentions are good. But I can't help thinking that they might be violating some privacy rules here?

Also what on Earth give apple the right to act as policemen?

These cooperation just do what the hell the want, doesn't the same rules apply to them as to everyone else?

If I decided to "scan" other peoples phones for child abuse pictures, I would be charged and fined for it.

Also as everyone knows, it will take about 1-3 month and then someone have figured out how to bypass it. And what people end up with is yet another privacy violation.

Imagine how many people that take family pictures of their kids that are going to get flagged by this?

Im obviously against child abuse etc. but I don't see how this is going to help anything? Those people that do these things are simply going to find new or other ways of doing it, leaving everyone else to be spied on. And one can only wonder what else they will use it for.

So not only do you get spied on by Google, Facebook and all the other social media sites, now Apple is going to join the party as well. :)

I hope it backfires big time for them and people simply refuse to use their phones or they get a lot of lawsuits on them.

I think these companies are going to far.
 
Last edited:

PoetPhilosopher

Veteran Member
I certainly have mixed feelings as well. I mean sure, we can pretend to support it in the name of safety, but we can't truly support it because we don't know exactly how these algorithms work and whether they're well designed.
 

Twilight Hue

Twilight, not bright nor dark, good nor bad.
So I read this story earlier, which I find rather disturbing to be honest.

Apple wants to look in your iPhone: Company plans to scan U.S. phones for images of child abuse
Apple to scan U.S. iPhones for images of child abuse

Don't get me wrong, I think their intentions are good. But I can't help thinking that they might be violating some privacy rules here?

Also what on Earth give apple the right to act as policemen?

These cooperation just do what the hell the want, doesn't the same rules apply to them as to everyone else?

If I decided to "scan" other peoples phones for child abuse pictures, I would be charged and fined for it.

Also as everyone knows, it will take about 1-3 month and then someone have figured out how to bypass it. And what people end up with is yet another privacy violation.

Imagine how many people that take family pictures of their kids that are going to get flagged by this?

Im obviously against child abuse etc. but I don't see how this is going to help anything? Those people that do these things are simply going to find new or other ways of doing it, leaving everyone else to be spied on. And one can only wonder what else they will use it for.

So not only do you get spied on by Google, Facebook and all the other social media sites, now Apple is going to join the party as well. :)

I hope it backfires big time for them and people simply refuse to use their phones or they get a lot of lawsuits on them.

I think these companies are going to far.
How in the world can child abuse be detected through a phone?

Put two way speakers in and listen for yelling and sounds of chairs being thrashed around?

As for images, what, child porn or something?

Bruises on little Sally's arms from photos taken at the family picnic?

I agree. This steps way over the line and open for a whole slew of lawsuits if they go ahead with this.
 
Last edited:

Twilight Hue

Twilight, not bright nor dark, good nor bad.
I certainly have mixed feelings as well. I mean sure, we can pretend to support it in the name of safety, but we can't truly support it because we don't know exactly how these algorithms work and whether they're well designed.
I think most everyone wants kids to be safe.

However it puts in tools and mechanisms that can go far beyond a simple issue of safety for children.
 

PoetPhilosopher

Veteran Member
I think most everyone wants kids to be safe.

However it puts in tools and mechanisms that can go far beyond a simple issue of safety for children.

True. Not to mention, this subject may be a bit too branching off of a subject, but Big Tech is getting too powerful already. These algorithms could give them even more power, to decide when and when isn't a person doing something wrong. And this decision isn't even based on real people's judgement apparently, but on something written by a couple of programmers.
 

Twilight Hue

Twilight, not bright nor dark, good nor bad.
True. Not to mention, this subject may be a bit too branching off of a subject, but Big Tech is getting too powerful already. These algorithms could give them even more power, to decide when and when isn't a person doing something wrong. And this decision isn't even based on real people's judgement apparently, but on something written by a couple of programmers.
It's why I think we are on a dangerous future road of real and actual corporate government.
 

Nimos

Well-Known Member
How in the world can child abuse be detected through a phone?

Put two way speakers in and listen for yelling and sounds of chairs being thrashed around?

As for images, what, child porn or something?

Bruises on little Sally's arms from photos taken at the family picnic?

I agree. This steps way over the line and open for a whole slew of lawsuits if they go ahead with this.
Agree or at the beach. I don't know how it is in other countries, but in Denmark its very common that girls are topless and small children at very young age are naked when on the beach etc. Their parents take a picture of them and they get flagged. They apparently want to use some computer AI stuff to do it and if this program think it might be illegal it will contact the authority to have a look at it. So all these pictures will be send to the police to have a look at, so I guess Apple, is not only collecting the images but also personal details about the owner of the phone so the police can contact them.

Trying to stop things like this should be very high priority, but to me this is taking it to far, mistakes in these things happens whenever we are talking computers. And I just don't see how it is Apple or some computers job to track people.

If Apple want to help fight it, that is fine, but then remove the damn camera from their phones, wouldn't that be the better option? rather than starting to act as a police company.

And we all know that these things gets exploited or hacked or what not, almost all the major companies have been drawn to court for selling or sharing personal information about their users.

I certainly have mixed feelings as well. I mean sure, we can pretend to support it in the name of safety, but we can't truly support it because we don't know exactly how these algorithms work and whether they're well designed.
The problem as I see it, is that then the next company comes around and want to "track" you health as well, so they collect your health information, it might almost be easier in the end if we all got chips installed in our head, so we can be tracked 24/7 by the police and whatever company that want these information and all personal information uploaded to a website where everyone can go and read about each other.

Not in favour of this.
Me neither, and I don't even have an Iphone :D

But they plan on doing it at the end of this year, if I understood it correctly.

I think most everyone wants kids to be safe.

However it puts in tools and mechanisms that can go far beyond a simple issue of safety for children.
The problem is that this won't work. When this gets rolled out big time, its going to draw huge headlines and people that do these things, just won't store pictures on their phones. They are not stupid people, but suffer from I don't know some mental disorder or what it is. But its not like they can't add two and two together, so its not going to solve anything. And besides that how much data does Apple have that 1000s of child abuse images are stored on their phones, that validate that they should be allowed to act as policemen?

Again, remove the damn camera from their phones, if they are so worried about it, instead of violating peoples privacy. Guess that didn't even cross their minds. :)

It's why I think we are on a dangerous future road of real and actual corporate government.
It already is, read that its very common in the US, might be in other countries as well, that if you want to work for them, you have to "freely" sign a contract that any violations or issues are going to be solved/settled internally in the company and that the employee is not allow to take it to court. That is to create your own little justice system to screw over your workers and save the company's *** whenever they screw up. Obviously this contract is "free" to sign, but lets be honest, how big of a chance do you have to get hired if you don't sign it? :)
 

Hermit Philosopher

Selflessly here for you
55B778E2-7263-4E18-8689-D6AA35A646D6.jpeg


Humbly
Hermit
 

Brickjectivity

Veteran Member
Staff member
Premium Member
So I read this story earlier, which I find rather disturbing to be honest.
Its interesting that they can detect the images without decrypting. That is a neat technical trick, and it may get them out of hot water with the FBI.

My guess about the way this works is that they take these known images and encrypt them different ways to train a neural network to recognize the encrypted versions. So then the neural network scan would indicate that a phone very likely contained such an image but would not absolutely be able to without decrypting. I wonder what the confidence level is? 98%? 68%?

What about the legal side? Everyone who buys an i-phone signs an agreement with Apple, and that agreement can be changed by Apple but not completely. There is a line in there somewhere. All of the users have already agreed, at least by certain contract forms. May Apple, having assured customers that their data will remain encrypted, then hack that data without breach of contract? I don't know. Technically they are not decrypting the data.
 

Twilight Hue

Twilight, not bright nor dark, good nor bad.
Agree or at the beach. I don't know how it is in other countries, but in Denmark its very common that girls are topless and small children at very young age are naked when on the beach etc. Their parents take a picture of them and they get flagged.

That used to be commonplace in the old days. I still see it in some foreign clothing catalogs in some European countries in a perfectly normal clean way.

There was no need for tops since there is nothing to see that was any different from the boys. That was the mentality then. My cousin was a nudist in Canada and had a whole bunch of photos including children and adults having fun on the sun doing all sorts of clean and wholesome activities.

My mom and our neighbor changed us as little kids outside, buck naked we were, and it didn't bother anybody in the least.

Girls just covered up when their breasts started to develop, and going topless stopped when the training bras started starting a new phase in their lives.

It was so much more innocent and nicer back then. I miss those times

Today we have a really jaded, crazy, and paranoid society and it's disgusting to see how it has gripped people.
 

Jeremiah Ames

Well-Known Member
So I read this story earlier, which I find rather disturbing to be honest.

Apple wants to look in your iPhone: Company plans to scan U.S. phones for images of child abuse
Apple to scan U.S. iPhones for images of child abuse

Don't get me wrong, I think their intentions are good. But I can't help thinking that they might be violating some privacy rules here?

Also what on Earth give apple the right to act as policemen?

These cooperation just do what the hell the want, doesn't the same rules apply to them as to everyone else?

If I decided to "scan" other peoples phones for child abuse pictures, I would be charged and fined for it.

Also as everyone knows, it will take about 1-3 month and then someone have figured out how to bypass it. And what people end up with is yet another privacy violation.

Imagine how many people that take family pictures of their kids that are going to get flagged by this?

Im obviously against child abuse etc. but I don't see how this is going to help anything? Those people that do these things are simply going to find new or other ways of doing it, leaving everyone else to be spied on. And one can only wonder what else they will use it for.

So not only do you get spied on by Google, Facebook and all the other social media sites, now Apple is going to join the party as well. :)

I hope it backfires big time for them and people simply refuse to use their phones or they get a lot of lawsuits on them.

I think these companies are going to far.


At first, the idea seemed disturbing. Privacy and all.
Then I read the article.
I think (maybe naively) that it’s a noble idea.
I say go for it.
Of course it could be dangerous, in the wrong hands.
But every technology is dangerous in the wrong hands.
We have no idea the extent of surveillance we’re subject to now.
Do we really have privacy any more?
 

Wildstar

Member
So I read this story earlier, which I find rather disturbing to be honest.

Apple wants to look in your iPhone: Company plans to scan U.S. phones for images of child abuse
Apple to scan U.S. iPhones for images of child abuse

Don't get me wrong, I think their intentions are good. But I can't help thinking that they might be violating some privacy rules here?

Also what on Earth give apple the right to act as policemen?

These cooperation just do what the hell the want, doesn't the same rules apply to them as to everyone else?

If I decided to "scan" other peoples phones for child abuse pictures, I would be charged and fined for it.

Also as everyone knows, it will take about 1-3 month and then someone have figured out how to bypass it. And what people end up with is yet another privacy violation.

Imagine how many people that take family pictures of their kids that are going to get flagged by this?

Im obviously against child abuse etc. but I don't see how this is going to help anything? Those people that do these things are simply going to find new or other ways of doing it, leaving everyone else to be spied on. And one can only wonder what else they will use it for.

So not only do you get spied on by Google, Facebook and all the other social media sites, now Apple is going to join the party as well. :)

I hope it backfires big time for them and people simply refuse to use their phones or they get a lot of lawsuits on them.

I think these companies are going to far.
If it can prevent even 40 percent of child trafficking, exploitation and abuse, I am all for it. As an iPhone user, I commend Apple for taking this step.
 

Mock Turtle

Oh my, did I say that!
Premium Member
Its interesting that they can detect the images without decrypting. That is a neat technical trick, and it may get them out of hot water with the FBI.

My guess about the way this works is that they take these known images and encrypt them different ways to train a neural network to recognize the encrypted versions. So then the neural network scan would indicate that a phone very likely contained such an image but would not absolutely be able to without decrypting. I wonder what the confidence level is? 98%? 68%?
I'm not sure this is right - have a look at this (from the article I linked):

https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
 

Brickjectivity

Veteran Member
Staff member
Premium Member
I'm not sure this is right - have a look at this (from the article I linked):

https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
Looking through all of that...my guess was off. Where I was incorrect was in thinking that the comparison would be made at the server. Instead the documentation says that it is the client software which compares the hashes.

Of course Apple installs and controls this software, so they'd be able to (obviously) change this without notice. They can also do updates, completely changing how the system functions. The way it works seems to me as follows:

According to the protocol, each image is not directly uploaded to the server (i-cloud) but is first encrypted at the client (user) end with a hash function using their neuralhash algorithm which creates a similar hash for similar images. So two pictures of the same swan with different sizes and encodings should result in the same or similar neuralhash. The server never sees the image nor has the ability to easily decrypt user images. Only the client (user) can decrypt them. The client also handles the scanning of the hashes, comparing them to hashes of illegal images, then reports an approximate number, a total. In the technical documents it uses the phrase 'Cardinality of the intersection'. It is purposely approximate. A user with only 1 illegal image will likely not be noticed. This is necessary for some technical reasons that have to do with stopping malicious servers and malicious clients.

If I follow this correctly the client scans the neural hash and compares it to hashes sent from the server. These are hashes of (but not originals of) illegal images. The client never gets any illegal images from the server, and the server never gets to see the client images only their hashes. Then on a weekly basis the server reports anyone who has enough illegal images to reach a certain threshold number for reporting, however there is a small amount of uncertainty about how many images. This uncertainty is created at the client end purposely by the addition of synthetic positives.
 

Saint Frankenstein

Here for the ride
Premium Member
So glad I don't have a garbage iPhone. That is a seriously breach of privacy. Methinks that they chose this child porn excuse to soften people up to the idea.
 

Brickjectivity

Veteran Member
Staff member
Premium Member
It could be used nefariously, to find out who has certain images other than these child abuse images. Like...suppose USA wanted to pressure Apple to find out who owned a particular image of President Trump then this system could be used for such. On the other hand it doesn't reveal personal images and lets individuals keep their images in an encrypted state. Is it perfectly secure? Its fairly secure. If your images are that sensitive then of course you're going to use some other means, and you should of course register them if they are intellectual property.
 

Yazata

Active Member
I don't like it either.

And it isn't just Apple, I expect that Google would do it too with Android.

It's ironic, since not so long ago Apple was fighting tooth and nail to avoid decrypting particular terrorist suspects' phones for the security agencies. Trust and privacy were supposedly paramount then.
 
Top