I QUIT

Apple will now forestall implementation of its threat to inspect your private photographs and text messages, and eavesdrop on your Siri commands, for suspected kiddie porn. The entire infrastructure is wrong in principle, but critics of Apple’s plans have their own ideas of what “principle” might entail, and it always means “make sure my political enemies are harmed but my allies are protected.”

It was particularly risible to read that Apple’s surveillance mechanisms might be reused by some kind of authoritarian or repressive régime. Apple already is one of those. Its authoritarianism merely pleases the political biases of the far-left Apple press.

Apple already “scans” your photos

Under its proposal to report kiddie porn, Apple claimed it would inspect photos uploaded to its cloud service. This form of “scanning” photos – which means inspecting and interpreting, not converting analogue to digital – already happens on your iPhone, iPad, or Macintosh. Apple already “scans” your photos.

The VoiceOver Recognition feature (for iPhone and iPad; for Macintosh) has to be explicitly turned on, and you also have to download a plug‑in to make it work. As such, the parallels with Apple’s kiddie-porn-detection plans are imperfect, because VoiceOver’s method of scanning your own private photographs requires double opt‑in. Further, any photo interpretation is spoken to you personally (and/or presented as Braille), with transient or long-lasting transcriptions possible; the description is seemingly not otherwise saved.

  • VoiceOver Recognition and the proposed anti-kiddie-porn technologies are distinguishable. But it’s a distinction without a difference, because VoiceOver refuses to speak or read out a description of a photo if its A.I. thinks that photo is pornographic.

    • The exact terminology has evolved over different iOS versions, but now VoiceOver prefixes “Possible adult content” to a picture that includes too much nudity. (“Nudity” – it’s my term, not Apple’s – can mean wearing clothes.)

    • A lot of the time, “Possible adult content” is the only thing you’ll be told – the system won’t even bother attempting to describe the photo.

    • The system will also simply say “adult” in contexts I have not been able to figure out. I think it means “grown-up person,” not “pornographic.”

  • Apple’s A.I. will probably be quite good at detecting kiddie porn because it already won’t even put into words a photo of a naked man with his back, and backside, facing you. For those photos, the system overcorrects. But it cannot consistently identify an exposed penis, and it is baffled by skimpy male underwear and by swimsuits.

  • In a case of ideology programmed into ostensibly objective artificial intelligence, VoiceOver Recognition only ever says “a shirtless person” instead of “a shirtless man.” They’re quite distinguishable, unless your CEO is a gay progressive who wants everyone to pretend some men have vaginas.

  • The Image Explorer function pre-populates a photo-caption field with the text of the spoken recognition (with poor copy-editing).

  • Documentation claims Recognition uses “on-device” computation, which further implies that interpretations do not leave your machine. I am quite sure that the system does not trawl through your photos attempting to describe them in advance; it produces an interpretation only upon request and only for that image.

  • Further, I can’t figure out how to get VoiceOver unstuck from text recognition. A photo with text (even the embroidered brand name of a swimsuit) isolates that text and will not describe the rest of the photo.

VoiceOver Recognition hits and misses

You can skip this section if you do not wish to look at photographs of male nudes.

Here I will transcribe VoiceOver Recognition’s spoken interpretation (I could copy and paste the text from Image Explorer) and show you the actual photo it derived from.

  1. Possible adult content. A photo containing a frame and an adult

    It’s a black-and-white photo of a man reclining on a bed. You can see his genitalia and one leg, but his head is turned and hidden behind that leg.

  2. Possible adult content. A photo containing an adult and a sword

    It really said “sword.” It’s an artistic composite of a shirtless man with white bodypaint on right pectoral and left lower quarter of face. There’s a band, derived from the white sheet hung in the background, stretched across the entire width of the photo that also covers his eyes.

  3. Possible adult content. A photo containing a textile and a dog

    It’s a shirtless muscular man (Ramón Christianson, photographer) with his arms raised and his face obscured above the beardline. The only hairless part of him is his neck, but I don’t see how that equates to “dog.”

  4. Adult. Tattoo

    It’s a well-built man with small tattoos who is half-obscured by a white wall.

  5. A shirtless person standing in front of a body of water in front of a clear blue sky

    Correct, except it’s a man, not a “person.”

  6. A photo containing an adult in clothing

    By far the most risible description. It’s a muscleboy with six-pack abs, who also has his jeans hiked down and is displaying a large basket in skimpy orange underwear. For greater verisimilitude, he’s on a construction site wearing gloves and holding a scraper. As one would encounter in real life.

  7. A person standing on a boat on a body of water

    On almost every other occasion (see below†), a back shot of a man in a swimsuit will be labelled “possible adult content.” Description does not mention the clearly discernible Italian flag.

  8. Possible adult content. A photo containing a swimsuit, grass, and an adult

    †Accurate as far as it goes. (From a pseudonymous James; after a while you won’t be able to tell his photos apart any more than I can.)

  9. A person wearing a Speedo standing on a rock and posing for a photo

    Accurate to the point of correctly interpreting intent. (It’s still a man [Nathan McCallum].) Why the system cannot be this accurate with comparable photos remains to be explained.

  10. Adult. Clothing

    Also a joke. It’s a Bob Mizer–style picture (exactly the kind of photo that brightens one’s day), from the Apollo’s belt up, of a grinning bodybuilder against a fuchsia background. (Chase Carlson [Noel Photo Studio].)

Does “LGBT imagery” mean fisting photos?

Gruber and the Electronic Frontier Foundation insist that Apple’s kiddie-porn-detection proposal could be misused against “LGBT imagery,” to quote the former. I’m sure they’re thinking of jolly gay-pride-parade photos (nope: “trans men who have had top surgery”). Surely neither was referring to the photos of gay fisting that are endless online.

Far from censoring “LGBTQ+” content, the most deviant such photos will always be allowed. Not just the “brolapse” (obfuscated link), which is horrendous enough, but hole pics, elective-mastectomy scars, and other perversions that progressives pretend are not widespread when they are not championing them outright.

When opponents of Apple’s proposal talk about “LGBT imagery,” they sure as shit are not talking about actually artistic photos of nice-looking men, as seen above. (Those are also the sort of photos that form the basis of my lock-screens project.)

Apple already censors its political enemies

Those enemies happen to be the same political enemies of the Apple press and its developer class, effectively all of whom are across-the-board hard-left progressives.

  • We never stop hearing about Apple’s hypocrisy in censoring apps – and even flag emoji – in China, Taiwan, and Hong Kong (and presumably Macau). It’s perfectly safe for someone like Gruber, who still rakes in a half-million a year, to lambaste Apple on a topic that never touches his life. From what I can gather as a reader of, and actual writer of, Macintosh and Apple coverage since 1984, the Apple press is an across-the-board leftist monoculture.

    Indeed, apart from the fact that conservatives cannot design even if they happen to use Macs, the only prominent non-leftist Apple users in living memory have been Rush Limbaugh, who pleased progressives when he went deaf and suffered a long cancer illness, and Donald Trump, who infuriates those same progressives by remaining hale and hearty.

  • Apple censors podcasts all the time. They’re just the kind of podcasts that the Apple press believes should never be allowed to exist.

    • Of course everyone brings up Alex Jones and InfoWars first, but, as the only person actually tracking this issue, I assure you that even accomplished authors like Jim Goad have been completely disappeared from the iTunes podcast directory, which remains the only viable one.

      (Adam Curry is too stoned on pot, and his seldom-named developer too plainly dumb, to dislodge this market monopolist via Curry’s samizdat podcast directory.)

    • Marco Arment threatens to carry out his own censorship in Overcast, his podcasting application. Arment, who won’t stop talking about “diversity” and “Blacks” on his own shows, has been quite open about his plans to make it functionally impossible for Overcast users, even if they pay for a subscription, to subscribe to “problematic, controversial” podcasts. (Are you sure you can hunt down and correctly enter an unspecified number of bare RSS-feed URLs?)

      Meanwhile, Overcast features an entire category entitled “Podcasts in Color.”

  • Bruce Schneier correctly pointed out that Apple’s proposal simply eliminates end-to-end encryption of formerly private instant messages.

    But Apple interferes with your instant messages all the time. It banned Gab outright and did the same to Parler for a while. Specific “content” on Telegram is censored on your iPhone or iPad. Meanwhile, you are free to call for the assassination of your enemies, or run as many fisting photos as you want, using any number of Twitter apps on your iPhone, iPad, or Macintosh.

  • Apple allows bizarre gay “hookup apps,” a term it now explicitly uses. Grindr, a Chinese honeypot/kompromat operation, seems innocent by comparison with Recon; 9Monsters (more bizarre still, for Japanese gays); and the baffling Fit Gorillas, which presents random profiles of musclebears from around the world. Some or all of those ought to be banned under §1.1.4 of Apple’s own App Store guidelines, which cover “ ‘hookup’ apps that may include pornography or be used to facilitate prostitution.”

The Apple press and ecosystem protect their friends

Apple’s censorship is never meaningfully opposed by its press corps, and indeed its existence is denied. (“It isn’t happening and you deserve it anyway.”)

The Apple “ecosystem” is opposed to Apple’s kiddie-porn-detection proposal because it would affect its political allies, namely BLM, antifa, transgenders, and other maladaptive and perverted leftists. They’re the ones who already have kiddie porn on their phones, though it would be hard to locate amid the sea of chicks-with-dicks photographs and furry porn.

Indeed, trannies and furfags, and deviant autistics generally, are core client bases of the American political left, hence are exactly the kind of predators the Apple press will defend by proxy. That’s what they’re doing here. If Apple invents a system that is really good at detecting kiddie porn, these dominant factions’ political allies will be the ones nabbed first. They’re the ones with kiddie porn on their phones.

Of course the proposal is wrong in principle. But for the Apple press and the Apple ecosystem, the proposal is wrong because it would get its friends in trouble.

Apple already is a “repressive régime”

Its repression merely suits effectively everyone who writes about the company, and everyone inside Apple who dares not defy its gay progressive CEO. The level of repression is actually far more severe than I have detailed here. It’s just that you probably approve of it, and think it should go further.

The foregoing posting appeared on Joe Clark’s personal Weblog on 2021.09.06 14:32. This presentation was designed for printing and omits components that make sense only onscreen. (If you are seeing this on a screen, then the page stylesheet was not loaded or not loaded properly.) The permanent link is:
https://blog.fawny.org/2021/09/06/repressive/

(Values you enter are stored and may be published)

  

Information

None. I quit.

Copyright © 2004–2024