Why are people happy or approving of AI on apple products, when it seems like the same thing was treated (rightly) horribly when Microsoft just did it.

Is Apple doing it better in some way? Both said it’ll be local only, but then Apple is doing some cloud processing now. Do people really just trust Apple more???

  • vermyndax@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    4 months ago

    I actually like Apple’s approach to AI more than all of the others. I don’t care for Microsoft’s implementation at all. I just try to avoid Microsoft in general on top of that, so no need to complain about it.

    But I do think Apple’s approach to AI from a privacy and implementation perspective is what I would prefer from a software vendor.

    • Blisterexe@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      I like how apple seems to understand where ai is useful and doesnt shove it everywhere, its also opt-in.

      I dont like apple but their ai implementation is quite nice

  • garretble@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 months ago

    The biggest thing in the last couple of weeks is Microsoft showing off the half baked Recall “feature” that let your computer take photos of basically everything you do. The idea that you could search for something you did in the past using normal language is interesting, but the implementation was terrible. So that’s a big strike against MS, so much so they now are recalling the beta release of that. MS doesn’t have a good track record with things that are supposed to be local that somehow end up not local; I believe there was a big issue on xbox where local screenshots were still being monitored by the cloud somewhere. MS also loves shoving ads down your throat and turning back on features you have explicitly turned off. There’s no trust.

    Apple certainly has their own issues, but as others have said, they have at least outwardly been a privacy first company, at least in marketing materials. They were one of the first to build in “secure enclaves” into phones and PCs so biometrics couldn’t get off of your device, for example. There’s a bit of a history, earned or otherwise, that Apple isn’t doing bad things with your data, so when they say their AI junk is private it’s easier to swallow.

    That said, I still have yet to find a use for any of this AI junk across all platforms. I wish it all just stayed in the realm of intelligently making your photos a little sharper or whatever and not hallucinating things out of whole cloth. I’m actually happy my iPhone isn’t new enough to take advantage of this new stuff.

  • Jackiechan@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    edit-2
    4 months ago

    Unsure, but I think it’s because one of apple’s main talking points in recent years has been privacy (think the Apple logo with the lock). Whether it’s true or not, Apple has built trust with its users (misplaced or otherwise), whereas MS has lost a lot of that trust (especially with the recent Recall fiasco).

    I think both companies are storing and using user data and telling you they aren’t, but I think it’s like what you said. People for whatever reason trust Apple more

  • Thekingoflorda@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    4 months ago

    I think it’s also partly because apple’s business model is more compatible with not earning money from selling your data (not saying they are also doing that). You buy a windows keys once and that’s it, but apple customers often buy into their ecosystem meaning that apple can earn recurring money from overpriced hardware.

  • mojoaar@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    4 months ago

    I dont trust either in regards to my data - the same goes for Google and Meta for that matter 🙂

    But in regards to what I saw in the WWDC video vs M365 Copilot, I have to say I’m looking forward to see what Apple brings in regards to functionality. To be specific it is the actions part I’m looking forward to see.

    We have been POC’ing M365 Copilot at work and I have to say that for something they charge 30 dollars a month (with no ability to do actions) the feeling is: meeeh…

    • Dojan@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      4 months ago

      Apple talks big about privacy but we only have their word for it, and they’re a corporation just as likely to lie and muddle things to fool their customers. That’s my main problem concerning Apple and privacy.

      On paper their approach to ML is probably the best I’ve seen from any corp, and it’s probably the best I could’ve hoped to see since there’s no way they’d outright not go down that route.

  • umami_wasabi@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    edit-2
    4 months ago

    It’s about precieved reputation. I have to admit Apple is really good at software and integration with a somewhat balanced approach, although I will rate their business practices a negative score.

  • crossover@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 months ago

    Apple lay out some details here: https://security.apple.com/blog/private-cloud-compute/

    They control the cloud hardware. Information used for cloud requests is deleted as soon as the request is done. Everything end-to-end encrypted. Server builds are publicly available to inspect. And all of this is only used unless the on-device processing can’t handle a request.

    If somebody wanted to actually create a private AI system, this is probably how they’d do it.

    You can disagree with this or claim somehow that they are actually accessing and selling people’s data, but Apple are going out of their way to show (and cryptographically prove) how they’re not. It would also be incredible fraudulent and illegal for them to make these claims and not follow through.

  • lemmylommy@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    Well. One company stared down the FBI when they wanted assistance unlocking a terrorists phone, because it would weaken security for everyone else.

    The other keeps adding „features“ to my operating system that are designed to siphon data from me, they build at the very least misleading dialogs for those „features“ to trick me into enabling them (not even allowing „no“ as a choice, usually it’s just „yes“ or „not now“) and even when meticulously disabled they have a tendency to magically re-enable themselves after updates.

    Who would you trust more?

  • EarMaster@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    4 months ago

    I think it is, because Siri is barely usable any more. Other solutions have shown how bad it is and everyone hopes real AI will make it better…

  • TechNerdWizard42@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    4 months ago

    People who use and trust Apple, are idiots. They gasp in wonder when they receive such new advances like arranging icons on your screen. Did this with my Palm Pilot in the 90’s and every phone, even Windows Mobile phones on 2002/2003. But now it’s “NEW”!

    Same thing with AI and Apple. Too stupid to actually know any better. But when Daddy Apple says you are going to use it, everyone fawns.

    Sending your data, all of it, to a cloud is not privacy. I guarantee this is part of the content scanning and reporting requirements being seen across the globe. It’s under the public relations marketing to prevent CSAM, human trafficking, drug crimes, etc. But anyone with a brain cell knows that’s not the why. That’s the way it can be sold to fear mongering groups.