Everything Apple iOS 18 Will Do, Android Already Does

https://lemmy.world/post/16505118

Everything Apple iOS 18 Will Do, Android Already Does - Lemmy.World

Yes. Android already does all these things. But I think the things I’m excited most about are not on this list at all.

  • A private local LLM. With the on-device context of my notes, messages, calendar, etc, I’m rather excited to have a more personal LLM than ChatGPT.

  • Personal messaging via satellite. I love that I can stay in touch with people outside of a cell network.

  • It’s a very short list but powerful.

    A private local LLM

    Running on a phone? No way, not without being absolutely horrible, slow or making your phone churn through your battery anyway.

    Good LLMs are olready slow on a GTX 1080, which is already miles faster than any phone out there

    I hear you, but also I would be shocked if Apple were to roll this out and it be an absolutely terrible experience. Like their MO is “luxury” products with “premium” experiences, it would not be fitting of the brand to have a piece of crap experience on their flagship announcement.

    I’m willing to give them the benefit of the doubt on this one.

    You might wanna check with siri on that. Apple regularly failed at that even under the leadership of Jobs. And Tim Cook is no Steve Jobs. It’s already looking like it’s going to be just standard remote chat GPT. Hallucinations and all.
    apple maps has entered the chat.
    But apple maps is much better now lederp.jpeg.

    Apple Maps was bad, yes. But they had their hand forced. Google started charging for their API (enough to cripple their app), and they had very little time to create one of their own.

    That’s not happening here. No one is forcing their hand. If they didn’t release an updated Siri this year, nothing would happen.

    I think It’s running on their “Private cloud compute” platform, not locally (I’m not sure though)
    some things are run locally.

    It’s not a LLM, it’s a much smaller model (~3B) which is closer to what Microsoft labels as a SLM (Small Language Models, e.g. MS Phi-3 Mini).

    …apple.com/…/introducing-apple-foundation-models

    Introducing Apple’s On-Device and Server Foundation Models

    At the 2024 Worldwide Developers Conference, we introduced Apple Intelligence, a personal intelligence system integrated deeply into iOS 18…

    Apple Machine Learning Research
    Microsoft’s penchant for making up names for thing that already have names is neither here nore there. It is an LLM, in fact its already twice as large as chatGTP2 (1.5B params).
    I do think it’s a useful distinction considering open models can be more than 100B+ nowdays and GPT4 is rumored to be 1.7T params. Plus this class of models are far more likely to be on-device.
    You would be surprised. If you haven’t tried to run a LLM on Apple silicon, it’s pretty snappy but like all others, RAM can be a significantly limiting factor unless the model is trimmed down to do very specific things to reduce the size.
    also excited for hands free unlock of smart door locks. not sure if android/google home does that.
    Did I understand correctly that this is only going to be in the iPhone 15 pro? Because that’s a lot more expensive than a pixel, more than I’d ever spend on a phone tbh.
    The satellite im fairly sure is only pro
    Satellite is on 14+
    We went hiking and tried calling emergency services but my friend’s 14 pro couldn’t get any signal for some reason. Idk what was wrong, me and my friend with pixels had no issues though.
    Where do you live?
    This was in the LA mountains. Angeles national park
    I don’t know. Pixels don’t support call vis satellite though
    Which really surprised me when pixels were able to call but all the iPhones including the iphone 14 pro wasn’t able to call
    Satellite messaging is already available in Android 15 beta, and the LLM in the device will only be for iPhone 15 Pro and above, so most iPhone users will not be able to use it.
    But there’s literally zero phones who can use it

    A private local LLM. With the on-device context of my notes, messages, calendar, etc, I’m rather excited to have a more personal LLM than ChatGPT.

    llmfarm.site

    LLaMA and other LLM on iOS and MacOS

    LLM Farm is an App for run llama and other LLM on iOS and MacOS.

    That’s great, but the fact it’s local and private means it can consume my personal data and be a more personal LLM. This just doesn’t hit that mark.
    Yeah I guess it doesn’t allow access to those things yet although I don’t see why they couldn’t add that in a future release. The APIs for that already exist.
    I’d add to that list. If Siri is 3/4 as capable as shown in the presentation, that’s sick. Android does not have that.
    Surprising no one?
    Been that way for a long long time now.

    Me watching WWDC: “Android already does that.”

    Me watching Google I/O “iOS already does that.”

    Windows phone 10 had most of these things in 2015
    It had everything except apps.
    Oof. I felt the heat from that burn from all the way over here
    Qnx had a lot of features before windows phone in 2013.
    I miss Windows phone, still the most intuitive phone UI I’ve ever seen.
    Microsoft: “I think we really nailed this phone UI. We should make this the desktop computer experience too.”

    Ok my god. I had an Samsung Omnia 7 and I loved the Metro UI.

    Remote Desktop to a Windows 2008 Server and try to open the Start Menu by clicking a single pixel in the left lower corner… Shoot me.

    In ten years all phones will be crabs
    if only i could be as successful as mr. krabs…
    I knew there was a link between cell phones and cancer!!
    nah thats social media
    I’m experiencing déjà vu…
    They often steal features from each other. Technological innovation has slowed down a lot because of monopolies

    I would argue that it’s the nature of having a mature and complex product. Adding new stuff is hard because you have a lot of legacy code / UX that you have to accommodate for. You need to move slower because it’s easier to break stuff in a more mature product.

    I’d also argue that Apple and Google’s research teams are generally hearing the similar stuff out of their end users, so it’s to be expected that both companies are going to prioritize similar functionality.

    That was my experience when I’ve worked on massive products. The complexity of the product impacts development speed, and shared understandings of user desires results in similar feature sets between competitors.

    Exactly. You get it. At the end of the day they are all going to get many of the same features.

    They both copy from webOS anyway, at the end of the day. That webOS from Palm was way ahead at the time but lacked the hardware and Carrier support needed to succeed.

    My last brand new Pixel phone had debug strings in the user interface and the UI was not responsive. It’s the daily annoyances and details that made me get an iPhone. Comparisons have been stupid since the beginning of smartphones.
    Can you elaborate on these problems?

    I know this is an old feature, but do we have the NFC money transfer thingy?

    I mean the one where you touch other phone with your phone and transfer money.

    If we do I am unaware of it, I use AOSP.

    I think Google Pay used to have something similar, until Google axed the whole thing in favor of Google Wallet
    … And that is why competition and “stealing features” is always good, being Apple’s or Google’s competition.

    Nope. Apple only.

    I’m switching to iPhone because Google has let Android languish for years now. Samsung does more for android than Google does for goodness sake.

    Apple users get fun and cool updates which is why they love it. Plus best in class photos and videos so they can share photos with friends and family with confidence, as opposed to android which has shit cameras and even shittier video.

    Samsung does more for android than Google does for goodness sake.

    I agree with you, even when I dislike One UI… I think Google although simpler is more of my liking… But oh boy stock AOSP is so limited that for some users it is even too similar to iOS, with the distinction that you can sideload easier (for now).

    Lots of features that Google releases each Android iteration literally have been here in One UI, MIUI, Color OS you name it.

    But android doesnt run the llm locally. Can someone please make a foss app that can do simmillar things.
    You could use this
    GitHub - Mobile-Artificial-Intelligence/maid: Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.

    Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely. - GitHub - Mobile-Artificial-Intelligence/maid: Maid is a cro...

    GitHub
    The point of a local llm is u can give it access to contacts notes app control fucking everything simmillar to whaf apples done cool app tho.

    No phones can run “LLMs” currently because by definition, large.

    Some Android phones however can and does run smaller models locally. Gemini Nano runs on Pixel 8 and can run on Samsung phones.

    deepmind.google/technologies/gemini/nano/

    Gemini 1.0 Nano

    Gemini 1.0 Nano is our most efficient model for on-device tasks

    Google DeepMind