On raw performance might, the M4 really does live up to Apple’s promises, should deliver. Single core is up about 20% compared to all M3 chips and more than 40% compared to M2. The generational computational leap from the previous M2 iPad Pro is at least a 42% jump on single-core and multi-core.

    • Altomes@lemm.ee
      link
      fedilink
      English
      arrow-up
      33
      ·
      6 months ago

      Right, like they don’t really have many AAA, the main thing holding this back is firmly the OS. I just truly don’t get it

      • maegul (he/they)@lemmy.ml
        link
        fedilink
        English
        arrow-up
        30
        arrow-down
        1
        ·
        6 months ago

        Market segregation is worth it for them and the chips will be used in plenty of other hardware anyway, so dumping them in iPads doesn’t hurt, even if it’s mostly just marketing fit the products, nor does it necessitate a product change.

        • AggressivelyPassive@feddit.de
          link
          fedilink
          English
          arrow-up
          22
          arrow-down
          5
          ·
          6 months ago

          It’s a waste of computing power, though.

          I have an M1 MacBook Air and barely ever actually used the CPU. Putting these chips in iPads, which are mostly used for drawing at most, is just a waste, and one of the reasons they’re so incredibly expensive. Apple could have just kept producing M1s and putting those in current iPads.

          The reality is, there’s zero innovation in Apple products. The switch to M1 was really great, but everything since then was just “more M is more better”, utility stayed the same, price went up. Awesome.

          • ji17br@lemmy.ml
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            1
            ·
            6 months ago

            It’s not a waste at all. The extra computing power allows them to get much better performance than previous model OR the same performance with half the power use. That’s pretty important in a mobile device.

          • skulblaka@startrek.website
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            3
            ·
            6 months ago

            It isn’t a waste if people buy it. Putting M4s in the iPad lets them market it to rubes who think bigger number is better without reading the spec sheet or understanding their own requirements, and if they’re already manufacturing M4s to put in other things, that’s one less production line that needs to run. Sure, they could release an iPad Cheapass Edition with an M1 in it and sell it comfortably at a profit for like $80, but the market for those is likely to be small, they won’t make nearly the overhead profit that the M4 iPad will, it requires an entire extra production line setup, and most importantly it isn’t flashy enough for Apple. They don’t want to release a product that feels cheap, even if it was specifically intended to be cheap. It’s bad brand optics and they care about that a lot. Let China sell a bunch of bootleg tablets to people that want them, they’re gonna do that anyway regardless if Apple gets in the train or not, and this way Apple isn’t tarnishing their product lineup with a PoorPad^TM

      • kratoz29@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 months ago

        Perhaps with a more robust OS, such as Linux or macOS the battery and thermals would just not suffice?

        I mean, an iPad is basically a larger phone, which I think can get hot enough if pushing it to its limit

        Also I don’t think the RAM would be enough for intensive tasks, the device as it is could be pretty good for gaming though, if only the title list wouldn’t be a shit for the most part.

        But at the same time, a MacBook Air doesn’t seem much bigger compared to the biggest iPad available.

        • BaroqueInMind@lemmy.one
          link
          fedilink
          English
          arrow-up
          8
          ·
          6 months ago

          Isn’t iOS just about heavily modified Unix clone? My jailbroken old iPad has /var/log and misc GNU directories, as well as an Apt package manager to access Cydia repos.

          • anlumo@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            Not a clone, its kernel was once certified UNIX. It’s just a heavily modified UNIX.

          • kratoz29@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            It is, but it would be like saying Android is just another Linux variant.

            What I want to stress in my initial comment is that the OS is so heavily modified and focused on optimization and RAM management, that it can’t hardly work for power users when multitasking is on the board.

      • helenslunch@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        just truly don’t get it

        It’s the same reason Macs don’t have touchscreens. If they can both do the same tasks, why would you buy both? And LOTS of people buy both.

    • NOT_RICK@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      6 months ago

      Maybe they’ll finally announce something interesting at WWDC. I’m ready to be hurt again

    • Ghostalmedia@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      4
      ·
      6 months ago

      I get it if you’re doing photo editing on an iPad. That stuff is still a CPU hog.

      That said, the M3 is on an end-of-life manufacturing process, and now that these things are getting updated every 2 years, it just makes sense to put the M3’s successor in this thing. A Pro M2 is going to stick out like a sore thumb in 2 years, and the M3s are going to start to disappear from the line up soon.

      • GamingChairModel@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        That’s why they also announced a multi camera synced video editing functionality on the iPad version of Final Cut Pro. In theory it can make use of the CPU with a ton of compute involved in video editing, especially with many source videos. Other than that, though, it’s hard to marry that overpowered hardware with underpowered software.

    • Monument@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      2
      ·
      6 months ago

      I’m so annoyed they announced this.

      I have a slew of raspberry pi’s kicking around, doing various things. I also have a name brand NAS that reportedly lets you run other software, including containerized apps, but their implementation is whack and doesn’t work super well.
      I want to get a more powerful machine for use as a replacement server. I’d like to spin up my own LLM tools, use it to with software like photoprism to auto tag my pictures, or even spin up Frigate on it.

      My leading contender had been either a Jetson Orin nano or a system with the core ultra 155h chip. But now I might have to wait until they announce/release M4 Mac minis - which is really annoying because I want instant gratification for my half-baked ideas.

      • skulblaka@startrek.website
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 months ago

        Now you have the time to actually write up a design document and let your half-baked idea become a fully cooked one before you drop a bunch of cash on it

    • Garry@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      11
      ·
      6 months ago

      We’ll find out the future of iPadOS in one month! They have raised the price on the pro models, hopefully they have a big ass update readied up or alll the reviewers are gonna say the same thing “great hardware let down by shit software”

      • anlumo@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 months ago

        They have been saying the same conclusion since the very first iPad, hasn’t deterred Apple yet.

  • Fern@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    6 months ago

    Curious who uses this for pro means. With FCP, Logic, Resolve on there now, who would choose an iPad for these?

    Great way for a kid to start learning them, I imagine, but I would wager a guess that most pro peeps are using it for illustration and art.

    • evident5051@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 months ago

      Whoever can afford this can already afford the laptop alternatives. My guess is that this will be a convenient “nice to have” item whenever bringing along a tablet over a laptop feels like less of a hassle.

      • locuester@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        Yeah will be interesting, because an Air is what I use for that. I need the keyboard….

        I have a powerful PC laptop, then a MacBook Air for days at conferences, airplane, etc.

        iPad seems useless for me at least. I have a phone.

        • TheRealKuni@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          Personally I love my iPad as a larger browsing/watching device, for creative uses like vector image work, photo editing, and drawing, occasionally for CAD work (which is remarkably simple with Shapr3D), and of course streaming from my Xbox or PS5 to play remotely. Also it can run Stable Diffusion, which can be fun to play around with.

          But the primary reason I originally bought it was for sheet music. 😅

          I don’t really need the pro performance, but it’s nice to have for some of the creative stuff. And learning to redo workflows with the pencil and touch inputs can be frustrating and slow at first but I find once I get the hang of them it can be really intuitive and quick. I recently designed a T-shirt design for my dad in a vector app that I had never used before, took me only an hour or so to feel proficient enough to be satisfied with the work and further practice will only make it better.

          Obviously it isn’t for everyone, I’m not trying to be an iPad evangelist. But even though I don’t use mine for my primary job I really enjoy working with it when I get to.

    • some_guy@lemmy.sdf.orgOP
      link
      fedilink
      arrow-up
      4
      arrow-down
      3
      ·
      6 months ago

      Accidentally replied in the wrong spot.

      I posted more because I’m happy about the chip performance gains. That it’s in an iPad is interesting, but I’m excited for the Mac.

  • FluffyPotato@lemm.ee
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    4
    ·
    6 months ago

    I don’t use apple’s stuff but alternatives to X86 could be the future. The one thing they need is compatibility with X86 software otherwise mass adoption is heavily crippled. It doesn’t matter as much for Apple’s stuff since their whole ecosystem is under strict control but for general purpose consumer hardware that compatibility is required first.

    • SMillerNL@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      6 months ago

      Apple already stopped selling x86 devices and even the stuff that is not under their control seems to work fine

    • eleitl@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 months ago

      You seem to not be using open source software packaged for multiple architectures or which can be built for your binary target. Most people will be just using a browser and an office suite.

      • FluffyPotato@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        Yea, obviously, that’s the case for most people. A lot of people for who a chromebook would be enough would not be effected, yea but for example software that isn’t getting new updates and like all gaming would just not work on other architectures currently.

    • TheRealKuni@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      6 months ago

      I have a friend who said on his M2 MacBook, even before the Apple Silicon build of Factorio released, the game ran better in x86 emulation than on his previous machine. And much cooler.

      The battery life and thermals that come out of these powerful ARM chips are amazing, and anything that can be multithreaded is going to perform brilliantly on these chips.

      Obviously for stuff where thermals and power consumption aren’t as important the gains aren’t as large, but I can’t remember the last time I worked on an actual desktop machine rather than a laptop with or without a docking station.

      • FluffyPotato@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        6 months ago

        That heavily depends on what the previous machine was. Like factorio runs on my laptop without taxing the system much more than just idling and on my desktop I can’t even tell it’s running based on performance monitoring. So yea, I’m not sure factorio is a good indicator.

        • TheRealKuni@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          Sure, definitely not a perfect benchmark. I’m not saying it’s going to outperform a current x86 machine in general. But if it can perform as well as or better than a relatively powerful x86 machine from a few years prior, while emulating, that’s impressive.

          But I don’t know, I don’t have a MacBook.

          • FluffyPotato@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            6 months ago

            I’m pretty sure the old AMD APUs from the Bulldozer era can run factorio and that’s like a decade old.

            Like sure, it’s some metric but I’m pretty sure any computer produced currently can run factorio.

        • olympicyes@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          I’ve got a high end Intel MacBook Pro and a low end M1 Mac Mini. The Mac Mini runs x86 apps live Civ 6 faster and smoother than the Intel MacBook can.

          • FluffyPotato@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            I don’t doubt it, Apple has never had good gaming performance. But a non apple laptop in the same price range with X86 aimed at gaming can run it a lot better.

    • erwan@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 months ago

      The performances is not inherent to ARM, x86 can definitely catch up to this.

  • suction@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    6 months ago

    If Lenovo was really clever they’d now spend some money on creating a Linux Desktop that is as polished and usable as MacOS and use truly Retina-level displays. I’m ready to ditch Apple like I’ve never been before.

    • thatKamGuy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 months ago

      In general, I would love for any OEM to step in and provide similar build quality to a Mac… doesn’t even have to be Lenovo (who IMO are a pale imitation of IBM’s line of laptops).

      • suction@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        6 months ago

        The Lenovo additions to the Thinkpad lines (like the foldable ones or tablet-hybrids) are pretty horrible, the classic ones are still good (T, P)

        The Ultrabook X carbon or whatever they’re called are also ok for the weight.

        I bought a used P51 and love developing on it because using Docker on an OS where it’s natively integrated is a game changer, but at the same time looking at the ugly font rendering on a dim 4k screen with huge 1 inch bezels spoils it again. Developing on a Mac feels less like work because of their attention to design.

        • Addv4@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          I have an X380, it’s pretty decent for what it is. Sure, there are plenty of things I’d change (ram slots instead of soldered, 3:2 or 4:3 aspect ratio for the touch screen, maybe a bit lighter) but later gens actually have a few of those improvements. It’s not really a great replacement for an ipad, but it’s a pretty decent work machine (provided you don’t need a ton of power or ram).

    • AnUnusualRelic@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      6 months ago

      That’s only if you like MacOs. I tried it and ran back to the usability heaven of Kde (and someone was gifted a n apple laptop).

  • Garry@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    6 months ago

    If the leaked score is true, isn’t it beating every cpu in single core performance

    • Evilcoleslaw@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 months ago

      In Geekbench, yes. From other reporting I’ve seen the major improvements here are from Scalable Matrix Extensions being on the M4, which Geekbench supports. Real world performance of which would be limited to certain scenarios and require application support for SME.

    • some_guy@lemmy.sdf.orgOP
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      No, but the metric is performance at that power-draw. And I don’t know that it’s the best there, even. But I’m excited for what it means for the future of my platform (MacOS).

  • helenslunch@feddit.nl
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 months ago

    I see a lot of “Apple says” here. I’ll believe it when I see it. And not on their shitty graphs with no numbers and comparing it to 4 year old processors.