The Apple A7 – A is for Ambition

A great deal of articles have come out trying to sour Apple’s A7 processor and the fact that they have moved their ARM architecture from 32-bit to 64-bit. The narrative claims that there are no tangible benefits to make the move from 32-bit to 64-bit in mobile devices today.

Much of the narrative is rooted in a desktop perspective with regard to 64-bit architectures. Folks were quick to point out unless the iPhone has 4gb of memory that 64-bit will be useless. Those folks are using a desktop computing mindset to 64-bit not a mobile one. iOS for example is extremely memory efficient as are the apps that run on it.

This kind of thinking also discounts Apple’s vertical approach from the Soc, to the hardware, and the software. Because they control the OS they can “tune” or “optimize” the software to maximize efficiently every bit of the core they designed. Apple needs only to design the A7 for one purpose–iOS. Therefore, they can focus in on optimization for performance gains in all areas they feel are important.

Now, while we can certainly make the case that Apple could have achieved many performance gains by staying 32-bit we need to understand the context of how they would have achieved that.

Throw More Cores At It

If you study trends in the semiconductor industry you know that more area of the chip is being dedicated to graphics. It is almost as if the GPU is becoming more important than the CPU. But that is a discussion for another time.

Apple could have effectively accomplished performance gains of some magnitude simply by designing a quad-core version of the A7. Apple is yet to design a quad-core SoC and this would have given them performance gains. However, it would have come at a power cost. More cores require more power. Now there are very good and power efficient quad-core ARM processors available today from Qualcomm and Nvidia. But by moving to 64-bit and staying dual-core, Apple has effectively delivered equal to and perhaps better performance than competitors running quad-core chipsets with a dual-core solution.

So where will this matter? Battery life is the biggest beneficiary in the short term. Every task that utilizes the CPU or the GPU will happen faster allowing the CPU to return to a low-power state much faster.

Many will claim that consumers don’t care about specs. And this is true in most regions. However, there is a group that cares very much about specs, a group that is very important to Apple–developers.

Never Before Seen

One of my favorite lines when talking with developers is: “you can never have enough performance.” I hear this exact line so often you would think it is their motto.

Recently, I heard a wise man say: “Performance doesn’t matter. Until you don’t have enough of it.”

The key to understanding the value of the A7 being 64-bit is what developers will do with it.

We saw a great example from the team at Epic Games at the Apple event Tuesday. While showing off the preview for their upcoming game Infinity Blade III they made an important observation. They pointed out that they could keep “turning on features” they wanted to use in the game. Games like Infinity Blade use engines built around the graphics libraries. Many of the features encompassed in the graphics libraries can’t be used if the CPU or GPU can’t support it. When this happens developers just turn off features. When game developers are provided with more performance they take advantage of it and their applications and the experience with those applications benefit.

We will look back in a few months when we see extremely talented and creative developers take advantage of the A7’s performance and create classes of applications never seen or possible before on mobile devices.

Pundits and the media may look at the A7 being 64-bit and say big deal. However, developers who make a living creating best in class software to push the future of mobile computing forward will look at the A7 being 64-bit and say BIG DEAL!

Foundation For the Future

By going to 64-bit now it sets Apple up for greater performance gains utilizing the architecture for their future. This years gains are 2X with better power efficiency. Next years will be 2X or greater with even better power efficiency and so forth. Each generation delivering better performance-per-watt. Within this context it is easier to understand the “why now” angle to the A7 being 64-bit.

Apple certainly could have kept riding the 32-bit curve and just added cores and optimization with each new process node. But by going 64-bit now, it means they have more grand ambitions to push the envelope in what is possible computationally with their smartphones, tablets, and perhaps more much sooner than expected.

The iPhone will benefit from this, as I pointed out, with battery gains and new classes of applications (particular those that are graphically or computationally complex) but the real winner with this move will be the iPad.

I am, of course, speculating but I think it is reasonable to assume that the next iPad will run the 64-bit A7. Bringing a true desktop class processor to the iPad has the potential to change the game dramatically in terms of how the iPad is used and the types of applications possible on it.

And of course, this move will fuel the fire that Apple may have intentions of bringing the A7 to ‘some’ Mac products.

If you are interested in a much deeper dive on this new ARMv8 64-bit architecture, I recommend this article written by David Kanter.((if you do read the article pay attention in particular to the sections on register states, memory, and virtual addressing)) I found something he wrote in the conclusion of interest.

“In some respects though, the more significant changes came not from adding features, but removing them.”

Sounds kind of familiar.

Published by

Ben Bajarin

Ben Bajarin is a Principal Analyst and the head of primary research at Creative Strategies, Inc - An industry analysis, market intelligence and research firm located in Silicon Valley. His primary focus is consumer technology and market trend research and he is responsible for studying over 30 countries. Full Bio

67 thoughts on “The Apple A7 – A is for Ambition”

  1. From the article:
    “with batter gains”

    Other than that a great piece. In fact, for a while there I thought his dad had written it.

  2. Quick. Name the desktop applications that gain more than 10% performance running in 64 bit version of windows vs the 32 bit version.

    Chances are you will have a hard time naming a handful, and this is for a desktop environment.

    It is great that Apple is getting to a new generation ARM CPU ahead of competitors, but the importance of the 64 bit aspect is marginal to non-existent.

    5 years from now when we finally crack 4GB of RAM in a portable device, then it will matter.

    For the next couple of years, being 64 bit in mobile is all but irrelevant. Simply switching to 64 bit doesn’t automatically increase performance by any significant amount and can increase code size needlessly.

    1. Take a look at that deep dive article I linked to and particular how it address the memory points in 64 bit ARM. I had LONG discussions with all the silicon guys this week. The memory stuff seems to be less of an issue in their minds in mobility. That is a desktop point not a mobile one.

      I also talked to many developers who rattled off a list of technical things they can do now the integer path’s are larger. Things like more complex image recognition (again using desktop class libraries) of course there are many things in OpenGL that were not possible before in mobile that are now with 64-bit GPU. Image and video processing will benefit greatly on this foundation all without a CPU tax.

      I again think the point about the less tax to the CPU has an immediate benefit can’t be overstated. Playing a graphically intense game will tax the CPU less thus allow you to play those types of games longer. Of course once the apps catch up and utilize all the threads we are back to even but it will be with apps not possible before.

      OS X benefits greatly from being 64-bit. It also takes advantage of a heterogeneous architecture leveraging the benefits of the CPU and GPU for core processes efficiently. This is also referred to as GPU computing and its the standard now process now for even the most visually intense aspects of an operating system.

      1. first major step forward to 10x performance gain
        the iOS flash Memory will be placed on the RAM bus at the higher locations
        programs then only need RAM to execute code Not load it
        Apps are effectively loaded when they are installed
        Think back to the Mac OS 6 that had large portions installed on ROM’s

        1. 1. NAND/NOR/SPI memory (Flash) is much slower than RAM.
          2. Apps are stored compressed to save limited space, and must be decompressed before use.
          3. Apps must be “fixed up” with the addresses of system libraries so that they can run. All those system calls are replaced with a “reference thunk”. When loaded, the reference thunk is replaced with the correct address for the system library. Sometimes a library isn’t even in memory, and is demand-loaded once an app that needs it has been launched.

    2. Incorrect. The big win for mobile is NOT access to more memory. It’s being able to more quickly rifle through the actual work you need to get done and PUT THE CPU TO SLEEP as quickly as you can. The more powerful apps are a side-benefit. Most apps spend most of their time waiting for user input or other slow tasks like network transfers.

      Apple has had a huge focus on power lately, educating its developers that, in their findings, the most important thing is to get the work done and get to idle.

      Going 64-bit, offloading to low-energy processors like the M7 so the CPU can sleep, reworking background network transfers, allowing apps to wait until power conditions are better…rewatch the WWDC keynotes. Apple is trying to get a huge lead in battery life for mobile.

          1. Yeah, it was JUST a compliment. I can be hyperbolic at times.

            The quality of your posts on this subject have been very high IMO. They are giving me a lot of context on the A7.

          2. Phew! Thanks. 😉

            I’m enjoying your posts as well. I’ve been reading your back-catalog (old posts).

            Now, let’s see if we can get Defendor to explain why/how the 5s isn’t paying a battery life penalty.

            The 64-bit move may force Google to invest more effort into Android/Dalvik than they intended, so that they can make a 64-bit JIT. Google seems to be trying to reduce its losses in supporting Android just so others can ship without their apps and profit (e.g. Amazon, Samsung).

            The real loser may be Microsoft and any newcomers who are struggling to gain traction.

            Apple seems to be playing this a bit like the Cold War. The mostly white UI makes many display technologies power-hungry. 64-bit done poorly (e.g. the stock ARMv8 tech) is comparatively power-hungry for mobile. Bluetooth, even Bluetooth LE, done poorly is power-hungry compared to NFC and peer-to-peer Wi-Fi, but doesn’t require “bumping” phones. Always-on motion detection done poorly…blah blah blah. WWDC was all about power this year, both iOS and OS X.

            I detect a trend, and Apple tends to play the “long game.”

            Perhaps they’re trying to raise the bar high enough to keep new entrants away and make Google/Samsung/Microsoft/Nokia ante up if they want to stay in the game.

          3. I thought the same thing when I saw iOS7. Android would grind to a screeching halt if Google attempted to implement comparable visual effects. Apple is using its technology advances to open up distance now.

      1. Going 64 bit really doesn’t improve performance enough to matter. So that isn’t the reason either. There are also no real power efficiency gains shown.

        The real reason they are going 64 bit, is because that is the next step in the ARM pipeline for everyone.

        Nothing more.

        If ARM V8 was 32 bit, this would be a 32 bit chip.

        So the question isn’t why is Apple going 64 bit. It is why ARM holdings went 64 bit.

        1. Well, I think we would agree ARM went 64-bit for server architectures. Apple’s team from PA Semi were some solid architects. No one in the silicon industry doubts this. So Apple must have some vision for how to utilize this uniquely in phones and tablets.

          The 64-bit gurus tell me that they could have done some really interesting stuff extending instruction ISA extensions to take some unique advantage of the chipset design.

  3. While I agree with Ben that the move to 64-bit was relevant, I was also hardpressed to understand why Apple would put such a ridiculously powerful processor on a phone. My first thought was that, at some point, Apple intends make the iPhone a dockable computer. It would kind of be its unique answer to building a “hybrid.” Carry the iPhone around regularly, then place it in a docking station to use it as a computer. I’m not certain if this makes sense in the short term with Apple making the less capable 5C its mainstream device but long-term…

    Apple could choose to apply that strategy to the iPad once it has an A7. Indeed, it makes an Apple “hybrid” a lot more sensible. The re-designed iPad is rumored to have the smaller iPad mini bezels; if Apple chooses to maintain the overall size of the device and increase the screen size to something in the 11″ range, all it has to do is add a keyboard attachment. With TouchID, it becomes a pretty secure mobile solution more suitable for business. I’m just spitballing with this though because I don’t see Apple doing something as linear as this.

    I’ve always thought that a 4K iPad was the next step and the A7 can certainly handle that load. I know the comments on this will be “that doesn’t make sense” but it would, for all intents and purposes, all Apple to “prime the pump” for 4K content. If and when 4K becomes mainstream, then an Apple TV becomes a logical choice because of Apple’s lead in 4K content. I see a 4K iPad as an ecosystem play rather than a hardware one.

    I will put this out there now: this is just speculation. I’m not stating that Apple will do any of these moves other than the 4K one. I really think Apple is going to release a 4K iPad in the near future.

    1. Good thoughts James. I am with you on many points. In fact when I first heard about the A7 being 64 bit I first started thinking about 4k video. There are 32-bit ARM core chips that can handle 4k capture encode and decode but the foundation for 64-bit to do that opens more doors.

      The other bit I’m speculating on with regards to the iPad is that it could go to quad-core with the A7. Call it the A7x. That would truly be something for mobile devs to turn the iPad into a real computing beast.

      1. Another thing to consider is that putting the A7 in the iPhone is a pretty good way to reach economies of scale. Assuming Apple has bigger plans for its 64-bit architecture, using the iPhone to scale it up is a pretty smart way to leverage its efficiencies.

        1. Very much agree regarding the A7 and economies of scale. Apple always plays a bigger game rather than just shipping the next product. I suspect they’re also trying to leverage getting developers thinking about (and shipping) 64-bit apps for the real bombshell to be dropped next year. They’re reinventing the entire iOS platform underneath everyone’s noses.

          The M7 is another big leverage point.

    2. Graphics processing is generally handled on the GPU, not CPU. GPUs are best described as 32-bit instructions with very wide (256-bit, 384-bit, 512-bit) data paths where the data can actually be multiple items (SIMD). Handling 4K images/video will be done by the GPU or, most likely, special hardware engines for things like H.264/H.265, which are much more efficient for such tasks.

      There’s a reason why the new Mac Pro has 7 TeraFLOPS of GPU power, and Schiller made a point of saying that developers should focus on using the GPU/OpenCL.

      1. My speculation was based purely on the overall improvement in graphical performance of the A7. My technical knowledge as it relates to silicon of any type is rudimentary at best.

    3. “why Apple would put such a ridiculously powerful processor on a phone”

      There is a very simple reason: Competition.

      Even at the claimed “up to double” the CPU performance, would only put the A7 CPU about EQUAL to Snapdragon 800/Tegra 4 Quad core CPUs. For CPU performance, this only keeps Apple in the running.

      Also 4K tablet is essentially a pipe dream and pointless. Even at 326 dpi, a tablet would need to be larger than 14″ to hold that resolution. I don’t see Apple exceeding 326 dpi on screens bigger than 10″ this decade. It is really just pointless waste.

      1. There is already a mass produced device with dpi that exceeds the 520+ necessary for a 4K iPad. Whether a company could produce screens in the size and yield that it would take for a mass produced iPad is another story. As for “pointless waste,” even the best current display technology doesn’t approach the limits of what the human eye can see. Especially when it comes to gaming, display tech has a very long way to go. The developers of the Oculus Rift claim that even 8K isn’t enough to get a truly realistic level of detail in a video game.

        1. You aren’t going to hold a 10+ inch screen tablet, 8 inches from your face to see the pixels unless it is just to nerd rage about it. So it is just a pointless waste.

          Occulus Rift is a poor example since you wear it like glasses, so it extremely close to your eyes, with focusing lenses, to allow viewing at that distance.

          Even at 264 dpi, pixels are invisible on the Retina iPad at normal(~15″-18″) tablet usage distance. Though sometimes you can pull that one in closer in some scenarios, and see them. Bump it to 326 and you would never see them in any normal usage scenario.

          1. The mistake people make when discussing pizel density is focusing on the the ability to view pixels rather than the ability to view detail. Anti-aliasing, even on a Retina display, negatively affects visual detail; it’s an optical illusion designed to mask the shortcomings of current display technology which can’t produce the full range of detail perceptible to the human eye. Pixel density is important when viewing the SMALLEST objects and detail on a screen. From that perspective, even the most dense iPad display is nowhere close to the human eye. There are studies that place the threshold for true realism at roughly 600 dpi. The additional visual information at that pixel density is orders of magnitude higher than what is offered on a current iPad Retina display.

            “Good enough” is a relative statement. If you only use apps on your iPad, the current technology is probably good enough. But the finer detail would greatly benefit movies and games.

          2. It doesn’t make any sense whatsoever to claim ~ double the pixel density would provide orders of magnitude increases in visual information (1 order of magnitude is 10 times as much, 2 orders of magnitude = 100 times as much).

            In reality the opposite is happening, we are already deep into diminishing returns at iPad Retina resolution, at typical viewing distance (~15″ to 18″). Unless pointed out to them, I would bet the vast majority would never notice an increase beyond this point.

            I walked into an electronics shop with a buddy, to check out the new iPad 3 display. He was looking at the iPad 2 instead and I tried to point out the Retina advantage to him, he had a hard time even noticing the difference, much less caring about it.

            Apple continued to sell millions of iPad 2, to a huge swath of the market that couldn’t that didn’t car less about Retina.

            Both of the above were against very low 132 dpi. Now with 264 as a starting point, the perceived benefit for an increase is diminished dramatically.

            Your assertion about movies has it precisely backwards. Video is an extremely forgiving kind of visual information. Its capture creates a natural anti-aliasing effect. Making any kind of pixel related artifact much less visible.

            This argument about uber resolution at normal viewing distances are essentially the equivalent of the self proclaimed “Golden Eared” audiophiles who claim you need 24bit/96KHz encoding rates for proper playback of home audio.

            Anything beyond 300 dpi on a tablet, is marketing wars and nothing more.

          3. Okay dude, I have no intention of getting into a religious war with you about it.

            As for me, I want the resolution on my devices to be high enough for me to see into the future. To each his own.

          4. Agree with James again. Not long ago, 72 dpi was considered more than good enough.

            Heck, not long ago, NTSC/PAL/SECAM broadcast television was considered more than good enough, and that was at a resolution of 352×480, across 25 and 27″ CRT tubes, with a color-mapping that had half that resolution for green and blue (YGrGb).

            Even when we say that 300 dpi is some magical density for human eyesight, it doesn’t apply universally. Many people, especially the young, have finer resolution than that. The human eye doesn’t use a pixel grid, so you will find people and situations where even 3000 dpi can be resolved.

            I believe James mentioned the Ocular device because it, also, would be measured differently, and by virtue of being closer to the viewer’s eyes, would require higher resolution to to appear “retina-like.”

    4. Very similar to what I’m thinking, except why would you ever need to dock it? AirPlay to monitors. Peer-to-peer Wi-Fi or Bluetooth LE to anything else you might need.

  4. The benefits of the iPhone 5S 64-bit environment (64-bit A7 processor, running 64-bit apps, on 64-bit iOS 7) are HUGE!!!

    There is an article on All Things D today that explains what a radical improvement this is. Here are a few quotes from the article:

    “The fact that the A7 has twice as many processor registers means that more operations can occur without the processor using main memory, which is slower to access,” Carl Howe, VP of research and data sciences at the Yankee Group toldAllThingsD. “This means for that, for some codes, the A7 will be twice as fast (or faster, depending on how many memory accesses the original code had) to run code, because the processor doesn’t have to use main memory as much.”

    “The ARMv8 instruction set is clean-slate approach with many improvements. Even without 4GB of RAM, the A7 should make it easier to build larger applications like PC-class games and programs. Apps can now become real desktop-class programs and games.”

    “with the 64-bit A7, Apple has made it possible for developers to take the 64-bit apps they’re written for the Mac and bring them to iOS 7 with relative ease. And that is a huge benefit”

    “This will not be true with Android, by the way. The Android Java app and native app environment will need support from Oracle, who owns the Java environment as well as 64-bit support from the Android kernel. Android has a lot more moving pieces to coordinate, and will take longer to go to 64-bit.”

    1. Android has no dependencies on Oracle. The JIT compiler is, effectively, Dalvik, and can be easily moved to 64-bit. In fact, Android has an advantage in being able to instantly move to 64-bit just by shipping a 64-bit version of Android with the proper JIT.

      1. Why was this downvoted? He’s right about dalvik though many of those iOS recompiles will be trivial. Native Andriod games will need to be recompiled.

        1. Sometimes the truth is hard to hear. I’m a very intense Apple-fan, but as a software developer who needs to have a solid grasp of what’s happening in this industry, I prefer to plumb the depths of the truth rather than stick my fingers in my ears and say “I can’t hear you!”

          In a way, this is one of the best things that can happen to the iOS app stores. Old apps that were first put on sale during the initial gold rush will quickly fade away. Many (not all) of the poor apps were done by first-time programmers who have likely moved on by now.

          There’s no reason for an Android/Dalvik app to ever die, other than DMCA takedown.

      2. Yes, it should not be all that difficult to create a 64-bit build of Android. However, the primary problem for Android as a platform is that developers are not using it. It’s not a premier platform for novel, interesting apps.

        That being the case, why would delivering a 64-bit high end Galaxy S6, say, result in any useful new applications benefitting from it? Developers aren’t even taking serious advantage of Android 3.x/4.x features. How many years will it be before a substantial fragment of the Android installed base has access to some 64-bit build of Android 5.x? 2016?

        The second largest platform behind iOS 6 is still Android 2.x, and Google has no control over the majority of the Android installed base. Witness the fate of the Upgrade Alliance, or how well “pure Android” has worked to sell hardware.

        Look how long it took Windows to get its long tail of users to 64-bit. Google has much less control over its platform that Microsoft did, and shows no real ambition in making any portion of Android “great,” because the goal of Android is not to deliver exceptional products, but merely to protect Google’s ad platform, originally as a defense against Windows Mobile or the more generic Java Mobile on Linux. Google’s largest ad threat today is the forked versions of its own platform.

        Google isn’t just herding cats, it’s herding cats who are direct competitors with no interest in doing things that benefit Google or the rest of Android collectively.

        1. Apple has been able to avoid the 32-bit tail the same way Canada got rid of $1 and $2 bills–they stopped making them. Apple forced the Mac switch to 64 bit by squeezing support for 32-bit out of OS X; Snow Leopard really wanted x64 and Lion and Mountain Lion demand it. Microsoft, meanwhile, still has to support 32-bit Windows 8 because they want it to run on Intel’s Atom Z-2760, a 32-bit processor.

          If Apple follows its usual course, new versions of iOS will support 32-bit iPhones for the next three years. As these things go, this is a speedy transition.

          1. I find it curious that Apple is getting good battery life with 64-bit mobile processors while Intel’s 32-bit Atom is on the very far edge of being suitable for mobile. It’s not just the hardware, it’s the entire system, including software, that must be tailored to get power consumption in line.

            I suspect all of next year’s iPhones will be 64-bit, with only some variant of the 5, perhaps an 8 MB 5c?, left with a 32-bit processor.

          2. Yep. MS Sleep would’ve been much better off if they just didn’t support x86 in Windows 7 x64 Pro.

            If you wanted 4+ gigs of ram, signed drivers, etc, you had to have all 64 bit code. Had they done that, I think Windows would be fully 64 bit today.

        2. DED,

          I am glad to see you commenting here. Your take on Apple is exactly in line with the rest of the excellent analysis here at TechPinions. I do enjoy your articles on AppleInsider and wish that you had time to do more on roughly drafted. Keep up the great work!

        3. you are right about android being very fragmented,

          anyone, including Google engineer are aware of this, and that is why they try to solve this problem by building something much more powerful and meaningful call Chrome app launcher that developer can use to build a single mobile application and a single PC application that works wonders on any computing device, including IDevice where Chrome browser is installed .

    2. Replying to my own comment, I just want to point out the possibilities envisioned by one of the quotes:
      “with the 64-bit A7, Apple has made it possible for developers to take the 64-bit apps they’re written for the Mac and bring them to iOS 7 with relative ease. And that is a huge benefit”

      Imagine if OS X developers did convert their current 64-bit OS X desktop applications to 64-bit iOS apps. Now imagine an upcoming MacBook Air* (or other Apple computer) running a 64-bit iOS desktop OS on an Apple designed “A-class” processor.

      The benefits would be:
      1) A notebook computer that has 2 to 3 times the battery life of one with an Intel processor, with the same processing power. The demonstration of Infinity Blade III running on the iPhone 5S shows how powerful the A7 processor is… you will not be able to run Infinity Blade III (when it comes out for OS X) with the same level of rendering on any of the current MacBook Air models!

      2) Apple would be able to eventually jettison Intel as a processor supplier, and use its own custom-designed 64-bit A-class processors in all of its products (desktop, notebooks, tablets, and smartphones).

      3) Apple would only need to develop and support one operating system, instead of the two currently!

      4) With both desktop and mobile versions of iOS, it would be much easier for developers to create and code app titles that work on all Apple devices.

      5) Apple device users would be benefited by having just iOS to deal with, instead of having to maintain, update, and use apps on the current two separate platforms (OS X and iOS).

      *The desktop version of iOS on a MacBook Air, would probably have a user interface more similar to OS X than to what you would find on the iPad, since it would be optimized for use on that type of computer hardware.

      1. Don’t fool yourself. The fact that Apple is so heavily invested in Thunderbolt and the new Mac Pro means it’s more likely they’ll buy Intel than kill OS X.

        iOS is for TOUCH devices. OS X is for non-TOUCH devices. They won’t make the same mistake that Microsoft has, though they will certainly share technologies.

        Apple is soon to re-enter the server-like market…with a vengeance, and with a very different looking product that will catch everyone off-guard.

          1. I’m exceptionally familiar with the Magic Trackpad. However, it requires an indirect usage model…I move my fingers on this thing over here, and something on screen over there moves. iOS is modeled around the semblence of direct modification, as though you’re touching the paper, your fingers are spraying ink, or directly dragging things around.

            If you’re trying to write grade A software, that’s a massive difference.

          2. Also, the Magic Trackpad is designed for pixel-precise control through a cursor. iOS touch assumes much larger targets.

  5. Regarding “Why now?”, has anyone considered that the announcement comes at an ideal time to get most of their third-party software compiled for the 64-bit architecture? Although 64-bit was a rumor, almost nobody believed it. Developers were already pushing hard to get up-to-speed with iOS7. Now, if their first bug update involves minor tweaks and a rebuild with XCode 5, nearly every new app will be iOS7 **AND** have a 64-bit software repertoire going, even before October starts.

    As for why 64-bit, code efficiency trumps simple expansion to 64-bits. Apple is all about power management for this round. It’s 64-bit INSTRUCTION and 64-bit DATA. Everyone has focused on the DATA part of 64-bit, being able to handle bigger data and pointers. The big win for Apple is 64-bit CODE…instructions that can include 32-bit offsets, constants and other data, but still be a single instruction. The ability to make single instructions that consolidate multiple instructions, which ties into Apple’s LLVM work. Getting work done quickly so that you can gate most of the chip and save power…that’s why battery life won’t suffer, but in the future, when they leech GPU work back to the CPU side of the chip…oh my…

    Methinks everyone so much underestimates them.

    1. It would also be good to consider that the M7, which exists solely to reduce power consumption, can be used elsewhere. Apple could have easily combined the M7’s functionality onto the A7 die. Multiple power planes on a single ASIC has been around for quite a while. They CHOSE to make the M7 a separate chip, and it wasn’t just for marketing reasons.

  6. Real confusion actually comes because A15 was never designed for phones.
    So Apple designed Swift processor with some features of A15 and some of A9.

    So everyone assumed that Apple was the one who asked ARM for big and little. ARM even came out with A12 because A9 needed evolutionary step.

    But it is clear that Apple has gone with 64 bit and combined with Swift.
    created M7 and eve licensed the GPU so it can put additional things. So all the competitors are mighty confused on what to copy.

    1. Apple’s A7 is modeled after the ARM A57, but has a great deal of Apple-specific enhancement. How much? Have a look at the transistor count difference.

  7. This may be the first time in history that a single vendor controls the chip design, the programming language, the compiler and code optimizer, the operating system and the finished consumer product.
    The possibilities for extracting efficiencies and delivering unmatchable features and experiences should be equally unprecedented.

      1. IBM still does on the Power and Z-series systems–if you program in Fortran. (Actually, I don’t think IBM technically controls Fortran any more, but Apple doesn’t technically control Objective-C either.

        1. I’m not familiar with the formal status of Objective C, but it’s clear that in practice Apple and only Apple decides everything about its evolution.

  8. Over the past year, Apple has seemed more focused on features that are difficult to copy to prevent the clone-makers (Android/Samsung/etc.) from catching up as quickly as they were able to in 2009-2012: iOS 7, the ultra-thin iPhone 5/5s aluminum case, 64-bit processor, cutting edge 5s camera features, fingerprint recognition. These complex features will permit Apple to retain its leadership position in mobile computing.

    1. 64 bit processing is not some new cutting-edge technology that people need to copy. It’s a standard commercial off-the-shelf technology that anyone can put into their devices when they think it makes sense.

      1. “64 bit processing is not some new cutting-edge technology that people need to copy. It’s a standard commercial off-the-shelf technology…” – Anders CT

        Not in mobile where watts are what matter.

      2. @Anders – I agree a 64-bit processor for mobile phones represents off-the-shelf technology, but the shelf belongs to Apple. This article strongly suggests that the A7 took Samsung by surprise, and that it has no competitive response other than to announce a 64-bit processor constructed of vapor:

        appleinsider. com/articles/13/09/14/ after-its-disastrous-exynos-5-octa-samsung-may-have-lost-apples-a7-contract-to-tsmc

      3. Samsung didn’t seem to think so. 64bit mobile is pretty darn cutting edge and will have a longer life in mobile than NFC.

      4. The move is up to Google. Sure, you can drop a 64-bit chip into a device and run a 32-bit operating system on it, but that makes no sense because users (and developers) derive no benefit from it.

  9. I mainly don’t understand why the pundits are so quick to harshly judge what Apple is doing. That’s up to Apple to decide about what architecture it wants to use. People are so quick with the hate about anything Apple does.

  10. 1) 64 bit computing does not magically improve performance, apart for some rather esoteric computing tasks.

    2) 64-bit processors use more power than 32-bit processor at similar single threaded performance.

    3) Moving to armv8 makes a lot of sense, but it will not be a big improvement for iPhone 5s owners. I’s a move for the future, to improve softwaresupport when 64-bit becomes the norm on future mobile devices.

    4) When Android moves on to the desktop this fall, it will be 64-bit enabled running on 22 nm Intel processors. That was a pretty clear outtake from IDF. That also makes a lot of sense.

    http://images.dailytech.com/nimage/Wintel_Android_Enhancements.png

Leave a Reply

Your email address will not be published. Required fields are marked *