Rabu, 10 April 2019

Amazon Kindle review (2019): The Paperwhite gets a run for its money - Engadget

Like many gadgets, the Kindle line follows the "good, better, best" marketing strategy. A few years ago, this would mean the difference between features like a touchscreen, better contrast on the display or a much-needed front-lit screen. Today, the distinctions between Kindle models are subtler. When Amazon recently announced that the "All New Kindle" (that's the basic, non-Paperwhite, non-Oasis model) would come with a front-lit screen, the last big deal-breaker for the most basic e-reader was finally dissolved (unless waterproofing is a must have). For less than $90 (with "special offers"), there's finally a Kindle you can read in the dark, that has a touchscreen, and supports Audible over Bluetooth. I'd wager that for a large slice of Kindle readers, the reasons to spend extra on a Paperwhite are getting more specific.

Engadget Score
Poor
Uninspiring
Good
Excellent
Key

Pros
  • Finally a front light screen on the lower-end Kindle
  • Improved contrast display
  • Smaller and sleeker design
Cons
  • Only one storage option
  • Recessed display will catch dust and dirt
  • Not waterproof

Summary

After years in the darkness, the more affordable Kindle finally gets an illuminated display. This squeezes the gap between this and the Paperwhite to the point where you’ll really have to want waterproofing or more storage to justify paying the higher price. The lower resolution display won’t please everyone, but for the bulk of your reading you likely won’t ever notice, making the All New Kindle attractive to both upgraders and those new to e-readers alike.

Be the first to review the Kindle (2019)?
Your ratings help us make the buyer’s guide better for everyone.
Write a review

The big news here is clearly that illuminated display. There was a time when even the premium Kindle meant reaching for the reading light (much as I loved my Kindle Keyboard, I also like reading at night). In fact, for about five years after the first one, all Kindle owners were consigned to squinting when the lights went down -- that is, until the Paperwhite arrived in 2012 with its four LEDs in tow. Since then, those LEDs have likely been the main reason to make the leap from the budget model to the Paperwhite.

Of course, that's not all that's new here. Amazon gave the latest Kindle a modest redesign, even if that's mostly a matter of smoother edges and a different logo embossed on the back. (No "Amazon" text, but the trademark smile/arrow remains.) The device is also slightly smaller than the model it replaces, by about 2mm (barely 1/8th of an inch) in both depth and width. Despite the sleeker footprint, it gains a little weight -- an additional 13g, or just under half an ounce. The size difference is more noticeable when you place it beside the Paperwhite. Also, I find the proportion of the bezels around the display less attractive on the Paperwhite, but that's entirely subjective.

Amazon Kindle (2019) Review

The screen density remains the same as the previous model at 167 ppi, but the contrast is much better than on older models. The E-ink panel used is similar to that in earlier Paperwhites (E-Ink Carta 1.2 for those asking). Of course, the pixel density is lower (the current Paperwhite offers 300 ppi), but depending on what you like to read (or rather, how graphical it is), I don't know how much that will matter.

When I compared the home screens of a current Paperwhite to the All New Kindle, the difference in quality is visible. The small images of book covers show less detail on the new budget model than the Paperwhite. Other tells include when you download something, and the spinning circle in the corner pops up; it's much "smoother" on the Paperwhite. But once you open a book to a full page of text (i.e., when you're actually reading) the difference in sharpness is less obvious. And given that reading is mostly just text, this lower resolution might not bother you much, if at all.

As for that contrast, I almost think the cheaper Kindle goes toe to toe with the Paperwhite, though I always found the "white" in Paperwhite to be more like "Paper-a bit less gray." I keep checking the two next to each other on the same page of the same book, and while there is a difference, it's not significant in my opinion.

Amazon Kindle (2019) Review

Top: Base Kindle. Bottom: Kindle Paperwhite 2018. Spot the differnce?

There are two things about the display here that I'm less thrilled about. One of the more underrated aspects of the latest Paperwhite screen is how it sits flush with the bezels. A small detail, but one that prevents lint, dust and small hairs getting trapped in the lip where the screen meets the bezel. That said, I can see that some might find that a flush screen means it's also easier to accidentally rest your thumb on the display, causing unwanted page turns. I've never had that problem, though, so the presence of a lint trap here is a small grumble. In a similar vein, some might prefer physical page turn buttons, but those folks are out of luck here.

The other thing, based on my testing so far, is that I often have to tap twice to turn a page, as my first try isn't recognized. This might be how I am holding it, or the slight change in weight and dimensions, but I have noticed it enough to mention it. Perhaps the lack of muscle memory for the extra millimeter or two my finger needs to travel to hit the recessed screen is causing it? Hard to say.

The Paperwhite's flush screen serves another practical purpose: waterproofing. That's not a feature shared with the new budget model. Waterproofing is definitely a big "nice to have," but for the amount of time I spend reading near water, I personally can live without it. If it's important to you, you'll definitely want the Paperwhite, which has had waterproofing since the latest model came out back in November.

I'm particularly interested in this Kindle as I've always opted for the higher-end models, mostly because of that front-light. If I'm going to spend a lot of time with this thing in my hand, anything that makes the experience better seems like money well spent. I've never felt the need for the Oasis, though, but that's as much about the curious form factor as anything else. My wife has an older, basic Kindle with physical buttons and no LED display. Her reading habits are different from mine, and it works for her, but I always find it a bit restrictive when I use it. Harder to read in changing light conditions, and pecking in text with a four-way button is just no fun.

Amazon Kindle (2019) Review

The detail-oriented might have spotted that the display on the All New Kindle only has four LEDs, compared to the Paperwhite's five. It's hard to quantify how much of a difference this makes, but when I tried various brightness settings on both (side by side), I didn't spot any gaps in light coverage or even much difference in how bright they were. Battery life also doesn't seem to be affected either way; after several hours of reading, I'm still well over 70 percent.

If you can live with one less LED and a lower (yet perfectly legible) text resolution and don't mind the lack of waterproofing, you might think this is a no brainer. And for most people, it probably is. The only other major difference worth noting is that the All New Kindle only comes with one storage option: 4GB. The Paperwhite starts at 8GB, with an option for upgrading to 32GB. Again, this is something that won't faze a lot of people as the average ebook doesn't take up much space and reading takes time, so even with a dozen books stored you're probably okay for a while.

But if you read long, graphically intense books and definitely if you love Audible, this lower amount of storage might start to feel restrictive. In many ways, it's the new "big difference" between the lower end and the Paperwhite. For those who rotate the library on their Kindle infrequently, it's not an issue, but for everyone else, it's something to consider.

Kindles last a good few years if you treat them well or (ahem) don't keep leaving them on planes. (I've done it three times and counting now.) So for most people on an older Kindle, now is a very interesting time to upgrade. The gap between the All New Kindle and the Paperwhite has never been smaller in terms of key reading features, while the price gap remains more or less the same.

Amazon Kindle (2019) Review

There's a reason I haven't spent much time comparing the previous-generation basic Kindle and this one (the LED display makes that a no brainer). It's whether the All New Kindle is going to eat some of the Paperwhite's lunch, and I think it might.

Of course, things get complicated thanks to Amazon's special offers (ads) pricing. Opting out of the offers bumps the All New Kindle up to $110, putting you just $20 away from the Paperwhite (with the offers). So the real decision then becomes how averse you are to being advertised to during your reading time. For me: very.

Like for like, however, there's a good case to be made for the All New Kindle being the best choice for most people. It's small, comfortable to hold, easy to read and now it's also bedside ready. It'd be nice to have the option of more storage, without making the leap to the Paperwhite, but I guess Amazon needs to hold something back for next time?

All products recommended by Engadget were selected by our editorial team, independent of our parent company, Verizon Media. If you buy something through one of our links, we may earn an affiliate commission.

Let's block ads! (Why?)


https://www.engadget.com/2019/04/10/amazon-kindle-review-2019/

2019-04-10 07:00:51Z
52780265443736

Amazon Kindle review (2019): The Paperwhite gets a run for its money - Engadget

Like many gadgets, the Kindle line follows the "good, better, best" marketing strategy. A few years ago, this would mean the difference between features like a touchscreen, better contrast on the display or a much-needed front-lit screen. Today, the distinctions between Kindle models are subtler. When Amazon recently announced that the "All New Kindle" (that's the basic, non-Paperwhite, non-Oasis model) would come with a front-lit screen, the last big deal-breaker for the most basic e-reader was finally dissolved (unless waterproofing is a must have). For less than $90 (with "special offers"), there's finally a Kindle you can read in the dark, that has a touchscreen, and supports Audible over Bluetooth. I'd wager that for a large slice of Kindle readers, the reasons to spend extra on a Paperwhite are getting more specific.

Engadget Score
Poor
Uninspiring
Good
Excellent
Key

Pros
  • Finally a front light screen on the lower-end Kindle
  • Improved contrast display
  • Smaller and sleeker design
Cons
  • Only one storage option
  • Recessed display will catch dust and dirt
  • Not waterproof

Summary

After years in the darkness, the more affordable Kindle finally gets an illuminated display. This squeezes the gap between this and the Paperwhite to the point where you’ll really have to want waterproofing or more storage to justify paying the higher price. The lower resolution display won’t please everyone, but for the bulk of your reading you likely won’t ever notice, making the All New Kindle attractive to both upgraders and those new to e-readers alike.

Be the first to review the Kindle (2019)?
Your ratings help us make the buyer’s guide better for everyone.
Write a review

The big news here is clearly that illuminated display. There was a time when even the premium Kindle meant reaching for the reading light (much as I loved my Kindle Keyboard, I also like reading at night). In fact, for about five years after the first one, all Kindle owners were consigned to squinting when the lights went down -- that is, until the Paperwhite arrived in 2012 with its four LEDs in tow. Since then, those LEDs have likely been the main reason to make the leap from the budget model to the Paperwhite.

Of course, that's not all that's new here. Amazon gave the latest Kindle a modest redesign, even if that's mostly a matter of smoother edges and a different logo embossed on the back. (No "Amazon" text, but the trademark smile/arrow remains.) The device is also slightly smaller than the model it replaces, by about 2mm (barely 1/8th of an inch) in both depth and width. Despite the sleeker footprint, it gains a little weight -- an additional 13g, or just under half an ounce. The size difference is more noticeable when you place it beside the Paperwhite. Also, I find the proportion of the bezels around the display less attractive on the Paperwhite, but that's entirely subjective.

Amazon Kindle (2019) Review

The screen density remains the same as the previous model at 167 ppi, but the contrast is much better than on older models. The E-ink panel used is similar to that in earlier Paperwhites (E-Ink Carta 1.2 for those asking). Of course, the pixel density is lower (the current Paperwhite offers 300 ppi), but depending on what you like to read (or rather, how graphical it is), I don't know how much that will matter.

When I compared the home screens of a current Paperwhite to the All New Kindle, the difference in quality is visible. The small images of book covers show less detail on the new budget model than the Paperwhite. Other tells include when you download something, and the spinning circle in the corner pops up; it's much "smoother" on the Paperwhite. But once you open a book to a full page of text (i.e., when you're actually reading) the difference in sharpness is less obvious. And given that reading is mostly just text, this lower resolution might not bother you much, if at all.

As for that contrast, I almost think the cheaper Kindle goes toe to toe with the Paperwhite, though I always found the "white" in Paperwhite to be more like "Paper-a bit less gray." I keep checking the two next to each other on the same page of the same book, and while there is a difference, it's not significant in my opinion.

Amazon Kindle (2019) Review

Top: Base Kindle. Bottom: Kindle Paperwhite 2018. Spot the differnce?

There are two things about the display here that I'm less thrilled about. One of the more underrated aspects of the latest Paperwhite screen is how it sits flush with the bezels. A small detail, but one that prevents lint, dust and small hairs getting trapped in the lip where the screen meets the bezel. That said, I can see that some might find that a flush screen means it's also easier to accidentally rest your thumb on the display, causing unwanted page turns. I've never had that problem, though, so the presence of a lint trap here is a small grumble. In a similar vein, some might prefer physical page turn buttons, but those folks are out of luck here.

The other thing, based on my testing so far, is that I often have to tap twice to turn a page, as my first try isn't recognized. This might be how I am holding it, or the slight change in weight and dimensions, but I have noticed it enough to mention it. Perhaps the lack of muscle memory for the extra millimeter or two my finger needs to travel to hit the recessed screen is causing it? Hard to say.

The Paperwhite's flush screen serves another practical purpose: waterproofing. That's not a feature shared with the new budget model. Waterproofing is definitely a big "nice to have," but for the amount of time I spend reading near water, I personally can live without it. If it's important to you, you'll definitely want the Paperwhite, which has had waterproofing since the latest model came out back in November.

I'm particularly interested in this Kindle as I've always opted for the higher-end models, mostly because of that front-light. If I'm going to spend a lot of time with this thing in my hand, anything that makes the experience better seems like money well spent. I've never felt the need for the Oasis, though, but that's as much about the curious form factor as anything else. My wife has an older, basic Kindle with physical buttons and no LED display. Her reading habits are different from mine, and it works for her, but I always find it a bit restrictive when I use it. Harder to read in changing light conditions, and pecking in text with a four-way button is just no fun.

Amazon Kindle (2019) Review

The detail-oriented might have spotted that the display on the All New Kindle only has four LEDs, compared to the Paperwhite's five. It's hard to quantify how much of a difference this makes, but when I tried various brightness settings on both (side by side), I didn't spot any gaps in light coverage or even much difference in how bright they were. Battery life also doesn't seem to be affected either way; after several hours of reading, I'm still well over 70 percent.

If you can live with one less LED and a lower (yet perfectly legible) text resolution and don't mind the lack of waterproofing, you might think this is a no brainer. And for most people, it probably is. The only other major difference worth noting is that the All New Kindle only comes with one storage option: 4GB. The Paperwhite starts at 8GB, with an option for upgrading to 32GB. Again, this is something that won't faze a lot of people as the average ebook doesn't take up much space and reading takes time, so even with a dozen books stored you're probably okay for a while.

But if you read long, graphically intense books and definitely if you love Audible, this lower amount of storage might start to feel restrictive. In many ways, it's the new "big difference" between the lower end and the Paperwhite. For those who rotate the library on their Kindle infrequently, it's not an issue, but for everyone else, it's something to consider.

Kindles last a good few years if you treat them well or (ahem) don't keep leaving them on planes. (I've done it three times and counting now.) So for most people on an older Kindle, now is a very interesting time to upgrade. The gap between the All New Kindle and the Paperwhite has never been smaller in terms of key reading features, while the price gap remains more or less the same.

Amazon Kindle (2019) Review

There's a reason I haven't spent much time comparing the previous-generation basic Kindle and this one (the LED display makes that a no brainer). It's whether the All New Kindle is going to eat some of the Paperwhite's lunch, and I think it might.

Of course, things get complicated thanks to Amazon's special offers (ads) pricing. Opting out of the offers bumps the All New Kindle up to $110, putting you just $20 away from the Paperwhite (with the offers). So the real decision then becomes how averse you are to being advertised to during your reading time. For me: very.

Like for like, however, there's a good case to be made for the All New Kindle being the best choice for most people. It's small, comfortable to hold, easy to read and now it's also bedside ready. It'd be nice to have the option of more storage, without making the leap to the Paperwhite, but I guess Amazon needs to hold something back for next time?

All products recommended by Engadget were selected by our editorial team, independent of our parent company, Verizon Media. If you buy something through one of our links, we may earn an affiliate commission.

Let's block ads! (Why?)


https://www.engadget.com/2019/04/10/amazon-kindle-review-2019/

2019-04-10 07:00:26Z
52780265443736

Selasa, 09 April 2019

Android spyware Exodus makes the leap to iOS devices - Engadget

NurPhoto via Getty Images

Researchers at security firm Lookout recently discovered an iOS version of Exodus spyware that typically targets Android devices. Before you go wiping your iPhone to ensure you aren't being spied on, it's worth noting that the iOS version of the malware has only been found in third-party app marketplaces and hasn't made its way into the walled garden that is Apple's official App Store.

According to Lookout, Exodus for iOS was found on a number of phishing sites that were designed to trick customers of mobile carriers in Italy and Turkmenistan. The spyware was determined to be a stripped down port of the Android version. If installed on a device, the malicious software could steal contacts, photos, videos and audio recordings, GPS information and device location data. An attacker could use the app also perform on-demand audio recordings. The iOS variant of Exodus uploaded the stolen information to the same server as the Android malware, suggesting a direct connection between the attacks.

The Exodus attack initially used enterprise certificates signed by Apple, which made it possible for victims to install the app on their device despite downloading it outside of the App Store. Apple has since revoked those certificates, meaning the attack has largely been squashed. Still, it's a good reminder that iOS devices aren't immune to attacks. It's best to stick to Apple's official App Store to avoid falling victim to spyware.

Let's block ads! (Why?)


https://www.engadget.com/2019/04/09/exodus-spyware-ios/

2019-04-09 20:57:24Z
52780265126276

AT&T announces more 5G markets, still won't sell you a 5G device - Android Police

AT&T was, by some measures, the first carrier to roll out 5G wireless services in late 2018. However, it still only has a single 5G device, a $500 Netgear hotspot you probably won't be allowed to buy. The carrier is rolling its 5G ghost town in seven more cities today, bringing its total to 19.

The newly added markets are Austin, Los Angeles, Nashville, Orlando, San Diego, San Francisco, and San Jose. You won't get 5G signals in all parts of those cities, though. Like most carriers, AT&T is launching 5G on millimeter wave only. Those signals offer high download speeds but don't travel very far. AT&T says that lower frequency "sub-6" 5G will come next year.

If you live in one of AT&T's new 5G markets, good luck actually using 5G. The Netgear hotspot remains the only way you can access the new network, and AT&T won't sell it to just anyone. Currently, only select businesses and consumers have been able to use AT&T's 5G. We expect the Galaxy S10 5G to launch on AT&T later this year, and hopefully you won't have to ask nicely to buy one.

Let's block ads! (Why?)


https://www.androidpolice.com/2019/04/09/att-announces-more-5g-markets-still-wont-sell-you-a-5g-device/

2019-04-09 17:43:00Z
52780265029366

Slack joins forces with Microsoft Office 365 - TechRadar

Businesses use a wide variety of apps, tools and services to communicate and collaborate everyday and now Slack is making things easier for Office 365 users by integrating Microsoft's services into its platform.

The messaging service is getting a new Outlook calendar and mail app, an updated OneDrive app and users will now be able to preview Office files directly within Slack.

The company is making it easier to keep track of all your meetings and calendar invites by bringing them into Slack through the new Outlook calendar app. Users will receive a message when a meeting invite arrives and they will even be able to respond with just one click.

Reminders to join Skype, Webex or Zoom meetings will also appear and the Outlook calendar app will now be able to set your Slack status automatically based on your calendar including setting “out of office” as your status if it has been enabled in Outlook.

Deeper integration

Slack users can now bring emails right into their channels thanks to the addition of Outlook mail integration. They will even be able to forward emails directly from Outlook into a Slack channel with the new Outlook add-in.

Importing files from Microsoft's cloud storage service will also be possible as a result of an update to Slack's OneDrive app. This functionality is similar to the company's existing Dropbox and Google Drive integration which allows users to browse files and add them into a channel or direct message.

Working with Office documents will now be easier in Slack as the company is enabling full previews of PowerPoint slides, Word documents and Excel spreadsheets. These files can be previewed without having to open them and the firm hopes to bring this functionality to OneDrive files as well.

While Microsoft Teams has been gaining ground in its fight against Slack, many businesses often rely on both products for their workloads. By offering greater integration with Microsoft's products, Slack is giving its users another reason to continue using its platform as opposed to searching for an alternative.

Via The Verge

Let's block ads! (Why?)


https://www.techradar.com/news/slack-joins-forces-with-microsoft-office-365

2019-04-09 18:45:00Z
52780265081929

The AI Race Expands: Qualcomm Reveals “Cloud AI 100” Family of Datacenter AI Inference Accelerators for 2020 - AnandTech

The impact that advances in convolutional neural networking and other artificial intelligence technologies have made to the processor landscape in the last decade is unescapable. AI has become the buzzword, the catalyst, the thing that all processor makers want a piece of, and that all software vendors are eager to invest in to develop new features and new functionality. A market that outright didn’t exist at the start of this decade has over the last few years become a center of research and revenue, and already some processor vendors have built small empires out of it.

But this modern era of AI is still in its early days and the market has yet to find a ceiling; datacenters continue to buy AI accelerators in bulk, and deployment of the tech is increasingly ratcheting up in consumer processors as well. In a market that many believe is still up for grabs, processor markers across the globe are trying to figure out how they can become the dominant force in one of the greatest new processor markets in a generation. In short, the AI gold rush is in full swing, and right now everyone is lining up to sell the pickaxes.

In terms of the underlying technology and the manufacturers behind them, the AI gold rush has attracted interest from every corner of the technology world. This has ranged from GPU and CPU companies to FPGA firms, custom ASIC markers, and more. There is a need for inference at the edge, inference at the cloud, training in the cloud – AI processing at every level, served by a variety of processors. But among all of these facets of AI, the most lucrative market of all is the market at the top of this hierarchy: the datacenter. Expansive, expensive, and still growing by leaps and bounds, the datacenter market is the ultimate feast or famine setup, as operators are looking to buy nothing short of massive quantities of discrete processors. And now, one of the last juggernauts to sit on the sidelines of the datacenter AI market is finally making its move: Qualcomm

This morning at their first Qualcomm AI Day, the 800lb gorilla of the mobile world announced that they are getting into the AI accelerator market, and in an aggressive way. At their event, Qualcomm announced their first discrete dedicated AI processors, the Qualcomm Cloud AI 100 family. Designed from the ground up for the AI market and backed by what Qualcomm is promising to be an extensive software stack, the company is throwing their hat into the ring for 2020, looking to establish themselves as a major vendor of AI inference accelerators for a hungry market.

But before we too far into things here, it’s probably best to start with some context for today’s announcement. What Qualcomm is announcing today is almost more of a teaser than a proper reveal – and certainly far from a technology disclosure. The Cloud AI 100 family of accelerators are products that Qualcomm is putting together for the 2020 timeframe, with samples going out later this year. In short, we’re probably still a good year out from commercial products shipping, so Qualcomm is playing things cool, announcing their efforts and their rationale behind them, but not the underlying technology. For now it’s about making their intentions known well in advance, especially to the big customers they are going to try to woo. But still, today’s announcement is an important one, as Qualcomm has made it clear that they are going in a different direction than the two juggernauts they’ll be competing with: NVIDIA and Intel.

The Qualcomm Cloud AI 100 Architecture: Dedicated Inference ASIC

So what exactly is Qualcomm doing? In a nutshell, the company is developing a family of AI inference accelerators for the datacenter market. Though not quite a top-to-bottom initiative, these accelerators will come in a variety of form factors and TDPs to fit datacenter operator needs. And within this market Qualcomm expects to win by virtue of offering the most efficient inference accelerators on the market, offering performance well above current GPU and FPGA frontrunners.

The actual architectural details on the Cloud AI 100 family are slim, however Qualcomm has given us just enough to work with. To start with, these new parts will be manufactured on a 7nm process – presumably TSMC’s performance-oriented 7nm HPC process. The company will offer a variety of cards, but it’s not clear at this time if they are actually designing more than one processor. And, we’re told, this is an entirely new design built from the ground up; so it’s not say a Snapdragon 855 with all of the AI bits scaled up.

In fact it’s this last point that’s probably the most important. While Qualcomm isn’t offering architectural details for the accelerator today, the company is making it very clear that this is an AI inference accelerator and nothing more. It’s not being called an AI training accelerator, it’s not being called a GPU, etc. It’s only being pitched for AI inference – efficiently executing pre-trained neural networks.

This is an important distinction because, while the devil is in the details, Qualcomm’s announcement very strongly points to the underlying architecture being an AI inference ASIC – ala something like Google’s TPU family – rather than being a more flexible processor. Qualcomm is of course far from the first vendor to build an ASIC specifically for AI processing, but while other AI ASICs have either been focused at the low-end of the market or reserved for internal use (Google’s TPUs again being the prime example), Qualcomm is talking about an AI accelerator to be sold to customers for datacenter use. And, relative to the competition, what they are talking about is much more ASIC-like than the GPU-like designs everyone is expecting in 2020 out of front-runner NVIDIA and aggressive newcomer Intel.

That Qualcomm’s Cloud AI 100 processor design is so narrowly focused on AI inference is critical to its performance potential. In the processor design spectrum, architects balance flexibility with efficiency; the closer to a fixed-function ASIC a chip is, the more efficient it can be. Just as how GPUs offered a massive leap in AI performance over CPUs, Qualcomm wants to do the same thing over GPUs.

The catch, of course, is that a more fixed-function AI ASIC is giving up flexibility. Whether that’s the ability to handle new frameworks, new processing flows, or entirely new neural networking models remains to be seen. But Qualcomm will be making some significant tradeoffs here, and the big question is going to be whether these are the right tradeoffs, and whether the market as a whole is ready for a datacenter-scale AI ASIC.

Meanwhile, the other technical issue that Qualcomm will have to tackle with the Cloud AI 100 series is the fact that this is their first dedicated AI processor. Admittedly, everyone has to start somewhere, and in Qualcomm’s case they are looking to translate their expertise in AI at the edge with SoCs into AI at the datacenter. The company’s flagship Snapdragon SoCs have become a force to be reckoned with, and Qualcomm thinks that their experience in efficient designs and signal processing in general will give the company a significant leg up here.

It doesn’t hurt either that with the company’s sheer size, they have the ability to ramp up production very quickly. And while this doesn’t help them against the likes of NVIDIA and Intel – both of which can scale up at TSMC and their internal fabs respectively – it gives Qualcomm a definite advantage over the myriad of smaller Silicon Valley startups that are also pursuing AI ASICs.

Why Chase the Datacenter Inferencing Market?

Technical considerations aside, the other important factor in today’s announcement is why Qualcomm is going after the AI inference accelerator market. And the answer, in short, is money.

Projections for the eventual size of the AI inferencing market vary widely, but Qualcomm buys in to the idea that datacenter inference accelerators alone could be a $17 billion market by 2025. And if this proves to be true, then it would represent a sizable market that Qualcomm would otherwise be missing out on. One that would rival the entirely of their current chipmaking business.

It’s also worth noting here that this is explicitly the inference market, and not the overall datacenter inference + training market. This is an important distinction because while training is important as well, the computational requirements for training are very difference from inferencing. While accurate inferencing can be performed with relatively low-precision datatypes like INT8 (and sometimes lower), currently most training requires FP16 or more. Which requires a very different type of chip, especially when we’re talking about ASICs instead of something a bit more general purpose like a GPU.

This also leans into scale: while training a neural network can take a lot of resources, it only needs to be done once. Then it can be replicated out many times over to farms of inference accelerators. So as important as training is, potential customers will simply need many more inference accelerators than they will training-capable processors.

Meanwhile, though not explicitly said by the company, it’s clear that Qualcomm is looking to take down market leader NVIDIA, who has built a small empire out of AI processors even in these early days. Currently, NVIDIA’s Tesla T4, P4, and P40 accelerators make up the backbone of datacenter AI inference processors, with datacenter revenues as a whole proving to be quite profitable for NVIDIA. So even if the total datacenter market doesn’t grow quite as projected, it would still be quite lucrative.

Qualcomm also has to keep in mind the threat from Intel, who has very publicly telegraphed their own plans for the AI market. The company has several different AI initiatives, ranging from low-power Movidius accelerators to their latest Cascade Lake Xeon Scalable CPUs. However for the specific market Qualcomm is chasing, the biggest threat is probably Intel’s forthcoming Xe GPUs, which are coming out of the company’s recently rebuilt GPU division. Like Qualcomm, Intel is gunning for NVIDIA here, so there is a race for the AI inference market that none of the titans wish to lose.

Making It to the Finish Line

Qualcomm’s ambitions aside, for the next 12 months or so, the company’s focus is going to be on lining up its first customers. And to do this, the company has to show that it’s serious about what it’s doing with the Cloud AI 100 family, that it can deliver on the hardware, and that it can match the ease of use of rivals’ software ecosystems. None of this will be easy, which is why Qualcomm has needed to start now, so far ahead of when commercial shipments begin.

While Qualcomm has had various dreams of servers and the datacenter market for many years now, perhaps the most polite way to describe those efforts are “overambitious.” Case in point would be Qualcomm’s Centriq family of ARM-based server CPUs, which the company launched with great fanfare back in 2017, only for the entire project to collapse within a year. The merits of Centriq aside, Qualcomm is still a company that is largely locked to mobile processors and modems on the chipmaking side. So to get datacenter operators to invest in the Cloud AI family, Qualcomm not only needs a great plan for the first generation, but a plan for the next couple of generations beyond that.

The upshot here is that in the young, growing market for inference accelerators, datacenter operators are more willing to experiment with new processors than they are, say, CPUs. So there’s no reason to believe that the Cloud AI 100 series can’t be at least moderately successful right off the bat. But it will be up to Qualcomm to convince the otherwise still-cautious datacenter operators that Qualcomm’s wares are worth investing so many resources into.

Parallel to this is the software side of the equation. A big part of NVIDIA’s success thus far has been in their AI software ecosystem – itself is an expansion of their decade-old CUDA ecosystem – which has vexed GPU rival AMD for a while now. The good news for Qualcomm is that the most popular frameworks, runtimes, and tools have already been established; TensorFlow, Caffe2, and ONNX are the big targets, and Qualcomm knows it. Which is why Qualcomm is promising an extensive software stack right off the bat, because nothing less than that will do. But Qualcomm does have to get up to speed very quickly here, as how well their software stack actually works can make or break the whole project. Qualcomm needs to deliver good hardware and good software to succeed here.

But for the moment at least, Qualcomm's announcement today is a teaser – a proclamation of what’s to come. The company has developed a very ambitious plan to break into the growing AI inference accelerator market, and to deliver a processor significantly unlike anything else on the open market. And while getting from here to there is going to be a challenge, as one of the titans of the processor world Qualcomm is among the most capable out there, both in funding and engineering resources. So it’s as much a question of how badly Qualcomm wants the inference accelerator market as it is their ability to develop processors for it; and how well they can avoid the kind of missteps that have sunk their previous server processor plans.

Above all else, however, Qualcomm won’t simply take the inference accelerator market: they’re going to have to fight for it. This is NVIDIA’s market to lose and Intel has eyes on it as well, never mind all the smaller players from GPU vendors, FPGA vendors, and other ASIC players. Any and all of which can quickly rise and fall in what’s still a young market for an emerging technology. So while it’s still almost a year off, 2020 is quickly shaping up to be the first big battle for the AI accelerator market.

Let's block ads! (Why?)


https://www.anandtech.com/show/14187/qualcomm-reveals-cloud-ai-100-family-of-datacenter-ai-inference-accelerators-for-2020

2019-04-09 17:30:00Z
52780265129342

The AI Race Expands: Qualcomm Reveals “Cloud AI 100” Family of Datacenter AI Inference Accelerators for 2020 - AnandTech

The impact that advances in convolutional neural networking and other artificial intelligence technologies have made to the processor landscape in the last decade is unescapable. AI has become the buzzword, the catalyst, the thing that all processor makers want a piece of, and that all software vendors are eager to invest in to develop new features and new functionality. A market that outright didn’t exist at the start of this decade has over the last few years become a center of research and revenue, and already some processor vendors have built small empires out of it.

But this modern era of AI is still in its early days and the market has yet to find a ceiling; datacenters continue to buy AI accelerators in bulk, and deployment of the tech is increasingly ratcheting up in consumer processors as well. In a market that many believe is still up for grabs, processor markers across the globe are trying to figure out how they can become the dominant force in one of the greatest new processor markets in a generation. In short, the AI gold rush is in full swing, and right now everyone is lining up to sell the pickaxes.

In terms of the underlying technology and the manufacturers behind them, the AI gold rush has attracted interest from every corner of the technology world. This has ranged from GPU and CPU companies to FPGA firms, custom ASIC markers, and more. There is a need for inference at the edge, inference at the cloud, training in the cloud – AI processing at every level, served by a variety of processors. But among all of these facets of AI, the most lucrative market of all is the market at the top of this hierarchy: the datacenter. Expansive, expensive, and still growing by leaps and bounds, the datacenter market is the ultimate feast or famine setup, as operators are looking to buy nothing short of massive quantities of discrete processors. And now, one of the last juggernauts to sit on the sidelines of the datacenter AI market is finally making its move: Qualcomm

This morning at their first Qualcomm AI Day, the 800lb gorilla of the mobile world announced that they are getting into the AI accelerator market, and in an aggressive way. At their event, Qualcomm announced their first discrete dedicated AI processors, the Qualcomm Cloud AI 100 family. Designed from the ground up for the AI market and backed by what Qualcomm is promising to be an extensive software stack, the company is throwing their hat into the ring for 2020, looking to establish themselves as a major vendor of AI inference accelerators for a hungry market.

But before we too far into things here, it’s probably best to start with some context for today’s announcement. What Qualcomm is announcing today is almost more of a teaser than a proper reveal – and certainly far from a technology disclosure. The Cloud AI 100 family of accelerators are products that Qualcomm is putting together for the 2020 timeframe, with samples going out later this year. In short, we’re probably still a good year out from commercial products shipping, so Qualcomm is playing things cool, announcing their efforts and their rationale behind them, but not the underlying technology. For now it’s about making their intentions known well in advance, especially to the big customers they are going to try to woo. But still, today’s announcement is an important one, as Qualcomm has made it clear that they are going in a different direction than the two juggernauts they’ll be competing with: NVIDIA and Intel.

The Qualcomm Cloud AI 100 Architecture: Dedicated Inference ASIC

So what exactly is Qualcomm doing? In a nutshell, the company is developing a family of AI inference accelerators for the datacenter market. Though not quite a top-to-bottom initiative, these accelerators will come in a variety of form factors and TDPs to fit datacenter operator needs. And within this market Qualcomm expects to win by virtue of offering the most efficient inference accelerators on the market, offering performance well above current GPU and FPGA frontrunners.

The actual architectural details on the Cloud AI 100 family are slim, however Qualcomm has given us just enough to work with. To start with, these new parts will be manufactured on a 7nm process – presumably TSMC’s performance-oriented 7nm HPC process. The company will offer a variety of cards, but it’s not clear at this time if they are actually designing more than one processor. And, we’re told, this is an entirely new design built from the ground up; so it’s not say a Snapdragon 855 with all of the AI bits scaled up.

In fact it’s this last point that’s probably the most important. While Qualcomm isn’t offering architectural details for the accelerator today, the company is making it very clear that this is an AI inference accelerator and nothing more. It’s not being called an AI training accelerator, it’s not being called a GPU, etc. It’s only being pitched for AI inference – efficiently executing pre-trained neural networks.

This is an important distinction because, while the devil is in the details, Qualcomm’s announcement very strongly points to the underlying architecture being an AI inference ASIC – ala something like Google’s TPU family – rather than being a more flexible processor. Qualcomm is of course far from the first vendor to build an ASIC specifically for AI processing, but while other AI ASICs have either been focused at the low-end of the market or reserved for internal use (Google’s TPUs again being the prime example), Qualcomm is talking about an AI accelerator to be sold to customers for datacenter use. And, relative to the competition, what they are talking about is much more ASIC-like than the GPU-like designs everyone is expecting in 2020 out of front-runner NVIDIA and aggressive newcomer Intel.

That Qualcomm’s Cloud AI 100 processor design is so narrowly focused on AI inference is critical to its performance potential. In the processor design spectrum, architects balance flexibility with efficiency; the closer to a fixed-function ASIC a chip is, the more efficient it can be. Just as how GPUs offered a massive leap in AI performance over CPUs, Qualcomm wants to do the same thing over GPUs.

The catch, of course, is that a more fixed-function AI ASIC is giving up flexibility. Whether that’s the ability to handle new frameworks, new processing flows, or entirely new neural networking models remains to be seen. But Qualcomm will be making some significant tradeoffs here, and the big question is going to be whether these are the right tradeoffs, and whether the market as a whole is ready for a datacenter-scale AI ASIC.

Meanwhile, the other technical issue that Qualcomm will have to tackle with the Cloud AI 100 series is the fact that this is their first dedicated AI processor. Admittedly, everyone has to start somewhere, and in Qualcomm’s case they are looking to translate their expertise in AI at the edge with SoCs into AI at the datacenter. The company’s flagship Snapdragon SoCs have become a force to be reckoned with, and Qualcomm thinks that their experience in efficient designs and signal processing in general will give the company a significant leg up here.

It doesn’t hurt either that with the company’s sheer size, they have the ability to ramp up production very quickly. And while this doesn’t help them against the likes of NVIDIA and Intel – both of which can scale up at TSMC and their internal fabs respectively – it gives Qualcomm a definite advantage over the myriad of smaller Silicon Valley startups that are also pursuing AI ASICs.

Why Chase the Datacenter Inferencing Market?

Technical considerations aside, the other important factor in today’s announcement is why Qualcomm is going after the AI inference accelerator market. And the answer, in short, is money.

Projections for the eventual size of the AI inferencing market vary widely, but Qualcomm buys in to the idea that datacenter inference accelerators alone could be a $17 billion market by 2025. And if this proves to be true, then it would represent a sizable market that Qualcomm would otherwise be missing out on. One that would rival the entirely of their current chipmaking business.

It’s also worth noting here that this is explicitly the inference market, and not the overall datacenter inference + training market. This is an important distinction because while training is important as well, the computational requirements for training are very difference from inferencing. While accurate inferencing can be performed with relatively low-precision datatypes like INT8 (and sometimes lower), currently most training requires FP16 or more. Which requires a very different type of chip, especially when we’re talking about ASICs instead of something a bit more general purpose like a GPU.

This also leans into scale: while training a neural network can take a lot of resources, it only needs to be done once. Then it can be replicated out many times over to farms of inference accelerators. So as important as training is, potential customers will simply need many more inference accelerators than they will training-capable processors.

Meanwhile, though not explicitly said by the company, it’s clear that Qualcomm is looking to take down market leader NVIDIA, who has built a small empire out of AI processors even in these early days. Currently, NVIDIA’s Tesla T4, P4, and P40 accelerators make up the backbone of datacenter AI inference processors, with datacenter revenues as a whole proving to be quite profitable for NVIDIA. So even if the total datacenter market doesn’t grow quite as projected, it would still be quite lucrative.

Qualcomm also has to keep in mind the threat from Intel, who has very publicly telegraphed their own plans for the AI market. The company has several different AI initiatives, ranging from low-power Movidius accelerators to their latest Cascade Lake Xeon Scalable CPUs. However for the specific market Qualcomm is chasing, the biggest threat is probably Intel’s forthcoming Xe GPUs, which are coming out of the company’s recently rebuilt GPU division. Like Qualcomm, Intel is gunning for NVIDIA here, so there is a race for the AI inference market that none of the titans wish to lose.

Making It to the Finish Line

Qualcomm’s ambitions aside, for the next 12 months or so, the company’s focus is going to be on lining up its first customers. And to do this, the company has to show that it’s serious about what it’s doing with the Cloud AI 100 family, that it can deliver on the hardware, and that it can match the ease of use of rivals’ software ecosystems. None of this will be easy, which is why Qualcomm has needed to start now, so far ahead of when commercial shipments begin.

While Qualcomm has had various dreams of servers and the datacenter market for many years now, perhaps the most polite way to describe those efforts are “overambitious.” Case in point would be Qualcomm’s Centriq family of ARM-based server CPUs, which the company launched with great fanfare back in 2017, only for the entire project to collapse within a year. The merits of Centriq aside, Qualcomm is still a company that is largely locked to mobile processors and modems on the chipmaking side. So to get datacenter operators to invest in the Cloud AI family, Qualcomm not only needs a great plan for the first generation, but a plan for the next couple of generations beyond that.

The upshot here is that in the young, growing market for inference accelerators, datacenter operators are more willing to experiment with new processors than they are, say, CPUs. So there’s no reason to believe that the Cloud AI 100 series can’t be at least moderately successful right off the bat. But it will be up to Qualcomm to convince the otherwise still-cautious datacenter operators that Qualcomm’s wares are worth investing so many resources into.

Parallel to this is the software side of the equation. A big part of NVIDIA’s success thus far has been in their AI software ecosystem – itself is an expansion of their decade-old CUDA ecosystem – which has vexed GPU rival AMD for a while now. The good news for Qualcomm is that the most popular frameworks, runtimes, and tools have already been established; TensorFlow, Caffe2, and ONNX are the big targets, and Qualcomm knows it. Which is why Qualcomm is promising an extensive software stack right off the bat, because nothing less than that will do. But Qualcomm does have to get up to speed very quickly here, as how well their software stack actually works can make or break the whole project. Qualcomm needs to deliver good hardware and good software to succeed here.

But for the moment at least, Qualcomm's announcement today is a teaser – a proclamation of what’s to come. The company has developed a very ambitious plan to break into the growing AI inference accelerator market, and to deliver a processor significantly unlike anything else on the open market. And while getting from here to there is going to be a challenge, as one of the titans of the processor world Qualcomm is among the most capable out there, both in funding and engineering resources. So it’s as much a question of how badly Qualcomm wants the inference accelerator market as it is their ability to develop processors for it; and how well they can avoid the kind of missteps that have sunk their previous server processor plans.

Above all else, however, Qualcomm won’t simply take the inference accelerator market: they’re going to have to fight for it. This is NVIDIA’s market to lose and Intel has eyes on it as well, never mind all the smaller players from GPU vendors, FPGA vendors, and other ASIC players. Any and all of which can quickly rise and fall in what’s still a young market for an emerging technology. So while it’s still almost a year off, 2020 is quickly shaping up to be the first big battle for the AI accelerator market.

Let's block ads! (Why?)


https://www.anandtech.com/show/14187/qualcomm-reveals-cloud-ai-100-family-of-datacenter-ai-inference-accelerators-for-2020

2019-04-09 16:50:22Z
52780265129342