News


MSI Z97 Gaming Motherboard Giveaway

MSI Z97 Gaming Motherboard Giveaway

On Sunday Intel lifted its embargo on its Haswell refresh, including the updated Z97 chipset. From a high level, Z97 looks a lot like last year’s Z87. Z97 primarily adds official support for M.2 and SATA Express, as well as RST support for PCIe storage devices. This year we’ll see the transition to native PCIe/NVMe SSDs, so Z97 marks the beginning of a platform transition to a world where PCIe storage is more widespread. 

We’ll be publishing Z97 motherboard reviews over the coming days, but to kick off the launch MSI is giving AnandTech readers the chance to win one of four Z97 motherboards. We’ve got two MSI Z97 Gaming 9 AC motherboards and two Z97 Gaming 7 boards to give away.

The Z97 Gaming 9 AC is a standard ATX motherboard with integrated Intel 802.11ac WiFi as well as Killer’s E2205 gigabit Ethernet. The board features a single M.2 connector and three PCIe x16 slots (physical). 

The Z97 Gaming 7 drops the 802.11ac but keeps most of the other features, including the M.2 slot. 

Both are high-end options for your next Haswell build, either with one of the existing SKUs, newly refreshed CPUs or upcoming Devil’s Canyon parts.

We’ll be accepting entries for the next three days. To enter you must be a US resident – simply leave a comment below (please leave only one). We will randomly choose four winners, two will receive a Z97 Gaming 9 AC and the other two will receive a Z97 Gaming 7. We’ll choose the winners/prize combinations at random. Good luck!

Microsoft Unbundles Kinect, $399 Xbox One Model Available Starting June 9th

Microsoft Unbundles Kinect, $399 Xbox One Model Available Starting June 9th

In an interesting reversal of what happened last generation, Microsoft’s Xbox One launched at a $100 price premium to Sony’s PlayStation 4. Despite Sony building the higher performing console, Microsoft’s Xbox One actually had a higher silicon budget (thanks to eSRAM increasing the SoC’s total die area). It was ultimately the bundling of Microsoft’s Kinect that forced the Xbox One to launch at $499 instead of $399. Committed to making the Xbox One more than just a game console, Microsoft seemingly hoped Kinect would be a non-negotiable part of the Xbox experience. That all changes in early June however.

Microsoft just announced a $399 version of the Xbox One, without Kinect, available starting June 9th. The console hardware appears unchanged, it’ll just be sold without Kinect. Microsoft will offer a standalone Kinect option later this fall. Also in June Microsoft will begin offering its Games with Gold Xbox Live program to Xbox One owners as well. Any Xbox One user with a Gold Xbox Live subscription will get access to free games every month (similar to the program already available for Xbox 360 owners, a single subscription will give you access to Games with Gold on both platforms).

Putting the Xbox One at price parity with the PS4 makes a lot of sense, and should help Microsoft in the near term. The real question is whether $100 is enough to move users over to the Xbox One or if the market views the PS4’s spec/performance advantage as being more valuable than the Xbox ecosystem. 

The real tragedy in all of this is that both Microsoft and Sony appear to have hedged their bets a little too much with the Xbox One/PS4. I get the feeling that neither company felt the market for ultra high end consoles was all that solid to begin with, and instead aimed lower on the performance ladder than they did last round (relatively speaking). It’s a bit of a self fulling prophecy at this point. Going more conservative with performance due to a fear of a market going away is a great way to ensure that the market is open for a higher performing alternative (read: Steambox, PCs) to come in and steal users away. 

In speaking with NVIDIA prior to the Tegra K1 launch their viewpoint is that the clock is ticking for when mobile SoCs can equal the performance of the new consoles. I’m sure the other mobile players are focused on the same thing. We’ll likely see Xbox 360-ish performance out of mobile silicon in the next 12 months. Add another few generations (and process nodes) and we’ll be a lot closer to Xbox One/PS4 performance. We’re already pretty close on the CPU side.

Motorola Introduces the Moto E: The $129 Smartphone for Everyone

Motorola Introduces the Moto E: The $129 Smartphone for Everyone

It doesn’t seem like it was long ago that everyone was excited by rumors of a new Motorola phone after Google’s short-lived acquisition of the company in May 2012. Motorola had long been known as a company that introduced their own custom interface onto Android which users felt offered little benefit in return for the performance impact it had on their devices. After their acquisition by Google it was hoped that Motorola would offer an experience closer to that of Google’s Nexus devices. Those wishes came true, and Motorola has since been creating devices that are very close to stock Android with only minor modifications which, for the most part, are generally seen as helpful. Motorola also made a move for the mid-end segment of the market with the Moto G which offered users a formidable smartphone relative to its price.

Today Motorola has continued along their new path by introducing the Moto E, the most inexpensive device in their new lineup of smartphones.

Motorola’s Smartphone Lineup
  Motorola Moto E Motorola Moto G Motorola Moto X
SoC Qualcomm Snapdragon 200 (MSM8x10)
2x ARM Cortex A7 at 1.2GHz
Adreno 302 at 400MHz
Qualcomm Snapdragon 400
(MSM8x26)
4x ARM Cortex A7 at 1.2 GHz
Adreno 305 at 450MHz
Qualcomm Snapdragon S4 Pro (MSM8960Pro)
2x Krait 300 at 1.7 GHz
Adreno 320 at 400MHz
Motorola X8 System (SoC+NLP Processor+Contextual Processor)
RAM/NAND 1GB LPDDR2 + 4GB w/ MicroSDHC 1GB LPDDR2 + 8/16GB NAND 2GB LPDDR2 + 16/32GB NAND 
Display 4.3″ 960×540 LCD 4.5″ 1280×720 IPS LCD 4.7″ 1280×720 RGB Stripe AMOLED
Network 2G / 3G (MSM8x10 21.1Mbps HSDPA 850/900/1900/2100MHz or 850/1700/1900/2100MHz) 2G / 3G (Qualcomm MSM8x26 21.1Mbps HSDPA) 2G / 3G / 4G LTE (Qualcomm MDM9x15 UE Category 4 LTE)
Dimensions 124.38 x 64.8 x 12.3mm, 142g 129.9 x 65.9 x 11.6 mm, 143g 129.3 x 65.3 x 10.4 mm, 130g
Camera 5MP (2592 х 1944) Rear Facing 5MP (2592 х 1944) Rear Facing w/ 1.4µm pixels and F/2.4 aperture

1.3MP Front Facing

10 MP (4320×2432) Clear Pixel (RGBC) Rear Facing w/ 1.4µm pixels and F/2.4 aperture

2MP 1080p Front Facing

Battery 1980 mAh (7.52 Whr) 2070 mAh (7.87 Whr) 2200 mAh (8.36 Whr)
OS Android 4.4.2 Android 4.4.2 Android 4.4.2
Connectivity 802.11 b/g/n + BT 4.0, USB2.0, GPS/GNSS 802.11 b/g/n + BT 4.0, USB2.0, GPS/GNSS 802.11 a/b/g/n/ac + BT 4.0, USB2.0, GPS/GNSS, DLNA, NFC
SIM Size Micro-SIM (Dual SIM SKU) Micro-SIM Nano-SIM

The Moto E sports a similar design to the other smartphones in Motorola’s lineup with a rounded plastic design and a lip at the top for the 3.5mm headphone jack. Similar to the Moto G, customers will be able to swap out the back cover for ones of many different colors which adds a level of customization to the phone’s design. I hope that Motorola has not compromised on the quality of construction to attain such a low price point; the Moto G felt exceptionally well crafted given its cost.

On the face of the device we have a 960×540 qHD display. Motorola isn’t being forthcoming with whether or not this is an IPS panel, although they do confirm that it is covered with Corning Gorilla Glass 3. To attain its low price point the Moto E does not come with a front facing camera. There is also a large front facing speaker along the bottom of the device. On the back we have the 5MP camera which is capable of FWVGA (854×480) video recording, and the indented Motorola logo that has become standard across Motorola’s new devices.

Inside the device we have Qualcomm’s Snapdragon 200 platform with two Cortex A7 cores running at 1.2GHz and an Adreno 302 GPU at 400MHz. On the cellular side the device supports 21.1Mbps HSDPA on bands 1, 2, 5, and 8 for the North American and European versions and bands 1, 2, 4, and 5 in Latin America. A special SKU with support for dual SIM cards will also be available in certain markets with support for dual SIM standby. The device packs a 7.52 Whr battery and Motorola is promising that it will be capable of lasting you the day on a single charge. The device ships with only 4GB of NAND (only 2.2GB of which is available to the user) but to compensate Motorola has included a MicroSD card slot for up to 32GB of storage expansion.

Motorola is marketing this as the smartphone to kill the dumbphone and they have set up a website which encourages users to tell their friends still using flip phones to make the switch. The website states “Life before mobile apps should be a thing of the past. Introducing the Moto E. Made to last. Priced for all.” It really is something to see such a capable device at a price accessable to the masses. It was only a few years ago that hardware of this capability was reserved for the fastest and most expensive of smartphones.

The Moto E comes with Android 4.4.2 out of the box. It is available now in the US and India for $129 USD and 6999 Rs respectively, with Motorola planning to launch in over 40 countries in the coming weeks.

Source: Official Motorola Blog

TechEd NA 2014 - Services In The Cloud

TechEd NA 2014 – Services In The Cloud

On Monday Microsoft kicked off TechEd North America 2014 in Houston. TechEd is the technology conference geared towards IT Professionals and Enterprise developers, and focuses on the tools, software, and services that many enterprises rely on for s…

VESA Adds Adaptive-Sync to DisplayPort 1.2a Standard; Variable Refresh Monitors Move Forward

VESA Adds Adaptive-Sync to DisplayPort 1.2a Standard; Variable Refresh Monitors Move Forward

The last half-year or so has seen the concept of variable refresh desktop monitors advance rather quickly. After sitting on the technology backburner for a number of years, the issue came to the forefront of the PC graphics industry late last year when NVIDIA announced G-Sync, the first such desktop implementation (and unfortunately proprietary implementation) of the concept. AMD in turn fired back at NVIDIA at CES this year, demonstrating their FreeSync concept, which could implement variable refresh through features found in the embedded DisplayPort (eDP) standard. Since then the technology has been in something of a holding pattern – NVIDIA and their partners are still prepping retail G-Sync monitors, meanwhile AMD and the VESA have needed to bridge the specification gap between eDP and DisplayPort.

To that end, the VESA sends word today that they have done just that with the latest update to the DisplayPort 1.2a standard. Adaptive-Sync (not to be confused with NVIDIA’s Adaptive V-Sync), the eDP feature that allows for variable refresh monitors, has been added to the DisplayPort 1.2a standard as an optional feature. We’ve been expecting this addition since AMD first announced their FreeSync concept, however until now it wasn’t clear whether Adaptive-Sync would first be added to DisplayPort 1.2a or rolled into the forthcoming DisplayPort 1.3 standard, so we’re glad to see that it’s the former rather than the latter.


AMD’s FreeSync Demo


NVIDIA’s G-Sync Demo

With the standard now having been settled, this frees up GPU manufacturers and display manufacturers to move forward on implementing it in hardware and drivers. The good news is that the underlying technology is fairly old – eDP was ratified in 2009 – so while we’re not accustomed to seeing Adaptive-Sync on desktop hardware, there are GPU and display/controller manufacturers who have experience with the technology. That said, since this feature isn’t present in today’s display controllers there’s still a need to iterate on the hardware and its firmware, even if it’s just making small modifications to existing designs (this being the advantage of doing a DP 1.2a extension).

AMD for their part sent over a notice that they’re already working with display manufacturers to get the technology into future monitors, with their estimate being 6-12 months for Adaptive-Sync capable displays to hit the market. There’s no real precedent for this kind of change, so it’s hard to say just what a realistic number within that window is; but historically vendors have been slow to update their hardware for new DisplayPort standards, and NVIDIA’s own efforts have still taken many months even with NVIDIA’s extra muscle and close relationship. With that in mind we suspect 12 months is more realistic than 6, though we’d be happy to be wrong.


VESA’s Adaptive-Sync Promo Material

Meanwhile the VESA for their part is touting the full range of benefits for Adapative-Sync. This includes both the gaming angle that NVIDIA and AMD have recently been pushing and the power savings angle that drove the creation of Adaptive-Sync and eDP in the first place. Admittedly the power gains are miniscule and generally unimportant for a desktop scenario, but they are there. Outside of gaming what’s more interesting is the ability to apply Adaptive-Sync to video playback, allowing for the elimination of the judder that’s common when playing back 24fps/25fps content on today’s 60Hz displays.

Along with the addition of Adaptive-Sync to the DisplayPort standard, the VESA will also be putting together a new (yet to be revealed) logo for the technology. Since Adaptive-Sync is an optional feature not every DisplayPort device will support it, so those devices that do support it will sport a logo to visibility indicate their compliance. The logo will go hand-in-hand with the VESA’s forthcoming Adaptive-Sync compliance test, so manufacturers will need to pass the test before they’re able to use the logo.

Moving on, coinciding with today’s announcement from the VESA AMD sent along their own release on the subject. In it, AMD notes that they’re immediately preparing for Adaptive-Sync, though they will be continuing to promote it under the FreeSync brand. AMD is telling us that as of this point most of their GCN 1.1 products will support Adaptive-Sync, including the desktop Radeon 290 and 260 series, along with most of AMD’s current APUs: Beema/Mullins, Kaveri (AMD had mistakenly omitted this from their list), and even the GCN 1.0 Kabini/Temesh. Meanwhile AMD has not yet commented on whether their GCN 1.0 video cards will support Adaptive-Sync, so the outcome of that remains to be seen. But for all of the supported products the underlying hardware is already Adaptive-Sync capable, so it’s just a matter of AMD rolling out support for it in their drivers.


There are some technical differences between G-Sync and Adaptive-Sync, but the goals and resulting behaviors are the same

AMD’s release also contains an interesting note on supported refresh rates: “Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.” While the upper-bounds of those ranges are in-line with numbers we’ve seen before, the sub-30Hz refresh rates on the other hand are unexpected. As you might recall from our look at G-Sync, even though LCD monitors don’t suffer from anything quite like phosphor decay as CRT monitors do, there is still a need to periodically refresh an LCD to keep the pixels from drifting. As a result G-Sync has a minimum refresh rate of 30Hz, whereas AMD is explicitly promising lower refresh rates. Since the pixel drift issue is an underlying issue with the LCD technology there is presumably something in Adaptive-Sync to compensate for this – the display is likely initiating a self-refresh – though at the end of the day the variable refresh rate means that you can always set the refresh rate to a multiple of the targeted refresh rate and get the same results.

Finally, we also had a brief chat with NVIDIA about whether they would support Adaptive-Sync on current generation hardware. NVIDIA tells us that they can’t comment at this time since there aren’t any Adaptive-Sync displays available. It’s entirely possible this is just NVIDIA being coy, however like all device vendors they do have to pass the VESA’s compliance tests. So if nothing else NVIDIA’s “no comment” is technically correct: until they pass that test they are limited in what they can say about being Adaptive-Sync compliant.

Though while we’re on the subject, this also brings up the matter of NVIDIA’s competing G-Sync technology. Because of NVIDIA’s head-start on the variable refresh concept with G-Sync, for the next year or so they will continue to be the only vendor with retail support for variable refresh. The modified Asus monitors have been available for a few months now, and the retail G-Sync monitors are still due this quarter the last we heard from NVIDIA. So until Adaptive-Sync monitors hit the market G-Sync is the only option.

Ultimately it remains to be seen what will become of G-Sync – NVIDIA seems to be in this for the long haul as part of their broader ecosystem plans – and there is the matter of whether the technical differences between Adaptive-Sync and G-Sync result in meaningful performance differences between the two technologies. With that said, even if NVIDIA keeps G-Sync around we would hope to see them support Adaptive-Sync just as well as AMD.