Review of Lipsense Long Lasting Lipstick

The new LipSense lipstick range by SeneGence is been listed as one of the top new trendy lipstick range these days. While it has become necessary to explore about the product we are looking to buy, especially when it is a cosmetic product, which can have some adverse effect on the body. Keeping this in mind here are some key facts about Lipsense lipstick.

There are many instances when your lipstick can spoil the day for you. Suppose you have to go to a distant place far away from your house would you be willing to put your lipstick back in every two hours? Or what if you are thinking about something and unconsciously your hand reached your lip. The answer is that your favorite lip color will blew away in seconds. What about that shine, you always wanted on your lips? And yes, what about those mark that a normal chapstick leaves on your lips?

Lipsense long lasting lipstick comes in a range of about 72 lip colors. It will not shed away even if you are drinking, touching, chatting or even rubbing. Its lasting abilities are a result of SeneGence patented super-adhesive lip color formula, a multi-layered lip color process and sealing ‘top coats’ (aka glosses). Not only this, it is waterproof, smear proof, budge proof and does not remove even after rubbing the lips convincingly. The only way to remove is by using a suitable make up remover.

It can be applied by following a simple procedure. Make sure your lips are clean and there is no other products left applied. First line your lips with the color of your choice. Allow it to dry for 5 minutes. Apply the second layer and let it dry. It is also pretty tacky due to the product’s adhesive ingredients. You can use an Oops remover if make any mistake while applying it. Then give the last coating of the lip color and allow to it to get arid. If you run the brush over the same spot a second time before it dries, it can lift off some of the product and create a patchy look.

Then apply the gloss which comes in shiny or matte styles. There you are, ready to shine!

You can use Oops remover to remove the lip color at the end of the day. Instead of it, you can also use some coconut oil and gently scrub with a wash cloth. Using an ordinary soap and water can also help you clean it well. While going outside the Oops remover proves to be a great option.

It gives you the feeling of almost a weightless wear. It does not even show that chinky effect, making you move your lips effortlessly. While it is fine to make thinner and runnier coats while using the lipstick one may find them a little expensive than other such products but it may be noted that this lipstick is not needed to be applied many times a day and hence it will last for 4 to 6 months on an average.

Now the only question arises is whether it contains any harmful chemicals? Well this product is made by ingredients that are approved by FDA, so it saves you doing any harm to your lips. It doesn’t penetrate the skin and go into the blood stream. It is a patented molecular bond which means that it just sticks to your skin leaving you with clean and dry lips. So if you want to learn more about this amazing long lasting lipstick you can browse through the internet for more details.

Quern – Undying Thoughts review: The closest we may ever come to a Riven sequel

Quern - Undying Thoughts

2016 was one hell of a year for Myst-alikes, eh? Straight from the source, from Cyan itself, you had official spiritual successor Obduction—a game which crept onto our Games of the Year list. Then there was The Eyes of Ara, which utilized the same sort of puzzles although in quite a different sort of setting.

But I wish I’d gotten around to playing Quern – Undying Thoughts. I don’t know why I didn’t, although it certainly wasn’t helped by a late November release nor its impenetrable title.

Quern - Undying Thoughts

Ads by Kiosked

Quern – Undying Thoughts

I recently delved into Quern’s world though—nothing like those slow Januarys for catching up on your backlog—and it’s excellent. Like Myst of old, you arrive in an area with no idea how you got there or what you should be doing. In this case you’ve been teleported to an island by way of a massive gateway which promptly self-destructs upon your arrival.

There’s a way off the island though, approximately 50 locked doors and dozens of puzzles in your future. Puzzles involving gears and simple mechanical contraptions, puzzles involving audio cues, puzzles involving some lightweight botany, and puzzles involving a whole series of crystals each of which has its own unique properties.

Quern - Undying Thoughts

Ads by Kiosked

Quern – Undying Thoughts

Along the way you’ll delve into the origins of the titular island, Quern. The island’s previous occupant has left behind directives, notes on the peculiar properties Quern possesses, the secrets he’s uncovered and what he’s done with said knowledge. His journal entries lead you from area to area, sometimes giving context, sometimes giving clues, and all apparently part of some plan. But what plan? And why?

It’s as close to a Myst game as I’ve ever seen from another developer. Or rather, a Riven game. Just look at the architecture here:

Quern - Undying Thoughts

Ads by Kiosked

Quern – Undying Thoughts

This is the first village you enter. It’s your main hub and where you’ll spend a good portion of Quern. And it’s incredibly reminiscent of the bleached-white adobe buildings that dotted Riven’s Jungle Island:

Riven

Ads by Kiosked

Riven

Those huts may represent the most obvious parallel, but the game reeks of Cyan’s mid-’90s output. It has the same sense of weary isolation, the same unexpected warmth to its grimy copper gizmos, and that feeling of “Normal” and “Alien” bashed together in one place. Here, a library. There, a forge. Everyday objects, but all of it residing in a world so unique and unusual.

Between the focus on mechanical puzzles and the aesthetic nod, Quern feels like the Riven successor we never really got (and likely never will get). It’s grim and lonesome in the same way that made Riven a classic—and much closer in tone to Riven than Cyan’s actual sequel, Myst III.

Quern - Undying Thoughts

Ads by Kiosked

Quern – Undying Thoughts

And it takes the same approach to puzzles—not just in terms of aesthetics, but philosophically. Quern loves introducing new ideas and puzzle constraints at a rapid pace and then disposing of them at will, adhering to a certain internal logic but unconcerned with building on its own foundation.

Hell, there’s even a puzzle similar to Riven’s famous spinning room.

Quern’s not nearly as bash-your-head-against-a-wall obtuse, of course. While Riven occasionally stumps me even today, twenty or so years after I first played it, Quern is breezier. Benefiting from an additional two decades of puzzle design and fully 3D environments, it took me maybe eight or nine hours to get through. Not too bad.

Quern - Undying Thoughts

Ads by Kiosked

Quern – Undying Thoughts

It’s also not a perfect game. Some of its puzzles fall into the typical adventure game trap—very obvious solution buried under a cumbersome series of steps. One puzzle is just a retread of the Knights of the Old Republic puzzle where you do a bunch of column math to blow up injector pods on Manaan. (Yes, it’s as boring as it sounds.) And you do it three times here. Ugh. There are a few other stinkers, but that one’s the worst.

The game also includes a seemingly handy Notebook feature, which is supposed to function like in-game screenshots. You see something important, a clue to a puzzle, and you can “jot it down in your notebook” at the touch of a button. Quern then makes a cool little sketch-representation of whatever item you were looking at.

source”cnbc”

Amazon Fire TV Stick with Alexa Voice Remote review: A peppier processor makes this streamer worthwhile

Picking the best low-cost media streamer just got a lot more complicated.

firetvstick2016
AT A GLANCE
  • Amazon Fire TV Stick with Alexa Voice Remote (2016)

    TECHHIVE RATING

    $39.99 MSRP $39.99

    VIEW

    on Amazon

    The new Fire TV Stick combines good-enough performance with powerful voice controls and a smart user interface.

COMMENTS
Before Amazon’s second-generation Fire TV Stick showed up for review, I thought it could be the cheap streaming device to beat. Amazon has claimed a 30 percent performance boost over the original Fire TV Stick, whose biggest problem was sluggishness. Apps took a long time to load, and were prone to bouts of freezing and stuttering. The device didn’t seem built to last in an age of increasingly sophisticated streaming apps such as Sling TV and PlayStation Vue.

The 2016 Fire TV Stick—still priced at $40—alleviates those issues, but it doesn’t completely solve them. And while Amazon’s apps and interface are forward-thinking, they lack some of the nice touches that make competing devices such as the latest Roku Streaming Stick and or Google’s Chromecast worth considering.

A gentle upgrade

Compared to the first-generation Fire TV Stick, the 2016 version is a bit larger, not that the size matters much. Like other streaming sticks, this one plugs into your TV’s HDMI port (there’s a short extension cable in box if you need more room around the port).

There’s a new 1.3GHz quad-core processor inside, a step up from 1.0GHz dual-core in the original Fire TV Stick. Storage and RAM are unchanged at 8- and 1GB respectively, and playback resolution still tops out at 1080p. The hardware upgrade is enough to keep home screen animations running smoothly, and to avoid the more egregious performance hang-ups that made certain apps borderline unusable with the previous Fire TV Stick.

firetvstick2016ext

Jared Newman

An HDMI extension cable is included in case the stick blocks any nearby ports.

That said, you might notice times where the new stick struggles. HBO Go, for instance, seems especially stutter-prone, and app loading times as a whole can border on annoying. PlayStation Vue is one of the bigger offenders, taking around 20 seconds to get up and running.

Speaking of Vue, it doesn’t run at 60 frames per second on the Fire TV Stick, even though Sling TV and MLB.tv do; so you won’t get the smooth motion you’d expect from live sports. On the other hand, the Vue app on Fire TV is far superior to the Roku version overall.

Amazon has also upgraded the Wi-Fi to 802.11ac. This should provide faster speeds and improved range—provided you have an 802.11ac router, of course. And while Amazon isn’t publicizing this, the new Fire TV Stick can support Bluetooth headphones, although I’ll qualify that by noting the volume controls didn’t work on the pair I tested. It’s possible that this feature isn’t complete yet.

The other big improvement is in the remote: You get a voice-enabled “Alexa” remote by default. (Amazon had been selling this as a $10 upgrade with the old Fire TV Stick.) The new remote is comfier and has a more premium feel than the previous default remote, and it enables some powerful software features that I’ll dive into later. On the downside, it still doesn’t have any sort of built-in volume control, so you’ll need to keep the TV remote (or a Sideclick) handy.

Software in stasis

Right now, there’s not much point in critiquing the Fire TV Stick user interface, because it’s about to get a major facelift. But even after that happens, the general idea isn’t going to change: Amazon is more interested in pulling TV and movie recommendations onto the home screen than pushing you out to dozens of siloed apps.

This used to mean that Amazon would simply bombard you with recommendations from its own Prime video service, scattering thumbnails across the vast majority of the main menu. Last month, however, Amazon added Netflix and HBO Go recommendations to the home screen, while promising more streaming services to come. This emphasis on content over apps is how streaming video should work, and Amazon has traveled further down that road than anyone else.

firetvstickhome

The Fire TV home screen no longer limits its recommendations to Amazon Video.

Amazon has also bulked up its voice controls, adding new content sources (including Netflix) and an ever-expanding list of skills via its Alexa virtual assistant. You can use the voice remote to play music, call an Uber, control your smart thermostat and light bulbs, or order more diapers, among other things.

Unfortunately, some old criticisms of Amazon’s voice search haven’t gone away: Amazon still hides alternative content sources behind a “more ways to watch” button, and there’s no way to get a quick look at which seasons of a show are available on which streaming service. And while Alexa is great at hunting down shows, movies, actors, directors and apps, it’s lacking the advanced genre search powers of Apple TV. Failure to integrate with other Alexa devices also seems like a missed opportunity, with no way to synchronize music or tell Amazon Echo to start playing a video.

firetvsticktunes
Playing music on the Fire TV Stick is nice, but whole-home audio via Amazon Echos would be nicer.

Elevating the stick game

Amazon’s first Fire TV Stick got a middling review the last time we checked in on it, as rival devices had raised the bar for cheap media streamers.

The latest Roku Streaming Stick, for instance, is an excellent value at $50. Although it lacks Amazon’s high-quality apps and forward-thinking interface, and it doesn’t include a voice remote, it feels a lot faster than even the new Fire TV Stick. It also has some smart flourishes of its own, such as private headphone listening through your smartphone, and a dedicated “rewind 10 seconds with closed captions” button.

And then there’s Google’s $35 Chromecast. It hasn’t changed a whole lot, but it doesn’t need to. By offloading navigation to the apps on your phone or tablet, Chromecast enables cheap hardware without compromise. It also integrates with Google’s new smart-home device, Google Home, much more effectively than the Fire TV integrates with Amazon’s Echo. (We’ll have a hands-on review of the new 4K Chromecast Ultra soon.)

The second-generation Amazon Fire TV Stick does not mop the floor with its competitors, but its hardware has improved enough—and its user interface has advanced enough—to nudge the market forward while also making your purchasing decision a bit more agonizing.

source”cnbc”

Toshiba OCZ’s TL100 review: A budget SSD that’s not a bargain

box shot

There are two types of products that send the PCWorld lab into retest mode: those that perform better than expected, and those that perform worse. Toshiba’s 2.5-inch, TLC NAND-based, OCZ-branded TL100 definitely falls into the latter category. As a matter of fact, after seeing a sustained write speed of less than 100MBps, a far cry from the “up to 530MBps” you’ll see advertised, we started kicking tires with particular energy.

Three TL100’s were tested on four separate PCs and three different operating systems (Windows 7, 8.1, and 10) to confirm that the bad write numbers it was pulling weren’t somehow related to our MO or hardware. Results varied only slightly. The drive will write around 500MBps, but only for a few seconds, then speeds decline precipitously to the 100MBps level and stay there.

drive

There’s no nice way to say this: Unless priced ridiculously low, buy something else.

HD Tune, CrystalDiskMark, and several other benchmarks not normally used were also run to make sure it wasn’t an issue with our usual go-to, AS SSD. CrystalDiskMark didn’t see a problem with the drive in its 1GB test, but beyond that, it was the same slow-sustained-write story.

Still faster than a hard drive

To be fair, read speed and read random-access have more to do with the apparent speed of a system than write speeds, because they are what you experience when the operating system and programs load. In that regard, while the TL100 isn’t the fastest SSD we’ve tested, it is indeed an SSD; replacing the hard drive in your system with one will make it seem like you just strapped a rocket to it. Okay, a bottle rocket. However, the first time you copy an even moderately large file to it, you’re going to wonder if you somehow broke it.

The TL100 might’ve gotten more love around here if it had hit the market at 10 or 15 cents per gigabyte. But it showed up at $45 for the 120GB version and $68 for the 240GB version, or around 35 and 28 cents per gigabyte, respectively. That’s inexpensive, but a dollar or two in savings is not enough to offset a four-fold drop in sustained write performance. Not nearly.

ocz tl100 enlarged
We were a bit stunned by the TL100’s performance: The preceding Trion 150 was a great improvement over the original Trion 100. This drive is two steps backward.

At first, as the write speeds declined so quickly, we suspected a lack of cache. After consulting OCZ/Toshiba, the company explained that there indeed is cache in the drive, but it’s used differently than with previous drives in an attempt to smooth performance. It didn’t work, unless you consider 60MBps to 100MBps in our real-world tests a successful levelling.

Note that our real-world copy tests utilize compressed files that negate the minimal performance advantages offered by SSD controllers that compress data. That’s the beauty of AS SSD as well. This was a choice made several years ago to eliminate fantastical test results generated using highly compress-able files that didn’t remotely mirror users’ actual experiences. Highly compress-able files such as text or raw images comprise a very small percentage of most users’ data.

As you can see below, the TL100 is an okay reader. As mentioned, you will definitely notice a boost in performance if this SSD replaces your hard drive, just not as much as you would with similarly priced SSDs that read 15 percent faster and write 400 percent faster.

tl100 as ssd

So far, Samsung’s 750 EVO is the only TLC NAND drive that’s gotten it right. Longer bars are better.

20gb copies tl100

When it comes to writing large amounts of data, the TL100 is actually slower than some hard drives. Shorter bars are better.

This next chart was a real puzzler. We’ve never seen an SSD write access-time even remotely as slow as the TL100’s. This may have been an interaction between the TL100 and AS SSD. The drive handled operating system duties, which stress random access, just fine. We were still investigating at the time of this article.

access times tl100

 Shorter bars are better.

TLC NAND

It would be nice if TLC always stood for tender loving care, and not triple-level cell (3-bit) NAND. At least when it comes to writing data. It’s inherently slower than MLC (multi-level cell/2-bit) and SLC (single-level cell/1-bit) at writing, but the TL100 is uniquely slothful. Samsung manages to wring a very respectable 400MBps out of its TLC on the 750 EVO, and Toshiba manages 240MBps with its Trion 150. The TL100’s 94MBps? Wow. And that’s not a good wow.

tl100 laying correct

Toshiba

It’s hard to believe Toshiba could put out an SSD this slow. But here it is.

Note that this is not a diatribe against TLC per se. Not in the least. If the performance shortcomings and cost savings are appropriately balanced, TLC can be a great thing. But so far, only Samsung’s 750 EVO has delivered sustained write performance commensurate with price, when compared with MLC NAND SSDs.

Also, this is not about whether you’ll get a boost in performance when you replace your hard drive with an SSD. Again, any SSD, even a slow one such as the TL100 will do that. This is about how much of a boost you get, and for how much.

Conclusion

Let’s keep this short and sweet. Unless you see this drive in a bargain bin for $20 (120GB) or $40 (240GB), don’t buy it. There’s no pleasure in saying that—in fact, it pains me to see what Toshiba is doing to the OCZ brand.

Instead of the TL100, brown-bag it for a day and get the Samsung 750 EVO ($70 for 256GB on Amazon) or a nice MLC drive such as OCZ’s VX500 ($97 for 256GB on Amazon) or the Transcend SSD370 ($93 for 256GB on Amazon). If you absolutely must go with the cheapest drive available, there are several like-priced TLC drives, including OCZ’s own Trion 150 ($75 for 240GB on Amazon) and Crucial’s MX300 ($70 for 275GB on Amazon) that offer significantly better all-around performance.

source”cnbc”

MSI GS63VR Stealth review: A game-changing amount of performance in a laptop

msi gs63vr beauty 3

here’s an old saying: You can have performance or portability in a gaming laptop, but not both.

Well, not anymore.

The MSI GS63VR Stealth is the first laptop I’ve seen using Nvidia’s GeForce GTX 1060, and I’m suitably impressed. In fact, I’m blown away by the amount of performance that can be squeezed into a laptop that’s actually portable.

How portable? On our postal scale, the laptop pushes just over four pounds sans power brick. Granted, it’s not as featherlight as an ultrabook, but the Stealth is right in the neighborhood of the svelte Dell XPS 15 and Apple MacBook Pro 15.

msi gs63vr mac book pro 15 xps 15 front

The MSI GS63VR Stealth is a little bigger than a MacBook Pro 15 (2013-2016) and Dell’s latest XPS 15, but it packs a much bigger punch with its GeForce GTX 1060 GPU.

TABLE OF CONTENTS
  • The build
  • Performance
  • Battery life
  • Room for improvement
  • Conclusion

The build

Inside the Stealth, you get an Intel 6th-gen quad-core Core i7-6700HQ paired with 16GB of DDR4/2133 RAM. For storage, our $2,099 review sample (available for $1,999 on Amazon) came with a 512GB Samsung SM951 SSD and 1TB hard drive. Who makes the hard drive? Do you care? The 15.6-inch monitor is a wide-viewing-angle 4K panel with a light anti-glare coating.

Before you start complaining that this system should have a quad-core Kaby Lake CPU, you should know 7th-gen quad-core Kaby Lake CPUs won’t grace computers until early next year, so this is the current state of the art.

The real star of the show is, of course, that GeForce GTX 1060. If you’re not caught up on current events, Nvidia has dropped the “M” from its laptop GPUs because it feels its mobile parts are the equal of their desktop counterparts.

For the most part that’s true, as you can see from this quick spec sheet I cobbled together: The mobile GPU has the same CUDA core count and memory bandwidth as its equivalent desktop chip. The main difference between desktop and laptop GPU is in the clock speed, where the notebook gives up a little bit of speed.

gpus

Nvidia’s GeForce GTX 1060 for notebooks gives up a little clock speed but is otherwise identical to its desktop counterpart.

On the exterior, the laptop has the MSI “look and feel” but with plenty of gamer appeal. The backlit keyboard has three-zone lighting, and there’s a 10-key number pad, but it’s so small that anyone with big meat claws will find it challenging. The key action is okay, but not noteworthy. The same can be said of the piano-hinge trackpad, which feels metallic, but also works reasonably well.

msi gs63vr kb

The keyboard features a three-zone backlight but it isn’t particularly bright.

The shell itself is brushed magnesium-lithium. There’s a slight give when you squeeze the body but nothing that would appear to compromise the device.

For ports, you get three USB Type A 5GBps, one USB Type A 480Mbps, a Thunderbolt 3.0 port, HDMI 2.0, and a Mini DisplayPort. You also get gigabit ethernet using a Killer NIC controller, an SD card reader, a Kingston lock, and two analog audio jacks. The Wi-Fi is also a KillerNIC part supporting 802.11ac.

msi gs63vr felt

The entire bottom of the MSI GS63VR Stealth is covered with a suede-like material that actually makes it comfortable to use on your lap.

Performance

Since this is the first laptop I’ve received with a GeForce GTX 1060, I was very interested to see if it hit the bar Nvidia set for the desktop chip: equivalence with the GeForce GTX 980.

3DMark FireStorm Extreme Graphics

The first test is 3DMark Fire Storm Extreme. I’ve broken out the graphics score, a subscore of the overall performance, to assess the GPU.

For comparison, I looked at the results of behemoth gaming laptops that weigh between 8 and 10 pounds. I also threw in an older Razer Blade laptop with a GeForce GTX 970M.

The Stealth outpaces the Razer laptop as well as the Acer Predator 17’s GeForce GTX 980M by a healthy margin. But it doesn’t quite catch the GeForce GTX 980 chip in the Acer Predator 17X. Still, it’s damned close.

msi gs63vr stealth 3dmark fire strike extreme graphics

Considering its weight and size, the Stealth does very well against big burly laptops.

Most people like to see the overall score too, so I’ve included the results here. Remember, the overall score factors in the CPU, so it’s not a surprise that we see the Acer Predator 17X with its Core i7-6820HK and better cooling push a little further ahead of the Stealth. And just to give you a sense of the full performance range, I’ve thrown in the overall scores I have from lower-powered GPUs as well.

msi gs63vr stealth 3dmark fire strike extreme overall

These scores include lower-end 15-inch laptops and show you the huge gulf between the bottom-end GPUs and the GTX 1060 in the MSI Stealth.

Tomb Raider

Tomb Raider is old at this point, but still a worthy real-world test if you want to see how a 3-year-old game performs on a laptop. The results are mostly weighted toward the GPU, but the benchmark does tend to like faster CPUs, too.

The results basically mirror that of 3DMark FireStorm Extreme’s overall scores, with the GTX 1060 a step behind the GTX 980 in the Acer Predator 17X. Our tests were run at 1920×1080, which is the most common gaming resolution right now.

hp omen 17 gtx1070 tomb raider ultimate 19x10

Tomb Raider is about three years old at this point, but scales well with graphics and CPU performance.

Middle-earth: Shadows of Mordor

I also ran a newer game, Middle-earth: Shadows of Mordor, on the Ultra setting at 1920×1080, and found the Stealth to be exceptional. The Stealth’s GTX 1060 proves to be just as fast here as the larger and heavier Predator 17X’s once-mighty GTX 980. And no, this isn’t because the GTX 1060 has 6GB of memory versus the standard 4GB frame buffer of GTX 980 desktop cards. Like most big gaming laptops, the Predator 17X packs the 8GB version of the card.

This is all Pascal at work here, and it’s a good showing for the Stealth and GeForce GTX 1060 card.

hp omen 17 gtx1070 shadows of mordor 4k textures 19x10

You won’t be wanting for frame rates with the GeForce GTX 1060 in the MSI Stealth.

4K performance

Naturally, you’re probably wondering about 4K performance, what with the Stealth’s 4K panel. But as awesome as the GeForce GTX 1060 is, it can’t handle 4K gaming on the highest settings.

You can see this in the results for Shadows of Mordor at 4K resolution. I don’t have results for the Razer Blade at this resolution nor the Acer Predator 17, but I think it’s okay to excuse both the GeForce GTX 970M and GeForce GTX 980M from the room because gaming at 4K is only for real GPUs.

Looking at the chart, you may think the GeForce GTX 1060 and GTX 960 can join the 970M and 980M for early recess too, since it’s clear neither are up to the task without turning down knobs. The good news for the Stealth is that the GTX 1060 runs just about dead even with the GTX 980. When you remember that the GTX 980 was the top dog in gaming laptops just a few months ago, that’s a hell of an accomplishment.

hp omen 17 gtx1070 shadows of mordor 4k textures 4k resolution

I’d say that expecting 4K gaming from the GTX 1060 is too much.

Rise of the Tomb Raider performance

One last gaming test comes from the fairly new Rise of the Tomb Raider game.

I ran it on the Very High setting, at 1920×1080. (I also have 4K results, but like Shadows of Mordor, it’s not pretty.) Here, the GTX 1060 lags much more significantly behind the GTX 980 in the Acer Predator 17X. Why? Looking back at the green comparison chart up top, I’d surmise that it’s due to the GTX 980’s massive CUDA core advantage. Even though Nvidia says the CUDA cores in the newer Pascal chips are 135 percent more efficient, it may not be enough to out-muscle sheer numbers.

The good news is the Stealth and GTX 1060 are basically capable of running this game at high-quality settings at around 60fps.

hp omen 17 gtx1070 rise of the tomb raider very high 19x10

The GTX 980’s raw CUDA core advantage may give it a big step up over the GTX 1060.

Handbrake encoding performance

The vast majority of our tests focus on the thing that matters the most for gaming: the graphics card. However, since people actually do things outside of gaming on their computers, I pitted the Stealth against the thicker, beefier laptops in our Handbrake encoding test.

This benchmark uses the free and popular Handbrake encoder to convert a 30GB 1080p MKV file into a smaller MP4 using the Android Tablet preset. Handbrake hearts CPUs, and you see improvements as you add more cores and more megahertz.

As the thin Stealth uses pretty much the same Skylake quad-core as just about every other gaming laptop we’ve seen all year, performance is, well, about the same. The fastest of the bunch is the Origin EON17-X with its overclocked desktop CPU, but for the most part, the Stealth can hang with the rest of its competition just fine.

hp omen 17 gtx1070 handbrake 0.9.9 encode

Battery life

Typically we don’t care about battery life on burly gaming laptops. That’s because the moment you crank up a 130-watt GPU to play a game on a battery, you’ll be lucky to get an hour out of it. Really, the battery in gaming laptops is there so you can unplug it and move from the kitchen to another room where you plug it in again.

With the Stealth though, we’re now in an interesting world. It’s a “true” gaming laptop that lets you run modern games on Ultra at 60fps, but at four pounds, it’s also a laptop you can use on the move.

To see how the Stealth does in battery life, I used our standard video rundown test, where we play a 4K-resolution file using Windows 10’s Movies & TV player. The screen is set between 250 nits and 260 nits, which is a good brightness for a daytime office or home setting, and we plug in a set of earbuds and leave audio on.

Rather than compare it to the big gaming laptops, which very few will regularly bring onto an airplane, I pitted the Stealth’s battery life against that of the Dell XPS 15 (which has a smaller battery), and a Samsung Book 9 Pro. Both are similar in size to the Stealth and also pack quad-cores and discrete graphics (albeit with far lower performance). For a reference, I also included HP’s Spectre X360 15T, quite a different beast with a dual-core and integrated graphics.

stealth

Gordon Mah Ung

The MSI GS63VR Stealth uses a lithium-polymer battery and features but two small speakers.

The Stealth’s results revealed pretty mediocre battery life, even though the Stealth uses Nvidia’s Optimus, which shifts the workload over to the integrated graphics for lighter duties such as this test. For comparison, the Samsung Book 9 Pro also has a 4K-resolution screen (higher-resolution screens use more power) but its battery is more than 10 percent smaller than the MSI.

For what it’s worth, this battery performance was also generated with the laptop on its “eco” mode. I saw about about half an hour less runtime when it was in “sport” mode and the keyboard backlight was on.

Basically, expect around three or four hours of battery life in general use. You can get more, but you’ll have to crank down the brightness. The best option for those chasing better battery life might be to opt for the 1920×1080 version of the Stealth.

msi gs63vr stealth battery life

The Stealth’s battery life isn’t great, but it’s not as bad you’d expect either.

Room for improvement

As much as the GTX 1060-equipped Stealth is a game changer, there’s plenty that’s far from perfect.

First, the panel. Its colors just don’t pop and the screen isn’t terribly bright overall. Just to hit the minimum brightness test for our video rundown, I had to run the Stealth brightness at 100 percent.

The 4K resolution panel also doesn’t support G-Sync (Nvidia’s adaptive refresh technology). This isn’t all MSI’s fault though, as G-Sync currently doesn’t work with Optimus—and if MSI hadn’t implemented Optimus, you’d likely get worse battery life, since the discrete GPU would run all the time. That said, trying to push gaming at 4K without G-Sync is pretty futile on a GTX 1060 (it isn’t even a great idea on the next-step-up GeForce GTX 1070).

Since you don’t have the horsepower to play at 4K UHD resolution, should you bother with the 4K panel? Especially if the panel doesn’t zing like other 4K panels we’ve seen? Nope. Opt for the 1920×1080 instead, so you can run everything at or near Ultra settings.

Another weakness is the Stealth’s audio. It consists of two small bottom-firing drivers in the front of the laptop. There’s just no low- or midrange to them, and when they’re cranked up, they’ll occasionally rattle.

Conclusion

Despite its drawbacks, I’m still blown away by what the MSI GS63VR Stealth delivers—a 4-pound laptop that can play the newest games with settings at or near Ultra. That’s just a tectonic shift of portable gaming capability we’ve just never seen before.

Previously, the best GPU we could expect in a laptop this thin was Nvidia’s GeForce GTX 970M. The GTX 1060 in the Stealth is at least 50 percent faster than that. That means that with the MSI GS63VR Stealth, you can finally have your portability and your performance at the same time.

msi gs63vr beauty 3

source”cnbc”

SpotCam HD Eva review: The security features are good, but the app isn’t

book shelf eva

SpotCam is a relatively recent entrant in the DIY security sweepstakes, offering a trio of HD (720p) indoor and outdoor cameras. The $160 SpotCam HD Eva looks to muscle in—perhaps literally, given its bulky body—on the crowded Dropcam/Nestcam space with attractive features such as 24-hour continuous recording and motorized pan/tilt. A few peccadilloes, however, keep it from the top tier.

You can forget about hiding the SpotCam HD Eva in plain sight—there’s nothing inconspicuous about this camera; it’s easily the bulkiest we’ve reviewed so far. The head unit is bigger than a tennis ball—which is typically the largest comparative size we see on DIY security cams—and the base is nearly as big. In fact, it almost looks as if they stacked two of the same cone-shaped modules one on top of the other.

Inside that ample head is an IP camera ringed with 18 high-power infrared LEDs for night vision. The camera has a 110-degree field of view, which should be enough to take in most rooms.

In the event it’s not, you can take advantage of the SpotCam Eva’s pan-and-tilt functionality. Using the Spotcam mobile or web app, you can pan the camera 360 degrees—180 degrees in either direction—tilt it up 50 degrees, or down 20 degrees. Motorized pan-and-tilt is also a nice feature for tracking pets or (hopefully not) intruders as they move through your home.

SpotCam HD Eva also includes a speaker and microphone, so you can communicate remotely with family members or pets at home or scare off burglars with your voice. And, like all SpotCam’s, Eva can be used with IFTTT recipes to connect with other smart-home devices, including Philips Hue lighting, Nest Labs’ Nest Protect smoke/CO detector, and Amazon’s Alexa digital assistant.

eva front

SpotCam

SpotCam HD EVA boasts many desirable features, but its pan-and-tilt is hampered by clunky app control.

All surveillance footage is recorded to SpotCam’s cloud servers, and buyers get the SpotCam Live plan, which stores a rolling 24 hours of recordings for free. To store more, you’ll need to purchase one of the SpotCam NVR packages: 3-day recording for $3.95 per month or $39/year, 7-day recording for $5.95 per or $59/year, or 30-day recording for $19.95 per month or $199/year.

Setup and Usage

Setting up SpotCam HD Eva isn’t difficult, but it is clunky. Using the SpotCam mobile app, I had to first to connect the camera to SpotCam’s Wi-Fi, then toggle to my phone’s settings to find my Wi-Fi network. Once I entered my Wi-Fi password, I then had to flip a switch on the back of the camera to put it in “client” mode before toggling back to the SpotCam app to wait for it to connect.

Confirmation that this rigmarole worked is a still shot from the camera’s live feed displayed on the SpotCam app’s home screen. Tapping this picture takes you to the live feed itself.

The Spotcam app is clean and intuitive, if a little unpolished. Directly beneath the live feed window are icons for turning audio on and off, capturing still photos of the feed, and a 30-second rewind. There’s also a button for sharing access to the camera in case you want multiple family members to peek in on the homestead. Tapping this opens a window where you enter the party’s email address, customize a couple of access settings, and add an invitation message.

book shelf eva

SpotCam

SpotCam HD Eva’s size makes it tough to conceal.

The bottom half of the screen features a horizontally scrolling timeline and icons for viewing a chronology of detected sound and motion events, triggering the microphone, opening the settings tab, and getting back to the live stream.

Both live and recorded video were sharp and vibrant during my use. Even in night-vision mode, the image had good contrast with plenty of detail. Many wide-angle lenses suffer from distortions around the edges of the image, but the Eva’s showed little warping.

You control the camera’s pan-and-tilt feature by swiping horizontally or vertically in the desired direction on the live feed image. This operation is hardly precise. In theory, the length of your swipe should determine how far the camera pans or tilts (the app displays the degree of rotation). But sometimes the slightest swipe would send the camera rotating the full 180 degrees, while what seemed like a longer swipe would barely budge it. Other times, the app wouldn’t respond to swiping at all.

spotcam app
All detected motion and audio events are easily accessible from a chronological log.

The SpotCam HD Eva can detect both motion and sound—the latter is always a bonus as you’re still getting some protection if something is happening outside the camera’s field of view. Detection alerts were timely and accurate in my tests. The SpotCam app offers a fair amount of customization to reduce the incidence of false alarms: You can set motion and audio sensitivity to low, normal, or high, and opt in to be alerted to motion or audio events, or both. You can also schedule alerts to be deactivated during certain hours or days, which means you won’t need to unplug the camera when you’re home.

When an event is detected, it’s added to the event log. You just tap the event icon and select the event to see video of the triggering action. You can also access recorded events from the live feed screen by sliding through the timeline, but pinpoint accuracy is tough on a phone’s small screen. Also, sometimes trying to view videos this way caused the app to close and reboot.

Although you can view recorded videos on the SpotCam mobile app, you can’t download them. For that you need to log in to the SpotCam web portal (which has all the same features as the app) and create a video from your surveillance clips. Once you select a period of time, you can opt to make a “normal” or time-lapsed video. The result is saved to a tab called My Film on the portal, from which you can download the video to your local drive.

Bottom Line

The SpotCam HD Eva has most of the makings of standout home security camera; unfortunately, some shortcomings in execution hinder its performance. Its bulky, non-descript design is curious considering how much emphasis other manufacturers—MyFoxand Neatmo in particular—have put on the aesthetics of their products. The clumsy pan-and-tilt operation diminishes what should be a standout feature, and the need to “create” a video feels like extra work.

Maybe these things will be improved down the line. If you can overlook them in the meantime, SpotCam Echo still offers solid security for a reasonable price.

source”cnbc”

Intel Kaby Lake Review: What optimization can do for a 14nm CPU

kaby lake

When Intel’s Kaby Lake CPU arrived at our doorstep in the form of Dell’s XPS 13 laptop, it was wrapped in foreboding. Hardware fans have long been in denial about the inevitable end of Moore’s Law. With Kaby Lake, Intel’s abandoned its relentless “tick-tock” march in favor of a slower “process-architecture-optimize” stroll. Semiconductor doomsday seemed nigh.

Kaby Lake is the first CPU produced under Intel’s new plan. The plan started with a “tock”—a CPU shrink (22nm Haswell to 14nm Broadwell), then a “tick” of efficiency improvement (14nm Broadwell to 14nm Skylake). Intel wasn’t ready to produce another tock yet, though. Instead we got a second tick, an “optimized” 14nm Kaby Lake. The question is whether there really is much improvement from Skylake to Kaby Lake, or is Intel just stalling while it looks for a way to stretch Moore’s Law even further.

I can say now that it’s a decent step forward. To find out just how much you get and where it’s from, you’ll have to read on.

kaby lake wafer new

Gordon Mah Ung

Intel’s 7th-gen Kaby Lake is built on a 14nm process similar to that of Skylake CPUs, but manufacturing tweaks give it a performance boost, the company says.

How we tested

A laptop CPU is inherently more difficult to review than a desktop CPU. You can’t isolate a laptop CPU like a desktop CPU—it’s part of a complete package. Each laptop vendor makes different performance decisions for thermals, power and weight, so it can be hard to suss out the CPU’s impact unless the laptops are exactly the same. Otherwise, you’re really comparing laptop A with laptop B.

Fortunately, I had access to three generations of Dell’s excellent XPS 13. They’re not identical, of course, but they’re close enough that the comparison has some validity. I purposely selected tests that will minimize those differences.

Meet the XPS 13s

dsc01005

Gordon Mah Ung

Kaby Lake, Skylake, and Broadwell power these three generations of XPS 13 laptops.

The newest XPS 13 costs $1,100. It’s equipped with a 7th-gen Kaby Lake Core i5-7200U, 8GB of LPDDR/1866 in dual-channel mode; a 1920×1080, non-touch, IPS panel; and a 256GB Lite-On NVME M.2 SSD.

I reviewed its predecessor last year. It has a 6th-gen Skylake Core i5-6200U, 8GB of LPDDR3/1866, a 1920×1080, non-touch, IPS panel and a 256GB NVMe M.2 Samsung SSD.

Then there’s the original XPS 13, configured with a 5th-gen Broadwell Core i5-5200U. Because it was the entry-level model, it had 4GB of DDR3/1600RS in dual-channel mode and a 128GB M.2 SATA drive. It also had a 1920×1080, non-touch IPS panel.

Prior to testing, all three laptops were reset, and all three were updated with the same Windows 10 version (OS 1607 Build 1493.222). All three also had the latest drivers and BIOSes applied from Dell’s support site.

If your PC world tends to swing desktop, know that what we’re only looking at laptop Kaby Lake performance. Desktop CPUs (and quad-core laptop chips) are due early next year. Consider what you see here a preview of what you might see with the upcoming desktop CPUs.

xps 13s

Gordon Mah Ung

Three generations of XPS 13 laptops: On the left is the 1st-gen Broadwell. On the right is 2nd-gen Skylake, and in the middle is the latest: A 3rd-gen Kaby Lake version.

Three 14nm CPUs enter a bar…

Unlike previous generations, Intel now has three CPUs built on its 14nm process. I’ve lined up the comparative details from Intel’s ARK page. I’ve also included the Core i7-6560U chip used in the gold XPS 13 that occasionally appears in my benchmark charts.

If you look at the specs below, you can find the main performance advantage Kaby Lake has—clock speed. Kaby Lake cores are, for the most part, identical to Skylake cores. By massaging the 14nm process  squeeze out a CPU that can hit higher clock speeds. For example, where 6th gen Broadwell Core i5-6200U maxes out at 2.8GHz, 7th-gen Kaby Lake can hit 3.1GHz.

So what does that 10 percent increase in clock give you? Let’s find out.

kaby sky broad

Cinebench R15 multi-threaded performance

First up is Cinebench R15. This test is based on Maxon’s real-world rendering engine and is a pure CPU test. That roughly 10-percent clock speed advantage the current Core i5-7200U has over the prior Core i5-6200U results in roughly a 10 percent bump. With the older Broadwell-based Core i5-5200, the gap widens to 20 percent.

As for the gold XPS 13, you’d think its Intel Core i7-6560U and Iris 540 graphics core, with the ability to hit a maximum of 3.2GHz, would prove slightly faster than the Kaby Lake version, but it’s not. In fact, it’s slightly slower. I think the Kaby Lake Core i5 with its improved process is able to run at its highest clock speed speed for longer than the Skylake Core i7. That’s another testament to Intel’s “optimize” step.

dell xps 13 kaby lake cinebench vs skylake broadwell

The 7th gen Kaby Lake Core i5 unit is surprisingly faster than the Core i7 by a smidge.

Geekbench 4.01 performance

While Maxon’s Cinebench R15 is based on a 3D rendering engine the company sells, Geekbench is a synthetic test that tries to simulate what it believes are real-world workloads. The test isn’t without controversy, but much of that is due to Primate Lab’s attempts to make it a usable cross-platform tool that would let you compare an iPad Pro to a PC.

When you get involved in the quasi-religious wars between tech tribes, expect mudslinging. I won’t re-litigate the politics of it here, but the good news is Geekbench 4.01 is completely new, and Primate Labs seems to have taken much of the criticism to heart.

Geekbench 4.01 tests cryptography, integer, floating point and memory tests. As I didn’t have Geekbench 4.01 for the gold XPS 13, I’ve omitted that laptop. The result is an unsurprising 11 percent performance difference. Moving on to the Broadwell XPS 13, Kaby Lake shows a very respectable 24 percent improvement in performance. That pretty much echoes what I’m seeing elsewhere.

dell xps 13 kaby lake geekbench 4 multi threaded performance

Geekbench 4.01 shows the 7th-gen Kaby Lake also ahead of the 6th-gen Skylake by a reasonable margin.

Handbrake Encoding Performance

Moving on to our final CPU-centric test, we use the free and popular Handbrake encoder to transcode a 30GB 1080P MKV file using the Android Preset. The test is mostly CPU-limited and loves multiple cores.

On most ultrabooks, we use it as both a performance and thermal soak test to find out what happens to a laptop when it’s forced to heat up the the CPU for almost two hours. Typically, this is where you see the limits of the laptop’s cooling system. Performance often falls off as the it heats up—or the fans get loud. Historically, Dell’s XPS 13s have done really in this test, and that’s because Dell generally isn’t afraid to make a little noise in favor of performance.

The results for the 5th-gen Broadwell and 6th-gen Skylake are as expected, with the 7th-gen Kaby Lake coming in about 11 percent faster than 6th-gen and the 24 percent faster than 5th gen. As for the gold XPS 13 with its Skylake Core i7-6560U. Despite having a higher maximum clockspeed of 3.2GHz, the Core i7 actually finishes slightly slower than the Core i5 unit.

This seems to back up up my suspicion that the Kaby Lake Core i5, with its “optimized” process, can sit at higher clock speeds far longer than the Skylake Core i7 chip.

dell xps 13 kaby lake handbrake vs skylake broadwell

It’s no surprise that the Kaby Lake CPU aces both its Skylake and Broadwell siblings by a reasonable amount. The surprise is beating the Core i7 Skylake chip.

3DMark Cloud Gate Graphics performance

I didn’t get too heavy into the gaming performance of Kaby Lake, because like the CPU side, it’s very similar. The Kaby Lake chip has Intel HD 620 graphics, with 24 execution units and clock speeds of 300MHz to 1,050MHz. That probably sounds pretty similar to what you got out of Skylake’s Intel HD 520, which has 24 execution units and a clock speed range of 300MHz to 1,050MHz

To gauge the performance of the laptops, I used Futuremark’s 3DMark Cloud Gate. It’s a synthetic benchmark, but its value lies in being a neutral title that doesn’t favor any particular vendor’s optimizations. The Cloud Gate test is well suited for integrated graphics gaming, too.

In 3DMark, Kaby Lake comes in just 7 percent faster over the comparable Skylake chip. Over Broadwell’s Intel HD 5500, though, you get a very respectable 27 percent difference. Based on a 3DMark Cloud Score from the gold XPS 13 with Core i7-6560U chip. You can see where the Iris 540 graphics and its embedded 64MB of eDRAM pay real dividends.

One thing that should be said: Intel says Kaby Lake graphics can play some eSports-level games at 30 fps with resolutions and settings turned down. It’s nice that you’re getting near Iris-level performance in a standard CPU, but if games are going to be something of a focus in your laptop, get one with a discrete graphics chip (and with a Kaby Lake chip too).

dell xps 13 kaby lake 3dmark cloud gate vs skylake and broadwell

Using the graphics test of 3Dmark Cloud Gate, Kaby Lake is faster but not by quite the same margin over Skylake. The 64MB eDRAM cache in the Core i7 also pays off handsomely.

Battery life is better, maybe

One of the more difficult tests in a laptop is comparing battery consumption of different CPUs. Ideally, you’d test a laptop and then swap the CPU out and test again. That world doesn’t really exist anymore, because all of Intel’s mobile CPUs are soldered to the motherboard.

The best-case scenario is what I have today. It’s not perfect but still an interesting comparison to make.

In the interest of full disclosure though, I should mention there are key some key differences. Dell uses cells that are physically the same size, but due to denser batteries has increased the capacity. The Skylake-based XPS 13 has a 57,532 watt-hour battery, while the Kaby Lake-based XPS 13 has a 59,994 watt-hour battery.

The other key difference, which may matter more, is the SSD. Both have 256GB SSD’s but the brands and rated power consumption are different. The Skylake unit has a Samsung PM951 NVMe drive which can use up to 4.5 watts, while the Kaby Lake unit has a Lite-On CX2 which absolutely sips power at 1.32 watts under load. What’s not clear is whether the Samsung drive is using a full 4.5 watts while being read from, or if that’s only a worst-case scenario. Ideally, both would have the same drive, but that’s out of my hands today. At least both pack 8GB of LPDDR3/1866.

For our rundown test, I stuck with our standard workload, which is to play the open source Tears of Steel using Windows 10 Movies and TV with the brightness set to around 255 nits. Audio was on using a pair of ear buds and the tests were conducted in airplane mode.

The result? The Kaby Lake in the 3rd-gen XPS 13 gives you an hour’s more battery life than the Skylake version. Because these are laptops I can’t separate the CPU’s role from the battery’s, but at least we’re not going backwards.

dell xps 13 kaby lake 4k avc tears of steel 255 nits battery life

The Kaby Lake XPS 13 gives up slightly better battery life than the Skylake XPS 13, but other factors contribute to that outcome.

Greatly improved video engine

You know Kaby Lake’s magic on the CPU side is mostly a clock speed bump. Graphics? Another bump. The big upgrade is to the video engine, where Intel added a wealth of hardware support for such things as VP9 at 4K resolution and 10-bit color support for HEVC.

To find out the practical upshot I used an Intel-supplied 10-bit HEVC file at 4K resolution. The file was actually the same open-source Tears of Steel video, but encoded at a 10-bit color depth.

Using the same settings for the previous battery run-down test I wanted to see how much that hardware support for 10-bit HEVC mattered. As it turns out, it matters a lot.

dell xps 13 kaby lake 4k 10 bit hevc tears of steel 255 nits battery life

,Battery life during playback of 10-bit HEVC video barely dented the Kaby Lake XPS 13 thanks to its hardware support. I can’t say the same for the Skylake XPS 13.

As you can see, the Kaby Lake XPS 13’s battery life is just over 10 hours. On the Skylake XPS 13, you take a yuge hit with Skylake tapping out at just under three hours. Ouch.

Kaby Lake’s dedicated hardware for playing back 10-bit HEVC on the graphics cores means the CPU can sit idle most of the time during playback. Skylake doesn’t have the same video hardware support, so decoding of that 10-bit content is shifted to the CPU cores, which must run at at far higher speeds to process it. The more you use the CPU, the more power you use.

As a comparison test, I downloaded a pair of videos encoded in HEVC at 10-bit color depth from Jelly.fish.us. The files are provided so people can test network streaming performance.

I didn’t stream them though–I just played the files directly from the SSD using Movies and TV. In this first screenshot you can see the Kaby Lake XPS 13 playing the 1080p version of the 10-bit HEVC file. Because the GPU is doing all of the work, the CPU is basically idling.

jelly 1080 kabylake 10bit

With hardware acceleration of 10-bit HEVC video, the 7th-gen Kaby Lake Core i5-7200U is buttery-smooth playing both 1080p and 4K video.

In the next screenshot, you can see the same file being played on the Skylake XPS 13 in Movies and TV. It’s pretty apparent what’s using all the power on the Skylake XPS 13: The CPU is working hard to decode that 10-bit content, and with both cores running at 2.7GHz during the playback.

What’s worse: Rhe Skylake XPS 13 constantly drops frames even with the CPU running at near top speed. To put a finer point on it: The Kaby Lake XPS 13 can play the file at 4K resolution with 10-bit color and HEVC and still cruise, while the Skylake XPS 13 can barely play the 1080p version. Ouch.

Intel says you can expect similar results for Google’s VP9 video too using the Chrome browser. VP9 is a video codec Google supports and how it encodes all videos on Youtube. I tried replicating Intel’s tests using Chrome on Youtube but didn’t have the network bandwidth at home to reliably stream two 4K VP9 videos simultaneously (one to each laptop). I see no reason to doubt Intel’s claims, though, as the claim about 10-bit content in HEVC is pretty crystal-clear.

One thing that should be pointed out: 10-bit content is very rare today. One area I think it does matter though, is in HTPC use. If I were building or buying a machine to run my 4K-and-up television, I’d want Kaby Lake over Skylake in a heartbeat.

jelly1080p skylake

This 1080p HEVC file encoded with 10-bit color depth will drag a Skylake Core i5-6200U to the ground.

Conclusion

In the end, Kaby Lake is a decent step forward. Is it as exciting as reading about a 10nm-based CPU? No. But it does deliver a margin of improvement that’s actually in line with the evolutionary steps that what we’ve seen from the last few generations of chips. Haswell to Broadwell gave us maybe 10 percent. Going from Broadwell to Skylake yielded similar results.

Does it mean dump your Skylake laptop and rush out to buy a Kaby Lake laptop? No, not at all. However, when you’re looking at a 20 to 25 percent difference between Kaby Lake and Broadwell, then you start to wonder. As you get to Haswell, where it probably opens up to 30 to 35 percent, or Ivy Bridge and even Sandy Bridge, then yes, an upgrade would be a game-changer.

kaby lake

source”cnbc”

Virginia review: This detective thriller adapts the language of film to games

Virginia

When I reviewed Event[0] last month I said, “This might just be one of the most importantindie games of the year.” And then along came Virginia.

Like Event[0], Virginia belongs to a growing subset of experimental games, ones focused on expanding the language of games as much as—or sometimes more than—commercial appeal or approachability.

Whether or not their individual experiments are successful, I feel these games are important to experience if you care about the growth of the medium—not least because you can expect their better ideas to be “borrowed” by the bigger-budget games of the future.

Lynchian

Long preface aside, I don’t think Virginia is a particularly successful experiment.

You play as FBI Agent Anne Tarver, a recent graduate of the academy who’s ostensibly assigned to work a missing persons case. This is pretense, though. Your main charge is your partner, Maria Halperin, who you’re investigating as part of an internal affairs matter.

Virginia

Never before, not even in the heyday of ‘90s full motion video, have film and games drawn so close together. What initially purports to be some sort of detective game is essentially a two-hour short film where the player is occasionally required to click the mouse.

It raises questions about interactivity. How much agency do you need to give the player for a character to resonate, for a game to instill that very particular sort of empathy unique to this medium? We keep pushing this idea further and further.

“Okay, Telltale games are sort-of like a long, semi-interactive movie but controlling the dialogue seems to keep players interested and build empathy. Can we go further? What about a game where you just walk around a family’s home? That still works? Okay, let’s go further still.”

Virginia dispenses with even the trappings of a game in the traditional sense, instead co-opting the visual language of film and television. Letterboxing, sure. Thirty frames per second, sure. But mostly I mean its heavy reliance on jump cuts and match cuts—a nod to Thirty Flights of Loving in the end credits is almost unnecessary, as it’s clear whereVirginia’s inspirations lie. You’re in the middle of some action, and then suddenly you’re teleported through time and space to the completion of that action.

Virginia

So a long drive through the countryside is instead presented as three short clips, the sun arcing across the sky and the scenery changing outside the windshield. Or a walk to the basement is ten seconds of a hallway, ten of you walking downstairs, and then ten of wandering the basement itself.

No matter that we’ve abridged time and space. The cut is a standard part of film, excising the boring bits to tell a more concise story or draw the audience’s attention to matters it might not notice otherwise. With over a hundred years of film behind us, we’ve grown accustomed to the jump cut as it exists in filmed media.

But it’s decidedly non-standard in gaming, and its use in Virginia is somewhat novel, allowing for trippy dream sequences and a larger sense of scope than could be handled if the player had to physically travel to each location. A whole town can be rendered in a few key points.

The downside is it undercuts the very same empathy games are so brilliant at establishing. Despite some criticism from vocal parts of the games community, the so-called “walking simulator” genre has mostly been successful in its own way, I think because there’s something psychologically satisfying about controlling even an aspect as basic as a character’s movement, of charting a space on your own terms.

Virginia

Virginia excises that last bit of agency, instead aiming to both railroad players through a story and to disorient. As a result, it’s hard to feel any attachment to the characters or the town of Kingdom, Virginia.

An aside: It could be argued that this is done on purpose, as in Bertolt Brecht’s distancing effect. From this viewpoint, Virginia purposefully pushes the audience away emotionally in order to focus it intellectually on the injustices of our political system, or America’s relation to race/gender. This is a possibility, but I think it’s more likely that Virginia tried to instill true emotional empathy with its characters and failed by nature of the game’s structure.

And that’s partially because the language of film isn’t really translated into Virginia. Superficially, it works. There are film-esque jump-cuts and it makes for an intriguing visual style. But we don’t get the benefits of other important film techniques—different shot types, or framing, or the other elements that make up a coherent scene.

The lack of dialogue certainly doesn’t help either. I actually love Virginia’s lo-fi style, and think the game creates some strong environments out of fairly simple tools. But the lack of dialogue puts a larger burden on Virginia’s animations—particularly its faces—to communicate emotion to the player, and that’s difficult when characters have all the expressiveness of a wooden doll.

And then there are genre concerns. I think a detective story was perhaps Virginia’s gravest error, if only because it implies some sort of investigation or—dare I say it?—interaction on the part of the player. Even in mystery novels, half the appeal is trying to solve the story before the characters. What often makes a good mystery is feeling like all the pieces were there, if only you’d been able to put them together.

Virginia

Virginia’s mystery is set-dressing though, and is of secondary concern to a more character-centric vein of storytelling layered on top. That’s not bad in-and-of-itself, but in a game already set on subverting so many player expectations it’s just another layer of abstraction between the audience and the game’s emotional core.

But as I said up top, I think Virginia will prove to be one of 2016’s most important indie games even if it’s not entirely successful, and I meant it.

Games are just starting to experiment with the language of film in an artistic way, rather than as mere utility (i.e. cutscenes). I expect that trend to continue, and Virginia adapts film editing in a far more audacious manner than Thirty Flights of Loving or evenQuadrilateral Cowboy.

Virginia also makes extensive use of symbolism—still a rarity in games, which is baffling considering its importance in storytelling and its pervasiveness in other mediums. Virginiaisn’t especially sophisticated in its approach. Its symbols are rather ham-handedly presented to the audience, and ad nauseum.

Virginia

Nevertheless, there’s something to be said for a game that works in subtext, that encourages a deeper reading of the story beyond “What you see is what you get.”Virginia’s recurring bison and bird motifs may be overdone by the standards of film/TV, but their mere presence is still sophisticated vis-a-vis stories in video games.

Bottom line

I don’t like Virginia, nor do I think it’s successful in accomplishing what it set out to achieve, but I do think someone’s going to look at it and be inspired, and maybe that person will make something great. Or maybe the person after that.

And so we throw Virginia into the growing pile of games I think are important for people who care about the avant garde in gaming—alongside The Beginner’s Guide, Papers Please, Event[0], Soma, and others. These games are almost all flawed in some way, and our scores often reflect that fact, but they’re important. They broaden the way we discuss games, even as they change what games discuss.

source”gsmarena”

Akitio Thunder3 PCIe SSD review: Blazingly fast external storage for rich kids

thunder3 pcie ssd 5

If you’d walked by our test bed while we were testing Akitio’s Thunder3 PCIe SSD, external drive, you’d probably have stopped and done a double-take. 2GBps transfers from an external drive? No way.

How could this possibly be, you ask? The Thunderbolt 3 interface is basically PCIe over a wire, and features a massive 5GBps transfer rate. The Thunder3 PCIe SSD is a classy-looking, Thunderbolt 3 enclosure that contains a single PCIe slot. Put a 1.2GB Intel 750 NVMe SSD in said slot, and for all intents and purposes, it’s the same as putting it in an internal PCIe slot. 2GBps? Thunderbolt 3 isn’t even breaking a sweat.

Alas, this marvelous demonstration of advanced technology isn’t cheap. Actually, at $1,299, which includes the $800-plus drive (or even $1,270 on Amazon), it’s not even within shouting distance of affordable. But dang, if its speed isn’t enticing.

Hardware

The silver-hued Thunder3 PCIe SSD measures approximately 9.2 inches deep, by 6 inches tall, by 3 inches wide. That’s large, but necessary to accommodate all types of PCIe cards, including full-height and double-width if necessary. In this case, the card is the single-width, low-profile Intel 750, which means more room for airflow to keep it cool. The back of the drive sports two Thunderbolt 3 ports, as well as what the online specs say is a single DisplayPort 1.1 port.

The Thunderbolt 3 chip is an Intel Alpine Ridge, which supports DisplayPort 1.2, but Akitio informed us that there have been issues with some 1.2 displays, so it’s sticking with DP 1.1. Note that Thunderbolt 3 uses the same Type-C connector as USB 3.x, but requires a more expensive cable with embedded electronics. A Thunderbolt 3 cable is included with the drive. And no, the Thunder3 PCIe SSD won’t work with USB 3.1.

thunder3 pcie ssd 3

Akitio

If your laptop features Thunderbolt 3, it’s quite possible that the Thunder3 PCIe SSD will be multiple times faster than your internal storage.

Our test box came pre-populated with the Intel 750 SSD, but captive thumbscrews on the back make opening the unit and adding your own PCIe card SSD a snap. So yes, you can pay $400 for the box and roll your own adapter card/M.2 PCIe solution.

Performance

We’ve already told you that the Thunder3 PCIe SSD is really, really fast, but there’s nothing like a chart or two to drive home that point. We’ve included internally mounted Samsung 950 Pro M.2 NVMe and Intel 750 SSDs to illustrate the relatively even performance between internal and external PCIe.

as ssd akitio pcie

The Akitio Thunder3 PCIe SSD at the end of a Thunderbolt 3 cable actually outperformed two internal NVMe drives in pure sequential read speed.

The Atech Flash VX-2SSD, a very fast external USB 3.1/SATA RAID enclosure that would dominate most external drive graphs, seems slothful by comparison. Though not shown, random access times are in line with an internal NVMe SSD as well—0.03 to 0.06 milliseconds.

As shown below, the Thunder3 PCIe SSD’s performance simply squashes that of other external drives. Especially plain USB 3.0 hard drives such as the Seagate Backup Plus Slim and WD My Passport Wireless Pro. Then again, the Akitio drive costs nearly 30 times more than a plain 2TB USB 3.0 drive. Ouch.

20gb akitio t3 pcie

There’s simply no comparison between the Akitio Thunder3 PCIe SSD and normal USB storage in terms of raw throughput.

We used an Asus X99 Deluxe motherboard with an Asus Thunderex 3 expansion card attached to its Thunderbolt header for testing. Note that the Thunder3 PCIe SSD is not compatible with OS X, and you can’t boot from it.

Final thoughts

The Akitio Thunder3 PCIe SSD is a great product, and a joy to use. But, of course, we didn’t have to pay for it. The nearly $1,300 price tag pretty much relegates the pre-populated version to deep-pocketed early adopters and speed freaks, or multimedia production types in a hurry. Even $400 for the enclosure is sure to make the average user blanch. I, and I suspect the majority of the public, will make do with those $80 2TB, 130MBps USB 3.0 drives for a bit longer. Sigh.

source”gsmarena”

Xbox One S controller review: New features and custom colors make for a great successor

Xbox One S controller

My $80 powder-blue-and-orange Xbox Design Lab controller arrived recently, and I’ve fallen in love with its look. Maybe you’re not so easily swayed. Maybe you hate it. That’s fine. More than fine, actually—that’s kind of the whole point.

Last year Microsoft released the $150 high-end Elite controller for a segment of the market traditionally supported by third parties and aftermarket parts dealers. That undertaking was by all accounts a rousing success, way beyond Microsoft’s predictions.

And so at E3 this year Microsoft announced it would be stepping into another traditionally aftermarket realm: custom controllers. Xbox Design Lab lets you construct your own Xbox One S controller: For $20 more than the stock Xbox One S controller (available for its list price of $60 on Amazon), you can select the color of the front, rear, bumpers and triggers, D-pad, thumbsticks, face buttons, and the Menu and View buttons.

Artistic expression

Xbox Design Lab

I took a spin through the process, which is how I ended up with my GT40-inspired controller. The Design Lab website is easy enough to navigate, though much harder to use effectively. You can create some truly awful monstrosities.

Honestly, the pool of colors to choose from isn’t that broad, especially for some of the components, like the thumbsticks. Many of the colors aren’t complementary, either. It’s much, much easier to design something hideous than attractive. Microsoft can tout however many millions of color combinations, but the reality is there are probably a hundred that look reasonably inoffensive. Maybe a few dozen that look capital-G good.

But you still get quite a bit more choice than I’m used to with first-party controllers. I spent about an hour tinkering with the website, creating various Frankensteins and even a tongue-in-cheek rendition of the stock Xbox One controller before looking at the highest-rated community designs for inspiration and settling on powder blue and orange.

Xbox One S controller

I’m pretty excited by the result. The color here is infused into the plastic, not painted on afterward like with many third-party shops’ custom controllers—a small distinction, but vital to maintaining that “official first-party controller” aesthetic. And the pad looks even better in person than it did on the site—the difference between the matte plastic of the body and the glossy plastic of the bumpers/triggers is a subtle touch I didn’t anticipate.

Place in the lineup

Now, the big question: How does this controller (regardless of custom color or not) stack up against the original Xbox One controllers?

Xbox One controller trio

From left to right: The original stock Xbox One controller, custom Xbox One S, and Xbox One Elite (with rounded sticks).

Short answer: The Xbox One Elite controller is still best of the best. That’s no surprise to me—I’ve still got an Elite controller hooked to my PC and use it weekly for various console-first games.

However, the Xbox One S controller is a step up from the original version of the Xbox One controller. It even has a lead over last year’s slight redesign. First and foremost, the Xbox One S controller has Bluetooth. You no longer need to buy Microsoft’s massive Xbox One wireless dongle then try to fit it into one of your USB slots (while blocking half a dozen others in the process).

You can only connect one Xbox One S controller at a time with Bluetooth, but I doubt most people care about that limitation. And it’s rippled into other environments too—Valve issued an update so the Steam Link works seamlessly with the Xbox One S controller through Bluetooth, too. That makes the Xbox One S controller more viable for wireless PC use (say, in the living room) than the Xbox One controller.

Xbox One S controller

It also folds in improvements Microsoft has introduced into the stock Xbox One controller since 2013, like the 3.5mm jack on the bottom of the controller. The requisite chat adapter thingamajig you needed for the original Xbox One controller was an ugly wart, and I’m happy to see it die. You also get the myriad small design tweaks: The rear of the controller is now textured where the original stock Xbox One controller was smooth. The central Xbox logo jewel is slightly smaller, and the contour lines around it have been adjusted. The controller’s overall shape has been slightly tweaked for a more ergonomic grip.

Xbox One S controller

A close-up of the textured back rear grips

And the bumpers are now softer and squishier—more like the Elite controller. That’s excellent news, because the original Xbox One’s bumpers were so loud and stiff in 2013 as to be obnoxious. Unfortunately the Xbox One S D-pad seems similarly soft and squishy, which I don’t like. It’s still better than the Xbox 360’s mushy D-pad, but it’s a step down from the original Xbox One controller.

Bottom line

My official ranking overall? The Xbox One Elite controller on top, then the Xbox One S, Xbox 360, and in last place the original stock Xbox One controller. The Elite still has quite a bit going for it—even after almost a year of use, my Elite’s analog sticks still glide like they’ve been newly oiled. It’s a beautiful controller, and I’d love for Microsoft to create an Xbox Design Lab for custom Elite controllers.

But the Xbox One S controller, in all its various color variants? Sure, you can get the stock Xbox One S controller for less, but the customization is damn nice. The Xbox Design Lab would’ve been great to have while growing up in a household with three other siblings. We could’ve settled the “No, this is my controller” argument once and for all.

Hell, that’s pretty much my plan for this one. The powder blue controller? That’s mine now. You want to play at my house, you can use the Xbox One controller that came with my console, clicky bumpers and all.

source”gsmarena”