Well it's been over a week and I've now gotten to test it out with a variety of programs beyond Railworks. From testing, I can now confirm that the CPU does in-fact go up to the 3.1GHz Intel claims and not the 2.6GHz max Windows 8.1 properties states.
The power management of these mobile Intel CPUs is very impressive. My particular ultra low-volt chip idles in the 500MHz (0.5GHz) range. My AMD's idle was 1.4GHz to the best of my recollection.
Both the CPU and GPU have yet go beyond 69C in temperature at peak full loads. I'm very happy with that. I'm sure they would go higher under higher ambient temperatures.
Here's an interesting guideline on temps I found that should help everybody on chip lifespans...
Microcircuitry engineering is something like cooking: it involves a lot of probabilistcs and will often have rather random results. So, you don't know how good a microchip is until you have fabricated it. Even then, deterioration will have too a bit of probabilistic behavior.
40ºC (104ºF) or below is heaven for every microchip.
50ºC (122ºF) is a not bad temperature for any microchip.
Microchips starts getting damaged on its lifetime at 60ºC (140ºF).
A chip running at 70ºC (158ºF) during 24 hours and 7 days a week, will probably last 2-6 years.
A chip running at 80ºC (176ºF) during 24 hours and 7 days a week, will probably last 1-3 years.
A chip running at 90ºC (194ºF) during 24 hours and 7 days a week, will probably last 6-20 months.
In this matter there is no difference between main computer chips like GPU, CPU, Northbridge, Southbridge... etc.
Given a temperature, it is harder for the chip to maintain it at high processor usage than at low processor usage. For example: a CPU that achieves 70ºC (158ºF) during 10 hours on nearly-inactive Windows desktop suffers less than a(nother) CPU that achieves 70ºC (158ºF) during 10 hours of intensive CPU processing (i.e: SuperPI). Some hardware engineers report this could be due to that in the second case the CPU uses most of the microcircuitry, and in the first case only a small part of it.
The general rule: microcircuitry is like an ellectrical printed circuit board that has the tracks very close between them (there are often only 4-5 molecules between two tracks), so heating is slowly melting the tracks as time goes by. Keep things as cold as possible.
The general rule when reading the manufacturer's data: they want for you not to care about refrigerating anything, because then it will get broken just after the warranty period (sometimes only a few weeks after it; it is incredible, I know). "It is just bussiness", Alcapone dixit.
Preventing is important (better than waiting for failures to repair): when things start to fail, it could be due to tracks melting in the microcircuitry, or due to minor tracks dilatations. The second case is a temporal problem. The first one is probably a definitive one.
http://superuser.com/questions/749146/s ... m-cpu-temp
I have noticed something about GPU performance. The GTX 850M does DX10 & 11 much better than DX9. Comparing it to the Intel HD 4600 in the brief time I had it.... in DX9 set at optimal settings I got about 2x the frame rate in "Far Cry 2" with the GTX 850M, but I got 3x the frame rate when I switched the game to DX10 and all settings now maxed.
The game "LA Noire" was even more interesting. In DX9 with the graphics at default levels, it only got about 6fps more (28fps vs. 22fps) than the Intel GPU in performance at the same settings and also it shared the Intel's very noticeable stutters in some cutscenes and gameplay.
But here is what surprised me... in DX11 with all graphics settings now upped to max levels, I got 30fps locked Vs.10fps... but this time I didn't get even a single stutter. Buttery smooth animations! Amazing how well it runs DX11. That...or Rockstar doesn't write good DX9 code. These tests were all done in the game's multi-thread option mode. Haven't set it to single core mode (and it's slight turbo clock speed advantage Vs. threading disadvantage) yet to compare the two modes performance-wise.
Trainz 12 runs better than it did on my AMD laptop, but it's still a stuttering mess in track-side camera mode... just not quite as bad as before. That game engine sucks...period! Can't wait to try it with the DX11-based Trainz: A New Era.
Run 8 runs at 32fps using the Intel HD 4400 of my current CPU (FYI...the HD 4400 is the exact same GPU as the HD 4600, just clocked lower). My 1GB AMD 7670M ran the sim at 26fps. The sim runs at about 56fps with the GTX 850M. The Nvidia control panel's default auto-select setting strangely chooses my lower powered Intel HD 4400 over the GTX 850M. It's the only 3D graphics game that it suggests the lower GPU. What's crazy about that is.....it's actually a good recommendation in the end. Other than populating the objects initially on the screen faster when launching the route, there's really no need for faster than 30fps in this graphically simple, physics-heavy based sim. It uses less power and runs cooler as a result...although you can override it and run it on the better GPU if you really want to.
As for my impressions of Nvidia's game profiles in the GeForce experience? It does pretty well at picking the best settings for good performance at good quality based on your hardware specs. The exception so far is with Boarderlands 2. It recommended maxed out on all settings except it suggested PhysX set at low. I compared that to it set at high and I didn't notice a performance difference. Maybe a 2fps hit at the most. It still hovered around 56fps a good chunk of the time. I found on other games that I could go beyond Nvidia's recommendations and still have very good performance.
In conclusion....
Well that wraps up my review. I made this thread not so much as a reason to talk about my new laptop (everybody gets new computers...no big deal really), but mainly to show other people that are looking into a new computer that laptops in the last year or so are finally a true desktop-class replacement candidate.
When I first joined here, most of the desktop crowd steered people away from laptops for Railworks. From a performance gap perspective, they were pretty spot on. In order to get good frame rates in Railworks (especially on certain routes), some settings had to be turned down sometimes. Today I can confidently say though, that's no longer the case...although I concede desktops are best overall if money is no object. You see the performance I'm getting in Railworks with only a $800 laptop.... and to think Nvidia currently makes 5 mobile GPUs that are better in performance than mine!
So if you want to go portable, there's little excuse not to now!

So thread title changed!

