Product Chat / GameGuru is CPU bound - Upgrader's beware

Author
Message
DVader
20
Years of Service
User Offline
Joined: 28th Jan 2004
Location:
Posted: 30th Aug 2016 14:47
Hi all. Thought I'd post this as I remember many discussions about it in the past. Many people argued against this at the time, but I always believed it was the case. Recent tests have shown me it is pretty much correct.

Ages ago I owned a Geforce 260, it ran GG (Reloaded at the time) okay, but was never going to be great. I upgraded to a 660 later which should have been a lot faster, I got a boost from it, but not all that much. At that point I guessed the CPU was holding back the cards performance here. People showed screens displaying CPU usage to be really low, to argue against it, but of course none where using a Q6600, all were in the I5-I7 range.

I've now upgraded to a 970. Again a significant performance upgrade GPU wise. Guess what? Virtually no difference AT ALL. It's a smidgen faster, but I can see it's running pretty much the same as my old 660. The editor if possible seems slightly worse, but that could be imagination.

Now, a friend wanted a system checking over and also my old 660 to make it a reasonable games unit. It's an I5 running at 3.4 ghz. So once it was setup I tried my same GG game on it. Way faster, you could see it was not struggling at all. Way faster than my 6600 with a 970. I'll get better feedback when he sorts Steam out on it and can try GG on it directly.

This not only confirms my suspicions for years, it also shows why many laptops struggle so badly even though they have a reasonable video card on them. My CPU is the main factor slowing GG down at the moment. Anyone thinking of buying a beefy new video card and have a similar CPU to mine, don't bother, unless you plan to update it not too long after as well.

So next an I7, or possibly I5, when, I have no idea. The're not cheap by any means. I knew the video card wouldn't help vastly, but I expected a tiny bit better than what I got


SPECS: Q6600 CPU. Nvidia 660GTX. 8 Gig Memory. Win 7.
smallg
Community Leader
18
Years of Service
User Offline
Joined: 8th Dec 2005
Location:
Posted: 30th Aug 2016 15:00
Yep and a slower CPU with less cores will outperform a faster CPU with more cores so that's why its hard to compare.
Lee did mention not too long ago the possibility of putting the AI onto another core which might help a little but I fear GG will always underperform on a quad core or such due to the way it was made.

That's not to say it doesn't run on a quad core, I can personally run all test levels fine on my i5 but it is already really close thing and in door levels especially really struggle for me.

Its not really a surprise though that CPU is more important than the GPU, at this stage GG doesn't really tax a modern graphics card in anything except memory as there's very few flashy effects to process, just textures.
lua guide for GG
https://steamcommunity.com/sharedfiles/filedetails/?id=398177770
windows 10
i5 @4ghz, 8gb ram, AMD R9 200 series , directx 11
Belidos
3D Media Maker
9
Years of Service
User Offline
Joined: 23rd Nov 2015
Playing: The Game
Posted: 30th Aug 2016 15:06
Part of the problem here is what they call bottlenecking, although GG should be improved by a better graphics card, part of the calculations for the graphics card has to be processed via the processor, if you have a significantly more powerful GPU than CPU then you will begin to hit an upper limit at how fast the GPU and CPU will trade information.

i5, NV960 2GB, 16GB memory, 2x 2TB Hybrid, Win10.
i3 , Intel integrated graphics, 6GB memory, 512GB Generic SATAIII Win8.1.
Intel Celeron (duel Core), Radeon integrated graphics, 4GB memory, 180gB Generic SATAII, WinVista.
Q6600, Intel integrated graphics, 8GB memory, 512GB Generic SATAII, Win7.
DVader
20
Years of Service
User Offline
Joined: 28th Jan 2004
Location:
Posted: 30th Aug 2016 15:10
Yes a dual core variant of my CPU runs faster as well due to it's slightly higher clock speed. The I5 is a quad core also but obviously a quad running at 3.4 gig compared to 2.4. The I5 with my old video card runs it way better though. I have been holding back (saving) for a new CPU for ages, but saw this 970 at 185 or so new and thought it worth getting, despite my reservations about it (memory issues that Nvidia were taken to court about not long back). It's there for the future, but seems I would have been better off looking for a decent base unit.


SPECS: Q6600 CPU. Nvidia 660GTX. 8 Gig Memory. Win 7.
Belidos
3D Media Maker
9
Years of Service
User Offline
Joined: 23rd Nov 2015
Playing: The Game
Posted: 30th Aug 2016 15:53
It's a shame, the Q6600 was one of my favourite CPU's, it was way ahead of its time, and in fact was better than most of the newer Q chips, it took quite some time for Intel to actually make a better chip, but everything has its time, and it's probably time to move on from the Q6600. I still keep my machine with the Q6600 in it, still using 4gb of DDR2 memory, with my old 6670 GPU, I use it every now and again to play games on and it still performs well considering how old it is.

i5, NV960 2GB, 16GB memory, 2x 2TB Hybrid, Win10.
i3 , Intel integrated graphics, 6GB memory, 512GB Generic SATAIII Win8.1.
Intel Celeron (duel Core), Radeon integrated graphics, 4GB memory, 180gB Generic SATAII, WinVista.
Q6600, Intel integrated graphics, 8GB memory, 512GB Generic SATAII, Win7.
3com
10
Years of Service
User Offline
Joined: 18th May 2014
Location: Catalonia
Posted: 30th Aug 2016 16:28
I'm closets to buy my desktop, but I must admit I have not clear yet, what's settings it should meet, to be considered suitable to work with GG, getting some decent results.
Talking about cpu+gpu.

3com

Laptop: Lenovo - Intel(R) Celeron(R) CPU 1005M @ 1.90GHz

OS: Windows 10 (64) - Ram: 4 gb - Hd: 283 gb - Video card: Intel(R) HD Graphics

PM
DVader
20
Years of Service
User Offline
Joined: 28th Jan 2004
Location:
Posted: 30th Aug 2016 16:49
Well the I5 CPU seems to handle things well with the 660 (streets ahead of my Q6600 and the 970). I would recommend the fastest CPU you can get really. A top GPU won't hurt but it is less important with GG. I've been set on an I7 for ages, but new they cost more than my friend paid for his entire base. I may have to break my normal aversion to secondhand kit I think An I5 is certainly a much better bang for buck generally and so far runs GG stuff well. Still to test it properly though, will see if I can get some better tests run when I can.


SPECS: Q6600 CPU. Nvidia 660GTX. 8 Gig Memory. Win 7.
Zigi
15
Years of Service
User Offline
Joined: 5th Jul 2009
Location:
Posted: 30th Aug 2016 17:00
In my opinion the optimization of GG as a whole is a complete joke on both sides CPU and GPU.
I have also upgraded my graphics card recently, from GT430 to GTX750Ti and I'm experiencing huge performance boost.

On the Big Escape map With GT430 I was getting only 8 FPS if I set everything to High and 30 FPS if I set everything to LOW but including drawing distance set to almost 0 so it was ugly as hell.
With GTX750Ti, everything set to Highest, I get stable 30 FPS some places even 40 FPS, if I set Everything to LOW I get stable 60, some places even 70 FPS.
Finally, after years of waiting now I can run GameGuru.

My first config tried to use Reloaded on and have been told I should get decent FPS after optimizations:
Intel Core 2 Duo 3.0 GHz
4GB DDR2 667MHz
Geforce GT430 1GB GDDR3
Windows 7 32bit

My current config required to finally able to get that decent FPS mentioned before:
Intel Core 2 Quad Q6600 2.4GHz
8GB DDR2 667Mhz
Geforce GTX750Ti 2GB GDDR5
Windows 10 Pro 64bit

Everything I throw at it runs smooth at stable 30 FPS at High and Ultra settings including GameGuru.

BUT!!! When I run GameGuru My graphics card is running so hot, it painful to hear how fast and noisy the cooling fans are . Even at LOW settings!!
Honestly, I'm worry about that GG just going to kill my shiny new card.

I'm running Subnautica at ULTRA settings and my card operating on low temperature, the fans are so quiet I can't even hear it and the game is beautiful, the most beautiful game I have ever played and it runs smooth and cool on my config.

So, yeah, finally I have the config to run GameGuru, the only problem this time it runs my graphics card too hot even at LOW settings.
So, by end of the day I'm still can't use GameGuru because I wont as I'm really don't want to risk my card being killed.

GameGuru got serious problems with optimization.
My next upgrade plan (in 1-2 years) is a config that would be able to run VR games, I'm really hope I'm going to able to run GameGuru on a VR Ready config
PM
25-WATTS
8
Years of Service
User Offline
Joined: 23rd Feb 2016
Location:
Posted: 30th Aug 2016 20:29
DVader is right, I've been testing GG on all sorts of rigs and my finding are the same.

Game Guru will not damage your video card, the video card will only take what it can and no more. Some video cards run their power off the pci e slot and have to stick to a wattage limit (TDP). What will damage your card is heat and you will find the extra heat is coming off the CPU which has been pointed out above that the CPU runs flat out, this will heat the space inside the PC case to a very high temp and that could over heat and kill your video card. Fit an extra case fan and clean the heat sinks out of dust every 6 months will help.

I use a plugin amp meter and Game Guru running a game like father's Island takes no more power than a AAA 3D game.
PM
Nomad Soul
GameGuru Tool Maker
17
Years of Service
User Offline
Joined: 9th Jan 2007
Location: United Kingdom
Posted: 30th Aug 2016 20:35
GG is more CPU bound than I would prefer.

It would be great if physics and AI could be put on separate cores and allow the main game thread to run on its own core. I think TGC have done as much as they can in terms of moving the engine to C++ and improving the culling system. Going multi core is about the only big win left for performance.

The 32bit memory cap would be resolved by moving the engine to 64bit but I don't think that's even on the table at the moment. With the EBE system and then AI next to be looked at, Lee is probably going to have his hands full until 2017.
MXS
Valued Member
15
Years of Service
User Offline
Joined: 10th Jan 2009
Location: Cybertron
Posted: 30th Aug 2016 21:59 Edited at: 30th Aug 2016 22:01
@Nomad Soul we don't need 64 bit version of guru. Lee has already got the memory cap at 4gb when you run guru on a 64bit system.

saying all this now is pointless because you guys voted for that dumb EBE and everything after it. We are just going to have deal with the performance as is for now. just learn to utilize what we got. if you all want more performance then you have to vote up these three things.

Hardware Instancing
Rendering distance for static meshes
Tree batching

this can really help on how we utilize the usage of the performance. Tree batching is a must being we use a lot of trees in our levels. it's all about the draw calls usage and Tree batching can save us on how much draw calls we use in our levels. the problem is guru uses up the performance to fast. on super terrain I can have 1000 fps but that can quickly drop down by adding a few models on the level. but the less draw calls the more fps we will have. for that to happen we need things like Tree batching which will help utilize the draw calls usage and save us on performance. these three features will complete the performance over haul that guru got from the c++ upgrade. so if you all really feel the performance is must then go to the voting board and vote for them. but if not then this thread is for nothing.
more than what meets the eye.Welcome to SciFi Summer

Windows 7 home premium 64bit gtx770 sc acx 2gb gpu boost 2.0

Honkeyboy
3D Media Maker
9
Years of Service
User Offline
Joined: 16th Sep 2015
Location: Doesnt know half the time ;)
Posted: 30th Aug 2016 22:19
Tbh guys i use two machines neither brilliant but both x64 with plenty of ram, and 3.4ghz roughly on both my graphics cards although are supposed to be not that good nvs315 on the i5 quad core and 210 silent on the dual core the dual core sometimes runs GG better than the i5. I dont know if you saw my post about the mad fps i got with the intel card hd 2000 but it was true up until updating the driver i got massively stupid fps in some cases well over 100 this in comparison to max 30 with the others. I do agree with DVader on this something is amiss and needs looking at.
Remember at the end of the day if we are to make games for the general public they are all going to have different specs some high end some low end and therefore GG seriously needs the performance looking at whether its to go x64 or directx 12 or whatever i dont know. I even considered trying it out on my old shuttle 3.4 single core and maybe would have produced better results. just my thoughts
Intel i5 4950 Quad core 3.3ghz
8gb Ram
Nvidia NVS 315 1gb
and a well fed mouse on a wheel

When Reality is broken. Game designers can fix it and make it more realistic
OldFlak
GameGuru TGC Backer
9
Years of Service
User Offline
Joined: 27th Jan 2015
Location: Tasmania Australia
Posted: 30th Aug 2016 23:00
Yeah, performance should always be top of the list - it should not be a voting thing - the engine should always be optimized and re-optimized after any new additions to the engine.

My system (specs in signature) runs ok for fps, but the system is definitely working hard just for a few models in a level.

Performance is not an feature - it is crucial - nothing should be put ahead of it in the development cycle, after all it is pointless to have bells and whistles if you can't ring them even on an average machine!

Reliquia....
Intel(R) Core(TM) i3-4160 @ 3,60GHz. 8GB Ram. NVidia GeForce GTX 750. Acer 24" Monitors x 2 @ 1920 x 1080. Windows 10 Pro 64-bit.
PM
Nomad Soul
GameGuru Tool Maker
17
Years of Service
User Offline
Joined: 9th Jan 2007
Location: United Kingdom
Posted: 30th Aug 2016 23:28
I've said this a number of times but I don't think the voting board is the way to go when developing software.

There are core elements such as performance, lighting, physics and AI which should be done to a high standard before getting into things like EBE and menu editors. The voting board doesn't work because it gives the community too much control and allows 'features' which can be achieved with a little extra work to become higher priority than core engine functionality required to deliver a quality, stable game.

Lee said himself after several releases dedicated to performance he wanted to work on other things which is fair enough. When you are working on a long term project its important to keep yourself motivated and interested. However I would prefer the limited resource TGC has to be used on core work and ideally open up the engine / source code for the community to work on the nice to have stuff.

The reason engines like Unity and Unreal have made huge strides is because of the amount of input the community has, not just with media but with tools and plugins. LUA was a step in the right direction and exposing more variables but I would like to see GG go open source in 2017.

synchromesh
Forum Support
10
Years of Service
User Offline
Joined: 24th Jan 2014
Location:
Posted: 31st Aug 2016 00:29 Edited at: 31st Aug 2016 00:32
Quote: "Yeah, performance should always be top of the list - it should not be a voting thing - the engine should always be optimized and re-optimized after any new additions to the engine."


I think this is what you need to vote for on the voting board ... Currently at No 15 .... As stated the time it would take to do the work I personally would prefer some other features first that actually prevent us doing much ... Then once a game can at least be made the full optimisation to make it look and run perfectly .. Just my thoughts

Rendering engine overhaul

Toggle Details

This will involve a large overhaul of the graphics engine used by GameGuru. Possibly including features such as PBR, HDR, advanced shaders, reflections, better lighting engine and advanced shadows.

Please note: This feature may take a number of months to implement without other updates or bug fixes.

Quote: "The reason engines like Unity and Unreal have made huge strides is because of the amount of input the community "

I would think its more down to amount of programmers they have on their teams ..

Again just my thoughts
The only person ever to get all his work done by "Friday" was Robinson Crusoe..
PM
DVader
20
Years of Service
User Offline
Joined: 28th Jan 2004
Location:
Posted: 31st Aug 2016 02:02
I didn't intend this to become a performance gripe thread :0 I just wanted to let people know my findings on this, upgrading is expensive, so if something is not going to help you, it's useful to know. It also proved my thoughts that my CPU is just not cutting it with GG and is actually holding back the performance way more than the graphics card.

I'm fairly happy with performance in general (keeping up well in my current WIP), but it certainly needs improving, having the polygon meter already near maxed out with just the basic terrain is not the best. It's always annoyed me it's so high in honesty, It seems the terrain could do with some improvement, those poly-counts are really high for a blank map. Not sure if this can be optimised much, but it would seem so. I know Lee talked about it in his blogs a long time ago, he was working on terrain occlusion at the time, but not sure if it ever made it in...

On saying that, I find if you stick to (mostly) non transparent, reasonably low polycount objects in the main and avoid masses of foliage. GG can run pretty well, even on my Q6600. It's a balancing act, it can be rewarding pushing speed by clever level design, but it also can be costly on time and restrict your game to some degree. Also having to lower draw distances to a really low amount is not the best for open world scenarios while at the same time is really the most effective fps booster you can use at the moment.

The real point of the thread however, as I have been getting off point, is that if you have a CPU of similar generation as mine, upgrade that before you bother with your video card, if you already have a Geforce 660 or similar video card! That is for GG dev at least.

For me I'm hoping for AI to be worked on next. An improvement in that would be good, especially if the speed is increased at the same time :/ Which is probably asking too much, but we can all hope. Lighting as well, want those thousands of lights that are listed Not baked. I want moving pulsating lights! That alone would transform GG's look.


SPECS: Q6600 CPU. Nvidia 660GTX. 8 Gig Memory. Win 7.
synchromesh
Forum Support
10
Years of Service
User Offline
Joined: 24th Jan 2014
Location:
Posted: 31st Aug 2016 02:40 Edited at: 31st Aug 2016 02:41
Quote: "want those thousands of lights that are listed ... Not baked "

That alone would certainly improve the look of interiors
The only person ever to get all his work done by "Friday" was Robinson Crusoe..
PM
Jerry Tremble
GameGuru TGC Backer
12
Years of Service
User Offline
Joined: 5th Nov 2012
Location: Sonoran Desert
Posted: 31st Aug 2016 03:58
Quote: "saying all this now is pointless because you guys voted for that dumb EBE and everything after it."


I think you're blaming the wrong crowd in this thread. I doubt anyone on this thread at the time of my posting voted EBE to the top. It is what it is (I know, cliche). I've recently upgraded my desktop (video card, power supply and SSD, specs in sig) and saw not much of a performance increase. It confirms DVader's finding. I upgraded not necessarily for GG, but to experience VR at it's best (currently). I really don't have any performance issues with GG on either my desktop or laptop, and I never really have, but I can definitely see which changes make a difference. This engine DOES need to focus on performance alongside any bells and whistles. Whether it's DX11/12, 64 bit, multi core, it shouldn't matter, they should do them ALL, in my opinion. Anything would help, and I'm pretty sure most people today are on 64 bit systems. If they are not, they should be!
Desktop: i7 4770@3.4Ghz, 12GB RAM, Win 10/64, GeForce GTX 1080, 1TB SSD, 1TB HDD; Laptop: i7 4800MQ@2.7Ghz, 16GB RAM, Win 10/64, GeForce GTX870M , 1TB SSD.
PM
pepesilvia
8
Years of Service
User Offline
Joined: 8th Jul 2016
Location:
Posted: 31st Aug 2016 05:37
"Performance is not an feature - it is crucial - nothing should be put ahead of it in the development cycle, after all it is pointless to have bells and whistles if you can't ring them even on an average machine!" Well said!

I'm new here and even I can tell this engine is in beta, and we are all testing it providing feedback for the final product. The voting system is a clever way to obtain surveys, as a lot of the stuff in there should be fixed before the final version. Eg "bots to navigate stairs", which is really "allowing waypoints to change y position".
PM
wizard of id
3D Media Maker
18
Years of Service
User Offline
Joined: 16th Jan 2006
Playing: CSGO
Posted: 31st Aug 2016 05:58 Edited at: 31st Aug 2016 06:01
@25-WATTS
Quote: "What will damage your card is heat and you will find the extra heat is coming off the CPU which has been pointed out above that the CPU runs flat out, this will heat the space inside the PC case to a"


I don't know where you got your information but unless the air is stagnate in your Case due to incorrect fan placement heat rises, so even if your CPU manages to get to a 100 degrees it will not adversely affect the GPU with temps, Modern GPU's have a push pull configuration and uses a double or single slot design with a grill that pushes air out the back of the GPU.

Laws of thermodynamics comes into play here, a fan doesn't actually cool down the air as it sucks fresh air over the fins it might feel like it is doing so, what is actually happening is the fan is removing hot air from the case at room temperature, as the hot air is removed it creates a temperature differential.

Your are simply circulating air at room temperature, the hotter the room temperature the hotter the air you are circulating.Which is why pending the Case you have , fans should always be configured in a push and pull configuration, so that you have a positive airflow that avoids air getting stagnated in the case.

I for example have a cooler master case, with 5 x 120mm fans, and placed an aftermarket cooler master cooler on the CPU with an additional 2 120mm fans in a push a pull configuration.

Gameguru still managed to kill the 660GTX I had in the case.Not sure where the OP makes the assertion that Gameguru is CPU bound it's not, it's GPU bound. Gameguru runs any GPU at near 100%, constantly unlike a game for example Skyrim or fallout 4, which has twice the amount of graphics fidelity and GPU temperature difference between Gameguru and fallout 4 is quite large.

You sir have it the wrong way around, gameguru is GPU bound and not CPU, while the CPU works hard when doing light map calculations, it otherwise twiddles it's thumbs most of the time in gameguru. Some none GPU tasks have been offloaded to the CPU, it isn't in no way taking the brunt of the work.

While your assumption that a new CPU will work better, picking a better CPU over a GPU, is the worst possible advise you can give to any one here.
The reason why an I5 will run better then a Q6600 in fact any core2quad CPU, is due to the fact that a Core2quad as an example has 2 dual cores dies on it with each die having a separate level 2 cache, if I am not mistaken the core2quad had 6mb cache, basically 3mb per die, 1.5mb per core and the cache on each die could not be shared with one another.

The iCore series of CPU's changed things considerably, while the I5 for example has 4 physical cores, and each one have access to local level 1 and level 2 cache, it now makes use of a level 3 cache which is shared globally between the CPU's which means pending the application any given CPU can make use of the cache as well as get the lions share, meaning some cpu's won't even use the level 3 cache in most instances and stay with in their allotted local cache.

Another important factor to take into account is the PCIe express bus it self, pending which LGA775 motherboard used and what graphics card it may only be using PCIe 1 speed, which creates a significant bottleneck meaning even if you place a new PCIe express 3 graphics card within the older system, it will definitely improve performance but only at the link speed of the motherboard. You will in return lose a fair bit of performance as a result, but it will not be directly be due to using an older CPU, but due to the bus speed as well.


@Jerry Tremble
Moving the engine over to 64bit isn't going to make the engine faster, the assumption that 64bit is making apps run faster is wholly incorrect, 64bit computing adds more physical memory the app has at it's disposal, bar a few CPU instruction sets, 64bit isn't going to make a massive improvement.
Where you will gain lots of performance and is addressed to every one in this thread is below

Offloading the none GPU dependent tasks to the CPU, and moving the engine over to a new directX version, and make use of things like GPU instancing, and the specific rendering optimizations of a new directx version.For example DirectX 11 is capable of taking meshes and slicing them in smaller chunks for rendering.There is probably a bunch more benefits of moving to a new directx version, I don't even pretend to understand.

For which version.It will likely be Directx11, directx 12 is too new, and the requirement is that you need windows 10, older versions doesn't support dx12 as far as I am aware, You simply will not catch me dead on windows 10, with all the privacy issues it has, if time comes I will rather upgrade to windows after 10 if and when Microsoft resolves it's privacy issues otherwise I am quite happy to stay with windows 7 or 8.1 till support ends entirely for it.

TLDR version.
The thread is full of inaccuracies, gameguru is fully GPU bound with some CPU tasks, you will be better off with a better GPU then CPU.
Win7 pro, Intel 2500K @3.7ghz 660GTX 8gig ram 16tb HDD
JohnS_CZ
9
Years of Service
User Offline
Joined: 4th Mar 2015
Location: Czech republic
Posted: 31st Aug 2016 08:25
wizard of id wrote: "You simply will not catch me dead on windows 10, with all the privacy issues it has"

If you're talking about telemetry, I can assure you that it's in your Windows 7/8/8.1 too. They have added it in one update last year.
So unless you're reading about every update released and/or you don't install updates, it's already in your OS.
That said, Google probably knows more about us than Microsoft, so it doesn't really bother me that much.

About performance, GG definitely needs some optimization. But since (afaik) it's Lee and few other people, it would pause additional development for a while /again/, so I can't really think of any 'best' solution.
PM
Belidos
3D Media Maker
9
Years of Service
User Offline
Joined: 23rd Nov 2015
Playing: The Game
Posted: 31st Aug 2016 08:44 Edited at: 31st Aug 2016 08:48
I agree with most of what you're saying Wizard, it kind of reflects what I was saying.

But, something that does need to be taken into account when deciding on whether to replace a GPU or replace a CPU to increase performance, is the "gap" between the two.

When choosing your components for a PC you can't just mix and match and expect maximum performance, there are what they call bottlenecks between components, in DVaders case, I think the reason he hasn't noticed a considerable increase in performance isn't just because of how GameGuru behaves (although GameGuru could seriously do with some more optimizing), part of it is because his new GPU is too powerful for his CPU.

The GPU and CPU need to interact and the further apart they are in specs the less efficient that interaction, when you upgrade a GPU you need to take into account that once you reach a certain level of "speed/power" in a GPU in comparison to the "speed/power" of the CPU anything more powerful will not run asignificantly better, It looks to me, with my experience of the Q6600, and what I have read online, that the threshold for that processor is around the gtx760 and its AMD equivalent (about the 7870 i think) mark, anything after that doesn't show much of an improvement because of the CPU/GPU bottleneck.

This is actually one of the reasons I moved on from my Q6600 to an i5, I tried to upgrade from an AMD 6670 to an NVidia 960 and it showed at the most a few fps increase, it had got to the point where more memory and better graphics cards just weren't making a difference because it was limited by the CPU, I switched it for an i5 and that increase changed from a few fps to 20 to 30 fps increase.

i5, NV960 2GB, 16GB memory, 2x 2TB Hybrid, Win10.
i3 , Intel integrated graphics, 6GB memory, 512GB Generic SATAIII Win8.1.
Intel Celeron (duel Core), Radeon integrated graphics, 4GB memory, 180gB Generic SATAII, WinVista.
Q6600, Intel integrated graphics, 8GB memory, 512GB Generic SATAII, Win7.
wizard of id
3D Media Maker
18
Years of Service
User Offline
Joined: 16th Jan 2006
Playing: CSGO
Posted: 31st Aug 2016 08:44
Quote: "If you're talking about telemetry, I can assure you that it's in your Windows 7/8/8.1 too. They have added it in one update last year.
So unless you're reading about every update released and/or you don't install updates, it's already in your OS."
Yet you can uninstall the offending updates While they by past the hosts file, you can actively not install the offending updates, windows 10 has it by default, Besides, I don't install updates or have it set to automatic.While I have SP1 installed, I have a firewall installed as well a router setup that blocks unwanted connections period.Unless some thing really needs an update, will gladly install it otherwise I don't waste my time updating windows unnecessarily, router and firewall connected with spyware and antivirus has served me well enough for the last odd 16 years. Bandwidth is at a premium here so nothing gets updated or gets past the firewall unless I say so.

Much easier to deal with the privacy issues in windows 7 and 8 then 10 where it comes with it default, windows 10 does far more spying then windows 7 or 8 updates can. Windows 10 can go take a hike.
Win7 pro, Intel 2500K @3.7ghz 660GTX 8gig ram 16tb HDD
wizard of id
3D Media Maker
18
Years of Service
User Offline
Joined: 16th Jan 2006
Playing: CSGO
Posted: 31st Aug 2016 09:09
Quote: "part of it is because his new GPU is too powerful for his CPU. "
It's not too powerful, that isn't what a bottleneck implies, the apt definition of what a bottleneck is totally different to what you are thinking about. You can have a 640GT in that system and it will still bottleneck, due to memory speed, bus speed and CPU processing power.It doesn't means you shouldn't get a better graphics card over a CPU.My issue is that right off the bat the op mentioned rather get a better CPU then a GPU, which is wholly incorrect regardless of system specs.Obviously there is a point where upgrading the GPU would be pointless as the CPU isn't fast enough.

While almost always better when doing a complete system upgrade to take a better CPU as you can always upgrade the GPU later, upgrading individual components in a existing system unless you are running a Pentium dual core, GPU trumps CPU in most cases, after which More memory trumps SSD and the SSD trumps mechanical HDD.The Op has a 11 odd Year old CPU, so it's fair to say he needs to choose a better CPU, advising every one else that CPU plays a more important part, irrespective of that persons system specs are, is irresponsible.

On my CPU I am on basically the edge of having to upgrade to an entirely new system or getting a new GPU.

While you can still decently game on a i3 CPU, higher end games requires i5 as a min

Win7 pro, Intel 2500K @3.7ghz 660GTX 8gig ram 16tb HDD
Belidos
3D Media Maker
9
Years of Service
User Offline
Joined: 23rd Nov 2015
Playing: The Game
Posted: 31st Aug 2016 09:33
Quote: "You can have a 640GT in that system and it will still bottleneck, due to memory speed, bus speed and CPU processing power.It doesn't means you shouldn't get a better graphics card over a CPU.My issue is that right off the bat the op mentioned rather get a better CPU then a GPU, which is wholly incorrect regardless of system specs.Obviously there is a point where upgrading the GPU would be pointless as the CPU isn't fast enough. "


That's exactly what I meant, you just had better words for it than me. When I said too powerful, I didn't mean it literally I meant exactly what you said that the CPU isn't fast enough to keep up, and yes a 640 will still bottleneck, but it works on a curve, and the point where it tips the balance with the Q6600 to the point where it's pointless upgrading the GPU further is about the 760 mark.

We're pretty much agreeing, you just explain it much better than me :p

Oh and yes I agree, a flat "get a better cpu over gpu" to everyone is the wrong thing to say, I only suggested it in DVaders case because he seems to have reached that point, he has a Q6600 with 8GB of memory, which if I remember correctly is the maximum most of the board that take the Q6600 can have, so more memory is out of the question and he already has a new GPU which is bottlenecked, swapping out an HDD for SSD wouldn't improve much in the way of FPS, it's more likely to increase load times, so in his case i'd say a new CPU is the better option.

i5, NV960 2GB, 16GB memory, 2x 2TB Hybrid, Win10.
i3 , Intel integrated graphics, 6GB memory, 512GB Generic SATAIII Win8.1.
Intel Celeron (duel Core), Radeon integrated graphics, 4GB memory, 180gB Generic SATAII, WinVista.
Q6600, Intel integrated graphics, 8GB memory, 512GB Generic SATAII, Win7.
25-WATTS
8
Years of Service
User Offline
Joined: 23rd Feb 2016
Location:
Posted: 31st Aug 2016 15:02
wizard I get my information from 37 years of testing in the real world, no two PC's are the same, e.g my Steam PC has a 25watt CPU (AMD 5350 x 4 at 2.05 Ghz) and a 35watt GPU (R7 240 DDR3) total flatout 60watts, I don't need any case fans and no fan on the Power supply unit as I use a Laptop type power brick unit.

Most Video cards gets its cooling air from inside the case which will never be at room temperature the hotter the case temperature the hotter the card will run, most cards still do not push the hot air out the back of the PC. Think about this your case fans are making more heat than my CPU, also it is a known fact that to many case fans fill the computer with dust faster and in turn will fill your Video card with dust faster and it will overheat and kill it within 12 months. Game Guru did not kill your card heat or bad workmanship did.

To get back to the OP post I then fitted a GTX 750 ti to this above system, did it run Game Guru any Faster the hell it did (R7 240 3D score = 970 / GTX 750 = 3D score = 3500 the GTX is 3 x faster why was Game Guru not 3 x Faster, all down to the Slow core GHz of the CPU, so it goes to show it all about matching the GPU and CPU more so with 3D engines which mainly use one core like Game Guru.

Is the AMD 5350 a good match for the GTX 750 yes with other games I see a very big jump like 3x with other games over the R7 240 card.

If you are just looking to make a PC for Game Guru Pick a CPU with the best Single Thread Core Rating you can, the AMD 5350 is just too low at 808 score per core. Get a Fast 2 core would be a much better idea like an old Intel Pentium G2020 has a Single Thread Core Rating 1545 Game Guru flys with this CPU.

I've picked a few Chips below that all have the same Single Thread Core Rating as the old Intel Pentium G2020 give or take 10%, Powerfull chips but may not run Game Guru much faster.


Intel Core i7-965 @ 3.20GHz
Intel Core i5-3330 @ 3.00GHz
Intel Core i5-2310 @ 2.90GHz
Intel Core i7-870S @ 2.67GHz
Intel Core i5-2300 @ 2.80GHz
Intel Core i7-940 @ 2.93GHz
Intel Core i5-2320 @ 3.00GHz
AMD A10-7870K
AMD FX-4350 Quad-Core


I was running Game Guru last night with father's island and to give you some idea of the power draw using high and low setting, 120watts at the wall for low and 125watts for high very small rise. Test Computer (AMD A4 3400 @ 2.7 GHz with AMD HD 5670) at 20 FPS High Setting and was Playable.
PM
DVader
20
Years of Service
User Offline
Joined: 28th Jan 2004
Location:
Posted: 31st Aug 2016 15:06 Edited at: 31st Aug 2016 15:08
WizardofID, I'm pretty sure you were arguing about GG being GPU bound last time. I am sorry you are wrong. Yes, it may be when you have a top CPU, but with my spec, GG is definitely slower because of my CPU. My tests here prove it and no amount of spec talk will change that. If I add a new video card that is TWICE as powerful as my old one and get about 1 fps increase, that is saying it has made no difference to me. Then adding my old video card to a faster system which showed a much better improvement than I ever had before with it also proves it.

Normally CPU/GPU bottleneck is a bit of a myth, but here in actual practise, it is without doubt a fact. I have just bought a new GPU without any advantage, I am merely warning others not to do the same if you have a rig of similar age and spec. It isn't the GPU holding it back it's the CPU. Blank map. Nothing added. Polycount bar is screaming into the red. That never changes no matter which video card I have used, and it's been 3 now.

Edit - 25 watts got in before I finished.

Quote: "Most Video cards gets its cooling air from inside the case which will never be at room temperature"


LOL, would be in mine I never have the sides on...


SPECS: Q6600 CPU. Nvidia 660GTX. 8 Gig Memory. Win 7.
Belidos
3D Media Maker
9
Years of Service
User Offline
Joined: 23rd Nov 2015
Playing: The Game
Posted: 31st Aug 2016 15:16
Another part of your situation that can be improved by upgrading your processor DVader, if you go for a new processor you will have to replace the board, which means you will be able to put more and better memory in it, if I remember correctly the old boards that took the Q6600 took DDR2, and I think it was just on the cusp of the original release of DDR3 so some boards took the original DDR3, which means that your memory speed's capped at something like 833mhz, so although that won't improve FPS hugely it will make a big difference in other areas.

i5, NV960 2GB, 16GB memory, 2x 2TB Hybrid, Win10.
i3 , Intel integrated graphics, 6GB memory, 512GB Generic SATAIII Win8.1.
Intel Celeron (duel Core), Radeon integrated graphics, 4GB memory, 180gB Generic SATAII, WinVista.
Q6600, Intel integrated graphics, 8GB memory, 512GB Generic SATAII, Win7.
DVader
20
Years of Service
User Offline
Joined: 28th Jan 2004
Location:
Posted: 31st Aug 2016 15:41
When I first bought it I had up to 1333 fsb support on the M/B and spent a lot on the fastest memory I could. That died at some point and I have 8 gig of slightly slower ram now, but yes a new CPU means all new parts in that regard, all bases would be covered. Not gonna happen for awhile now though, budget spent.


SPECS: Q6600 CPU. Nvidia 660GTX. 8 Gig Memory. Win 7.
Belidos
3D Media Maker
9
Years of Service
User Offline
Joined: 23rd Nov 2015
Playing: The Game
Posted: 31st Aug 2016 15:44 Edited at: 31st Aug 2016 15:51
ah yes you're right it was 1333mhz I knew there was a 3 in it somewhere :p

Well, I hope you get some improvement soon, you do some great work in GG, imagine how much more you could do with better specs. If I had known a few months ago I would have had an i5 with board and 4gb of ddr3 memory you could have had for nothing, a guy I know was having a clear out and offered me a few desktops, if i'd have known then I would have grabbed them off him and posted one to you. Only thing I have spare at the moment is a Q8300 which won't be much better than what you have (in fact I think the Q6600 was better because they messed up on the later Q Core architecture) If I hear of anything else i'll let you know.

i5, NV960 2GB, 16GB memory, 2x 2TB Hybrid, Win10.
i3 , Intel integrated graphics, 6GB memory, 512GB Generic SATAIII Win8.1.
Intel Celeron (duel Core), Radeon integrated graphics, 4GB memory, 180gB Generic SATAII, WinVista.
Q6600, Intel integrated graphics, 8GB memory, 512GB Generic SATAII, Win7.
wizard of id
3D Media Maker
18
Years of Service
User Offline
Joined: 16th Jan 2006
Playing: CSGO
Posted: 31st Aug 2016 17:03 Edited at: 31st Aug 2016 18:42
@25-WATTS Lol

I used to be a hardware reviewer for a website (Defunct now) , and specialized in PSU's, bar that I was quite active in the overclocking community as well, pretty much done the whole peltier cooling, dry ice, and water cooling thing, even case modding.

Quote: "Think about this your case fans are making more heat than my CPU"
I don't know if you are intentionally trying to play the fool, or simply don't grasp the concept of the laws of thermodynamics.Lets give you a crash course.

You desktop fan, ceiling fan for that matter, moves air around at room temperature positive air flow in a case means there isn't stagnate air inside the case while with proper cable management you can minimize hot spots within a case, and a more suited case with a bottom mounted PSU you can cut out any problem areas.

If you have positive air flow it is really a moot point as you constantly shifting the airflow, thermodynamics state you can't cool air past room temperature without an active catalyst, refrigerated cooling fins as a quick example.So you are constantly shifting room temperature air to avoiding trapping air around components that run hotter then the surrounding air, which means you are removing heat from the cooling fins, this cooling method is trying to keep it at room temperature, while it isn't the best method out there, it is the cheapest and easiest to maintain

I specifically have a Cooler Master 690 II case with a bottom mounted PSU, 120MM side fan drawing out air directly from the GPU, Front intake fan 120mm on the 5 HDD's I have installed, A bottom intake 120mm fan blowing in air, and two top mounted 120mm fans drawing out air.The CPU cooler which has after market cooler with a push and pull configuration that draws air from the empty 5 inch bays and out the back.

Oh lets not forget the custom cooler master clipon dust filters, and the monthly cleaning with my air compressor.I suggest you go look up laws of thermodynamics and come back when you have a better understanding of air cooling.

As for the GPU, explain to me with your 37 years odd experience you have , how does the game tomb raider manage to have a 80% load on my GPU, and gameguru has 100% load on a practically empty map with not even half the detail of tomb raider.Perhaps you should load up a monitoring app and have a look at how different the CPU/GPU load is in gameguru versus any game you enjoy with decent graphics.

You do know what the say about assumptions

@DVader
You might be CPU bound, but not every one else is in the same boat as you.Gameguru is not CPU bound.

Your exact words !!!!!
Quote: "This not only confirms my suspicions for years, it also shows why many laptops struggle so badly even though they have a reasonable video card on them"


What I think is important to mention here, by the meaning of CPU bound for me in the context I explained is that gameguru is not reliant on the CPU to do the brunt of the work, most of the work is being done by the GPU, it doesn't however mean you can run single core CPU on it. It simply means that Gameguru it self isn't well coded or efficiently coded to make better use of the CPU.Which is why I pointed out to dvader putting blame on the CPU isn't entirely correct.

If you consider how low the the system requirements are for unity as an example https://unity3d.com/unity/system-requirements, it does however greatly depend on the complexity of the project your working on. Gameguru has far bigger issues if you consider what the system requirements are for other tools out there, the problem is hardly dvaders CPU old or not, with the same system he could develop a higher detail game in leadwerks as an example even UDK perhaps.Which is why I am arguing that gameguru it isn't CPU "bound" it has a much higher reliance on the GPU.This isn't a healthy thing for a GPU, real world usage would see a GPU with relative fluctuating load some times a little some times a lot, but never at 100% load unless you have a really old GPU, extended gameguru use isn't particularly good for the GPU in the long run, considering there often no reduction in load in general for the GPU, which it's can quite easily kill your GPU, if you disagree, well that is your opinion.But you should remember I work full time on gameguru and content, I hardly game any more.Gameguru is the app you will more then likely find open on my desktop.

While I managed to bring back the GPU by baking it with a heat gun, I really need to throw it in the oven to properly bake the solder, just need to get new thermal paste before doing it again.




Thanks for the good laugh either way.

BTW
Quote: "power draw using high and low setting, 120watts at the wall for low and 125watts for high very small rise."
How exactly did you measure this ? What is your power draw on a commercial game it won't be much different considering the CPU wattage draw is 65watt the GPU is 61 odd watts. Irrespective of what you throw at the system the neither the CPU or GPU can draw more power then 65watt and 61 odd watts, it is the maximum power draw they have.It's really a moot point even mentioning this, not even sure why you did, however I find it hard to believe that the power draw would be that low from the wall socket. It entirely depends on the PSU efficiency, unless you have a platinum rated PSU in your system, with bronze, silver and gold ratings wattage drawn is almost always going to be higher, as PSU isn't 100% efficient it is physically impossible. While PSU's have rated efficiency at different load levels especially if they are rated they get load tested at 20% 50% and 100% load, the higher the rating the lower the efficiency drops on PSU's.

The power draw is always going to be less efficient, and going to be higher draw on the wall socket, it's physically impossible.The high quality PSU's often have a problems at being efficient at low loads ironically. However this also greatly depends on how good the topology is and how much transient line filtering is before it actually reaches the PSU components.

To add insult to injury, efficiency can drop even more when you start applying heat to a PSU, that is where things get interesting, as often these rated PSU's fail their ratings, in a double forward PSU design as a quick example you have a primary side and a secondary side, pending what capacitors is being used, capxon is most often used by lower end and some midrange PSU's, isn't a particularly great capacitor and considered a tier 2, they do getting pretty hot however their performance starts dropping off as low as 55 degrees. which causes ripple and in return you can kiss your efficiency rating good bye when you apply heat.

Unless you are running a bare min system I definitely would like to see 125 odd wattage draw from the wall socket, considering just your CPU and GPU takes 120watt without taking into account the rest of the system, ram, HDD, motherboard it self, while ram and HDD doesn't take much, pending the motherboard there is some usage their, not that much either a few watts at most.Seeing is definitely believing here.
Win7 pro, Intel 2500K @3.7ghz 660GTX 8gig ram 16tb HDD
wizard of id
3D Media Maker
18
Years of Service
User Offline
Joined: 16th Jan 2006
Playing: CSGO
Posted: 31st Aug 2016 18:27
@Belidos
Quote: "DDR3 so some boards took the original DDR3, which means that your memory speed's capped at something like 833mhz, so although that won't improve FPS hugely it will make a big difference in other areas."


You could actually throw in some higher speed DDR3 memory for the sole purpose of overclocking, I had a Q6600 G0 stepping which ran much cooler then the B3, also had a decent DFI motherboard at the time, and with air cooling and decent ram you could get it to 3.8GHZ, while it wasn't 100% stable.PSU had too much ripple and a small ripple would often cause benchmark fails.

With loads of Vaseline and dry ice you could have some serious fun

Also the latter version of the core2quad series was faster then the Q6XXX range due to having large cache and new instructions set, Q8 series was the exception as it was the entry quad cores, still was a fair bit faster in most instances the Q9XXX series really moved, but wasn't that great at overclocking unlike the fabled Q6600 G0
Win7 pro, Intel 2500K @3.7ghz 660GTX 8gig ram 16tb HDD
25-WATTS
8
Years of Service
User Offline
Joined: 23rd Feb 2016
Location:
Posted: 31st Aug 2016 19:21 Edited at: 31st Aug 2016 19:39
@Wizard Said "It's really a moot point even mentioning this, not even sure why you did"

It was a bit of info for all uses. BUT the main reason was something Zigi said about his fans running fast and noisy even on low setting it was to show the power draw is much the same for high and low setting.

You seem to think I have the made up the total power for my test, You are right the CPU and GPU take about 64 watts each so if they were both running flat out that would be 128Watts then the loss in the Power supply 10/20% the Total power would be a lot higher you are thinking along those lines yes.

Lets give you a crash course.

What you have missed is the A4 3400 APU as on board Graphics which has been shut down as I have a Plug in Card so the A4 3400 APU would not be taking the Full 64Watts about 50%, that is where the missing watts have gone.

Thanks for the good laugh either way
PM
wizard of id
3D Media Maker
18
Years of Service
User Offline
Joined: 16th Jan 2006
Playing: CSGO
Posted: 31st Aug 2016 20:05
While you don't use the APU, it does go into an idle stay and there is still a power draw regardless of the APU being used or not, lesson taught. Your 50% wattage is best guess without actual evidence is still pretty much hot air.Nor have you actually given any evidence of wall socket draw more hot air Unless you have actual voltmeter readings to back up a statement then it is more hot air. Software based readings are not to be trusted with barge pole. (ANY major PC hardware website worth their salt will tell you this) PSU efficiency entirely depends on the rating simply stating efficiency minus 10/20% will given you a fair idea of what it might be it completely incorrect. Real world tests, especially if you go to website like johnny guru you will find that the ridiculously low rating test temps don't apply when you apply real world heat, and some fail their rating quite easily.You will find a rating curve that shows efficiency on PSU's in general pending the brand of course this can be quite fictitious as well as the rated wattage output, this isn't to be trust with a barge pole as well. Accurate output wattage and efficiency can only be taken with a voltmeter

What PSU you have ? should be interesting?

Unfortunately I can't let you near my PC currently there is a fair bit of hot air coming from your direction.




Win7 pro, Intel 2500K @3.7ghz 660GTX 8gig ram 16tb HDD
science boy
16
Years of Service
User Offline
Joined: 3rd Oct 2008
Location: Up the creek
Posted: 31st Aug 2016 20:52
As in read up there
Would take months to overhaul with no update etc.

Ermm hello ebe anyone? Total waste of time to me so why a total overhaul a problem?
an unquenchable thirst for knowledge of game creation!!!
synchromesh
Forum Support
10
Years of Service
User Offline
Joined: 24th Jan 2014
Location:
Posted: 31st Aug 2016 21:01
Quote: "Ermm hello ebe anyone? Total waste of time to me so why a total overhaul a problem? "

Not a problem ... Just not voted for enough yet ...
The only person ever to get all his work done by "Friday" was Robinson Crusoe..
PM
wizard of id
3D Media Maker
18
Years of Service
User Offline
Joined: 16th Jan 2006
Playing: CSGO
Posted: 31st Aug 2016 21:41
Quote: "Ermm hello ebe anyone? Total waste of time to me so why a total overhaul a problem? "
Not really a waste it was needed for users who can't model, it does need a fair bit of work thought Some limitation apply. Synchromesh won't tell you it, but he is happy like pig in the mud....Still fair bit of work on it before I would be happy
Win7 pro, Intel 2500K @3.7ghz 660GTX 8gig ram 16tb HDD
synchromesh
Forum Support
10
Years of Service
User Offline
Joined: 24th Jan 2014
Location:
Posted: 31st Aug 2016 21:53 Edited at: 31st Aug 2016 22:08
I'm happy as a pig in mud ..



For the record ...
Personally I would rather have had Third person for every character or bots on all levels before the EBE...
But I don't let that get me down ... Make the most of what we do get next that's what I say
The only person ever to get all his work done by "Friday" was Robinson Crusoe..
PM
25-WATTS
8
Years of Service
User Offline
Joined: 23rd Feb 2016
Location:
Posted: 31st Aug 2016 22:05
@wizard

The meter is a plug-in mains power and energy monitor from Maplin
measurse voltage, amp, watts, volt-amps,hertz and power factor, does a bit more than a voltmeter

I'm sorry you think the reading is wrong I can't do much about that.

I think the A4 3400 would be about 30watts for the CPU and about 35watts for GPU thats my best guess based on how each performs.

Spock from Star Trek could never understand the human best guess.

The on-board Graphics is turn off in the BIOS and if is taking any power it would so small it's not worth talking about.

Let me post this again.


I was running Game Guru last night with father's island and to give you some idea of the power draw using high and low setting, 120watts at the wall for low and 125watts for high very small rise. Test Computer (AMD A4 3400 @ 2.7 GHz with AMD HD 5670) at 20 FPS High Setting and was Playable.


and add this for wizard

Reading taken with a plug-in mains power meter and energy monitor from Maplin
measurse voltage, amp, watts, volt-amps, hertz and power factor

Sorry about the hot air coming from my end but with the case fans you have your computer will be just fine. LOL

always good to talk to you.

@synchromesh Love the pig
PM
Bored of the Rings
GameGuru Master
19
Years of Service
User Offline
Joined: 25th Feb 2005
Location: Middle Earth
Posted: 31st Aug 2016 22:09 Edited at: 31st Aug 2016 22:13
love the pig synchromesh... , APU, CPU, GPU.... who gives two hoots ... ive had a lot of wine tonoght soo excuse my lack of enthusiasm with this thread .......
Professional Programmer: Languages- SAS (Statistical Analysis Software) , C++, SQL, PL-SQL, JavaScript, HTML, Darkbasic Pro (still love this language), Purebasic, others
Hardware: Dell Precision 490; AMD Radeon HD 7570; 12GB.
FPSC to GameGuru Tools: SegAutoWelder, Entity+Weapon Welder
25-WATTS
8
Years of Service
User Offline
Joined: 23rd Feb 2016
Location:
Posted: 31st Aug 2016 22:13
only an owl
PM
Bored of the Rings
GameGuru Master
19
Years of Service
User Offline
Joined: 25th Feb 2005
Location: Middle Earth
Posted: 31st Aug 2016 22:16
- oh dear drunk too much wine, but this thread is becomng hazy .... ha ok - enjoy your evenings all
Professional Programmer: Languages- SAS (Statistical Analysis Software) , C++, SQL, PL-SQL, JavaScript, HTML, Darkbasic Pro (still love this language), Purebasic, others
Hardware: Dell Precision 490; AMD Radeon HD 7570; 12GB.
FPSC to GameGuru Tools: SegAutoWelder, Entity+Weapon Welder
25-WATTS
8
Years of Service
User Offline
Joined: 23rd Feb 2016
Location:
Posted: 31st Aug 2016 22:24 Edited at: 31st Aug 2016 22:25
@Bored of the Rings. you just made my day I'm off to bed happy now I was just about to turn to drink myself you have saved me.
PM
Bored of the Rings
GameGuru Master
19
Years of Service
User Offline
Joined: 25th Feb 2005
Location: Middle Earth
Posted: 31st Aug 2016 22:33
@25 watss- glad i could help ....
Professional Programmer: Languages- SAS (Statistical Analysis Software) , C++, SQL, PL-SQL, JavaScript, HTML, Darkbasic Pro (still love this language), Purebasic, others
Hardware: Dell Precision 490; AMD Radeon HD 7570; 12GB.
FPSC to GameGuru Tools: SegAutoWelder, Entity+Weapon Welder
Wolf
Forum Support
17
Years of Service
User Offline
Joined: 8th Nov 2007
Location: Luxemburg
Posted: 31st Aug 2016 23:49 Edited at: 31st Aug 2016 23:50


...and please minimalise the amount of hot air blown in either direction, everyone



-Wolf
"When I contradict myself, I am telling the truth"
"absurdity has become necessity"
wizard of id
3D Media Maker
18
Years of Service
User Offline
Joined: 16th Jan 2006
Playing: CSGO
Posted: 31st Aug 2016 23:49
Quote: "The meter is a plug-in mains power and energy monitor from Maplin
measurse voltage, amp, watts, volt-amps,hertz and power factor, does a bit more than a voltmeter"


A 13 pound odd meter VS my trusty Fluke multimeter by all means lol.

meh your not worth the effort.Thanks any ways.
Win7 pro, Intel 2500K @3.7ghz 660GTX 8gig ram 16tb HDD
MK83
GameGuru TGC Backer
18
Years of Service
User Offline
Joined: 10th Jun 2006
Location: Greeneville, TN USA
Posted: 1st Sep 2016 00:14
Quote: "Yeah, performance should always be top of the list - it should not be a voting thing - the engine should always be optimized and re-optimized after any new additions to the engine."
I totally agree.
AMD Phenom x4 9850 2.58 Ghz , 6 gb ram, 2GB EVGA Geforce GTX 750, Win 10 x64



PM
synchromesh
Forum Support
10
Years of Service
User Offline
Joined: 24th Jan 2014
Location:
Posted: 1st Sep 2016 01:00 Edited at: 1st Sep 2016 01:06
Would that really be viable ?

Add a feature ...then a few weeks optimising ... Add another ... then a few more weeks optimising ... and again and again..
Or .... get some features in ... Then optimise because you may need to hit the same code a few times for related features that don't necessarily go in at the same time ?

Usually if everyone notices a huge drop after an update like the SOA it gets reported and dealt with ..
The only person ever to get all his work done by "Friday" was Robinson Crusoe..
PM
wizard of id
3D Media Maker
18
Years of Service
User Offline
Joined: 16th Jan 2006
Playing: CSGO
Posted: 1st Sep 2016 07:27
Quote: "Would that really be viable ?

Add a feature ...then a few weeks optimising ... Add another ... then a few more weeks optimising ... and again and again..
Or .... get some features in ... Then optimise because you may need to hit the same code a few times for related features that don't necessarily go in at the same time ?

Usually if everyone notices a huge drop after an update like the SOA it gets reported and dealt with .."
2018 that is when we can expect some thing complete Seriously thought Lee didn't expect EBE to take this long, so there are going to be set backs some times.....But rather have all the core in before the is performance work or graphics updates.....You can't really have performance done....if you have nothing to actually do performance on
Win7 pro, Intel 2500K @3.7ghz 660GTX 8gig ram 16tb HDD
25-WATTS
8
Years of Service
User Offline
Joined: 23rd Feb 2016
Location:
Posted: 1st Sep 2016 11:07
@wizard

Good to see you have a multimeter, over the years l've had a few of those myself, my first one was a AVO METER back in the 60's at the age of 10 I would come home from school and work together with my father helping him with his business making and designing electrical equipment.

He designed fitted and Commissioned the control gear for the steam catapult system on the Royal Navy's Aircraft Carrier Ark Royal, he was also one of the first people in the U.K. to make a working T.V set. He passed his skills onto me and then sent me to collage for 3 years to train as a Electrical and Electronic engineer, we worked together for many years before I took the business over, We designed made fitted and serviced control equipment for security alarms, fire alarms, Wardon control systems etc for many Government and Local Government organisations.

I would not be seen dead sticking meter leads into a 240V AC wall socket... but maybe one day you will.

Will stick to my £15 tester for this sort of job thanks.
PM

Login to post a reply

Server time is: 2024-11-24 22:49:02
Your offset time is: 2024-11-24 22:49:02