Python is too slow for Portage… why not write a C/C++ module to do CPU bound tasks like dependency resolution?

I've been using Gentoo since the early-mid 2000s, and one thing that's always bothered me, especially after using apt/dnf/pacman and seeing how much faster they all are, is that emerge -va takes a long time to even get to the point where you answer y/n. Obviously, being a source based distro, there are a lot more factors to consider (like USE flags, various versions of dependencies, etc.), but at the end of the day, Python is really slow at CPU bound tasks, and if you profile emerge, basically all of the runtime is in a couple of tight functions (_select_files, IIRC).

Why not implement a C/C++ replacement for these few functions and load/call that from the existing Python? It could even have the same interface as the existing function(s) and spit out results in the same return format.

For reference, I don't have a lot of exotic USE flags or anything, but my ~amd64 system with GNOME/KDE takes 1 min 41 sec to run emerge -uDvpU --keep-going --with-bdeps=y @world (quad-core Haswell Xeon with 3.6 GHz max turbo boost). I've tried using PyPy instead of regular Python, but that gives mixed results (sometimes faster, sometimes slower).

I'd wager rewriting Portage in Go or something would also give good speedups, but that would take a lot more work. Optimizing a few hot functions in C/C++ would be relatively easy and most likely be a huge win.

πŸ‘︎ 52
πŸ“°︎ r/Gentoo
πŸ’¬︎
πŸ‘€︎ u/EatMeerkats
πŸ“…︎ Jun 16 2020
🚨︎ report
Performance: any chance of a (distant) future version of FMG not being CPU-bound?

D3 is an amazing library, however it has two major problems:

  1. It its heavy handed in DOM manipulation, which is a known expensive operation
  2. It does not use dedicated hardware acceleration (GPU).
  3. Not a major problem but being written in JS doesn't help (although there are ways around this, hence it is not a major problem -- not suggesting that this should change either, JS has other advantages that make it worth keeping).

The result is that FMG struggles on all but the smallest maps (points, zoom, or literal size), or with all but the beefiest of processors, which is really frustrating for me (and I suspect for a few others as well). D3 just isn't spec'd to be used with something as big and amazing as FMG and its only going to get more difficult the more you try to do with it.

Not saying that this is necessarily something that could happen in the near future, because it would require a huge effort. My concern is that as you add more features, the map will become less usable because end-user computers are tending towards being more parallelised with more specialized processors for certain tasks rather than producing processors with larger base clock speeds. In fact, processors are currently set to reduce their base clock speeds to better conserve power on mobile devices and laptops.

If improvements along this line are desirable then I'm pretty sure that there are a few of us in the community that could provide some good ideas on how to go about it; sometimes all you need is a rubber duck.

Either way; keep doing what you're doing u/azgarr. FMG is an amazing piece of software.

πŸ‘︎ 29
πŸ’¬︎
πŸ‘€︎ u/EternalDeiwos
πŸ“…︎ Jul 01 2020
🚨︎ report
Is Valorant CPU or GPU bound?

As the title mentions, i was wondering if the game is CPU or GPU bound as from what I've read on some posts, it depends on the PC too. The reason I'm asking this is that i want to OC my i5-3570k for better stability in-game. Although, RIOT SUPPORT says that's GPU bound. What are your experiences so far? Thanks.

πŸ‘︎ 7
πŸ“°︎ r/VALORANT
πŸ’¬︎
πŸ‘€︎ u/Wakaastrophic
πŸ“…︎ Jun 10 2020
🚨︎ report
Anno 1800 cpu/gpu bound settings

Can I just ask, what settings impact the cpu and gpu respectively?

my specs:

Ryzen 5 3600, Rx 580 4gb, 16 gb 3600 MHz ram, on m.2 ssd.

Playing the game at 1440p, aiming for 60 fps. And I only achieved that on ultra low in dx12.... The gpu is dying all the time, while the cpu is practically sleeping at 5% usage on only 2 out of 6 cores.

Too damn lazy to experiment around as it requires a full restart for every individual setting change, I want to turn up the settings that are cpu-bound, to make it work harder and get some nicer quality out of the game without lowering the already barely barely stable 60 fps mark, usually the cpu intensive tasks are physics based (the game is usually around 80 fps in huge late game cities, but I can't play it without vsync as the tearing is horrendous, and it sometimes dips to 50 fps for 2-3seconds and messes up vsync completely and looks horrendous if I don't have 60+ all the time)

Also what are some general tips on better performance for the game? Like what's the difference between dx11 and dx 12? Usually dx 12 is a lot better, but it's horribly implemented in some other games, things like that...New player, so sorry if these are common knowledge questions :'(

πŸ‘︎ 5
πŸ“°︎ r/anno
πŸ’¬︎
πŸ‘€︎ u/Mataskarts
πŸ“…︎ May 30 2020
🚨︎ report
This game is absurdly CPU bound

Generally my experience with these types of survival games is that they are very pretty so they are usually GPU bound. In last oasis however my gpu is sitting at 30% while my cpu is at 100%. My rig is an overclocked i5-6600k and a GTX1080. In comparsion I run at 80% CPU and 80% GPU in ark at max settings. It's unfortunate too because lowering your settings only increases fps when your gpu bound. So it doesn't matter if I'm at the lowest or highest settings I still get 30fps in big battles.

πŸ‘︎ 16
πŸ“°︎ r/lastoasis
πŸ’¬︎
πŸ‘€︎ u/Xanjis
πŸ“…︎ Apr 15 2020
🚨︎ report
Gears Tactics - Incredibly CPU bound

Is anyone else having issues with very low framerates on Gears Tactics? Does not matter if on all low or all ultra settings, my framerate is 35 average on the former and 15 average on the latter.

When running the benchmark it show the game to be 99% CPU bound in all cases. But my CPU is well above recommended requirements and my GPU from what I can see from other users should be getting 60 to 90fps in this game not 15.

I have a 6600k set to all core turbo up to 4.5GHz, a GTX1070, 16GB of 3200 ram, z170 gaming 7 mobo and installed the game on a sata3 samsung evo SSD. This is all one Windows 10 pro with 1909 latest version with the latest Intel and nvidia drivers. All the drivers have been reinstalled and so has the game on xbox gamepass.

I'm out of ideas on this one.

πŸ‘︎ 5
πŸ“°︎ r/GearsOfWar
πŸ’¬︎
πŸ‘€︎ u/Cafuddled
πŸ“…︎ May 04 2020
🚨︎ report
CPU-bound or I/O-bound approach with clustering in a high-stress environment?

Hi!

I'm building a web app and we are facing major performance issues, and high CPU overload and stress.

In this web app we use Express, Handlebars, Passport, Sequelize with MariaDB, node-redis with Redis and some others modules. We have an i9 9900K, 8 cores/16 threads, and we run pm2 cluster on 16 threads. That should be a powerful machine, but I don't know why there's some bottleneck which causes the CPU to get to 80%-90% or even 100% when 1000 or 2000 requests happen in like 50 seconds. Profiling can't help me, even StackOverflow can't, and I don't know how to handle it anymore, it's over 1 week that I'm trying to fix this CPU stress. Some people on StackOverflow agree with me that what my web app does is lots of data processing, but really simple and shouldn't be stressing this much the CPU.

In the entire environment I have chosen to do this approach:

Get data from Redis (or query it if doesn't exist) -> JSON.parse -> filter json array with Array.filter and other common array operations to get specific data.

Now, we have to get very specific data and a lot of times, at least using those operations 10 times per request, on small or medium arrays of objects, from 6 entries up to 30k entries, hanging from Array.filter to Array.sort to Array.some and Array.find.

This happens LOOOOTS of time, and we have to handle 3000 concurrent requests sometimes. This causes high CPU stress and latency, even if I managed to reduce a bit the event loop latency by inspecting it with blocked-at package. Also, looks like, from profiling, that a lot of processing is spent in garbage collecting. That looks like a memory leak.

They always say that Node is I/O-bound and not CPU-bound due to event loop. So I thought this may not be the better approach to handle with this kind of data, and it would be better to avoid using lots of Array operations and just do SQL query which are I/O and not CPU.

Is it right? Should I use more database queries and caching, instead of doing simple queries and handling large data with array operations? Won't lots of database queries stress my CPU anyway? Will it be better to do 3-4 non-cached queries each request or do 10-15 operations each request on cached queries?

Thanks for helping and sorry for such a stupid question, probably, but really I don't know what to do anymore.

πŸ‘︎ 3
πŸ“°︎ r/node
πŸ’¬︎
πŸ‘€︎ u/DanielVip3
πŸ“…︎ May 26 2020
🚨︎ report
CPU bound question

Excuse me if I am wrong, but in general, doesn't a need for more ram go hand in hand with CPU load?? Will we likely need 8 cores to run 4k ultra setiings ?

πŸ‘︎ 16
πŸ’¬︎
πŸ‘€︎ u/dodgerspilot
πŸ“…︎ May 03 2020
🚨︎ report
Cpu tech has advanced by leaps and bounds. Except for the IHS. Why?

Processors have gotten and faster over the years, as one would expect. Why is then that there has never really been any attempt to advance the IHS. Processors have been using the same heat spreader for as long I can remember. Modders who delid get a very substantial drop in temps. I've seen guys shave 15-20 degrees off their chips. Seeing as temps play such a huge roll in frequency, why are we still stuck with the same IHS that hasn't changed since the Pentium days.

πŸ‘︎ 3
πŸ“°︎ r/pcmasterrace
πŸ’¬︎
πŸ‘€︎ u/Randommx5
πŸ“…︎ May 18 2020
🚨︎ report
Can anyone explain this to a little. Like what’s up with GPU and CPU bound and can I change. Please Help
πŸ‘︎ 3
πŸ“°︎ r/pcmasterrace
πŸ’¬︎
πŸ‘€︎ u/GenTrapstar
πŸ“…︎ Apr 19 2020
🚨︎ report
VR might be more CPU bound than we thought.

I recently upgraded from an i7 4790 to a R7 3800X, I was not expecting any real improvement in VR performance as common wisdom is that VR is GPU bound, especially at higher Super Sampling settings.

I found this to be true for games like Beatsaber where there was no difference in performance and in Synthetic Benchmarks. However, I was pleasantly surprised to discover improved performance in a few games including Pavlov. I run Pavlov at 200% SS and when using the i7 4790 I would experience some stutter or frame drops when the screen was busy or when smoke was used, with FPS dipping down to ~ 80- 70 FPS. Zooming out on some custom maps while dead would also cause a significant reduction in FPS. With the 3800X I now maintain a constant stable 90 FPS in both these situations.

I saw an even bigger improvement in Scraper: First Strike, some scenes used to reduce my GPU and CPU FPS to ~50 at 200% SS resulting in an overall unpleasant experience. Now FPS never drops below 70 and CPU FPS sits around 80 at the same SS settings. That’s a massive improvement and it looks like something in my old system was acting as a bottleneck.

I am wondering if some of this improvement could be attributed to switching from slower DDR3 Ram to faster DDR4 Ram. This is all obviously anecdotal information; I am hoping that at some point some one can do more rigorous testing.

GPU: GTX 1080

Headset: Lenovo Explorer @ 200% Steam SS

πŸ‘︎ 22
πŸ“°︎ r/virtualreality
πŸ’¬︎
πŸ‘€︎ u/AnAttemptReason
πŸ“…︎ Feb 05 2020
🚨︎ report
What graphic settings are more cpu bound?

I'm trying to squeeze every single fps out of my ryzen 5 1600 and I was wondering which video settings are more cpu bound than gpu bound.

There are games like Apex that tell you if a graphic settings has more impact on cpu tham gpu, like for example "Effect Settings" but I'm wondering if this translates to a game like Overwatch with a completely different game engine.

πŸ‘︎ 26
πŸ’¬︎
πŸ‘€︎ u/axaro1
πŸ“…︎ Mar 04 2020
🚨︎ report
Is tarkov really cpu bound game?

Here is the thing: I was playing with an i7 [email protected] 16gb ram and an RTX 2070. I just bought a new setup with i9 9900KS (did not oc yet), with 32gb ram, and migrated the rtx from older setup.

All settings are the same in game and I have not see any difference on the FPS.

All other games I play improved on FPS, some even has a huge jump. πŸ˜©πŸ˜–πŸ˜ž

πŸ‘︎ 2
πŸ“°︎ r/EscapefromTarkov
πŸ’¬︎
πŸ‘€︎ u/BendakBR
πŸ“…︎ Mar 06 2020
🚨︎ report
I love playing in open-air stadiums because I can kick the ball ALMOST out of bounds and the dumbass CPU lets it hit the ground. But this. THIS. Is a new one. SAFETY. v.redd.it/0jrfphtsbud41
πŸ‘︎ 37
πŸ“°︎ r/Madden
πŸ’¬︎
πŸ‘€︎ u/pinkocommieboi
πŸ“…︎ Jan 30 2020
🚨︎ report
Intense CPU bound game ideas

What if I told you I coded a 'They Are Billions' type of game in 3 months, handling 20k units at once with normal RTS mechanics along with 50k resources on a 2D map using a special programming paradigm?

I'm now looking for your most CPU intensive, never-done-before game ideas. I'll try to make a game out of that with this out of the box programming paradigm.

πŸ‘︎ 16
πŸ“°︎ r/gameideas
πŸ’¬︎
πŸ‘€︎ u/randomgaia
πŸ“…︎ Dec 09 2019
🚨︎ report
Under ANY logical circumstances he’s DIVING into the end zone. But since there’s a CPU controlled player it makes sense for him to spin so he can get pushed out of bounds. This game is hilarious sometimes v.redd.it/st97xwix3mb41
πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/Idiotfromthebook
πŸ“…︎ Jan 18 2020
🚨︎ report
Settings to reduce stutter when CPU bound in games? (other than capping FPS)

So i mainly play Fortnite and Modern Warfare 2019, and my cpu is pretty old 4670k @ 4.3ghz (4 threads) So in game its usually pinned at 100% and not really using all of my GPU.

I have a 144hz monitor so i would like to keep the FPS as high as possible (i never really hit above 144 anyway on my 1060 6gb)

Is there a way to stop the micro stuttering and freezes i get with being limited by my CPU?

I remember there being a setting like maximum pre rendered frames or something and it was supposed to help? But i cant find that setting anymore.

What settings do people recommend i change to help with this (cpu bottleneck)?

Thanks

Gershy13

πŸ‘︎ 2
πŸ“°︎ r/pcmasterrace
πŸ’¬︎
πŸ‘€︎ u/Gershy13
πŸ“…︎ Mar 02 2020
🚨︎ report
2070 Super @ 4K is GPU Bound but would slightly faster CPU help??

I have 2070 Super and recently upgraded to an awesome Dell UP3216Q 4K monitor, awesome except that it has no adaptive refresh rate. That's fine as I don't play FPS much so just need to hit 60 FPS to avoid tearing. Buyt I love eye candy so still want High or Max visual settings.

Any idea what a slightly better CPU would do for me? I have a I5-6600K running all 4 cores at 4.5ghz. I can see in Afterburner, GPU-Z, etc that GPU usage is pegged at 100% all the time at 4K. But CPU usage is around 70% or less.

What would replacing my I5-6600K w/a I7-7700K do for my FPS? Anything?

πŸ‘︎ 2
πŸ“°︎ r/nvidia
πŸ’¬︎
πŸ‘€︎ u/SteverinoLA
πŸ“…︎ Dec 20 2019
🚨︎ report
new to the game and playing as qb1 and i hate how the cpu receivers comeback for the ball like this, am i doing something wrong? i also hate how the receiver runs out of bounds like that but had so much space to turn up field, it’s still a problem like in previous maddens v.redd.it/344wlo8r4dc41
πŸ‘︎ 7
πŸ“°︎ r/Madden
πŸ’¬︎
πŸ‘€︎ u/momorenos
πŸ“…︎ Jan 22 2020
🚨︎ report
/u/vblanco has been working on a experimental fork of Godot that increases performance by around 50% and makes the engine no longer CPU bound. github.com/godotengine/go…
πŸ‘︎ 661
πŸ“°︎ r/gamedev
πŸ’¬︎
πŸ‘€︎ u/Two-Tone-
πŸ“…︎ Jun 03 2019
🚨︎ report
Developer vblanco20-1 has been working on a experimental fork of the engine that increases performance by around 50% and makes it no longer CPU bound. github.com/godotengine/go…
πŸ‘︎ 373
πŸ“°︎ r/godot
πŸ’¬︎
πŸ‘€︎ u/Two-Tone-
πŸ“…︎ Jun 02 2019
🚨︎ report
going to a r5 3600 from a i5 7400, how much more performance (cpu bound games) should i expect?

i'm gonna upgrade in the next few months and i'm curious to see how much of a boost i'm gonna get as i can't seem to find a proper benchmark on these two particular processors

πŸ‘︎ 3
πŸ“°︎ r/Amd
πŸ’¬︎
πŸ‘€︎ u/ctf_sawmill_Fan
πŸ“…︎ Nov 16 2019
🚨︎ report
Is this game CPU bound?

Can this game still be enjoyed if your CPU is slightly below the minimum requirements? I'm still rocking a Phenom II X4 which is ancient by modern standards but has been enough for my needs (mostly indies and older AAA titles). Kingdom Come: Deliverance was my wake up call that my setup isn't adequate for modern AAA titles but that game is built in CryEngine which is notoriously CPU intensive. The rest of my rig is in line with the recommended specs for Outer Wilds (GTX 1060, 8GB RAM, SSD). The only info I've found is anecdotal reports that this game suffers from stuttering like some other games built in Unity due to asset streaming. My display is just a standard 1920x1080 60Hz panel.

Will the game be playable on my setup or should I wait until I upgrade so I don't have a bad experience?

Update: Got the game; no issues at all so far, but I haven't explored too much yet. Seems like your GPU matters more than anything else, as is usually the case.

πŸ‘︎ 2
πŸ“°︎ r/outerwilds
πŸ’¬︎
πŸ‘€︎ u/primehunter326
πŸ“…︎ Dec 31 2019
🚨︎ report
PC: Great game, but TERRIBLY optimized graphically (CPU bound)

The engine as usual is highly cpu bound.

Currently I am running an i5 6500 and GTX 1060 6gb. I know that my CPU isn't on the best side but it's decent. The game seems to utilize 100% CPU and 4% of my GPU and gives microstutters all the time while playing.

This same PC runs BF1 and the witcher 3 just fine..

This single point makes this game a miss for me, atleast until i upgrade my CPU. Seriously optimize your game engines..

πŸ‘︎ 21
πŸ“°︎ r/modernwarfare
πŸ’¬︎
πŸ‘€︎ u/persason
πŸ“…︎ Sep 22 2019
🚨︎ report
Will an eGPU improve performance in CPU bound games?

At the moment I mainly play games like stellaris that appears CPU bound (fps and game speed tanks late game but easily 60fps early game), running off the iGPU.

XTU shows my laptop (8560u) is power limit throttling, my question really is how much of this power limit is being used by the iGPU and will moving the graphics load to an eGPU indirectly improve performance by freeing up more of the power budget for my CPU?

Running furmark on its own will use the full 25W boost power limit with 4% CPU load so I assume the iGPU is capable of hogging a load of power.

πŸ‘︎ 8
πŸ“°︎ r/eGPU
πŸ’¬︎
πŸ‘€︎ u/ICWiener_
πŸ“…︎ Oct 21 2019
🚨︎ report
Revive CPU or GPU bound

I've been trying to play Asgard's wrath and Defector. They play reasonably well during low movement but studder badly during combat. I have a 1070 , OG Vive and Odyssey+. Does anyone know how much better it would be if I went to a nvidia 2080.

πŸ‘︎ 4
πŸ“°︎ r/virtualreality
πŸ’¬︎
πŸ‘€︎ u/dupdup7833
πŸ“…︎ Nov 17 2019
🚨︎ report
How does the Low Latency mode on Ultra / On affect Performance in CPU or GPU Bound scenarios ?

So i searched youtube and googled but didnt find an answer to that.

How does the Low latency mode affect games with performance in GPU limited games and CPU limited games in the case of fps.

πŸ‘︎ 5
πŸ“°︎ r/nvidia
πŸ’¬︎
πŸ‘€︎ u/Evonos
πŸ“…︎ Oct 02 2019
🚨︎ report
The npm Registry uses Rust for its CPU-bound bottlenecks rust-lang.org/static/pdfs…
πŸ‘︎ 179
πŸ“°︎ r/programming
πŸ’¬︎
πŸ‘€︎ u/steveklabnik1
πŸ“…︎ Feb 26 2019
🚨︎ report
Bad FPS/drops to low 100 on 144hz, CPU bound on i5-2500k @ 4.5 Ghz

Hi, I've been having FPS problems for a while now after being able to play successfully at 144hz for over 3 years from 2015 to 2018 on the same hardware..

Specs:

  • CPU: i5-2500k @ 4.5 Ghz
  • RAM: 16 gb
  • GPU: GTX 1070 (historically had a GTX 680)
  • Screen: BenQ XL2411Z

I upgraded to the 1070 in early june 2017 and had no problem with csgo at the time and for a while after that. The troubles only started when the Danger Zone update hit. I went from smooth gameplay where I did not notice stutter at all to the point of having janky aim because of stuttering during gun fights.

So I started trying to figure out if I can fix it and so far no dice. I seem to be CPU bound where I'm pretty much always at 100%, regardless of graphics settings. Setting it on medium yields roughly the same frames as everything high/very high and anisotropic x16 (which were my settings on the 680 anyways) and low gives more frames overall but sees the same dips and usage.

Figuring it might be background app or the such, I set up a windows 10 on a fresh Samsung EVO 860 SSD and installed only the following: firefox, steam, csgo, nvidia control panel (no geforce experience). Disabled the usual tracking stuff, game DVR, windows defender. My test map is going into dust 2 deathmatch and csgo uses about 85%-90% on average, highest I saw was 93% lol. On medium settings, it is the same thing, it eats up everything. Disabling multi-core rendering (for testing purposes) puts me around 80-85% total, so like a good 70-75% csgo but I have trash frames, like sub 80. (launch options: -novid -nojoy)

There are absolutely no processes running in the background, the highest I saw was windows desktop manager that spiked to 10% but was generally 3% or less.

I remember that before my troubles, the lowest frames I'd get was looking down mid doors from T spawn and iirc it dropped to about 120 fps while right now, watching down mid doors from top mid I hover around 75-80 fps..

That win 10 install is from today (was hoping to get this working to play the operation) but I have also tried a win 7 and a linux (ubuntu) installation in the past (also another win 10) all yielding fairly the same results.

I would say the gameplay/ fps drops feels very similar on my partition that only has csgo on it vs my main partition running my 2nd screen and having discord and firefox open on the side.

At this point I'm out of ideas after scouring the web for this long and I'm just left concluding that the danger zone upd

... keep reading on reddit ➑

πŸ‘︎ 5
πŸ“°︎ r/csgo
πŸ’¬︎
πŸ‘€︎ u/Rayfloyd
πŸ“…︎ Nov 20 2019
🚨︎ report
Lag stuttering in CPU bound games

Hey,

I've been having this problem recently that my game stutters after every 2-3 minutes or so. This mainly happens in CSGO en Fortnite which are both CPU bound games. I can see that my game's FPS drop from around 150 to 70 in CSGO, but it completly freezes. I think it has something to do with the power supply, that it's to little Watt. I currently have a 450W power supply. Any tips or advice is highly appreciated.

πŸ‘︎ 2
πŸ“°︎ r/buildapc
πŸ’¬︎
πŸ‘€︎ u/Thyman_
πŸ“…︎ Sep 24 2019
🚨︎ report
[no spoilers] What to do when cpu bound?

I’m CPU bound in this game to the point where my graphics card can be chilling at 50-60% usage whilst my cpu sits at 90-100% a lot of the time. How can I either make use of my gpu more or reduce the cpu load? I’ve already tried usin nvidia dsr to adjust the resolution but unfortunately it doesn’t work in subnautica.

Thanks for the input guys!

πŸ‘︎ 3
πŸ“°︎ r/subnautica
πŸ’¬︎
πŸ‘€︎ u/titanmongoose
πŸ“…︎ Nov 29 2019
🚨︎ report
R5 3600x or i7 9700k for CPU bound games?

Soo... I play mainly CS:GO which is extremely CPU bound and also extremely dependent on high FPS. I know that usually it would make more sense to go with a more expensive GPU and a less Expensive CPU but does that rule also apply for CPU bound games? I'm really struggling about what I should buy. Edit: Probably going with a 5700xt, maybe a 5800(xt) if it still releases this year and isnt tooo expensive. Gaming on 1440p 144hz

πŸ‘︎ 2
πŸ“°︎ r/intel
πŸ’¬︎
πŸ‘€︎ u/Joramsim
πŸ“…︎ Aug 03 2019
🚨︎ report
PSA for alienware m15 r1 users who play CPU bound games

I play league of legends 90% of the time, it's my favorite game. As you guys are all well aware, these laptops do not have gsync so we have very limited options for a smooth gameplay experience

  1. Vsync (No tearing but input lag)
  2. No Vsync (No input lag but tearing)
  3. Scanline Sync (little input lag + No tearing) [This uses riva statistics tuner to set up, look it up]

So naturally I went for scanline sync cause I didn't want the tearing and I wanted to minimize the input lag and I capped league of legends fps at 120 fps (cause I could never get a solid 144 fps)

This was fine for the most part but I always noticed a huge dip in my frames from time to time even when nothing was going on, it was inconsistent but I noticed it seemed to happen when minions show up, other champs, sometimes when I use abilities, etc. This drove me mad for MONTHS since I have owned the m15 since it first came out last november and I tired EVERYTHING to resolve this issue.

  1. Nvidia control panel tweaks for more FPS
  2. Setting all graphic settings to minimum on league
  3. Looking up some config files

But no matter what, I always had the dip somewhere.

I can officially say that I have pretty much minimized the dips to the point that I don't even notice them anymore and the answer was just to buy a new kit of RAM! How crazy?

All I did was buy a kit of "Kingston Technology HyperX Impact 16 GB 2666MHz DDR4 CL15 260-Pin SODIMM Laptop Memory" and the rest was history! Now I can even play league closer to 144 fps with minimal dips in fps to the point I don't notice! Just for reference the stock RAM has a CL19 so it's got a little more latency than this kit despite them both having the same speeds.

πŸ‘︎ 7
πŸ“°︎ r/Alienware
πŸ’¬︎
πŸ‘€︎ u/LiquidShadowFox
πŸ“…︎ Oct 01 2019
🚨︎ report
ESO CPU Bound or GPU Bound?

Been playing on my new setup 3800X + Vega 64 and @ 3440x1440 100hz, i limit my fps to 100fps and it sits at this nicely pretty much constantly except when im in a heavily populated area or city, it can dip to 55ps or so.

Is the game still heavily CPU bound? or have they managed to actually start pushing more towards the GPU?

I read that ESO is going to be released on Google Stadia which means it will get converted to Vulkan, which imho is a good thing IF they push it over to PC (probably Xbox as well) as it should then be able to harness GPU power better.

I have just bought a 5700XT which arrives tomorrow so i will see for myself if it makes much of a difference, im thinking probably not though as i suspect the game is still more CPU Bound.

Anyone have any ideas?

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/FalcUK
πŸ“…︎ Aug 22 2019
🚨︎ report
How do you know if you are GPU, CPU, or RAM bound in a given application? ELI 5 to get a decent experience in the Index

No troubles on Vive for years. Now on Index it is too choppy to play Beat Saber. Even in 90hz. So it is my hardware or my settings

HARDWARE

-i7-4770k

-1080ti

-8Gb 1600MHz RAM

Trying to track with 2 tool at the moment: Advance Frame Timer & Windows 10 default Task Manager

  • Task Manager's Performance tab tracks CPU/RAM/GPU utilization. These are never quite maxed on any scale even as the game chops and motion smoothing kicks on and off

  • Advanced frame timing... I always use this, but honestly it is Egyptian to me. I watch it as others play hoping that stats will start to correlate.

STEAM VR SETTINGS

  • "Custom Resolution" under Applications control's specif application? Beat Saber already has a default of 66% all other seem to be 100%. Odd, No?

  • "Enable Custom resolution" is something I don't even mess with the but I can tell you that Steam sets it to 150% for 90Hz; 114% for 120Hz; and 94% for 144Hz

------------------------------EDIT---------------------------------

Of the (many) VR games I have installed, looks like Veat Saber and Pavlov have themselves adjusted to 66% in SteamVR->Applications->Resoultion. I set these back to 100% Performance was much better. Good enough to get me to 144Hz. Go fig... That makes no sense.

Lastly, at 144hz Beatsaber (all default resolutions on steam VR) I was consistently barely above the re projection line (smoothing). But here is the thing, utilization was constantly:

  • CPU 43%

  • Ram 3.9GB out of 8GB

  • GPU 58%

WTF???

πŸ‘︎ 7
πŸ“°︎ r/ValveIndex
πŸ’¬︎
πŸ‘€︎ u/CaptnYestrday
πŸ“…︎ Jul 12 2019
🚨︎ report
BFV Graphic Settings which are CPU vs GPU Bound ?

Hi all,

I have an old i7 4970 with a RTX 2080 and looking if anyone can help explain which settings push the CPU and which the GPU in BFV ? Looking at putting the settings which effect GPU to the max while maybe lowering the CPU bound effects down a bit you see.

Thanks !

πŸ‘︎ 2
πŸ“°︎ r/BattlefieldV
πŸ’¬︎
πŸ‘€︎ u/AngrySixInches
πŸ“…︎ Sep 06 2019
🚨︎ report
Am I CPU Bound?

I haven't found a whole lot of data on this but what with me upgrading my PC the wifey's is having some issues which... well seems odd.

She's dropping frames, and I'll share her specs with you and you'll understand why, for the life of me, I cannot imagine why she's dropping frames in Overwatch, or all games:

CPU: i7-6700k (@ stock)

CPU Cooler: CORSAIR Hydro Series, H110i RGB PLATINUM 240mm RAD

RAM: 32 GB G.Skill 3200Mhz Timing 16-18-18-38 (Model F4-3200C16D-16GVKB 2x Kits)

GPU: EVGA GeForce RTX 2070 XC GAMING (Case Mods Vertical Mount)

Motherboard: Asus Z170-TUF Sabertooth Mark 1

SSD: SAMSUNG 970 EVO PLUS 500GB

Case: NZXT s340

Monitor: LG 34UC89G 34" 144mhz G-Sync 21:9 Ratio Wide Screen Monitor (Connected via Display Port)

So... spec wise I can't imagine how this rig is getting any drop in frames. My only possible thought is: Could the CPU be limiting the GPU?

πŸ‘︎ 3
πŸ“°︎ r/techsupport
πŸ’¬︎
πŸ‘€︎ u/Zithero
πŸ“…︎ Aug 14 2019
🚨︎ report
Do PCI Express, SSD readwrite speed and RAM speed have anything to do with the performance of a seemingly CPU-bound python script?

I personally don't have much experience on hardware side of things and how any improvement on hardware may affect the performance of python script, which is why I am here asking for help.

So I have a loop-based number crunching script that does transformations on NumPy arrays and pandas Dataframes. This script only reads the data once and saves once in the run time, and most of the time it spends on the number crunching side, so it seems to be not I/O bound. It seems to be CPU-bound(I could be wrong, but the reason I suspect it could be CPU-bound is because when I use multiprocessing on it, all of my cores/threads have 100% utilization).

Here is my question: Let's say in future, I update my mother-board to be the new one that has PCIE 4.0, update my ram to the 3600MHz ram and also update my hard-drive to a NVME SSD. Will these upgrade have any effect on the performance of my number-crunching python script?

Thank you in advance.

Note: not sure if it is relevant, but I am running script on Windows 10 64-bit, Python 3.7.3, Python IDE Spyder 3.3.6 and IPython 7.7.0

πŸ‘︎ 2
πŸ“°︎ r/Python
πŸ’¬︎
πŸ‘€︎ u/Mathwizzy
πŸ“…︎ Aug 02 2019
🚨︎ report
Faster/lower CAS latency RAM worth it for heavily CPU-bound and unoptimized UE3 game? (Rising Storm 2)

Specs: 3900X paired with G.Skill Trident Z RGB (Samsung B-die) 3200MHz with 16-18-18-38 timings. The exact memory kit model number F4-3200C16D-16GTZRX.

I play a game called Rising Storm 2: Vietnam almost exclusively, but the game is notoriously unoptimized. It runs on Unreal Engine 3, has almost no support for multithreading (I generally see 2 threads at nearly 100% load, with a few others at medium loads, but those are probably background processes), and the large maps place tons of draw calls on the CPU. It's 32v32 and fairly realistic physics, so there's a lot of physics calculations being done. There is a setting I have enabled the supposedly offloads some of these to the GPU, but I can't say for sure whether it really does or to what degree.

To give an idea of how badly optimized the game is, on certain maps I'll average 70-90fps, while on other maps I'll average in excess of 120fps, even getting up to 140+fps. And this is at 1440p on Ultra settings, so a decent part of the CPU bottleneck should be alleviated. Neither lowering the resolution (even down to 720p) nor lowering the graphics settings significantly change the frame rate. The game was released in 2017, about 2 months after the 1080ti. I have an RTX 2080, and as it's nearly identical to the 1080ti performance-wise, so for any optimized game of that time with similarly mediocre UE3-based graphics, I would imagine that I'd get more than 140fps or more, and fairly consistently.

I've tried disabling one of the CCDs, disabling SMT, and even both at the same time, but these don't seem to make a noticeable difference. However, I should note that there is no in-game benchmark and frame rates are influenced by highly variable factors (like player count or how many explosions are occuring), so it's possible that there is some small difference.

I'm wondering if higher speed or lower latency RAM (or both) will help. I'm thinking about either 3200CL14, 3600CL16, or (if I can find it at a good price) even 3600CL14. When I upgraded from generic, non-B-die 3000CL16 to B-die 3200CL16 (though this was when I was running a 2700X), I did notice an a change in frame rate of maybe 10-30fps on the menu screen, although there are likely no real physics calculations being made on the menu screen, and my frame rate there is usually double or more my actual in-game frame rate.

So if any of you have experience with upgrading to faster/lower latency RAM and seeing performance differences in CPU-bound titles, I'd

... keep reading on reddit ➑

πŸ‘︎ 3
πŸ“°︎ r/Amd
πŸ’¬︎
πŸ‘€︎ u/tedshino
πŸ“…︎ Aug 05 2019
🚨︎ report
Will motion smoothing from 62 Hz to 144 Hz save me from being CPU bound or will it make things even worse?

I am currently on a long vacation and am suffering from this serious first world problem: When I come back home to my PC (1080ti, i7 3770k PC @ 4200 MHz, 32 GB RAM @ 1833 MHZ) and my first wave Index in the end of June, I am very much afraid of being denied the 120 Hz VR I now feel I urgently need by my antique CPU/RAM. So will motion smoothing (or some other form of reprojection etc) from 72 Hz to 144 Hz enable me to play CPU intense games like Skyrim with a higher refresh rate? Will my graphic card calculate the missing frames and my CPU only has to keep up with 72 frames/s or does it shift the load to my CPU and fuck me even worse? I am having a hard time justifing upgrading my CPU/Motherboard/RAM if most of the games will run at 120 Hz just fine, but I think about being CPU bound way more than I should (like really a lot more). Sorry for the rant, maybe I will just keep refreshing r/valveindex every 5 minutes for the next month and hope for a better CPU overclock when I get home.

tldr: Does motion smoothing help with being CPU bound or does it make it even worse?

PS: I originally posted in r/valveindex and the post was deleted. I realise this post is to some degree a shitpost and to some degree a PC Hardware and Steam VR question, but I still feel it is pretty Index specific and will probably come up once the index is released. Sorry if this is the wrong sub as well

PPS: 72 Hz, obviously

πŸ‘︎ 8
πŸ“°︎ r/vive_vr
πŸ’¬︎
πŸ‘€︎ u/Cows_are_scary
πŸ“…︎ Jun 05 2019
🚨︎ report
Python threading and subprocesses explained Take advantage of Python’s ability to parallelize workloads using threads for I/O-bound tasks and subprocesses for CPU-bound tasks infoworld.com/article/331…
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/ashish316
πŸ“…︎ Oct 01 2019
🚨︎ report
No Man's Sky: CPU or GPU bound?

In your experience have you found No Man's Sky to be more heavily limited by CPU or GPU performance?

πŸ‘︎ 3
πŸ“°︎ r/NoMansSkyTheGame
πŸ’¬︎
πŸ‘€︎ u/JimHart64
πŸ“…︎ Jun 06 2019
🚨︎ report
Just your regular solo battle cpu td. Clearly out of bounds but ea thinks otherwise. Good thing it was a garbage time td and not a deciding factor on the outcome.
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/infamousprophet3
πŸ“…︎ Sep 22 2019
🚨︎ report
BTW DX11 uses more GPU so if you get input lag try making it more CPU bound

Tricks like 1.41 renderscale in the ini or ultra textures can actually be detrimental experiment with settings and see what feels better.

πŸ‘︎ 12
πŸ“°︎ r/Planetside
πŸ’¬︎
πŸ‘€︎ u/wrezl
πŸ“…︎ Apr 27 2019
🚨︎ report
current build on staging has a fix for cpu bound clients that may increase fps for 20%,Helk says twitter.com/Helkus/status…
πŸ‘︎ 100
πŸ“°︎ r/playrust
πŸ’¬︎
πŸ‘€︎ u/dennisxD
πŸ“…︎ Aug 11 2018
🚨︎ report
How to benchmark django to know whether the server is CPU-bound or IO-bound?
  1. Figure out CPU-bound or IO-bound

  2. Find the offending code in each case

How do you go about them?

Thanks!

πŸ‘︎ 14
πŸ“°︎ r/django
πŸ’¬︎
πŸ‘€︎ u/netok
πŸ“…︎ Feb 03 2019
🚨︎ report
According to TPU and many reviewers R5 1600x 1500x outperforms i5 in CPU bound games and in streaming also.
πŸ‘︎ 757
πŸ“°︎ r/Amd
πŸ’¬︎
πŸ‘€︎ u/MadCri
πŸ“…︎ Apr 12 2017
🚨︎ report
Poor CPU performance in CPU bound games?

Hello there!

Recently I've started noticing more and more that games that rely a lot on the CPU are often performing very badly. Prime examples are Kerbal space program and Civilization VI.

I ran cpupower frequency-set -g performance but no luck.

I doubt my CPU can't handle those two games, I have a AMD Ryzen 2600x

πŸ‘︎ 8
πŸ“°︎ r/ManjaroLinux
πŸ’¬︎
πŸ‘€︎ u/danielsuarez369
πŸ“…︎ Feb 06 2019
🚨︎ report
Network security concern with CPU bound PoW

Disclaimer: I am not a miner and I have no economic self interest in which mining algo Monero decides on.

Currently our network security is compromised by a centralized group of ASIC miners, and this problem needs to be addressed ASAP. A lot of work has been done on CPU bound PoW ideas (RandomJS, RandomX, etc), and while I value the research, I see a few serious issues with CPU bound PoW that I would like to see addressed.

Firstly, I think many people in this community hate ASICs as an inherently centralizing force in the network. While I am no ASIC fanboy myself, I think we need to be careful about this line of thinking. ASICs specifically aren't the problem. The problem is that 85% of hashpower is owned by one group. This can happen in CPUs and GPUs also. I think we need to change our design goal away from "ASIC resistance" to something more like "equal access to competitive mining hardware". To this point, I don't think CPU mining will allow equal access for miners, as I will outline below.

Consider this: CPUs are widespread and mostly centralized into a handful of server farms around the world. Amazon, for example, has exaflops of compute power that at any moment may be sitting idle (when not in peak usage). Microsoft, Google, Facebook, etc all have similarly large excess compute capacity. If these large farms direct even a fraction of their idle power towards mining Monero, then Monero difficulty will rise to the point that no dedicated mining operation would be able to stay in business. The big corps will always buy the latest and greatest servers for their web hosting businesses, and don't need to consider Monero profit in their decision to buy better hardware. Dedicated miners on the other hand need to justify new hardware upgrades with profit margins from mining Monero. Profit margins will likely be so slim, that dedicated mining operations will not be able to compete and will shut down.

Sure, anyone can buy a CPU or a server rig, but can anyone really setup a mining operation that competes with Google or Amazon? Especially considering Google and Amazon don't need to consider Monero profit as a justification to buy more servers?

This will result in a network whose global hashpower will be centralized to a few large server farms owned by companies that do not care about Monero. How is this different from a few ASIC companies that dominate the network? Well one difference is that the new owners (Amazon, Facebook, etc) have repeatedly proven th

... keep reading on reddit ➑

πŸ‘︎ 4
πŸ“°︎ r/Monero
πŸ’¬︎
πŸ‘€︎ u/accenthammer
πŸ“…︎ Feb 10 2019
🚨︎ report
Intel 9700k or 9900k for CPU bound games

I play a lot of 4x and simulation games (Civilization VI, Stellaris, Cities Skylines, etc.). I have a 4790k CPU currently, which does ok, but late-stage games do take their time somewhat.

For example, Civilization VI late-game on a large map can take about 15 seconds to run through an entire turn with several AIs (single player, plus fast animations/battles on), and in Cities Skylines the fastest speed becomes the normal speed after a city of 200k+ population.

If I upgrade to a 9900k CPU, should I expect any sort of noticeable improvement in turn times and simulation speeds, or any other as yet unrealized benefits?

Thanks

πŸ‘︎ 12
πŸ“°︎ r/intel
πŸ’¬︎
πŸ‘€︎ u/Lordberek
πŸ“…︎ Sep 15 2018
🚨︎ report
i5 7400 Reason for CPU Bound Reprojection in SteamVR?

Title. I'm currently running an i5 7400 cpu with an RX480 8gb GPU and 8gb DDR4 ram and per the indicator light you can turn on for WMR headsets using SteamVR, I'm constantly having reprojection due to being CPU bound. I don't really understand why. This is sometimes happening even in not very intensive games like Eleven Table Tennis. Can anyone offer some insight? Is it really my CPU or something else? Would more and better ram help? Any insight would be more than welcome. I'm very rarely GPU bound, despite it not being the tippy top of the line.

πŸ‘︎ 2
πŸ“°︎ r/WindowsMR
πŸ’¬︎
πŸ‘€︎ u/kinsarc
πŸ“…︎ Mar 05 2019
🚨︎ report
Meanwhile in the average YouTube comment section of a crucial CPU-bound benchmark video.
πŸ‘︎ 617
πŸ“°︎ r/pcmasterrace
πŸ’¬︎
πŸ‘€︎ u/hambopro
πŸ“…︎ Aug 30 2017
🚨︎ report
ESP Async Web Server - Serves requests but never finishes them when CPU bound

I have an app that contains a web server implemented with the ESPAsyncWebServer component, which works pretty well.

My problem is that once I have created all the other tasks in my app, across two cores, the web server still works but is very slow and in fact never finished serving multipart requests.

Is there perhaps an idle task or something that is not getting serviced or any other ideas? I added delay(100); into the loops of my other tasks temporarily to make sure that there is CPU time free, but that did not help. So it's not that all the CPU is consumed, necessarily, but the web response never gets completed.

Any ideas? Since I posted this I went and looked at the code to delay() and noticed that FastLED has overridden it with a busy way, which I don't understand but that worries me!

πŸ‘︎ 3
πŸ“°︎ r/esp32
πŸ’¬︎
πŸ‘€︎ u/davepl
πŸ“…︎ Apr 30 2019
🚨︎ report
CPU bound reprojection always on in Elite Dangerous

Hi all. I'm trying to troubleshoot some performance issues in Elite Dangerous with my Odyssey+. I was getting frequent frame skips/stutters, so I decided to try out enabling motion reprojection. I opted in to both the SteamVR and Windows Mixed Reality for SteamVR betas and uncommented the line "motionReprojectionMode" : "auto", in the config file.

What I noticed is that now the light blue indicator is always on in Elite Dangerous meaning reprojection is always on due to being CPU bound. I've tried using the VR low profile in Elite Dangerous and disabling all super sampling. I've even tried running at 50% SS (which looks awful) but cpu bound motion reprojection is still always enabled except for vary brief periods where the indicator will turn green for a few seconds.

I feel like something must be wrong, as my hardware should be able to hold 90fps no problem. I have a GTX 1080ti and i7 4790k

πŸ‘︎ 8
πŸ“°︎ r/WindowsMR
πŸ’¬︎
πŸ‘€︎ u/alether2
πŸ“…︎ Dec 02 2018
🚨︎ report
Understanding CPU and I/O bound for asynchronous operations hellsoft.se/https-hellsof…
πŸ‘︎ 32
πŸ“°︎ r/Kotlin
πŸ’¬︎
πŸ‘€︎ u/dayanruben
πŸ“…︎ Jan 30 2019
🚨︎ report
PSA: "Reduce Buffering" increases lag if you are CPU-bound; enable only if you are getting "three dots" next to the FPS counter

Reduce buffering syncs your CPU simulation-start to the GPU render-end. If your game is GPU bound, the total lag is equal to

1/FPS + (t_GPU - t_CPU) = (2 * t_GPU) - t_CPU

Enabling "reduce buffering" forces the total lag to be

1/FPS = t_GPU + t_CPU

If your game is CPU-bound, the total lag is equal to

1/FPS = t_CPU

Enabling "Reduce Buffering" forces the CPU to start simulating only when the frame finishes rendering, which makes the total lag equal to

1/FPS' = t_GPU + t_CPU

To see if your game is CPU-bound or GPU-bound, run your game in your typical scenario with reduce buffering off and framerate uncapped, and see if there is one dot or three dots next to the FPS counter. One dot means CPU-bound, and three dots means GPU-bound. (you can verify this yourself by turning up to an absurdly high graphics setting. It will show three dots. Conversely, turning everything down including the resolution will result in one dot.)

For absolutely minimum lag, you should run the game CPU-bound (one dot) with "Reduce Buffering" off.

"Reduce Buffering" should only be used as a last resort if you cannot run your game CPU-bound.

EDIT 2: Made a mistake in my imgur illustration. Will update diagram with correction, as well as making it clearer. The gist of it is that if you draw out how all the different render processes fits together, you can easily find exactly the time differential between the game reading the input and the GPU finish drawing the frame.

EDIT 3: People pointed out that Battle(non)sense's tests seems to contradict what should logically happen. I suspect that there might be something else going on involving the FPS cap, where some faulty coding might be causing the FPS cap to introduce extra latency that should not happen. I will be investigating this with testing.

In addition, some people have cited sources that describe a different interpretation of the dots next to the FPS counter. However, it seems to contradict the actual behavior in-game, instead my description matches how it's currently working. Interestingly enough, on the PTR build, it does behave as described by that source, so this seems to be further evidence that Blizzard has some faulty coding in the render engine that they might be working on fixing.

For the present, my advice is now preliminarily changed to three scenarios:

If your system is entirely overkilling the FPS cap: enable reduce buffering

**

... keep reading on reddit ➑

πŸ‘︎ 296
πŸ’¬︎
πŸ‘€︎ u/everythingllbeok
πŸ“…︎ May 06 2017
🚨︎ report
Windows 10 Fall Creators Update -Ryzen CPU-bound benchmark youtu.be/PMU-ZFDwJW8
πŸ‘︎ 136
πŸ“°︎ r/Amd
πŸ’¬︎
πŸ‘€︎ u/andrei_pelle
πŸ“…︎ Oct 20 2017
🚨︎ report
Optimizing FPS in CPU Bound Systems?

Hi guys, I have been following the optimization tweaks and tricks OW players have been discussing to maximize fps for competitive play for about two years now. I have tried just about every single "tip" promising increased frames, and in almost every combination, and some things seem to work but most dont seem to do much noticeable. What are some of the ways to *actually* limit CPU usage and consumption by the system to prevent significant frame drops during high action sequences?

My computer specs:

Intel i5-7400 3.0GHz (since it is the non-K chipset, I cannot really overclock my CPU, but the multiplier will show up to 3.3GHz in "High Performance" power settings during turbo boost.

GeForce GTX 1060 3GB

8GB 2133mhz DDR4 RAM (I believe from what I have read from OW fps tuners online that this may be an unseen bottleneck in fps as the game seems to prefer more and higher speed RAM)

Windows 10 64-bit

144hz monitor - ultimate goal is to achieve a STABLE ~140fps at the peak of team fights, meaning it never drops below 140fps.

In game statistics and performance metrics:

I have found that for those who dont run a steady 250+ fps in uncapped framerate setting, setting an FPS Limit via the in game setting for "Display-Based" gives much more stable frames and overall *lower* overall valleys during the FPS drop sequences. When my game runs uncapped fps it will start around 180~ fps when the gate opens and down to around 100fps during the peak of 6v6 team fights. When Display-Based limit is enabled, I start the rounds at a steady 154 fps (the game's 144hz frame cap) but team fights will bring me to about 120fps lows for one or two seconds before jumping back up to around 135-140 fps (where I wish the fps always stayed steady).

During the course of a competitive match, my **GPU Usage** never exceeds around 50-55% and the temp never over 70C, yet my **CPU Usage %** starts the round's 1 minute wait period at around 60% and will bounce around the 90%+ region for the entirety of the match once doors open, and the 6v6 brawls with heavy action reach 99-100% easily and cause 20-30 fps stutters before re-normalizing. **Yet I notice despite my CPU and all 4 cores seemingly being maxed out at the height of the competitive match, my CPU temp never exceeds 50-55 degrees C, which seems very low for a fully taxed processor at "peak" loads.**

MSI Afterburner never shows FB Usage % of my GPU to ever exceed

... keep reading on reddit ➑

πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/skrilla76
πŸ“…︎ Jul 12 2018
🚨︎ report
ELI5: Why does the GIL slow down CPU-bound multithreaded programs?

I understand why CPython needs the GIL: to prevent race conditions when counting references, which could cause memory leaks.

What I don't understand is:

  1. How do other high-level languages (like, say, Java) avoid this problem? Is it because Java uses garbage collection? Could Python discard the GIL by using garbage collection? Would that slow single-threaded performance? Why?

  2. Say you have a single process, multithreaded program. My understanding is that within this program, two threads cannot simultaneously execute instructions. In Python, I assume that rather than switch at the instruction level, threads must switch at the 'bytecode' level because of the GIL. Why is this a bad thing?

πŸ‘︎ 13
πŸ“°︎ r/learnpython
πŸ’¬︎
πŸ“…︎ Sep 23 2018
🚨︎ report
BFV Dual vs. Quad Channel Memory: Surprising Gains up to +30% when CPU-Bound youtu.be/8EM0g0Szg_U?t=85
πŸ‘︎ 12
πŸ“°︎ r/BattlefieldV
πŸ’¬︎
πŸ‘€︎ u/octiceps
πŸ“…︎ Apr 06 2019
🚨︎ report
CPU Power needed / Game highly CPU bound in Falkreath Hold?

I did not yet play much of Skyrim VR. I have 6600k @ 4.4 Ghz. I had an GTX 1060 6gb before, but recently upgraded to a RTX 2080. I can not yet test the game with the new card, since my new monitor is still not here and I had to connect my old one via HDMI (only DVI as other connector, which the RTX Cards no longer have).

Anyway, I did try out SkyrimVR a bit with the GTX 1060, lightly modded (some texture mods). Perfomance was okay, but quite bad near Riverwood and deeper down inside Falkreath Hold.

Now, with the new card, and messing around in Skyrim SE for a rough estimate, it seems that Falkreath in limited by the CPU, even in the unmodded game. The FPS in highly unstable. Even without any mods, the FPS will drop as low as 70 in some viewing angles, but will go up to 120 if just slightly moving the head. With the 70 FPS, the GPU usage will drop to 60%. With my current mod setup on SE, the FPS goes down as low as 45 in some viewing angles, but up to 100 in others. Indeed, playing on 1440p instead of 1080p makes no difference in the lower FPS numbers, so it does seem to be limited by the CPU.

Since I currently cannot try in myself in VR, my question: Is it the same for SkyrimVR? Do the framerates drop consisently way below 90 (or does ASW activate) in Falkreath Hold, but are permantly at 90 anywhere else? I feel that even the most powerful CPU cannot fix it, because with this game, probably only the single core perfomance does matter (doubt Skyrim uses for than 4 cores), which only improved by roughly 20% since the release of the 6600k.

It may be a weird question, but I simply want to know if the problem is on my end or if everyone suffers from large framedrops near falkreath, probably limited by CPU power.

Bonus question: Does anyone know why exacty this region takes such a heavy hit on the processor?

πŸ‘︎ 2
πŸ“°︎ r/skyrimvr
πŸ’¬︎
πŸ‘€︎ u/Malaktus
πŸ“…︎ Jan 23 2019
🚨︎ report
The latest update for Rise of the Tomb Raider significantly improves DX12 performance in CPU-bound scenes dsogaming.com/news/the-la…
πŸ‘︎ 230
πŸ“°︎ r/pcgaming
πŸ’¬︎
πŸ‘€︎ u/Ghost_LeaderBG
πŸ“…︎ May 31 2017
🚨︎ report
Quantifiable results on how CPU bound Overwatch really is

http://www.techspot.com/articles-info/1180/bench/CPU_01.png

We knew Overwatch was somewhat CPU intensive, but I'm not sure everyone knows exactly to what degree. Since my cpu (2600k) performs almost exactly like the i5 2500k, this mean a 6700k will net ~40% more fps in overwatch since I have a GTX 1080.

This is colossal.

Just tested avg fps in OW @ ultra 1080p 100% render, its dead on 170fps avg just like the 2500k test they did.

πŸ‘︎ 64
πŸ’¬︎
πŸ‘€︎ u/MaximusCactus
πŸ“…︎ Mar 14 2017
🚨︎ report
If I upgrade to 2080ti from 980ti on a 4770K @ 4.7GHz, will I be CPU bound at 3440x1440?

I've been thinking of upgrading my graphics card now and the rest later. Would this show me a decent boost in performance at 3440x1440?

πŸ‘︎ 2
πŸ“°︎ r/intel
πŸ’¬︎
πŸ‘€︎ u/PedalMonk
πŸ“…︎ Nov 04 2018
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.