Questions tagged [gpu]
GPU (graphics processing unit), is a specialized processor designed to accelerate the process of building images.
183 questions
0
votes
2
answers
182
views
Do GPUs re-draw a all of a character's vertices/triangles/fragments every frame?
I'm a beginner in game programming and I want someone to confirm my understanding.
Let's say there's a 3D model for a character in the game. This character is made up of triangles. Each vertex in ...
0
votes
1
answer
150
views
How to efficiently construct model matrices in a vertex shader from a small amount of data, for instanced rendering of many moving 3d game characters
I am trying to efficiently render many 3d game characters: many instances of the same 3d model which can change their positions and z rotations each frame. For reference I'm using OpenGL but this ...
0
votes
2
answers
314
views
How can I efficiently render lots of moving objects in a game?
I'm using OpenGL but this question should apply generally to rendering.
I understand that for efficient rendering in games, you want to minimize communication between the CPU and GPU. This means pre-...
1
vote
1
answer
107
views
Efficiently passing data to the GPU corresponding to a collection of objects
I am beginning to work on a physics system, where there are multiple objects in the world that get rendered. The current approach I am considering consists of pseudo-OOP in C where each object has ...
1
vote
0
answers
881
views
Completely independent dual GPU setup for VR with 100% "SLI efficiency"?
I have a simple (and maybe quite naive) question regarding dual GPU use for Virtual Reality (VR); it nagged me for years now and I couldn't figure out why this can't work, at least in principle:
I ...
1
vote
1
answer
320
views
Is DirectX 12 or lower just an API?
I am programming a game using DirectX 12. Shall it support all GPUs? Or just newer GPUs? What about the version of the Windows OS supported?
What changes when a new DirectX version comes?
0
votes
1
answer
2k
views
Int vs Float, which one is faster for gpu?
My game need to loop through massive amount of data, and the amount of data can increase by a lot depending on world settings set by player. The data is too big for CPU so i need to use GPU for it ...
0
votes
0
answers
51
views
How many divisions does the GPU's texture mapper do in parallel?
Perspective-correct texture mapping requires one division per pixel. Before the advent of GPUs this was a problem because this was quite heavy to do on the CPU (especially back in the days of non-SSE ...
1
vote
1
answer
412
views
Computations in GPU Unity
I've made a fluid simulation using particles in Unity, but now it is painfully slow because all computations are done using the CPU. In order to make it faster, I need to do computations on the GPU, ...
0
votes
0
answers
53
views
Balance load between CPU and GPU [duplicate]
I am making a game with Unreal Engine 5, but it takes more GPU power and the CPU is used much less.
I want to optimize it to use both CPU and GPU so it can be playable on low-end PCs or laptops. Is ...
0
votes
0
answers
1k
views
AsyncGPUReadback.RequestIntoNativeArray - owner has been invalidated
I have the following C# code in Unity version 2022.2.0a12:
...
0
votes
1
answer
989
views
Use of CPU vs. GPU on mobile devices
I was always told that if a task can be parrarelized, I should put it on the GPU for better performance. Although this is defenetly true for computer GPUs, I was wondering if the mobile GPUs were so ...
0
votes
0
answers
235
views
Is it possible to use hardware acceleration in OpenCL
I built a small game engine using OpenCL and SDL2 but it's not running really fast compare to Vulkan and OpenGL. I wrote rasterization code, but when I did some research Vulkan and OpenGL use hardware ...
1
vote
1
answer
3k
views
Why do we use GLSL(Shader) instead of CUDA?
I'm meaning GLSL and CUDA both utilize GPU to their maximum power and in some cases, I heard CUDA runs faster on Nvidia graphic card. So my question is why don't we use CUDA more often for GPU graphic ...
1
vote
1
answer
754
views
GPU Instanced Transparent Mesh Not Rendering
I'm trying to render a bunch of clouds via gpu instancing. This works perfectly fine with the default HDRP Lit Shader, and I get the following result:
However, as soon as I change the surface type ...
0
votes
2
answers
10k
views
Low FPS in Unreal engine, but GPU usage is low as well
I am running an Unreal Engine 4 project which has many high quality assets. My computer is fairly strong:
CPU: AMD Ryzen 5 3600 6-Core
GPU: GeForce RTX 3060 SSD: Lexar 500GB
NM610 M.2 NVMe SSD
RAM: 2 ...
0
votes
1
answer
326
views
How to temporarily set additional system environment variable only in 'play' mode inside godot editor?
I'm learning godot with a laptop that has AMD discrete GPU. My OS is Arch Linux so if I want to use discrete GPU I have to set system environment variable ...
1
vote
1
answer
1k
views
Map() fails when reading back GPU Texture
I need to read back a GPU texture (stored in the GPU as D3D11_USAGE_DEFAULT). I am doing this via creating a staging ID3D11Texture. The whole application is running ...
0
votes
0
answers
40
views
Not clearing FBO's Texture error in battery economy mode
When rendering inside a FBO's texture, I'm not using glClear() but overwriting each fragment, GL_BLEND is set to true.
This works just fine, but I just realised when my laptop switch to economy mode, ...
1
vote
1
answer
358
views
How to share constant variables between Compute Shaders?
So, I have two compute shaders A and B (using unity and HLSL). Right now, before every dispatch, I send my mouse coordinates to both of them every update.
So, from my understanding, you can actually ...
0
votes
0
answers
2k
views
SDL2 for hardware accelerated graphics?
I am attempting to make a 3d game using SDL2 just to learn and have a bit of fun. I was wondering if there is any way to get SDL2 do calculations on GPU. I have read that SDL2 Textures uses GPU for ...
0
votes
1
answer
2k
views
Unity Build GPU Performance
I have been banging my head against the wall with this for few days now with no improvement..
The problem is that after build my project keeps using over 30% of the GPU. Even in the editor it takes 20%...
0
votes
1
answer
284
views
Why was 24-bit color support introduced twice in GPUs?
I was doing research, trying to answer the question "Which was the first GPU to support 24-bit color". I know all color since 1992 is 24-bit, even in games like Doom. I mean 16 million ...
0
votes
0
answers
49
views
How can I use gpu to crop unnecessary pixels from an image?
I would like to keep only the color and x,y coordinates of the pixels that are touching a pixel that is a different color than itself (basically the edges) and remove the rest so that the gpu can ...
0
votes
0
answers
68
views
How is data written from two different gpu cores to the same memory?
Does each core’s data get written to the shared memory one at a time or both at the same time? For instance, when two cores that are next to each other need to write to the same memory spot does the ...
0
votes
1
answer
564
views
How can I make a custom mesh class in Unity?
I'm doing something in Unity where I need to specify the position and orientation of vertices with two Vector4s, and they're not just position and normal vectors. I'...
0
votes
1
answer
240
views
What is the difference between these two shaders in terms of performance?
I have implemented a two pass Gaussian blur shader in GLSL like this:
...
5
votes
1
answer
3k
views
Which Terrain LOD algorithm should I use for super large terrain?
My game needs a terrain, the requirements are:
Freely Zoom in & zoom out, like GoogleEarth. Max resolution when zooming in ~100 meter, Max range when zooming out ~2000km (a whole country scale).
...
3
votes
1
answer
1k
views
How to render a grid of dots being exactly 1x1 pixel wide using a shader?
I would like to render a grid with many dots, all equally spaced and being exactly 1 pixel wide.
What I was able to achieve so far is this :
What I would like is (simulated using image editing ...
0
votes
0
answers
2k
views
How many triangles should I expect to be able to render in a second?
Assuming that I'm doing everything right and all I'm doing when I render my scene is going through all the vertex arrays that I want to render, how many triangles should I expect to be able to render ...
1
vote
1
answer
350
views
Understanding buffer swapping in more detail
This is more a theoretical question. This is what I understand regarding buffer swapping and vsync:
I - When vsync is off, whenever the developer swap the front/back buffers, the buffer that the GPU ...
0
votes
0
answers
994
views
gpu_rotate a texture2d or rendertexture
How does one gpu rotate a Texture2D or RenderTexture using Unity c#? Or, is that possible?
I think it might be something like this...
I'm also trying to understand how gpu_scale seems to work here? ...
3
votes
2
answers
2k
views
Why isn't more culling being done on the GPU?
I'm interested in game development and 3D graphics; however, I'm not very experienced, so I apologize in advance if this comes across as ignorant or overly general.
The impression I get is that quite ...
0
votes
0
answers
234
views
How Lots of ComputeBuffers at Once Affect Performance Unity
How will lots ComputeBuffer instances affect performance? And why?
I know that I should call ComputeBuffer.Release() on every <...
4
votes
3
answers
4k
views
Should calculations be done on the CPU or GPU?
I'm currently learning OpenGL and it's become obvious that I can do most calculations on the CPU or the GPU. For example, I can do something like lightColor * objectColor on the CPU, then send it to ...
0
votes
0
answers
218
views
Why Java wont use GPU?
I have a java code (game) which runs at 3200 frames per second on my PC. On my laptop, it runs with 100 frames per second, despite the fact they have very similar hardware.
I think the issue might be ...
1
vote
1
answer
599
views
reading from texture2d resource in directx11
Hi^^ i am trying to read data in resource, which i used to do well without any problems. but suddenly it is not working.
first i made immutable resource that has data in it, which is
...
0
votes
1
answer
433
views
Graphic render speed in libgdx using many sprite sheets
I am currently working on a customizable Player-character for my top down view 2D pixel game. I am using libgdx and aiming for android devices as well as desktop applications.
I am wondering about an ...
1
vote
0
answers
337
views
How to render decoded video that is on GPU memory without copying to CPU
I'm reading this example of ffmpeg hardware decoding: https://github.com/FFmpeg/FFmpeg/blob/release/4.2/doc/examples/hw_decode.c
At line 109 it does this:
...
1
vote
1
answer
3k
views
Windows 10 GPU Engine Performance Counters - Phys / Eng Meaning
For performance tracing of intermittent degredation in performance I wanted to use the Performance Counters available in Windows 10 1809 under GPU Engine -> Utilization percentage. This particular ...
0
votes
1
answer
285
views
OpenGL Texture Zig-Zag Artifacts Over Time
I'm working on a deferred shading renderer in OpenGL, where I write all geometry output to Colour, Normal and Depth textures and then apply lighting effects later.
Everything seems fine except once ...
0
votes
1
answer
201
views
Is it possible to achieve the same performance of CUDA on OpenCL?
I am planning on porting some of my CPU code to GPU. I want my code to run on all GPUs, so openCL seems to be the right choice. Will I be able to achieve the same performance as of CUDA in openCL?. ...
1
vote
1
answer
182
views
GPU (render time) increase if screen size increase
i create a simple 2d scene in unity 2017.3.1f1
I changed the size (height and width) in the Game View and proflie to see how it affects the rendering.. (below photo)
I saw that the rendering time ...
1
vote
1
answer
2k
views
Does Unity3D custom shader use its default GPU or CPU skinning automatically?
I'm writing my own shader in mobile, but was wondering if it uses default GPU or CPU skinning feature in Unity3D?
I'd like to use GPU skinning and already enabled GPU skinning. Is there any way to ...
6
votes
0
answers
92
views
How can I make sure my OpenCL code works correctly on different graphic cards?
I'm testing on adding some OpenCL code in my game, but I only have a single Nvidia card & I'm not sure the code will run normally on other platforms.
Is there any way to make sure my code runs ...
-1
votes
1
answer
2k
views
How OpenGL running without GPU?
Every tutorial on OpenGL mentions that,
OpenGL codes are commands executed by the GPU.
Few days ago, my GPU cashed and I removed it from my computer. But the same OpenGL programs are still running ...
1
vote
1
answer
664
views
Why do GPUs have limited amount of allocations?
I've been learning Vulkan lately and I read that you can allocate VRAM memory only set amount of times and it doesn't matter if it's 2gb or 2kb, why is it?
I'm specifically referring to ...
3
votes
0
answers
218
views
Are GPU drivers hand optimized for specific games using low-level APIs?
The GPU drivers often have slightly different behavior depending on game or program, which is using them. It optimizes performance, bypasses bugs and improves overall experience in popular games, ...
1
vote
1
answer
693
views
Does texture splatting always sample 4 x N textures per fragment (regardless of the weights)?
Texture splatting - usually - is done by vertex painting, where each channel R-G-B-A is assigned as a different texture weight.
Due to the way shaders are executed, doesn't it mean that the fragment ...
1
vote
0
answers
240
views
Can running unity of a system with nvidia graphics card speed up the simulation?
I have a nvidia GTX 1080 graphics card in my computer, I don't know if unity by default makes use of GPU. Does having a GPU helps to speed up unity simulations.