Skip to main content

Questions tagged [gpu]

GPU (graphics processing unit), is a specialized processor designed to accelerate the process of building images.

Filter by
Sorted by
Tagged with
0 votes
2 answers
182 views

I'm a beginner in game programming and I want someone to confirm my understanding. Let's say there's a 3D model for a character in the game. This character is made up of triangles. Each vertex in ...
marcelo kmsaw's user avatar
0 votes
1 answer
150 views

I am trying to efficiently render many 3d game characters: many instances of the same 3d model which can change their positions and z rotations each frame. For reference I'm using OpenGL but this ...
greenlagoon's user avatar
0 votes
2 answers
314 views

I'm using OpenGL but this question should apply generally to rendering. I understand that for efficient rendering in games, you want to minimize communication between the CPU and GPU. This means pre-...
greenlagoon's user avatar
1 vote
1 answer
107 views

I am beginning to work on a physics system, where there are multiple objects in the world that get rendered. The current approach I am considering consists of pseudo-OOP in C where each object has ...
CPlus's user avatar
  • 153
1 vote
0 answers
881 views

I have a simple (and maybe quite naive) question regarding dual GPU use for Virtual Reality (VR); it nagged me for years now and I couldn't figure out why this can't work, at least in principle: I ...
Felix Tritschler's user avatar
1 vote
1 answer
320 views

I am programming a game using DirectX 12. Shall it support all GPUs? Or just newer GPUs? What about the version of the Windows OS supported? What changes when a new DirectX version comes?
Praveen Kumar's user avatar
0 votes
1 answer
2k views

My game need to loop through massive amount of data, and the amount of data can increase by a lot depending on world settings set by player. The data is too big for CPU so i need to use GPU for it ...
pi squared's user avatar
0 votes
0 answers
51 views

Perspective-correct texture mapping requires one division per pixel. Before the advent of GPUs this was a problem because this was quite heavy to do on the CPU (especially back in the days of non-SSE ...
Warp's user avatar
  • 171
1 vote
1 answer
412 views

I've made a fluid simulation using particles in Unity, but now it is painfully slow because all computations are done using the CPU. In order to make it faster, I need to do computations on the GPU, ...
UserUser's user avatar
  • 171
0 votes
0 answers
53 views

I am making a game with Unreal Engine 5, but it takes more GPU power and the CPU is used much less. I want to optimize it to use both CPU and GPU so it can be playable on low-end PCs or laptops. Is ...
Pranav Upadhyay's user avatar
0 votes
0 answers
1k views

I have the following C# code in Unity version 2022.2.0a12: ...
Corvus Ultima's user avatar
0 votes
1 answer
989 views

I was always told that if a task can be parrarelized, I should put it on the GPU for better performance. Although this is defenetly true for computer GPUs, I was wondering if the mobile GPUs were so ...
Gyoo's user avatar
  • 286
0 votes
0 answers
235 views

I built a small game engine using OpenCL and SDL2 but it's not running really fast compare to Vulkan and OpenGL. I wrote rasterization code, but when I did some research Vulkan and OpenGL use hardware ...
is code's user avatar
  • 31
1 vote
1 answer
3k views

I'm meaning GLSL and CUDA both utilize GPU to their maximum power and in some cases, I heard CUDA runs faster on Nvidia graphic card. So my question is why don't we use CUDA more often for GPU graphic ...
is code's user avatar
  • 31
1 vote
1 answer
754 views

I'm trying to render a bunch of clouds via gpu instancing. This works perfectly fine with the default HDRP Lit Shader, and I get the following result: However, as soon as I change the surface type ...
person132's user avatar
0 votes
2 answers
10k views

I am running an Unreal Engine 4 project which has many high quality assets. My computer is fairly strong: CPU: AMD Ryzen 5 3600 6-Core GPU: GeForce RTX 3060 SSD: Lexar 500GB NM610 M.2 NVMe SSD RAM: 2 ...
Quantum Guy 123's user avatar
0 votes
1 answer
326 views

I'm learning godot with a laptop that has AMD discrete GPU. My OS is Arch Linux so if I want to use discrete GPU I have to set system environment variable ...
ArchBug's user avatar
  • 21
1 vote
1 answer
1k views

I need to read back a GPU texture (stored in the GPU as D3D11_USAGE_DEFAULT). I am doing this via creating a staging ID3D11Texture. The whole application is running ...
fred26's user avatar
  • 113
0 votes
0 answers
40 views

When rendering inside a FBO's texture, I'm not using glClear() but overwriting each fragment, GL_BLEND is set to true. This works just fine, but I just realised when my laptop switch to economy mode, ...
ebkgne's user avatar
  • 21
1 vote
1 answer
358 views

So, I have two compute shaders A and B (using unity and HLSL). Right now, before every dispatch, I send my mouse coordinates to both of them every update. So, from my understanding, you can actually ...
SoftwareMan's user avatar
0 votes
0 answers
2k views

I am attempting to make a 3d game using SDL2 just to learn and have a bit of fun. I was wondering if there is any way to get SDL2 do calculations on GPU. I have read that SDL2 Textures uses GPU for ...
Apple_Banana's user avatar
0 votes
1 answer
2k views

I have been banging my head against the wall with this for few days now with no improvement.. The problem is that after build my project keeps using over 30% of the GPU. Even in the editor it takes 20%...
theCodeHermit's user avatar
0 votes
1 answer
284 views

I was doing research, trying to answer the question "Which was the first GPU to support 24-bit color". I know all color since 1992 is 24-bit, even in games like Doom. I mean 16 million ...
Boris Rusev's user avatar
0 votes
0 answers
49 views

I would like to keep only the color and x,y coordinates of the pixels that are touching a pixel that is a different color than itself (basically the edges) and remove the rest so that the gpu can ...
user11937382's user avatar
0 votes
0 answers
68 views

Does each core’s data get written to the shared memory one at a time or both at the same time? For instance, when two cores that are next to each other need to write to the same memory spot does the ...
user11937382's user avatar
0 votes
1 answer
564 views

I'm doing something in Unity where I need to specify the position and orientation of vertices with two Vector4s, and they're not just position and normal vectors. I'...
A. Kriegman's user avatar
0 votes
1 answer
240 views

I have implemented a two pass Gaussian blur shader in GLSL like this: ...
racz16's user avatar
  • 171
5 votes
1 answer
3k views

My game needs a terrain, the requirements are: Freely Zoom in & zoom out, like GoogleEarth. Max resolution when zooming in ~100 meter, Max range when zooming out ~2000km (a whole country scale). ...
IlIlijl1Ili's user avatar
3 votes
1 answer
1k views

I would like to render a grid with many dots, all equally spaced and being exactly 1 pixel wide. What I was able to achieve so far is this : What I would like is (simulated using image editing ...
tigrou's user avatar
  • 3,279
0 votes
0 answers
2k views

Assuming that I'm doing everything right and all I'm doing when I render my scene is going through all the vertex arrays that I want to render, how many triangles should I expect to be able to render ...
Clearer's user avatar
  • 101
1 vote
1 answer
350 views

This is more a theoretical question. This is what I understand regarding buffer swapping and vsync: I - When vsync is off, whenever the developer swap the front/back buffers, the buffer that the GPU ...
felipeek's user avatar
  • 131
0 votes
0 answers
994 views

How does one gpu rotate a Texture2D or RenderTexture using Unity c#? Or, is that possible? I think it might be something like this... I'm also trying to understand how gpu_scale seems to work here? ...
ina's user avatar
  • 294
3 votes
2 answers
2k views

I'm interested in game development and 3D graphics; however, I'm not very experienced, so I apologize in advance if this comes across as ignorant or overly general. The impression I get is that quite ...
Time4Tea's user avatar
  • 133
0 votes
0 answers
234 views

How will lots ComputeBuffer instances affect performance? And why? I know that I should call ComputeBuffer.Release() on every <...
Ely Shaffir's user avatar
4 votes
3 answers
4k views

I'm currently learning OpenGL and it's become obvious that I can do most calculations on the CPU or the GPU. For example, I can do something like lightColor * objectColor on the CPU, then send it to ...
Adam's user avatar
  • 86
0 votes
0 answers
218 views

I have a java code (game) which runs at 3200 frames per second on my PC. On my laptop, it runs with 100 frames per second, despite the fact they have very similar hardware. I think the issue might be ...
ss5151's user avatar
  • 1
1 vote
1 answer
599 views

Hi^^ i am trying to read data in resource, which i used to do well without any problems. but suddenly it is not working. first i made immutable resource that has data in it, which is ...
KIM CHANGJUN's user avatar
0 votes
1 answer
433 views

I am currently working on a customizable Player-character for my top down view 2D pixel game. I am using libgdx and aiming for android devices as well as desktop applications. I am wondering about an ...
Yesyoor's user avatar
1 vote
0 answers
337 views

I'm reading this example of ffmpeg hardware decoding: https://github.com/FFmpeg/FFmpeg/blob/release/4.2/doc/examples/hw_decode.c At line 109 it does this: ...
Poperton's user avatar
  • 111
1 vote
1 answer
3k views

For performance tracing of intermittent degredation in performance I wanted to use the Performance Counters available in Windows 10 1809 under GPU Engine -> Utilization percentage. This particular ...
Malcolm McCaffery's user avatar
0 votes
1 answer
285 views

I'm working on a deferred shading renderer in OpenGL, where I write all geometry output to Colour, Normal and Depth textures and then apply lighting effects later. Everything seems fine except once ...
Vercidium's user avatar
0 votes
1 answer
201 views

I am planning on porting some of my CPU code to GPU. I want my code to run on all GPUs, so openCL seems to be the right choice. Will I be able to achieve the same performance as of CUDA in openCL?. ...
Pravinkumar's user avatar
1 vote
1 answer
182 views

i create a simple 2d scene in unity 2017.3.1f1 I changed the size (height and width) in the Game View and proflie to see how it affects the rendering.. (below photo) I saw that the rendering time ...
sam's user avatar
  • 9
1 vote
1 answer
2k views

I'm writing my own shader in mobile, but was wondering if it uses default GPU or CPU skinning feature in Unity3D? I'd like to use GPU skinning and already enabled GPU skinning. Is there any way to ...
kkl's user avatar
  • 43
6 votes
0 answers
92 views

I'm testing on adding some OpenCL code in my game, but I only have a single Nvidia card & I'm not sure the code will run normally on other platforms. Is there any way to make sure my code runs ...
ravenisadesk's user avatar
-1 votes
1 answer
2k views

Every tutorial on OpenGL mentions that, OpenGL codes are commands executed by the GPU. Few days ago, my GPU cashed and I removed it from my computer. But the same OpenGL programs are still running ...
Shuvo Sarker's user avatar
1 vote
1 answer
664 views

I've been learning Vulkan lately and I read that you can allocate VRAM memory only set amount of times and it doesn't matter if it's 2gb or 2kb, why is it? I'm specifically referring to ...
Werem's user avatar
  • 148
3 votes
0 answers
218 views

The GPU drivers often have slightly different behavior depending on game or program, which is using them. It optimizes performance, bypasses bugs and improves overall experience in popular games, ...
CodeSandwich's user avatar
1 vote
1 answer
693 views

Texture splatting - usually - is done by vertex painting, where each channel R-G-B-A is assigned as a different texture weight. Due to the way shaders are executed, doesn't it mean that the fragment ...
JBeurer's user avatar
  • 467
1 vote
0 answers
240 views

I have a nvidia GTX 1080 graphics card in my computer, I don't know if unity by default makes use of GPU. Does having a GPU helps to speed up unity simulations.
NAnn's user avatar
  • 339