Skip to main content
Grammar
Source Link
DMGregory
  • 140.8k
  • 23
  • 257
  • 401

Use vsync event instead of a 16ms timer?

I found that in game development, most people set a timer whichthat gets triggered every 16ms andto do the render thingsrendering. 

As an Android programmer, I'm familiar with the Choreographer. You set a callback whichthat will do the drawing staffstuff, and each time vsync happens, the callback is called.

I know the originoriginal use of vsync is to prevent modifying the buffer when the front buffer and the back buffer is switchingare being switched. But don't you think making the drawing/render code hashave the same period withas the actual screen refresh rate (in an ideal situation) is better?

So is there any reason why people don't prefer the vsync callback or vsync event based design?

Edit: OneOne reason I can come up with is that if a monitor is 75Hz, it will be too frequencyhigh-frequency to do the renderrendering, right?

Use vsync event instead a 16ms timer?

I found that in game development, most people set a timer which triggered every 16ms and do the render things. As an Android programmer, I'm familiar with the Choreographer. You set a callback which will do the drawing staff, and each time vsync happens, the callback is called.

I know the origin use of vsync is to prevent modifying the buffer when the front buffer and the back buffer is switching. But don't you think making the drawing/render code has the same period with the actual screen refresh rate (in an ideal situation) is better?

So is there any reason why people don't prefer the vsync callback or vsync event based design?

Edit: One reason I can come up with is that if a monitor is 75Hz, it will be too frequency to do the render, right?

Use vsync event instead of a 16ms timer?

I found that in game development, most people set a timer that gets triggered every 16ms to do the rendering. 

As an Android programmer, I'm familiar with the Choreographer. You set a callback that will do the drawing stuff, and each time vsync happens, the callback is called.

I know the original use of vsync is to prevent modifying the buffer when the front buffer and the back buffer are being switched. But don't you think making the drawing/render code have the same period as the actual screen refresh rate (in an ideal situation) is better?

So is there any reason why people don't prefer the vsync callback or vsync event based design?

One reason I can come up with is that if a monitor is 75Hz, it will be too high-frequency to do the rendering, right?

added 120 characters in body
Source Link
ravenisadesk
  • 680
  • 1
  • 5
  • 14

I found that in game development, most people set a timer which triggered every 16ms and do the render things. As an Android programmer, I'm familiar with the Choreographer. You set a callback which will do the drawing staff, and each time vsync happens, the callback is called.

I know the origin use of vsync is to prevent modifying the buffer when the front buffer and the back buffer is switching. But don't you think making the drawing/render code has the same period with the actual screen refresh rate (in an ideal situation) is better?

So is there any reason why people don't prefer the vsync callback or vsync event based design?

Edit: One reason I can come up with is that if a monitor is 75Hz, it will be too frequency to do the render, right?

I found that in game development, most people set a timer which triggered every 16ms and do the render things. As an Android programmer, I'm familiar with the Choreographer. You set a callback which will do the drawing staff, and each time vsync happens, the callback is called.

I know the origin use of vsync is to prevent modifying the buffer when the front buffer and the back buffer is switching. But don't you think making the drawing/render code has the same period with the actual screen refresh rate (in an ideal situation) is better?

So is there any reason why people don't prefer the vsync callback or vsync event based design?

I found that in game development, most people set a timer which triggered every 16ms and do the render things. As an Android programmer, I'm familiar with the Choreographer. You set a callback which will do the drawing staff, and each time vsync happens, the callback is called.

I know the origin use of vsync is to prevent modifying the buffer when the front buffer and the back buffer is switching. But don't you think making the drawing/render code has the same period with the actual screen refresh rate (in an ideal situation) is better?

So is there any reason why people don't prefer the vsync callback or vsync event based design?

Edit: One reason I can come up with is that if a monitor is 75Hz, it will be too frequency to do the render, right?

Source Link
ravenisadesk
  • 680
  • 1
  • 5
  • 14

Use vsync event instead a 16ms timer?

I found that in game development, most people set a timer which triggered every 16ms and do the render things. As an Android programmer, I'm familiar with the Choreographer. You set a callback which will do the drawing staff, and each time vsync happens, the callback is called.

I know the origin use of vsync is to prevent modifying the buffer when the front buffer and the back buffer is switching. But don't you think making the drawing/render code has the same period with the actual screen refresh rate (in an ideal situation) is better?

So is there any reason why people don't prefer the vsync callback or vsync event based design?