0
\$\begingroup\$

Implementing a viewport is just a rectangular space on the back buffer usually a 2d texture set to the size of the client window that we render to.

However in direct3d when we describe the viewport it asks for the window width and heights and the top left X,Y position

From what I understand normalized device coordinates for the top left would be represented by 1,-1 and if I chose another type of device coordinate would be represented by 0 , 0.

How does this translate To a coordinate system for primitives I draw on screen from some file formats like wavefronts OBJ format their are coordinates that are out of range from -1 to 1.

enter image description here

\$\endgroup\$
5
  • \$\begingroup\$ Hint: you're never drawing just the model. You're drawing a view of the model, from a specific perspective. \$\endgroup\$ Commented Oct 19, 2019 at 0:20
  • \$\begingroup\$ but is the “true” “world” I’m rendering my model to the coordinates that make up my back buffer (the height and width) of my window or is it the (in direct3d) the top left XY coordinate system that range from 0 to 1. Maybe I’m misunderstanding. \$\endgroup\$ Commented Oct 19, 2019 at 2:24
  • \$\begingroup\$ Neither. Those are measures of the canvas the artist is painting on, not measures of the scenery the artist is painting. \$\endgroup\$ Commented Oct 19, 2019 at 2:26
  • \$\begingroup\$ So does the primitives sent in through the pipeline get converted to pixel coordinates that make up the width and height of my window and get mapped there during the rasterization stage ? \$\endgroup\$ Commented Oct 19, 2019 at 3:36
  • \$\begingroup\$ It sounds like you're skipping a few pipeline stages. Wouldn't that be the job of the vertex shader stage? \$\endgroup\$ Commented Oct 19, 2019 at 12:39

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.