Skip to main content
deleted 3 characters in body
Source Link

Sometimes this question can arise not because of the cost of performing distance calculations, but because of the number of times the calculation is being made.

In a large game world with many actors, it is not scalable to keep checking the distance between one actor and all the others. As more players, NPCs and projectiles enter the world, the number of comparisons that need to be made will grow quadratically with O(N^2).

One way to reduce that growth is to use a good data structure to quickly discard unwanted actors from the calculationsconsideration.

We are looking for a way to efficiently iterate all the actors that might be in range, whilst excluding the majority of actors which are definitely out-of-range.

If your actors are fairly evenly spread across the world space, then a grid of buckets should be a suitable structure (as the accepted answer suggests). By keeping references to actors in a coarse grid, you need only check a few of the nearby buckets to cover all the actors which could be in range, ignoring the rest. When an actor moves, you may need to move him from his old bucket to a new one.

For actors which are less evenly spread, a quadtree may do better for a two-dimensional world, or an octree for a three-dimensional world. These are more general purpose structures that can efficiently partition large areas of empty space, and small areas containing lots of actors. For static actors there is binary space partitioning (BSP), which is very fast to search but unsuitable to update in realtime. BSPs partition the space using planes to repeatedly cut it in half, and can be applied to any number of dimensions.

Of course there are overheads to keeping your actors in such a structure, especially when they are moving between partitions. But in a large world with many actors but small ranges of interest, the savings should be far greater than the costs.

Consideration of how the expense of an algorithm grows as it receives more data is called computational complexity is crucial for scalable software design. Sometimes simply choosing the correct data structure is enough. Costs are usually described using Big O notation.

(I realise this is not a direct answer to the question, but it may be useful for some readers. My apologies if I have wasted your time!)

Sometimes this question can arise not because of the cost of performing distance calculations, but because of the number of times the calculation is being made.

In a large game world with many actors, it is not scalable to keep checking the distance between one actor and all the others. As more players, NPCs and projectiles enter the world, the number of comparisons that need to be made will grow quadratically with O(N^2).

One way to reduce that growth is to use a good data structure to quickly discard unwanted actors from the calculations.

We are looking for a way to efficiently iterate all the actors that might be in range, whilst excluding the majority of actors which are definitely out-of-range.

If your actors are fairly evenly spread across the world space, then a grid of buckets should be a suitable structure (as the accepted answer suggests). By keeping references to actors in a coarse grid, you need only check a few of the nearby buckets to cover all the actors which could be in range, ignoring the rest. When an actor moves, you may need to move him from his old bucket to a new one.

For actors which are less evenly spread, a quadtree may do better for a two-dimensional world, or an octree for a three-dimensional world. These are more general purpose structures that can efficiently partition large areas of empty space, and small areas containing lots of actors. For static actors there is binary space partitioning (BSP), which is very fast to search but unsuitable to update in realtime. BSPs partition the space using planes to repeatedly cut it in half, and can be applied to any number of dimensions.

Of course there are overheads to keeping your actors in such a structure, especially when they are moving between partitions. But in a large world with many actors but small ranges of interest, the savings should be far greater than the costs.

Consideration of how the expense of an algorithm grows as it receives more data is called computational complexity is crucial for scalable software design. Sometimes simply choosing the correct data structure is enough. Costs are usually described using Big O notation.

(I realise this is not a direct answer to the question, but it may be useful for some readers. My apologies if I have wasted your time!)

Sometimes this question can arise not because of the cost of performing distance calculations, but because of the number of times the calculation is being made.

In a large game world with many actors, it is not scalable to keep checking the distance between one actor and all the others. As more players, NPCs and projectiles enter the world, the number of comparisons that need to be made will grow quadratically with O(N^2).

One way to reduce that growth is to use a good data structure to quickly discard unwanted actors from consideration.

We are looking for a way to efficiently iterate all the actors that might be in range, whilst excluding the majority of actors which are definitely out-of-range.

If your actors are fairly evenly spread across the world space, then a grid of buckets should be a suitable structure (as the accepted answer suggests). By keeping references to actors in a coarse grid, you need only check a few of the nearby buckets to cover all the actors which could be in range, ignoring the rest. When an actor moves, you may need to move him from his old bucket to a new one.

For actors which are less evenly spread, a quadtree may do better for a two-dimensional world, or an octree for a three-dimensional world. These are more general purpose structures that can efficiently partition large areas of empty space, and small areas containing lots of actors. For static actors there is binary space partitioning (BSP), which is very fast to search but unsuitable to update in realtime. BSPs partition the space using planes to repeatedly cut it in half, and can be applied to any number of dimensions.

Of course there are overheads to keeping your actors in such a structure, especially when they are moving between partitions. But in a large world with many actors but small ranges of interest, the savings should be far greater than the costs.

Consideration of how the expense of an algorithm grows as it receives more data is called computational complexity is crucial for scalable software design. Sometimes simply choosing the correct data structure is enough. Costs are usually described using Big O notation.

(I realise this is not a direct answer to the question, but it may be useful for some readers. My apologies if I have wasted your time!)

minor tweaks to emphasis and wording
Source Link

Sometimes this question can arise not because of the cost of performing distance calculations, but because of the number of times the calculation is being made.

In a largelarge game world with many actors actors, it is unscalablenot scalable to keep checking the distance between one actor and all the others. As more players, NPCs and projectiles enter the world, the number of comparisons that need to be made will grow quadraticallygrow quadratically with O(N^2).

One way to reduce that growth is to use a good data structure to quickly discard unwanted actors from the calculations.

We are looking for a way to efficiently iterate all the actors that might be in range, whilst excluding the majority of actors which are definitely out-of-rangedefinitely out-of-range.

If your actors are fairly evenly spread across the world space, then a grid of buckets should be a suitable structure (as the accepted answer suggests). By keeping references to actors in a coarse grid, you need only check a few of the nearby buckets to cover all the actors which could be in range, ignoring the rest. When an actor moves, you may need to move him from his old bucket to a new one.

For actors which are less evenly spread, a quadtree may do better for a two-dimensional world, or an octree would be suitable for a three-dimensional world. These are more general purpose structures that can efficiently partition large areas of empty space, and small areas containing lots of actors. For static actors there is binary space partitioning (BSP), which is very fast to search but far too costlyunsuitable to update in realtime. BSPs separatepartition the space using planes to repeatedly cut it in half, and can be applied to any number of dimensions.

Of course there are overheads to keeping your actors in such a structure, especially when they are moving between partitions. But in a large world with many actors but small ranges of interest, the costssavings should be far lowergreater than those incurred by naive comparison against all objectsthe costs.

Consideration of how the expense of an algorithm grows as it receives more data is called computational complexity is crucial for scalable software design. Sometimes simply choosing the correct data structure is enough. Costs are usually described using Big O notation.

(I realise this is not a direct answer to the question, but it may be useful for some readers. My apologies if I have wasted your time!)

Sometimes this question can arise not because of the cost of performing distance calculations, but because of the number of times the calculation is being made.

In a large game world with many actors, it is unscalable to keep checking the distance between one actor and all the others. As more players, NPCs and projectiles enter the world, the number of comparisons that need to be made will grow quadratically with O(N^2).

One way to reduce that growth is to use a good data structure to quickly discard unwanted actors from the calculations.

We are looking for a way to efficiently iterate all the actors that might be in range, whilst excluding the majority of actors which are definitely out-of-range.

If your actors are fairly evenly spread across the world space, then a grid of buckets should be a suitable structure (as the accepted answer suggests). By keeping references to actors in a coarse grid, you need only check a few of the nearby buckets to cover all the actors which could be in range, ignoring the rest. When an actor moves, you may need to move him from his old bucket to a new one.

For actors which are less evenly spread a quadtree may do better for a two-dimensional world, or an octree would be suitable for a three-dimensional world. These are more general purpose structures that can efficiently partition large areas of empty space, and small areas containing lots of actors. For static actors there is binary space partitioning (BSP), which is very fast to search but far too costly to update in realtime. BSPs separate the space using planes to repeatedly cut it in half, and can be applied to any number of dimensions.

Of course there are overheads to keeping your actors such a structure, especially when they are moving between partitions. But in a large world with many actors but small ranges of interest, the costs should be far lower than those incurred by naive comparison against all objects.

Consideration of how the expense of an algorithm grows as it receives more data is crucial for scalable software design. Sometimes simply choosing the correct data structure is enough. Costs are usually described using Big O notation.

(I realise this is not a direct answer to the question, but it may be useful for some readers. My apologies if I have wasted your time!)

Sometimes this question can arise not because of the cost of performing distance calculations, but because of the number of times the calculation is being made.

In a large game world with many actors, it is not scalable to keep checking the distance between one actor and all the others. As more players, NPCs and projectiles enter the world, the number of comparisons that need to be made will grow quadratically with O(N^2).

One way to reduce that growth is to use a good data structure to quickly discard unwanted actors from the calculations.

We are looking for a way to efficiently iterate all the actors that might be in range, whilst excluding the majority of actors which are definitely out-of-range.

If your actors are fairly evenly spread across the world space, then a grid of buckets should be a suitable structure (as the accepted answer suggests). By keeping references to actors in a coarse grid, you need only check a few of the nearby buckets to cover all the actors which could be in range, ignoring the rest. When an actor moves, you may need to move him from his old bucket to a new one.

For actors which are less evenly spread, a quadtree may do better for a two-dimensional world, or an octree for a three-dimensional world. These are more general purpose structures that can efficiently partition large areas of empty space, and small areas containing lots of actors. For static actors there is binary space partitioning (BSP), which is very fast to search but unsuitable to update in realtime. BSPs partition the space using planes to repeatedly cut it in half, and can be applied to any number of dimensions.

Of course there are overheads to keeping your actors in such a structure, especially when they are moving between partitions. But in a large world with many actors but small ranges of interest, the savings should be far greater than the costs.

Consideration of how the expense of an algorithm grows as it receives more data is called computational complexity is crucial for scalable software design. Sometimes simply choosing the correct data structure is enough. Costs are usually described using Big O notation.

(I realise this is not a direct answer to the question, but it may be useful for some readers. My apologies if I have wasted your time!)

quadratic growth, not exponential
Source Link

Sometimes this question can arise not because of the cost of performing distance calculations, but because of the number of times the calculation is being made.

In a large game world with many actors, it is unscalable to keep checking the distance between one actor and all the others. As more players, NPCs and projectiles enter the world, the number of comparisons that need to be made will grow exponentiallyquadratically with O(N^2).

One way to reduce that growth is to use a good data structure to quickly discard unwanted actors from the calculations.

We are looking for a way to efficiently iterate all the actors that might be in range, whilst excluding the majority of actors which are definitely out-of-range.

If your actors are fairly evenly spread across the world space, then a grid of buckets should be a suitable structure (as the accepted answer suggests). By keeping references to actors in a coarse grid, you need only check a few of the nearby buckets to cover all the actors which could be in range, ignoring the rest. When an actor moves, you may need to move him from his old bucket to a new one.

For actors which are less evenly spread a quadtree may do better for a two-dimensional world, or an octree would be suitable for a three-dimensional world. These are more general purpose structures that can efficiently partition large areas of empty space, and small areas containing lots of actors. For static actors there is binary space partitioning (BSP), which is very fast to search but far too costly to update in realtime. BSPs separate the space using planes to repeatedly cut it in half, and can be applied to any number of dimensions.

Of course there are overheads to keeping your actors such a structure, especially when they are moving between partitions. But in a large world with many actors but small ranges of interest, the costs should be far lower than those incurred by naive comparison against all objects.

Consideration of how the expense of an algorithm grows as it receives more data is crucial for scalable software design. Sometimes simply choosing the correct data structure is enough. Costs are usually described using Big O notation.

(I realise this is not a direct answer to the question, but it may be useful for some readers. My apologies if I have wasted your time!)

Sometimes this question can arise not because of the cost of performing distance calculations, but because of the number of times the calculation is being made.

In a large game world with many actors, it is unscalable to keep checking the distance between one actor and all the others. As more players, NPCs and projectiles enter the world, the number of comparisons that need to be made will grow exponentially.

One way to reduce that growth is to use a good data structure to quickly discard unwanted actors from the calculations.

We are looking for a way to efficiently iterate all the actors that might be in range, whilst excluding the majority of actors which are definitely out-of-range.

If your actors are fairly evenly spread across the world space, then a grid of buckets should be a suitable structure (as the accepted answer suggests). By keeping references to actors in a coarse grid, you need only check a few of the nearby buckets to cover all the actors which could be in range, ignoring the rest. When an actor moves, you may need to move him from his old bucket to a new one.

For actors which are less evenly spread a quadtree may do better for a two-dimensional world, or an octree would be suitable for a three-dimensional world. These are more general purpose structures that can efficiently partition large areas of empty space, and small areas containing lots of actors. For static actors there is binary space partitioning (BSP), which is very fast to search but far too costly to update in realtime. BSPs separate the space using planes to repeatedly cut it in half, and can be applied to any number of dimensions.

Of course there are overheads to keeping your actors such a structure, especially when they are moving between partitions. But in a large world with many actors but small ranges of interest, the costs should be far lower than those incurred by naive comparison against all objects.

Consideration of how the expense of an algorithm grows as it receives more data is crucial for scalable software design. Sometimes simply choosing the correct data structure is enough. Costs are usually described using Big O notation.

(I realise this is not a direct answer to the question, but it may be useful for some readers. My apologies if I have wasted your time!)

Sometimes this question can arise not because of the cost of performing distance calculations, but because of the number of times the calculation is being made.

In a large game world with many actors, it is unscalable to keep checking the distance between one actor and all the others. As more players, NPCs and projectiles enter the world, the number of comparisons that need to be made will grow quadratically with O(N^2).

One way to reduce that growth is to use a good data structure to quickly discard unwanted actors from the calculations.

We are looking for a way to efficiently iterate all the actors that might be in range, whilst excluding the majority of actors which are definitely out-of-range.

If your actors are fairly evenly spread across the world space, then a grid of buckets should be a suitable structure (as the accepted answer suggests). By keeping references to actors in a coarse grid, you need only check a few of the nearby buckets to cover all the actors which could be in range, ignoring the rest. When an actor moves, you may need to move him from his old bucket to a new one.

For actors which are less evenly spread a quadtree may do better for a two-dimensional world, or an octree would be suitable for a three-dimensional world. These are more general purpose structures that can efficiently partition large areas of empty space, and small areas containing lots of actors. For static actors there is binary space partitioning (BSP), which is very fast to search but far too costly to update in realtime. BSPs separate the space using planes to repeatedly cut it in half, and can be applied to any number of dimensions.

Of course there are overheads to keeping your actors such a structure, especially when they are moving between partitions. But in a large world with many actors but small ranges of interest, the costs should be far lower than those incurred by naive comparison against all objects.

Consideration of how the expense of an algorithm grows as it receives more data is crucial for scalable software design. Sometimes simply choosing the correct data structure is enough. Costs are usually described using Big O notation.

(I realise this is not a direct answer to the question, but it may be useful for some readers. My apologies if I have wasted your time!)

Placed requirements before solutions, expanded with some brief explanations
Source Link
Loading
Source Link
Loading