34

I think I shall reframe my question from

Where should you use BlockingQueue Implementations instead of Simple Queue Implementations ?

to

What are the advantages/disadvantages of BlockingQueue over Queue implementations taking into consideration aspects like speed,concurrency or other properties which vary e.g. time to access last element.

I have used both kind of Queues. I know that Blocking Queue is normally used in concurrent application. I was writing simple ByteBuffer pool where I needed some placeholder for ByteBuffer objects. I needed fastest , thread safe queue implementation. Even there are List implementations like ArrayList which has constant access time for elements.

Can anyone discuss about pros and cons of BlockingQueue vs Queue vs List implementations?

Currently I have used ArrayList to hold these ByteBuffer objects.

Which data structure shall I use to hold these objects?

3 Answers 3

44

A limited capacity BlockingQueue is also helpful if you want to throttle some sort of request. With an unbounded queue, a producers can get far ahead of the consumers. The tasks will eventually be performed (unless there are so many that they cause an OutOfMemoryError), but the producer may long since have given up, so the effort is wasted.

In situations like these, it may be better to signal a would-be producer that the queue is full, and to give up quickly with a failure. For example, the producer might be a web request, with a user that doesn't want to wait too long, and even though it won't consume many CPU cycles while waiting, it is using up limited resources like a socket and some memory. Giving up will give the tasks that have been queued already a better chance to finish in a timely manner.


Regarding the amended question, which I'm interpreting as, "What is a good collection for holding objects in a pool?"

An unbounded LinkedBlockingQueue is a good choice for many pools. However, depending on your pool management strategy, a ConcurrentLinkedQueue may work too.

In a pooling application, a blocking "put" is not appropriate. Controlling the maximum size of the queue is the job of the pool manager—it decides when to create or destroy resources for the pool. Clients of the pool borrow and return resources from the pool. Adding a new object, or returning a previously borrowed object to the pool should be fast, non-blocking operations. So, a bounded capacity queue is not a good choice for pools.

On the other hand, when retrieving an object from the pool, most applications want to wait until a resource is available. A "take" operation that blocks, at least temporarily, is much more efficient than a "busy wait"—repeatedly polling until a resource is available. The LinkedBlockingQueue is a good choice in this case. A borrower can block indefinitely with take, or limit the time it is willing to block with poll.

A less common case in when a client is not willing to block at all, but has the ability to create a resource for itself if the pool is empty. In that case, a ConcurrentLinkedQueue is a good choice. This is sort of a gray area where it would be nice to share a resource (e.g., memory) as much as possible, but speed is even more important. In the worse case, this degenerates to every thread having its own instance of the resource; then it would have been more efficient not to bother trying to share among threads.

Both of these collections give good performance and ease of use in a concurrent application. For non-concurrent applications, an ArrayList is hard to beat. Even for collections that grow dynamically, the per-element overhead of a LinkedList allows an ArrayList with some empty slots to stay competitive memory-wise.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks Erickson for such a nice explanation. It has solved my problem.
4

You would see BlockingQueue in multi-threaded situations. For example you need pass in a BlockingQueue as a parameter to create ThreadPoolExecutor if you want to create one using constructor. Depending on the type of queue you pass in the executor could act differently.

Comments

4

It is a Queue implementation that additionally supports operations that

wait for the queue to become non-empty when retrieving an element,

and

wait for space to become available in the queue when storing an element.

If you required above functionality will be followed by your Queue implementation then use Blocking Queue

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.