1

Consider the case below

When we are using a C API's inside a class to create some data that are allocated in heap using malloc (e.g Object* create_obj()), and we have to call a certain method (void free_obj()) in the end of the class lifetime to free the memory manually.

When a language has a destructor, we can easily put the free_obj in the class destructor so the user does not have to call the free_obj manually and wait until the class get garbage collected.

My question

  • Why some garbage collected & OOP programming language (Java [Java has been deprecating it's finalize] and Ruby) doesn't have a destructor?

  • Isn't a destructor is necessary when you're interfacing a low level API like for the case above? If it's not necessary, what is the best practice to solve the problem below?

9
  • 1
    In Java, classes which hold resources that must be closed typically implement AutoCloseable. That interface has the close() method which the developer is responsible for calling at the appropriate time. It was rarely, if ever, a good idea to rely on finalize. And while finalize has been deprecated they did add Cleaner. But it's still best to avoid relying on Cleaner as much as possible. For objects which don't hold open resources, and thus don't implement AutoCloseable, there's really no reason to have a destructor—the GC will take care of it. Commented Aug 21, 2020 at 10:40
  • 1
    If you use try-with-resources (from Java 7 onwards) and your component implements AutoCloseable, you don't need finalize Commented Aug 21, 2020 at 10:44
  • 1
    This is a pointless question. In Java, you don’t use malloc, so you don’t have a need to put a corresponding free into a destructor. Commented Aug 21, 2020 at 13:44
  • 1
    That’s a rare corner case. Why should a language get designed around the rare corner cases? Commented Aug 24, 2020 at 8:17
  • 1
    When you “want to know what's the best approach to solve this kind of problem” you shouldn’t ask a “why” question. Commented Aug 25, 2020 at 16:28

2 Answers 2

3

Languages like Java and Ruby have finalizers, but not destructors. The main reason is that deterministic destruction constrains the implementation in a way that the language designers did not want to do.

Many of the performance tricks that modern high-performance garbage collectors employ would not be possible with deterministic finalization. Ruby and Java do not even guarantee that an object will be collected at all. It is perfectly legal for a Ruby or Java implementation to never collect an object even if it is unreachable.

Even CPython, which has a very simple garbage collector cannot guarantee deterministic finalization. It only guarantees deterministic finalization for non-cyclic object graphs. And the Python community has made it very clear that this is a private internal implementation detail of CPython and not part of Python language semantics, meaning that other implementations (e.g. PyPy, IronPython, Jython) do not have to implement it and are thus free to implement much better garbage collectors.

Sign up to request clarification or add additional context in comments.

1 Comment

Though finalizers are highly discouraged and deprecated since java-9, for example.
1

Destructors are necessary in allocation based languages, but optional in GC languages like Ruby. Destructor patterns are not to be confused with garbage collection and are, as you said, representative of matching an object lifespan to a scope.

Objects live for a while and then the section of memory said object consumes is marked as available for future objects. Ruby offers two sets of memory: malloc heap and Ruby object heap. malloc heap does not release back to the os unless the memory is unused by Ruby at the end of gc. The latter (a subset of malloc heap) is where most Ruby objects live. The Ruby garbage collector directs its focus here and cleans up often, meaning destructors are, for the most part, unnecessary. Not every object will be collected but languages like Ruby do not guarantee that.

In Ruby variables reference an Object which means an Object is stored somewhere and variables only hold the Object id. If we called a destructor on such an object that has been collected or destructed by another variable it would return nil but possibly the same object id, which could cause issues at run time.

Ruby's define_finalizer is not common practice and developers are discouraged from using it. The method cannot refer to the object it is freeing as the callback is executed after the object is freed, so there's no guarantee it will be called. If a finalizer proc holds reference to self which would make it impossible for the object to be garbage collected, meaning it will never be collected.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.