Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it a good idea to make an destructor protected to disable delete calls via a base class with virtual functions

Tags:

c++

I work within an environment where I know that all child classes of an abstract base class will never be deleted via a pointer to this abstract base class. So I don't see the need to have this base class to provide with an virtual destructor. Therefore I make the destructor protected, which seems to do what I want.

Here is a simplified example of this:

#include <iostream>

struct Base
{
    virtual void x() = 0;
protected:
    ~Base() =default;
};

struct Child: Base
{
    void x() override
    {
        std::cout << "Child\n";
    }
};

int main()
{
    // new and delete are here to make a simple example,
    // the platform does not provide them
    Child *c = new Child{};
    Base *b=c;

    b->x();
    // delete b; // does not compile, as requested
    delete c;
    return 0;
}

Is it sufficient to make the destructor protected to be save against unwanted base class deletions, or do I miss something important here?

like image 740
Rudi Avatar asked Dec 11 '25 19:12

Rudi


1 Answers

Is it a good idea to [...]

From a safety point of view I answer with a clear 'no':

Consider the case that the child class is inherited again – maybe someone else than you does that. That person might overlook that you violated good practice in Base and assume that the destructor of Child already is virtual – and delete GrandChild via pointer to Child...

To avoid that situation, you could

  • make the destructor virtual in Child again – so in the end, nothing gained anyway.
  • or declare Child final, imposing quite some limits that most likely are meaningless from any other point of view.

And you'd have to opt for one of these for any derived class. All for avoiding a single virtual function call on object deletion?

How often would you do that at all? If frequency of object deletion really is an issue, then consider the overhead of allocating and freeing the memory again. In such a case, it's most likely more efficient to allocate memory just once (sufficently large and appropriately aligned to hold any of the objects you consider to create), do placement new and explicit destructor calls instead and finally free the memory again just once when you are done completely and don't need it any more. That will compensate the virtual function call by much...

like image 195
Aconcagua Avatar answered Dec 14 '25 09:12

Aconcagua



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!