Old question, but still...
I use a couple of simple helper for this. It'll give an error link-time that's fairly readable:
// Not implemented is not implemented :-)thing, it'll break:
struct NotImplHelper { static void notimplemented(); };
#define notimplemented() NotImplHelper::notimplemented();
#if defined(DEBUG) || defined(_DEBUG)
#define notimplementedvirtual() throw std::exception();
#else
#define notimplementedvirtual() static_assert(false, "You should implement virtual function calls before moving to production.");
#endif
Usage:
//lets say this is some class that still doesnt support...
//...all the functionality that it should based on the design docs
void MyClass::MyFunction()
{
notimplemented();
// or notimplementedvirtual() if MyFunction() is virtual...
}
Rationale:
IMHO if you use a function in your program, it should be available. When you try to compile something that you haven't implemented yet, it should give a compile-time or link-time error.
F.ex., in MSVC++ this'll give:
1>Test.obj : error LNK2019: unresolved external symbol "public: static void __cdecl NotImplHelper::notimplemented(void)" (?notimplemented@NotImplHelper@@SAXXZ) referenced in function "[blahblahblah]"
Note that the 'referenced function' is there in MSVC++. I haven't tested it in other compilers.
As for not-implemented virtual function calls, the only option you have it to throw an exception. Not having these implemented in your debugger while developing is fine - however, the moment stuff gets serious, these might be called by your program, so they should be available. A static_assert ensures the latter. (So: combined with any continuous integration package, it'll basically fail.)
Obviously most people will accidentally mix up notimplemented and notimplementedvirtual. In reality this ain't a big problem: a simple solution is to always use the former, unless you want to get rid of the error because it's a WIP.