François Chollet+ Your Authors @fchollet Deep learning @google. Creator of Keras, neural networks library. Author of 'Deep Learning with Python'. Opinions are my own. Nov. 25, 2019 1 min read + Your Authors

Built-in losses and metrics in Keras follow the signature `loss(y_true, y_pred, sample_weight=None)`. If you have exotic losses or metrics, a simple way to add them w/o having to implement your own training loop from scratch is to define them in an "endpoint layer". Like this:

It feels a bit more like the "Estimator" training style.

Full Colab notebook:  https://colab.research.google.com/drive/1zzLcJ2A2qofIvv94YJ3axRknlA6cBSIw 

This usage pattern enables you to use `fit`/etc with losses or metrics that have completely arbitrary signatures. Such endpoint layer may also have different behavior during training or inference.


You can follow @fchollet.



Bookmark

____
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.

Enjoy Threader? Sign up.

Since you’re here...

... we’re asking visitors like you to make a contribution to support this independent project. In these uncertain times, access to information is vital. Threader gets 1,000,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Your financial support will help two developers to keep working on this app. Everyone’s contribution, big or small, is so valuable. Support Threader by becoming premium or by donating on PayPal. Thank you.


Follow Threader