What Does It Mean To "get Variable" In TensorFlow?
Solution 1:
If a variable already exists, why can't I reuse it (or "get it") by calling its name.
You can and that's usually done when the whole model is in one file. However, the big model is likely to be split into different source files and libraries. In this case, tf.get_variable
is simply convenient: "tf.get_variable
also allows you to reuse a previously created variable of the same name, making it easy to define models which reuse layers".
Out-of-the-box layers and functions in tensorflow often define their variables with tf.get_variable
, for example tf.contrib.crf.crf_log_likelihood
(source code), which allows the client to pass a transitions
matrix even if crf_log_likelihood
invocation is in another module or even in third-party code.
The possibility of sharing is another use-case, as already suggested in the comments, so writing tf.get_variable
deep within a layer is a step towards better compositionality.
Post a Comment for "What Does It Mean To "get Variable" In TensorFlow?"