a
    d=ic¡  ã                   @   sf   d Z ddlmZ ddlmZ ejZejZejZejZejZej	Z	ej
Z
ejZejZejZejZ[[dS )aî  Public APIs for the HParams plugin.

This module supports a spectrum of use cases, depending on how much
structure you want. In the simplest case, you can simply collect your
hparams into a dict, and use a Keras callback to record them:

>>> from tensorboard.plugins.hparams import api as hp
>>> hparams = {
...     "optimizer": "adam",
...     "fc_dropout": 0.2,
...     "neurons": 128,
...     # ...
... }
>>>
>>> model = model_fn(hparams)
>>> callbacks = [
>>>     tf.keras.callbacks.TensorBoard(logdir),
>>>     hp.KerasCallback(logdir, hparams),
>>> ]
>>> model.fit(..., callbacks=callbacks)

The Keras callback requires that TensorFlow eager execution be enabled.

If not using Keras, use the `hparams` function to write the values
directly:

>>> # In eager mode:
>>> with tf.create_file_writer(logdir).as_default():
...   hp.hparams(hparams)
>>>
>>> # In legacy graph mode:
>>> with tf.compat.v2.create_file_writer(logdir).as_default() as w:
...   sess.run(w.init())
...   sess.run(hp.hparams(hparams))
...   sess.run(w.flush())

To control how hyperparameters and metrics appear in the TensorBoard UI,
you can define `HParam` and `Metric` objects, and write an experiment
summary to the top-level log directory:

>>> HP_OPTIMIZER = hp.HParam("optimizer")
>>> HP_FC_DROPOUT = hp.HParam(
...     "fc_dropout",
...     display_name="f.c. dropout",
...     description="Dropout rate for fully connected subnet.",
... )
>>> HP_NEURONS = hp.HParam("neurons", description="Neurons per dense layer")
>>>
>>> with tf.summary.create_file_writer(base_logdir).as_default():
...   hp.hparams_config(
...       hparams=[
...           HP_OPTIMIZER,
...           HP_FC_DROPOUT,
...           HP_NEURONS,
...       ],
...       metrics=[
...           hp.Metric("xent", group="validation", display_name="cross-entropy"),
...           hp.Metric("f1", group="validation", display_name="F&#x2081; score"),
...           hp.Metric("loss", group="train", display_name="training loss"),
...       ],
...   )

You can continue to pass a string-keyed dict to the Keras callback or
the `hparams` function, or you can use `HParam` objects as the keys. The
latter approach enables better static analysis: your favorite Python
linter can tell you if you misspell a hyperparameter name, your IDE can
help you find all the places where a hyperparameter is used, etc:

>>> hparams = {
...     HP_OPTIMIZER: "adam",
...     HP_FC_DROPOUT: 0.2,
...     HP_NEURONS: 128,
...     # ...
... }
>>>
>>> model = model_fn(hparams)
>>> callbacks = [
>>>     tf.keras.callbacks.TensorBoard(logdir),
>>>     hp.KerasCallback(logdir, hparams),
>>> ]

Finally, you can choose to annotate your hparam definitions with domain
information:

>>> HP_OPTIMIZER = hp.HParam("optimizer", hp.Discrete(["adam", "sgd"]))
>>> HP_FC_DROPOUT = hp.HParam("fc_dropout", hp.RealInterval(0.1, 0.4))
>>> HP_NEURONS = hp.HParam("neurons", hp.IntInterval(64, 256))

The TensorBoard HParams plugin does not provide tuners, but you can
integrate these domains into your preferred tuning framework if you so
desire. The domains will also be reflected in the TensorBoard UI.

See the `Experiment`, `HParam`, `Metric`, and `KerasCallback` classes
for API specifications. Consult the `hparams_demo.py` script in the
TensorBoard repository for an end-to-end MNIST example.
é    )Ú_keras)Ú
summary_v2N)Ú__doc__Ztensorboard.plugins.hparamsr   r   ZDiscreteÚDomainZHParamZIntIntervalZMetricZRealIntervalZhparamsZ
hparams_pbZhparams_configZhparams_config_pbZCallbackZKerasCallback© r   r   úp/home/droni/.local/share/virtualenvs/DPS-5Je3_V2c/lib/python3.9/site-packages/tensorboard/plugins/hparams/api.pyÚ<module>   s   b