Index
A
- abs function, 36
- accumulation, momentum algorithm, 75–76
- activation functions
- activation parameter, 157, 185
- activation_fn parameter, 138, 144, 145, 147
- activity_regularizer parameter, 157
- AdagradOptimizer class, 76–77
- AdamOptimizer class, 77–78
- adaptive gradient algorithm, 76–77
- add function, 36
- add_event method, 60, 61–62
- add_graph method, 60, 62
- add_loss method, 185
- add_meta_graph method, 61, 62, 85
- add_meta_graph_and_variables function, 85, 86
- add_n function, 36
- add_run_metadata method, 61
- add_session_log method, 61
- add_summary method, 60, 61, 64, 89–90
- add_to_collection method, 47
- add_to_collections method, 47
- add_update method, 185
- add_variable method, 185
- adjust_brightness function, 169, 170
- adjust_contrast function, 169, 170, 175
- adjust_gamma function, 169, 170
- adjust_hue function, 169, 170
- adjust_saturation function, 169, 170
- Advanced Package Tool (APT), 20
- after_create_session method, 91
- after_run method, 91, 92
- allocator_type field, 238
- allow_growth field, 237, 238
- allow_soft_placement option, 227
- AlphaGo program, Google, 1, 129
- app group, gcloud, 282
- apply_regularization function, 141
- arg_scope function, 143, 147
- argmax function, 37, 38
- argmin function, 37, 38
- args field, 301
- argument scope, 143
- arrays. See tensors
- artificial neural networks (ANNs). See neural networks
- as_cluster_def method, 270
- as_default method, 46
Get TensorFlow For Dummies now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.