![]() |
Random-number generator.
View aliases
Main aliases
tf.random.experimental.Generator
Compat aliases for migrationSeeMigration guide formore details.
tf.compat.v1.random.Generator, tf.compat.v1.random.experimental.Generator
tf.random.Generator( copy_from=None, state=None, alg=None)
Used in the notebooks
Used in the guide | Used in the tutorials |
---|---|
|
|
Example:
Creating a generator from a seed:
g = tf.random.Generator.from_seed(1234)
g.normal(shape=(2, 3))
<tf.Tensor: shape=(2, 3), dtype=float32, numpy=
array([[ 0.9356609 , 1.0854305 , -0.93788373],
[-0.5061547 , 1.3169702 , 0.7137579 ]], dtype=float32)>
Creating a generator from a non-deterministic state:
g = tf.random.Generator.from_non_deterministic_state()
g.normal(shape=(2, 3))
<tf.Tensor: shape=(2, 3), dtype=float32, numpy=...>
All the constructors allow explicitly choosing an Random-Number-Generation(RNG) algorithm. Supported algorithms are "philox"
and "threefry"
. Forexample:
g = tf.random.Generator.from_seed(123, alg="philox")
g.normal(shape=(2, 3))
<tf.Tensor: shape=(2, 3), dtype=float32, numpy=
array([[ 0.8673864 , -0.29899067, -0.9310337 ],
[-1.5828488 , 1.2481191 , -0.6770643 ]], dtype=float32)>
CPU, GPU and TPU with the same algorithm and seed will generate the sameinteger random numbers. Float-point results (such as the output of normal
)may have small numerical discrepancies between different devices.
This class uses a tf.Variable to manage its internal state. Every timerandom numbers are generated, the state of the generator will change. Forexample:
g = tf.random.Generator.from_seed(1234)
g.state
<tf.Variable ... numpy=array([1234, 0, 0])>
g.normal(shape=(2, 3))
<...>
g.state
<tf.Variable ... numpy=array([2770, 0, 0])>
The shape of the state is algorithm-specific.
There is also a global generator:
g = tf.random.get_global_generator()
g.normal(shape=(2, 3))
<tf.Tensor: shape=(2, 3), dtype=float32, numpy=...>
When creating a generator inside a tf.distribute.Strategy scope, eachreplica will get a different stream of random numbers.
For example, in this code:
strat = tf.distribute.MirroredStrategy(devices=["cpu:0", "cpu:1"])with strat.scope(): g = tf.random.Generator.from_seed(1) def f(): return g.normal([]) results = strat.run(f).values
results[0]
and results[1]
will have different values.
If the generator is seeded (e.g. created via Generator.from_seed), therandom numbers will be determined by the seed, even though different replicasget different numbers. One can think of a random number generated on areplica as a hash of the replica ID and a "master" random number that may becommon to all replicas. Hence, the whole system is still deterministic.
(Note that the random numbers on different replicas are not correlated, evenif they are deterministically determined by the same seed. They are notcorrelated in the sense that no matter what statistics one calculates on them,there won't be any discernable correlation.)
Generators can be freely saved and restored using tf.train.Checkpoint. Thecheckpoint can be restored in a distribution strategy with a different numberof replicas than the original strategy. If a replica ID is present in both theoriginal and the new distribution strategy, its state will be properlyrestored (i.e. the random-number stream from the restored point will be thesame as that from the saving point) unless the replicas have already divergedin their RNG call traces before saving (e.g. one replica has made one RNG callwhile another has made two RNG calls). We don't have such guarantee if thegenerator is saved in a strategy scope and restored outside of any strategyscope, or vice versa.
When a generator is created within the scope oftf.distribute.experimental.ParameterServerStrategy, the workerswill share the generator's state (placed on one of the parameterservers). In this way the workers will still get differentrandom-number streams, as stated above. (This is similar to replicasin a tf.distribute.MirroredStrategy sequentially accessing agenerator created outside the strategy.) Each RNG call on a workerwill incur a round-trip to a parameter server, which may haveperformance impacts. When creating atf.distribute.experimental.ParameterServerStrategy, please makesure that the variable_partitioner
argument won't shard smallvariables of shape [2]
or [3]
(because generator states must notbe sharded). Ways to avoid sharding small variables include settingvariable_partitioner
to None
or totf.distribute.experimental.partitioners.MinSizePartitioner with alarge enough min_shard_bytes
(seetf.distribute.experimental.ParameterServerStrategy's documentationfor more details).
Methods
binomial
binomial( shape, counts, probs, dtype=tf.dtypes.int32, name=None)
Outputs random values from a binomial distribution.
The generated values follow a binomial distribution with specified count andprobability of success parameters.
Example:
counts = [10., 20.]# Probability of success.probs = [0.8]rng = tf.random.Generator.from_seed(seed=234)binomial_samples = rng.binomial(shape=[2], counts=counts, probs=probs)counts = ... # Shape [3, 1, 2]probs = ... # Shape [1, 4, 2]shape = [3, 4, 3, 4, 2]rng = tf.random.Generator.from_seed(seed=1717)# Sample shape will be [3, 4, 3, 4, 2]binomial_samples = rng.binomial(shape=shape, counts=counts, probs=probs)
Args | |
---|---|
shape | A 1-D integer Tensor or Python array. The shape of the outputtensor. |
counts | Tensor. The counts of the binomial distribution. Must bebroadcastable with probs , and broadcastable with the rightmostdimensions of shape . |
probs | Tensor. The probability of success for thebinomial distribution. Must be broadcastable with counts andbroadcastable with the rightmost dimensions of shape . |
dtype | The type of the output. Default: tf.int32 |
name | A name for the operation (optional). |
Returns | |
---|---|
samples | A Tensor of the specified shape filled with random binomialvalues. For each i, each samples[i, ...] is an independent draw fromthe binomial distribution on counts[i] trials with probability ofsuccess probs[i]. |
from_key_counter
@classmethod
from_key_counter( key, counter, alg)
Creates a generator from a key and a counter.
This constructor only applies if the algorithm is a counter-based algorithm.See method key
for the meaning of "key" and "counter".
Args | |
---|---|
key | the key for the RNG, a scalar of type STATE_TYPE. |
counter | a vector of dtype STATE_TYPE representing the initial counter forthe RNG, whose length is algorithm-specific., |
alg | the RNG algorithm. If None, it will be auto-selected. See__init__ for its possible values. |
Returns | |
---|---|
The new generator. |
from_non_deterministic_state
@classmethod
from_non_deterministic_state( alg=None)
Creates a generator by non-deterministically initializing its state.
The source of the non-determinism will be platform- and time-dependent.
Args | |
---|---|
alg | (optional) the RNG algorithm. If None, it will be auto-selected. See__init__ for its possible values. |
Returns | |
---|---|
The new generator. |
from_seed
@classmethod
from_seed( seed, alg=None)
Creates a generator from a seed.
A seed is a 1024-bit unsigned integer represented either as a Pythoninteger or a vector of integers. Seeds shorter than 1024-bit will bepadded. The padding, the internal structure of a seed and the way a seedis converted to a state are all opaque (unspecified). The only semanticsspecification of seeds is that two different seeds are likely to producetwo independent generators (but no guarantee).
Args | |
---|---|
seed | the seed for the RNG. |
alg | (optional) the RNG algorithm. If None, it will be auto-selected. See__init__ for its possible values. |
Returns | |
---|---|
The new generator. |
from_state
@classmethod
from_state( state, alg)
Creates a generator from a state.
See __init__
for description of state
and alg
.
Args | |
---|---|
state | the new state. |
alg | the RNG algorithm. |
Returns | |
---|---|
The new generator. |
make_seeds
make_seeds( count=1)
Generates seeds for stateless random ops.
For example:
seeds = get_global_generator().make_seeds(count=10)for i in range(10): seed = seeds[:, i] numbers = stateless_random_normal(shape=[2, 3], seed=seed) ...
Args | |
---|---|
count | the number of seed pairs (note that stateless random ops need apair of seeds to invoke). |
Returns | |
---|---|
A tensor of shape [2, count] and dtype int64. |
normal
normal( shape, mean=0.0, stddev=1.0, dtype=tf.dtypes.float32, name=None)
Outputs random values from a normal distribution.
Args | |
---|---|
shape | A 1-D integer Tensor or Python array. The shape of the outputtensor. |
mean | A 0-D Tensor or Python value of type dtype . The mean of the normaldistribution. |
stddev | A 0-D Tensor or Python value of type dtype . The standarddeviation of the normal distribution. |
dtype | The type of the output. |
name | A name for the operation (optional). |
Returns | |
---|---|
A tensor of the specified shape filled with random normal values. |
reset
reset( state)
Resets the generator by a new state.
See __init__
for the meaning of "state".
Args | |
---|---|
state | the new state. |
reset_from_key_counter
reset_from_key_counter( key, counter)
Resets the generator by a new key-counter pair.
See from_key_counter
for the meaning of "key" and "counter".
Args | |
---|---|
key | the new key. |
counter | the new counter. |
reset_from_seed
reset_from_seed( seed)
Resets the generator by a new seed.
See from_seed
for the meaning of "seed".
Args | |
---|---|
seed | the new seed. |
skip
skip( delta)
Advance the counter of a counter-based RNG.
Args | |
---|---|
delta | the amount of advancement. The state of the RNG afterskip(n) will be the same as that after normal([n]) (or any other distribution). The actual increment added to thecounter is an unspecified implementation detail. |
Returns | |
---|---|
A Tensor of type int64 . |
split
split( count=1)
Returns a list of independent Generator
objects.
Two generators are independent of each other in the sense that therandom-number streams they generate don't have statistically detectablecorrelations. The new generators are also independent of the old one.The old generator's state will be changed (like other random-numbergenerating methods), so two calls of split
will return differentnew generators.
For example:
gens = get_global_generator().split(count=10)for gen in gens: numbers = gen.normal(shape=[2, 3]) # ...gens2 = get_global_generator().split(count=10)# gens2 will be different from gens
The new generators will be put on the current device (possible differentfrom the old generator's), for example:
with tf.device("/device:CPU:0"): gen = Generator(seed=1234) # gen is on CPUwith tf.device("/device:GPU:0"): gens = gen.split(count=10) # gens are on GPU
Args | |
---|---|
count | the number of generators to return. |
Returns | |
---|---|
A list (length count ) of Generator objects independent of each other.The new generators have the same RNG algorithm as the old one. |
truncated_normal
truncated_normal( shape, mean=0.0, stddev=1.0, dtype=tf.dtypes.float32, name=None)
Outputs random values from a truncated normal distribution.
The generated values follow a normal distribution with specified mean andstandard deviation, except that values whose magnitude is more than2 standard deviations from the mean are dropped and re-picked.
Args | |
---|---|
shape | A 1-D integer Tensor or Python array. The shape of the outputtensor. |
mean | A 0-D Tensor or Python value of type dtype . The mean of thetruncated normal distribution. |
stddev | A 0-D Tensor or Python value of type dtype . The standarddeviation of the normal distribution, before truncation. |
dtype | The type of the output. |
name | A name for the operation (optional). |
Returns | |
---|---|
A tensor of the specified shape filled with random truncated normalvalues. |
uniform
uniform( shape, minval=0, maxval=None, dtype=tf.dtypes.float32, name=None)
Outputs random values from a uniform distribution.
The generated values follow a uniform distribution in the range[minval, maxval)
. The lower bound minval
is included in the range, whilethe upper bound maxval
is excluded. (For float numbers especiallylow-precision types like bfloat16, because ofrounding, the result may sometimes include maxval
.)
For floats, the default range is [0, 1)
. For ints, at least maxval
mustbe specified explicitly.
In the integer case, the random integers are slightly biased unlessmaxval - minval
is an exact power of two. The bias is small for values ofmaxval - minval
significantly smaller than the range of the output (either2**32
or 2**64
).
For full-range random integers, pass minval=None
and maxval=None
with aninteger dtype
(for integer dtypes, minval
and maxval
must be bothNone
or both not None
).
Args | |
---|---|
shape | A 1-D integer Tensor or Python array. The shape of the outputtensor. |
minval | A Tensor or Python value of type dtype , broadcastable withshape (for integer types, broadcasting is not supported, so it needsto be a scalar). The lower bound (included) on the range of randomvalues to generate. Pass None for full-range integers. Defaults to 0. |
maxval | A Tensor or Python value of type dtype , broadcastable withshape (for integer types, broadcasting is not supported, so it needsto be a scalar). The upper bound (excluded) on the range of randomvalues to generate. Pass None for full-range integers. Defaults to 1if dtype is floating point. |
dtype | The type of the output. |
name | A name for the operation (optional). |
Returns | |
---|---|
A tensor of the specified shape filled with random uniform values. |
Raises | |
---|---|
ValueError | If dtype is integral and maxval is not specified. |
uniform_full_int
uniform_full_int( shape, dtype=tf.dtypes.uint64, name=None)
Uniform distribution on an integer type's entire range.
This method is the same as setting minval
and maxval
to None
in theuniform
method.
Args | |
---|---|
shape | the shape of the output. |
dtype | (optional) the integer type, default to uint64. |
name | (optional) the name of the node. |
Returns | |
---|---|
A tensor of random numbers of the required shape. |