How can I see the exact formula from the book, the same here on the email
When I copy formulas from any book online, and paste it on yahoo mail, they do not ahow the same as they are in the book. Example:
Based on the single-input neuron formula
with weight
and bias
, the input
is calculated by rearranging to
. Possible transfer functions (Table 2.1) and corresponding inputs
for outputs
are generally: Linear (
), Saturating Linear, or Sigmoid.
Transtutors
Transtutors
+2
i. Output = 1.6: Linear (
) or Saturating Linear.
ii. Output = 1.0: Linear (
) or Saturating Linear (
for max output).
iii. Output = 0.9963: Log-Sigmoid or Hyperbolic Tangent Sigmoid (
), yielding
.
iv. Output = -1.0: Linear (
) or Symmetric Hard Limit.
Transtutors
Transtutors
+1
Common Transfer Functions (Table 2.1 Context):
Linear:
Saturating Linear:
Log-Sigmoid:
Symmetric Hard Limit:
this is from the email: E2.1 A single input neuron has a weight of 1.3 and a bias of 3.0. What possible kinds of transfer functions, from Table 2.1, could this neuron have, if its output is given below. In each case, give the value of the input that would produce these outputs.
Based on the single-input neuron formula π=π(π€π+π) with weight π€=1.3 and bias π=3.0, the input
οΏ½
π is calculated by rearranging to π=πβ1(π)β3.01.3. Possible transfer functions (Table 2.1) and corresponding inputs
οΏ½
π for outputs
οΏ½
π are generally: Linear (1.3π+3=π), Saturating Linear, or Sigmoid.
TranstutorsTranstutors +2
i. Output = 1.6: Linear (1.3π+3=1.6βΉπ=β1.077) or Saturating Linear.
ii. Output = 1.0: Linear (1.3π+3=1.0βΉπ=β1.538) or Saturating Linear (πβ₯0 for max output).
iii. Output = 0.9963: Log-Sigmoid or Hyperbolic Tangent Sigmoid (0.9963=11+πβ(1.3π+3)), yielding πβ1.25.
iv. Output = -1.0: Linear (1.3π+3=β1.0βΉπ=β3.077) or Symmetric Hard Limit.
TranstutorsTranstutors +1
Common Transfer Functions (Table 2.1 Context):
Linear: π(π)=π
Saturating Linear: π(π)=satlin(π)
Log-Sigmoid: π(π)=11+πβπ
Symmetric Hard Limit