Back Propagation

Joined
Jul 17, 2023
Messages
1
Reaction score
0
Hi, I'd like to know why exactly my back propagation function keeps giving the same pattern of weights for every neuron of the same layer (except the first layer, it's different because it has non-linear activation functions).
This is the sheet with the weights:

If I knew how to, I would post the code. If it's needed I'll try to figure out how to post it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,995
Messages
2,570,226
Members
46,815
Latest member
treekmostly22

Latest Threads

Top