Got two books on Neural Nets, neither cover noting more than 1 hidden layer.
Oh, anyone recombed a book, really would like one that covered two or more hidden layers..
Anyone here good with Neural Nets or equally really good at maths. Have a look at the attached document with the full question listed!
try this...has code you can try & you can instal Python for free (get Python at https://www.python.org/)
Another decent read:
I somewhat remember a course I took around 1993 using my 100 MHz PC to train on Lenna:
When in the dark remember-the future looks brighter than ever. I look forward to being able to predict the future!
Hello freaks. So I was wondering if you could help me? I'm looking to know how to calculate a 2 hidden layer network. Have a look at the diagram below.
To calculate w1 I have the following.
The calculation of the first term on the right hand side of the equation above is a bit more involved than previous calculations since affects the error through both and .
Now my question is. If I had a hypothetical weight input into X1 called weight I, wi what is the equation for back propagating tht weight? I'm thinking it is maybe it is this.
âE/âwi = âE/âx1 . âx1/âzx1 . âzx1/âw1
âE/âx1 = ( âE/âo1 . âo1/âzo1 . âzo1/âx1) + (âE/âo2 . âo2/âzo2 . âzo2/âx1)
Or maybe it is this:
âE/âx1 = ( âE/âh1 . âh1/âzh1 . âzh1/âx1) + (âE/âh2 . âh2/âzh2 . âzh2/âx1)
I really hope you can help me with this, thanks for having a look!
Ah, I didn't see this thread before I replied to your other one here:
... so never mind ;-)
"Experience is what enables you to recognise a mistake the second time you make it."
"Good judgement comes from experience. Experience comes from bad judgement."
"Wisdom is always wont to arrive late, and to be a little approximate on first possession."
"When you hear hoofbeats, think horses, not unicorns."
"Fast. Cheap. Good. Pick two."
"We see a lot of arses on handlebars around here." - [J Ekdahl]
This function does it, just substitute for db1 for the weights.
© 2021 Microchip Technology Inc.