I started taking an interest in machine learning and AI about a year and a half ago. I don't consider myself any kind of genius (although I'm reasonably intelligent), and I was terrible at math in school--to the point where I'd come to the conclusion that I simply "wasn't good at math".
After a good deal of reading, trial and error, and banging my head against the wall, I've managed to get myself to pretty much the cutting edge of ML research as it applies to neural networks. There's quite a bit of math involved, and it would have been easy for me to write it off as "too hard" in the beginning. However, I'm glad I stuck with it because I'm actually using it for some pretty neat applications.
My point being, if you have an interest in something that seems like you have to be a genius to be good at it, don't let that stop you because it probably isn't true.
If you're just getting started, I highly recommend Andrew Ng's online ML class. I had started reading up before this was available, but it really tied a lot of basics together that I was confused about.
From there, read papers. For me, my primary interest is in neural networks. Geoffrey Hinton and Yoshua Bengio have two very good groups that both have contributed a great deal to research in this area, and their websites provide lots of good stuff.
After spending a bit of time on a survey of the field, try to come up with a practical goal as quickly as you can: I want to use ML to do X. Then, try to do that. When you get stuck, get back to reading until you find the answer. Rinse and repeat. The math naturally falls into this--you'll get stuck on things that you can't fix without a decent understanding of the math. So figure out how to formulate the question you're really asking, hit google, and read. Then try again. Rinse and repeat.
If you're interested in neural networks, I can also recommend the deep learning tutorials associated with Theano, a library for compiling python code down to CUDA code to get speed increases on certain operations that will let you train your models about 10-40x faster on GPU than if you tried it on CPU.
For me, putting things into practice has helped me make the biggest leaps in understanding, but of course I wouldn't have been able to do that at all without getting a basic grasp of the mechanisms involved. So it's a bit of push and pull between practice and learning, like anything worth doing.
Im not sure if you can find many places where they have both in great detal as most machine learning material assume some knowledge in the specific math but you could always write down the terms that you dont understand from any ML material and look it up on khan academy. For all the material you need for Andrew Ng's classes (on coursera) for example you can pretty much find all the required terms (matricies, probability etc) on khan academy
After a good deal of reading, trial and error, and banging my head against the wall, I've managed to get myself to pretty much the cutting edge of ML research as it applies to neural networks. There's quite a bit of math involved, and it would have been easy for me to write it off as "too hard" in the beginning. However, I'm glad I stuck with it because I'm actually using it for some pretty neat applications.
My point being, if you have an interest in something that seems like you have to be a genius to be good at it, don't let that stop you because it probably isn't true.