This post is the first of a series of posts that serve as an introduction to the field of Machine Learning for those with a mathematical background. We'll start here by introducing features, labels, hypothesis spaces, loss functions and model generalization.
Logistic Regression is one of the first techniques taught in Machine Learning, and for many applications is a good baseline model. Here I'd like to share some details I've discovered about it over the last year, which helped me better understand how and why it works.
During a hackathon at work I finally had tensorflow and it's api "click" for me. This article shares some of what made it click for me, in addition to the other stuff I did trying to understand the weights of the trained network better.
When I was getting this blog setup, I decided I needed to make creating new posts as easy as possible. I want to use my daily commute (which is about 1.5 hours each day) to create new content, and in order to make the barrier as small as possible it started looking into the options for posting from some kind of notebook format.
Well this is going to be my first post. As I'm sure lots of first time bloggers have run into, you feel a lot of pressure to make the first post amazing. You feel the need to start with a great first post, because you've told all your friends you're going to make a blog, and some of them are eagerly awaiting that first post. And then pressure builds…