This summer is all about training. Yes, I'm trying to run regularly, but I'm actually talking about training machine-learning algorithms.
I've been trying to learn machine learning for about three years — only to feel hopelessly overwhelmed. It was as though someone said, "With a chicken, a cow, and a field of wheat, you can make a lovely soufflé!"
I took online classes, read books, and tried to modify sample code. But unless I devoted myself to the computer version of animal husbandry, it seemed, I was stuck.
Then someone at work mentioned fast.ai. It's a machine-learning library for Python that got me to the eggs-milk-flour stage, and provided some great starter recipes. Thanks to free guides and videos, I was soon baking algorithms that actually worked.
Now I want to get good, and experiment with different flavors and styles.
So this summer, I'm setting out to train and use new machine learning models, at least one each week. I'll try several techniques, use different kinds of data, and solve a variety of problems. It's a little like my Make Every Week project, providing constraints to inspire and motivate me.
I'll share what I learn, both here and at qz.ai where the Quartz AI Studio is helping journalists use machine learning, and I get to practice machine learning at work.
In the fall I'll be teaching a few workshops and classes that will incorporate, I hope, some of the things I've learned this summer. If you'd like to hear about those once they're announced, drop your email address into the signup box on this page and I'll keep you posted.
Time to train!