Blog Posts by Andrew B. Collier / @datawookie


Day 9: Input/Output

Month of Julia
Your code won’t be terribly interesting without ways of getting data in and out. Ways to do that with Julia will be the subject of today’s post. Console IO Direct output to the Julia terminal is done via print() and println(), where the latter appends a newline to the output. julia> print(3, " blind "); print("mice!\n") 3 blind mice! julia> println("Hello World!") Hello World! Terminal input is something that I never do, but it’s certainly possible. Read More →

Day 3: Variables and Data Types

Month of Julia
Most coding involves the assignment and manipulation of variables. Julia is dynamically typed, which means that you don’t need to declare explicitly a variable’s data type. It also means that a single variable name can be associated with different data types at various times. Julia has a sophisticated, yet extremely flexible, system for dealing with data types. covered in great detail by the official documentation. My notes below simply highlight some salient points I uncovered while digging around. Read More →

Day 1: Installation and Orientation

Month of Julia

As a long-term R user I’ve found that there are few tasks (analytical or otherwise) that R cannot immediately handle. Or be made to handle after a bit of hacking! However, I’m always interested in learning new tools. A month or so ago I attended a talk entitled Julia’s Approach to Open Source Machine Learning by John Myles White at ICML in Lille, France. What John told us about Julia was impressive and intriguing. I felt compelled to take a closer look. Like most research tasks, my first stop was the Wikipedia entry, which was suitably informative.

Read More →

Shiny Bayesian Updates

Reading Bayesian Computation with R by Jim Albert (Springer, 2009) inspired a fit of enthusiasm. Admittedly, I was on a plane coming back from Amsterdam and looking for distractions. I decided to put together a Shiny app to illustrate successive Bayesian updates. I had not yet seen anything that did this to my satisfaction. I like to think that my results come pretty close.

Read More →

Constructing a Word Cloud for ICML 2015

Word clouds have become a bit cliché, but I still think that they have a place in giving a high level overview of the content of a corpus. Here are the steps I took in putting together the word cloud for the International Conference on Machine Learning (2015).

Read More →

ICML 2015 (Lille, France): Day 4

Sundry notes from the fourth day of the International Conference for Machine Learning (ICML 2015) in Lille, France. Some of this might not be entirely accurate. Caveat emptor. Celeste: Variational inference for a generative model of astronomical images (Jeffrey Regier, Andrew Miller, Jon McAuliffe, Ryan Adams, Matt Hoffman, Dustin Lang, David Schlegel, Prabhat) Colour modelled as a 4 dimensional vector. The Physics (Planck’s Law) places some constraints on the components of these vectors. Read More →

ICML 2015 (Lille, France): Day 3

Selected scribblings from the third day at the International Conference for Machine Learning (ICML 2015) in Lille, France. I’m going out on a limb with some of this, since the more talks I attend, the more acutely aware I become of my limited knowledge of the cutting edge of Machine Learning. Caveat emptor. Adaptive Belief Propagation (Georgios Papachristoudis, John Fisher) Belief Propagation describes the passage of messages across a network. The focus of this talk was Belief Propagation within a tree. Read More →