This year, on October 7th, we remember Ada Lovelace, the daughter of Lord Byron who in the mid-19th century wrote the world’s first machine algorithm for Charles Babbage’s analytical engine. That basically means that yes, she was, in fact, the world’s first computer programmer.
Programming back then wasn’t as simple as knowing a computer language, however — Babbage’s machine was essentially a gigantic analog calculator programmed entirely using physical “punch cards.” These cards — which would be used as the de facto method of computer interface all the way through the 1950’s — represented the most fundamental form of data, ones and zeroes, and engineering even the simplest of mathematical operations was a challenging and mentally taxing task.
Fortunately, Lovelace was a brilliant mathematician. From 1842-1843, Lovelace devised a method of calculating Bernoulli numbers using Babbage’s machine. It was an achievement so impressive that Babbage himself called her “The Enchantress of Numbers.”