In the last 50 years, life has been simplified by the awe-inspiring advancements that have been achieved in the world of computer science and technology. In 1976, Steve Jobs and Steve Wozniak unveiled the Apple I, the first-ever computer that operated on a single-circuit board, just five years after a team of IBM engineers introduced the “floppy disk,” which revolutionized data-sharing. In 1981, the first personal computer – IBM's Acorn – equipped with an optional color monitor, two floppy disks, and an intel chip was rolled out to the masses, and the dynamic evolution of the World Wide Web soon followed. While many are familiar with those facts, and a recent movie revived interest in Alan Turing’s achievements with computing during World War II, it was Charles Babbage who was the first to conceive the notion of a programmable and automatic universal computer, which, on top of its ability to calculate any mathematical equation at an unmatched speed, could also be used for a seemingly infinite number of other applications. In other words, he envisioned the precursor to the modern computer. At first blush, Babbage hardly seemed the type, because in many ways, Babbage was the antithesis of the debonair, silver-tongued, and effortlessly charismatic CEOs of present-day tech giants. Babbage was a quirky individual to say the least. He was highly observant, but was in the same breath a habitual daydreamer, often caught in a trance of deep thought. He spoke with a stutter, cared little about his appearance, often sporting stained collars and rumpled coats, and in his later years became something of an agoraphobe, developing a disdain for crowds and music. Indeed, his unquenchable thirst for knowledge and his brilliant mind were unparalleled, but this was paired with his restless, addictive, and extreme nature, as well as his obsession with precision and factual accuracy. This was the same man who once reached out to celebrated poet Alfred Tennyson and requested the wordsmith to correct the wording of his poem, “The Vision of Sin.” A letter to England’s legendary poet read, “In your otherwise beautiful poem, one verse reads: 'Every moment dies a man, every moment one is born'...If this were true, the population of the world would be at a standstill. In truth, the rate of birth is slightly in excess of that of death. I would suggest that the next version...should read: 'Every moment dies a man, Every moment 1 1/16 is born.'” Working with Babbage was certainly an unusual path for any woman, but Ada Lovelace managed to do so before an incredibly premature death. Augusta Ada King, Countess of Lovelace, died in 1852 at the age of 36, but during her short and tumultuous life, she was one of the first to recognize that computers could do far more than complex calculations. This was all the more surprising given that, during her life, no computers existed, and her ideas were based on an implicit understanding of the theoretical work of Charles Babbage, who himself is today recognized as the “Father of the Computer.” What makes her work truly startling is that she did this more than 100 years before the first computer had even been created. But Ada Lovelace was a woman, and in Victorian England, her work was generally either ignored or disparaged - after all, Britons in the Victorian Era assumed that women weren’t mentally equipped to be scientists or mathematicians. For that reason alone, it's likely inevitable that Ada received very little recognition for her work, but on top of that, she wasn’t simply ahead of her time - she was so far ahead that almost no one was able to recognize the importance of what she had done until more than 100 years after her death. Moreover, Ada was regarded with suspicion by her contemporaries in large part because she was the daughter of the era’s most famous and controversial poet, Lord Byron.