The Gambler’s Fallacy – Why Betting on Luck Is A Very Bad Idea

File:Online gambling 5857838458.jpg
Poker Chips

I owe a great deal to the gambling industry, and I am not even kidding. No, I have never had a gambling addiction or even been in the inside of a casino, but gambling is a very big part of my life and one of the main reason why I still have employment. Gambling, as a matter of fact was the reason the field of probability exists.

You see, back in 1654, there was a dispute between two gamblers and their game came to a premature end. Chevalier de Méré, a very popular gambler proposed the problem to many prominent mathematicians including Pascal, who later communicated with Fermat, and that was the first step into the development of probability as a whole. (Read about the history of Statistics and Probability here. )

Lets move now to our topic, The Gambler’s Fallacy.

In simple terms Gambler’s fallacy is the incorrect belief that if something has happened frequently in the past, then it is less likely to happen in the future. The truth be told, there is no relationship between a past event and the probability of that occurring in the future. For example:- While tossing a coin three times, let us say we get 3 heads. There is no way to tell what the outcome of the coin will be, while tossing a coin for the fourth time. That is all there is to this idea. Now, let us see a few examples of the gambler’s fallacy in day to day life.

The first three children of a couple have been girls, the next one is bound to be a boy, right!

Your team has always lost in the finals of the championship, surely this year has to be your year.

You have lost consistently on poker over the last few weeks, this time you are going to take all the winnings.

Ok, you get the picture by now. So, let us now try to see why do we still make such an obvious logical flaw in our day to day lives.

Answering the question of why do we fall for gambler’s fallacy is somewhat equivalent to answering the question of why is gambling so addictive. The main reason is obviously the belief that one day you are going to make up for all the losses you have incurred, but the gambler obviously forgets the old adage, “The House Always Wins”, which holds true almost every single time. You may point out one or two people who made it big from the casino, but do not forget, it is just one of millions of bets placed by the casino. No matter what The House Always Wins.

As human beings, we are an optimistic bunch when it comes to our future, and for the most part we should be. But, gambling is one of those few things in which it is better to be pessimistic about your future chances.

Another point which I must stress upon is that many people just do not like the idea of randomness. We as human beings like stability in our lives. We look for historical precedence and patterns even when there are none present. People like things that they can easily understand and comprehend, rather than thinking about a problem from a logical point of view.

How can we tackle and overcome gambler’s fallacy?

The only way you can overcome gambler’s fallacy is by taking better logical decisions. Sounds simple, but think about it in this way. You need to be able to determine whether something is worth pursuing or not. A good mental exercise we can practice is to think that the current event is at the start of the series rather than at the end of a series. This way, we can train our mind to take more logical choices. Remember causation does not equal correlation, and we should not look for patterns where no patterns exist.

This is all I have to say for now. I am finally back writing this blog after a long self imposed hiatus. Thank you for reading.

References

https://timesofmalta.com/articles/view/the-origins-of-probability.684474#

Internet Of Things:- Basics, Hype and Potential

Internet Of Things

Internet has become a basic necessity in our daily lives. According to Statista, more than four billion people have access to the internet in 2020. Internet has become the number one medium for exchanging information and communication, over the last decade and naturally the applications of internet have entered various areas of our daily life. Internet of Things can be considered to be a collection of such applications.

Internet of Things (IOT) is a network of physical things embedded with sensors, software and other technologies for the purpose of exchanging data and information with other devices over the internet. This means, we take a random object like a pen and fit it with sensors so that it can transmit information over the internet.

Let us now understand how internet of things works.

Fig:- The Basic Block Diagram of an IOT System.

You can divide the basic architecture of an IOT system into three parts. The first tier is the devices, which includes sensors, actuators, antennas and microcontrollers. These devices in the first tier operate under protocols like zigbee, bluetooth or wifi. The job of the first tier devices it to take information from the environment and pass it onto the IOT Hub or Gateway. The job of the gateway device is to process and transfer the data to the third tier and also facilitate in managing the upper layer. The final tier is the application built for the IOT using the microwave architecture. It also includes the database which stores and analyzes the data converting it into useful information.

Now that we understand how IOT works, let us take a look at some of its applications in different aspects of our lives.

Wearable Technology

Wearable Technology.

One of the most common uses of IOT is wearable technology. Wearable technologies are generally used to monitor different things like heart rate, blood pressure, time spent working out, and can be used to forecast different ailments and even situations like change in mood, and athletic performance among others. Wearables are also used for non health reasons like communications and even fashion to some extent.

Smart Home

Smart Home

If you have worked in Hardware Programming, then you have definitely tried to automate the electronic appliances in your home. Smart Homes employ IOT technologies to control lighting, heating, media and security cameras to name a few. This is especially useful in places where saving energy is of a great concern. Smart Homes can generally be controlled with a smartphone and can be controlled from anywhere in the world. So, if you forgot to turn the electric heater off while you left home, you can do it from your phone.

Other common examples of Internet of Things include Traffic Monitoring, Fleet management, agriculture, smart grid, industrial management and so on. In fact, the applications of IOT are so humungous that you could probably find it in all facets of life.

Issues with IOT

IOT is still not the finished article and there are a lot of problems that arise with Internet of Things. The major of these being security. Some of the common security issues in IOT are lack of standards and proper protocols. The devices are coming out of the woodwork and these work pretty much using their own standards. This could make data theft and phishing a common occurrence.

IOT devices are susceptible to malware attacks. IOT software are not updated as regularly as your android devices, and as a result malware attacks are very much prevalent in these devices. This is not as big a problem now as it was a few years ago, but there is still a lot of improvement to be done regarding this.

The Future of IOT

I am not exaggerating when I say that the possibilities of IOT are limitless. Currently the overall revenue of IOT devices is in the billions and this is only going to increase in the upcoming years. With the advent of artificial intelligence and machine learning, and the continuous expansion of the internet, IOT devices will become an integral part of our lives. AI and Machine Learning have the capacity to solve the problems of connecting billions of devices in the world and make them work to the best of their ability. The development of various encryption technologies will only make IOT more secure and better in the future.

That is it for this article. Thank you for reading.

Summary Statistics – A Way To Deceive Using Numbers.

Statistics, Analysis, Graph, Data, Chart, Information
Statistical Charts

In the early days of the coronavirus lockdown, I wrote an article on How Statistics are Misused to Manipulate The Public. To my utter amazement and surprise, the article was very well received and I was encouraged to write a follow up on that piece on that. This is my attempt to do so.

Before I completely criticize summary statistics, let me make it clear that I understand the importance of these statistical measures. The sole point of this article is to make the reader clear that just looking at a single statistic is not enough and the reader has to be vigilant about the information they are fed.

Summary Statistics are used to convey the largest amount of information that can possibly be conveyed. Often times a large amount of data can be condensed into a single statistical measure, like mean, median, mode, standard deviation, kurtosis, skewness and so on. Statisticians and people who work with data in general spend a large chunk of their time in trying to condense a large volume of data into a single statistic, but they will be the first ones to tell you that just a single statistic cannot describe the entire data in its entirety (if they’re any good, they will). Today we will understand why it is dangerous and wrong to draw conclusions about a large amount of data with just a single summary statistic and what we can do to get a more accurate picture of our data. So, let us begin.

The flaws with summary statistics were first depicted by British statistician Francis Anscombe, when he came up with Anscombe’s quartet. Let us look at the dataset he used and the information they conveyed.

Fig1:- Anscombe’s Dataset and Summary Statistic.

Lets look at the data visually to get a better idea.

Fig2:- Anscombe’s Quartet

The figure above shows the disparities in the four datasets used by Anscombe. It doesn’t even require a second glance to see that the four datasets are completely different. Interestingly though, if you look at the summary of these datasets, you will infer that the datasets are very much similar to each other. This is the danger of just looking at summary statistic and not looking at the data in its entirety.

Let me demonstrate the insufficiency of summary statistics with an example.

Lightbulb, Bulb, Light, Idea, Energy, Power, Innovation
Fig 3:- A Lightbulb.

An electrical company launches two new lines of lightbulbs, sample A, which cost more money and sample B, which is cheaper. It was reported that on average that both lines of lightbulbs lasted an average of 10000 hours. You will probably choose the cheaper of the two lightbulbs when you’re told about that stat, and why would you not.

But let me add a few details to our lightbulbs. Imagine that there were 1000 samples tested from both sets of lightbulbs. 500 lightbulbs in sample A lasted an average of 11000 hours while other 500 bulbs lasted for 9000 hours, bringing the average to 10000 hours. The lightbulbs in sample A were fairly consistent, and you would probably get your money’s worth even when it is more expensive.

Now, imagine sample B only had 750 samples initially and they were not very good. Say that on average the samples lasted 7500 hours only. The company had to sell the products, no matter what. So, they get 250 more samples which had an average life of 17500 hours, bringing the overall average of sample B to 10000 hours lifetime.

So, you can clearly see that despite having the same average lifetime of 10000 hours, the two products are not the same. The probability of you getting a lightbulb which will last 10000 hours is much higher in sample A than in sample B. If you just look at summary statistics, you will not be able to come to this important conclusion.

That is it for this post. The sole purpose of this post was to introduce the idea of summary statistics to the reader and encourage them to look beyond a single number which is used to define a large set of data.

Thank you for reading this. I will see you shortly.

References

https://www.statisticshowto.com/summary-statistics/

https://towardsdatascience.com/importance-of-data-visualization-anscombes-quartet-way-a325148b9fd2

Five Lesser Known German Inventions During World War Two

German Empire, 1942.

Germany during World War Two is not something that you can talk about without your stomach churning. During wars, it is not always possible to separate the good guys from the bad guys, but during the second world war it was obvious that The Third Reich was pretty much evil. The things they have done and the war against humanity they caused have filled up the pages of thousands of books. Today, we will learn about the technologies that came out of this evil empire, technologies that have shaped up the world that we live in today, and technologies that have helped the world become a better place, even though the reason for their conception were truly evil.

Disclaimer: – This post will not include the horrific death traps created solely to take the lives of innocent human beings in the concentration camps. This post will only include the neutral technologies (a vehicle, a communication technology etc), which can be used for good and not exclusively evil.

So, let us begin. Here are some of the greatest technologies that came out of Germany during the second world war.

Methadone 

undefined
Methadone

Methadone is a very popular drug, which is mostly used in therapy for people dealing with opioid addiction, and also used as a painkiller. The effect of methadone is similar to that of morphine and can provide pain relief for anywhere between 8 to 36 hours depending on the frequency and the dose used.

There was a opium shortage in Germany in 1937, just two years before the war. Scientists working for I.G. Farbenindustrie AG developed a synthetically fabricated alternative to the nation’s problem. It was patented under the name Polamidon and entered the market in 1943, and it was used extensively in the war by the troops.

In 1947, the drug was renames as “methadone” by the Council of Pharmacy and Chemistry of American Medical Association, and it entered the United States Market the same year by Eli Lilly and Company under the name Dolophine. Today, methadone is available in all parts of the world under the names Symoron, Amidone, Methadose, Physeptone etc.

Messerschmitt Me 262

Messerschmitt Me 262

The Messerschmitt Me 262, was the first jet powered fighter aircraft. It took flight in April 1941, as a piston powered engine while the jet powered version took flight in 18 July 1942. However, it was deployed far too late by the Luftwaffe to be of any use in the European theaters of war. The preliminary design work began in 1938, but there were a lot of technical problems and issues that came with the aircraft, as is expected.

Preliminary design work on what was to become the Me262 began in 1938. Persistent problems with the turbojets intended for the aircraft delayed the project and the first flight by a Me262 using only jet power did not take place until July 1942. Me 262, only took to battle in 1944, at a point where it was already too late to be of any significant impact against the allies.

However, it was the only jet fighter to see air-to-air combat in the second world war and its superiority did surprise and shock the allies. A remarkable machine, nonetheless.

The Z- Series Computers

Replica of the Z1 in the German Museum of Technology in Berlin

The Z – series computers were a series of a mechanical computers that are considered to be among the first freely programmable computers ever developed. These were developed by Konrad Zuse, often regarded as the inventor of modern computers. There were four computers in the series (Z1,Z2,Z3 and Z4), and Zuse was the chief inventor for all of these. Let us talk about the first one Z1 in some detail.

The Z1 was made between 1936 to 1938, in the living room of Zuse’s parents’ home. It had all the components of a modern computers, i.e. control unit, memory unit, micro sequences, floating point logic and input-output devices.

Konrad Zuse

It weighed around 1000 kg, had around 20000 parts and used a binary switching system. A keyboard was used to provide the input, and the programs were stored on tapes.

The Z1 was one of the many German inventions of this era, destroyed by Allied bombing. It was December 1943, when this computer was destroyed and Zuse successfully rebuilt this machine in 1989, for the purpose of historical preservation.

The Z2 was somewhat of an upgrade on Z1, as it weighted somewhere around 300 kilograms. The Z3 and Z4 were huge upgrades on the previous models, and honestly speaking, these machines along with their inventor deserve an article on their own, which I will get to shortly.

The Jerrycan

A Jerrycan

This is one of those daily things that we use, without thinking where it came from. Well, now you know it came from Nazi Germany. The Muller Engineering Company, designed the Jerrycan in 1937, and it came to extreme use in 1939, when the German army attacked Poland. Jerrycans were used then for what they are used now, carrying fuel and water across long distances.

The Jerrycan was one of the inventions that might have played a huge part in the allies victory in the year. These plastic containers provided the ability to resupply and we have proven records that there were over 20 million jerrycans were used by U.S forces just in Europe by the end of the war. These cans were so important that President Roosevelt remarked that “without these cans it would have been impossible for our armies to cut their way across France at a lightning pace which exceeded the German Blitzkrieg of 1940”.

Flettner Fl 282

Flettner 282 airborne.jpg
Flettner Fl 282

Nicknamed the Hummingbird, Flettner FL 282 was the first series production helicopter, designed by Anton Fletter in the early 1940’s. It was a single seat helicopter and the best part of it was that it required servicing every 400 hours, instead of 25 hours, which was the average servicing time for helicopters.

Initially it was used to ferry items between ships and reconnaissance missions, but later Luftwaffe considered to convert it for battlefield purposes. These helicopters were so useful that the BMW were commissioned to make a 1000 of these, but the Munich plant was destroyed by allied bombings and they could only provide 24 units. The Hummingbird were slowly phased out because they fell victims to Soviet fighters and anti aircraft missiles.

So, that was a small list of lesser known things developed by Germany during the second world war. Of course there are a lot more things that were developed, some of which only served evil purposes and have not been used since. Other things like the ones in this list have survived the test of time and have contributed directly and indirectly in the development of modern technologies. Lastly, I would request my readers to not forget that “technology by itself is a neutral entity” and the goodness and evil in these machines are a result of the hand operating them.

Thank you for reading.

References

https://www.rafmuseum.org.uk/research/collections/messerschmitt-me-262a-2a-schwalbe-swallow/

https://www.i-programmer.info/history/people/253-konrad-zuse.html

A Brief History of Alcohol

-Bidhan Bhattarai

File:Common alcoholic beverages.jpg
Alcoholic Beverage

In my extremely humble opinion, alcohol is the greatest invention in the history of mankind. When enjoyed responsibly, it is a great companion in joy and in sorrow. Whether it is marriage or divorce, relationships or hardships, this family of fermented spirits has always remained a constant. Throughout human history alcohol has accompanied everyone from the richest of rich to the poorest of poor. Often times, while having an exceptionally brewed craft beer, I wonder how this wonderful spirit came into existence and what roles it played throughout history. Well, crack up a cold one, because we are going to learn about that today.

Alcohol has a very convoluted history, and it is impossible to pinpoint the exact moment alcohol was first invented. It exists in million different forms and in thousands of different proportions. We have evidence that fermented beverages have existed as early as the Neolithic era. An article by Richa Malhotra on BBC stated that alcohol might have shaped the evolution of fruit eating primates like ourselves. We have been accustomed to eating fermented fruits before we evolved into homo sapiens. This is the scientific proof that alcohol exists in our genes (I always knew that).

Robert Dudley, in the book The Drunken Monkey: Why We Drink and Abuse Alcohol writes that the attraction and consumption of alcohol for homo sapiens goes back tens of millions of years. The smell of ripening and fermenting fruits was a good source of calories for primates living in rain forests. Thus, our tendency to abuse alcohol stems from that. All these researches point to one thing, we can’t tell exactly when we started consuming alcohol, but we can definitely study the evidence and learn more about this fun, social and potentially detrimental activity. So here it goes, a brief history of alcohol.

An article on the BBC in 2018, found a brewery in a burial cave near Hafifa in Israel. The archaeologists managed to find beer 13000 years old, which was very probably used in ritual sites to honor the dead. The beer was verified to be man made, and the scientists who ran experiments on the beer proved that it was not as strong a drink as the modern day beer. So, I will have to pass that one over.

Then, we go to ancient China, as early as 7000 B.C. The early wine was made by fermenting rice, honey and fruits. We know this because residue found from pottery of that point proved the presence of alcohol among the neolithic Chinese people. This means, the use of alcohol in China is older than recorded history itself.

Yellow River Basin

The civilization along the Yellow River developed their own kind of alcohol from fermented millet. They worshiped their alcohol, and thought the use of alcohol was a mandate from heaven. Alcohol was considered to be a religious and a pious drink by the ancient Chinese. The moderate use of alcohol was supposed to be prescribed by heaven.

Guinness World Records logo.svg
Guinness

The Guinness Book of World Records stated that the oldest wine dating back 6000 – 5800 BC in the neolithic era, in Tbilisi, Georgia( the country not the American State) . The spirit were identified as wine since they contained tartaric acid, that are present in large amount in the Eurasian grapes.

Alcohol’s evidence can clearly be traced in Egypt in and around the time The Pyramids of Giza were said to be constructed. Alcohol was also present in Mexico about 2000 BC and Sudan in 1500 BC.

Medicinally, alcohol was used and around 2000 BC according to Egyptian and Sumerian texts. The Hebrew Bible also suggests alcoholic drinks to the dying and depressed so that they could kill their sorrow.

File:Marco Polo - costume tartare.jpg
Venetian Explorer Marco Polo

Fourteenth century Venetian explorer Marco Polo stated that grain and rice alcohol were extremely common in China, being consumed in large volume almost everyday and the state made a lot of money of liquor. In fact alcohol was used for hospitality and as an antidote for tiredness, same as it is done today and of course it was heavily abused. So much so that we find evidence of laws and rules against the abuse of alcohol being enacted on a regular basis but alcohol was not completely outlawed since it was both culturally and monetarily significant.

stemware, wine glass, red wine, glass, drinkware, drink, champagne stemware, barware, alcohol, wine, tableware, liquid, cranberry juice, wine cocktail, fluid
Wine- A spirit from the gods

According to the Penn Museum website, we started to understand wine-making better from the analysis of a yellowish residue in a jar excavated by archaelogist Mary M Voigtt at Zagros Mountains in Iran. The jar was about 9 litres, and subsequently five similar jars were found embedded in a neolithic kitchen dating back to 5400 – 5000 BC. This was mainly because wine was the best way to store grapes for a long time, which would perish very early otherwise.

The Pyramids of Giza- Egypt

The Egyptians were not that far behind when it came to booze. In Drink: A Cultural History of Alcohol. Iain Gateley states that that around 3400 BC, the brewery at Heirakonpolis was capable of producing 1136 liters of alcohol everyday. It is not a surprise at all, because Egyptians worshiped Orisis, who was the god of the dead, life, vegetative regeneration and most importantly for our purposes, of wine.

In the same book it is also stated that the Egyptians brewed at least 17 different types of beer and 24 different types of wine. Alcohol was used for pleasure, nutrition (don’t ask me how), medicine, cultural functions and even funerals.

The North Africans however loathed taverns and pubs, and considered moderation to be extremely important. Overall, the Egyptians were responsible alcohol users in and around the 3000 BC mark.

Code-de-Hammurabi-1.jpg
The Code of Hammurabi

The Code of Hammurabi, one of the oldest scripts in the world talked about alcohol around 1750 BC. The Babylonians consumed and worshiped wine as early as 2700 BC.

In the book “Genomics and Health in The Developing World”, which is a collection of articles, Meera Vasani argues that the process of distillation started in India. India has a rich history of alcohol use and their is evidence of alcohol use in the Indus Valley Civilization, between 3000 and 2000 BC. A beverage called Sura made from rice, wheat and sugarcane was popular among Kshatriya warriors and considered to be India’s favorite alcoholic drink.

The Ancient Ayurvedic texts describe the benefits and pitfalls of alcohol consumption. Alcohol was considered to be a medicine in moderation and poison in excess (I am starting to notice a pattern here). It is noticed that alcohol consumption in India was much more strict in India than in other parts of the world, particularly due to the religious Hindu scriptures stating that good people were vegetarians and the devils were he booze drinking, meat eating psychopaths like myself.

Mead was extremely popular among the Greeks. Greeks also made their own wine by 1700 BC. Wine became so popular among the Greeks that those who did not consume wine were considered to be uncivilized barbarians.

Alexander the Great mosaic.jpg
Alexander the Great

The Macedonians were particularly known for their drunkenness. Their King Alexander The Great was known for being an absolute drunkard and also one of the greatest military commanders of all time.

Now, we need to talk about alcohol culture in the United States before the arrival of Columbus.

File:Landing of Columbus (2).jpg - Wikimedia Commons
Landing of Columbus

The alcohol history of the Americas before the arrival of Columbus is an extremely rich one. The natives in that part of the world made alcohol which we still consume today. A paper entitled “Historical and Cultural Roots of Drinking Problems Among American Indians” states that the drinking problem was present in America long before the colonization. Overall, the natives were said to have produced over forty different types of alcohol using all sorts of ingredients such as honey, fruits and grains.

Dionysus

We can’t have a conversation about history without alluding to the Romans and this is true even in the case of alcohol. The Romans worshiped Dionysus, the god of wine and of harvest. The divine mission of Dionysus was to mingle the music and arts and bring an end to worry. The Romans were noted for having crazy dinner parties where dinners were served in a 3 course meal and each course had plenty of alcohol. Roman aristocrats, forever a classy bunch preferred beer with wine.

Of course not all Romans preferred wine over beer. Beer was extremely popular among Roman Soldiers and legionaries, so much so that it was supplied among the garrison.

Jabir ibn Hayyan.jpg
Jabir ibn Hayyan

Persian chemist Jabir ibn Hayyan, the Arab Al- Kindi and Muhammad ibn Zakariya al-Razi were among the first to use distillation extensively in their researches. Hayyan invented the alembic still, whose principles still govern alcohol production. It is mentioned that Al- Kindi unambiguously described the true method of wine distillation in the ninth century.

File:The Art of Distillation, 1651.djvu
The Art of Distillation- John French, 1651

Hieronymus Brunschwig, a German alchemist published Liber de arte distillandi de simplicibus, the first book solely dedicated to distillation in 1500. John French, in 1651 published The Art of Distillation, the first book regarding distillation in the English language.

History testifies that there was a massive consumption of alcohol where water-borne diseases were present, as it was assumed that it was safer to drink alcohol than water.

As we can see from the scriptures of 16th to 18th century, all sects of Christianity considered alcohol to be a gift from god, but getting hammered was considered to be a sin (Again, I see a pattern here).

To show the hold alcohol had in Europe in the early modern period, let me quote a passage extract. “In spite of the ideal of moderation, consumption of alcohol was often high. In the 16th century, alcohol beverage consumption reached 100 liters per person per year in Valladolid, Spain, and Polish peasants consumed up to three liters of beer per day. In Coventry, England, the average amount of beer and ale consumed was about 17 pints per person per week, compared to about three pints today; nationwide, consumption was about one pint per day per capita. Swedish beer consumption may have been 40 times higher than in modern Sweden. English sailors received a ration of a gallon of beer per day, while soldiers received two-thirds of a gallon. In Denmark, the usual consumption of beer appears to have been a gallon per day for adult laborers and sailors.”

Champagne made a debut during the seventeenth century by Dom Perignon. The invention was somewhat of an accident, and it took another hundred years before champagne was commercially produced.

Whiskey and rum were the latest alcoholic beverages to get mainstream attention, after the former was produced in Dublin around 1405.

Since then, alcohol has been an integral part of every culture and has been consumed in every shape possible. The lesson we need to understand about alcohol is that when used in moderation, it is a great spirit, when misused it can cause chaos and havoc to unprecedented levels.

Thank You and Drink Responsibly.

References

https://www.bbc.co.uk/earth/story/20170222-our-ancestors-were-drinking-alcohol-before-they-were-human

https://www.georgianjournal.ge/discover-georgia/34000-guinness-book-of-records-declares-georgian-wine-as-worlds-oldest-wine.html

https://www.jstor.org/stable/2478422

Grace Hopper – One Of The Greatest Computer Scientists Of All Time.

Commodore Grace M. Hopper

It is often easier to ask for forgiveness than to ask for permission.“- Grace Hopper.

If I were to ask you the top 5 Computer programmers of all time, you would probably think of the pioneers like Ada Lovelace, Alan Turing, Howard Aiken, John Von Newmann, Tim Berners – Lee among others. However, I don’t expect Grace Hopper to show up on your list unless you are deeply interested in the history of programming and computers in general, but she should absolutely be on your list, and I will explain you why. In the paragraphs that will follow, we will learn about Hopper’s career, achievements and how her works have shaped the programming sphere as we know it.

Let us begin.

Grace Hopper was a United States Navy Rear Admiral, and a computer programmer. She was one of the pioneers of computer programming who has been largely credited with the development of first written compiler or linker (a program that takes several files and combines them into a single executable file like a .exe file for windows. Read more about linkers here.)

Grace Hopper was born in New York on December 9, 1906. She was a curious student since a very young age, and that was one of the traits which stuck to her throughout her entire life. Her father Walter Fletcher Murray, and his father were both insurance bankers which is a field that makes use of statistics and Grace’s mother, Mary Campbell Van Horne Murray, went on trips with her father, who worked as a civil engineer for the city of New York. She was encouraged to pursue education and become self reliant by her well educated family. At age 16 she failed the exam to enter Vassar College, but she only had to wait one year and entered Vassar College aged 17 in 1923. She received a Master’s degree in 1930 and a PHD in Mathematics in 1934 from Yale University. She also started teaching at Vassar College in 1931.

Hopper had a very distinguished career in the United States Navy. After the bombing of Pearl Harbor on December, 1941, she decided to join the navy. However, she was initially denied the opportunity to enlist because she was 34 years of age, which was considered too old to enlist, and also because her weight to height ratio was too low. The main reason however, was that Hopper was a mathematician at a reputed institution and her job at Vassar was considered to be important for the war effort.

If you think, that made Hopper forget about the Navy, then you would be very much mistaken. In 1943, she took a leave of absence from Vassar and joined the United States Naval Reserve (WAVES), which was the woman’s branch of United States Naval Reserve in World War Two. Despite weighing nearly 7 kg less than the minimal requirements, she got an exemption to join the navy. Hopper was assigned to the Bureau of Ships Computation Project at Harvard University, where she worked under Howard Aitken, who had developed the Mark-I, the IBM Automatic Sequence Controlled Calculator. She was responsible for programming the Mark-I and punching the instructions onto tape. Along with Aitken, she co-authored three papers on Mark-I. She turned down a full professorship at Vassar, to stay at Harvard under a navy contract. She was also responsible in “running the numbers” and performing the calculations for the atom bomb dropped on Nagasaki.

Grace Hopper in her office in Washington

On an interesting side note, Hopper is popularly credited with the terms “bug” and “debugging”. Hopper and the team working on the Mark-II, found a moth stuck on the relay of the system, and coined the process of fixing errors and problems in computers as debugging. So, next time you find yourself editing a piece of dodgy code, think about Grace Hopper.

Hopper at the UNIVAC Condole- 1960

In 1949, Hooper joined the Eckert-Mauchly Computer Corporation in Philadelphia, where she did some of the best work of her career. The company would soon be taken over by Remington Rand, and Hooper would be highly influential in the development of UNIVAC-I. Here, she and her team designed the first compiler called the A-0. A compiler converts mathematical code into machine readable binary code and this made it possible to write programs for a multitude of machines except a single one. Hooper wasn’t done here, she and her team than designed the Flow-Matic, which was the first English like programming language. (If you wish to understand why this was a big deal, you should take a look at FORTRAN, which only used symbols.) Hooper had the foresight to understand that not everyone who programs are well acquainted with complex mathematics and a word based programming language was the way to go. She was right. Word based programming languages are the norm and basically anyone with a computer can learn how to program.

You may have heard of COBOL (Common Business Oriented Language). It was one of the first standardized general business oriented programming language. While Hopper did not invent the language, she can be considered to be one of the most important contributors in its use in both the civilian and military sectors. COBOL came out in 1959, and by 1970, it was the most extensively used programming language in the world. Much of that can be attributed to Grace Hopper.

At age 60 in 1966, she was let go by the navy, but less than a year later she returned to work on standardizing Navy’s multiple computer languages. Her second stint in the navy lasted 19 years and she eventually retired from the United States Navy in 1986, as a Rear Admiral. She was 79 years old. She didn’t retire from her working life however, working as a consultant in the Digital Equipment Corporation until 1991, one year prior to her death.

Grace Hopper passed away on New Years Day 1992 at the age of 85.

Hopper was the recipient of a large multitude of awards, both in civilian and military sphere. She has had more than 40 honorary degrees, a large stash of scholarships and countless professorship awards. Some of her more prestigious awards include ale’s Wilbur Lucius Cross Medal in 1972,  National Medal of Technology, the nation’s highest technology award in 1991, from President Bush. This made Hopper the first female recipient of that award.  In 1996 the Navy commissioned the U.S.S. Hopper, after her name. In 2016 Hopper received the Presidential Medal of Freedom, the highest civilian award, for her contributions to the field of computer science.

Grace Hopper did it all. She was an amazing mathematician, computer scientist, navy commander, programmer and so much more. It is impossible to say how modern day computing would look like if Grace Hooper did not exist, but one thing is for sure, Grace Hooper is one of the greatest computer scientists of all time, and we are forever grateful for the legacy she has left behind in the field of computer science and programming.

References

https://news.yale.edu/2017/02/10/grace-murray-hopper-1906-1992-legacy-innovation-and-service

https://www.biography.com/scientist/grace-hopper

https://www.public.navy.mil/surfor/ddg70/pages/namesake.aspx

https://www.thoughtco.com/the-younger-years-of-grace-murray-hopper-4077488

A Case Study About the Misinterpretation of Statistics:- Part One

A Bar Graph Showing an upward curve

A while ago, I wrote an article on this blog about How Statistics are Misused to manipulate the General Public. In it I tried to cite some examples, which had taken place and also given some stereotypical examples on how statistics can be misused and as general public, what we can do about it. I did not go into a lot of examples and real life scenarios in it because that was already a long post and it would be unbearable to read had it been any longer. However, the examples of such misuse and manipulation of numbers are extremely common and it can fool and scare even the most conscious of readers, never mind the general public. So, starting from this post, on a biweekly basis, I will bring to you some cases, one or maybe two per article, where statistics have been heavily misused and generally used to serve an agenda, whether consciously or unconsciously. So, without wasting too much time, let us get straight into it.

Entry Number One: – Eating Meat Increases The Risk of Cancer By 18 %.

Apparently this fine specimen increases risk of cancer by 18 %.

This story is the motivation for me to start writing about misuse of statistics all over again. I remember vividly when this claim came out. I was a freshman in my undergraduate days and ate meat on a regular, if not a daily basis and this news was a bombshell to me. Earlier this week, I picked up Sir David Speigelhalter‘s book “The Art Of Statistics“, in which he completely debunks this theory and shows how Statistics can be misinterpreted to cause mass outrages. I will give you a summary of that and explain why these mistakes and misinterpretations are far too common.

The highly reputed and popular International Agency for Research in Cancer (IARC), which is a part of the World Health Organization, stated in late 2015 that processed meat was as dangerous as cigarettes when it came to causing cancer. In that same statement the IARC stated that “50g of processed meat a day was associated with an increased risk of bowel cancer of 18%.” Of course, this sent the entire world into a frenzy. The statistic is simply alarming, and naturally people were very scared, but a deeper dive into the underlying numbers shows that much of the fear was unwarranted.

To fully understand the situation, we need learn about two terms: Absolute Risk and Relative Risk, and instead of boring you with long definitions, I will explain the concepts using a real world example.

A study states that the numbers of death due to cancer due to a new prescription drug in a city decreased from 2 per 100 to 1 per hundred. ” A statement written in this way expresses absolute risk.

A study states that a new prescription drug has decreased the risk of death among patients by 50 %.” This is a statement which expresses relative risk.

If the first one is published, people don’t tend to give it too much notice and will forget about the new drug almost instantaneously, while if you express the same statistic in relative terms, it will get the public’s attention. The 18 % rise in bowel cancer was relative risk represented as absolute risk. Whether it was intentional or just a misinterpretation, I can’t tell, but it was certainly not the right way to express the statistic.

Going back to our meat causes cancer study, it stated that normally around 6 out of 100 people who do not eat bacon everyday got bowel cancer at some point in their life, while among those who ate bacon everyday 7 out of 100 people got bowel cancer. So, there was a 18% relative increase of cancer, from people who didn’t eat bacon everyday to those who did.

When possible, statistics should be expressed in terms of absolute risk and not in terms of relative risk. One can only wonder how many of such misinterpretations have occurred in the past and the amount of hysteria such cases caused in the general public.

That is all I have to say regarding this particular case. This is a perfectly good example of misinterpretation of statistics. As conscious readers we need to understand what the data actually indicates and listen to what our data is trying to convey. That is it for today. Expect another article regarding this topic very soon.

Thank You For Reading.

References

https://www.forbes.com/sites/jvchamary/2015/10/27/bacon-cancer/#5d7caa858e36

https://www.theguardian.com/society/2018/dec/25/bacon-cancer-link-head-of-un-agency-at-heart-of-furore-defends-its-work

https://www.theguardian.com/lifeandstyle/2015/oct/26/processed-meats-pose-same-cancer-risk-as-smoking-and-asbestos-reports-say

http://www.statslab.cam.ac.uk/~david/

The Greatest US Inventions To Come Out of The Cold War – Part Three

United States Political Map
The Map of the United States of America

This is the third article out of 5 which takes a look at ten most significant inventions made by the United States during the Cold War. This article focuses on the third decade (1966- 1975), of the Cold War. This decade was when technologies took a giant leap and as an electronics and computer nerd, this was the most exciting decade of inventions during the cold war for me. So, without wasting any more time let us dive straight into the greatest inventions and innovations to have come out of the USA in the third decade of Cold War.

Number 10 – The C Programming Language(1972)

File:The C Programming Language logo.svg
The C Programming Language

In the world of technology, the only constant is change. Thousands of technologies including programming languages are launched and forgotten about every year, never to be seen again. So, it is a remarkable fact that the C programming language has been around for nearly half a decade and is still one of the most widely used programming languages. C has had a wide range of influences throughout the years, and I am sure it will continue to do so. Most of Microsoft Windows is still written using C, same is true for Mac and Linux. In addition it has been an ever present in the design of web browsers, network tools and basically every kind of software you could think of.

So, it is important that we understand the history of this fantastic tool. The C Programming Language was developed in 1972 by Dennis Ritchie, who was working in Bell Labs. Ritchie wanted to develop a programming language which could solve complex, high level and abstract ideas but would be simple and efficient to read and write.

Ritchie wanted a language that would be powerful and expressive enough to enable programmers to express complex, abstract, high-level ideas. At the same time, the language would have to be simple and efficient. Soon enough, almost all of Unix itself and the programs many people would use under Unix, including C itself, would be crafted in C, which would suit the task at hand admirably. C was a follow up to a language created by Ritchie, called B (no surprises there). The first formal guide in C was called The C Programming Language, which was written by Dennis Ritchie and Brian Kerrighan. Soon after that the formal standard of the language was established and it has been a roaring success to say the least. Due to the impact it has had in the software industry, C makes our list.

Number 9 – Email (1971)

Email Logo

I am not going to insult my audience’s intelligence by stating what email is. In the digital world we live in, it is almost impossible to not use email on a regular basis. Email’s are the most widely used form of online communication with over 2.6 Billion active users and over 5 Billion accounts. Think about that number for a second, 2.6 Billion people represent over 30% of the world’s population. So, it is safe to say Email has been a success. Let us see how it came to be and meet its inventor Ray Tomlinson.

You may be surprised to know that email is actually older than internet itself. The exchange of mail between computers started after the creation of ARPANET in 1969. ARPANET was a network connecting numerous computers across the US Department of Defense, for the purpose of increasing communication within the computers of the department.

1n 1971, Ray Tomlinson created the electronic mail by creating ARPANET’s networked mail system, and the idea of efficient communication throughout different departments in an office caught fire, and the idea soon began to spread. Tomlinson was also the man responsible for incorporating the @ sign in email addresses. Half a decade after that 75% of all the traffic in ARPANET was electronic mail. Email proved to be so useful that it was only a matter of time before people devised ways to send email outside of an internal network. The advent of the Internet in the early 80’s made this possible. Due to the use and importance of email, it makes our list.

Number 8 – Virtual Reality (1968)

VR Set in Action

This one caught me off guard, for I had assumed Virtual Reality were a newer invention, one developed in the 90’s or the early 2000’s, but no. Virtual Reality devices were out and about as early as 1968. Virtual Reality is a technology that allows the user to actively interact with a computer- simulated environment. Virtual Reality are primarily visual experiences, that are either displayed on a computer screen or through special displays.

In 1968, Ivan Sutherland along with his student Bob Sproull, created the first head mounted display for use in immersive simulation applications, and despite being primitive in user interface and realism, it was still extremely significant. It was of a huge size and had to be suspended from the ceiling, and was aptly called “The Sword of Damocles”.

Virtual Reality devices gained popularity in the 70’s for a large quantity of VR devices were manufactured and widely used for medical, flight simulation, automobile industry design, and military training purposes. Virtual Reality have added a new dimension of thought and have become as realistic as the real world at times, and it is a no brainer that VR devices will continue to improve. Hence, they make our list.

Number 7 – Markup Language (1969)

A Markup Language

A markup language uses tags to define all its elements within a document. The advantage of a markup language over programming languages is that it is easier to understand and makes use of syntax familiar to the average English speaker. To describe the history of markup language I am going to quote a passage from an article published on codepunk.io (https://codepunk.io/a-brief-history-of-markup-and-xml/)

It was William Tunnicliffe and the Graphic Communication Association’s (GCA) Composition Committee that started the move towards generic coding in order to promote a separation of content from formatting. Tunnicliffe presented this idea to the Canadian Government’s Printing Office in 1967. Also at this time, Stanley Rice, a book designer from New York, was in the midst of proposing an idea for editorial tags meant for structure. Later, Norman Scharpf, the director of the GCA, started a generic coding project inside of the Composition Committee of the GCA as a result of noticing trends towards this generic markup. This committee created a GenCode(R) concept that established the idea that different generic codes would be necessary for different types of documents. This GenCode(R) concept eventually evolved into the GenCode Committee, which had a large role in developing the Standardized Generalized Markup Language (SGML).

The importance of this invention can hardly be overlooked or overstated. It was the precursor to the development of Hyper Text Markup Language (HTML), which powers most of the web pages today. For this reason, markup languages make our list.

Number 6 – Digital Cameras (1975)

The Canon PowerShot A95

Everyone knows what a digital camera is, all of us possess at least one with our smartphone these days, and I would believe all of us have possessed or used a digital camera as pictured above. Digital Cameras employ a memory chip or internal storage to store a photograph unlike an analog camera which used a photographic film. The image captured would be passed onto an analog to digital converter, stored temporarily in a DRAM and then passed onto a cassette recorder. The camera weighed 3.6 kilograms and required 16 batteries for operation. It is a shame that Kodak didn’t take advantage of their own invention and the rest as they say, is history.

Sansson’s Digital Camera

Fairchild Semiconductors had only just developed the new type 201 charge – coupled device (CCD). The a young engineer working at Eastman Kodak Company by the name of Steven Sasson, was tasked to design a 100 by 100 pixel CCD to digitize an image. The image captured would be passed onto an analog to digital converter, stored temporarily in a DRAM and then passed onto a cassette recorder. The camera weighed 3.6 kilograms and required 16 batteries for operation. It is a shame that Kodak didn’t take advantage of their own invention and the rest as they say, is history. It makes our list for changing the way photographs are taken and increasing the convenience of photography.

Number 5 – Wireless Local Area Network (WLAN )(1970)

An example of WLAN Network

wireless LAN (WLAN) is a wireless computer network that connects many devices using wireless communication to form a local area network (LAN) within a limited area such as a home, school or office building. This gives users the ability to move around within the area and remain connected to the network.

NormanAbramson.jpg
Norman Abramson

The first recognized WLAN network was developed by Norman Abramson, in 1970 and was called ALOHANet. It enabled wireless communication between the small Hawaiian islands and can be considered to be the precursor to technologies such as WiFi and ethernet. The ability of performing communication without the use of the dreaded telephone lines was one of the greatest achievements in field of communication technology in the seventies.

Considering the fact that most of you would not have read this without your WiFi connection, is it any surprise that Wireless LAN’s have made the list. To learn more about WLAN’s go here.

Number 4 – Mobile Phones

File:IPhone 11 Pro Max Midnight Green.svg
The IPhone 11

Should I explain what a mobile phone is, or what it does and what role it plays in our lives in the twenty first century. Thought I need not. So, let us dive straight into the history of mobile phones, shall we?

Martin Cooper, Two Antennas, October 2010.jpg
Martin Cooper- Father of Handheld Phones

On April the 3rd, 1973, Martin Cooper, general manager of Motorola’s Communications Systems Division made a phone call to Josh D Engel, who was AT&T general manager. He said, “Joel, this is Marty Cooper, I’d like you to know that I’m calling you from a cellular phone”, and just like that Motorola had won the cellular phone race.

Motorola took another decade to produce the cellular phone commercially, but still they were the first ones to do it. It weighed almost 7 times as the current mobile phones and cost $3,995 in 1983, and the battery lasted a grand total of 20 minutes. For completely changing the way human beings communicate, mobile phones make our list.

Number 3 – Microprocessor

The Intel 4004 Microprocessor

The Microprocessor is the main processing unit of computers. Microprocessor is a controlling unit of a micro-computer, fabricated on a small chip capable of performing ALU (Arithmetic Logical Unit) operations and communicating with the other devices connected to it.

File:Robert Noyce with Motherboard 1959.png
Stan Mazor, Ted Hoff, and Federico Faggin 

Fairchild semiconductors (founded in 1957) invented the first Integrated Circuit in 1959 that marked the microprocessor history. In 1968, Gordan Moore, Robert Noyce, and Andrew Grove resigned from the Fair child semiconductors and started their own company: Integrated Electronics (Intel). In 1971, the first microprocessor Intel 4004 was invented. A microprocessor is also known as a central processing unit in which numbers of peripherals’ are fabricated on a single chip. It has ALU (arithmetic and logic unit), a control unit, registers, bus systems, and a clock to perform computational task.

Number 2 – Personal Computers

A Personal Computer

If I were to explain to you, what a personal computer is, you would chase me with a stick. Personal Computers are the greatest inventions in the history of technology. They are used in all sorts of applications by all sorts of people in all sorts of environment. There is not a case or an application in which personal computers of some kind are not being used.

Kenbak1.jpg
The KENBAK – 1

The KENBAK -1 is credited to be the first commercially available personal computer in the world. I will quote the article (https://history-computer.com/ModernComputer/Personal/Kenbak-1.html) to explain the importance and working of the Kenbak-1, which will explain it much better than I ever could.

It was created in 1971 by John Blankenbaker, working in his garage in Los Angeles. Initial sales commenced in September of 1971. It was intended to be educational and the professionals in the field were enthusiastic but it was a struggle to convince the non-professionals that they could buy a real computer at this price ($750), thus only some 40 devices were sold, mainly to schools.

The creator of Kenbak-1—John V. Blankenbaker (born 1929), had a long experience in the field of computers. He started the design of a computing device as early as in the winter of 1949, when he was a 19 y.o. physics freshman at Oregon State College, inspirited by an article in a magazine. After graduation from the college in 1952, he worked at Hughes Aircraft Co. in the department for digital computers, designing the arithmetic unit for a business data processor. Some time in the late 1950s he began to think there could be simple computers which could be afforded by individuals.

As late as in the fall of 1970 he found himself unemployed and decided to investigate what might be done to make a computer for personal use. He wanted the computer to be low cost, educational, and able to give the user satisfaction with simple programs. The computer could be serial and slow which would reduce the cost yet create the environment that was desired. It should demonstrate as many programming concepts as was possible. Because of the small size, the native language of the unit would be the machine language. Above all, it had to be a stored program machine in the von Neumann sense. To keep the costs low, switches and lights were the input and output of the machine. (Some thought was given to punched card input, but it was never developed.)

By the spring of 1971, the logic printed circuit board had been built and the computer was assembled. Designed before microprocessors were available, the logic consisted of small and medium scale integrated circuits, mounted on one printed circuit board. MOS shift registers implemented the serial memory. Switches in the front keyed the input and lights displayed the output. The memory was two MOS shift registers, each of 1024 bits. The computer executed several hundred instructions per second.

(I am using content from other site for this entry because I intend to write a detailed piece on the KENBAK-1 very soon.)

Number 1 – Lunar Module

Apollo16LM.jpg
The Apollo 16 LM

If you think, I am going to not include any space related entries on this list, you are much mistaken. Our entry for the outer space is the Apollo 16 Lunar Module. It was the first crewed spacecraft to operate exclusively in the airless vacuum of space, and remains the only crewed vehicle to land anywhere beyond Earth.

It was structurally and aerodynamically incapable to be launched from the earth so it was ferried by the Apollo Command and Service Module (CSM), and then launched from orbit.

Apollo 16 crew.jpg
Thomas K. Mattingly II, Command Module pilot; John W. Young, Commander; and Charles M. Duke Jr., Lunar Module pilot.

The module was crewed by 3 men Thomas K. Mattingly II, John W. Young, Charles M. Duke Jr. Launched from the Kennedy Space Center in Florida at 12:54 PM EST on April 16, 1972, the mission lasted 11 days, 1 hour, and 51 minutes, and concluded at 2:45 p.m. EST on April 27.

Young and Duke, spend 3 days on the lunar surface, conducting three moonwalks which totaled 20 hours and 14 minutes. They drove the Lunar Roving Vehicle for 26.7 kilometres and collected 95.8 kilograms of lunar samples to return to earth, while pilor Mattinlgy orbited in the Command Service Module to see what was going on. It was and still is one of the most daring feats of humankind and for that reason, it definitely makes and tops our list.

References

https://www.toptal.com/c/after-all-these-years-the-world-is-still-powered-by-c-programming

https://h2g2.com/edited_entry/A425846

https://www.section.io/engineering-education/history-of-c-programming-language/

https://www.fi.edu/virtual-reality/history-of-virtual-reality

https://codepunk.io/a-brief-history-of-markup-and-xml/

https://spectrum.ieee.org/tech-history/silicon-revolution/how-the-digital-camera-transformed-our-concept-of-history

https://www.thevintagenews.com/2016/10/16/priority-martin-cooper-invented-mobile-phone-1973-says-inspired-captain-kirks-gold-flip-top-communicator-star-trek/

https://www.uswitch.com/mobiles/guides/history-of-mobile-phones/#:~:text=However%2C the history of mobile,really mobile phones at all.

https://history-computer.com/ModernComputer/Basis/microprocessor.html

https://www.elprocus.com/microprocessor-history-and-brief-information-about-its-generations/

https://history-computer.com/ModernComputer/Personal/Kenbak-1.html

https://www.hq.nasa.gov/alsj/a16/a16ov.html

https://www.nasa.gov/mission_pages/apollo/missions/apollo16.html

Top Ten US Inventions and Innovations During the Cold War- Part Two

Flag of the United States of America
Flag of The United States of America

In the first part of this series, we saw the inventions and innovations by the Americans, during the first decade of the cold war (1946-1955). Now, we take a look at American inventions during the second decade of the cold war. This was a very significant decade in terms of space exploration and development of computers and other accessories related to computers, so there were more than 10 but we have a format to stick to. That being said, let us start straight away.

Number 10 – Computer Mouse (1963)

The First Ever Computer Mouse. – The Engelbart mouse

Let me start with something which everyone knows about, the humble computer mouse. It is difficult to imagine how we would have used computers had computer mice not been invented, fortunately we don’t have to put ourselves through that misery.

The first computer mouse was invented by Douglas Englebrat  at Augmentation Research Center, funded by Defense Advanced Research Project Agency(DARPA), in 1963. The idea of creating a mouse, an interactive device which could improve computational efficiency had been in his head since 1961. The first mouse was carved out of wood and tracked motion by employing two wheels at the bottom. It was replaced by a single ball and now almost all the mice use light to track the movement. This one makes our list because it is impossible to imagine computing in the modern day without the help of a mouse or the subsequent technologies that spurred out after the success of the mouse.

Number 9 – FORTRAN (1956-57)

Fortran acs cover.jpeg
Cover of The Fortran Automatic Coding System for the IBM 704 EDPM, said to be the first book about Fortran.

Fortran is a general purpose computer programming language which is especially designed for numeric computation and scientific computing. Despite being invented in the 50’s, Fortran is still being used to this day for scientific purposes like geology, computational physics and other scientific researches. Fortran was one of the first programming language to integrate object oriented programming concepts, probably the most important feature in all modern programming language.

John Backus 2.jpg
John Backus- Inventor of Fortran

The credit for creation of Fortran goes to John Backus, who invented it along with the team of researchers at IBM, particularly for the IBM 704 mainframe computer. The main reason for the creation of Fortran was to develop a more practical alternative for assembly language coding for IBM 704. This made human machine interaction much more easier and made Fortran a precursor for other more popular languages like BASIC. The reasons why Fortran makes the list are its longevity and impact it has had in the development of other programming languages.

Number 8 – Zero Gravity Pen (Space Pen)(1965)

An AG-7 Astronaut Space Pen 

We finally get to the space stuffs. Let us start with one of the most important inventions related to space, the space pen. The space pen is a pen that enables writing in zero gravity, underwater, in greasy paper at any angle and at a wide range of temperatures. The tip of the pen is made up of tungsten carbide ( a compound that contains equal portion of tungsten and carbon).

The credit of invention of the space pen goes to Paul Fisher in Boulder City, Nevada. Fisher first patented the AG7 “anti gravity” pen in 1965. The ink used is thixotropic in nature (the concentration gets thinner with use, think of tomato ketchup). It is said that the ink can run for thrice as long as normal ballpoint pen. Official literature of the pen states that one refill of the ink is said to write for a length equivalent to a distance of 49.8 km, and has been proven to write at an altitude of 3800m. NASA tested the pen for two years, and deployed it during the Apollo 7 mission in 1968. Documentation in space is a difficult task, and space pen has made it possible, which is why it features in our list.

Number 7- Cordless Telephone (1965)

Cordless Telephone

If you were a middle school student in the 2000’s, these were an integral part of your communications. Cordless telephones are phones with a wireless handset which communicates via radio waves with a base station connected to a fixed telephone line, usually within a limited range of its base station. The cordless is a movable landline phone which has increased the flexibility and range of the static landline telephone.

George Sweigert Radiotelephone Inventor.jpg
George Sweigert, the first person to patent the Cordless Phone

In 1965, an American woman named Teri Pall invented the cordless telephone, but never got around to patenting it. In 1966, American inventor, George Sweigert applied for a patent describing the technology behind cordless telephones. Sweigert was a radio operator in the Second World War, and he was interested in improving the quality of military radio phones. The commercialization of cordless telephones only began in the 1980’s, but the groundwork was laid nearly two decades before that, by Pall and Sweigert. This invention is still used today and has been crucial in improving the range and flexibility of landline Public Switched Telephone Network (PSTN), and this is why the cordless telephone makes our list.

Number 6 – Light Emitting Diodes (LED)(1962)

RBG-LED.jpg
Light Emitting Diodes

Light Emitting Diodes (LED’s) are semiconductor diodes that emit light when current flows through it. Light is produced when positive and negative charge carriers (electrons and holes) combine together with the semiconductor material. Light Emitting Diodes are used in almost all applications that require light, some examples being LED Lamps, Traffic Lights, exit signs, emergency vehicle signs, and even as source in optical fiber cables. This makes LED’s one of the most popular inventions during the cold war.

The first LED was an infrared one invented by Gary Pittman and James R. Biard in 1961, while the duo were working for Texas Instruments. This was an accidental invention for the duo were working on a laser diode. However, infrared light is not visible to humans and it was of practical importance.

The first visible LED was invented by Nick Holonyak, while he was working for General Electric in Syacruse, New York. The diode invented by Holonyak could produce red light. For this invention Holonyak has been given the title of “Father Of Light Emitting Diodes”.

During the 1960’s two companies were responsible for commercialization of LED products. They were Helwett-Packard(HP) and Monsanto. During the 60’s HP used materials supplied by Monsanto to make LED’s. Monsanto were a chemical company and they used to produce Gallium Arsenic Phosphate (GaAsPh), which turned out to be a useful semiconductor while producing LED’s. After that opto-electronic products using LED became the norm and still are to this day. For their longevity and applications, LED’s deserve to make this list.

Number 5 – Laser(1957)

Lasers of different wavelengths projecting light of different color

Lasers are another light source which have cWehanged the world. LASER is actually an acronym, that stands for Light Amplification for Stimulated Emission of Radiation. LASERS are one of the most popular light sources along with the LED’s and are used in applications such as surgery, communications, nuclear fusion, laser printing and in pretty much every opto-electronic applications. In optical fiber cables, Lasers are preferred to LED’s as sources of light because of their narrow width. Let us take a look at its history.

We have to talk about Charles H Townes, who featured in our Soviet version of this article, for his work on MASERS, which were a precursor to laser. Townes played an integral role in the development of lasers as well. Townes along with Arthur Leonard Schawlow (another physicist who won the Nobel Prize), started to study the infrared laser seriously. They soon abandoned that and focused on visible light lasers, and initially called it optical maser.

Almost at the same time, a Columbia University graduate student, Gordon Gould was working on energy levels of thallium (a transitional metal, atomic no: 81) upon excitation. Gould can be credited with the first use of the acronym LASER. At a conference in 1959, Gordon Gould published the term LASER in the paper The LASER, Light Amplification by Stimulated Emission of Radiation.

Maiman and his laser

Theories aside, the first fully functioning LASER was invented by Theodore H Maiman, at Hugh Research Laboratories in Malibu, California. Maiman’s laser was made out of ruby, which could absorb green and blue light while emitting red light. The laser also had a high chromium content which were energized to produce red light. Using ruby to create lasers was thought impossible at that time, but Maiman achieved it with relative ease.

Lasers have changed so many fields and impacted the life of millions of people in a positive way. In medical fields, surgeries are virtually impossible without lasers, and for these reasons lasers make our list.

Number 4 – Global Navigation Satellite Systems (1960)

The Transit- First GNSS System

GNSS stands for Global Navigation Satellite Systems, and is known in popular lingo as SatNav or GPS. GPS is just an example of GNSS. GNSS are satellite systems that provide autonomous geo- spatial positioning with global coverage. A GNSS allows small electronic receivers, like your phone to determine its position in planet earth and with a very high degree of accuracy. GNSS also allows synchronization with local clocks and for the most part can work without a cellular coverage or an internet connection. You can read about how satellite phone works here.

Like everything else in this list, GNSS was a product of cold war. The first GNSS system was called Transit, and was deployed by the US Military, in the 1960’s. The Transit Satellite System was developed at John Hopkins University’s Applied Physics Laboratory(APL) for the United States Navy. The brains responsible were William Guier and Geoege Weiffenbach, who researched on microwave signals that likely emanated from satellite. They were successful in tracking the orbit of Sputnik-1, the first ever satellite by analyzing the Doppler Shift of its radio signals. This then led to the theory that if the satellite’s position were predictable, we could locate a receiver on Earth.

two physicists at APL, William Guier and George Weiffenbach, found themselves in discussion about the microwave signals that would likely be emanating from the satellite. They were able to determine Sputnik’s orbit by analyzing the Doppler shift of its radio signals during a single pass. Frank McClure, the chairman of APL’s Research Center, suggested that if the satellite’s position were known and predictable, the Doppler shift could be used to locate a receiver on Earth.

So, the team at John Hopkins University began to develop the TRANSIT system in 1958, and a satellite the Transit 1-A was launched, and it failed to reach orbit. A second Satellite, Transit 1-B was successfully deployed on April 13, 1960 and the era of satellite navigation began. What it has led to in the subsequent six decades is there for all to see. Satellite navigation has become an integral part of human life in the 2010’s and will continue to do so in the decades to follow. For this reason GNSS makes our list.

Number 3 – Operating Systems (1956)

Everyone knows what an operating system is, but let me define it anyway. An operating system is system software that manages computer hardware, software resources, and provides common services for computer programs. Operating systems are a necessary prerequisite for computers and today we will talk about the history of operating systems.

The first operating system was the GM-NAA I/O which was created by Owen Mock and Bob Patrick of General Research Laboratories in 1956. The operating system was designed for the IBM 701 mainframe computer and is considered to be the first “batch processing” operating system. Batch processing are frequently used programs which can run with little human interaction. They were called batch processing because punched cards were stacked on top of each other, and were run in batches. The operating system would read in a program and its data, run that program to completion (including outputting data), and then load the next program in series as long as there were additional jobs available.

Batch processing programs were very popular in the late fifties to early sixties and was replaced by multi processing operating systems, which were much more efficient and led to the development of operating systems as we know them today. For this reason, operating system makes our list.

Number 2 – Integrated Circuits (IC) (1958)

The Integrated Circuits of an EEPROM

Integrated Circuits are the most important components of any electronic equipment. It is a miniature electronic component manufactured in a thin substrate of semiconductor material. Integrated Circuits function as a large number of transistors and other electronic components, which make it smaller, faster and less expensive than the individual discrete components. The main component of an Integrated Circuit is the Metal Oxide Semiconductor, which has made integrated circuits practical and omnipresent in every single equipment.

Jack Kilby’s Integrated Circuit

There were a large number of people who were responsible for the development of integrated circuits. The first person we should talk about is Geoffery Dummer, who was a radar scientist working for the British Ministry of Defense’s Royal Radar Establishment. His work was presented in Washington, DC in a conference in 1952, in which he states “With the advent of the transistor and the work on semi-conductors generally, it now seems possible to envisage electronic equipment in a solid block with no connecting wires. The block may consist of layers of insulating, conducting, rectifying and amplifying materials, the electronic functions being connected directly by cutting out areas of the various layers”. We take this to be the first description of integrated circuits. Now we move onto Jack Kilby, who proposed the idea of integrated circuits to the US Army in 1957, and in 1958, he successfully demonstrated an example of integrated circuits. It was a hybrid integrated circuit which contained external wires and was extremely difficult to mass produce. Jack Kilby won the Nobel Prize in Physics in 2000 for his invention.

Robert Noyce invented the first monolithic IC chip. Noyce’s work was made from silicon and the best thing about that was, it could easily be mass produced and it started the personal computer revolution. Unlike Kilby’s IC which had external wire connections and could not be mass-produced, Noyce’s monolithic IC chip put all components on a chip of silicon and connected them with copper lines.

Integrated Circuits are omnipresent in all electronic components and have fueled the computer revolution. It is for that reason Integrated Circuits make this list. (Integrated Circuits deserve an article on their own, and I will pen it once I am done with this series.)

Number 1 – The Explorer 1 (1958)

Explorer1.jpg
The Explorer 1

The Explorer 1 was the first United States satellite to be launched into outer space. It was launched on January 31, 1958. It was one of the most important statements in the space race in the cold war. An extract from NASA states ” Following the launch of the Soviet Union’s Sputnik 1 on October 4, 1957, the U.S. Army Ballistic Missile Agency was directed to launch a satellite using its Jupiter C rocket developed under the direction of Dr. Wernher von Braun. The Jet Propulsion Laboratory received the assignment to design, build and operate the artificial satellite that would serve as the rocket’s payload. JPL completed this job in less than three months.”

The primary work of the Explorer 1 was to measure the radiation environment in earth’s orbit. The orbit ranged as close as 354 kilometres and as far as 2515 kilometres from earth. It would orbit the earth in every 114.8 minutes, making 12.54 orbits in a day. The satellite weighted 14 kilograms and burned up on March 31, 1970 after nearly 60000 orbits.

The impact of the explorer can not be overstated. It showed the Soviets that, the Americans were not far behind in the space race. This was a statement of intent that showed the world that America is a superpower, and still to this day, the USA has the highest number of satellites in the world. America is the undisputed leader in the space race to this day, and it all started with the Explorer in 1958. This is why it tops our list.

Thank you for reading. Part 3 will be out shortly.

References

https://www.dougengelbart.org/content/view/162/000/

https://www.livephysics.com/computational-physics/fortran/history-fortran-language/

https://www.obliquity.com/computer/fortran/history.html

https://www.spacepen.com/about-us.aspx

https://www.thespacereview.com/article/613/1

https://www.spacepen.com/

https://www.shineretrofits.com/knowledge-base/lighting-learning-center/a-brief-history-of-led-lighting.html#

http://www.lamptech.co.uk/Spec Sheets/LED Monsanto MV1.htm

https://history-computer.com/ModernComputer/Basis/IC.html

https://ethw.org/Theodore_Maiman_and_the_Laser

https://www.photonics.com/Articles/A_History_of_the_Laser_1960_-_2019/a42279

https://ethw.org/Theodore_Maiman_and_the_Laser

https://www.historyofinformation.com/detail.php?id=83

https://www.ibm.com/support/knowledgecenter/zosbasics/com.ibm.zos.zconcepts/zconc_whatisbatch.htm

http://www.osdata.com/kind/history.htm#batchsystems

jpl.nasa.gov/missions/explorer-1/

https://www.nasa.gov/mission_pages/explorer/explorer-overview.html

Top Ten US Inventions and Innovations During the Cold War- Part One

The Flag of the United States of America

Disclaimer – This is a first part in a possible four or five part series, because there are far too many of them to mention in a single article. The lists are divided by time frame but not strictly chronological. This first article ranks top 10 inventions in the first ten years of the cold war (1946-1955)

The Cold War was the most significant time period for inventions and innovations in history. Both sides invented a lot of great things and most of them shaped the world we live in right now. The inventions made by the Soviets have already been published, and now spanning across four or five parts, we will look at ten inventions and innovations in the 1946 – 1955 time period, that have withstood the test of time.

Number 10- Tupperware (1948)

Tupperware

Great inventions don’t have to be big and bulky. Some of them can be small and beautiful, but most of them withstand the test of time. Our first entry Tupperware, falls into that category. Invented by Earl Tupper, Tupperware are plastic containers that can be used to store, contain and serve food (mostly perishable ones). When they were invented in 1948, they were marketed as “the plastic that could withstand anything”. Tupperware started gaining popularity in the 1950’s after a Detroit resident Brownie Wise, a woman who had a knack for hostess parties. Back in those days, housewives covered the leftover of dishes with shower caps, but Wise showed that there was no longer the need to do such and with the patented lids that made the burping sounds, Tupperware were a revolution throughout the kitchen in the 1950’s. There are two reasons why Tupperware makes this list. Firstly, the sheer usefulness of this invention. Tupperware have been invented everywhere and are found in almost any kitchens in the world. The second reason is the success of Tupperware Brands Corporation, which generated over 2 billion dollars in revenue in 2017. One of the most important Cold War era American invention.

Number 9 – Teleprompter (1950)

A Teleprompter

When I was a child, I used to wonder how the reporter reading the news, remembered every single bit of information. A few years later, I learnt about the teleprompter and the illusion was shattered. Herbert Schlafly, an Electrical Engineer working at 20th Century Fox film studios in Los Angeles, invented the Teleprompter in 1950. A Teleprompter is a device, that allows the presenter to read a script while they maintain eye contact with the audience. Teleprompter have been used in many scenarios and specially by television presenters and politicians while making a televised speech. Their uses in the twenty first century extends to independent video creators, and even artists performing on stage.

A Teleprompter and Camera Setup

The main principle of teleprompter is that text is displayed on a monitor, which is mounted below a piece of reflective glass or beam splitter. The glass is transparent on a side, and the camera shoots straight through the back of it, which makes it invisible to the audience and reflective on the other, making the presenter able to see the script. Teleprompter makes this list because, it has revolutionized the Television history and it is impossible to imagine what television and specially news industry would look like without teleprompters.

Number 8- Radiocarbon Dating (1949)

Radiocarbon Dating is a method of determining the age of any organic compound using radiocarbon, a radioactive form of carbon. Radiocarbon dating has been used to find out the age of carbon containing materials as far as 60000 years old, with a high degree of accuracy. This technique was developed by American Chemist Willard Libby who invented the procedure at the University of Chicago in 1949. This won Libby the Nobel Prize in Chemistry in 1960.

File:Willard Libby.jpg
Willard Libby

The biggest use of radiocarbon dating has been in the field of archaeology. Analysis of soil samples and even remains of animals including human beings has helped us put a date on a lot of civilizations which would not be possible otherwise. Radiocarbon dating is also used in applications like geology, sedimentation and also to date carbon released from the ecosystem, and measure the amount of carbon released that was previously stored in the soil.

Despite not being a tangible physical product, the impact of radiocarbon dating can hardly be overstated and it is one of the most significant scientific techniques to have ever been discovered, and we have learnt more about our planet as a result of radiocarbon dating. (You can learn more about radiocarbon dating here.)

Number 7 – The Transistor (1947)

File:Replica-of-first-transistor.jpg
The First Transistor

A transistor is an electronic component, that is used to amplify and switch signals. The biggest advantage of a transistor is that the output power can be significantly larger than the input power and as a result, it provides signal amplification. The transistor is used in every single electronic device, ever created since the 50’s, and is the most popular and most widely studied single electronic component.

The first transistor was a point contact transistor created by John Bardeen and Walter Brattain, working under William Shockley in 1947, while working at Bell Labs. The trio went onto win the Nobel Prize in Physics in 1956.

Bardeen and Brattain conducted experiments and saw when two gold point contacts were applied to a Germanium crystal, the output signal was much higher than the input signals. Shockley saw the potential in this and eventually their work led to the development of Bipolar Junction Transistors. Considering the impact transistors have had in the electronics industry, it would be foolish not to include them here.

Number 6 – Compiler (1949)

Commodore Grace M. Hopper, USN (covered).jpg
Rear Admiral Grace M. Hopper

If you have ever written a line of code, you know exactly what a compiler is. If you are unfamiliar with compilers, they are computer programs that translate a code written in one language into another language. The compilers generally convert a high level language (the code that the programmer writes) to a low level language (machine code). The process of converting a high level code to a low level code is known as compilation. So, basically the compiler takes the entire code written by the user and then reorganizes the instructions.

The first compiler was written for the A-0 programming language, by United States Navy Rear Admiral and computer scientist Grace Hopper for the A-0 programming language. Hopper is another one in the long list of computer scientists, that we do not talk enough about. She was one of the first programmers who invented one of the first linkers, programs that take one or more object to combine them in a file. She was also integral in the development of Harvard Mark I and the COBOL programming language, which is still being used today. (Expect a biography on her soon.)

Compilers make this list because they are one of the most integral programs and have helped bridge the gap between human and computers. They are one of the main reasons, computers understand what human say, and its importance can’t simply be overstated.

Number 5 – Cable Television (1948)

TV Screen

Here is one invention that needs no introduction. Everyone knows what a cable Television is, and if you were like me, born in the mid 90’s or earlier, Cable TV was your primary source of entertainment for a long time. Cable TV primarily used coaxial cables to deliver television content to your TV sets via Radio Frequency. Though the coaxial cables are being replaced by optical fiber cables, the principles of Cable TV remain the same.

The invention of Cable TV has been credited to John Walson and Margaret Walson in the mountains of Pennsylvania. John Walson was a lineman for Pennsylvania Power & Light and also owned a local appliance store, that held an inventory of unsold TV sets.  He connected electrical wire from a local hill to serve as a receiver from Philadelphia stations, and as a result cable TV was born, and frankly has never looked back. This one makes the list due to the sheer success of cable TV and the broadcasting industry as a whole.

Number 4 – The Nuclear Submarine USS Nautilus (1955)

USS Nautilus SSN571.JPG
The USS Nautilus

The USS Nautilus was the first ever nuclear submarine and it completely changed naval warfare. This is not an exaggeration of any kind, but rather an understatement. Submarines prior to the USS Nautilus required two engines, a diesel engine to float on the surface and an electric engine to travel underwater. The USS Nautilus, making use of Nuclear energy could make dough with just a single engine, which meant it could travel for hundreds or even thousands of kilometres without having to worry about refueling and recharging. The Congress authorized the construction of a nuclear powered submarine in 1951 and it was planned and supervised by Admiral Hyron Rickover, father of the nuclear navy. The USS Nautilus became fully operational in January 1955 and roamed the waters of the Atlantic and the Pacific until it was decommissioned in 1980 and has been preserved as a museum ship in The United States Navy Submarine Force Library and Museum in Groton, Connecticut. The fact that this was the first nuclear powered submarine is enough to grant in a place in our list.

Number 3 – The Bell X-1 (Bell Model 44)(1947)

File:Bell X-1 46-062 (in flight).jpg
The Bell X-1

I have made you guys wait for a while before actually unleashing an actual war aircraft but with good reason. The Bell X-1 is one of the most significant inventions ever created and it has to rank highly on this list, let us understand why.

ChuckYeager.jpg
Brigadier General  Chuck Yeager, US Air Force

Supersonic travel is the rate of travel of objects that exceeds the speed of sound,(767 mph/ 1228 kph). The Bell X-1 was the first supersonic aircraft to take flight, and reached a top speed of approximately 1000mph or 1600 kph. The X-1 was the first among a series of X-panes which were used by the United States military to test new technologies and aerodynamic concepts. The person to achieve this remarkable feat aboard the cockpit of X-1 was the flying ace Chuck Yeager, who achieved this feat on 14 October 1947. The legacy left behind by the X-1 can’t be understated. We have seen in the Soviet article, the number of supersonic aircraft they created and the X-1 can be considered to be a forerunner in the development of further supersonic bombers, carriers and even passenger jets. Hence, the high position on this list is easily justified.

Number 2 – Hard Disk Drive(1956)

File:Laptop-hard-drive-exposed.jpg
Hard Disk Drive

Based on longevity and sheer usefulness, the hard disk drive might have topped this list. In fact, there is very little between the second and third place entry on this list. Everyone knows what a hard disk is. I can safely say that hard disk are one of the most used inventions of all time and all of us reading this article use it on almost daily basis.

A Hard Disk is a electro-mechanical disk that is used to store and retrieve data. They come in different shapes and sizes and can be both external and inbuilt within a computer hardware . But, you already knew that, and this is not a sixth grade computer science lecture, so I will tell you about the first hard disk, invented in 1956.

It was IBM corporation, who designed the first “commercial” hard drive. It was shipped with the IBM RMAC 305 computer and was the size of two refrigerators and was as heavy as a thousand kilograms. The capacity of this revolutionary device was 5 MB, yes Megabytes not Gigabytes, and the cost of this machine was nearly $50000.

BRL61-IBM 305 RAMAC.jpeg
The IBM RMAC 305

In 1956, the first hard drive to be sold commercially was invented by IBM. This hard drive, shipped with the RAMAC 305 system, was the size of two refrigerators and weighed about a ton. It held 5MB of data, at a cost of $10,000 per megabyte. The main brains behind the development of the hard disk was Raynold Johnson, a long term IBM employee (1937-1971), who also was responsible for the development of videocassette tape. Hard disk makes this list for being one of the most important inventions after the industrial revolution.

Number 1- Artificial Heart (1952)

File:Artificial-heart-london.JPG
Artificial Heart

I deliberated a long time thinking what should I place hard disk or artificial heart higher on my list, the former changed human lives for the later saves human lives. As you can see, I went with the latter. An artificial heart is a electro-mechanical equipment that essentially replaces the human heart. The replacement can be both permanent and temporary. Temporary replacements are more common when a heart transplant surgery or other cardiac surgeries are taking place.

Dr Forest Dewey Dodrill

On July 3, 1952 Henry Opitek became the first recipient of an artificial heart while his left ventricle underwent open heart surgery. The surgeon, who took upon the task was Forest Dewey Dodrill, who used a machine developed by himself and the researchers at General Motors, the Dodrill- GMR to perform this miraculous feat. Since then, open heart surgeries have saved the life of hundreds of thousands of people, and artificial hearts have prolonged the lifespan of many people, who would otherwise be deceased. For this reason, the artificial heart tops our list.

So, that was it for the greatest inventions from the United States, during the first ten years of the cold war. You may have noticed that, none of the entries concerned space exploration. That was because the United States was still recovering from the effects of World War II during these ten years. The space exploration projects and other mega projects will feature in the upcoming parts.

Thank you for reading.

References

https://people.howstuffworks.com/tupperware2.htm

https://www.autocue.com/blog/what-is-a-teleprompter

https://theconversation.com/explainer-what-is-radiocarbon-dating-and-how-does-it-work-9690

https://www.nobelprize.org/prizes/chemistry/1960/libby/biographical/

https://news.yale.edu/2017/02/10/grace-murray-hopper-1906-1992-legacy-innovation-and-service

http://www.columbia.edu/cu/computinghistory/mark1.html

https://hbswk.hbs.edu/item/cable-tv-from-community-antennas-to-wired-cities

https://electricalfundablog.com/barcodenumbersystem/#How_Barcode_Reader_Reads_1D_Barcodes

https://airandspace.si.edu/collection-objects/bell-x-1-glamorous-glennis/nasm_A19510007000

https://www.nasa.gov/centers/armstrong/history/experimental_aircraft/X-1.html

https://www.history.com/this-day-in-history/uss-nautilus-commissioned

https://science.howstuffworks.com/innovation/everyday-innovations/artificial-heart.htm

https://www.sjsu.edu/faculty/watkins/transist.htm#