Do you ever use the term ‘jiffy’ when talking about how quickly something can be done? As IT experts, we’re all about precision and accuracy – but where did this quirky little term come from?
Well, it turns out that ‘jiffy’ is actually a real unit of time used in the world of computing. Specifically, it refers to the amount of time it takes for one tick of the system timer interrupt on a computer. I can explain that further if you like, but it’s getting very technical!
Where did the term ‘jiffy’ originate? It actually dates back to the 18th century, where it was used to refer to a brief moment of time. And over the years, it’s been used in various industries to denote different lengths of time For example, in physics, a jiffy is defined as the time it takes light to travel one fermi… that’s a real term by the way, not something from Star Trek.
So next time you hear someone say they’ll get something done ‘in a jiffy’, you can impress them with your knowledge of its origins in computing.
And who knows, maybe you’ll even start using it yourself to measure how long it takes to complete annoying IT tasks… that you should be giving to my team anyway!
#jiffy #ComputingHistory #ITHumor
https://measurement.en-academic.com/1235/jiffy