Skip to main content

FLOPS - what they mean



FLOPS.  What is a FLOPS? It is the number of multiplications a computer can compute in one second,  far less than additions, though not quite.  It is the number of floating point operations a computer can compute in a single second.  A floating point number has a bunch of digits (e.g. 12) times a power of ten.  For example, most computers would have native trouble with the number 12345678901234546789012345, but they do well with approximations such as 1234567890 x 10^16. This means you lose some precision, i.e. accuracy, but you get the magnitude.   

Let’s not devolve into software that can handle very high precision numbers, or the precise representation of floating point numbers inside the computer. It is enough here to know that we are working with numbers having a fixed number of digits together with a power of ten.  Let’s just talk about the FLOPS, floating point operations per second, of the native hardware.

You know computers are fast, and expensive computers are really fast.  MegaFLOPS refers to a million floating point operations per second.  A computer rated at one gigaFLOPS refers to a billion floating point operations per second, and a one teraFLOPS machine can compute a trillion floating point operations per second. A one petaFLOPS machine is faster again by multiplication by 1000, a quadrillion floating point operations per second.   

Is this fast?  You bet. At the human level, if you could multiply two of these numbers in one second, it would take one year to make 32 million such computations. It would take you more than 31,000 years working 24/7 to make the computational equivalent of just one teraFLOPS. Modern supercomputers are much faster.  Let’s start with the fastest.

The current record as of June 2016 is held by the Sunway TaihuLight at the National Supercomputing Center in Wuxi, in China*. It is rated at 93 petaFLOPS, now called PFLOPS.  They want to bump this up to 130 petaFLOPS, presumably so they can do some serious computing – or maybe just show off. A really fast modern desktop you can buy clocks in at about 10.5 gigaFLOPS (get the Intel i7-5820K @3.3gh chip set).  The ratio is about 9,000,000.  So, the Sunway TaihuLight can make 9 million calculations in the time your “speedy” machine can make one.  Japan is investing $173m to build machines even faster. 

Historically, in late 1996 Intel's ASCI Red was the world's first computer to achieve one teraFLOPS. More recently in the USA, we have the the Cray Titan at the Oak Ridge National Laboratory clocking in at 17.59 petaFLOPS in 2012.  In the same year, the IBM Sequoia located at the Livermore National Laboratory clocked in at 17.17 petaFLOPS.  Fast computers though they are, we are seeing factors of increased speed year by year.  Getting a bit technical, the computer architectures play a role, and those used for these machines has been about maxed out.  Increased speeds can come only from decreasing the scale of components and changes in architecture. All use multiple processors.

Returning to the real world, I use a Surface 3 Pro computer, which features an Intel Core i7-4650U @1.7ghz processor.  It clocks in at about 2.5 gigaFLOPS. I thought it was fast, but this speed is really slow compared with its big brothers, fully 37 million times slower. Put another way, it would take just over one year for my computer to compute what the Sunway TaihuLight machine can compute in one second.  (This uses the approximation of about 32 million seconds in one year. Actually, there are 31,557,600 seconds in a year having 365 ¼ days.) You can test your computer with software located at http://www.passmark.com/.  Note that my simple computer is a mere 400 times slower than the fastest computer just 20 years ago!  

This is all ridiculous, you say.  Whatever can we need such speed for?  Actually, we do need the speed, and even more speed.  Many problems these days use multiple thousands of variables, and solving even simple systems with so many variables can require trillions of calculations.  And this may be for just one step of a trillion step iteration.  Specifically, they are applied in a variety of fields including quantum mechanics, weather forecasting, climate research, airplane aerodynamics, nuclear weapons, biological macromolecules, and cryptanalysis.  So, speed is a vital factor when concatenated with modern problems that do need solving. 

*  These records are in flux, having changed 18 times in the last 23 years.  Expect a new record in 2017. Linux seems to be the operating system of choice, overwhelmingly dominating all contenders.  The Japanese via Hitachi hope to break the exaFLOPS (quintillion) barrier by 2020. An exaFLOPS machine is rated at 1000 PFLOPS. Such speeds are to this author utterly incomprehensible. 

References
https://en.wikipedia.org/wiki/Supercomputer, the idea and details of supercomputers.
http://www.passmark.com/, benchtesting your machine
https://en.wikipedia.org/wiki/TOP500 - the fastest 500 supercomputers

Comments

Popular posts from this blog

Behavioral Science and Problem-Solving

I.                                       I.                 Introduction.                Concerning our general behavior, it’s high about time we all had some understanding of how we operate on ourselves, and it is just as important how we are operated on by others. This is the wheelhouse of behavioral sciences. It is a vast subject. It touches our lives constantly. It’s influence is pervasive and can be so subtle we never notice it. Behavioral sciences profoundly affect our ability and success at problem-solving, from the elementary level to highly complex wicked problems. This is discussed in Section IV. We begin with the basics of behavioral sciences, Section II, and then through the lens of multiple categories and examples, Section III. II.     ...

Where is AI (Artificial Intelligence) Going?

  How to view Artificial Intelligence (AI).  Imagine you go to the store to buy a TV, but all they have are 1950s models, black and white, circular screens, picture rolls, and picture imperfect, no remote. You’d say no thanks. Back in the day, they sold wildly. The TV was a must-have for everyone with $250 to spend* (about $3000 today). Compared to where AI is today, this is more or less where TVs were 70 years ago. In only a few decades AI will be advanced beyond comprehension, just like TVs today are from the 50s viewpoint. Just like we could not imagine where the video concept was going back then, we cannot really imagine where AI is going. Buckle up. But it will be spectacular.    *Back then minimum wage was $0.75/hr. Thus, a TV cost more than eight weeks' wages. ------------------------- 

Principles of Insufficiency and Sufficiency

   The principles we use but don't know it.  1.      Introduction . Every field, scientific or otherwise, rests on foundational principles—think buoyancy, behavior, or democracy. Here, we explore a unique subset: principles modified by "insufficiency" and "sufficiency." While you may never have heard of them, you use them often. These terms frame principles that blend theory, practicality, and aspiration, by offering distinct perspectives. Insufficiency often implies inaction unless justified, while sufficiency suggests something exists or must be done. We’ll examine key examples and introduce a new principle with potential significance. As a principle of principles of these is that something or some action is not done enough while others may be done too much. The first six (§2-6) of our principles are in the literature, and you can easily search them online. The others are relatively new, but fit the concepts in the real world. At times, these pri...