Skip to main content

Microprocessor Architectures in the top500 list.

The top500 list has been swamped by x86 machines over the last several years.  Anyone who knows me is aware that I am NO fan of the x86 ISA.

Somewhat heartening is that the most recent list seen a resurgence in other architectures. 

The Power architecture from IBM has been a fixture on the list for a long and is still a strong contender even though it has fallen to No. 12 from No. 9 last time and No. 1 in November of 2007.

A Sparc machine has captured the No. 1 place this year and is actually as powerful as the the No. 2 through No. 6 systems combined.

The Chinese MIPS based Loongson processor is starting to make some noise and the Dawning 6000 is reported to eventually reach a Petaflop in computing power.

One other note is that an increasing number of machines in the top500 list rely heavily on GPUs.  The No. 2 machine on the current list, the Chinese Tianhe-1a machine, is comprised of 14,336 Intel Xeon processors and 7,168 Nvidia Tesla GPUs.

Now I guess we just have to wait to see an ARM based system hit the list.

Comments

Popular posts from this blog

Duck typing considered harmful.

I've had a chance to work on a fairly large chunk of Python at this point and have decided that Python (like Perl) is completely untenable at scale.  <rant><rave><drool>  but wait!  I have reasons! Programmers spend most of their time reading code.  We read massive amounts of code.  After we read massive amounts of code we write one... or change one... To make a change to a piece of code you first have to understand what it does and how it interacts with the system around it.  Then we start reading massive amounts of code again.  Anything you can do to minimize the amount of code a programmer has to understand to make a change becomes a huge gain in productivity. Duck typing causes the amount of code you need to read to make a change to grow very large. For example lets look at two functions, one in C++ and one in Python. First in C++ int function(type1 arg1, type2 arg2) {   return arg1->method(arg2); } In this function we know that it takes two typ

Publications by Jeff Vitter

Jeff Vitter is the provost and executive vice chancellor and the Roy A. Roberts Distinguished Professor at the University of Kansas. I just stumbled across his page and found a great book on external memory algorithms and data structures external memory algorithms and data structures. IE: algorithms and data structures to use when your dataset is too large to fit in main memory. He has many other papers on his site including several on external memory graph algorithms. Here's the link... http://www.ittc.ku.edu/~jsv/Papers/catalog/

Types of arrays

Arrays are one of the earliest data structures.  They are essentially just a block of memory which is accessed by offset.  Despite their simplicity there are still several broad categories each with their own sets of algorithms. Is the array a fixed size or can it grow as you add elements to it? A fixed size array is called a static array and one that can grow is called a dynamic array. Are the element all of the same type or can they be different? If they must all be the same type then the array are called homogeneous.  Otherwise it is a heterogeneous array. Finally, how many dimensions (subscripts) does the array have? For example, a 2 dimensioned array is a matrix. The number of dimensions doesn't change the structure of the array but changes the way elements are accessed.