Wednesday, April 20, 2011
This article is brought to you in association with LG Optimus 2X
If you thought your PC was fast, wait until you see what Tianhe-1A can do: the Intel and Nvidia-powered supercomputer can do in a day what a dual-core personal computer would take 160 years to complete.
It's a serious bit of kit, and it's not the only supercomputer with more cores than we've had hot dinners.
Every six months, the TOP500 project ranks the world's most powerful computers - and right now, these are the top ten machines the world has ever seen.
1. The slightly mysterious Chinese one: Tianhe-1A
China's supercomputer is currently the world's fastest: it can run at a sustained 2.5 petaflops (a petaflop is a thousand trillion floating point operations per second) thanks to its 186,368 cores and 229,376GB of RAM.
While the horsepower comes from off-the-shelf Intel and Nvidia chips, the New York Times says that the Chinese machine's speed is down to its interconnect, the networking technology that connects the individual nodes of the computer together, which is twice as fast as the InfiniBand technology used in many other supercomputers.
It's located in Shenzhen's National Supercomputing Centre, where it's used by universities and Chinese companies.
Image credit: Nvidia
2. The one with a quarter of a million cores: Jaguar
Jaguar, a Cray XT5-HE supercomputer located at the US Oak Ridge National Laboratory, has quite a few cores: TOP500 says there are nearly a quarter of a million since its most recent upgrade.
Jaguar's 224,162 cores come courtesy of a whole bunch of six-core Opteron chips, and its performance is a hefty 1.76 petaflops. Oak Ridge says it's the world's fastest supercomputer for unclassified research.
Image credit: NCCS.gov
3. The other slightly mysterious Chinese one: Dawning Nebulae
When it launched in early 2010 the Chinese Dawning Nebulae supercomputer was the world's fastest, with performance of 1.27 petaflops, but it's already in third place thanks to Jaguar and China's own newer, faster Tianhe-1A. Like its sibling Nebulae is in the National Supercomputing Centre in Shenzhen.
Image credit: Nvidia
4. The one with the rubbish name: TSUBAME 2.0
Tokyo's TSUBAME 2.0 offers similar performance to Jaguar - it peaks at 2.3 petaflops, with sustained performance of around 1.4 petaflops - but it's one-quarter of the size and uses one-quarter of the power thanks to its heavy reliance on Nvidia Fermi GPUs as well as Intel CPUs.
According to project lead Professor Satoshi Matsuoka, TSUBAME will really shine in climate and weather forecasting, biomolecular modelling and tsunami simulations.
5. The planet-saver: Hopper
Hopper is working on the big stuff: climate change, clean energy, astrophysics, particle physics... its home, the US Department of Energy's National Energy Research Scientific Computing Center, offers its services to more than 3,000 researchers in the fields of climate research, chemistry, new material development and other crucial fields.
Image credit: NERSC
6. The French one: Tera-100
The first petaflop-scale supercomputer to be designed and built in Europe is pretty fast: "its capacity to transfer information is equivalent to a million people watching high-definition films simultaneously", the press release says.
Built around Intel Xeon 7500 processors, the successor to 2005's Tera 10 is 20 times faster and seven times more energy efficient as its predecessor. It's another nuclear one: Tera-100's mission is to help guarantee the reliability of Europe's nukes.
Image credit: Bull
7. The former champion: Roadrunner
Supercomputing is a fast-moving field, and Roadrunner is the proof: in 2008 it was the first supercomputer to crack the petaflop barrier for sustained performance, but its 1.04 petaflop speed means it fell to seventh place in just two years.
Built by IBM for the US Department of Energy, it was designed to work out whether the US's nuclear weapons would remain safe as they age - although like most supercomputers it's also available to industry, with car and aerospace industries paying for a go.
Image credit: IBM
8. The answer to life, the universe and everything: Kraken
Can your computing project be handled by a machine with 511 cores? Then don't bother coming to Kraken: it's best suited to jobs that use "at least 512 cores". It's got plenty to spare: the National Institute for Computational Sciences reports that the Cray supercomputer has 112,895 compute cores spread across 9,408 nodes.
Its purpose? To help "solve the world's greatest scientific challenges, such as understanding the fundamentals of matter and unlocking the secrets to the origin of our universe".
Image credit: UTK
9. The ultimate DVD ripper: JUGENE
Germany's supercomputer was designed for low power consumption as well as high performance, and it's been involved in some interesting projects - including trying to work out how DVDs work. According to Scientific Computing, it's improving our understanding of "the processes involved in writing and erasing a DVD", which should lead to storage media that works better, lasts longer and provides higher capacity.
Image credit: Forschungszentrum Jülich
10. The one keeping nukes safe: Cielo
Nuclear science and supercomputers are a match made in heaven: the former can use the latter to test things without blowing anybody up or irradiating them for generations. Cielo is used for "classified operations" by the US National Nuclear Security Administration, and it's getting a big upgrade this year: its 6,704 computing nodes will be upped to 9,000, and its memory will go from 221.5TB to around 300TB.
Image credit: NNSA
The ones coming soon: Titan and Sequoia
Will China lose its fastest computer crown in 2012? That's when the US Titan supercomputer, a $100 million machine for the US Department of Energy, comes online - and it's also when IBM's Tianhe-1A-rivalling Sequoia gets plugged in too. TItan is designed to analyse complex energy systems, while Sequoia will work on simulations of nuclear explosions to reduce the need for real-world tests.
Last year, Twitter made it a goal to increase monetization of its social networking site by implementing promoted tweets and running ad campaigns with major brands.
This year, Twitter hopes to make the site more accessible and broaden its appeal to the masses.
To that end, the San Francisco-based company hired Dick Costolo in 2010 hoping he could successfully monetize Twitter.Co-founder Evan Williams has described COO Costolo’s tenure as successful, saying, "During his year at Twitter, he has been a critical leader in devising and executing our revenue efforts, while simultaneously and effectively making the trains run on time in the office."
This year, Twitter has re-hired Jack Dorsey whose main function will be to make Twitter more useful.
Dorsey officially returned to Twitter three weeks ago after being pushed out by the board two-and-a-half-years ago.
At a talk at Columbia, Dorsey said, "We have a lot of mainstream awareness, but mainstream relevancy is still a challenge.”
The majority of people still see Twitter as a way to update a status, much like Facebook. One of Twitter’s main goals for 2011 is to highlight the usefulness of Twitter by taking a hyper local approach.
Sources say when someone signs on Twitter, the company hopes to highlight tweets from their immediate area. Whether it’s from a local politician or musician, the idea is to show new users how Twitter is relevant to their lives, thus prompting them to use the site.
Jonathan Strauss, chief executive at Awe.sm said "Most people understand Twitter exists, but they don't understand what Twitter is and how they can participate.”
Besides promoting local tweets to add relevance, Twitter is hoping to add tools for power users. Recently, TG Daily reported rumors that Twitter might buy TweetDeck for $50 million, a tool that allows users to filter the Twitter conversation and track their account in a more advanced way.
Aside from promoted tweets for local users, Twitter users have already created lists where people can group together particular people. A person familiar with the matter said Twitter is looking into ways to promote certain lists to make them more accessible to newbies.
Twitter is also exploring "EdgeRank," a system that highlights posts by a user’s closest friends.
"Most of the time what people want is the most relevant and important information, and without filtering its content for individual users that's difficult for Twitter to satisfy," said Strauss.
Twitter is clearly an awesome idea that has caught on in a major way. That said, it’s about time Twitter finally understands that users want to access and filter information in new ways. No doubt the social networking site will be seeing some serious changes (hopefully for the better) in the coming months.
Don't bother buying a Playbook if you have an AT&T Blackberry.Apparently the mobile carrier can't sync with RIM's new tablet, which is one of its most important features.
The Playbook doesn't have its own e-mail client, calendar app, instant messenger, or contact list. But when it's wirelessly linked to a Blackberry, all those features are instantly enabled.
It makes it pretty neat when the two devices are paired up with each other, giving you a bigger screen to look through all the content on your phone.
That is, when it actually works. Customers with a Blackberry on AT&T, which is ironically one of RIM's strongest partners, receive the following message instead: "This application is not available on your device or for your carrier."
Neither AT&T nor RIM has issued a statement as to why it is the only carrier blocking the connectivity option, but some have suggested AT&T may want to jockey for an additional fee for the service.
When it comes to the Playbook, RIM can't afford any bad publicity or snags. It is the most important product the company has released in a long time, and could be the thing that brings it back from impending doom. But it could also be a nail on the coffin...