What is Larrabee ? Well, before we go into that, let me explain it from the root and damn I'm sure its full of confusing terms and history.
A recap of history
Back in the era of Intel Pentium 4, Intel swear they'll break the 4ghz stock clock but instead the Pentium 4 hit a maximum speed of 3.8 Ghz on air cooling and heats up like unstable plutonium, further clock increase needs extreme cooling and it looses out to the Athlon 64 series in terms of price/performance/thermal. That is the end of Ghz/Mhz race and the starting of performance-per-mhz race or so called clock efficiency race.
What made the AMD Athlon 64 so much better than the P4, by only having a 2.0Ghz Athlon 64 costs almost 30~40% cheaper and almost equally faster than a 3.2Ghz Pentium 4?
This is because of the AMD's Intergrated memory controller and a two-way direct bandwidth between the processor/chipset/rams, Hyper-Transport (not to be confused with Hyper-Threading). Originally, the old athlon XP or athlon design is based on the pentium designs, utilises a single bus except the P4 is a quad pumped bus. Intel has been using the quad pumped bus until the recent Core2 series
When Intel notices they can no longer improve the old tech, they invested much to design a new ones during the Core2 fever. During the time, Core2 is ahead of the Athlon 64 in terms of price/thermal/performance and even triumph over AMD's first Phenom line.
After Intel jumped from the Core2 and onto integration of their memory controller into the Intel i7 processor and using a new bus design named as QPI (Quick Path Interconnect), like we all belive this will soon no longer able improve anything else but to increase clock speed and efficiency until it hits the wall.
As history repeats itself, every new will become the old and we keep search for the new. The future of CPU is the combination of CPU with a GPU (processor + graphics processor)
Recent hyped Larrabee made its news by introducing both CPU and GPU together performing 3D games rendering in real time, some sources mention it ran the games on lower quality and resolution and some defended it ran on high res and quality. From what I gathered, the Larrabee still haven't run anything taxing as Crysis or UT3 or STALKER, what they showed on wiki graphs and other webs are just simulated results of the performance, its not an actual performance graph.
Well, based on my experience, if it's able to run on high quality then its a really hot piece of silicon which needs to cool by running the computer remotely at North Pole or having a big heat sink sized like our head and heat pipes as thick as hot dogs.
(The north is running out of ice!)
Somehow until recenly, Intel canceled the Larrabee for the consumer market and making it a general purpose or enterprise market, for example, movie or entertainment industries or other scientific organization.
Still, its still not as cool as having these...
You could fold your wallet 200x if you can do that. :P