New Intel Chip Threatens Video Card Obsolesence with Discrete Graphics Processor

A new chip unveiled by hardware maker Intel is hoping to take graphics processing back to the x86 instruction set while still offering DirectX and OpenGL support, according to TechReport. The chip will be offered as discrete chip on motherboards as well as a standalone processor to compete directly with the GeForce and Radeon products.

Dubbed "Larrabee", the chip was unveiled at this year's SIGGRAPH conference and sports a variety of new technical features, including a fully coherent memory subsystem which allows for more efficient multi-chip implementation.

Should the x86-based graphical rendering capabilities prove viable to developers, Larrabee could serve as a cost-efficient alternative to expensive PC video cards, such as those produced by AMD and Nvidia. TechReport explains the chip and its features as follows:

One potential positive of Larrabee's fully coherent memory subsystem is the possibility of much more efficient multi-chip implementations. Nvidia and AMD are essentially managing the coherency problem manually via custom game profiles for multi-GPU setups right now. When I asked about this issue, Intel said it didn't expect to have the same pain as its competitors in this area.

The possible downsides of all-software implementations of things like the render back-end are also rather apparent. We saw this illustrated nicely when the Radeon HD 4800 series brought a vast improvement over the shader-based MSAA resolve used in the Radeon HD 2900 and 3800 series products.

Somewhat comically, the initial reactions to the Larrabee architecture disclosures from Intel are mixed along clear dividing lines. When I pinged David Kanter, the CPU guru at Real World Tech, about it, he was very much impressed with the choices Intel had made and generally positive about the prospects. Meanwhile, Rys over at Beyond3D expressed quite a bit of skepticism about the chip's likely performance and area efficiency for graphics versus more traditional architectures.

Intel expects engineering samples of the chip by the end of the year, followed by a general market release in late 2009 to early 2010.

Filed Under
From The Chatty
  • reply
    August 4, 2008 1:31 PM

    John Carmack rails against this sort of thing in his QuakeCon interview on

    • reply
      August 4, 2008 1:41 PM

      What are his reasons? If it's better than the current technology, or at least has a higher quality/price rating, then there's no reason to be 'against' it, unless you're AMD, Nvidia, or someone who just bought a monster graphics card. Even then, late 2009 is still far away, from a consumer stand point at least.

      • reply
        August 4, 2008 6:48 PM

        Because John Carmack just made you his bitch.

    • reply
      August 4, 2008 1:45 PM

      if u remove the software layer for api's like dx or opengl then yes. for me it's still to early to get a clue about where intel is heading with larrabee: rasterized graphics <> software rendering

    • reply
      August 4, 2008 2:43 PM

      You're totally off. (But then, so is the main article) John Carmack railed against ray tracing. Ray tracing is a stupid low value idea that intel is using to FUD the graphics market until they can get their own raster GPU in.

      His main point was that although he thinks Ray Tracing is inefficient, and rasterization is fast and starting to hit its peak, there are other techniques that both GPU companies and CPU companies aren't pursuing because nobody has created any actual software to run the hardware.

      He's right, but it is kind of a stupid chicken and egg complaint, what there needs to be is an open source project to create a good sparse voxel octree library / engine so that gpu companies have no reason not to develop it.

      • reply
        August 4, 2008 2:54 PM

        sparse voxel octree is caramcks way to implement ray tracing, slated for tech6, so maybe ur off here too.

Hello, Meet Lola