WE GOT INTEL’S PROTOTYPE GRAPHICS CARD!!


Intel keeps very careful track of its internal engineering samples, going to great lengths to ensure that if they leave the lab it is in pieces so small that they could never be reassembled again. So the first question we need to answer is: how did we get our hands on this thing? eBay obviously. What can’t you buy on eBay? A sponsorship on Linus Tech Tips. For that you have to talk to Colton, like Corsair did. Corsair’s Dark Core SE RGB wireless mouse features 1ms, 2.4ghz and low latency bluetooth connectivity.
Check it out at the link below. [intro music] So the seller turned out to have been a contractor at Intel a few years ago, who in the middle of a sixth-floor renovation went dumpster diving through the boxes full of “junk” that was destined for the e-waste pile. Apparently It was a treasure trove of press samples, laptops and this GPU-looking thing, that was noteworthy for being blue instead of green or red. You see, by that point, project Larrabee – this, sort of – had been cancelled for years and how many years depends on which cancellation you’re going by. So talking to Tom Forsyth, who was one of the key team members, he figures they got cancelled anywhere from four to five times, and remembers getting these weird memos. Yeah… You guys are gonna see some headlines. It’s just a thing. None of you were laid off. Just keep on working. BT-dubs you’ve been rebranded Xeon Phi. Thanks, bye. So what is this thing? That’s actually a somewhat complicated question, but because it’s got a DVI port, not to mention DisplayPort and HDMI soldered onto it, it is technically an engineering sample board for Intel’s first and to date only dedicated graphics card. Now, most people who follow the mainstream tech press believe that project Larrabee was an abject failure but as is so often the case, the truth is actually stranger than fiction. Not only was it a success, but it powered TH2, which was the world’s most powerful supercomputer for over two years, and ten years later you can actually still buy its descendants, either in socketed form, as we reviewed just last year, or on Amazon for a cool 1,500 greenbacks. So as it turns out, the goal of the program never was actually to create a gaming GPU. That was just a workload that was already fairly well understood at the time because, you got to remember, back in the mid 2000s the idea of using a GPU as a general-purpose computing unit was just emerging, so this idea of using it for gaming was actually just a small part of a business case to build a processor that had many highly efficient x86 cores that could be easily just, like, slotted into these powerful supercomputers. But that doesn’t mean that it couldn’t have been used for gaming. In fact, by the time they wound down the units that were working on graphics, they had about 300 of the top selling games on Steam running on the thing, with a card just like this one as the only GPU in the system. And the way this whole thing worked is incredible. Now, a normal graphics card, or GPU rather, uses a lot of fixed function hardware, so if you told it “okay look, I don’t need shaders, just draw a ton of tiny lines with really nice anti-aliasing” so it’s pretty much CAD, in a nutshell. It would use only a fraction of its hardware. But with Larrabee, everything is software, so the whole chip is lit up doing that. So that actually help to offset the x86 overhead a fair bit. This was the fastest CAD card at the time, and it had other benefits. With regular GPUs, you might run into a situation where enabling a particular feature in a game might hit the AMD users a lot harder than the Nvidia users or vice versa. So during development, AMD and Nvidia, they both have to actually guess as best they can what the next couple of years of games will demand, and then try to look into their crystal ball and build their hardware around that. Larrabee, no such limitation. This thing is a full-blown computer with up to 61 quad-threaded cores running a normal operating system like FreeBSD. Like, you could actually telnet into the thing and run a top command and see a list of all the processes that were running on it, and if you were running a game you’d see, I don’t know, 128 or 200 processes called “DirectX graphics”, and you could do that while the thing was working! So if you wanted you could cordon off some of the cores and use them for something else, or you could just yolo it and throw another workload into the mix and then just let the processor manage itself. The only non-programmable hardware on this puppy is the texture unit which takes very simple commands. I mean, wrap your brain around this. The thing that I’m looking at right here is Intel’s first ever DirectX 11 GPU. Even though it was built before DirectX 11. So this was possible because all of those graphics card features that are normally running in hardware are just running in software, so you could actually update it to DirectX 11 or DirectX 12 With a driver update. Now, there are some caveats here. I mean, there’s a reason that the thing never made it into a computer near you. It wasn’t as efficient as a dedicated graphics card for a lot of things. So it only got about a quarter of the performance in games as a comparably power consuming card from AMD or Nvidia at the time. But it was really good at certain graphics workloads for a number of reasons and- I mean if you think about it and you look at how far off they were, considering that they were effectively emulating dedicated hardware, it’s damn impressive. So… What happened? Well, management happened. Intel at its core, ha-ha, is a hardware company, so they wanted all the features completed so they could either ship this thing or can it. Because in the hardware world making up a four times difference in performance is impossible and you might as well just pull the plug, but the team wanted to work on performance optimization instead, because in the software world it’s not unheard of to go from, like, two pixels showing up on a screen and dog slow to a hundred times faster, in a week if you have a breakthrough. And it got to the point where they had to have separate teams for performance and for features, to get management off their backs. So the performance team actually got Quake running, like really fast, but then they found out that Quake was this weird edge case and the architecture would have to be completely redone. I mean to give you that some idea of the dysfunction, at one point there were three to four software teams with different ideas and working on different rendering architectures. But depending who you ask, the continued development would have been worth it. I mean imagine this: instead of turning anti-aliasing on for an entire scene, imagine if a game developer could say “Well, you know what? This sky is not important to be anti-aliased. Why don’t we just focus all of our AAA on, you know, these characters here, or this foliage there?” Or how about this, like, “oh crap that texture wasn’t loaded. You know what, let’s just procedurally generate a placeholder.” Boom. Arguably the stupidest decision that was made was to make the Larrabee graphics team and the GEN graphichs team, which is what Intel calls its integrated graphics internally, compete together for the same budget, and then like make internal presentations arguing about why their approach was good in the future and the other groups was bad and not the future, because they were both perfectly suitable for what they were doing. Larrabee was never going to be a 5 watt part that you could fit right into a CPU, and a 200 watt PCI Express part was nowhere on the roadmap for GEN. So, what I’ve got here (come on, come on, come on) is not “Knights Ferry”. That was the first Larrabee revision that had some deal breaking bugs. Apparently the saying in the hardware industry is “always plan to make a prototype, since you’ll end up making one anyway”. So this is Knights Corner and probably has anywhere from 6 to 16 gigs of RAM, and up to 62 cores depending on how many of them had some manufacturing flaws. Should we fire it up? I mean, come on, I wasn’t not gonna do that at this point. I spent like $400 on this thing off of eBay. I’ve got no drivers for it, so it’s actually- this is the first time I’ve turned it on so it is very possible that it won’t manage to display anything even in 2d, But I definitely… have to try. By the way, if anyone out there has the secret sauce drivers or has access to the secret sauce drivers that would make this run games, Please, hit me up. I mean, assuming that it even works, which we don’t know yet. I- I actually haven’t tried this, I wanted to save the suspense for the video. This is like far more postcodes than I’m accustomed to seeing. But it hasn’t stopped. And it hasn’t, like, rebooted. We’ve got some kind of uh… We’ve got some kind of LED on here. It looks like it stalled on D6, but I don’t know what that is. Now, when I talked to Tom, he did specifically mention it’s got DVI soldered to it. Now I don’t know if that’s because DVI was the most relevant output at the time, so that’s like what they used internally, or if the display port an HDMI were just dummies and DVI was the only thing that actually worked, so… Take two, I’m gonna run and grab a DVI monitor and gonna try this again. Like I kind of wonder about… You know what, It’s PCIe… I mean, would that be even gen two at that point? Like two thousand… 2007? 2009? I wonder about compatibility with a new board and stuff like that. You know what, I don’t think it’s gonna boot. Well, that’s pretty disappointing. I thought I might be onto something with the whole DVI thing. I’m just gonna try- I’m gonna try one other slot, just uh… I think there’s only one other one out in the wild and some like Russian collector of like weird hardware has it. Yeah. Not you, a different one. Okay, sometimes it hangs on 79 for a bit and then this thing boots, so that might have been a good sign. Oh no, that’s D6 again. I think it’s not going anywhere. Well, that was disappointing, but I’m… I’m gonna let it keep trying while I tell you guys about Massdrop. Oh, I’m like sad. It’s like hard to have any energy. Okay, we’ll try that again. Massdrop! Massdrop is featuring the Sennheiser PC37X Gaming Headset. They’ve got angled drivers and an open-back design and the drivers actually come from the same family as the HD598 and HD600 headphones. They offer superior stereo imaging and locational accuracy, and come with a noise cancelling microphone. They’re available on Massdrop at the link below for a limited time for just $120, so go check them out. So thanks for watching guys. If this video sucked you know what to do, but if it was awesome get subscribed, hit that like button- you can especially hit that like button if you want to make me feel better about how disappointed I feel right now. Uh… Or you can check out the link to where to buy the stuff we featured in the video description. Also linked in the description is our merch store which has cool shirts like this one and our community forum which you should totally join. Oh I really… I was really hoping I was just gonna get the screen to light up. That was all I was really… It was all I really wanted Good night sweet prince. You were too good for this world.

, , , , , , , , , , , , , , , , ,

Post navigation

100 thoughts on “WE GOT INTEL’S PROTOTYPE GRAPHICS CARD!!

  1. I want that too. I want to do a lot of various testings on it. Looks too expensive tho😭😭😭😭😭

  2. Intel was doing Ray Tracing in 2010 in game on this thing https://www.youtube.com/watch?v=XVZDH15TRro crazy, nice find Linus, did you ever get it working?

  3. 11:13 Linus I think the "secret sauce" you were looking for is real! You can find it in Intel's internal memo about how AMD is a formidable competitor.

  4. I prefer the days of when LTT wasn't a script-based paid actors YouTube channel… this over the top presenting is pretty annoying.

  5. intel: WE GOT GRAPHICS CARD LOL. ryzen: but we sell graphics card and processors since we are created

  6. But I mean, why hasn't Intel just glued a bunch of their integrated graphics cores together and made one larger GPU? They have drivers that run pretty much every game, so I don't really get what the issue is after some R&D work to make this thing legit.

  7. Stop humble bragging, stop taking individual donations. LMG is so strong you should take it public. That ship has sailed. You pay for a name brand on Corsair, not performance. Crucial ram rated at the same speed does the same thing. Just not so flashy looking. You tools that buy that shit are retarded. And he KNOWS this. But yet he will let you over pay for ram or a gpu. Hence the advertising.

  8. You are trying to control absolutely everything within your little micro dot of a company. Keep that cute face. You will need it come actual business time.

  9. He tests shit on cinebench instead of a voltmeter and an O scope. Which means he believes manufacturers' specs. Don't ever do that.

  10. If I had unlimited money I could just keep throwing money at it until the problem disappears. Or if I invested 2000.00 in a quality meter and an O scope, i could probably figure out what the fuck was going on.

    I am old. They come in a combined unit nowadays.

  11. I have never seen Linus explain anything on a DMM. He really doesn't know what the fuck he is talking about.

  12. Intel should continue the project even if they cancelled it it's a really good idea to have it being mostly software based as they could have a program in it that automatically updates the drivers, or automatically puts your games at the best performance without sacrificing the graphics and with all of that from what I can understand it's basically a computer (all Graphics cards are but this one actually uses a CPU from what I understand) so if they were to make it they could pretty much have the same system as Google Stadia has right now but can play any game with them having the beefy hardware I don't know it's just a theory

  13. Could you use something like this for physics processor? Dedicated physX style? (I never got it to really work for me).

  14. If you still have this, LTT, there seems to have been a driver which has been pulled which seemed to have had support for upcoming discrete gpu's from team Blue. Maybe those drivers would make this dev board function ( somewhat.. ) as well.

  15. I actually felt bad for him, He looked like a kid hoping to find an xbox down christmas tree and find a sweater instead XD

  16. 9:26 sounds like idea from an Indian Manager… people did a lot good pre work, then this manager dude stepped in, and ruined everything, then dude just transfer to other team and repeat it…

  17. And less than a year later Intel announces discrete graphics cards:-

    https://www.techradar.com/news/intel-graphics-cards

  18. Some cards don't post when they are pcie v1 try an older motherboard such as an Intel "945GCNL" if the card is good it should post on that motherboard.

  19. I had my own intel I-740 pci version. The card had an "AGP" interface as well as a standard "pci" variant. Basically Intel made gpus before but they were never truly successful and even some that were higher end had issues like one game would run great and another game using a simlar kind of direct 3d would be slow or have strange graphical bugs/issues much like newer onboard cards by Intel still do, e.g. hd-4000, hd-4600 and Evan later higher end onboard cards. Fortnite with an hd-4600 has glitched shadowing, flickering colours and other problems but yet there are other 3d games that may work good with that video chip so much like the I-740 did back in the day Intel has never made a truly powerful card that really works properly like AMD or NVIDIA. Intel CPUs are very powerful and stable I believe Intel has always made nice stable cpus and has still come a long way but in the gpu world they never really "got it"

  20. [Question @Linus Tech Tips
    ] Can you build a PC that unfortunately nobody does, PC for really fast clicking and switching and loading and browsing hard users and for the idea send it to me, please?

    The issue I am having with most of the PC is that they can't cope with my speed of work, what I mean. Example, when I am programming I need many times to copy paste names of variables, when I will click too fast (with my perfect and cheap dell cable mouse) it selects full line instead of just a variable name. Very frustrating when you then paste it somewhere I need to go back and copy again. Or loading, when I start my PC I need to open 5-10 apps straight away e.g. Chrome (with up to 130 tabs), IDE, git client, another IDE, text editor, music player, and few more. When I work on bigger project sometimes I need to have 4-5 IDEs open at the same time where many times I open them and close where I am changing something in core files and need restart. This takes a lot of time for me then standing and looking on the loading bars.

    So in the nutshell testing a setup which best fits for loading files, loading data into ram, scrolling, switching, browsing, loading websites many of everything. My current setup has 1-2 year ago very good setup:

    Samsung 970 EVO (500GB, M.2 2280)

    Intel Core i7-8700K (6, LGA 1151, 3.70GHz)

    Corsair Vengeance LPX (2x, 8GB, DDR4-3000, DIMM 288)

    be quiet! Dark Rock Pro 4 (16.28cm)

    ASUS ROG STRIX Z370-F GAMING (LGA 1151, Intel Z370, ATX)

    Phanteks Eclipse P400S Tempered Glass (Midi Tower, Insulated)

    Thermal Grizzly Kryonaut (11.10g, 12.50W/m K)

    GTX 950

    WIth little XMP and overclocked GPU and still it is too slow, sometimes terrible slow!

    If you could do that you will safe me life time!

  21. I have the drivers if you still want them. We recieved this card for test purposes in our Crytek headquarter in germany.

  22. WE GOT INTEL'S PROTOTYPE GRAPHICS CARD!!
    Seriously i got this bullshit 1 year later
    So +1 year later Linus you are good

  23. Honestly if thats where you're going to give up, give it to me. Ill work on this thing until I have it working. I cant stand to see such a cool piece of tech go to waste and sit on a shelf again.

  24. Linus: stands still for more than 1 second
    Me: ok it’s sponsor time
    Linus: announce the sponsor
    Me: awwww maaaaan

  25. wish we could see more of these cards being made. like having an all intel setup. ssd, gpu, mobo, psu, cpu, and ram made by them!

  26. This was not Intel's "first, dedicated graphics card." I owned Intel's first, dedicated graphics card in, what, 1998? It was an AGP abomination called the i740. Meant to compete with the 3DFX Voodoo cards of the day, it had, well, lackluster driver support at best. Actually, it had no official driver support. It was a dog. For your edification: https://en.wikipedia.org/wiki/Intel740

  27. Linus is all lies when he says that this was the only/first Intel video card! The Intel i740 was their first discrete video card! https://en.wikipedia.org/wiki/Intel740

Leave a Reply

Your email address will not be published. Required fields are marked *