Thunderbolting Your Video Card

I have been using the Core with a Titan XP for a three months and for GPU intensive games it is awesome. THe issue is with games that require heavy CPU usage. Doom runs at 100+ DPS with all settings maxed but Division on the other hand runs at 50 FPS. Both on a 3440x1440 ultrawide.

This CPU bottleneck gives me benchmark and FPS envy.

Despite the lost performance in some games the ability to incrementally build a “desktop” like experience with my existing laptop was key.

The question now is do I give in to a full desktop and spend another $2000 or use the Skull Canyon with Core for another year or so for only $800.

You’re spot on about Mac users. I still have an aging, monster, gaming rig. But, given the hassles of Windows 10 and, now, nVidia’s ridiculous addition of a LOGON to their driver configuration software, I’ve decided that I’m not buying another PC. However, many of the games I like run just fine on a Mac, given sufficient horsepower. Civ V plays OK on my 15" MBPr, just on the embedded Intel graphics, but I hate the heat and fan noise it generates. I’ve been looking at these external enclosures for a long time. What keeps me from buying them (in addition to the cost) is that you have to do some very foreign things to the kernel on a Mac to get them to work. After 19 years of jerking around with Linux on the desktop, it’s a hassle I’m not willing to undertake. I said all of that to provide the context to say this: If Apple would officially support an external Thunderbolt enclosure (and popular game-playing video cards), it would be a DAY ONE purchase for me.

EDIT: I see from a comment above that the “Bizon” box has an installer, so there’s no monkeying around with kernel modules, directly… I may have to reconsider this.

Seems like there might be a market opportunity for a graphics manufacturer who’s paying attention. Making an integrated unit would save on redundant connectors and improve the cooling. Might be cheaper too.

Once upon a time there were no external hard drives either.

I guess I should have read your comment before I made mine below. I should know that I’m not clever enough to have an original idea!

1 Like

Very much a rumour, but there’s also this possibility:

1 Like

If anyone is interested, I wrote an article a few months ago about building a super-simple, hack-free TB2 build for use with older Macbooks. It’s not guaranteed to work with all configurations, and it’s only good for BootCamp, but now I don’t feel the need to upgrade for another year or more.

I’d imagine laptops, mostly. Most PC gaming I’ve done has been sitting on the couch with a laptop while also watching TV with my wife.

1 Like

The Akitio Node does the same thing as the Razer Core, for $299:

http://www.anandtech.com/show/10828/akitio-introduces-node-thunderbolt-3-egfx-box-for-299

1 Like

Why are you using razer??? Alienware 13 R3 has a built in OLED Display, also dell alienware amplifier is way better than thunderbolt 3

You know, you could build a complete PC, with GTX1080, in a box that’s the same size as the Razor Core - And again, Linus has you covered: https://youtu.be/s2W0Lsf7hec

OLED truly is an awesome visual experience, but aren’t you concerned of what yours display will look like in just under a year?
I’ve been using AMOLED phones for years now and daily usage of Google maps burns in two bars (top and bottom) in under a year, almost unusable after 2.
I bought an upgrade, also AMOLED, because people claimed burnin was a thing of the past and I am typing this comment over those exact 2 burnt in bars…

Although I will SORELY miss OLEDs contrast and colours I can’t be dealing with this anymore. I own a Dell XPS 12 convertible with the screen ghosting issue that plagued the lineup. It gives me a very good indication of what will burn-in will look like on a PC monitor: (task bar with pinned / frequent apps, subtitle blur if you watch a lot of subbed content and your browser top bar complete with the address bar, icons and frequented tabs)

My honest, unbiased, 2cents. Take it how you will.

I was playing PS4’s new flagship game Horizon ZD on the old non pro version of the PS4 in 1080p and turned on HDR and it seems like an entirely different set of shaders were used. The realism provided by better shadows was much more than a bump in pixels per inch would have provided… Maybe it’s a quirk of the PS4 and how ruthless they have to be to get just a basic frame rate but I wonder if the focus on pixels is a brute force approach to a better experience. Google has improved the JPEG compression algorithm with “psychology” :slight_smile: and maybe something like that will be incorporated into future shaders… Also putting more grunt into modelling characters expressions and lip sync’ing etc would be helpful more than more pixels IMO.

There’s been big improvements in the last couple of years, from what I understand. And phones (small displays) were the testbed. It is interesting that Apple won’t be using OLED until the upcoming iPhone 8 later this year, and given their… strictness… there might have been a reason for that.

That’s probably why there are literally zero OLED monitors you can buy for PC, yet. I hope that changes in the next 5 years though!

Certainly 2016 was the sweet spot to date for OLED in TVs, where the price got down to “reasonable” and the technology is getting to a mature place with all the bugs worked out.

That is a cool case, wish I could buy it even with infinite cash… but the whole point of this exercise is to decouple the GPU upgrade cycle from the system upgrade cycle. So you can plug a monster GPU into any system and transform it.

Connecting a GPU via a cutting-edge, external 5 gigabytes/sec Thunderbolt 3 connection is where the action is! That’s fascinating. Building a whole new PC system up is easy, it’s common, it’s just not … interesting.

This is pretty neat, I have to say. Pity my XPS13 doesn’t have a TB3 port. Though I see that there are cheapo Chinese alternatives (about $40US if you buy directly from China) that connect via mini pci express. My laptop appears to only have a PCIE 2.0 slot in it though, so that would likely cause some serious bottlenecks.

Having only a laptop.

With an external GPU you can get best of both worlds: an ultrabook that you can have with you at all times and the performances of a heavy rig when you want to play.

With your micro/miniITX you’d have to buy 2 computers with one being used only for gaming.

I didn’t even know that this type of thing even existed. It is a little high for my liking, but I will have to check out some thunderbolt GPUs for my crusty HP laptop!

There’s a company called EXP that makes a device called the “GDC beast” that breaks out expresscard, mini PCIe or M.2 slot into a fairly small, externally powered dock. I’m kinda hoping for a tbolt 3 version since those things are pretty cheap (50-70 usd) compared to most of these, but not hot swappable or as flexible as most of these other options. That would probably put this in the price range most people wouldn’t feel too terrible about.

Razer’s hardware is sweet, but at that price + a decent mainstream video card, I might as well get a desktop and stream things over.

1 Like

I totally forgot about this post but the new one that went up today had a link to it – funny timing, as I just read this article yesterday. You can spend that much on just one 1070 if you really want to, though the pricier cards might perform better. Still, might be a good sign of a real market emerging.

1 Like

I actually picked up the Blade Stealth + Core bundle at the end of last year when the Intel 7th gen Blade Stealth came out to replace my aging i5 2500K desktop and my cheapo commodity laptop and it’s been phenomenal. If you buy the bundle Razer knocks $100 off the price of the core, so it’s still pricey, but a little more manageable. I have an older GTX 980 in it that works like a champ when I’m at my gaming station, but I can disconnect it and have a nice light and portable laptop to take with me anywhere without feeling burdened down by the likes of a “gaming” laptop. It has some occasional hiccups, but overall the system is very stable and fills my needs perfectly. Plus what’s better than having one main computer for everything!?

I can understand why this would be desirable on laptops, and other AiO devices - but isn’t the GPU upgrade cycle already decoupled from the system upgrade cycle on PC?

I see this more as an interesting option for gamers who prefer laptops, over desktops.

Even integrated graphics are becoming quite capable at 1080p these days, so it’s really only of interest as 4k becomes more mainstream, or those with high refresh rate devices, at which point your bandwidth requirements start going way up.

Unfortunately, I don’t really see thunderbolt as a future-proof solution to these problems just yet. Once you compare the bandwidth of PCIe, the bandwidth of DisplayPort, and the bandwidth of thunderbolt itself; and just what displays these things can drive; you start to see that we’re already approaching limits of the tech - and new cables/standards are needed to drive the high-bandwidth displays that would drive you to consider these devices in the first place.

I very much think there’s a small niche for them to cut out with current-tech, and it appears that you are in it! So congratulations on your purchase, and I hope you and your son enjoy many hours of gaming fun :slight_smile: