Thunderbolt 3, eGPUs

(This is the first post in a planned series, focusing on various aspects of contemporary information and communication technologies.)

The contemporary computing is all about flow of information: be it a personal computer, a mainframe server, a mobile device or even an embedded system in a vehicle, for example, the computers of today are not isolated. Be it for better or worse, increasingly all things are integrated into world-wide networks of information and computation. This also means that the ports and interfaces for all that data transfer take even higher prominence and priority, than in the old days of more locally situated processing.

Thinking about transfer of data, some older generation computer users still might remember things like floppy disks or other magnetic media, that were used both for saving the work files, and often distributing and sharing that work with others. Later, optical disks, external hard drives, and USB flash drives superseded floppies, but a more fundamental shift was brought along by Internet, and “cloud-based” storage options. In some sense the development has meant that personal computing has returned to the historical roots of distributed computing in ARPANET and its motivation in sharing of computing resources. But regardless what kind of larger network infrastructure mediates the operations of user and the service provider, all that data still needs to flow around, somehow.

The key technologies for information and communication flows today appear to be largely wireless. The mobile phone and tablet communicate to the networks with wireless technologies, either WiFi (wireless local area networking) or cellular networks (GSM, 3G and their successors). However, all those wireless connections end up linking into wired backbone networks, that operate at much higher speeds and reliability standards, than the often flaky, local wireless connections. As data algorithms for coding, decoding and compression of data have evolved, it is possible to use wireless connections today to stream 4K Ultra HD video, or to play high speed multiplayer games online. However, in most cases, wired connections will provide lower latency (meaning more immediate response), better reliability from errors and higher speeds. And while there are efforts to bring wireless charging to mobile phones, for example, most of the information technology we use today still needs to be plugged into some kind of wire for charging its batteries, at least.

Thunderbolt 3 infographic, (c) Intel
Thunderbolt 3 infographic, (c) Intel

This is where new standards like USB-C and Thunderbolt come to the picture. Thunderbolt (currently Thunderbolt 3 is the most recent version) is a “hardware interface”, meaning it is a physical, electronics based system that allows two computing systems to exchange information. This is a different thing, though, from the actual physical connector: “USB Type C” is the full name of the most recent reincarnation of “Universal Serial Bus”, an industry standard of protocols, cables, and connectors that were originally released already in 1996. The introduction of original USB was a major step into the interoperability of electronics, as the earlier situation had been developing into a jungle of propriety, non-compatible connectors – and USB is a major success story, with several billion connectors (and cables) shipped every year. Somewhat confusingly, the physical, bi-directional connectors of USB-C can hide behind them many different kinds of electronics, so that some USB-C connectors comply with USB 3.1 mode (with data transfer speeds up to 10 Gbit/s in “USB 3.1 Gen 2” version) and some are implemented with Thunderbolt – and some support both.

USB-C and Thunderbolt have in certain sense achieved a considerable engineering marvel: with backward compatibility to older USB 2.0 mode devices, this one port and cable should be able to connect to multiple displays with 4K resolutions, external data storage devices (with up to 40 Gbit/s speeds), while also working as a power cable: with Thunderbolt support, a single USB-C type port can serve, or drain, up to 100 watts electric power – making it possible to remove separate power connectors, and share power bricks between phones, tablets, laptop computers and other devices. The small form factor Apple MacBook (“Retina”, 2015) is an example of this line of thinking. One downside for the user of this beautiful simplicity of a single port in the laptop is need for carrying various adapters to connect with anything outside of the brave new USB-C world. In an ideal situation, however, it would be a much simpler life if there would only be this one connector type to worry about, and it would be possible to use a single cable to dock any device to the network, gain access to large displays, storage drives, high speed networks, and even external graphics solutions.

The heterogeneity and historical layering of everyday technologies are complicating the landscape that electronics manufacturers would like to paint for us. As any student of history of science and technology can tell, even the most successful technologies did not replace the earlier ones immediately, and there has always been reasons why people have been opposing the adoption of new technologies. For USB-C and Thunderbolt, the process of wider adoption is clearly currently well underway, but there are also multiple factors that slow it down. The most typical peripheral does not yet come with USB-C, but rather with the older versions. Even in expensive, high end mobile phones, there are still multiple models that manufacturers ship with older USB connectors, rather than with the new USB-C ones.

A potentially more crucial issue for most regular users is that Thunderbolt 3 & USB-C is still relatively new and immature technology. The setup is also rather complex, and with its integration of DisplayPort (video), PCI Express (PCIe, data) and DC power into a single hardware interface it typically requires multiple manufacturers’ firmware and driver updates to work seamlessly together, for TB3 magic to start happening. An integrated systems provider such as Apple has best possibilities to make this work, as they control both hardware as well as software of their macOS computers. Apple is also, together with Intel, the developer of the original Thunderbolt, and the interface was first commercially made available in the 2011 version of MacBook Pro. However, today there is an explosion of various USB-C and Thunderbolt compatible devices coming to the market from multiple manufacturers, and the users are eager to explore the full potential of this new, high speed, interoperable wired ecosystem.

eGPU, or External Graphics Processing Unit, is a good example of this. There are entire hobbyist forums like eGPU.io website dedicated to the fine art of connecting a full powered, desktop graphics card to a laptop computer via fast lane connections – either Expresscard or Thunderbolt 3. The rationale for this is (apart from the sheer joy of tweaking) that in this manner, one can both have a slim ultrabook computer for daily use, with a long battery life, that is then capable of transforming into an impressive workstation or gaming machine, when plugged into an external enclosure that houses the power hungry graphics card (these TB3 boxes typically have full length PCIe slots for installing GPUs, different sets of connection ports, and a separate desktop PC style power supply).  VR (virtual reality) applications are one example of an area where current generation of laptops have problems: while there are e.g. Nvidia GeForce GTX 10 series (1060 etc.) equipped laptops available today, most of them are not thin and light for everyday mobile use, or, if they are, their battery life and/or fan noise present issues.

Razer, a American-Chinese computing hardware manufacturer is known as a pioneer in popularizing the field of eGPUs, with their introduction of Razer Blade Stealth ultrabook, which can be plugged with a TB3 cable into the Razer Core enclosure (sold separately), for utilizing powerful GPU cards that can be installed inside the Core unit. A popular use case for TB3/eGPU connections is for plugging a powerful external graphics card into a MacBook Pro, in order to make it into a more capable gaming machine. In practice, the early adopters have faced struggles with firmwares and drivers that do not provide direct support from either the macOS side, or from the eGPU unit for the Thunderbolt 3 implementation to actually work. (See e.g. https://egpu.io/akitio-node-review-the-state-of-thunderbolt-3-egpu/ .) However, more and more manufacturers have added support and modified their firmware updates, so the situation is already much better than a few months ago (see instructions at: https://egpu.io/setup-guide-external-graphics-card-mac/ .) In the area of PC laptops running Windows 10, the situation is comparable: a work in progress, with more software support slowly emerging. Still, it is easy to get lost in this, still evolving field. For example, Dell revealed in January that they had restricted the Thunderbolt 3 PCIe data lanes in their implementation of the premium XPS 15 notebook computer: rather than using full 4 lanes, XPS 15 had only 2 PCIe lanes connected in the TB3. There is e.g. this discussion in Reddit comparing the effects this has, in the typical case that eGPU is feeding image into an external display, rather than back to the internal display of the laptop computer (see: https://www.reddit.com/r/Dell/comments/5otmir/an_approximation_of_the_difference_between_x2_x4/). The effects are not that radical, but it is one of the technical details that the early users of eGPU setups have struggled with.

While fascinating from an engineering or hobbyist perspective, the situation of contemporary technologies for connecting the everyday devices is still far from perfect. In thousands of meeting rooms and presentation auditoriums every day, people fail to connect their computers, get anything into the screen, or get access to their presentation due to the failures of online connectivity. A universal, high speed wireless standard for sharing data and displaying video would no doubt be the best solution for all. Meanwhile, a reliable and flexible, high speed standard in wired connectivity would go a long way already. The future will show whether Thunderbolt 3 can reach that kind of ubiquitous support. The present situation is pretty mixed and messy at best.

iPhone 6: boring, but must-have?

iPhone 6 & 6 Plus © Apple.
iPhone 6 & 6 Plus © Apple.

There have been substantial delays in my advance order for iPhone 6 Plus (apparently Apple underestimated the demand), and I have had some time to reflect on why I want to get the damned thing in the first place. There are no unique technological features in this phone that really set it apart in today’s hi-tech landscape (Apple Pay, for example, is not working in Finland). The screen is nice, the phone (both models, 6 and 6 Plus) are well-designed and thin, but then again – so are many other flagship smartphones today. Feature-wise, Apple has never really been the one to play the “we have the most, we get there first” game, rather, they are famous for coming in later, and for perfecting few selected ideas that often have been previously introduced by someone else.

I have never been an active “Apple fan”, even while it has been interesting to follow what they have to offer. Apple pays very close attention to design, but on the other hand closes down many options for hacking, personalising and extending their systems, which is something that a typical power-user or geek type abhors – or, at least used to.

What has changed then, if anything? On one hand, the crucial thing is that in the tech ecosystem, devices are increasingly just interfaces and entry points to content and services that reside in the cloud. My projects, documents, photos, and increasingly also the applications I use, live in the cloud. There is simply not that much need for tweaking the operating system, installing specific software, customising keyboard shortcuts, system parameters etc. than before – or is it just that I have got lazy? Moving all the time from office to the meeting room, then to the lecture hall, next to seminar room, then to home, and next to the airport, there are multiple devices while on the road that serve as portals for information, documents and services that are needed then and there. Internet connectivity and electricity rather than CPU cycles or available RAM are the key currencies today.

While on the run, I carry four tools with me today: Samsung Galaxy S4 (work phone), iPhone 4S (personal phone), iPad Air (main work tablet device), and Macbook Pro 13 Retina (personal laptop). I also use three Windows laptops (Asus Vivobook at home, Vaio Z and Vaio Z3 which I run in tandem in the office), and in the basement is the PC workstation/gaming PC that I self-assembled in December 2011. (The video gaming consoles, alternative tablets, media servers and streaming media boxes are not included in the discussion here.) All in all, it is S4 that is the most crucial element here, simply because it is mostly at hand whenever I need to check some discussion or document, look for some fact, reply to someone – and while a rather large smartphone, it is still compact enough so that I can carry it with me all the time, and it is also fast and responsive, and it has large enough, sharp touchscreen that allows interacting with all that media and communication in timely and effortless manner. I use iPhone 4S much less, mainly because its screen is so small. (Also, since both iOS 8 and today’s apps have been designed for much speedier iPhone versions, it is terribly slow.) Yet, the Android apps regularly fall short when compared to their iOS counterparts: there are missing features, updates arrive later, the user experience is not optimised for the device. For example, I really like Samsung Note 10.1 2014 Edition, which is – with its S Pen and multitasking features – arguably a better professional tablet device than iPad; yet, I do not carry it with me daily, simply as the Android apps are still often terrible. (Have you used e.g. official Facebook app in a large-screen Android tablet? The user interface looks like it is just the smartphone UI, blown up to 10 inches. Text is so small you have to squint.)

iPhone 6, and particularly 6 Plus, show Apple rising up to the challenge of screen size and performance level that Android users have enjoyed for some time already. Since many US based tech companies still have “iOS first” strategy, the app ecosystem of iPhones is so much stronger than its Android counterpart that in my kinds of use at least, investing to the expensive Apple offering makes sense. I study digital culture, media, Internet and games by profession, and many interesting games and apps only come available to the Apple land, or Android versions come later or in stripped-down forms. I am also avid mobile photographer, and while iPhone 6 and 6 Plus have smaller number of megapixels to offer than their leading rivals, their fast auto-focus, natural colours, and good low-light performance makes the new iPhones good choices also from the mobile photographer angle. (Top Lumia phones would have even better mobile cameras in this standpoint, but Windows Phone app ecosystem is even worse than Android one, where at least the numbers of apps have been rising, as the world-wide adoption of Android handsets creates demand for low-cost apps, in particular.)

To summarise, mobile is where the spotlight of information and communication technologies lies at the moment, and where games and digital culture in general is undergoing powerful developments. While raw processing power or piles of advanced features are no longer the pinnacle or guarantee for best user experiences, it is all those key elements in the minimalistic design, unified software and service ecosystem that support smooth and effortless access to content, that really counts. And while the new iPhone in terms of its technology and UI design is frankly pretty boring, it is for many people the optimal entrance to those services, discussions and creative efforts of theirs that they really care about.

So, where is that damned 6 Plus of mine, again? <sigh>

%d bloggers like this: