
The accelerated computing landscape is full of terms, acronyms and jargon.
Navigating your way through all of them can be confusing and time consuming. Look no further – we have created a handy guide of 20 terms from the world of accelerated computing for you to learn about.
Ready? Let’s get scrolling through part one of our accelerated computing terms series!
1. Accelerated computing
Accelerated computing is the use of the CPU, GPU and specialised software to accelerate the processing performance of graphic and data workloads. It works by offloading intensive graphics and data from the CPU to the GPU. In turn, this enables the CPU to solely focus on its main function of processing and executing instructions (such as opening an application), whilst the GPU processes demanding graphics and data. This allows workloads and applications to be executed more quickly and efficiently.
Without accelerated computing there wouldn’t be the modern computing we see today as workloads from AI to GPU-accelerated VDI rely on it.
2. GPU
GPU is an acronym for Graphics Processing Unit. It is a specialist programmable processor designed for accelerating graphics. They are commonly and incorrectly referred to as graphics cards or video cards – see below. A GPU is an essential computing component for 3D design, rendering, digital twins and artificial intelligence workloads. These types of workloads require a GPU to process their rich graphics. Without it, graphics will be processed less efficiently by the CPU, which isn’t designed to process graphics.
3. Graphics card
A GPU is a component of a graphics card that works alongside various other components such as RAM and a monitor connection to process rich graphics and data. Graphics cards aren’t standard in most PCs, they are an expansion card used to improve performance. Whilst they have commonly been used for rendering visuals for gaming and graphics production, they are also used for processing large batches of data for AI and data science at high speeds.
4. CPU
CPU aka Central Processing Unit is the brain of a computer. It runs the operating system and applications on a computer and is responsible for processing and executing instructions. All computing devices such as phones, laptops, tablets, and even smart TVs have a CPU.
Typically, large enterprises would run high-spec servers with multi-core CPUs to enable multiple virtual machines to share CPU resources, making their IT infrastructure highly scalable.
5. vGPU
Delivering complex graphics and data in virtual and remote desktop environments with sufficient performance can be difficult. That’s where virtual GPU (vGPU) comes in.
Created by NVIDIA, vGPU is graphics acceleration technology that enables a GPU’s resources to be shared across multiple virtual machines on any device, anywhere. vGPU software is installed on a physical GPU in a data centre server and provides users with powerful GPU performance across a range of graphically and data intensive workloads. IT departments also benefit from the centralised management and security vGPU software brings to IT infrastructure.
6. GPU-acceleration
You may have come across phrases such as ‘GPU acceleration’ or ‘GPU-accelerated rendering’. This is when a GPU is employed, alongside a CPU to perform an operation more efficiently on a computer. Typically, GPUs are used for enhancing the performance of graphically intensive workloads such as for engineering applications like AutoCAD and Revit. However, GPU acceleration is also commonly used for data intensive workloads like data science and artificial intelligence.
7. VDI
Virtual desktop infrastructure is one of the technologies many of us will be familiar with that enables remote working. It is a form of the virtualisation where desktop environments are hosted virtually on a centralised data server.
The virtual desktop is accessed by a user over a network with an endpoint device such as a laptop, thin client or a phone and performs tasks as if it was running locally. Because the desktop computing takes place on a centralised server rather than on an endpoint device, there are several benefits including security, lower hardware requirements and configuration management.
8. Persistent VDI and Non-persistent VDI
There are two types of VDI. The first type, persistent VDI is where a user’s desktop is customisable and remains in a customisable setup from one session to another. Persistent VDI is also called stateful VDI due to the customised data being saved between sessions.
The second is non-persistent VDI. Whilst persistent VDI remains in the same customised setup whenever a user logs out and back in, non-persistent VDI is the opposite. Settings and data are not saved when a user logs out, with the desktop reverting to its original state when a user logs back in. Non-persistent VDI are commonly used in self-service tills and warehouse logistics
9. Thin client
A thin client is a basic, hard drive free computer that utilises resources within a centralised server rather than on a computer. Thin clients are commonly used within a virtual desktop computing model (VDI) for the security, scalability and management benefits. They run lean operating systems such as Linux and work by connecting remotely to a server-based computing environment. As thin clients rely on the server to perform processes, they are a low-cost option to desktop PCs and laptops.
10. Virtual machine
A virtual machine, commonly known as VM, is a virtual environment created and installed on a host machine (server or computer) that functions just like a physical computer. It has its own CPU, storage, memory and operating system and operates independently from the host machine. This is possible because software called a hypervisor separates the hosts hardware resources and distributes them to enable a VM to be used.
VM’s are predominantly used for server consolidation purposes. One physical server can host multiple virtual machines, meaning there’s less need to have additional hardware resources for a whole workforce. Therefore, for example, you can run multiple different operating systems simultaneously on a single computer. Sometimes you may hear VM referred to as a virtual workstation.
11. Virtualisation
This is technology that creates computer generated versions of hardware, applications, operating systems, and environments using resources from a physical machine. That machine could be a server, workstation, or a PC.
Virtualisation enables computer hardware to be used more efficiently and is the core to creating virtual machines and cloud computing.
12. Visualisation
Visualisation or visualization is a term used for the technique of creating and delivering graphics through processing data to communicate a message. Professionally, visualisation technology is essential for designers, engineers, architects and data scientists who use visualisation for 3D rendering and raytracing to real-time simulation and AI workflows.
13. HPC
High performance computing (HPC) is the practice of aggregating the power of multiple servers to process complex calculations at high speeds. It lets users process large data more quickly than a standard computer and can be located on premise, off premise and in the cloud.
A common use case for HPC is with supercomputers used for data science, analytics and artificial intelligence when solving complex problems. For businesses, HPC enables them to do more, in less time while spending less.
14. IoT
The internet of things (IoT) is the network of physical devices that are connected to the internet that communicate with each other. Devices ranging from printers and workstations to factory robots and autonomous vehicles are examples of IoT devices.
15. Workflow
A workflow can be defined as series of activities involved in completing a task. All businesses have workflows for varying tasks. They are beneficial for ensuring that processes are correctly completed. An example of a workflow would be a designer creating a first draft of a car design. They send it over to their colleague for review and feedback, after which they action the feedback and submit their final draft for review.
16. Workload
In computing, workload means an activity that requires specific computer resource to execute. For example, 3D rendering requires GPU power to correctly process the activity. That is a workload. The same goes for other technologies such as VDI, AI and digital twins.
17. Remote workspace
An environment that is anywhere that isn’t your work office, is your remote workspace. This could be a café, the spare room in your house, even your living room! Remote work has become increasingly popular over the last two years. This has required businesses and workforces to adapt to working more flexibly from the office and home.
18. NVIDIA
NVIDIA are a leading manufacturer of GPUs and GPU software. They have pioneered accelerated computing, inventing a GPU that was originally used for gaming, to creating GPU technology that has revolutionised modern computing and artificial intelligence. NVIDIA’s technology enables professional workloads such as data science and high-performance computing, to become more accessible and higher performing.
19. Data server
This is a server used for storing, managing, and distributing data to authorised users from a centralised location. A data server enables easier management of data for IT departments whilst also being more secure as data is not stored on individual physical devices.
With hybrid working becoming more common, data servers are important for providing access to data and applications to users who are working remotely away from the office.
20. GPU utilisation
This refers to how much of your GPU’s memory is being used. High GPU usage, between 90-100%, usually occurs when rendering graphics and playing computer games. This isn’t always a bad thing; they are built to be used to their maximum.
ebb3 is a leading accelerated computing specialist. They design and deliver IT platforms for businesses with graphic and data intensive workloads ranging from artificial intelligence and virtual reality to VDI and HPC.
Comentários