Installing TensorFlow with GPU Support on Python 3.7 in Lubuntu 20.04

Amit Bhargav
3 min readJul 13, 2023

--

TensorFlow | NVIDIA NGC

Introduction: In this article, we will manual you through the process of putting in TensorFlow with GPU assist on Python 3.7 in Lubuntu 20.04. TensorFlow is a popular open-supply gadget mastering framework that allows you to educate and set up deep gaining knowledge of models successfully. Utilizing GPU help can appreciably boost up your model training and inference, resulting in quicker and greater efficient computations. Let’s dive into the step-via-step set up procedure.

Step 1: Verify NVIDIA GPU Compatibility Before intending, it’s critical to make sure that your NVIDIA GPU is like minded with TensorFlow. Visit the authentic TensorFlow documentation to test the supported GPU fashions and corresponding CUDA and cuDNN variations.

Step 2: Install NVIDIA Drivers To make use of GPU acceleration, you want to install the perfect NVIDIA drivers. Open the terminal and execute the following commands to add the snap shots-motive force PPA and set up the endorsed NVIDIA motive force model:

$ sudo add-apt-repository ppa:graphics-drivers/ppa
$ sudo apt-get update
$ sudo apt-get install nvidia-driver-<VERSION>

Replace <VERSION> with the recommended driver version for your GPU.

with the encouraged driving force version in your GPU.

Step 3: Install CUDA Toolkit Next, you want to install the CUDA Toolkit, that’s a prerequisite for TensorFlow’s GPU support. Visit the NVIDIA CUDA Toolkit download web page and pick the precise model to your GPU. Download the installer and run it the usage of the following command:

$ sudo sh cuda_<VERSION>_<ARCH>.run

Replace <VERSION> and <ARCH> with the version and architecture of the CUDA Toolkit you downloaded.

with the model and structure of the CUDA Toolkit you downloaded.

Step 4: Install cuDNN cuDNN (CUDA Deep Neural Network library) is a GPU-increased library for deep neural networks. Download the cuDNN library from the NVIDIA Developer internet site. Choose the model that matches your hooked up CUDA Toolkit model. Extract the downloaded archive and replica the necessary documents to the CUDA Toolkit set up listing the usage of the following commands:

$ tar -xzvf cudnn-<VERSION>.tgz
$ sudo cp cuda/include/cudnn*.h /usr/local/cuda/include
$ sudo cp cuda/lib64/libcudnn* /usr/local/cuda/lib64
$ sudo chmod a+r /usr/local/cuda/include/cudnn*.h /usr/local/cuda/lib64/libcudnn*

Step 5: Create and Activate a Python Virtual Environment It’s a terrific practice to work inside a Python digital surroundings to control bundle dependencies. Create a brand new digital surroundings using the following command:

$ python3.7 -m venv tensorflow_env

Activate the virtual environment with:

$ source tensorflow_env/bin/activate

Step 6: Install TensorFlow with GPU Support Now, you may set up TensorFlow with GPU help using pip.

Execute the subsequent command:

$ pip install tensorflow-gpu==<VERSION>

Replace <VERSION> with the desired TensorFlow version you want to install.

with the preferred TensorFlow version you need to install.

Step 7: To ensure that TensorFlow is installed correctly and utilizing the GPU, run a simple script that prints out the available GPUs. Create a new Python file, e.g., gpu_test.py, and add the following code:

import tensorflow as tf

print(“Num GPUs Available: “, len(tf.config.experimental.list_physical_devices(‘GPU’)))

Execute the script:

python gpu_test.Py

If the entirety is installation effectively, you have to see the range of to be had GPUs published.

Conclusion:

Congratulations! You have effectively mounted TensorFlow with GPU aid on Python 3.7 in Lubuntu 20.04. This setup will enable you to leverage the power of your NVIDIA GPU for increased gadget mastering duties. Remember to seek advice from the legit documentation for further info and explore the substantial talents that TensorFlow gives. Happy coding!

--

--

Amit Bhargav

Software developer passionate about Data Structures, Algorithms, and Optimization. Skilled in Java, C++, Python, AI, and Machine Learning.