Skip to content

Latest commit

 

History

History
94 lines (61 loc) · 5.17 KB

README.md

File metadata and controls

94 lines (61 loc) · 5.17 KB

This repository contains the install scripts needed to add PYNQ to the official Ubuntu SDCard Image of your:

  • Kria KV260 Vision AI Starter Kit
  • Kria KR260 Robotic Starter Kit
  • Kria KD240 Drives Starter Kit

From that installation, a complete Python and Jupyter environment is installed on the Kria SOM along with multiple programmable logic overlays all ready to use.

Installation

1. Get the Ubuntu SD Card Image

Users of Ubuntu 22.04 LTS

If you wish to run Ubuntu 22.04 LTS, you may need to update the boot firmware to the latest version. For example, 2022.1 boot firmware is recommended for Ubuntu 22.04 LTS user. Ubuntu may not boot with mismatched firmware. The update methods can be found in Kria Wiki.

KV260

Follow the steps to Get Started with Kria KV260 Vision AI Starter Kit until you complete the Booting your Starter Kit section.

KR260

Follow the steps to Get Started with Kria KR260 Robotic Starter Kit until you complete the Booting your Starter Kit section.

KD240

Follow the steps to Get Started with Kria KD240 Drives Starter Kit until you complete the Booting your Starter Kit section.

2. Install PYNQ

Then install PYNQ on your Kria device. Simply clone this repository from your Kria and run the install.sh script specifying the device with the -b flag.

git clone https://github.com/Xilinx/Kria-PYNQ.git
cd Kria-PYNQ/
sudo bash install.sh -b { KV260 | KR260 | KD240 } 

This script will install the required debian packages, create Python virtual environment and configure a Jupyter portal. This process takes around 25 minutes.

3. Open Jupyter

JupyterLab can now be accessed via a web browser <ip_address>:9090/lab or kria:9090/lab. The password is xilinx

Included Overlays

Base Overlay [GitHub]

This overlay includes support for the KV260's Raspberry Pi camera and PMOD interfaces. A Digilent Pcam 5C camera can be attached to the KV260 and controlled from Jupyter notebooks. Additionally, a variety of Grove and PMOD devices are supported on the PMOD interface - all controllable from a Xilinx Microblaze processor in programmable logic.

Supported boards: KV260

DPU-PYNQ (v2.5) [GitHub] [PYPI]

This overlay contains a Vitis-AI 2.5.0 Deep Learning Processor Unit (DPU) and comes with a variety of notebook examples with pre-trained ML models.

Supported boards: KV260, KR260, KD240

Composable Pipeline (v1.1 soft-release) [GitHub]

The Composable pipeline is an overlay with a novel and clever architecture that allow us to adapt how the data flows between a series of IP cores.

Supported boards: KV260

PYNQ-Helloworld [GitHub] [PYPI]

One of PYNQ's first overlays, the PYNQ-Helloworld overlay includes an image resizer block in programmable logic. This overlay demonstrates a simple but powerful use of programmable logic HLS blocks to do image processing.

Supported boards: KV260, KR260

Selftest

A self-test script for each board is generated at the end of the build. This test script runs some of the Overlay tests. For some of the tests specific peripherals need to be connected to the board:

Board Test name Peripherals
KV260 test_apps
  • 1. A monitor connected to either the HDMI or DisplayPort
  • 2. A USB Webcamera

To run the self-test navigate to the Kria-PYNQ install directory and run:

sudo ./selftest.sh

References



Copyright (C) 2021 Xilinx, Inc

SPDX-License-Identifier: BSD-3 License