63 Commits

Author SHA1 Message Date
Oscar Krause
fda73a95d3 test "X-NLS-Signature" upper/lowercase 2025-04-14 13:38:21 +02:00
Oscar Krause
3f6ff0b178 changed "feature_name" 2025-04-14 13:38:21 +02:00
Oscar Krause
c9068ad034 updated attributes to match nvidia-nls response order 2025-04-14 13:38:21 +02:00
Oscar Krause
c98aa76d5e x-nls-signature 2025-04-14 13:38:21 +02:00
Oscar Krause
c31aedb28d test "x-nls-signature" 2025-04-14 13:38:21 +02:00
Oscar Krause
b12eb87077 fixed missing code 2025-04-14 13:38:21 +02:00
Oscar Krause
3415b2b9ec test with "X-NLS-Signature" 2025-04-14 13:38:21 +02:00
Oscar Krause
6edc88662d implemented client_challenge on lease apis 2025-04-14 13:38:21 +02:00
Oscar Krause
bac32a77c8 set "offline_lease" to false 2025-04-14 13:38:21 +02:00
Oscar Krause
f851370db4 added EMPTY (!!!) X-NLS-Signature response header 2025-04-14 13:38:21 +02:00
Oscar Krause
10428820f8 added debug headers 2025-04-14 13:38:21 +02:00
Oscar Krause
9dc3643fdd code styling 2025-04-14 13:38:21 +02:00
Oscar Krause
ea612bf2e7 added new "protocol_version" to client-token 2025-04-14 13:38:21 +02:00
Oscar Krause
3c1a1d89a4 fixes 2025-04-14 13:38:21 +02:00
Oscar Krause
27b131a789 also debug response 2025-04-14 13:38:21 +02:00
Oscar Krause
a8e14c0ed0 fixed datetime format 2025-04-14 13:38:21 +02:00
Oscar Krause
b0fe646b03 added missing lessor attributes 2025-04-14 13:38:21 +02:00
Oscar Krause
38ae14f6cf fixes 2025-04-14 13:38:21 +02:00
Oscar Krause
e967bda446 added debugging 2025-04-14 13:38:21 +02:00
Oscar Krause
cf6fc9a4ce updated new responses for 18.x drivers 2025-04-14 13:38:21 +02:00
Oscar Krause
5a98df563a created "init_config_token_demo" 2025-04-14 13:38:21 +02:00
Oscar Krause
50a9d70b77 added test and code for /leasing/v1/config-token
ref. https://git.collinwebdesigns.de/nvidia/nls/-/blob/main/src/test/test_config_token.py
2025-04-14 13:38:21 +02:00
Oscar Krause
cca169881c implemented initial endpoint for /leasing/v1/config-token 2025-04-14 13:38:21 +02:00
Oscar Krause
426da28382 added logging 2025-04-14 13:38:21 +02:00
Oscar Krause
d249ef34bc added debugging 2025-04-14 13:38:21 +02:00
Oscar Krause
e94db9c33b added endpoint '/leasing/v1/config-token' 2025-04-14 13:38:21 +02:00
Oscar Krause
f9c7475250 ci fixes 2025-04-14 13:37:27 +02:00
Oscar Krause
00dafdc63a do not run duplicated pipeline on feature-branches 2025-04-14 11:17:25 +02:00
Oscar Krause
607cca7655 typos 2025-04-10 09:00:58 +02:00
Oscar Krause
6f9b6b9e5c Merge branch 'dev' into 'main'
added NixOS Pull-Request note for official "nixpkgs"

See merge request oscar.krause/fastapi-dls!50
2025-04-10 08:58:36 +02:00
Oscar Krause
d22e93020c added NixOS Pull-Request note for official "nixpkgs" 2025-04-10 07:55:20 +02:00
Oscar Krause
4522425bcc Merge branch 'dev' into 'main'
dev

See merge request oscar.krause/fastapi-dls!48
2025-04-08 10:10:31 +02:00
Oscar Krause
4568881d1e fixes 2025-04-08 09:56:51 +02:00
Oscar Krause
d4d49956fe fixes 2025-04-08 09:38:49 +02:00
Oscar Krause
44a63efc4f requirements.txt updated 2025-04-08 09:22:03 +02:00
Oscar Krause
ca0eedc1f2 .gitignore 2025-04-08 09:18:50 +02:00
Oscar Krause
6d5b389f2a .gitlab-ci.yml 2025-04-08 09:17:44 +02:00
Oscar Krause
c3dbc043b3 updated EOLs 2025-03-21 08:47:28 +01:00
Oscar Krause
666a07507e typos 2025-03-21 08:39:32 +01:00
Oscar Krause
76272af36f added some glfm 2025-03-21 07:47:41 +01:00
Oscar Krause
cc2a11d07b moved the only two FAQ entries to Known-Issues on README 2025-03-21 07:47:26 +01:00
Oscar Krause
b26704646d fixes 2025-03-20 20:24:43 +01:00
Oscar Krause
e0c9bb46ee requirements.txt updated 2025-03-20 20:09:20 +01:00
Oscar Krause
490720f2d6 code styling 2025-03-20 20:09:13 +01:00
Oscar Krause
9ed178098b removed doc directory 2025-03-20 07:11:01 +01:00
Oscar Krause
951fc35203 moved Reverse Engineering Notes to separate project
ref.: https://git.collinwebdesigns.de/nvidia/nls
2025-03-19 21:03:54 +01:00
Oscar Krause
26248a4ea5 fixed tests 2025-03-19 21:01:58 +01:00
Oscar Krause
85623d1a65 code styling 2025-03-18 14:47:33 +01:00
Oscar Krause
e6a2de40c9 fixed test deprecations 2025-03-18 10:33:52 +01:00
Oscar Krause
fd46eecfb3 created PrivateKey / PublicKey wrapper classes 2025-03-18 09:43:44 +01:00
Oscar Krause
958f23f79d fixed "cryptography" dependency 2025-03-17 14:09:55 +01:00
Oscar Krause
0bdd3a6ac2 migrated from "pycryptodome" to "cryptography" 2025-03-17 14:05:26 +01:00
Oscar Krause
8a269b0393 Merge branch 'main' into 'dev'
# Conflicts:
#   doc/Reverse Engineering Notes.md
2025-03-13 20:33:01 +01:00
Oscar Krause
6607756c08 updated Reverse Engineering Notes.md 2025-03-13 20:29:16 +01:00
Oscar Krause
584eee41ef Merge branch 'dev' into 'main'
fixed logging

See merge request oscar.krause/fastapi-dls!47
2025-03-12 13:40:45 +01:00
Oscar Krause
25658cb1fb code styling 2025-03-12 11:41:58 +01:00
Oscar Krause
43fdf1170c Reverse Engineering Notes.md bearbeiten 2025-03-12 08:44:37 +01:00
Oscar Krause
a953e62bcb Reverse Engineering Notes.md bearbeiten 2025-03-11 22:51:45 +01:00
Oscar Krause
9c0cd21e71 Reverse Engineering Notes.md bearbeiten 2025-03-11 22:32:13 +01:00
Oscar Krause
3f5fcbebb3 fixed logging 2025-03-11 22:04:35 +01:00
Oscar Krause
3fdd439035 Reverse Engineering Notes.md bearbeiten 2025-03-11 13:40:21 +01:00
Oscar Krause
d30dbced39 Reverse Engineering Notes.md bearbeiten 2025-03-10 23:47:55 +01:00
Oscar Krause
5b61d0a40e Reverse Engineering Notes.md bearbeiten 2025-03-10 21:21:40 +01:00
15 changed files with 547 additions and 352 deletions

View File

@@ -2,7 +2,7 @@ Package: fastapi-dls
Version: 0.0 Version: 0.0
Architecture: all Architecture: all
Maintainer: Oscar Krause oscar.krause@collinwebdesigns.de Maintainer: Oscar Krause oscar.krause@collinwebdesigns.de
Depends: python3, python3-fastapi, python3-uvicorn, python3-dotenv, python3-dateutil, python3-josepy, python3-sqlalchemy, python3-pycryptodome, python3-markdown, uvicorn, openssl Depends: python3, python3-fastapi, python3-uvicorn, python3-dotenv, python3-dateutil, python3-josepy, python3-sqlalchemy, python3-cryptography, python3-markdown, uvicorn, openssl
Recommends: curl Recommends: curl
Installed-Size: 10240 Installed-Size: 10240
Homepage: https://git.collinwebdesigns.de/oscar.krause/fastapi-dls Homepage: https://git.collinwebdesigns.de/oscar.krause/fastapi-dls

View File

@@ -1,8 +1,8 @@
# https://packages.debian.org/hu/ # https://packages.debian.org/hu/
fastapi==0.92.0 fastapi==0.92.0
uvicorn[standard]==0.17.6 uvicorn[standard]==0.17.6
python-jose[pycryptodome]==3.3.0 python-jose[cryptography]==3.3.0
pycryptodome==3.11.0 cryptography==38.0.4
python-dateutil==2.8.2 python-dateutil==2.8.2
sqlalchemy==1.4.46 sqlalchemy==1.4.46
markdown==3.4.1 markdown==3.4.1

View File

@@ -1,8 +1,8 @@
# https://packages.ubuntu.com # https://packages.ubuntu.com
fastapi==0.101.0 fastapi==0.101.0
uvicorn[standard]==0.27.1 uvicorn[standard]==0.27.1
python-jose[pycryptodome]==3.3.0 python-jose[cryptography]==3.3.0
pycryptodome==3.20.0 cryptography==41.0.7
python-dateutil==2.8.2 python-dateutil==2.8.2
sqlalchemy==1.4.50 sqlalchemy==1.4.50
markdown==3.5.2 markdown==3.5.2

View File

@@ -1,8 +1,8 @@
# https://packages.ubuntu.com # https://packages.ubuntu.com
fastapi==0.110.3 fastapi==0.110.3
uvicorn[standard]==0.30.3 uvicorn[standard]==0.30.3
python-jose[pycryptodome]==3.3.0 python-jose[cryptography]==3.3.0
pycryptodome==3.20.0 cryptography==42.0.5
python-dateutil==2.9.0 python-dateutil==2.9.0
sqlalchemy==2.0.32 sqlalchemy==2.0.32
markdown==3.6 markdown==3.6

View File

@@ -8,7 +8,7 @@ pkgdesc='NVIDIA DLS server implementation with FastAPI'
arch=('any') arch=('any')
url='https://git.collinwebdesigns.de/oscar.krause/fastapi-dls' url='https://git.collinwebdesigns.de/oscar.krause/fastapi-dls'
license=('MIT') license=('MIT')
depends=('python' 'python-jose' 'python-starlette' 'python-httpx' 'python-fastapi' 'python-dotenv' 'python-dateutil' 'python-sqlalchemy' 'python-pycryptodome' 'uvicorn' 'python-markdown' 'openssl') depends=('python' 'python-jose' 'python-starlette' 'python-httpx' 'python-fastapi' 'python-dotenv' 'python-dateutil' 'python-sqlalchemy' 'python-cryptography' 'uvicorn' 'python-markdown' 'openssl')
provider=("$pkgname") provider=("$pkgname")
install="$pkgname.install" install="$pkgname.install"
backup=('etc/default/fastapi-dls') backup=('etc/default/fastapi-dls')
@@ -39,7 +39,7 @@ check() {
package() { package() {
install -d "$pkgdir/usr/share/doc/$pkgname" install -d "$pkgdir/usr/share/doc/$pkgname"
install -d "$pkgdir/var/lib/$pkgname/cert" install -d "$pkgdir/var/lib/$pkgname/cert"
cp -r "$srcdir/$pkgname/doc"/* "$pkgdir/usr/share/doc/$pkgname/" #cp -r "$srcdir/$pkgname/doc"/* "$pkgdir/usr/share/doc/$pkgname/"
install -Dm644 "$srcdir/$pkgname/README.md" "$pkgdir/usr/share/doc/$pkgname/README.md" install -Dm644 "$srcdir/$pkgname/README.md" "$pkgdir/usr/share/doc/$pkgname/README.md"
install -Dm644 "$srcdir/$pkgname/version.env" "$pkgdir/usr/share/doc/$pkgname/version.env" install -Dm644 "$srcdir/$pkgname/version.env" "$pkgdir/usr/share/doc/$pkgname/version.env"

2
.gitignore vendored
View File

@@ -1,6 +1,6 @@
.DS_Store .DS_Store
venv/ venv/
.idea/ .idea/
app/*.sqlite* *.sqlite
app/cert/*.* app/cert/*.*
.pytest_cache .pytest_cache

View File

@@ -16,12 +16,12 @@ build:docker:
interruptible: true interruptible: true
stage: build stage: build
rules: rules:
- if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH # deployment is in "deploy:docker:"
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
changes: changes:
- app/**/* - app/**/*
- Dockerfile - Dockerfile
- requirements.txt - requirements.txt
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
tags: [ docker ] tags: [ docker ]
before_script: before_script:
- docker buildx inspect - docker buildx inspect
@@ -41,19 +41,17 @@ build:apt:
interruptible: true interruptible: true
stage: build stage: build
rules: rules:
- if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
- if: $CI_COMMIT_TAG - if: $CI_COMMIT_TAG
variables: variables:
VERSION: $CI_COMMIT_REF_NAME VERSION: $CI_COMMIT_REF_NAME
- if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH - if: $CI_PIPELINE_SOURCE == 'merge_request_event'
changes: changes:
- app/**/* - app/**/*
- .DEBIAN/**/* - .DEBIAN/**/*
- .gitlab-ci.yml - .gitlab-ci.yml
variables: variables:
VERSION: "0.0.1" VERSION: "0.0.1"
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
variables:
VERSION: "0.0.1"
before_script: before_script:
- echo -e "VERSION=$VERSION\nCOMMIT=$CI_COMMIT_SHA" > version.env - echo -e "VERSION=$VERSION\nCOMMIT=$CI_COMMIT_SHA" > version.env
# install build dependencies # install build dependencies
@@ -94,16 +92,13 @@ build:pacman:
- if: $CI_COMMIT_TAG - if: $CI_COMMIT_TAG
variables: variables:
VERSION: $CI_COMMIT_REF_NAME VERSION: $CI_COMMIT_REF_NAME
- if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH - if: $CI_PIPELINE_SOURCE == 'merge_request_event'
changes: changes:
- app/**/* - app/**/*
- .PKGBUILD/**/* - .PKGBUILD/**/*
- .gitlab-ci.yml - .gitlab-ci.yml
variables: variables:
VERSION: "0.0.1" VERSION: "0.0.1"
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
variables:
VERSION: "0.0.1"
before_script: before_script:
#- echo -e "VERSION=$VERSION\nCOMMIT=$CI_COMMIT_SHA" > version.env #- echo -e "VERSION=$VERSION\nCOMMIT=$CI_COMMIT_SHA" > version.env
# install build dependencies # install build dependencies
@@ -153,7 +148,7 @@ test:
- source venv/bin/activate - source venv/bin/activate
- pip install --upgrade pip - pip install --upgrade pip
- pip install -r $REQUIREMENTS - pip install -r $REQUIREMENTS
- pip install pytest httpx - pip install pytest pytest-cov pytest-custom_exit_code httpx
- mkdir -p app/cert - mkdir -p app/cert
- openssl genrsa -out app/cert/instance.private.pem 2048 - openssl genrsa -out app/cert/instance.private.pem 2048
- openssl rsa -in app/cert/instance.private.pem -outform PEM -pubout -out app/cert/instance.public.pem - openssl rsa -in app/cert/instance.private.pem -outform PEM -pubout -out app/cert/instance.public.pem
@@ -168,11 +163,12 @@ test:
.test:apt: .test:apt:
stage: test stage: test
rules: rules:
- if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH - if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
changes: changes:
- app/**/* - app/**/*
- .DEBIAN/**/* - .DEBIAN/**/*
- if: $CI_PIPELINE_SOURCE == 'merge_request_event' - .gitlab-ci.yml
needs: needs:
- job: build:apt - job: build:apt
artifacts: true artifacts: true
@@ -217,11 +213,12 @@ test:apt:
test:pacman:archlinux: test:pacman:archlinux:
image: archlinux:base image: archlinux:base
rules: rules:
- if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH - if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
changes: changes:
- app/**/* - app/**/*
- .PKGBUILD/**/* - .PKGBUILD/**/*
- if: $CI_PIPELINE_SOURCE == 'merge_request_event' - .gitlab-ci.yml
needs: needs:
- job: build:pacman - job: build:pacman
artifacts: true artifacts: true
@@ -265,19 +262,19 @@ test_coverage:
before_script: before_script:
- apt-get update && apt-get install -y python3-dev gcc - apt-get update && apt-get install -y python3-dev gcc
- pip install -r requirements.txt - pip install -r requirements.txt
- pip install pytest httpx - pip install pytest pytest-cov pytest-custom_exit_code httpx
- mkdir -p app/cert - mkdir -p app/cert
- openssl genrsa -out app/cert/instance.private.pem 2048 - openssl genrsa -out app/cert/instance.private.pem 2048
- openssl rsa -in app/cert/instance.private.pem -outform PEM -pubout -out app/cert/instance.public.pem - openssl rsa -in app/cert/instance.private.pem -outform PEM -pubout -out app/cert/instance.public.pem
- cd test - cd test
script: script:
- pip install pytest pytest-cov - coverage run -m pytest main.py --junitxml=report.xml --suppress-no-test-exit-code
- coverage run -m pytest main.py
- coverage report - coverage report
- coverage xml - coverage xml
coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/' coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/'
artifacts: artifacts:
reports: reports:
junit: [ '**/report.xml' ]
coverage_report: coverage_report:
coverage_format: cobertura coverage_format: cobertura
path: '**/coverage.xml' path: '**/coverage.xml'

17
FAQ.md
View File

@@ -1,17 +0,0 @@
# FAQ
## `Failed to acquire license from <ip> (Info: <license> - Error: The allowed time to process response has expired)`
- Did your timezone settings are correct on fastapi-dls **and your guest**?
- Did you download the client-token more than an hour ago?
Please download a new client-token. The guest have to register within an hour after client-token was created.
## `jose.exceptions.JWTError: Signature verification failed.`
- Did you recreated `instance.public.pem` / `instance.private.pem`?
Then you have to download a **new** client-token on each of your guests.

View File

@@ -2,9 +2,12 @@
Minimal Delegated License Service (DLS). Minimal Delegated License Service (DLS).
> [!note] Compatibility
> Compatibility tested with official NLS 2.0.1, 2.1.0, 3.1.0, 3.3.1, 3.4.0. For Driver compatibility > Compatibility tested with official NLS 2.0.1, 2.1.0, 3.1.0, 3.3.1, 3.4.0. For Driver compatibility
see [compatibility matrix](#vgpu-software-compatibility-matrix). > see [compatibility matrix](#vgpu-software-compatibility-matrix).
Drivers are only supported until **17.x releases**.
> [!warning] 18.x Drivers are not yet supported!
> Drivers are only supported until **17.x releases**.
This service can be used without internet connection. This service can be used without internet connection.
Only the clients need a connection to this service on configured port. Only the clients need a connection to this service on configured port.
@@ -83,7 +86,7 @@ docker run -e DLS_URL=`hostname -i` -e DLS_PORT=443 -p 443:443 -v $WORKING_DIR:/
See [`examples`](examples) directory for more advanced examples (with reverse proxy usage). See [`examples`](examples) directory for more advanced examples (with reverse proxy usage).
> Adjust *REQUIRED* variables as needed > Adjust `REQUIRED` variables as needed
```yaml ```yaml
version: '3.9' version: '3.9'
@@ -330,14 +333,14 @@ Packages are available here:
- [GitLab-Registry](https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/-/packages) - [GitLab-Registry](https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/-/packages)
Successful tested with: Successful tested with (**LTS Version**):
- **Debian 12 (Bookworm)** (EOL: June 06, 2026) - **Debian 12 (Bookworm)** (EOL: June 06, 2026)
- *Ubuntu 22.10 (Kinetic Kudu)* (EOL: July 20, 2023) - *Ubuntu 22.10 (Kinetic Kudu)* (EOL: July 20, 2023)
- *Ubuntu 23.04 (Lunar Lobster)* (EOL: January 2024) - *Ubuntu 23.04 (Lunar Lobster)* (EOL: January 2024)
- *Ubuntu 23.10 (Mantic Minotaur)* (EOL: July 2024) - *Ubuntu 23.10 (Mantic Minotaur)* (EOL: July 2024)
- **Ubuntu 24.04 (Noble Numbat)** (EOL: April 2036) - **Ubuntu 24.04 (Noble Numbat)** (EOL: Apr 2029)
- *Ubuntu 24.10 (Oracular Oriole)* (EOL: tba.) - *Ubuntu 24.10 (Oracular Oriole)* (EOL: Jul 2025)
Not working with: Not working with:
@@ -399,6 +402,9 @@ Continue [here](#unraid-guest) for docker guest setup.
Tanks to [@mrzenc](https://github.com/mrzenc) for [fastapi-dls-nixos](https://github.com/mrzenc/fastapi-dls-nixos). Tanks to [@mrzenc](https://github.com/mrzenc) for [fastapi-dls-nixos](https://github.com/mrzenc/fastapi-dls-nixos).
> [!note] Native NixOS-Package
> There is a [pull request](https://github.com/NixOS/nixpkgs/pull/358647) which adds fastapi-dls into nixpkgs.
## Let's Encrypt Certificate (optional) ## Let's Encrypt Certificate (optional)
If you're using installation via docker, you can use `traefik`. Please refer to their documentation. If you're using installation via docker, you can use `traefik`. Please refer to their documentation.
@@ -600,6 +606,21 @@ Logs are available in `C:\Users\Public\Documents\Nvidia\LoggingLog.NVDisplay.Con
# Known Issues # Known Issues
## Generic
### `Failed to acquire license from <ip> (Info: <license> - Error: The allowed time to process response has expired)`
- Did your timezone settings are correct on fastapi-dls **and your guest**?
- Did you download the client-token more than an hour ago?
Please download a new client-token. The guest have to register within an hour after client-token was created.
### `jose.exceptions.JWTError: Signature verification failed.`
- Did you recreate `instance.public.pem` / `instance.private.pem`?
Then you have to download a **new** client-token on each of your guests.
## Linux ## Linux
### Invalid HTTP request ### Invalid HTTP request
@@ -734,6 +755,9 @@ The error message can safely be ignored (since we have no license limitation :P)
**18.x Drivers are not supported on FastAPI-DLS Versions < 1.6.0** **18.x Drivers are not supported on FastAPI-DLS Versions < 1.6.0**
<details>
<summary>Show Table</summary>
Successfully tested with this package versions. Successfully tested with this package versions.
| vGPU Suftware | Driver Branch | Linux vGPU Manager | Linux Driver | Windows Driver | Release Date | EOL Date | | vGPU Suftware | Driver Branch | Linux vGPU Manager | Linux Driver | Windows Driver | Release Date | EOL Date |
@@ -757,6 +781,8 @@ Successfully tested with this package versions.
| `15.4` | R525 | `525.147.01` | `525.147.05` | `529.19` | June 2023 | December 2023 | | `15.4` | R525 | `525.147.01` | `525.147.05` | `529.19` | June 2023 | December 2023 |
| `14.4` | R510 | `510.108.03` | `510.108.03` | `514.08` | December 2022 | February 2023 | | `14.4` | R510 | `510.108.03` | `510.108.03` | `514.08` | December 2022 | February 2023 |
</details>
- https://docs.nvidia.com/grid/index.html - https://docs.nvidia.com/grid/index.html
- https://docs.nvidia.com/grid/gpus-supported-by-vgpu.html - https://docs.nvidia.com/grid/gpus-supported-by-vgpu.html

View File

@@ -6,7 +6,8 @@ from datetime import datetime, timedelta, UTC
from hashlib import sha256 from hashlib import sha256
from json import loads as json_loads from json import loads as json_loads
from os import getenv as env from os import getenv as env
from os.path import join, dirname from os.path import join, dirname, isfile
from random import randbytes
from uuid import uuid4 from uuid import uuid4
from dateutil.relativedelta import relativedelta from dateutil.relativedelta import relativedelta
@@ -21,7 +22,7 @@ from starlette.middleware.cors import CORSMiddleware
from starlette.responses import StreamingResponse, JSONResponse as JSONr, HTMLResponse as HTMLr, Response, RedirectResponse from starlette.responses import StreamingResponse, JSONResponse as JSONr, HTMLResponse as HTMLr, Response, RedirectResponse
from orm import Origin, Lease, init as db_init, migrate from orm import Origin, Lease, init as db_init, migrate
from util import load_key, load_file from util import PrivateKey, PublicKey, load_file, Cert
# Load variables # Load variables
load_dotenv('../version.env') load_dotenv('../version.env')
@@ -42,17 +43,18 @@ DLS_PORT = int(env('DLS_PORT', '443'))
SITE_KEY_XID = str(env('SITE_KEY_XID', '00000000-0000-0000-0000-000000000000')) SITE_KEY_XID = str(env('SITE_KEY_XID', '00000000-0000-0000-0000-000000000000'))
INSTANCE_REF = str(env('INSTANCE_REF', '10000000-0000-0000-0000-000000000001')) INSTANCE_REF = str(env('INSTANCE_REF', '10000000-0000-0000-0000-000000000001'))
ALLOTMENT_REF = str(env('ALLOTMENT_REF', '20000000-0000-0000-0000-000000000001')) ALLOTMENT_REF = str(env('ALLOTMENT_REF', '20000000-0000-0000-0000-000000000001'))
INSTANCE_KEY_RSA = load_key(str(env('INSTANCE_KEY_RSA', join(dirname(__file__), 'cert/instance.private.pem')))) INSTANCE_KEY_RSA = PrivateKey.from_file(str(env('INSTANCE_KEY_RSA', join(dirname(__file__), 'cert/instance.private.pem'))))
INSTANCE_KEY_PUB = load_key(str(env('INSTANCE_KEY_PUB', join(dirname(__file__), 'cert/instance.public.pem')))) INSTANCE_KEY_PUB = PublicKey.from_file(str(env('INSTANCE_KEY_PUB', join(dirname(__file__), 'cert/instance.public.pem'))))
TOKEN_EXPIRE_DELTA = relativedelta(days=int(env('TOKEN_EXPIRE_DAYS', 1)), hours=int(env('TOKEN_EXPIRE_HOURS', 0))) TOKEN_EXPIRE_DELTA = relativedelta(days=int(env('TOKEN_EXPIRE_DAYS', 1)), hours=int(env('TOKEN_EXPIRE_HOURS', 0)))
LEASE_EXPIRE_DELTA = relativedelta(days=int(env('LEASE_EXPIRE_DAYS', 90)), hours=int(env('LEASE_EXPIRE_HOURS', 0))) LEASE_EXPIRE_DELTA = relativedelta(days=int(env('LEASE_EXPIRE_DAYS', 90)), hours=int(env('LEASE_EXPIRE_HOURS', 0)))
LEASE_RENEWAL_PERIOD = float(env('LEASE_RENEWAL_PERIOD', 0.15)) LEASE_RENEWAL_PERIOD = float(env('LEASE_RENEWAL_PERIOD', 0.15))
LEASE_RENEWAL_DELTA = timedelta(days=int(env('LEASE_EXPIRE_DAYS', 90)), hours=int(env('LEASE_EXPIRE_HOURS', 0))) LEASE_RENEWAL_DELTA = timedelta(days=int(env('LEASE_EXPIRE_DAYS', 90)), hours=int(env('LEASE_EXPIRE_HOURS', 0)))
CLIENT_TOKEN_EXPIRE_DELTA = relativedelta(years=12) CLIENT_TOKEN_EXPIRE_DELTA = relativedelta(years=12)
CORS_ORIGINS = str(env('CORS_ORIGINS', '')).split(',') if (env('CORS_ORIGINS')) else [f'https://{DLS_URL}'] CORS_ORIGINS = str(env('CORS_ORIGINS', '')).split(',') if (env('CORS_ORIGINS')) else [f'https://{DLS_URL}']
DT_FORMAT = '%Y-%m-%dT%H:%M:%S.%fZ'
jwt_encode_key = jwk.construct(INSTANCE_KEY_RSA.export_key().decode('utf-8'), algorithm=ALGORITHMS.RS256) jwt_encode_key = jwk.construct(INSTANCE_KEY_RSA.pem(), algorithm=ALGORITHMS.RS256)
jwt_decode_key = jwk.construct(INSTANCE_KEY_PUB.export_key().decode('utf-8'), algorithm=ALGORITHMS.RS256) jwt_decode_key = jwk.construct(INSTANCE_KEY_PUB.pem(), algorithm=ALGORITHMS.RS256)
# Logging # Logging
LOG_LEVEL = logging.DEBUG if DEBUG else logging.INFO LOG_LEVEL = logging.DEBUG if DEBUG else logging.INFO
@@ -248,6 +250,7 @@ async def _client_token():
"iat": timegm(cur_time.timetuple()), "iat": timegm(cur_time.timetuple()),
"nbf": timegm(cur_time.timetuple()), "nbf": timegm(cur_time.timetuple()),
"exp": timegm(exp_time.timetuple()), "exp": timegm(exp_time.timetuple()),
"protocol_version": "2.0",
"update_mode": "ABSOLUTE", "update_mode": "ABSOLUTE",
"scope_ref_list": [ALLOTMENT_REF], "scope_ref_list": [ALLOTMENT_REF],
"fulfillment_class_ref_list": [], "fulfillment_class_ref_list": [],
@@ -264,10 +267,10 @@ async def _client_token():
}, },
"service_instance_public_key_configuration": { "service_instance_public_key_configuration": {
"service_instance_public_key_me": { "service_instance_public_key_me": {
"mod": hex(INSTANCE_KEY_PUB.public_key().n)[2:], "mod": hex(INSTANCE_KEY_PUB.raw().public_numbers().n)[2:],
"exp": int(INSTANCE_KEY_PUB.public_key().e), "exp": int(INSTANCE_KEY_PUB.raw().public_numbers().e),
}, },
"service_instance_public_key_pem": INSTANCE_KEY_PUB.export_key().decode('utf-8'), "service_instance_public_key_pem": INSTANCE_KEY_PUB.pem().decode('utf-8'),
"key_retention_mode": "LATEST_ONLY" "key_retention_mode": "LATEST_ONLY"
}, },
} }
@@ -287,7 +290,7 @@ async def auth_v1_origin(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC)
origin_ref = j.get('candidate_origin_ref') origin_ref = j.get('candidate_origin_ref')
logging.info(f'> [ origin ]: {origin_ref}: {j}') logger.info(f'> [ origin ]: {origin_ref}: {j}')
data = Origin( data = Origin(
origin_ref=origin_ref, origin_ref=origin_ref,
@@ -298,14 +301,19 @@ async def auth_v1_origin(request: Request):
Origin.create_or_update(db, data) Origin.create_or_update(db, data)
environment = {
'raw_env': j.get('environment')
}
environment.update(j.get('environment'))
response = { response = {
"origin_ref": origin_ref, "origin_ref": origin_ref,
"environment": j.get('environment'), "environment": environment,
"svc_port_set_list": None, "svc_port_set_list": None,
"node_url_list": None, "node_url_list": None,
"node_query_order": None, "node_query_order": None,
"prompts": None, "prompts": None,
"sync_timestamp": cur_time.isoformat() "sync_timestamp": cur_time.strftime(DT_FORMAT)
} }
return JSONr(response) return JSONr(response)
@@ -317,7 +325,7 @@ async def auth_v1_origin_update(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC)
origin_ref = j.get('origin_ref') origin_ref = j.get('origin_ref')
logging.info(f'> [ update ]: {origin_ref}: {j}') logger.info(f'> [ update ]: {origin_ref}: {j}')
data = Origin( data = Origin(
origin_ref=origin_ref, origin_ref=origin_ref,
@@ -331,7 +339,7 @@ async def auth_v1_origin_update(request: Request):
response = { response = {
"environment": j.get('environment'), "environment": j.get('environment'),
"prompts": None, "prompts": None,
"sync_timestamp": cur_time.isoformat() "sync_timestamp": cur_time.strftime(DT_FORMAT)
} }
return JSONr(response) return JSONr(response)
@@ -344,7 +352,7 @@ async def auth_v1_code(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC)
origin_ref = j.get('origin_ref') origin_ref = j.get('origin_ref')
logging.info(f'> [ code ]: {origin_ref}: {j}') logger.info(f'> [ code ]: {origin_ref}: {j}')
delta = relativedelta(minutes=15) delta = relativedelta(minutes=15)
expires = cur_time + delta expires = cur_time + delta
@@ -362,8 +370,8 @@ async def auth_v1_code(request: Request):
response = { response = {
"auth_code": auth_code, "auth_code": auth_code,
"sync_timestamp": cur_time.isoformat(), "prompts": None,
"prompts": None "sync_timestamp": cur_time.strftime(DT_FORMAT),
} }
return JSONr(response) return JSONr(response)
@@ -381,7 +389,7 @@ async def auth_v1_token(request: Request):
return JSONr(status_code=400, content={'status': 400, 'title': 'invalid token', 'detail': str(e)}) return JSONr(status_code=400, content={'status': 400, 'title': 'invalid token', 'detail': str(e)})
origin_ref = payload.get('origin_ref') origin_ref = payload.get('origin_ref')
logging.info(f'> [ auth ]: {origin_ref}: {j}') logger.info(f'> [ auth ]: {origin_ref}: {j}')
# validate the code challenge # validate the code challenge
challenge = b64enc(sha256(j.get('code_verifier').encode('utf-8')).digest()).rstrip(b'=').decode('utf-8') challenge = b64enc(sha256(j.get('code_verifier').encode('utf-8')).digest()).rstrip(b'=').decode('utf-8')
@@ -396,27 +404,274 @@ async def auth_v1_token(request: Request):
'iss': 'https://cls.nvidia.org', 'iss': 'https://cls.nvidia.org',
'aud': 'https://cls.nvidia.org', 'aud': 'https://cls.nvidia.org',
'exp': timegm(access_expires_on.timetuple()), 'exp': timegm(access_expires_on.timetuple()),
'origin_ref': origin_ref,
'key_ref': SITE_KEY_XID, 'key_ref': SITE_KEY_XID,
'kid': SITE_KEY_XID, 'kid': SITE_KEY_XID,
'origin_ref': origin_ref,
} }
auth_token = jwt.encode(new_payload, key=jwt_encode_key, headers={'kid': payload.get('kid')}, algorithm=ALGORITHMS.RS256) auth_token = jwt.encode(new_payload, key=jwt_encode_key, headers={'kid': payload.get('kid')}, algorithm=ALGORITHMS.RS256)
response = { response = {
"expires": access_expires_on.isoformat(),
"auth_token": auth_token, "auth_token": auth_token,
"sync_timestamp": cur_time.isoformat(), "expires": access_expires_on.strftime(DT_FORMAT),
"prompts": None,
"sync_timestamp": cur_time.strftime(DT_FORMAT),
} }
return JSONr(response) return JSONr(response)
# NLS 3.4.0 - venv/lib/python3.12/site-packages/nls_services_lease/test/test_lease_single_controller.py
@app.post('/leasing/v1/config-token', description='request to get config token for lease operations')
async def leasing_v1_config_token(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC)
logger.debug(f'CALLED /leasing/v1/config-token')
logger.debug(f'Headers: {request.headers}')
logger.debug(f'Request: {j}')
# todo: THIS IS A DEMO ONLY
###
#
# https://git.collinwebdesigns.de/nvidia/nls/-/blob/main/src/test/test_config_token.py
#
###
root_private_key_filename = join(dirname(__file__), 'cert/my_demo_root_private_key.pem')
root_certificate_filename = join(dirname(__file__), 'cert/my_demo_root_certificate.pem')
ca_private_key_filename = join(dirname(__file__), 'cert/my_demo_ca_private_key.pem')
ca_certificate_filename = join(dirname(__file__), 'cert/my_demo_ca_certificate.pem')
si_private_key_filename = join(dirname(__file__), 'cert/my_demo_si_private_key.pem')
si_certificate_filename = join(dirname(__file__), 'cert/my_demo_si_certificate.pem')
def init_config_token_demo():
from cryptography import x509
from cryptography.hazmat._oid import NameOID
from cryptography.hazmat.primitives import serialization, hashes
from cryptography.hazmat.primitives.asymmetric.rsa import generate_private_key
from cryptography.hazmat.primitives.serialization import Encoding
""" Create Root Key and Certificate """
# create root keypair
my_root_private_key = generate_private_key(public_exponent=65537, key_size=4096)
my_root_public_key = my_root_private_key.public_key()
# create root-certificate subject
my_root_subject = x509.Name([
x509.NameAttribute(NameOID.COUNTRY_NAME, u'US'),
x509.NameAttribute(NameOID.STATE_OR_PROVINCE_NAME, u'California'),
x509.NameAttribute(NameOID.ORGANIZATION_NAME, u'Nvidia'),
x509.NameAttribute(NameOID.ORGANIZATIONAL_UNIT_NAME, u'Nvidia Licensing Service (NLS)'),
x509.NameAttribute(NameOID.COMMON_NAME, u'NLS Root CA'),
])
# create self-signed root-certificate
my_root_certificate = (
x509.CertificateBuilder()
.subject_name(my_root_subject)
.issuer_name(my_root_subject)
.public_key(my_root_public_key)
.serial_number(x509.random_serial_number())
.not_valid_before(datetime.now(tz=UTC) - timedelta(days=1))
.not_valid_after(datetime.now(tz=UTC) + timedelta(days=365 * 10))
.add_extension(x509.BasicConstraints(ca=True, path_length=None), critical=True)
.add_extension(x509.SubjectKeyIdentifier.from_public_key(my_root_public_key), critical=False)
.sign(my_root_private_key, hashes.SHA256()))
my_root_private_key_as_pem = my_root_private_key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption(),
)
with open(root_private_key_filename, 'wb') as f:
f.write(my_root_private_key_as_pem)
with open(root_certificate_filename, 'wb') as f:
f.write(my_root_certificate.public_bytes(encoding=Encoding.PEM))
""" Create CA (Intermediate) Key and Certificate """
# create ca keypair
my_ca_private_key = generate_private_key(public_exponent=65537, key_size=4096)
my_ca_public_key = my_ca_private_key.public_key()
# create ca-certificate subject
my_ca_subject = x509.Name([
x509.NameAttribute(NameOID.COUNTRY_NAME, u'US'),
x509.NameAttribute(NameOID.STATE_OR_PROVINCE_NAME, u'California'),
x509.NameAttribute(NameOID.ORGANIZATION_NAME, u'Nvidia'),
x509.NameAttribute(NameOID.ORGANIZATIONAL_UNIT_NAME, u'Nvidia Licensing Service (NLS)'),
x509.NameAttribute(NameOID.COMMON_NAME, u'NLS Intermediate CA'),
])
# create self-signed ca-certificate
my_ca_certificate = (
x509.CertificateBuilder()
.subject_name(my_ca_subject)
.issuer_name(my_root_subject)
.public_key(my_ca_public_key)
.serial_number(x509.random_serial_number())
.not_valid_before(datetime.now(tz=UTC) - timedelta(days=1))
.not_valid_after(datetime.now(tz=UTC) + timedelta(days=365 * 10))
.add_extension(x509.BasicConstraints(ca=True, path_length=None), critical=True)
.add_extension(x509.KeyUsage(digital_signature=False, key_encipherment=False, key_cert_sign=True,
key_agreement=False, content_commitment=False, data_encipherment=False,
crl_sign=True, encipher_only=False, decipher_only=False), critical=True)
.add_extension(x509.SubjectKeyIdentifier.from_public_key(my_ca_public_key), critical=False)
# .add_extension(x509.AuthorityKeyIdentifier.from_issuer_public_key(my_root_public_key), critical=False)
.add_extension(x509.AuthorityKeyIdentifier.from_issuer_subject_key_identifier(
my_root_certificate.extensions.get_extension_for_class(x509.SubjectKeyIdentifier).value
), critical=False)
.sign(my_root_private_key, hashes.SHA256()))
my_ca_private_key_as_pem = my_ca_private_key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption(),
)
with open(ca_private_key_filename, 'wb') as f:
f.write(my_ca_private_key_as_pem)
with open(ca_certificate_filename, 'wb') as f:
f.write(my_ca_certificate.public_bytes(encoding=Encoding.PEM))
""" Create Service-Instance Key and Certificate """
# create si keypair
my_si_private_key = generate_private_key(public_exponent=65537, key_size=2048)
my_si_public_key = my_si_private_key.public_key()
my_si_private_key_as_pem = my_si_private_key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption(),
)
my_si_public_key_as_pem = my_si_public_key.public_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PublicFormat.SubjectPublicKeyInfo,
)
with open(si_private_key_filename, 'wb') as f:
f.write(my_si_private_key_as_pem)
# with open('instance.public.pem', 'wb') as f:
# f.write(my_si_public_key_as_pem)
# create si-certificate subject
my_si_subject = x509.Name([
# x509.NameAttribute(NameOID.COMMON_NAME, INSTANCE_REF),
x509.NameAttribute(NameOID.COMMON_NAME, j.get('service_instance_ref')),
])
# create self-signed si-certificate
my_si_certificate = (
x509.CertificateBuilder()
.subject_name(my_si_subject)
.issuer_name(my_ca_subject)
.public_key(my_si_public_key)
.serial_number(x509.random_serial_number())
.not_valid_before(datetime.now(tz=UTC) - timedelta(days=1))
.not_valid_after(datetime.now(tz=UTC) + timedelta(days=365 * 10))
.add_extension(x509.KeyUsage(digital_signature=True, key_encipherment=True, key_cert_sign=False,
key_agreement=True, content_commitment=False, data_encipherment=False,
crl_sign=False, encipher_only=False, decipher_only=False), critical=True)
.add_extension(x509.ExtendedKeyUsage([
x509.oid.ExtendedKeyUsageOID.SERVER_AUTH,
x509.oid.ExtendedKeyUsageOID.CLIENT_AUTH]
), critical=False)
.add_extension(x509.SubjectKeyIdentifier.from_public_key(my_si_public_key), critical=False)
# .add_extension(x509.AuthorityKeyIdentifier.from_issuer_public_key(my_ca_public_key), critical=False)
.add_extension(x509.AuthorityKeyIdentifier.from_issuer_subject_key_identifier(
my_ca_certificate.extensions.get_extension_for_class(x509.SubjectKeyIdentifier).value
), critical=False)
.add_extension(x509.SubjectAlternativeName([
# x509.DNSName(INSTANCE_REF)
x509.DNSName(j.get('service_instance_ref'))
]), critical=False)
.sign(my_ca_private_key, hashes.SHA256()))
my_si_public_key_exp = my_si_certificate.public_key().public_numbers().e
my_si_public_key_mod = f'{my_si_certificate.public_key().public_numbers().n:x}' # hex value without "0x" prefix
with open(si_certificate_filename, 'wb') as f:
f.write(my_si_certificate.public_bytes(encoding=Encoding.PEM))
if not (isfile(root_private_key_filename)
and isfile(ca_private_key_filename)
and isfile(ca_certificate_filename)
and isfile(si_private_key_filename)
and isfile(si_certificate_filename)):
init_config_token_demo()
my_ca_certificate = Cert.from_file(ca_certificate_filename)
my_si_certificate = Cert.from_file(si_certificate_filename)
my_si_private_key = PrivateKey.from_file(si_private_key_filename)
my_si_private_key_as_pem = my_si_private_key.pem()
my_si_public_key = my_si_private_key.public_key().raw()
my_si_public_key_as_pem = my_si_private_key.public_key().pem()
""" build out payload """
cur_time = datetime.now(UTC)
exp_time = cur_time + CLIENT_TOKEN_EXPIRE_DELTA
payload = {
"iss": "NLS Service Instance",
"aud": "NLS Licensed Client",
"iat": timegm(cur_time.timetuple()),
"nbf": timegm(cur_time.timetuple()),
"exp": timegm(exp_time.timetuple()),
"protocol_version": "2.0",
"d_name": "DLS",
"service_instance_ref": j.get('service_instance_ref'),
"service_instance_public_key_configuration": {
"service_instance_public_key_me": {
"mod": hex(my_si_public_key.public_numbers().n)[2:],
"exp": int(my_si_public_key.public_numbers().e),
},
# 64 chars per line (pem default)
"service_instance_public_key_pem": my_si_public_key_as_pem.decode('utf-8').strip(),
"key_retention_mode": "LATEST_ONLY"
},
}
my_jwt_encode_key = jwk.construct(my_si_private_key_as_pem.decode('utf-8'), algorithm=ALGORITHMS.RS256)
config_token = jws.sign(payload, key=my_jwt_encode_key, headers=None, algorithm=ALGORITHMS.RS256)
response_ca_chain = my_ca_certificate.pem().decode('utf-8')
response_si_certificate = my_si_certificate.pem().decode('utf-8')
response = {
"certificateConfiguration": {
# 76 chars per line
"caChain": [response_ca_chain],
# 76 chars per line
"publicCert": response_si_certificate,
"publicKey": {
"exp": int(my_si_certificate.raw().public_key().public_numbers().e),
"mod": [hex(my_si_certificate.raw().public_key().public_numbers().n)[2:]],
},
},
"configToken": config_token,
}
logging.debug(response)
return JSONr(response, status_code=200)
# venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py # venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py
@app.post('/leasing/v1/lessor', description='request multiple leases (borrow) for current origin') @app.post('/leasing/v1/lessor', description='request multiple leases (borrow) for current origin')
async def leasing_v1_lessor(request: Request): async def leasing_v1_lessor(request: Request):
j, token, cur_time = json_loads((await request.body()).decode('utf-8')), __get_token(request), datetime.now(UTC) j, token, cur_time = json_loads((await request.body()).decode('utf-8')), __get_token(request), datetime.now(UTC)
logger.debug(j)
logger.debug(request.headers)
try: try:
token = __get_token(request) token = __get_token(request)
except JWTError: except JWTError:
@@ -424,9 +679,10 @@ async def leasing_v1_lessor(request: Request):
origin_ref = token.get('origin_ref') origin_ref = token.get('origin_ref')
scope_ref_list = j.get('scope_ref_list') scope_ref_list = j.get('scope_ref_list')
logging.info(f'> [ create ]: {origin_ref}: create leases for scope_ref_list {scope_ref_list}') logger.info(f'> [ create ]: {origin_ref}: create leases for scope_ref_list {scope_ref_list}')
lease_result_list = [] lease_result_list = []
# todo: for lease_proposal in lease_proposal_list
for scope_ref in scope_ref_list: for scope_ref in scope_ref_list:
# if scope_ref not in [ALLOTMENT_REF]: # if scope_ref not in [ALLOTMENT_REF]:
# return JSONr(status_code=500, detail=f'no service instances found for scopes: ["{scope_ref}"]') # return JSONr(status_code=500, detail=f'no service instances found for scopes: ["{scope_ref}"]')
@@ -434,29 +690,38 @@ async def leasing_v1_lessor(request: Request):
lease_ref = str(uuid4()) lease_ref = str(uuid4())
expires = cur_time + LEASE_EXPIRE_DELTA expires = cur_time + LEASE_EXPIRE_DELTA
lease_result_list.append({ lease_result_list.append({
"ordinal": 0, "error": None,
# https://docs.nvidia.com/license-system/latest/nvidia-license-system-user-guide/index.html # https://docs.nvidia.com/license-system/latest/nvidia-license-system-user-guide/index.html
"lease": { "lease": {
"ref": lease_ref, "created": cur_time.strftime(DT_FORMAT),
"created": cur_time.isoformat(), "expires": expires.strftime(DT_FORMAT),
"expires": expires.isoformat(), "feature_name": "GRID-Virtual-WS", # todo
"lease_intent_id": None,
"license_type": "CONCURRENT_COUNTED_SINGLE",
"metadata": None,
"offline_lease": False, # todo
"product_name": "NVIDIA RTX Virtual Workstation", # todo
"recommended_lease_renewal": LEASE_RENEWAL_PERIOD, "recommended_lease_renewal": LEASE_RENEWAL_PERIOD,
"offline_lease": "true", "ref": lease_ref,
"license_type": "CONCURRENT_COUNTED_SINGLE" },
} "ordinal": None,
}) })
data = Lease(origin_ref=origin_ref, lease_ref=lease_ref, lease_created=cur_time, lease_expires=expires) data = Lease(origin_ref=origin_ref, lease_ref=lease_ref, lease_created=cur_time, lease_expires=expires)
Lease.create_or_update(db, data) Lease.create_or_update(db, data)
response = { response = {
"client_challenge": j.get('client_challenge'),
"lease_result_list": lease_result_list, "lease_result_list": lease_result_list,
"result_code": "SUCCESS", "prompts": None,
"sync_timestamp": cur_time.isoformat(), "result_code": None,
"prompts": None "sync_timestamp": cur_time.strftime(DT_FORMAT),
} }
return JSONr(response) logger.debug(response)
signature = f'b\'{randbytes(256).hex()}\''
return JSONr(response, headers={'access-control-expose-headers': 'X-NLS-Signature', 'X-NLS-Signature': signature})
# venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py # venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py
@@ -468,12 +733,12 @@ async def leasing_v1_lessor_lease(request: Request):
origin_ref = token.get('origin_ref') origin_ref = token.get('origin_ref')
active_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref))) active_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref)))
logging.info(f'> [ leases ]: {origin_ref}: found {len(active_lease_list)} active leases') logger.info(f'> [ leases ]: {origin_ref}: found {len(active_lease_list)} active leases')
response = { response = {
"active_lease_list": active_lease_list, "active_lease_list": active_lease_list,
"sync_timestamp": cur_time.isoformat(), "prompts": None,
"prompts": None "sync_timestamp": cur_time.strftime(DT_FORMAT),
} }
return JSONr(response) return JSONr(response)
@@ -483,10 +748,10 @@ async def leasing_v1_lessor_lease(request: Request):
# venv/lib/python3.9/site-packages/nls_core_lease/lease_single.py # venv/lib/python3.9/site-packages/nls_core_lease/lease_single.py
@app.put('/leasing/v1/lease/{lease_ref}', description='renew a lease') @app.put('/leasing/v1/lease/{lease_ref}', description='renew a lease')
async def leasing_v1_lease_renew(request: Request, lease_ref: str): async def leasing_v1_lease_renew(request: Request, lease_ref: str):
token, cur_time = __get_token(request), datetime.now(UTC) j, token, cur_time = json_loads((await request.body()).decode('utf-8')), __get_token(request), datetime.now(UTC)
origin_ref = token.get('origin_ref') origin_ref = token.get('origin_ref')
logging.info(f'> [ renew ]: {origin_ref}: renew {lease_ref}') logger.info(f'> [ renew ]: {origin_ref}: renew {lease_ref}')
entity = Lease.find_by_origin_ref_and_lease_ref(db, origin_ref, lease_ref) entity = Lease.find_by_origin_ref_and_lease_ref(db, origin_ref, lease_ref)
if entity is None: if entity is None:
@@ -494,17 +759,21 @@ async def leasing_v1_lease_renew(request: Request, lease_ref: str):
expires = cur_time + LEASE_EXPIRE_DELTA expires = cur_time + LEASE_EXPIRE_DELTA
response = { response = {
"client_challenge": j.get('client_challenge'),
"expires": expires.strftime(DT_FORMAT),
"feature_expired": False,
"lease_ref": lease_ref, "lease_ref": lease_ref,
"expires": expires.isoformat(), "metadata": None,
"recommended_lease_renewal": LEASE_RENEWAL_PERIOD,
"offline_lease": True, "offline_lease": True,
"prompts": None, "prompts": None,
"sync_timestamp": cur_time.isoformat(), "recommended_lease_renewal": LEASE_RENEWAL_PERIOD,
"sync_timestamp": cur_time.strftime(DT_FORMAT),
} }
Lease.renew(db, entity, expires, cur_time) Lease.renew(db, entity, expires, cur_time)
return JSONr(response) signature = f'b\'{randbytes(256).hex()}\''
return JSONr(response, headers={'access-control-expose-headers': 'X-NLS-Signature', 'X-NLS-Signature': signature})
# venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_single_controller.py # venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_single_controller.py
@@ -513,7 +782,7 @@ async def leasing_v1_lease_delete(request: Request, lease_ref: str):
token, cur_time = __get_token(request), datetime.now(UTC) token, cur_time = __get_token(request), datetime.now(UTC)
origin_ref = token.get('origin_ref') origin_ref = token.get('origin_ref')
logging.info(f'> [ return ]: {origin_ref}: return {lease_ref}') logger.info(f'> [ return ]: {origin_ref}: return {lease_ref}')
entity = Lease.find_by_lease_ref(db, lease_ref) entity = Lease.find_by_lease_ref(db, lease_ref)
if entity.origin_ref != origin_ref: if entity.origin_ref != origin_ref:
@@ -525,9 +794,10 @@ async def leasing_v1_lease_delete(request: Request, lease_ref: str):
return JSONr(status_code=404, content={'status': 404, 'detail': 'lease not found'}) return JSONr(status_code=404, content={'status': 404, 'detail': 'lease not found'})
response = { response = {
"client_challenge": None,
"lease_ref": lease_ref, "lease_ref": lease_ref,
"prompts": None, "prompts": None,
"sync_timestamp": cur_time.isoformat(), "sync_timestamp": cur_time.strftime(DT_FORMAT),
} }
return JSONr(response) return JSONr(response)
@@ -542,13 +812,13 @@ async def leasing_v1_lessor_lease_remove(request: Request):
released_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref))) released_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref)))
deletions = Lease.cleanup(db, origin_ref) deletions = Lease.cleanup(db, origin_ref)
logging.info(f'> [ remove ]: {origin_ref}: removed {deletions} leases') logger.info(f'> [ remove ]: {origin_ref}: removed {deletions} leases')
response = { response = {
"released_lease_list": released_lease_list, "released_lease_list": released_lease_list,
"release_failure_list": None, "release_failure_list": None,
"sync_timestamp": cur_time.isoformat(), "prompts": None,
"prompts": None "sync_timestamp": cur_time.strftime(DT_FORMAT),
} }
return JSONr(response) return JSONr(response)
@@ -564,13 +834,13 @@ async def leasing_v1_lessor_shutdown(request: Request):
released_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref))) released_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref)))
deletions = Lease.cleanup(db, origin_ref) deletions = Lease.cleanup(db, origin_ref)
logging.info(f'> [ shutdown ]: {origin_ref}: removed {deletions} leases') logger.info(f'> [ shutdown ]: {origin_ref}: removed {deletions} leases')
response = { response = {
"released_lease_list": released_lease_list, "released_lease_list": released_lease_list,
"release_failure_list": None, "release_failure_list": None,
"sync_timestamp": cur_time.isoformat(), "prompts": None,
"prompts": None "sync_timestamp": cur_time.strftime(DT_FORMAT),
} }
return JSONr(response) return JSONr(response)
@@ -587,7 +857,7 @@ if __name__ == '__main__':
# #
### ###
logging.info(f'> Starting dev-server ...') logger.info(f'> Starting dev-server ...')
ssl_keyfile = join(dirname(__file__), 'cert/webserver.key') ssl_keyfile = join(dirname(__file__), 'cert/webserver.key')
ssl_certfile = join(dirname(__file__), 'cert/webserver.crt') ssl_certfile = join(dirname(__file__), 'cert/webserver.crt')

View File

@@ -1,8 +1,108 @@
import logging import logging
from cryptography.hazmat.primitives import serialization
from cryptography.hazmat.primitives.asymmetric.rsa import RSAPrivateKey, RSAPublicKey, generate_private_key
from cryptography.hazmat.primitives.serialization import load_pem_private_key, load_pem_public_key
from cryptography.x509 import load_pem_x509_certificate, Certificate
logging.basicConfig() logging.basicConfig()
class PrivateKey:
def __init__(self, data: bytes):
self.__key = load_pem_private_key(data, password=None)
@staticmethod
def from_file(filename: str) -> "PrivateKey":
log = logging.getLogger(__name__)
log.debug(f'Importing RSA-Private-Key from "{filename}"')
with open(filename, 'rb') as f:
data = f.read()
return PrivateKey(data=data.strip())
def raw(self) -> RSAPrivateKey:
return self.__key
def pem(self) -> bytes:
return self.__key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption()
)
def public_key(self) -> "PublicKey":
data = self.__key.public_key().public_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PublicFormat.SubjectPublicKeyInfo
)
return PublicKey(data=data)
@staticmethod
def generate(public_exponent: int = 65537, key_size: int = 2048) -> "PrivateKey":
log = logging.getLogger(__name__)
log.debug(f'Generating RSA-Key')
key = generate_private_key(public_exponent=public_exponent, key_size=key_size)
data = key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption()
)
return PrivateKey(data=data)
class PublicKey:
def __init__(self, data: bytes):
self.__key = load_pem_public_key(data)
@staticmethod
def from_file(filename: str) -> "PublicKey":
log = logging.getLogger(__name__)
log.debug(f'Importing RSA-Public-Key from "{filename}"')
with open(filename, 'rb') as f:
data = f.read()
return PublicKey(data=data.strip())
def raw(self) -> RSAPublicKey:
return self.__key
def pem(self) -> bytes:
return self.__key.public_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PublicFormat.SubjectPublicKeyInfo
)
class Cert:
def __init__(self, data: bytes):
self.__cert = load_pem_x509_certificate(data)
@staticmethod
def from_file(filename: str) -> "Cert":
log = logging.getLogger(__name__)
log.debug(f'Importing Certificate from "{filename}"')
with open(filename, 'rb') as f:
data = f.read()
return Cert(data=data.strip())
def raw(self) -> Certificate:
return self.__cert
def pem(self) -> bytes:
return self.__cert.public_bytes(encoding=serialization.Encoding.PEM)
def signature(self) -> bytes:
return self.__cert.signature
def load_file(filename: str) -> bytes: def load_file(filename: str) -> bytes:
log = logging.getLogger(f'{__name__}') log = logging.getLogger(f'{__name__}')
log.debug(f'Loading contents of file "{filename}') log.debug(f'Loading contents of file "{filename}')
@@ -11,33 +111,6 @@ def load_file(filename: str) -> bytes:
return content return content
def load_key(filename: str) -> "RsaKey":
try:
# Crypto | Cryptodome on Debian
from Crypto.PublicKey import RSA
from Crypto.PublicKey.RSA import RsaKey
except ModuleNotFoundError:
from Cryptodome.PublicKey import RSA
from Cryptodome.PublicKey.RSA import RsaKey
log = logging.getLogger(__name__)
log.debug(f'Importing RSA-Key from "{filename}"')
return RSA.import_key(extern_key=load_file(filename), passphrase=None)
def generate_key() -> "RsaKey":
try:
# Crypto | Cryptodome on Debian
from Crypto.PublicKey import RSA
from Crypto.PublicKey.RSA import RsaKey
except ModuleNotFoundError:
from Cryptodome.PublicKey import RSA
from Cryptodome.PublicKey.RSA import RsaKey
log = logging.getLogger(__name__)
log.debug(f'Generating RSA-Key')
return RSA.generate(bits=2048)
class NV: class NV:
__DRIVER_MATRIX_FILENAME = 'static/driver_matrix.json' __DRIVER_MATRIX_FILENAME = 'static/driver_matrix.json'
__DRIVER_MATRIX: None | dict = None # https://docs.nvidia.com/grid/ => "Driver Versions" __DRIVER_MATRIX: None | dict = None # https://docs.nvidia.com/grid/ => "Driver Versions"

View File

@@ -1,26 +0,0 @@
# Database structure
## `request_routing.service_instance`
| xid | org_name |
|----------------------------------------|--------------------------|
| `10000000-0000-0000-0000-000000000000` | `lic-000000000000000000` |
- `xid` is used as `SERVICE_INSTANCE_XID`
## `request_routing.license_allotment_service_instance`
| xid | service_instance_xid | license_allotment_xid |
|----------------------------------------|----------------------------------------|----------------------------------------|
| `90000000-0000-0000-0000-000000000001` | `10000000-0000-0000-0000-000000000000` | `80000000-0000-0000-0000-000000000001` |
- `xid` is only a primary-key and never used as foreign-key or reference
- `license_allotment_xid` must be used to fetch `xid`'s from `request_routing.license_allotment_reference`
## `request_routing.license_allotment_reference`
| xid | license_allotment_xid |
|----------------------------------------|----------------------------------------|
| `20000000-0000-0000-0000-000000000001` | `80000000-0000-0000-0000-000000000001` |
- `xid` is used as `scope_ref_list` on token request

View File

@@ -1,181 +0,0 @@
# Reverse Engineering Notes
[[_TOC_]]
# Usefully commands
## Check licensing status
- `nvidia-smi -q | grep "License"`
**Output**
```
vGPU Software Licensed Product
License Status : Licensed (Expiry: 2023-1-14 12:59:52 GMT)
```
## Track licensing progress
- NVIDIA Grid Log: `journalctl -u nvidia-gridd -f`
```
systemd[1]: Started NVIDIA Grid Daemon.
nvidia-gridd[2986]: Configuration parameter ( ServerAddress ) not set
nvidia-gridd[2986]: vGPU Software package (0)
nvidia-gridd[2986]: Ignore service provider and node-locked licensing
nvidia-gridd[2986]: NLS initialized
nvidia-gridd[2986]: Acquiring license. (Info: license.nvidia.space; NVIDIA RTX Virtual Workstation)
nvidia-gridd[2986]: License acquired successfully. (Info: license.nvidia.space, NVIDIA RTX Virtual Workstation; Expiry: 2023-1-29 22:3:0 GMT)
```
# DLS-Container File-System (Docker)
- More about Docker Images https://git.collinwebdesigns.de/nvidia/nls
## Configuration data
Most variables and configs are stored in `/var/lib/docker/volumes/configurations/_data`.
Files can be modified with `docker cp <container-id>:/venv/... /opt/localfile/...` and back.
(May you need to fix permissions with `docker exec -u 0 <container-id> chown nonroot:nonroot /venv/...`)
## Dive / Docker image inspector
- `dive dls:appliance`
The source code is stored in `/venv/lib/python3.9/site-packages/nls_*`.
Image-Reference:
```
Tags: (unavailable)
Id: d1c7976a5d2b3681ff6c5a30f8187e4015187a83f3f285ba4a37a45458bd6b98
Digest: sha256:311223c5af7a298ec1104f5dc8c3019bfb0e1f77256dc3d995244ffb295a97
1f
Command:
#(nop) ADD file:c1900d3e3a29c29a743a8da86c437006ec5d2aa873fb24e48033b6bf492bb37b in /
```
## Private Key (Site-Key)
- `/etc/dls/config/decryptor/decryptor`
```shell
docker exec -it <container-id> /etc/dls/config/decryptor/decryptor > /tmp/private-key.pem
```
```
-----BEGIN RSA PRIVATE KEY-----
...
-----END RSA PRIVATE KEY-----
```
## Site Key Uri - `/etc/dls/config/site_key_uri.bin`
```
base64-content...
```
## DB Password - `/etc/dls/config/dls_db_password.bin`
```
base64-content...
```
**Decrypt database password**
```
cd /var/lib/docker/volumes/configurations/_data
cat dls_db_password.bin | base64 -d > dls_db_password.bin.raw
openssl rsautl -decrypt -inkey /tmp/private-key.pem -in dls_db_password.bin.raw
```
# Database
- It's enough to manipulate database licenses. There must not be changed any line of code to bypass licensing
validations.
# Logging / Stack Trace
- https://docs.nvidia.com/license-system/latest/nvidia-license-system-user-guide/index.html#troubleshooting-dls-instance
**Failed licensing log**
```
{
"activity": 100,
"context": {
"SERVICE_INSTANCE_ID": "b43d6e46-d6d0-4943-8b8d-c66a5f6e0d38",
"SERVICE_INSTANCE_NAME": "DEFAULT_2022-12-14_12:48:30",
"description": "borrow failed: NotFoundError(no pool features found for: NVIDIA RTX Virtual Workstation)",
"event_type": null,
"function_name": "_evt",
"lineno": 54,
"module_name": "nls_dal_lease_dls.event",
"operation_id": "e72a8ca7-34cc-4e11-b80c-273592085a24",
"origin_ref": "3f7f5a50-a26b-425b-8d5e-157f63e72b1c",
"service_name": "nls_services_lease"
},
"detail": {
"oc": {
"license_allotment_xid": "10c4317f-7c4c-11ed-a524-0e4252a7e5f1",
"origin_ref": "3f7f5a50-a26b-425b-8d5e-157f63e72b1c",
"service_instance_xid": "b43d6e46-d6d0-4943-8b8d-c66a5f6e0d38"
},
"operation_id": "e72a8ca7-34cc-4e11-b80c-273592085a24"
},
"id": "0cc9e092-3b92-4652-8d9e-7622ef85dc79",
"metadata": {},
"ts": "2022-12-15T10:25:36.827661Z"
}
{
"activity": 400,
"context": {
"SERVICE_INSTANCE_ID": "b43d6e46-d6d0-4943-8b8d-c66a5f6e0d38",
"SERVICE_INSTANCE_NAME": "DEFAULT_2022-12-14_12:48:30",
"description": "lease_multi_create failed: no pool features found for: NVIDIA RTX Virtual Workstation",
"event_by": "system",
"function_name": "lease_multi_create",
"level": "warning",
"lineno": 157,
"module_name": "nls_services_lease.controllers.lease_multi_controller",
"operation_id": "e72a8ca7-34cc-4e11-b80c-273592085a24",
"service_name": "nls_services_lease"
},
"detail": {
"_msg": "lease_multi_create failed: no pool features found for: NVIDIA RTX Virtual Workstation",
"exec_info": ["NotFoundError", "NotFoundError(no pool features found for: NVIDIA RTX Virtual Workstation)", " File \"/venv/lib/python3.9/site-packages/nls_services_lease/controllers/lease_multi_controller.py\", line 127, in lease_multi_create\n data = _leaseMulti.lease_multi_create(event_args)\n File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 208, in lease_multi_create\n raise e\n File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 184, in lease_multi_create\n self._try_proposals(oc, mlr, results, detail)\n File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 219, in _try_proposals\n lease = self._leases.create(creator)\n File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 230, in create\n features = self._get_features(creator)\n File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 148, in _get_features\n self._explain_not_available(cur, creator)\n File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 299, in _explain_not_available\n raise NotFoundError(f'no pool features found for: {lcc.product_name}')\n"],
"operation_id": "e72a8ca7-34cc-4e11-b80c-273592085a24"
},
"id": "282801b9-d612-40a5-9145-b56d8e420dac",
"metadata": {},
"ts": "2022-12-15T10:25:36.831673Z"
}
```
**Stack Trace**
```
"NotFoundError", "NotFoundError(no pool features found for: NVIDIA RTX Virtual Workstation)", " File \"/venv/lib/python3.9/site-packages/nls_services_lease/controllers/lease_multi_controller.py\", line 127, in lease_multi_create
data = _leaseMulti.lease_multi_create(event_args)
File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 208, in lease_multi_create
raise e
File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 184, in lease_multi_create
self._try_proposals(oc, mlr, results, detail)
File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 219, in _try_proposals
lease = self._leases.create(creator)
File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 230, in create
features = self._get_features(creator)
File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 148, in _get_features
self._explain_not_available(cur, creator)
File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 299, in _explain_not_available
raise NotFoundError(f'no pool features found for: {lcc.product_name}')
"
```
# Nginx
- NGINX uses `/opt/certs/cert.pem` and `/opt/certs/key.pem`

View File

@@ -1,8 +1,8 @@
fastapi==0.115.8 fastapi==0.115.12
uvicorn[standard]==0.34.0 uvicorn[standard]==0.34.0
python-jose==3.4.0 python-jose[cryptography]==3.4.0
pycryptodome==3.21.0 cryptography==44.0.2
python-dateutil==2.8.2 python-dateutil==2.9.0
sqlalchemy==2.0.38 sqlalchemy==2.0.40
markdown==3.7 markdown==3.7
python-dotenv==1.0.1 python-dotenv==1.1.0

View File

@@ -1,3 +1,4 @@
import json
import sys import sys
from base64 import b64encode as b64enc from base64 import b64encode as b64enc
from calendar import timegm from calendar import timegm
@@ -7,7 +8,7 @@ from os.path import dirname, join
from uuid import uuid4, UUID from uuid import uuid4, UUID
from dateutil.relativedelta import relativedelta from dateutil.relativedelta import relativedelta
from jose import jwt, jwk from jose import jwt, jwk, jws
from jose.constants import ALGORITHMS from jose.constants import ALGORITHMS
from starlette.testclient import TestClient from starlette.testclient import TestClient
@@ -16,20 +17,21 @@ sys.path.append('../')
sys.path.append('../app') sys.path.append('../app')
from app import main from app import main
from app.util import load_key from util import PrivateKey, PublicKey
client = TestClient(main.app) client = TestClient(main.app)
INSTANCE_REF = '10000000-0000-0000-0000-000000000001'
ORIGIN_REF, ALLOTMENT_REF, SECRET = str(uuid4()), '20000000-0000-0000-0000-000000000001', 'HelloWorld' ORIGIN_REF, ALLOTMENT_REF, SECRET = str(uuid4()), '20000000-0000-0000-0000-000000000001', 'HelloWorld'
# INSTANCE_KEY_RSA = generate_key() # INSTANCE_KEY_RSA = generate_key()
# INSTANCE_KEY_PUB = INSTANCE_KEY_RSA.public_key() # INSTANCE_KEY_PUB = INSTANCE_KEY_RSA.public_key()
INSTANCE_KEY_RSA = load_key(str(join(dirname(__file__), '../app/cert/instance.private.pem'))) INSTANCE_KEY_RSA = PrivateKey.from_file(str(join(dirname(__file__), '../app/cert/instance.private.pem')))
INSTANCE_KEY_PUB = load_key(str(join(dirname(__file__), '../app/cert/instance.public.pem'))) INSTANCE_KEY_PUB = PublicKey.from_file(str(join(dirname(__file__), '../app/cert/instance.public.pem')))
jwt_encode_key = jwk.construct(INSTANCE_KEY_RSA.export_key().decode('utf-8'), algorithm=ALGORITHMS.RS256) jwt_encode_key = jwk.construct(INSTANCE_KEY_RSA.pem(), algorithm=ALGORITHMS.RS256)
jwt_decode_key = jwk.construct(INSTANCE_KEY_PUB.export_key().decode('utf-8'), algorithm=ALGORITHMS.RS256) jwt_decode_key = jwk.construct(INSTANCE_KEY_PUB.pem(), algorithm=ALGORITHMS.RS256)
def __bearer_token(origin_ref: str) -> str: def __bearer_token(origin_ref: str) -> str:
@@ -69,6 +71,31 @@ def test_client_token():
assert response.status_code == 200 assert response.status_code == 200
def test_config_token(): # todo: /leasing/v1/config-token
# https://git.collinwebdesigns.de/nvidia/nls/-/blob/main/src/test/test_config_token.py
response = client.post('/leasing/v1/config-token', json={"service_instance_ref": INSTANCE_REF})
assert response.status_code == 200
nv_response_certificate_configuration = response.json().get('certificateConfiguration')
nv_response_public_cert = nv_response_certificate_configuration.get('publicCert').encode('utf-8')
nv_jwt_decode_key = jwk.construct(nv_response_public_cert, algorithm=ALGORITHMS.RS256)
nv_response_config_token = response.json().get('configToken')
payload = jws.verify(nv_response_config_token, key=nv_jwt_decode_key, algorithms=ALGORITHMS.RS256)
payload = json.loads(payload)
assert payload.get('iss') == 'NLS Service Instance'
assert payload.get('aud') == 'NLS Licensed Client'
assert payload.get('service_instance_ref') == INSTANCE_REF
nv_si_public_key_configuration = payload.get('service_instance_public_key_configuration')
nv_si_public_key_me = nv_si_public_key_configuration.get('service_instance_public_key_me')
# assert nv_si_public_key_me.get('mod') == 1 #nv_si_public_key_mod
assert len(nv_si_public_key_me.get('mod')) == 512
assert nv_si_public_key_me.get('exp') == 65537 # nv_si_public_key_exp
def test_origins(): def test_origins():
pass pass
@@ -168,6 +195,7 @@ def test_auth_v1_token():
def test_leasing_v1_lessor(): def test_leasing_v1_lessor():
payload = { payload = {
'client_challenge': 'my_unique_string',
'fulfillment_context': { 'fulfillment_context': {
'fulfillment_class_ref_list': [] 'fulfillment_class_ref_list': []
}, },
@@ -182,13 +210,16 @@ def test_leasing_v1_lessor():
response = client.post('/leasing/v1/lessor', json=payload, headers={'authorization': __bearer_token(ORIGIN_REF)}) response = client.post('/leasing/v1/lessor', json=payload, headers={'authorization': __bearer_token(ORIGIN_REF)})
assert response.status_code == 200 assert response.status_code == 200
client_challenge = response.json().get('client_challenge')
assert client_challenge == payload.get('client_challenge')
signature = eval(response.headers.get('X-NLS-Signature'))
assert len(signature) == 512
lease_result_list = response.json().get('lease_result_list') lease_result_list = response.json().get('lease_result_list')
assert len(lease_result_list) == 1 assert len(lease_result_list) == 1
assert len(lease_result_list[0]['lease']['ref']) == 36 assert len(lease_result_list[0]['lease']['ref']) == 36
assert str(UUID(lease_result_list[0]['lease']['ref'])) == lease_result_list[0]['lease']['ref'] assert str(UUID(lease_result_list[0]['lease']['ref'])) == lease_result_list[0]['lease']['ref']
return lease_result_list[0]['lease']['ref']
def test_leasing_v1_lessor_lease(): def test_leasing_v1_lessor_lease():
response = client.get('/leasing/v1/lessor/leases', headers={'authorization': __bearer_token(ORIGIN_REF)}) response = client.get('/leasing/v1/lessor/leases', headers={'authorization': __bearer_token(ORIGIN_REF)})
@@ -207,9 +238,15 @@ def test_leasing_v1_lease_renew():
### ###
response = client.put(f'/leasing/v1/lease/{active_lease_ref}', headers={'authorization': __bearer_token(ORIGIN_REF)}) payload = {'client_challenge': 'my_unique_string'}
response = client.put(f'/leasing/v1/lease/{active_lease_ref}', json=payload, headers={'authorization': __bearer_token(ORIGIN_REF)})
assert response.status_code == 200 assert response.status_code == 200
client_challenge = response.json().get('client_challenge')
assert client_challenge == payload.get('client_challenge')
signature = eval(response.headers.get('X-NLS-Signature'))
assert len(signature) == 512
lease_ref = response.json().get('lease_ref') lease_ref = response.json().get('lease_ref')
assert len(lease_ref) == 36 assert len(lease_ref) == 36
assert lease_ref == active_lease_ref assert lease_ref == active_lease_ref
@@ -231,7 +268,23 @@ def test_leasing_v1_lease_delete():
def test_leasing_v1_lessor_lease_remove(): def test_leasing_v1_lessor_lease_remove():
lease_ref = test_leasing_v1_lessor() # see "test_leasing_v1_lessor()"
payload = {
'fulfillment_context': {
'fulfillment_class_ref_list': []
},
'lease_proposal_list': [{
'license_type_qualifiers': {'count': 1},
'product': {'name': 'NVIDIA RTX Virtual Workstation'}
}],
'proposal_evaluation_mode': 'ALL_OF',
'scope_ref_list': [ALLOTMENT_REF]
}
response = client.post('/leasing/v1/lessor', json=payload, headers={'authorization': __bearer_token(ORIGIN_REF)})
lease_result_list = response.json().get('lease_result_list')
lease_ref = lease_result_list[0]['lease']['ref']
#
response = client.delete('/leasing/v1/lessor/leases', headers={'authorization': __bearer_token(ORIGIN_REF)}) response = client.delete('/leasing/v1/lessor/leases', headers={'authorization': __bearer_token(ORIGIN_REF)})
assert response.status_code == 200 assert response.status_code == 200