64 Commits

Author SHA1 Message Date
Oscar Krause
16f80cd78b added "16.3" support 2024-03-04 21:19:59 +01:00
Oscar Krause
07aec53787 requirements.txt updated 2024-03-04 21:19:59 +01:00
Oscar Krause
3e87820f63 removed todo 2024-03-04 21:19:59 +01:00
Oscar Krause
a927e291b5 fixes 2024-03-04 21:19:59 +01:00
Oscar Krause
72054d30c4 make tests interruptible 2024-03-04 21:19:59 +01:00
Oscar Krause
00dc848083 only run test matrix when "app" or "test" changes 2024-03-04 21:19:59 +01:00
Oscar Krause
78b6fe52c7 fixed CI/CD path from "/builds" to "/tmp/builds" 2024-03-04 21:19:59 +01:00
Oscar Krause
f82d73bb01 run different jobs on "$CI_DEFAULT_BRANCH" 2024-03-04 21:19:59 +01:00
Oscar Krause
416df311b8 removed pylint 2024-03-04 21:19:59 +01:00
Oscar Krause
a6ea8241c2 disabled pylint 2024-03-04 21:19:59 +01:00
Oscar Krause
e70f70d806 disabled code_quality debug 2024-03-04 21:19:59 +01:00
Oscar Krause
77be5772c4 Update .codeclimate.yml 2024-03-04 21:19:59 +01:00
Oscar Krause
6c1b05c66a fixed test_coverage (fail on matrix) 2024-03-04 21:19:59 +01:00
Oscar Krause
a54411a957 added code_quality debug 2024-03-04 21:19:59 +01:00
Oscar Krause
90e0cb8e84 added code_quality “SOURCE_CODE” variable 2024-03-04 21:19:59 +01:00
Oscar Krause
eecb59e2e4 removed "cython" from "test" 2024-03-04 21:19:59 +01:00
Oscar Krause
4c0f65faec removed tests for "23.04"
> gcc -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -I/tmp/pip-install-sazb8fvo/httptools_694f06fa2e354ed9ba9f5c167df7fce4/vendor/llhttp/include -I/tmp/pip-install-sazb8fvo/httptools_694f06fa2e354ed9ba9f5c167df7fce4/vendor/llhttp/src -I/usr/local/include/python3.11 -c httptools/parser/parser.c -o build/temp.linux-x86_64-cpython-311/httptools/parser/parser.o -O2
      httptools/parser/parser.c:212:12: fatal error: longintrepr.h: No such file or directory
2024-03-04 21:19:59 +01:00
Oscar Krause
e41084f5c5 added tests for Ubuntu "Mantic Minotaur" 2024-03-04 21:19:59 +01:00
Oscar Krause
36d5b83fb8 requirements.txt updated 2024-03-04 21:19:59 +01:00
Oscar Krause
11138c2191 updated debian bookworm 12 dependencies 2024-03-04 21:19:59 +01:00
Oscar Krause
ff9e85e32b updated test to debian bookworm 2024-03-04 21:19:59 +01:00
Oscar Krause
cb6a089678 fixed testing dependency 2024-03-04 21:19:59 +01:00
Oscar Krause
085186f82a added gcc as dependency 2024-03-04 21:19:59 +01:00
Oscar Krause
f77d3feee1 fixes 2024-03-04 21:19:59 +01:00
Oscar Krause
f2721c7663 fixed debian package versions 2024-03-04 21:19:59 +01:00
Oscar Krause
40cb5518cb fixed versions & added 16.2 as supported 2024-03-04 21:19:59 +01:00
Oscar Krause
021c0ac38d added os specific requirements.txt 2024-03-04 21:19:59 +01:00
Oscar Krause
9c22628b4e implemented python test matrix for different python dependencies on different os releases 2024-03-04 21:19:59 +01:00
Oscar Krause
966b421dad README.md updated 2024-03-04 21:19:59 +01:00
Oscar Krause
7f8752a93d updated ubuntu from 22.10 (EOL) to 23.04 2024-03-04 21:19:59 +01:00
Oscar Krause
30979fd18e requirements.txt updated 2024-03-04 21:19:59 +01:00
Oscar Krause
72965cc879 added 16.1 as supported nvidia driver release 2024-03-04 21:19:59 +01:00
Oscar Krause
1887cbc534 added macOS as supported host (using python-venv) 2024-03-04 21:19:59 +01:00
Oscar Krause
2e942f4553 added Docker supported system architectures 2024-03-04 21:19:59 +01:00
Oscar Krause
3dda920a52 added linkt to driver compatibility section 2024-03-04 21:19:59 +01:00
Oscar Krause
765a994d83 requirements.txt updated 2024-03-04 21:19:59 +01:00
Oscar Krause
23488f94d4 added support for 16.0 drivers to readme 2024-03-04 21:19:59 +01:00
Oscar Krause
f9341cdab4 fixed docker image name (gitlab registry) 2024-03-04 21:19:59 +01:00
Oscar Krause
cad81ad1d6 fixed deploy docker 2024-03-04 21:19:59 +01:00
Oscar Krause
b07b7da2f3 fixed new docker registry image path 2024-03-04 21:19:59 +01:00
Oscar Krause
1ef7dd82f6 toggle api endpoints 2024-03-04 21:19:25 +01:00
Oscar Krause
5a1b1a5950 typos 2024-03-04 21:19:25 +01:00
Oscar Krause
83f4b42f01 added information about ipv6 may be must disabled 2024-03-04 21:19:25 +01:00
Oscar Krause
a3baaab26f removed mysql from included docker drivers 2024-03-04 21:19:25 +01:00
Oscar Krause
aa4ebfce73 added docker command to logging section
thanks to @libreshare (https://gitea.publichub.eu/oscar.krause/fastapi-dls/issues/2)
2024-03-04 21:19:25 +01:00
Oscar Krause
aa746feb13 improvements
thanks to @AbsolutelyFree (https://gitea.publichub.eu/oscar.krause/fastapi-dls/issues/1)
2024-03-04 21:19:25 +01:00
Oscar Krause
fce0eb6d74 fixed "deploy:pacman" 2024-03-04 21:19:25 +01:00
Oscar Krause
32806e5cca push multiarch image to docker-hub 2024-03-04 21:19:25 +01:00
Oscar Krause
50eddeecfc fixed mariadb-client installation
ref. https://github.com/PyMySQL/mysqlclient/discussions/624
2024-03-04 21:19:25 +01:00
Oscar Krause
092e6186ab added missing "pkg-config" for "mysqlclient==2.2.0"
ref. https://stackoverflow.com/questions/76533384/docker-alpine-build-fails-on-mysqlclient-installation-with-error-exception-can
2024-03-04 21:19:25 +01:00
Oscar Krause
acbe889fd9 fixed versions 2024-03-04 21:19:25 +01:00
Oscar Krause
05cad95c2a refactored docker-compose.yml so very simple example, and moved proxy to "examples" directory 2024-03-04 21:19:25 +01:00
Oscar Krause
c9e36759e3 added 15.3 to supported drivers list 2024-03-04 21:19:25 +01:00
Oscar Krause
d116eec626 updated compatibility list 2024-03-04 21:19:25 +01:00
Oscar Krause
b1620154db docker-compose.yml - added note for TZ 2024-03-04 21:19:25 +01:00
Oscar Krause
4181095791 requirements.txt updated 2024-03-04 21:19:25 +01:00
Oscar Krause
248c70a862 Merge branch 'dev' into db 2023-06-12 15:19:28 +02:00
Oscar Krause
39a2408d8d migrated api to database-config (Site, Instance) 2023-06-12 15:19:06 +02:00
Oscar Krause
18807401e4 orm improvements & fixes 2023-06-12 15:14:12 +02:00
Oscar Krause
5e47ad7729 fastapi openapi url 2023-06-12 15:13:53 +02:00
Oscar Krause
20448bc587 improved relationships 2023-06-12 15:13:29 +02:00
Oscar Krause
5e945bc43a code styling 2023-06-12 14:47:32 +02:00
Oscar Krause
b4150fa527 implemented Site and Instance orm models including initialization 2023-06-12 12:42:13 +02:00
Oscar Krause
38e1a1725c code styling 2023-06-12 12:40:10 +02:00
29 changed files with 927 additions and 2123 deletions

View File

@@ -2,7 +2,7 @@ Package: fastapi-dls
Version: 0.0 Version: 0.0
Architecture: all Architecture: all
Maintainer: Oscar Krause oscar.krause@collinwebdesigns.de Maintainer: Oscar Krause oscar.krause@collinwebdesigns.de
Depends: python3, python3-fastapi, python3-uvicorn, python3-dotenv, python3-dateutil, python3-jwt, python3-sqlalchemy, python3-cryptography, python3-markdown, uvicorn, openssl Depends: python3, python3-fastapi, python3-uvicorn, python3-dotenv, python3-dateutil, python3-jose, python3-sqlalchemy, python3-pycryptodome, python3-markdown, uvicorn, openssl
Recommends: curl Recommends: curl
Installed-Size: 10240 Installed-Size: 10240
Homepage: https://git.collinwebdesigns.de/oscar.krause/fastapi-dls Homepage: https://git.collinwebdesigns.de/oscar.krause/fastapi-dls

View File

@@ -1,9 +1,6 @@
# Toggle debug mode # Toggle debug mode
#DEBUG=false #DEBUG=false
# Cert Path
CERT_PATH="/etc/fastapi-dls/cert"
# Where the client can find the DLS server # Where the client can find the DLS server
DLS_URL=127.0.0.1 DLS_URL=127.0.0.1
DLS_PORT=443 DLS_PORT=443
@@ -24,3 +21,7 @@ DATABASE=sqlite:////etc/fastapi-dls/db.sqlite
#SITE_KEY_XID="00000000-0000-0000-0000-000000000000" #SITE_KEY_XID="00000000-0000-0000-0000-000000000000"
#INSTANCE_REF="10000000-0000-0000-0000-000000000001" #INSTANCE_REF="10000000-0000-0000-0000-000000000001"
#ALLOTMENT_REF="20000000-0000-0000-0000-000000000001" #ALLOTMENT_REF="20000000-0000-0000-0000-000000000001"
# Site-wide signing keys
INSTANCE_KEY_RSA=/etc/fastapi-dls/instance.private.pem
INSTANCE_KEY_PUB=/etc/fastapi-dls/instance.public.pem

View File

@@ -3,7 +3,13 @@
WORKING_DIR=/usr/share/fastapi-dls WORKING_DIR=/usr/share/fastapi-dls
CONFIG_DIR=/etc/fastapi-dls CONFIG_DIR=/etc/fastapi-dls
source $CONFIG_DIR/env if [ ! -f $CONFIG_DIR/instance.private.pem ]; then
echo "> Create dls-instance keypair ..."
openssl genrsa -out $CONFIG_DIR/instance.private.pem 2048
openssl rsa -in $CONFIG_DIR/instance.private.pem -outform PEM -pubout -out $CONFIG_DIR/instance.public.pem
else
echo "> Create dls-instance keypair skipped! (exists)"
fi
while true; do while true; do
[ -f $CONFIG_DIR/webserver.key ] && default_answer="N" || default_answer="Y" [ -f $CONFIG_DIR/webserver.key ] && default_answer="N" || default_answer="Y"
@@ -27,32 +33,27 @@ if [ -f $CONFIG_DIR/webserver.key ]; then
if [ -x "$(command -v curl)" ]; then if [ -x "$(command -v curl)" ]; then
echo "> Testing API ..." echo "> Testing API ..."
source $CONFIG_DIR/env
curl --insecure -X GET https://$DLS_URL:$DLS_PORT/-/health curl --insecure -X GET https://$DLS_URL:$DLS_PORT/-/health
else else
echo "> Testing API failed, curl not available. Please test manually!" echo "> Testing API failed, curl not available. Please test manually!"
fi fi
fi fi
echo "> Create Certificate-Chain folder ..."
mkdir -p $CERT_PATH
echo "> Set permissions ..."
chown -R www-data:www-data $CONFIG_DIR chown -R www-data:www-data $CONFIG_DIR
chown -R www-data:www-data $WORKING_DIR chown -R www-data:www-data $WORKING_DIR
echo "> Done."
cat <<EOF cat <<EOF
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# # # #
# fastapi-dls is now installed. # # fastapi-dls is now installed. #
# # # #
# Service should be up and running (if you choose to auto-generate # # Service should be up and running. #
# self-signed webserver certificate). # # Webservice is listen to https://localhost #
# #
# Configuration is stored in /etc/fastapi-dls/env. #
# # # #
# - Webservice is listen to https://localhost # #
# - Configuration is stored in /etc/fastapi-dls/env #
# # # #
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

View File

@@ -1,8 +1,8 @@
# https://packages.debian.org/hu/ # https://packages.debian.org/hu/
fastapi==0.92.0 fastapi==0.92.0
uvicorn[standard]==0.17.6 uvicorn[standard]==0.17.6
pyjwt==2.10.1 python-jose[pycryptodome]==3.3.0
cryptography==38.0.4 pycryptodome==3.11.0
python-dateutil==2.8.2 python-dateutil==2.8.2
sqlalchemy==1.4.46 sqlalchemy==1.4.46
markdown==3.4.1 markdown==3.4.1

View File

@@ -1,11 +0,0 @@
# https://packages.debian.org/hu/
fastapi==0.115.11
uvicorn[standard]==0.32.0
pyjwt==2.10.1
cryptography==43.0.0
python-dateutil==2.9.0
sqlalchemy==2.0.40
markdown==3.7
python-dotenv==1.0.1
jinja2==3.1.6
httpx==0.28.1

View File

@@ -0,0 +1,10 @@
# https://packages.ubuntu.com
fastapi==0.91.0
uvicorn[standard]==0.15.0
python-jose[pycryptodome]==3.3.0
pycryptodome==3.11.0
python-dateutil==2.8.2
sqlalchemy==1.4.46
markdown==3.4.3
python-dotenv==0.21.0
jinja2==3.1.2

View File

@@ -0,0 +1,10 @@
# https://packages.ubuntu.com
fastapi==0.101.0
uvicorn[standard]==0.23.2
python-jose[pycryptodome]==3.3.0
pycryptodome==3.11.0
python-dateutil==2.8.2
sqlalchemy==1.4.47
markdown==3.4.4
python-dotenv==1.0.0
jinja2==3.1.2

View File

@@ -1,10 +0,0 @@
# https://packages.ubuntu.com
fastapi==0.101.0
uvicorn[standard]==0.27.1
pyjwt==2.10.1
cryptography==41.0.7
python-dateutil==2.8.2
sqlalchemy==1.4.50
markdown==3.5.2
python-dotenv==1.0.1
jinja2==3.1.2

View File

@@ -1,10 +0,0 @@
# https://packages.ubuntu.com
fastapi==0.110.3
uvicorn[standard]==0.30.3
pyjwt==2.10.1
cryptography==42.0.5
python-dateutil==2.9.0
sqlalchemy==2.0.32
markdown==3.6
python-dotenv==1.0.1
jinja2==3.1.3

View File

@@ -8,7 +8,7 @@ pkgdesc='NVIDIA DLS server implementation with FastAPI'
arch=('any') arch=('any')
url='https://git.collinwebdesigns.de/oscar.krause/fastapi-dls' url='https://git.collinwebdesigns.de/oscar.krause/fastapi-dls'
license=('MIT') license=('MIT')
depends=('python' 'python-pyjwt' 'python-starlette' 'python-httpx' 'python-fastapi' 'python-dotenv' 'python-dateutil' 'python-sqlalchemy' 'python-cryptography' 'uvicorn' 'python-markdown' 'openssl') depends=('python' 'python-jose' 'python-starlette' 'python-httpx' 'python-fastapi' 'python-dotenv' 'python-dateutil' 'python-sqlalchemy' 'python-pycryptodome' 'uvicorn' 'python-markdown' 'openssl')
provider=("$pkgname") provider=("$pkgname")
install="$pkgname.install" install="$pkgname.install"
backup=('etc/default/fastapi-dls') backup=('etc/default/fastapi-dls')
@@ -17,7 +17,7 @@ source=("git+file://${CI_PROJECT_DIR}"
"$pkgname.service" "$pkgname.service"
"$pkgname.tmpfiles") "$pkgname.tmpfiles")
sha256sums=('SKIP' sha256sums=('SKIP'
'a4776a0ae4671751065bf3e98aa707030b8b5ffe42dde942c51050dab5028c54' 'fbd015449a30c0ae82733289a56eb98151dcfab66c91b37fe8e202e39f7a5edb'
'2719338541104c537453a65261c012dda58e1dbee99154cf4f33b526ee6ca22e' '2719338541104c537453a65261c012dda58e1dbee99154cf4f33b526ee6ca22e'
'3dc60140c08122a8ec0e7fa7f0937eb8c1288058890ba09478420fc30ce9e30c') '3dc60140c08122a8ec0e7fa7f0937eb8c1288058890ba09478420fc30ce9e30c')
@@ -30,6 +30,8 @@ pkgver() {
check() { check() {
cd "$srcdir/$pkgname/test" cd "$srcdir/$pkgname/test"
mkdir "$srcdir/$pkgname/app/cert" mkdir "$srcdir/$pkgname/app/cert"
openssl genrsa -out "$srcdir/$pkgname/app/cert/instance.private.pem" 2048
openssl rsa -in "$srcdir/$pkgname/app/cert/instance.private.pem" -outform PEM -pubout -out "$srcdir/$pkgname/app/cert/instance.public.pem"
python "$srcdir/$pkgname/test/main.py" python "$srcdir/$pkgname/test/main.py"
rm -rf "$srcdir/$pkgname/app/cert" rm -rf "$srcdir/$pkgname/app/cert"
} }
@@ -37,7 +39,7 @@ check() {
package() { package() {
install -d "$pkgdir/usr/share/doc/$pkgname" install -d "$pkgdir/usr/share/doc/$pkgname"
install -d "$pkgdir/var/lib/$pkgname/cert" install -d "$pkgdir/var/lib/$pkgname/cert"
#cp -r "$srcdir/$pkgname/doc"/* "$pkgdir/usr/share/doc/$pkgname/" cp -r "$srcdir/$pkgname/doc"/* "$pkgdir/usr/share/doc/$pkgname/"
install -Dm644 "$srcdir/$pkgname/README.md" "$pkgdir/usr/share/doc/$pkgname/README.md" install -Dm644 "$srcdir/$pkgname/README.md" "$pkgdir/usr/share/doc/$pkgname/README.md"
install -Dm644 "$srcdir/$pkgname/version.env" "$pkgdir/usr/share/doc/$pkgname/version.env" install -Dm644 "$srcdir/$pkgname/version.env" "$pkgdir/usr/share/doc/$pkgname/version.env"

View File

@@ -19,6 +19,10 @@ DATABASE="sqlite:////var/lib/fastapi-dls/db.sqlite"
SITE_KEY_XID="<<sitekey>>" SITE_KEY_XID="<<sitekey>>"
INSTANCE_REF="<<instanceref>>" INSTANCE_REF="<<instanceref>>"
# Site-wide signing keys
INSTANCE_KEY_RSA="/var/lib/fastapi-dls/instance.private.pem"
INSTANCE_KEY_PUB="/var/lib/fastapi-dls/instance.public.pem"
# TLS certificate # TLS certificate
INSTANCE_SSL_CERT="/var/lib/fastapi-dls/cert/webserver.crt" INSTANCE_SSL_CERT="/var/lib/fastapi-dls/cert/webserver.crt"
INSTANCE_SSL_KEY="/var/lib/fastapi-dls/cert/webserver.key" INSTANCE_SSL_KEY="/var/lib/fastapi-dls/cert/webserver.key"

View File

@@ -7,4 +7,8 @@ post_install() {
echo echo
echo 'A valid HTTPS certificate needs to be installed to /var/lib/fastapi-dls/cert/webserver.{crt,key}' echo 'A valid HTTPS certificate needs to be installed to /var/lib/fastapi-dls/cert/webserver.{crt,key}'
echo 'A self-signed certificate can be generated with: openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout /var/lib/fastapi-dls/cert/webserver.key -out /var/lib/fastapi-dls/cert/webserver.crt' echo 'A self-signed certificate can be generated with: openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout /var/lib/fastapi-dls/cert/webserver.key -out /var/lib/fastapi-dls/cert/webserver.crt'
echo
echo 'The signing keys for your instance need to be generated as well. Generate them with these commands:'
echo 'openssl genrsa -out /var/lib/fastapi-dls/instance.private.pem 2048'
echo 'openssl rsa -in /var/lib/fastapi-dls/instance.private.pem -outform PEM -pubout -out /var/lib/fastapi-dls/instance.public.pem'
} }

View File

@@ -18,6 +18,9 @@ Make sure you create these certificates before starting the container for the fi
WORKING_DIR=/mnt/user/appdata/fastapi-dls/cert&#xD; WORKING_DIR=/mnt/user/appdata/fastapi-dls/cert&#xD;
mkdir -p $WORKING_DIR&#xD; mkdir -p $WORKING_DIR&#xD;
cd $WORKING_DIR&#xD; cd $WORKING_DIR&#xD;
# create instance private and public key for singing JWT's&#xD;
openssl genrsa -out $WORKING_DIR/instance.private.pem 2048 &#xD;
openssl rsa -in $WORKING_DIR/instance.private.pem -outform PEM -pubout -out $WORKING_DIR/instance.public.pem&#xD;
# create ssl certificate for integrated webserver (uvicorn) - because clients rely on ssl&#xD; # create ssl certificate for integrated webserver (uvicorn) - because clients rely on ssl&#xD;
openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout $WORKING_DIR/webserver.key -out $WORKING_DIR/webserver.crt&#xD; openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout $WORKING_DIR/webserver.key -out $WORKING_DIR/webserver.crt&#xD;
```&#xD; ```&#xD;

2
.gitignore vendored
View File

@@ -1,6 +1,6 @@
.DS_Store .DS_Store
venv/ venv/
.idea/ .idea/
*.sqlite app/*.sqlite*
app/cert/*.* app/cert/*.*
.pytest_cache .pytest_cache

View File

@@ -16,12 +16,11 @@ build:docker:
interruptible: true interruptible: true
stage: build stage: build
rules: rules:
# deployment is in "deploy:docker:" - if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
changes: changes:
- app/**/* - app/**/*
- Dockerfile - Dockerfile
- requirements.txt - if: $CI_PIPELINE_SOURCE == 'merge_request_event'
tags: [ docker ] tags: [ docker ]
before_script: before_script:
- docker buildx inspect - docker buildx inspect
@@ -44,13 +43,16 @@ build:apt:
- if: $CI_COMMIT_TAG - if: $CI_COMMIT_TAG
variables: variables:
VERSION: $CI_COMMIT_REF_NAME VERSION: $CI_COMMIT_REF_NAME
- if: ($CI_PIPELINE_SOURCE == 'merge_request_event') || ($CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH) - if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH
changes: changes:
- app/**/* - app/**/*
- .DEBIAN/**/* - .DEBIAN/**/*
- .gitlab-ci.yml - .gitlab-ci.yml
variables: variables:
VERSION: "0.0.1" VERSION: "0.0.1"
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
variables:
VERSION: "0.0.1"
before_script: before_script:
- echo -e "VERSION=$VERSION\nCOMMIT=$CI_COMMIT_SHA" > version.env - echo -e "VERSION=$VERSION\nCOMMIT=$CI_COMMIT_SHA" > version.env
# install build dependencies # install build dependencies
@@ -91,13 +93,16 @@ build:pacman:
- if: $CI_COMMIT_TAG - if: $CI_COMMIT_TAG
variables: variables:
VERSION: $CI_COMMIT_REF_NAME VERSION: $CI_COMMIT_REF_NAME
- if: ($CI_PIPELINE_SOURCE == 'merge_request_event') || ($CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH) - if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH
changes: changes:
- app/**/* - app/**/*
- .PKGBUILD/**/* - .PKGBUILD/**/*
- .gitlab-ci.yml - .gitlab-ci.yml
variables: variables:
VERSION: "0.0.1" VERSION: "0.0.1"
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
variables:
VERSION: "0.0.1"
before_script: before_script:
#- echo -e "VERSION=$VERSION\nCOMMIT=$CI_COMMIT_SHA" > version.env #- echo -e "VERSION=$VERSION\nCOMMIT=$CI_COMMIT_SHA" > version.env
# install build dependencies # install build dependencies
@@ -120,12 +125,13 @@ build:pacman:
paths: paths:
- "*.pkg.tar.zst" - "*.pkg.tar.zst"
test:python: test:
image: $IMAGE image: python:3.11-slim-bookworm
stage: test stage: test
interruptible: true interruptible: true
rules: rules:
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH - if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
- if: $CI_COMMIT_TAG
- if: $CI_PIPELINE_SOURCE == "merge_request_event" - if: $CI_PIPELINE_SOURCE == "merge_request_event"
- if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH - if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH
changes: changes:
@@ -135,49 +141,33 @@ test:python:
DATABASE: sqlite:///../app/db.sqlite DATABASE: sqlite:///../app/db.sqlite
parallel: parallel:
matrix: matrix:
- IMAGE: - REQUIREMENTS:
# https://devguide.python.org/versions/#supported-versions - requirements.txt
# - python:3.14-rc-alpine # EOL 2030-10 => uvicorn does not support 3.14 yet - .DEBIAN/requirements-bookworm-12.txt
- python:3.13-alpine # EOL 2029-10 - .DEBIAN/requirements-ubuntu-23.10.txt
- python:3.12-alpine # EOL 2028-10
- python:3.11-alpine # EOL 2027-10
# - python:3.10-alpine # EOL 2026-10 => ImportError: cannot import name 'UTC' from 'datetime'
# - python:3.9-alpine # EOL 2025-10 => ImportError: cannot import name 'UTC' from 'datetime'
before_script: before_script:
- apk --no-cache add openssl - apt-get update && apt-get install -y python3-dev gcc
- python3 -m venv venv - pip install -r $REQUIREMENTS
- source venv/bin/activate - pip install pytest httpx
- pip install --upgrade pip
- pip install -r requirements.txt
- pip install pytest pytest-cov pytest-custom_exit_code httpx
- mkdir -p app/cert - mkdir -p app/cert
- openssl genrsa -out app/cert/instance.private.pem 2048
- openssl rsa -in app/cert/instance.private.pem -outform PEM -pubout -out app/cert/instance.public.pem
- cd test - cd test
script: script:
- python -m pytest main.py --junitxml=report.xml - python -m pytest main.py --junitxml=report.xml
artifacts: artifacts:
reports: reports:
dotenv: version.env
junit: ['**/report.xml'] junit: ['**/report.xml']
test:apt: .test:linux:
image: $IMAGE
stage: test stage: test
rules: rules:
- if: ($CI_PIPELINE_SOURCE == 'merge_request_event') || ($CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH) - if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH
changes: changes:
- app/**/* - app/**/*
- .DEBIAN/**/* - .DEBIAN/**/*
- .gitlab-ci.yml - if: $CI_PIPELINE_SOURCE == 'merge_request_event'
variables:
VERSION: "0.0.1"
parallel:
matrix:
- IMAGE:
- debian:trixie-slim # EOL: t.b.a.; "python3-jose" not available, but "python3-josepy" or "python3-jwt"
- debian:bookworm-slim # EOL: June 06, 2026
- debian:bullseye-slim # EOL: June 06, 2026
- ubuntu:24.04 # EOL: April 2036 (Noble Numbat)
- ubuntu:25.04 # EOL: January 2026 (Plucky Puffin); "python3-jose" not available, but "python3-josepy" or "python3-jwt"
- ubuntu:25.10 # EOL: July 2026 (Questing Quokka);
needs: needs:
- job: build:apt - job: build:apt
artifacts: true artifacts: true
@@ -209,14 +199,22 @@ test:apt:
- apt-get purge -qq -y fastapi-dls - apt-get purge -qq -y fastapi-dls
- apt-get autoremove -qq -y && apt-get clean -qq - apt-get autoremove -qq -y && apt-get clean -qq
test:pacman:archlinux: test:debian:
extends: .test:linux
image: debian:bookworm-slim
test:ubuntu:
extends: .test:linux
image: ubuntu:23.10
test:archlinux:
image: archlinux:base image: archlinux:base
rules: rules:
- if: ($CI_PIPELINE_SOURCE == 'merge_request_event') || ($CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH) - if: $CI_COMMIT_BRANCH && $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH
changes: changes:
- app/**/* - app/**/*
- .PKGBUILD/**/* - .PKGBUILD/**/*
- .gitlab-ci.yml - if: $CI_PIPELINE_SOURCE == 'merge_request_event'
needs: needs:
- job: build:pacman - job: build:pacman
artifacts: true artifacts: true
@@ -250,7 +248,7 @@ semgrep-sast:
test_coverage: test_coverage:
# extends: test # extends: test
image: python:3.13-slim-trixie image: python:3.11-slim-bookworm
allow_failure: true allow_failure: true
stage: test stage: test
rules: rules:
@@ -260,17 +258,19 @@ test_coverage:
before_script: before_script:
- apt-get update && apt-get install -y python3-dev gcc - apt-get update && apt-get install -y python3-dev gcc
- pip install -r requirements.txt - pip install -r requirements.txt
- pip install pytest pytest-cov pytest-custom_exit_code httpx - pip install pytest httpx
- mkdir -p app/cert - mkdir -p app/cert
- openssl genrsa -out app/cert/instance.private.pem 2048
- openssl rsa -in app/cert/instance.private.pem -outform PEM -pubout -out app/cert/instance.public.pem
- cd test - cd test
script: script:
- coverage run -m pytest main.py --junitxml=report.xml --suppress-no-test-exit-code - pip install pytest pytest-cov
- coverage run -m pytest main.py
- coverage report - coverage report
- coverage xml - coverage xml
coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/' coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/'
artifacts: artifacts:
reports: reports:
junit: [ '**/report.xml' ]
coverage_report: coverage_report:
coverage_format: cobertura coverage_format: cobertura
path: '**/coverage.xml' path: '**/coverage.xml'
@@ -289,12 +289,15 @@ gemnasium-python-dependency_scanning:
- if: $CI_PIPELINE_SOURCE == "merge_request_event" - if: $CI_PIPELINE_SOURCE == "merge_request_event"
- if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH - if: $CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH
.deploy:
rules:
- if: $CI_COMMIT_TAG
deploy:docker: deploy:docker:
extends: .deploy
image: docker:dind image: docker:dind
stage: deploy stage: deploy
tags: [ docker ] tags: [ docker ]
rules:
- if: $CI_COMMIT_TAG
before_script: before_script:
- echo "Building docker image for commit $CI_COMMIT_SHA with version $CI_COMMIT_REF_NAME" - echo "Building docker image for commit $CI_COMMIT_SHA with version $CI_COMMIT_REF_NAME"
- docker buildx inspect - docker buildx inspect
@@ -313,10 +316,9 @@ deploy:docker:
deploy:apt: deploy:apt:
# doc: https://git.collinwebdesigns.de/help/user/packages/debian_repository/index.md#install-a-package # doc: https://git.collinwebdesigns.de/help/user/packages/debian_repository/index.md#install-a-package
extends: .deploy
image: debian:bookworm-slim image: debian:bookworm-slim
stage: deploy stage: deploy
rules:
- if: $CI_COMMIT_TAG
needs: needs:
- job: build:apt - job: build:apt
artifacts: true artifacts: true
@@ -353,10 +355,9 @@ deploy:apt:
- 'curl --header "JOB-TOKEN: $CI_JOB_TOKEN" --upload-file ${EXPORT_NAME} "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/generic/${PACKAGE_NAME}/${PACKAGE_VERSION}/${EXPORT_NAME}"' - 'curl --header "JOB-TOKEN: $CI_JOB_TOKEN" --upload-file ${EXPORT_NAME} "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/generic/${PACKAGE_NAME}/${PACKAGE_VERSION}/${EXPORT_NAME}"'
deploy:pacman: deploy:pacman:
extends: .deploy
image: archlinux:base-devel image: archlinux:base-devel
stage: deploy stage: deploy
rules:
- if: $CI_COMMIT_TAG
needs: needs:
- job: build:pacman - job: build:pacman
artifacts: true artifacts: true
@@ -377,7 +378,7 @@ deploy:pacman:
release: release:
image: registry.gitlab.com/gitlab-org/release-cli:latest image: registry.gitlab.com/gitlab-org/release-cli:latest
stage: .post stage: .post
needs: [ deploy:docker, deploy:apt, deploy:pacman ] needs: [ test ]
rules: rules:
- if: $CI_COMMIT_TAG - if: $CI_COMMIT_TAG
script: script:
@@ -392,4 +393,4 @@ release:
- name: 'Package Registry' - name: 'Package Registry'
url: 'https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/-/packages' url: 'https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/-/packages'
- name: 'Container Registry' - name: 'Container Registry'
url: 'https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/container_registry/70' url: 'https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/container_registry/40'

View File

@@ -1,4 +1,4 @@
FROM python:3.13-alpine FROM python:3.11-alpine
ARG VERSION ARG VERSION
ARG COMMIT="" ARG COMMIT=""
@@ -10,7 +10,7 @@ RUN apk update \
&& apk add --no-cache --virtual build-deps gcc g++ python3-dev musl-dev pkgconfig \ && apk add --no-cache --virtual build-deps gcc g++ python3-dev musl-dev pkgconfig \
&& apk add --no-cache curl postgresql postgresql-dev mariadb-dev sqlite-dev \ && apk add --no-cache curl postgresql postgresql-dev mariadb-dev sqlite-dev \
&& pip install --no-cache-dir --upgrade uvicorn \ && pip install --no-cache-dir --upgrade uvicorn \
&& pip install --no-cache-dir psycopg2==2.9.10 mysqlclient==2.2.7 pysqlite3==0.5.4 \ && pip install --no-cache-dir psycopg2==2.9.6 mysqlclient==2.2.0 pysqlite3==0.5.1 \
&& pip install --no-cache-dir -r /tmp/requirements.txt \ && pip install --no-cache-dir -r /tmp/requirements.txt \
&& apk del build-deps && apk del build-deps

17
FAQ.md Normal file
View File

@@ -0,0 +1,17 @@
# FAQ
## `Failed to acquire license from <ip> (Info: <license> - Error: The allowed time to process response has expired)`
- Did your timezone settings are correct on fastapi-dls **and your guest**?
- Did you download the client-token more than an hour ago?
Please download a new client-token. The guest have to register within an hour after client-token was created.
## `jose.exceptions.JWTError: Signature verification failed.`
- Did you recreated `instance.public.pem` / `instance.private.pem`?
Then you have to download a **new** client-token on each of your guests.

330
README.md
View File

@@ -2,39 +2,22 @@
Minimal Delegated License Service (DLS). Minimal Delegated License Service (DLS).
> [!warning] Branch support Compatibility tested with official NLS 2.0.1, 2.1.0, 3.1.0. For Driver compatibility see [here](#setup-client).
> FastAPI-DLS Version 1.x supports up to **`17.x`** releases. \
> FastAPI-DLS Version 2.x is backwards compatible to `17.x` and supports **`18.x`**, **`19.x`**, releases in combination
> with [gridd-unlock-patcher](https://git.collinwebdesigns.de/vgpu/gridd-unlock-patcher).
> Other combinations of FastAPI-DLS and Driver-Branches may work but are not tested.
> [!note] Compatibility
> Compatibility tested with official NLS 2.0.1, 2.1.0, 3.1.0, 3.3.1, 3.4.0. **For Driver compatibility
> see [compatibility matrix](#vgpu-software-compatibility-matrix)**.
This service can be used without internet connection. This service can be used without internet connection.
Only the clients need a connection to this service on configured port. Only the clients need a connection to this service on configured port.
**Official Links** **Official Links**
* https://git.collinwebdesigns.de/oscar.krause/fastapi-dls (Private Git) - https://git.collinwebdesigns.de/oscar.krause/fastapi-dls (Private Git)
* https://hub.docker.com/r/collinwebdesigns/fastapi-dls (Docker-Hub `collinwebdesigns/fastapi-dls:latest`) - https://gitea.publichub.eu/oscar.krause/fastapi-dls (Public Git)
- https://hub.docker.com/r/collinwebdesigns/fastapi-dls (Docker-Hub `collinwebdesigns/fastapi-dls:latest`)
*All other repositories are forks! (which is no bad - just for information and bug reports)* *All other repositories are forks! (which is no bad - just for information and bug reports)*
[Releases & Release Notes](https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/-/releases)
**Further Reading**
* [NVIDIA vGPU Guide](https://gitlab.com/polloloco/vgpu-proxmox) - This document serves as a guide to install NVIDIA vGPU host drivers on the latest Proxmox VE version
* [vgpu_unlock](https://github.com/DualCoder/vgpu_unlock) - Unlock vGPU functionality for consumer-grade Nvidia GPUs.
* [vGPU_Unlock Wiki](https://docs.google.com/document/d/1pzrWJ9h-zANCtyqRgS7Vzla0Y8Ea2-5z2HEi4X75d2Q) - Guide for `vgpu_unlock`
* [Proxmox 8 vGPU in VMs and LXC Containers](https://medium.com/@dionisievldulrincz/proxmox-8-vgpu-in-vms-and-lxc-containers-4146400207a3) - Install *Merged Drivers* for using in Proxmox VMs and LXCs
* [Proxmox All-In-One Installer Script](https://wvthoog.nl/proxmox-vgpu-v3/) - Also known as `proxmox-installer.sh`
--- ---
[TOC] [[_TOC_]]
# Setup (Service) # Setup (Service)
@@ -50,9 +33,6 @@ Tested with Ubuntu 22.10 (EOL!) (from Proxmox templates), actually its consuming
- Make sure your timezone is set correct on you fastapi-dls server and your client - Make sure your timezone is set correct on you fastapi-dls server and your client
This guide does not show how to install vGPU host drivers! Look at the official documentation packed with the driver
releases.
## Docker ## Docker
Docker-Images are available here for Intel (x86), AMD (amd64) and ARM (arm64): Docker-Images are available here for Intel (x86), AMD (amd64) and ARM (arm64):
@@ -68,6 +48,9 @@ The images include database drivers for `postgres`, `mariadb` and `sqlite`.
WORKING_DIR=/opt/docker/fastapi-dls/cert WORKING_DIR=/opt/docker/fastapi-dls/cert
mkdir -p $WORKING_DIR mkdir -p $WORKING_DIR
cd $WORKING_DIR cd $WORKING_DIR
# create instance private and public key for singing JWT's
openssl genrsa -out $WORKING_DIR/instance.private.pem 2048
openssl rsa -in $WORKING_DIR/instance.private.pem -outform PEM -pubout -out $WORKING_DIR/instance.public.pem
# create ssl certificate for integrated webserver (uvicorn) - because clients rely on ssl # create ssl certificate for integrated webserver (uvicorn) - because clients rely on ssl
openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout $WORKING_DIR/webserver.key -out $WORKING_DIR/webserver.crt openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout $WORKING_DIR/webserver.key -out $WORKING_DIR/webserver.crt
``` ```
@@ -85,7 +68,7 @@ docker run -e DLS_URL=`hostname -i` -e DLS_PORT=443 -p 443:443 -v $WORKING_DIR:/
See [`examples`](examples) directory for more advanced examples (with reverse proxy usage). See [`examples`](examples) directory for more advanced examples (with reverse proxy usage).
> Adjust `REQUIRED` variables as needed > Adjust *REQUIRED* variables as needed
```yaml ```yaml
version: '3.9' version: '3.9'
@@ -119,7 +102,7 @@ volumes:
dls-db: dls-db:
``` ```
## Debian / Ubuntu / macOS (manual method using `git clone` and python virtual environment) ## Debian/Ubuntu/macOS (manual method using `git clone` and python virtual environment)
Tested on `Debian 11 (bullseye)`, `Debian 12 (bookworm)` and `macOS Ventura (13.6)`, Ubuntu may also work. Tested on `Debian 11 (bullseye)`, `Debian 12 (bookworm)` and `macOS Ventura (13.6)`, Ubuntu may also work.
**Please note that setup on macOS differs from Debian based systems.** **Please note that setup on macOS differs from Debian based systems.**
@@ -152,6 +135,9 @@ chown -R www-data:www-data $WORKING_DIR
WORKING_DIR=/opt/fastapi-dls/app/cert WORKING_DIR=/opt/fastapi-dls/app/cert
mkdir -p $WORKING_DIR mkdir -p $WORKING_DIR
cd $WORKING_DIR cd $WORKING_DIR
# create instance private and public key for singing JWT's
openssl genrsa -out $WORKING_DIR/instance.private.pem 2048
openssl rsa -in $WORKING_DIR/instance.private.pem -outform PEM -pubout -out $WORKING_DIR/instance.public.pem
# create ssl certificate for integrated webserver (uvicorn) - because clients rely on ssl # create ssl certificate for integrated webserver (uvicorn) - because clients rely on ssl
openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout $WORKING_DIR/webserver.key -out $WORKING_DIR/webserver.crt openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout $WORKING_DIR/webserver.key -out $WORKING_DIR/webserver.crt
chown -R www-data:www-data $WORKING_DIR chown -R www-data:www-data $WORKING_DIR
@@ -251,6 +237,9 @@ CERT_DIR=${BASE_DIR}/app/cert
SERVICE_USER=dls SERVICE_USER=dls
mkdir ${CERT_DIR} mkdir ${CERT_DIR}
cd ${CERT_DIR} cd ${CERT_DIR}
# create instance private and public key for singing JWT's
openssl genrsa -out ${CERT_DIR}/instance.private.pem 2048
openssl rsa -in ${CERT_DIR}/instance.private.pem -outform PEM -pubout -out ${CERT_DIR}/instance.public.pem
# create ssl certificate for integrated webserver (uvicorn) - because clients rely on ssl # create ssl certificate for integrated webserver (uvicorn) - because clients rely on ssl
openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout ${CERT_DIR}/webserver.key -out ${CERT_DIR}/webserver.crt openssl req -x509 -nodes -days 3650 -newkey rsa:2048 -keyout ${CERT_DIR}/webserver.key -out ${CERT_DIR}/webserver.crt
chown -R ${SERVICE_USER} ${CERT_DIR} chown -R ${SERVICE_USER} ${CERT_DIR}
@@ -320,23 +309,22 @@ EOF
Now you have to run `systemctl daemon-reload`. After that you can start service Now you have to run `systemctl daemon-reload`. After that you can start service
with `systemctl start fastapi-dls.service` and enable autostart with `systemctl enable fastapi-dls.service`. with `systemctl start fastapi-dls.service` and enable autostart with `systemctl enable fastapi-dls.service`.
## Debian / Ubuntu (using `dpkg` / `apt`) ## Debian/Ubuntu (using `dpkg`)
Packages are available here: Packages are available here:
- [GitLab-Registry](https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/-/packages) - [GitLab-Registry](https://git.collinwebdesigns.de/oscar.krause/fastapi-dls/-/packages)
Successful tested with (**LTS Version**): Successful tested with:
- **Debian 13 (Trixie)** (EOL: June 30, 2028) - Debian 12 (Bookworm)
- **Debian 12 (Bookworm)** (EOL: June 06, 2026) - Ubuntu 22.10 (Kinetic Kudu) (EOL: July 20, 2023)
- *Ubuntu 22.10 (Kinetic Kudu)* (EOL: July 20, 2023) - Ubuntu 23.04 (Lunar Lobster) (EOL: January 2024)
- *Ubuntu 23.04 (Lunar Lobster)* (EOL: January 2024) - Ubuntu 23.10 (Mantic Minotaur) (EOL: July 2024)
- *Ubuntu 23.10 (Mantic Minotaur)* (EOL: July 2024)
- **Ubuntu 24.04 (Noble Numbat)** (EOL: Apr 2029)
Not working with: Not working with:
- Debian 11 (Bullseye) and lower (missing `python-jose` dependency)
- Ubuntu 22.04 (Jammy Jellyfish) (not supported as for 15.01.2023 due to [fastapi - uvicorn version missmatch](https://bugs.launchpad.net/ubuntu/+source/fastapi/+bug/1970557)) - Ubuntu 22.04 (Jammy Jellyfish) (not supported as for 15.01.2023 due to [fastapi - uvicorn version missmatch](https://bugs.launchpad.net/ubuntu/+source/fastapi/+bug/1970557))
**Run this on your server instance** **Run this on your server instance**
@@ -390,13 +378,6 @@ Now you have to edit `/etc/default/fastapi-dls` as needed.
Continue [here](#unraid-guest) for docker guest setup. Continue [here](#unraid-guest) for docker guest setup.
## NixOS
Tanks to [@mrzenc](https://github.com/mrzenc) for [fastapi-dls-nixos](https://github.com/mrzenc/fastapi-dls-nixos).
> [!note] Native NixOS-Package
> There is a [pull request](https://github.com/NixOS/nixpkgs/pull/358647) which adds fastapi-dls into nixpkgs.
## Let's Encrypt Certificate (optional) ## Let's Encrypt Certificate (optional)
If you're using installation via docker, you can use `traefik`. Please refer to their documentation. If you're using installation via docker, you can use `traefik`. Please refer to their documentation.
@@ -413,148 +394,13 @@ acme.sh --issue -d example.com \
After first success you have to replace `--issue` with `--renew`. After first success you have to replace `--issue` with `--renew`.
## Nginx Reverse Proxy (experimental) # Configuration
- This guide is written for Debian/Ubuntu systems, other may work, but you have to do your setup on your own
- Uvicorn does no longer serve requests directly
- NGINX is used as HTTP & HTTPS entrypoint
- Assumes you already have set up webserver certificate and private-key
**Install Nginx Webserver**
```shell
apt-get install nginx-light
```
**Remove default vhost**
```shell
rm /etc/nginx/sites-enabled/default
```
**Create fastapi-dls vhost**
<details>
<summary>`/etc/nginx/sites-available/fastapi-dls`</summary>
```
upstream dls-backend {
server 127.0.0.1:8000; # must match dls listen port
}
server {
listen 443 ssl http2 default_server;
listen [::]:443 ssl http2 default_server;
root /var/www/html;
index index.html;
server_name _;
ssl_certificate "/etc/fastapi-dls/cert/webserver.crt";
ssl_certificate_key "/etc/fastapi-dls/cert/webserver.key";
ssl_session_cache shared:SSL:1m;
ssl_session_timeout 10m;
ssl_protocols TLSv1.3 TLSv1.2;
# ssl_ciphers "ECDHE-ECDSA-CHACHA20-POLY1305";
# ssl_ciphers PROFILE=SYSTEM;
ssl_prefer_server_ciphers on;
location / {
# https://www.uvicorn.org/deployment/
proxy_set_header Host $http_host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $connection_upgrade;
proxy_redirect off;
proxy_buffering off;
proxy_set_header X-Real-IP $remote_addr;
proxy_pass http://dls-backend$request_uri;
}
location = /-/health {
access_log off;
add_header 'Content-Type' 'application/json';
return 200 '{\"status\":\"up\",\"service\":\"nginx\"}';
}
}
map $http_upgrade $connection_upgrade {
default upgrade;
'' close;
}
server {
listen 80;
listen [::]:80;
root /var/www/html;
index index.html;
server_name _;
location /leasing/v1/lessor/shutdown {
proxy_set_header Host $http_host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_pass http://dls-backend/leasing/v1/lessor/shutdown;
}
location / {
return 301 https://$host$request_uri;
}
}
```
</details>
**Enable and test vhost**
```shell
ln -s /etc/nginx/sites-available/fastapi-dls /etc/nginx/sites-enabled/fastapi-dls
nginx -t
# nginx: the configuration file /etc/nginx/nginx.conf syntax is ok
# nginx: configuration file /etc/nginx/nginx.conf test is successful
```
**Override default fastapi-dls systemd service**
```shell
mkdir /etc/systemd/system/fastapi-dls.service.d
```
<details>
<summary>`/etc/systemd/system/fastapi-dls.service.d/override.conf`</summary>
```
[Service]
ExecStart=
ExecStart=uvicorn main:app \
--env-file /etc/fastapi-dls/env \
--host 127.0.0.1 --port 8000 \
--app-dir /usr/share/fastapi-dls/app \
--proxy-headers
```
</details>
**Run**
```shell
systemctl daemon-reload
service nginx start
service fastapi-dls start
```
# Configuration (Service)
| Variable | Default | Usage | | Variable | Default | Usage |
|------------------------|----------------------------------------|------------------------------------------------------------------------------------------------------| |------------------------|----------------------------------------|------------------------------------------------------------------------------------------------------|
| `DEBUG` | `false` | Toggles `fastapi` debug mode | | `DEBUG` | `false` | Toggles `fastapi` debug mode |
| `DLS_URL` | `localhost` | Used in client-token to tell guest driver where dls instance is reachable | | `DLS_URL` | `localhost` | Used in client-token to tell guest driver where dls instance is reachable |
| `DLS_PORT` | `443` | Used in client-token to tell guest driver where dls instance is reachable | | `DLS_PORT` | `443` | Used in client-token to tell guest driver where dls instance is reachable |
| `CERT_PATH` | `None` | Path to a Directory where generated Certificates are stored. Defaults to `/<app-dir>/cert`. |
| `TOKEN_EXPIRE_DAYS` | `1` | Client auth-token validity (used for authenticate client against api, **not `.tok` file!**) | | `TOKEN_EXPIRE_DAYS` | `1` | Client auth-token validity (used for authenticate client against api, **not `.tok` file!**) |
| `LEASE_EXPIRE_DAYS` | `90` | Lease time in days | | `LEASE_EXPIRE_DAYS` | `90` | Lease time in days |
| `LEASE_RENEWAL_PERIOD` | `0.15` | The percentage of the lease period that must elapse before a licensed client can renew a license \*1 | | `LEASE_RENEWAL_PERIOD` | `0.15` | The percentage of the lease period that must elapse before a licensed client can renew a license \*1 |
@@ -563,19 +409,42 @@ service fastapi-dls start
| `SITE_KEY_XID` | `00000000-0000-0000-0000-000000000000` | Site identification uuid | | `SITE_KEY_XID` | `00000000-0000-0000-0000-000000000000` | Site identification uuid |
| `INSTANCE_REF` | `10000000-0000-0000-0000-000000000001` | Instance identification uuid | | `INSTANCE_REF` | `10000000-0000-0000-0000-000000000001` | Instance identification uuid |
| `ALLOTMENT_REF` | `20000000-0000-0000-0000-000000000001` | Allotment identification uuid | | `ALLOTMENT_REF` | `20000000-0000-0000-0000-000000000001` | Allotment identification uuid |
| `INSTANCE_KEY_RSA` | `<app-dir>/cert/instance.private.pem` | Site-wide private RSA key for singing JWTs \*3 |
| `INSTANCE_KEY_PUB` | `<app-dir>/cert/instance.public.pem` | Site-wide public key \*3 |
\*1 For example, if the lease period is one day and the renewal period is 20%, the client attempts to renew its license \*1 For example, if the lease period is one day and the renewal period is 20%, the client attempts to renew its license
every 4.8 hours. If network connectivity is lost, the loss of connectivity is detected during license renewal and the every 4.8 hours. If network connectivity is lost, the loss of connectivity is detected during license renewal and the
client has 19.2 hours in which to re-establish connectivity before its license expires. client has 19.2 hours in which to re-establish connectivity before its license expires.
\*2 Always use `https`, since guest-drivers only support secure connections! \*3 Always use `https`, since guest-drivers only support secure connections!
\*4 If you recreate instance keys you need to **recreate client-token for each guest**!
# Setup (Client) # Setup (Client)
**The token file has to be copied! It's not enough to C&P file contents, because there can be special characters.** **The token file has to be copied! It's not enough to C&P file contents, because there can be special characters.**
This guide does not show how to install vGPU guest drivers! Look at the official documentation packed with the driver Successfully tested with this package versions:
releases.
| vGPU Suftware | Linux vGPU Manager | Linux Driver | Windows Driver | Release Date |
|---------------|--------------------|--------------|----------------|---------------|
| `16.3` | `535.154.02` | `535.154.05` | `538.15` | January 2024 |
| `16.2` | `535.129.03` | `535.129.03` | `537.70` | October 2023 |
| `16.1` | `535.104.06` | `535.104.05` | `537.13` | August 2023 |
| `16.0` | `535.54.06` | `535.54.03` | `536.22` | July 2023 |
| `15.3` | `525.125.03` | `525.125.06` | `529.11` | June 2023 |
| `15.2` | `525.105.14` | `525.105.17` | `528.89` | March 2023 |
| `15.1` | `525.85.07` | `525.85.05` | `528.24` | January 2023 |
| `15.0` | `525.60.12` | `525.60.13` | `527.41` | December 2022 |
| `14.4` | `510.108.03` | `510.108.03` | `514.08` | December 2022 |
| `14.3` | `510.108.03` | `510.108.03` | `513.91` | November 2022 |
- https://docs.nvidia.com/grid/index.html
*To get the latest drivers, visit Nvidia or search in Discord-Channel `GPU Unlocking` (Server-ID: `829786927829745685`) on channel `licensing` `biggerthanshit`
https://archive.biggerthanshit.com/NVIDIA/ (nvidia / b1gg3rth4nsh1t)
## Linux ## Linux
@@ -651,37 +520,33 @@ Done. For more information check [troubleshoot section](#troubleshoot).
8. Set schedule to `At First Array Start Only` 8. Set schedule to `At First Array Start Only`
9. Click on Apply 9. Click on Apply
# API Endpoints
# Endpoints
<details> <details>
<summary>show</summary> <summary>show</summary>
**`GET /`** ### `GET /`
Redirect to `/-/readme`. Redirect to `/-/readme`.
**`GET /-/health`** ### `GET /-/health`
Status endpoint, used for *healthcheck*. Status endpoint, used for *healthcheck*.
**`GET /-/config`** ### `GET /-/config`
Shows current runtime environment variables and their values. Shows current runtime environment variables and their values.
**`GET /-/config/root-certificate`** ### `GET /-/readme`
Returns the Root-Certificate Certificate which is used.
This is required for patching `nvidia-gridd` on `18.x`, `19.x` releases.
**`GET /-/readme`**
HTML rendered README.md. HTML rendered README.md.
**`GET /-/manage`** ### `GET /-/manage`
Shows a very basic UI to delete origins or leases. Shows a very basic UI to delete origins or leases.
**`GET /-/origins?leases=false`** ### `GET /-/origins?leases=false`
List registered origins. List registered origins.
@@ -689,11 +554,11 @@ List registered origins.
|-----------------|---------|--------------------------------------| |-----------------|---------|--------------------------------------|
| `leases` | `false` | Include referenced leases per origin | | `leases` | `false` | Include referenced leases per origin |
**`DELETE /-/origins`** ### `DELETE /-/origins`
Deletes all origins and their leases. Deletes all origins and their leases.
**`GET /-/leases?origin=false`** ### `GET /-/leases?origin=false`
List current leases. List current leases.
@@ -701,20 +566,20 @@ List current leases.
|-----------------|---------|-------------------------------------| |-----------------|---------|-------------------------------------|
| `origin` | `false` | Include referenced origin per lease | | `origin` | `false` | Include referenced origin per lease |
**`DELETE /-/lease/{lease_ref}`** ### `DELETE /-/lease/{lease_ref}`
Deletes an lease. Deletes an lease.
**`GET /-/client-token`** ### `GET /-/client-token`
Generate client token, (see [installation](#installation)). Generate client token, (see [installation](#installation)).
**Others** ### Others
There are many other internal api endpoints for handling authentication and lease process. There are many other internal api endpoints for handling authentication and lease process.
</details> </details>
# Troubleshoot / Debug # Troubleshoot
**Please make sure that fastapi-dls and your guests are on the same timezone!** **Please make sure that fastapi-dls and your guests are on the same timezone!**
@@ -734,26 +599,11 @@ Logs are available in `C:\Users\Public\Documents\Nvidia\LoggingLog.NVDisplay.Con
# Known Issues # Known Issues
## Generic
### `Failed to acquire license from <ip> (Info: <license> - Error: The allowed time to process response has expired)`
- Did your timezone settings are correct on fastapi-dls **and your guest**?
- Did you download the client-token more than an hour ago?
Please download a new client-token. The guest have to register within an hour after client-token was created.
### `jose.exceptions.JWTError: Signature verification failed.`
- Did you recreate any certificate or keypair?
Then you have to download a **new** client-token on each of your guests.
## Linux ## Linux
### Invalid HTTP request ### `uvicorn.error:Invalid HTTP request received.`
This error message: `uvicorn.error:Invalid HTTP request received.` can be ignored. This message can be ignored.
- Ref. https://github.com/encode/uvicorn/issues/441 - Ref. https://github.com/encode/uvicorn/issues/441
@@ -879,52 +729,12 @@ The error message can safely be ignored (since we have no license limitation :P)
</details> </details>
# vGPU Software Compatibility Matrix
<details>
<summary>Successfully tested with this package versions: Show Table</summary>
| FastAPI-DLS Version | vGPU Suftware | Driver Branch | Linux vGPU Manager | Linux Driver | Windows Driver | Release Date | EOL Date |
|---------------------|:-------------:|:-------------:|--------------------|--------------|----------------|--------------:|--------------:|
| `2.x` | `19.0` | **R580** | `580.65.05` | `580.65.06` | `580.88` | August 2025 | July 2028 |
| `2.x` | `18.4` | **R570** | `570.172.07` | `570.172.08` | `573.48` | July 2025 | March 2026 |
| | `18.3` | **R570** | `570.158.02` | `570.158.01` | `573.36` | June 2025 | |
| | `18.2` | **R570** | `570.148.06` | `570.148.08` | `573.07` | May 2025 | |
| | `18.1` | **R570** | `570.133.08` | `570.133.07` | `572.83` | April 2025 | |
| | `18.0` | **R570** | `570.124.03` | `570.124.06` | `572.60` | March 2025 | |
| `1.x` & `2.x` | `17.6` | **R550** | `550.163.02` | `550.63.01` | `553.74` | April 2025 | June 2025 |
| | `17.5` | | `550.144.02` | `550.144.03` | `553.62` | January 2025 | |
| | `17.4` | | `550.127.06` | `550.127.05` | `553.24` | October 2024 | |
| | `17.3` | | `550.90.05` | `550.90.07` | `552.74` | July 2024 | |
| | `17.2` | | `550.90.05` | `550.90.07` | `552.55` | June 2024 | |
| | `17.1` | | `550.54.16` | `550.54.15` | `551.78` | March 2024 | |
| | `17.0` | **R550** | `550.54.10` | `550.54.14` | `551.61` | February 2024 | |
| `1.x` | `16.11` | **R535** | `535.261.04` | `535.261.03` | `539.41` | July 2025 | July 2026 |
| `1.x` | `15.4` | **R525** | `525.147.01` | `525.147.05` | `529.19` | June 2023 | December 2023 |
| `1.x` | `14.4` | **R510** | `510.108.03` | `510.108.03` | `514.08` | December 2022 | February 2023 |
</details>
- https://docs.nvidia.com/grid/index.html
- https://docs.nvidia.com/grid/gpus-supported-by-vgpu.html
*To get the latest drivers, visit Nvidia or search in Discord-Channel `GPU Unlocking` (Server-ID: `829786927829745685`)
on channel `licensing`
# Credits # Credits
Thanks to vGPU community and all who uses this project and report bugs. Thanks to vGPU community and all who uses this project and report bugs.
Special thanks to: Special thanks to
- `samicrusader` who created build file for **ArchLinux** - @samicrusader who created build file for ArchLinux
- `cyrus` who wrote the section for **openSUSE** - @cyrus who wrote the section for openSUSE
- `midi` who wrote the section for **unRAID** - @midi who wrote the section for unRAID
- `polloloco` who wrote the *[NVIDIA vGPU Guide](https://gitlab.com/polloloco/vgpu-proxmox)*
- `DualCoder` who creates the `vgpu_unlock` functionality [vgpu_unlock](https://github.com/DualCoder/vgpu_unlock)
- `Krutav Shah` who wrote the [vGPU_Unlock Wiki](https://docs.google.com/document/d/1pzrWJ9h-zANCtyqRgS7Vzla0Y8Ea2-5z2HEi4X75d2Q/)
- `Wim van 't Hoog` for the [Proxmox All-In-One Installer Script](https://wvthoog.nl/proxmox-vgpu-v3/)
- `mrzenc` who wrote [fastapi-dls-nixos](https://github.com/mrzenc/fastapi-dls-nixos)
- `electricsheep49` who wrote [gridd-unlock-patcher](https://git.collinwebdesigns.de/vgpu/gridd-unlock-patcher)
And thanks to all people who contributed to all these libraries!

View File

@@ -2,17 +2,6 @@
I am planning to implement the following features in the future. I am planning to implement the following features in the future.
## Patching Endpoint
A (optional) Path-Variable to `gridd-unlock-patcher` which enables an additional endpoint.
Here you can upload your `nvidia-gridd` binary or `nvxdapix.dll` which then will be patched and responded.
## All-In-One Installer Script Endpoint
A new all-in-one installer endpoint
(here a script is returned for linux or windows which then could be called like
curl https://<fastapi-dls>/-/install/deb | sh which then
download and place a client-token in the right directory, patch your girdd / dll and restart nvidia-gridd service)
## HA - High Availability ## HA - High Availability

View File

@@ -1,109 +1,50 @@
import logging import logging
from base64 import b64encode as b64enc from base64 import b64encode as b64enc
from calendar import timegm
from contextlib import asynccontextmanager
from datetime import datetime, timedelta, UTC
from hashlib import sha256 from hashlib import sha256
from json import loads as json_loads, dumps as json_dumps
from os import getenv as env, listdir
from os.path import join, dirname, isfile, isdir, exists
from textwrap import wrap
from uuid import uuid4 from uuid import uuid4
from os.path import join, dirname
from os import getenv as env
from dateutil.relativedelta import relativedelta
from dotenv import load_dotenv from dotenv import load_dotenv
from fastapi import FastAPI from fastapi import FastAPI
from fastapi.requests import Request from fastapi.requests import Request
from fastapi.staticfiles import StaticFiles from json import loads as json_loads
from fastapi.responses import Response, RedirectResponse, StreamingResponse from datetime import datetime
import jwt from dateutil.relativedelta import relativedelta
from calendar import timegm
from jose import jws, jwt, JWTError
from jose.constants import ALGORITHMS
from starlette.middleware.cors import CORSMiddleware
from starlette.responses import StreamingResponse, JSONResponse as JSONr, HTMLResponse as HTMLr, Response, RedirectResponse
from sqlalchemy import create_engine from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker from sqlalchemy.orm import sessionmaker
from starlette.middleware.cors import CORSMiddleware
from orm import Origin, Lease, init as db_init, migrate from orm import init as db_init, migrate, Site, Instance, Origin, Lease
from util import CASetup, PrivateKey, Cert, ProductMapping, load_file
# Load variables
load_dotenv('../version.env') load_dotenv('../version.env')
# Get current timezone # get local timezone
TZ = datetime.now().astimezone().tzinfo TZ = datetime.now().astimezone().tzinfo
# Load basic variables # fetch version info
VERSION, COMMIT, DEBUG = env('VERSION', 'unknown'), env('COMMIT', 'unknown'), bool(env('DEBUG', False)) VERSION, COMMIT, DEBUG = env('VERSION', 'unknown'), env('COMMIT', 'unknown'), bool(env('DEBUG', False))
# Database connection # fastapi setup
config = dict(openapi_url='/-/openapi.json', docs_url=None, redoc_url=None)
app = FastAPI(title='FastAPI-DLS', description='Minimal Delegated License Service (DLS).', version=VERSION, **config)
# database setup
db = create_engine(str(env('DATABASE', 'sqlite:///db.sqlite'))) db = create_engine(str(env('DATABASE', 'sqlite:///db.sqlite')))
db_init(db), migrate(db) db_init(db), migrate(db)
# Load DLS variables (all prefixed with "INSTANCE_*" is used as "SERVICE_INSTANCE_*" or "SI_*" in official dls service) # DLS setup (static)
DLS_URL = str(env('DLS_URL', 'localhost')) DLS_URL = str(env('DLS_URL', 'localhost'))
DLS_PORT = int(env('DLS_PORT', '443')) DLS_PORT = int(env('DLS_PORT', '443'))
CERT_PATH = str(env('CERT_PATH', None))
SITE_KEY_XID = str(env('SITE_KEY_XID', '00000000-0000-0000-0000-000000000000'))
INSTANCE_REF = str(env('INSTANCE_REF', '10000000-0000-0000-0000-000000000001'))
ALLOTMENT_REF = str(env('ALLOTMENT_REF', '20000000-0000-0000-0000-000000000001'))
TOKEN_EXPIRE_DELTA = relativedelta(days=int(env('TOKEN_EXPIRE_DAYS', 1)), hours=int(env('TOKEN_EXPIRE_HOURS', 0)))
LEASE_EXPIRE_DELTA = relativedelta(days=int(env('LEASE_EXPIRE_DAYS', 90)), hours=int(env('LEASE_EXPIRE_HOURS', 0)))
LEASE_RENEWAL_PERIOD = float(env('LEASE_RENEWAL_PERIOD', 0.15))
LEASE_RENEWAL_DELTA = timedelta(days=int(env('LEASE_EXPIRE_DAYS', 90)), hours=int(env('LEASE_EXPIRE_HOURS', 0)))
CLIENT_TOKEN_EXPIRE_DELTA = relativedelta(years=12)
CORS_ORIGINS = str(env('CORS_ORIGINS', '')).split(',') if (env('CORS_ORIGINS')) else [f'https://{DLS_URL}'] CORS_ORIGINS = str(env('CORS_ORIGINS', '')).split(',') if (env('CORS_ORIGINS')) else [f'https://{DLS_URL}']
DRIVERS_DIR = env('DRIVERS_DIR', None)
DT_FORMAT = '%Y-%m-%dT%H:%M:%S.%fZ'
PRODUCT_MAPPING = ProductMapping(filename=join(dirname(__file__), 'static/product_mapping.json'))
# Create certificate chain and signing keys ALLOTMENT_REF = str(env('ALLOTMENT_REF', '20000000-0000-0000-0000-000000000001')) # todo
ca_setup = CASetup(service_instance_ref=INSTANCE_REF, cert_path=CERT_PATH)
my_root_private_key = PrivateKey.from_file(ca_setup.root_private_key_filename)
my_root_public_key = my_root_private_key.public_key()
my_root_certificate = Cert.from_file(ca_setup.root_certificate_filename)
my_ca_certificate = Cert.from_file(ca_setup.ca_certificate_filename)
my_si_certificate = Cert.from_file(ca_setup.si_certificate_filename)
my_si_private_key = PrivateKey.from_file(ca_setup.si_private_key_filename)
my_si_public_key = my_si_private_key.public_key()
jwt_encode_key = my_si_private_key.pem() # todo: replace directly in code
jwt_decode_key = my_si_private_key.public_key().pem() # todo: replace directly in code
# Logging
LOG_LEVEL = logging.DEBUG if DEBUG else logging.INFO
logging.basicConfig(format='[{levelname:^7}] [{module:^15}] {message}', style='{')
logger = logging.getLogger(__name__)
logger.setLevel(LOG_LEVEL)
logging.getLogger('util').setLevel(LOG_LEVEL)
logging.getLogger('NV').setLevel(LOG_LEVEL)
# FastAPI
@asynccontextmanager
async def lifespan(_: FastAPI):
# on startup
logger.info(f'''
Using timezone: {str(TZ)}. Make sure this is correct and match your clients!
Your clients renew their license every {str(Lease.calculate_renewal(LEASE_RENEWAL_PERIOD, LEASE_RENEWAL_DELTA))}.
If the renewal fails, the license is {str(LEASE_RENEWAL_DELTA)} valid.
Your client-token file (.tok) is valid for {str(CLIENT_TOKEN_EXPIRE_DELTA)}.
''')
logger.info(f'Debug is {"enabled" if DEBUG else "disabled"}.')
yield
# on shutdown
logger.info(f'Shutting down ...')
config = dict(openapi_url=None, docs_url=None, redoc_url=None) # dict(openapi_url='/-/openapi.json', docs_url='/-/docs', redoc_url='/-/redoc')
app = FastAPI(title='FastAPI-DLS', description='Minimal Delegated License Service (DLS).', version=VERSION, lifespan=lifespan, **config)
if DRIVERS_DIR is not None:
app.mount('/-/static-drivers', StaticFiles(directory=str(DRIVERS_DIR), html=False), name='drivers')
# fastapi middleware
app.debug = DEBUG app.debug = DEBUG
app.add_middleware( app.add_middleware(
CORSMiddleware, CORSMiddleware,
@@ -113,17 +54,29 @@ app.add_middleware(
allow_headers=['*'], allow_headers=['*'],
) )
# logging
logging.basicConfig()
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG if DEBUG else logging.INFO)
# Helper
def __get_token(request: Request) -> dict: def validate_settings():
session = sessionmaker(bind=db)()
lease_expire_delta_min, lease_expire_delta_max = 86_400, 7_776_000
for instance in session.query(Instance).all():
lease_expire_delta = instance.lease_expire_delta
if lease_expire_delta < 86_400 or lease_expire_delta > 7_776_000:
logging.warning(f'> [ instance ]: {instance.instance_ref}: "lease_expire_delta" should be between {lease_expire_delta_min} and {lease_expire_delta_max}')
session.close()
def __get_token(request: Request, jwt_decode_key: "jose.jwt") -> dict:
authorization_header = request.headers.get('authorization') authorization_header = request.headers.get('authorization')
token = authorization_header.split(' ')[1] token = authorization_header.split(' ')[1]
return jwt.decode(token=token, key=jwt_decode_key, algorithms=ALGORITHMS.RS256, options={'verify_aud': False})
# return jwt.decode(token=token, key=jwt_decode_key, algorithms=ALGORITHMS.RS256, options={'verify_aud': False})
return jwt.decode(jwt=token, key=jwt_decode_key, algorithms=['RS256'], options={'verify_aud': False})
# Endpoints
@app.get('/', summary='Index') @app.get('/', summary='Index')
async def index(): async def index():
@@ -137,41 +90,37 @@ async def _index():
@app.get('/-/health', summary='* Health') @app.get('/-/health', summary='* Health')
async def _health(): async def _health():
return Response(content=json_dumps({'status': 'up'}), media_type='application/json', status_code=200) return JSONr({'status': 'up'})
@app.get('/-/config', summary='* Config', description='returns environment variables.') @app.get('/-/config', summary='* Config', description='returns environment variables.')
async def _config(): async def _config():
response = { default_site, default_instance = Site.get_default_site(db), Instance.get_default_instance(db)
return JSONr({
'VERSION': str(VERSION), 'VERSION': str(VERSION),
'COMMIT': str(COMMIT), 'COMMIT': str(COMMIT),
'DEBUG': str(DEBUG), 'DEBUG': str(DEBUG),
'DLS_URL': str(DLS_URL), 'DLS_URL': str(DLS_URL),
'DLS_PORT': str(DLS_PORT), 'DLS_PORT': str(DLS_PORT),
'SITE_KEY_XID': str(SITE_KEY_XID), 'SITE_KEY_XID': str(default_site.site_key),
'INSTANCE_REF': str(INSTANCE_REF), 'INSTANCE_REF': str(default_instance.instance_ref),
'ALLOTMENT_REF': [str(ALLOTMENT_REF)], 'ALLOTMENT_REF': [str(ALLOTMENT_REF)],
'TOKEN_EXPIRE_DELTA': str(TOKEN_EXPIRE_DELTA), 'TOKEN_EXPIRE_DELTA': str(default_instance.get_token_expire_delta()),
'LEASE_EXPIRE_DELTA': str(LEASE_EXPIRE_DELTA), 'LEASE_EXPIRE_DELTA': str(default_instance.get_lease_expire_delta()),
'LEASE_RENEWAL_PERIOD': str(LEASE_RENEWAL_PERIOD), 'LEASE_RENEWAL_PERIOD': str(default_instance.lease_renewal_period),
'CORS_ORIGINS': str(CORS_ORIGINS), 'CORS_ORIGINS': str(CORS_ORIGINS),
'TZ': str(TZ), 'TZ': str(TZ),
} })
return Response(content=json_dumps(response), media_type='application/json', status_code=200)
@app.get('/-/config/root-certificate', summary='* Root Certificate', description='returns Root--Certificate needed for patching nvidia-gridd')
async def _config():
return Response(content=my_root_certificate.pem().decode('utf-8').strip(), media_type='text/plain')
@app.get('/-/readme', summary='* Readme') @app.get('/-/readme', summary='* Readme')
async def _readme(): async def _readme():
from markdown import markdown from markdown import markdown
content = load_file(join(dirname(__file__), '../README.md')).decode('utf-8') from util import load_file
response = markdown(text=content, extensions=['tables', 'fenced_code', 'md_in_html', 'nl2br', 'toc'])
return Response(response, media_type='text/html', status_code=200) content = load_file('../README.md').decode('utf-8')
return HTMLr(markdown(text=content, extensions=['tables', 'fenced_code', 'md_in_html', 'nl2br', 'toc']))
@app.get('/-/manage', summary='* Management UI') @app.get('/-/manage', summary='* Management UI')
@@ -209,26 +158,7 @@ async def _manage(request: Request):
</body> </body>
</html> </html>
''' '''
return Response(response, media_type='text/html', status_code=200) return HTMLr(response)
@app.get('/-/drivers/{directory:path}', summary='* List drivers directory')
async def _drivers(request: Request, directory: str | None):
if DRIVERS_DIR is None:
return Response(status_code=404, content=f'Variable "DRIVERS_DIR" not set.')
path = join(DRIVERS_DIR, directory)
if not exists(path) and not isfile(path):
return Response(status_code=404, content=f'Resource "{path}" not found!')
content = [{
"type": "file" if isfile(f'{path}/{_}') else "folder" if isdir(f'{path}/{_}') else "unknown",
"name": _,
"link": f'/-/static-drivers/{directory}{_}',
} for _ in listdir(path)]
return Response(content=json_dumps({"directory": path, "content": content}), media_type='application/json', status_code=200)
@app.get('/-/origins', summary='* Origins') @app.get('/-/origins', summary='* Origins')
@@ -238,11 +168,10 @@ async def _origins(request: Request, leases: bool = False):
for origin in session.query(Origin).all(): for origin in session.query(Origin).all():
x = origin.serialize() x = origin.serialize()
if leases: if leases:
serialize = dict(renewal_period=LEASE_RENEWAL_PERIOD, renewal_delta=LEASE_RENEWAL_DELTA) x['leases'] = list(map(lambda _: _.serialize(), Lease.find_by_origin_ref(db, origin.origin_ref)))
x['leases'] = list(map(lambda _: _.serialize(**serialize), Lease.find_by_origin_ref(db, origin.origin_ref)))
response.append(x) response.append(x)
session.close() session.close()
return Response(content=json_dumps(response), media_type='application/json', status_code=200) return JSONr(response)
@app.delete('/-/origins', summary='* Origins') @app.delete('/-/origins', summary='* Origins')
@@ -256,15 +185,14 @@ async def _leases(request: Request, origin: bool = False):
session = sessionmaker(bind=db)() session = sessionmaker(bind=db)()
response = [] response = []
for lease in session.query(Lease).all(): for lease in session.query(Lease).all():
serialize = dict(renewal_period=LEASE_RENEWAL_PERIOD, renewal_delta=LEASE_RENEWAL_DELTA) x = lease.serialize()
x = lease.serialize(**serialize)
if origin: if origin:
lease_origin = session.query(Origin).filter(Origin.origin_ref == lease.origin_ref).first() lease_origin = session.query(Origin).filter(Origin.origin_ref == lease.origin_ref).first()
if lease_origin is not None: if lease_origin is not None:
x['origin'] = lease_origin.serialize() x['origin'] = lease_origin.serialize()
response.append(x) response.append(x)
session.close() session.close()
return Response(content=json_dumps(response), media_type='application/json', status_code=200) return JSONr(response)
@app.delete('/-/leases/expired', summary='* Leases') @app.delete('/-/leases/expired', summary='* Leases')
@@ -277,15 +205,20 @@ async def _lease_delete_expired(request: Request):
async def _lease_delete(request: Request, lease_ref: str): async def _lease_delete(request: Request, lease_ref: str):
if Lease.delete(db, lease_ref) == 1: if Lease.delete(db, lease_ref) == 1:
return Response(status_code=201) return Response(status_code=201)
response = {'status': 404, 'detail': 'lease not found'} return JSONr(status_code=404, content={'status': 404, 'detail': 'lease not found'})
return Response(content=json_dumps(response), media_type='application/json', status_code=404)
# venv/lib/python3.9/site-packages/nls_core_service_instance/service_instance_token_manager.py # venv/lib/python3.9/site-packages/nls_core_service_instance/service_instance_token_manager.py
@app.get('/-/client-token', summary='* Client-Token', description='creates a new messenger token for this service instance') @app.get('/-/client-token', summary='* Client-Token', description='creates a new messenger token for this service instance')
async def _client_token(): async def _client_token():
cur_time = datetime.now(UTC) cur_time = datetime.utcnow()
exp_time = cur_time + CLIENT_TOKEN_EXPIRE_DELTA
default_instance = Instance.get_default_instance(db)
public_key = default_instance.get_public_key()
# todo: implemented request parameter to support different instances
jwt_encode_key = default_instance.get_jwt_encode_key()
exp_time = cur_time + default_instance.get_client_token_expire_delta()
payload = { payload = {
"jti": str(uuid4()), "jti": str(uuid4()),
@@ -294,17 +227,15 @@ async def _client_token():
"iat": timegm(cur_time.timetuple()), "iat": timegm(cur_time.timetuple()),
"nbf": timegm(cur_time.timetuple()), "nbf": timegm(cur_time.timetuple()),
"exp": timegm(exp_time.timetuple()), "exp": timegm(exp_time.timetuple()),
"protocol_version": "2.0",
"update_mode": "ABSOLUTE", "update_mode": "ABSOLUTE",
"scope_ref_list": [ALLOTMENT_REF], "scope_ref_list": [ALLOTMENT_REF],
"fulfillment_class_ref_list": [], "fulfillment_class_ref_list": [],
"service_instance_configuration": { "service_instance_configuration": {
"nls_service_instance_ref": INSTANCE_REF, "nls_service_instance_ref": default_instance.instance_ref,
"svc_port_set_list": [ "svc_port_set_list": [
{ {
"idx": 0, "idx": 0,
"d_name": "DLS", "d_name": "DLS",
# todo: {"service": "quick_release", "port": 80} - see "shutdown for windows"
"svc_port_map": [{"service": "auth", "port": DLS_PORT}, {"service": "lease", "port": DLS_PORT}] "svc_port_map": [{"service": "auth", "port": DLS_PORT}, {"service": "lease", "port": DLS_PORT}]
} }
], ],
@@ -312,19 +243,17 @@ async def _client_token():
}, },
"service_instance_public_key_configuration": { "service_instance_public_key_configuration": {
"service_instance_public_key_me": { "service_instance_public_key_me": {
"mod": my_si_public_key.mod(), "mod": hex(public_key.public_key().n)[2:],
"exp": my_si_public_key.exp(), "exp": int(public_key.public_key().e),
}, },
"service_instance_public_key_pem": my_si_public_key.pem().decode('utf-8').strip(), "service_instance_public_key_pem": public_key.export_key().decode('utf-8'),
"key_retention_mode": "LATEST_ONLY" "key_retention_mode": "LATEST_ONLY"
}, },
} }
# content = jws.sign(payload, key=jwt_encode_key, headers=None, algorithm=ALGORITHMS.RS256) content = jws.sign(payload, key=jwt_encode_key, headers=None, algorithm=ALGORITHMS.RS256)
content = jwt.encode(payload=payload, key=jwt_encode_key, headers=None, algorithm='RS256')
# response = StreamingResponse(iter([content]), media_type="text/plain") response = StreamingResponse(iter([content]), media_type="text/plain")
response = StreamingResponse(iter(content), media_type="text/plain")
filename = f'client_configuration_token_{datetime.now().strftime("%d-%m-%y-%H-%M-%S")}.tok' filename = f'client_configuration_token_{datetime.now().strftime("%d-%m-%y-%H-%M-%S")}.tok'
response.headers["Content-Disposition"] = f'attachment; filename={filename}' response.headers["Content-Disposition"] = f'attachment; filename={filename}'
@@ -334,10 +263,10 @@ async def _client_token():
# venv/lib/python3.9/site-packages/nls_services_auth/test/test_origins_controller.py # venv/lib/python3.9/site-packages/nls_services_auth/test/test_origins_controller.py
@app.post('/auth/v1/origin', description='find or create an origin') @app.post('/auth/v1/origin', description='find or create an origin')
async def auth_v1_origin(request: Request): async def auth_v1_origin(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.utcnow()
origin_ref = j.get('candidate_origin_ref') origin_ref = j.get('candidate_origin_ref')
logger.info(f'> [ origin ]: {origin_ref}: {j}') logging.info(f'> [ origin ]: {origin_ref}: {j}')
data = Origin( data = Origin(
origin_ref=origin_ref, origin_ref=origin_ref,
@@ -348,31 +277,26 @@ async def auth_v1_origin(request: Request):
Origin.create_or_update(db, data) Origin.create_or_update(db, data)
environment = {
'raw_env': j.get('environment')
}
environment.update(j.get('environment'))
response = { response = {
"origin_ref": origin_ref, "origin_ref": origin_ref,
"environment": environment, "environment": j.get('environment'),
"svc_port_set_list": None, "svc_port_set_list": None,
"node_url_list": None, "node_url_list": None,
"node_query_order": None, "node_query_order": None,
"prompts": None, "prompts": None,
"sync_timestamp": cur_time.strftime(DT_FORMAT) "sync_timestamp": cur_time.isoformat()
} }
return Response(content=json_dumps(response, separators=(',', ':')), media_type='application/json', status_code=200) return JSONr(response)
# venv/lib/python3.9/site-packages/nls_services_auth/test/test_origins_controller.py # venv/lib/python3.9/site-packages/nls_services_auth/test/test_origins_controller.py
@app.post('/auth/v1/origin/update', description='update an origin evidence') @app.post('/auth/v1/origin/update', description='update an origin evidence')
async def auth_v1_origin_update(request: Request): async def auth_v1_origin_update(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.utcnow()
origin_ref = j.get('origin_ref') origin_ref = j.get('origin_ref')
logger.info(f'> [ update ]: {origin_ref}: {j}') logging.info(f'> [ update ]: {origin_ref}: {j}')
data = Origin( data = Origin(
origin_ref=origin_ref, origin_ref=origin_ref,
@@ -386,68 +310,70 @@ async def auth_v1_origin_update(request: Request):
response = { response = {
"environment": j.get('environment'), "environment": j.get('environment'),
"prompts": None, "prompts": None,
"sync_timestamp": cur_time.strftime(DT_FORMAT) "sync_timestamp": cur_time.isoformat()
} }
return Response(content=json_dumps(response, separators=(',', ':')), media_type='application/json', status_code=200) return JSONr(response)
# venv/lib/python3.9/site-packages/nls_services_auth/test/test_auth_controller.py # venv/lib/python3.9/site-packages/nls_services_auth/test/test_auth_controller.py
# venv/lib/python3.9/site-packages/nls_core_auth/auth.py - CodeResponse # venv/lib/python3.9/site-packages/nls_core_auth/auth.py - CodeResponse
@app.post('/auth/v1/code', description='get an authorization code') @app.post('/auth/v1/code', description='get an authorization code')
async def auth_v1_code(request: Request): async def auth_v1_code(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.utcnow()
origin_ref = j.get('origin_ref') origin_ref = j.get('origin_ref')
logger.info(f'> [ code ]: {origin_ref}: {j}') logging.info(f'> [ code ]: {origin_ref}: {j}')
delta = relativedelta(minutes=15) delta = relativedelta(minutes=15)
expires = cur_time + delta expires = cur_time + delta
default_site = Site.get_default_site(db)
jwt_encode_key = Instance.get_default_instance(db).get_jwt_encode_key()
payload = { payload = {
'iat': timegm(cur_time.timetuple()), 'iat': timegm(cur_time.timetuple()),
'exp': timegm(expires.timetuple()), 'exp': timegm(expires.timetuple()),
'challenge': j.get('code_challenge'), 'challenge': j.get('code_challenge'),
'origin_ref': j.get('origin_ref'), 'origin_ref': j.get('origin_ref'),
'key_ref': SITE_KEY_XID, 'key_ref': default_site.site_key,
'kid': SITE_KEY_XID 'kid': default_site.site_key,
} }
# auth_code = jws.sign(payload, key=jwt_encode_key, headers={'kid': payload.get('kid')}, algorithm=ALGORITHMS.RS256) auth_code = jws.sign(payload, key=jwt_encode_key, headers={'kid': payload.get('kid')}, algorithm=ALGORITHMS.RS256)
auth_code = jwt.encode(payload=payload, key=jwt_encode_key, headers={'kid': payload.get('kid')}, algorithm='RS256')
response = { response = {
"auth_code": auth_code, "auth_code": auth_code,
"prompts": None, "sync_timestamp": cur_time.isoformat(),
"sync_timestamp": cur_time.strftime(DT_FORMAT), "prompts": None
} }
return Response(content=json_dumps(response, separators=(',', ':')), media_type='application/json', status_code=200) return JSONr(response)
# venv/lib/python3.9/site-packages/nls_services_auth/test/test_auth_controller.py # venv/lib/python3.9/site-packages/nls_services_auth/test/test_auth_controller.py
# venv/lib/python3.9/site-packages/nls_core_auth/auth.py - TokenResponse # venv/lib/python3.9/site-packages/nls_core_auth/auth.py - TokenResponse
@app.post('/auth/v1/token', description='exchange auth code and verifier for token') @app.post('/auth/v1/token', description='exchange auth code and verifier for token')
async def auth_v1_token(request: Request): async def auth_v1_token(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.utcnow()
default_site, default_instance = Site.get_default_site(db), Instance.get_default_instance(db)
jwt_encode_key, jwt_decode_key = default_instance.get_jwt_encode_key(), default_instance.get_jwt_decode_key()
try: try:
#payload = jwt.decode(token=j.get('auth_code'), key=jwt_decode_key, algorithms=ALGORITHMS.RS256) payload = jwt.decode(token=j.get('auth_code'), key=jwt_decode_key, algorithms=[ALGORITHMS.RS256])
payload = jwt.decode(jwt=j.get('auth_code'), key=jwt_decode_key, algorithms=['RS256']) except JWTError as e:
except jwt.PyJWTError as e: return JSONr(status_code=400, content={'status': 400, 'title': 'invalid token', 'detail': str(e)})
response = {'status': 400, 'title': 'invalid token', 'detail': str(e)}
return Response(content=json_dumps(response), media_type='application/json', status_code=400)
origin_ref = payload.get('origin_ref') origin_ref = payload.get('origin_ref')
logger.info(f'> [ auth ]: {origin_ref}: {j}') logging.info(f'> [ auth ]: {origin_ref}: {j}')
# validate the code challenge # validate the code challenge
challenge = b64enc(sha256(j.get('code_verifier').encode('utf-8')).digest()).rstrip(b'=').decode('utf-8') challenge = b64enc(sha256(j.get('code_verifier').encode('utf-8')).digest()).rstrip(b'=').decode('utf-8')
if payload.get('challenge') != challenge: if payload.get('challenge') != challenge:
response = {'status': 401, 'detail': 'expected challenge did not match verifier'} return JSONr(status_code=401, content={'status': 401, 'detail': 'expected challenge did not match verifier'})
return Response(content=json_dumps(response), media_type='application/json', status_code=401)
access_expires_on = cur_time + TOKEN_EXPIRE_DELTA access_expires_on = cur_time + default_instance.get_token_expire_delta()
new_payload = { new_payload = {
'iat': timegm(cur_time.timetuple()), 'iat': timegm(cur_time.timetuple()),
@@ -455,298 +381,238 @@ async def auth_v1_token(request: Request):
'iss': 'https://cls.nvidia.org', 'iss': 'https://cls.nvidia.org',
'aud': 'https://cls.nvidia.org', 'aud': 'https://cls.nvidia.org',
'exp': timegm(access_expires_on.timetuple()), 'exp': timegm(access_expires_on.timetuple()),
'key_ref': SITE_KEY_XID,
'kid': SITE_KEY_XID,
'origin_ref': origin_ref, 'origin_ref': origin_ref,
'key_ref': default_site.site_key,
'kid': default_site.site_key,
} }
auth_token = jwt.encode(payload=new_payload, key=jwt_encode_key, headers={'kid': payload.get('kid')}, algorithm='RS256') auth_token = jwt.encode(new_payload, key=jwt_encode_key, headers={'kid': payload.get('kid')}, algorithm=ALGORITHMS.RS256)
response = { response = {
"expires": access_expires_on.isoformat(),
"auth_token": auth_token, "auth_token": auth_token,
"expires": access_expires_on.strftime(DT_FORMAT), "sync_timestamp": cur_time.isoformat(),
"prompts": None,
"sync_timestamp": cur_time.strftime(DT_FORMAT),
} }
return Response(content=json_dumps(response, separators=(',', ':')), media_type='application/json', status_code=200) return JSONr(response)
# NLS 3.4.0 - venv/lib/python3.12/site-packages/nls_services_lease/test/test_lease_single_controller.py
@app.post('/leasing/v1/config-token', description='request to get config token for lease operations')
async def leasing_v1_config_token(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC)
cur_time = datetime.now(UTC)
exp_time = cur_time + CLIENT_TOKEN_EXPIRE_DELTA
payload = {
"iss": "NLS Service Instance",
"aud": "NLS Licensed Client",
"iat": timegm(cur_time.timetuple()),
"nbf": timegm(cur_time.timetuple()),
"exp": timegm(exp_time.timetuple()),
"protocol_version": "2.0",
"d_name": "DLS",
"service_instance_ref": j.get('service_instance_ref'),
"service_instance_public_key_configuration": {
"service_instance_public_key_me": {
"mod": my_si_public_key.mod(),
"exp": my_si_public_key.exp(),
},
"service_instance_public_key_pem": my_si_public_key.pem().decode('utf-8').strip(),
"key_retention_mode": "LATEST_ONLY"
},
}
# my_jwt_encode_key = jwk.construct(my_si_private_key.pem().decode('utf-8'), algorithm=ALGORITHMS.RS256)
# config_token = jws.sign(payload, key=my_jwt_encode_key, headers=None, algorithm=ALGORITHMS.RS256)
config_token = jwt.encode(payload=payload, key=jwt_encode_key, headers=None, algorithm='RS256')
response_ca_chain = my_ca_certificate.pem().decode('utf-8').strip()
# 76 chars per line on original response with "\r\n"
"""
response_ca_chain = my_ca_certificate.pem().decode('utf-8').strip()
response_ca_chain = response_ca_chain.replace('-----BEGIN CERTIFICATE-----', '')
response_ca_chain = response_ca_chain.replace('-----END CERTIFICATE-----', '')
response_ca_chain = response_ca_chain.replace('\n', '')
response_ca_chain = wrap(response_ca_chain, 76)
response_ca_chain = '\r\n'.join(response_ca_chain)
response_ca_chain = f'-----BEGIN CERTIFICATE-----\r\n{response_ca_chain}\r\n-----END CERTIFICATE-----'
"""
response_si_certificate = my_si_certificate.pem().decode('utf-8').strip()
# 76 chars per line on original response with "\r\n"
"""
response_si_certificate = my_si_certificate.pem().decode('utf-8').strip()
response_si_certificate = response_si_certificate.replace('-----BEGIN CERTIFICATE-----', '')
response_si_certificate = response_si_certificate.replace('-----END CERTIFICATE-----', '')
response_si_certificate = response_si_certificate.replace('\n', '')
response_si_certificate = wrap(response_si_certificate, 76)
response_si_certificate = '\r\n'.join(response_si_certificate)
"""
response = {
"certificateConfiguration": {
"caChain": [response_ca_chain],
"publicCert": response_si_certificate,
"publicKey": {
"exp": my_si_certificate.public_key().exp(),
"mod": [my_si_certificate.public_key().mod()],
},
},
"configToken": config_token,
}
return Response(content=json_dumps(response, separators=(',', ':')), media_type='application/json', status_code=200)
# venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py # venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py
@app.post('/leasing/v1/lessor', description='request multiple leases (borrow) for current origin') @app.post('/leasing/v1/lessor', description='request multiple leases (borrow) for current origin')
async def leasing_v1_lessor(request: Request): async def leasing_v1_lessor(request: Request):
j, token, cur_time = json_loads((await request.body()).decode('utf-8')), __get_token(request), datetime.now(UTC) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.utcnow()
default_instance = Instance.get_default_instance(db)
jwt_decode_key = default_instance.get_jwt_decode_key()
try: try:
token = __get_token(request) token = __get_token(request, jwt_decode_key)
except JWTError: except JWTError:
response = {'status': 401, 'detail': 'token is not valid'} return JSONr(status_code=401, content={'status': 401, 'detail': 'token is not valid'})
return Response(content=json_dumps(response), media_type='application/json', status_code=401)
origin_ref = token.get('origin_ref') origin_ref = token.get('origin_ref')
scope_ref_list = j.get('scope_ref_list') scope_ref_list = j.get('scope_ref_list')
lease_proposal_list = j.get('lease_proposal_list') logging.info(f'> [ create ]: {origin_ref}: create leases for scope_ref_list {scope_ref_list}')
logger.info(f'> [ create ]: {origin_ref}: create leases for scope_ref_list {scope_ref_list}')
for scope_ref in scope_ref_list:
# if scope_ref not in [ALLOTMENT_REF]:
# response = {'status': 400, 'detail': f'service instances not found for scopes: ["{scope_ref}"]')}
# return Response(content=json_dumps(response), media_type='application/json', status_code=400)
pass
lease_result_list = [] lease_result_list = []
for lease_proposal in lease_proposal_list: for scope_ref in scope_ref_list:
# if scope_ref not in [ALLOTMENT_REF]:
# return JSONr(status_code=500, detail=f'no service instances found for scopes: ["{scope_ref}"]')
lease_ref = str(uuid4()) lease_ref = str(uuid4())
expires = cur_time + LEASE_EXPIRE_DELTA expires = cur_time + default_instance.get_lease_expire_delta()
product_name = lease_proposal.get('product').get('name')
feature_name = PRODUCT_MAPPING.get_feature_name(product_name=product_name)
lease_result_list.append({ lease_result_list.append({
"error": None, "ordinal": 0,
# https://docs.nvidia.com/license-system/latest/nvidia-license-system-user-guide/index.html
"lease": { "lease": {
"created": cur_time.strftime(DT_FORMAT),
"expires": expires.strftime(DT_FORMAT), # todo: lease_proposal.get('duration') => "P0Y0M0DT12H0M0S
"feature_name": feature_name,
"lease_intent_id": None,
"license_type": "CONCURRENT_COUNTED_SINGLE",
"metadata": None,
"offline_lease": False, # todo
"product_name": product_name,
"recommended_lease_renewal": LEASE_RENEWAL_PERIOD,
"ref": lease_ref, "ref": lease_ref,
}, "created": cur_time.isoformat(),
"ordinal": None, "expires": expires.isoformat(),
"recommended_lease_renewal": default_instance.lease_renewal_period,
"offline_lease": "true",
"license_type": "CONCURRENT_COUNTED_SINGLE"
}
}) })
data = Lease(origin_ref=origin_ref, lease_ref=lease_ref, lease_created=cur_time, lease_expires=expires) data = Lease(instance_ref=default_instance.instance_ref, origin_ref=origin_ref, lease_ref=lease_ref, lease_created=cur_time, lease_expires=expires)
Lease.create_or_update(db, data) Lease.create_or_update(db, data)
response = { response = {
"client_challenge": j.get('client_challenge'),
"lease_result_list": lease_result_list, "lease_result_list": lease_result_list,
"prompts": None, "result_code": "SUCCESS",
"result_code": None, "sync_timestamp": cur_time.isoformat(),
"sync_timestamp": cur_time.strftime(DT_FORMAT), "prompts": None
} }
content = json_dumps(response, separators=(',', ':')) return JSONr(response)
content = f'{content}\n'.encode('ascii')
signature = my_si_private_key.generate_signature(content)
headers = {
'Content-Type': 'application/json',
'access-control-expose-headers': 'X-NLS-Signature',
'X-NLS-Signature': f'{signature.hex().encode()}'
}
return Response(content=content, media_type='application/json', headers=headers)
# venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py # venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py
# venv/lib/python3.9/site-packages/nls_dal_service_instance_dls/schema/service_instance/V1_0_21__product_mapping.sql # venv/lib/python3.9/site-packages/nls_dal_service_instance_dls/schema/service_instance/V1_0_21__product_mapping.sql
@app.get('/leasing/v1/lessor/leases', description='get active leases for current origin') @app.get('/leasing/v1/lessor/leases', description='get active leases for current origin')
async def leasing_v1_lessor_lease(request: Request): async def leasing_v1_lessor_lease(request: Request):
token, cur_time = __get_token(request), datetime.now(UTC) cur_time = datetime.utcnow()
jwt_decode_key = Instance.get_default_instance(db).get_jwt_decode_key()
try:
token = __get_token(request, jwt_decode_key)
except JWTError:
return JSONr(status_code=401, content={'status': 401, 'detail': 'token is not valid'})
origin_ref = token.get('origin_ref') origin_ref = token.get('origin_ref')
active_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref))) active_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref)))
logger.info(f'> [ leases ]: {origin_ref}: found {len(active_lease_list)} active leases') logging.info(f'> [ leases ]: {origin_ref}: found {len(active_lease_list)} active leases')
response = { response = {
"active_lease_list": active_lease_list, "active_lease_list": active_lease_list,
"prompts": None, "sync_timestamp": cur_time.isoformat(),
"sync_timestamp": cur_time.strftime(DT_FORMAT), "prompts": None
} }
return Response(content=json_dumps(response, separators=(',', ':')), media_type='application/json', status_code=200) return JSONr(response)
# venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_single_controller.py # venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_single_controller.py
# venv/lib/python3.9/site-packages/nls_core_lease/lease_single.py # venv/lib/python3.9/site-packages/nls_core_lease/lease_single.py
@app.put('/leasing/v1/lease/{lease_ref}', description='renew a lease') @app.put('/leasing/v1/lease/{lease_ref}', description='renew a lease')
async def leasing_v1_lease_renew(request: Request, lease_ref: str): async def leasing_v1_lease_renew(request: Request, lease_ref: str):
j, token, cur_time = json_loads((await request.body()).decode('utf-8')), __get_token(request), datetime.now(UTC) cur_time = datetime.utcnow()
default_instance = Instance.get_default_instance(db)
jwt_decode_key = default_instance.get_jwt_decode_key()
try:
token = __get_token(request, jwt_decode_key)
except JWTError:
return JSONr(status_code=401, content={'status': 401, 'detail': 'token is not valid'})
origin_ref = token.get('origin_ref') origin_ref = token.get('origin_ref')
logger.info(f'> [ renew ]: {origin_ref}: renew {lease_ref}') logging.info(f'> [ renew ]: {origin_ref}: renew {lease_ref}')
entity = Lease.find_by_origin_ref_and_lease_ref(db, origin_ref, lease_ref) entity = Lease.find_by_origin_ref_and_lease_ref(db, origin_ref, lease_ref)
if entity is None: if entity is None:
response = {'status': 404, 'detail': 'requested lease not available'} return JSONr(status_code=404, content={'status': 404, 'detail': 'requested lease not available'})
return Response(content=json_dumps(response), media_type='application/json', status_code=404)
expires = cur_time + LEASE_EXPIRE_DELTA expires = cur_time + default_instance.get_lease_expire_delta()
response = { response = {
"client_challenge": j.get('client_challenge'),
"expires": expires.strftime('%Y-%m-%dT%H:%M:%S.%f'), # DT_FORMAT => "trailing 'Z' missing in this response
"feature_expired": False,
"lease_ref": lease_ref, "lease_ref": lease_ref,
"metadata": None, "expires": expires.isoformat(),
"offline_lease": False, # todo "recommended_lease_renewal": default_instance.lease_renewal_period,
"offline_lease": True,
"prompts": None, "prompts": None,
"recommended_lease_renewal": LEASE_RENEWAL_PERIOD, "sync_timestamp": cur_time.isoformat(),
"sync_timestamp": cur_time.strftime(DT_FORMAT),
} }
Lease.renew(db, entity, expires, cur_time) Lease.renew(db, entity, expires, cur_time)
content = json_dumps(response, separators=(',', ':')) return JSONr(response)
content = f'{content}\n'.encode('ascii')
signature = my_si_private_key.generate_signature(content)
headers = {
'Content-Type': 'application/json',
'access-control-expose-headers': 'X-NLS-Signature',
'X-NLS-Signature': f'{signature.hex().encode()}'
}
return Response(content=content, media_type='application/json', headers=headers)
# venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_single_controller.py # venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_single_controller.py
@app.delete('/leasing/v1/lease/{lease_ref}', description='release (return) a lease') @app.delete('/leasing/v1/lease/{lease_ref}', description='release (return) a lease')
async def leasing_v1_lease_delete(request: Request, lease_ref: str): async def leasing_v1_lease_delete(request: Request, lease_ref: str):
token, cur_time = __get_token(request), datetime.now(UTC) cur_time = datetime.utcnow()
jwt_decode_key = Instance.get_default_instance(db).get_jwt_decode_key()
try:
token = __get_token(request, jwt_decode_key)
except JWTError:
return JSONr(status_code=401, content={'status': 401, 'detail': 'token is not valid'})
origin_ref = token.get('origin_ref') origin_ref = token.get('origin_ref')
logger.info(f'> [ return ]: {origin_ref}: return {lease_ref}') logging.info(f'> [ return ]: {origin_ref}: return {lease_ref}')
entity = Lease.find_by_lease_ref(db, lease_ref) entity = Lease.find_by_lease_ref(db, lease_ref)
if entity.origin_ref != origin_ref: if entity.origin_ref != origin_ref:
response = {'status': 403, 'detail': 'access or operation forbidden'} return JSONr(status_code=403, content={'status': 403, 'detail': 'access or operation forbidden'})
return Response(content=json_dumps(response), media_type='application/json', status_code=403)
if entity is None: if entity is None:
response = {'status': 404, 'detail': 'requested lease not available'} return JSONr(status_code=404, content={'status': 404, 'detail': 'requested lease not available'})
return Response(content=json_dumps(response), media_type='application/json', status_code=404)
if Lease.delete(db, lease_ref) == 0: if Lease.delete(db, lease_ref) == 0:
response = {'status': 404, 'detail': 'lease not found'} return JSONr(status_code=404, content={'status': 404, 'detail': 'lease not found'})
return Response(content=json_dumps(response), media_type='application/json', status_code=404)
response = { response = {
"client_challenge": None,
"lease_ref": lease_ref, "lease_ref": lease_ref,
"prompts": None, "prompts": None,
"sync_timestamp": cur_time.strftime(DT_FORMAT), "sync_timestamp": cur_time.isoformat(),
} }
return Response(content=json_dumps(response, separators=(',', ':')), media_type='application/json', status_code=200) return JSONr(response)
# venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py # venv/lib/python3.9/site-packages/nls_services_lease/test/test_lease_multi_controller.py
@app.delete('/leasing/v1/lessor/leases', description='release all leases') @app.delete('/leasing/v1/lessor/leases', description='release all leases')
async def leasing_v1_lessor_lease_remove(request: Request): async def leasing_v1_lessor_lease_remove(request: Request):
token, cur_time = __get_token(request), datetime.now(UTC) cur_time = datetime.utcnow()
jwt_decode_key = Instance.get_default_instance(db).get_jwt_decode_key()
try:
token = __get_token(request, jwt_decode_key)
except JWTError:
return JSONr(status_code=401, content={'status': 401, 'detail': 'token is not valid'})
origin_ref = token.get('origin_ref') origin_ref = token.get('origin_ref')
released_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref))) released_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref)))
deletions = Lease.cleanup(db, origin_ref) deletions = Lease.cleanup(db, origin_ref)
logger.info(f'> [ remove ]: {origin_ref}: removed {deletions} leases') logging.info(f'> [ remove ]: {origin_ref}: removed {deletions} leases')
response = { response = {
"released_lease_list": released_lease_list, "released_lease_list": released_lease_list,
"release_failure_list": None, "release_failure_list": None,
"prompts": None, "sync_timestamp": cur_time.isoformat(),
"sync_timestamp": cur_time.strftime(DT_FORMAT), "prompts": None
} }
return Response(content=json_dumps(response, separators=(',', ':')), media_type='application/json', status_code=200) return JSONr(response)
@app.post('/leasing/v1/lessor/shutdown', description='shutdown all leases') @app.post('/leasing/v1/lessor/shutdown', description='shutdown all leases')
async def leasing_v1_lessor_shutdown(request: Request): async def leasing_v1_lessor_shutdown(request: Request):
j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.now(UTC) j, cur_time = json_loads((await request.body()).decode('utf-8')), datetime.utcnow()
jwt_decode_key = Instance.get_default_instance(db).get_jwt_decode_key()
token = j.get('token') token = j.get('token')
token = jwt.decode(jwt=token, key=jwt_decode_key, algorithms='RS256', options={'verify_aud': False}) token = jwt.decode(token=token, key=jwt_decode_key, algorithms=ALGORITHMS.RS256, options={'verify_aud': False})
origin_ref = token.get('origin_ref') origin_ref = token.get('origin_ref')
released_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref))) released_lease_list = list(map(lambda x: x.lease_ref, Lease.find_by_origin_ref(db, origin_ref)))
deletions = Lease.cleanup(db, origin_ref) deletions = Lease.cleanup(db, origin_ref)
logger.info(f'> [ shutdown ]: {origin_ref}: removed {deletions} leases') logging.info(f'> [ shutdown ]: {origin_ref}: removed {deletions} leases')
response = { response = {
"released_lease_list": released_lease_list, "released_lease_list": released_lease_list,
"release_failure_list": None, "release_failure_list": None,
"prompts": None, "sync_timestamp": cur_time.isoformat(),
"sync_timestamp": cur_time.strftime(DT_FORMAT), "prompts": None
} }
return Response(content=json_dumps(response, separators=(',', ':')), media_type='application/json', status_code=200) return JSONr(response)
@app.on_event('startup')
async def app_on_startup():
default_instance = Instance.get_default_instance(db)
lease_renewal_period = default_instance.lease_renewal_period
lease_renewal_delta = default_instance.get_lease_renewal_delta()
client_token_expire_delta = default_instance.get_client_token_expire_delta()
logger.info(f'''
Using timezone: {str(TZ)}. Make sure this is correct and match your clients!
Your clients will renew their license every {str(Lease.calculate_renewal(lease_renewal_period, lease_renewal_delta))}.
If the renewal fails, the license is valid for {str(lease_renewal_delta)}.
Your client-token file (.tok) is valid for {str(client_token_expire_delta)}.
''')
validate_settings()
if __name__ == '__main__': if __name__ == '__main__':
@@ -760,7 +626,7 @@ if __name__ == '__main__':
# #
### ###
logger.info(f'> Starting dev-server ...') logging.info(f'> Starting dev-server ...')
ssl_keyfile = join(dirname(__file__), 'cert/webserver.key') ssl_keyfile = join(dirname(__file__), 'cert/webserver.key')
ssl_certfile = join(dirname(__file__), 'cert/webserver.crt') ssl_certfile = join(dirname(__file__), 'cert/webserver.crt')

View File

@@ -1,20 +1,143 @@
from datetime import datetime, timedelta, timezone, UTC import logging
from datetime import datetime, timedelta
from dateutil.relativedelta import relativedelta from dateutil.relativedelta import relativedelta
from sqlalchemy import Column, VARCHAR, CHAR, ForeignKey, DATETIME, update, and_, inspect, text from sqlalchemy import Column, VARCHAR, CHAR, ForeignKey, DATETIME, update, and_, inspect, text, BLOB, INT, FLOAT
from sqlalchemy.engine import Engine from sqlalchemy.engine import Engine
from sqlalchemy.orm import sessionmaker, declarative_base from sqlalchemy.orm import sessionmaker, declarative_base, Session, relationship
from util import DriverMatrix from app.util import parse_key
logging.basicConfig()
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
Base = declarative_base() Base = declarative_base()
class Site(Base):
__tablename__ = "site"
INITIAL_SITE_KEY_XID = '00000000-0000-0000-0000-000000000000'
INITIAL_SITE_NAME = 'default'
site_key = Column(CHAR(length=36), primary_key=True, unique=True, index=True) # uuid4, SITE_KEY_XID
name = Column(VARCHAR(length=256), nullable=False)
def __str__(self):
return f'SITE_KEY_XID: {self.site_key}'
@staticmethod
def create_statement(engine: Engine):
from sqlalchemy.schema import CreateTable
return CreateTable(Site.__table__).compile(engine)
@staticmethod
def get_default_site(engine: Engine) -> "Site":
session = sessionmaker(bind=engine)()
entity = session.query(Site).filter(Site.site_key == Site.INITIAL_SITE_KEY_XID).first()
session.close()
return entity
class Instance(Base):
__tablename__ = "instance"
DEFAULT_INSTANCE_REF = '10000000-0000-0000-0000-000000000001'
DEFAULT_TOKEN_EXPIRE_DELTA = 86_400 # 1 day
DEFAULT_LEASE_EXPIRE_DELTA = 7_776_000 # 90 days
DEFAULT_LEASE_RENEWAL_PERIOD = 0.15
DEFAULT_CLIENT_TOKEN_EXPIRE_DELTA = 378_432_000 # 12 years
# 1 day = 86400 (min. in production setup, max 90 days), 1 hour = 3600
instance_ref = Column(CHAR(length=36), primary_key=True, unique=True, index=True) # uuid4, INSTANCE_REF
site_key = Column(CHAR(length=36), ForeignKey(Site.site_key, ondelete='CASCADE'), nullable=False, index=True) # uuid4
private_key = Column(BLOB(length=2048), nullable=False)
public_key = Column(BLOB(length=512), nullable=False)
token_expire_delta = Column(INT(), nullable=False, default=DEFAULT_TOKEN_EXPIRE_DELTA, comment='in seconds')
lease_expire_delta = Column(INT(), nullable=False, default=DEFAULT_LEASE_EXPIRE_DELTA, comment='in seconds')
lease_renewal_period = Column(FLOAT(precision=2), nullable=False, default=DEFAULT_LEASE_RENEWAL_PERIOD)
client_token_expire_delta = Column(INT(), nullable=False, default=DEFAULT_CLIENT_TOKEN_EXPIRE_DELTA, comment='in seconds')
__origin = relationship(Site, foreign_keys=[site_key])
def __str__(self):
return f'INSTANCE_REF: {self.instance_ref} (SITE_KEY_XID: {self.site_key})'
@staticmethod
def create_statement(engine: Engine):
from sqlalchemy.schema import CreateTable
return CreateTable(Instance.__table__).compile(engine)
@staticmethod
def create_or_update(engine: Engine, instance: "Instance"):
session = sessionmaker(bind=engine)()
entity = session.query(Instance).filter(Instance.instance_ref == instance.instance_ref).first()
if entity is None:
session.add(instance)
else:
x = dict(
site_key=instance.site_key,
private_key=instance.private_key,
public_key=instance.public_key,
token_expire_delta=instance.token_expire_delta,
lease_expire_delta=instance.lease_expire_delta,
lease_renewal_period=instance.lease_renewal_period,
client_token_expire_delta=instance.client_token_expire_delta,
)
session.execute(update(Instance).where(Instance.instance_ref == instance.instance_ref).values(**x))
session.commit()
session.flush()
session.close()
# todo: validate on startup that "lease_expire_delta" is between 1 day and 90 days
@staticmethod
def get_default_instance(engine: Engine) -> "Instance":
session = sessionmaker(bind=engine)()
site = Site.get_default_site(engine)
entity = session.query(Instance).filter(Instance.site_key == site.site_key).first()
session.close()
return entity
def get_token_expire_delta(self) -> "dateutil.relativedelta.relativedelta":
return relativedelta(seconds=self.token_expire_delta)
def get_lease_expire_delta(self) -> "dateutil.relativedelta.relativedelta":
return relativedelta(seconds=self.lease_expire_delta)
def get_lease_renewal_delta(self) -> "datetime.timedelta":
return timedelta(seconds=self.lease_expire_delta)
def get_client_token_expire_delta(self) -> "dateutil.relativedelta.relativedelta":
return relativedelta(seconds=self.client_token_expire_delta)
def __get_private_key(self) -> "RsaKey":
return parse_key(self.private_key)
def get_public_key(self) -> "RsaKey":
return parse_key(self.public_key)
def get_jwt_encode_key(self) -> "jose.jkw":
from jose import jwk
from jose.constants import ALGORITHMS
return jwk.construct(self.__get_private_key().export_key().decode('utf-8'), algorithm=ALGORITHMS.RS256)
def get_jwt_decode_key(self) -> "jose.jwt":
from jose import jwk
from jose.constants import ALGORITHMS
return jwk.construct(self.get_public_key().export_key().decode('utf-8'), algorithm=ALGORITHMS.RS256)
def get_private_key_str(self, encoding: str = 'utf-8') -> str:
return self.private_key.decode(encoding)
def get_public_key_str(self, encoding: str = 'utf-8') -> str:
return self.private_key.decode(encoding)
class Origin(Base): class Origin(Base):
__tablename__ = "origin" __tablename__ = "origin"
origin_ref = Column(CHAR(length=36), primary_key=True, unique=True, index=True) # uuid4 origin_ref = Column(CHAR(length=36), primary_key=True, unique=True, index=True) # uuid4
# service_instance_xid = Column(CHAR(length=36), nullable=False, index=True) # uuid4 # not necessary, we only support one service_instance_xid ('INSTANCE_REF') # service_instance_xid = Column(CHAR(length=36), nullable=False, index=True) # uuid4 # not necessary, we only support one service_instance_xid ('INSTANCE_REF')
hostname = Column(VARCHAR(length=256), nullable=True) hostname = Column(VARCHAR(length=256), nullable=True)
guest_driver_version = Column(VARCHAR(length=10), nullable=True) guest_driver_version = Column(VARCHAR(length=10), nullable=True)
@@ -25,8 +148,6 @@ class Origin(Base):
return f'Origin(origin_ref={self.origin_ref}, hostname={self.hostname})' return f'Origin(origin_ref={self.origin_ref}, hostname={self.hostname})'
def serialize(self) -> dict: def serialize(self) -> dict:
_ = DriverMatrix().find(self.guest_driver_version)
return { return {
'origin_ref': self.origin_ref, 'origin_ref': self.origin_ref,
# 'service_instance_xid': self.service_instance_xid, # 'service_instance_xid': self.service_instance_xid,
@@ -34,7 +155,6 @@ class Origin(Base):
'guest_driver_version': self.guest_driver_version, 'guest_driver_version': self.guest_driver_version,
'os_platform': self.os_platform, 'os_platform': self.os_platform,
'os_version': self.os_version, 'os_version': self.os_version,
'$driver': _ if _ is not None else None,
} }
@staticmethod @staticmethod
@@ -66,17 +186,7 @@ class Origin(Base):
if origin_refs is None: if origin_refs is None:
deletions = session.query(Origin).delete() deletions = session.query(Origin).delete()
else: else:
deletions = session.query(Origin).filter(Origin.origin_ref.in_(origin_refs)).delete() deletions = session.query(Origin).filter(Origin.origin_ref in origin_refs).delete()
session.commit()
session.close()
return deletions
@staticmethod
def delete_expired(engine: Engine) -> int:
session = sessionmaker(bind=engine)()
origins = session.query(Origin).join(Lease, Origin.origin_ref == Lease.origin_ref, isouter=True).filter(Lease.lease_ref.is_(None)).all()
origin_refs = [origin.origin_ref for origin in origins]
deletions = session.query(Origin).filter(Origin.origin_ref.in_(origin_refs)).delete()
session.commit() session.commit()
session.close() session.close()
return deletions return deletions
@@ -85,18 +195,24 @@ class Origin(Base):
class Lease(Base): class Lease(Base):
__tablename__ = "lease" __tablename__ = "lease"
instance_ref = Column(CHAR(length=36), ForeignKey(Instance.instance_ref, ondelete='CASCADE'), nullable=False, index=True) # uuid4
lease_ref = Column(CHAR(length=36), primary_key=True, nullable=False, index=True) # uuid4 lease_ref = Column(CHAR(length=36), primary_key=True, nullable=False, index=True) # uuid4
origin_ref = Column(CHAR(length=36), ForeignKey(Origin.origin_ref, ondelete='CASCADE'), nullable=False, index=True) # uuid4 origin_ref = Column(CHAR(length=36), ForeignKey(Origin.origin_ref, ondelete='CASCADE'), nullable=False, index=True) # uuid4
# scope_ref = Column(CHAR(length=36), nullable=False, index=True) # uuid4 # not necessary, we only support one scope_ref ('ALLOTMENT_REF') # scope_ref = Column(CHAR(length=36), nullable=False, index=True) # uuid4 # not necessary, we only support one scope_ref ('ALLOTMENT_REF')
lease_created = Column(DATETIME(), nullable=False) lease_created = Column(DATETIME(), nullable=False)
lease_expires = Column(DATETIME(), nullable=False) lease_expires = Column(DATETIME(), nullable=False)
lease_updated = Column(DATETIME(), nullable=False) lease_updated = Column(DATETIME(), nullable=False)
__instance = relationship(Instance, foreign_keys=[instance_ref])
__origin = relationship(Origin, foreign_keys=[origin_ref])
def __repr__(self): def __repr__(self):
return f'Lease(origin_ref={self.origin_ref}, lease_ref={self.lease_ref}, expires={self.lease_expires})' return f'Lease(origin_ref={self.origin_ref}, lease_ref={self.lease_ref}, expires={self.lease_expires})'
def serialize(self, renewal_period: float, renewal_delta: timedelta) -> dict: def serialize(self) -> dict:
renewal_period = self.__instance.lease_renewal_period
renewal_delta = self.__instance.get_lease_renewal_delta
lease_renewal = int(Lease.calculate_renewal(renewal_period, renewal_delta).total_seconds()) lease_renewal = int(Lease.calculate_renewal(renewal_period, renewal_delta).total_seconds())
lease_renewal = self.lease_updated + relativedelta(seconds=lease_renewal) lease_renewal = self.lease_updated + relativedelta(seconds=lease_renewal)
@@ -104,10 +220,10 @@ class Lease(Base):
'lease_ref': self.lease_ref, 'lease_ref': self.lease_ref,
'origin_ref': self.origin_ref, 'origin_ref': self.origin_ref,
# 'scope_ref': self.scope_ref, # 'scope_ref': self.scope_ref,
'lease_created': self.lease_created.replace(tzinfo=timezone.utc).isoformat(), 'lease_created': self.lease_created.isoformat(),
'lease_expires': self.lease_expires.replace(tzinfo=timezone.utc).isoformat(), 'lease_expires': self.lease_expires.isoformat(),
'lease_updated': self.lease_updated.replace(tzinfo=timezone.utc).isoformat(), 'lease_updated': self.lease_updated.isoformat(),
'lease_renewal': lease_renewal.replace(tzinfo=timezone.utc).isoformat(), 'lease_renewal': lease_renewal.isoformat(),
} }
@staticmethod @staticmethod
@@ -178,7 +294,7 @@ class Lease(Base):
@staticmethod @staticmethod
def delete_expired(engine: Engine) -> int: def delete_expired(engine: Engine) -> int:
session = sessionmaker(bind=engine)() session = sessionmaker(bind=engine)()
deletions = session.query(Lease).filter(Lease.lease_expires <= datetime.now(UTC)).delete() deletions = session.query(Lease).filter(Lease.lease_expires <= datetime.utcnow()).delete()
session.commit() session.commit()
session.close() session.close()
return deletions return deletions
@@ -206,38 +322,104 @@ class Lease(Base):
return renew return renew
def init_default_site(session: Session):
from uuid import uuid4
from app.util import generate_key
private_key = generate_key()
public_key = private_key.public_key()
site = Site(
site_key=Site.INITIAL_SITE_KEY_XID,
name=Site.INITIAL_SITE_NAME
)
session.add(site)
session.commit()
instance = Instance(
instance_ref=Instance.DEFAULT_INSTANCE_REF,
site_key=site.site_key,
private_key=private_key.export_key(),
public_key=public_key.export_key(),
)
session.add(instance)
session.commit()
def init(engine: Engine): def init(engine: Engine):
tables = [Origin, Lease] tables = [Site, Instance, Origin, Lease]
db = inspect(engine) db = inspect(engine)
session = sessionmaker(bind=engine)() session = sessionmaker(bind=engine)()
for table in tables: for table in tables:
if not db.dialect.has_table(engine.connect(), table.__tablename__): exists = db.dialect.has_table(engine.connect(), table.__tablename__)
logger.info(f'> Table "{table.__tablename__:<16}" exists: {exists}')
if not exists:
session.execute(text(str(table.create_statement(engine)))) session.execute(text(str(table.create_statement(engine))))
session.commit() session.commit()
# create default site
cnt = session.query(Site).count()
if cnt == 0:
init_default_site(session)
session.flush()
session.close() session.close()
def migrate(engine: Engine): def migrate(engine: Engine):
from os import getenv as env
from os.path import join, dirname, isfile
from util import load_key
db = inspect(engine) db = inspect(engine)
def upgrade_1_0_to_1_1(): # todo: add update guide to use 1.LATEST to 2.0
x = db.dialect.get_columns(engine.connect(), Lease.__tablename__) def upgrade_1_x_to_2_0():
x = next(_ for _ in x if _['name'] == 'origin_ref') site = Site.get_default_site(engine)
if x['primary_key'] > 0: logger.info(site)
print('Found old database schema with "origin_ref" as primary-key in "lease" table. Dropping table!') instance = Instance.get_default_instance(engine)
print(' Your leases are recreated on next renewal!') logger.info(instance)
print(' If an error message appears on the client, you can ignore it.')
Lease.__table__.drop(bind=engine)
init(engine)
# def upgrade_1_2_to_1_3(): # SITE_KEY_XID
# x = db.dialect.get_columns(engine.connect(), Lease.__tablename__) if site_key := env('SITE_KEY_XID', None) is not None:
# x = next((_ for _ in x if _['name'] == 'scope_ref'), None) site.site_key = str(site_key)
# if x is None:
# Lease.scope_ref.compile()
# column_name = Lease.scope_ref.name
# column_type = Lease.scope_ref.type.compile(engine.dialect)
# engine.execute(f'ALTER TABLE "{Lease.__tablename__}" ADD COLUMN "{column_name}" {column_type}')
upgrade_1_0_to_1_1() # INSTANCE_REF
# upgrade_1_2_to_1_3() if instance_ref := env('INSTANCE_REF', None) is not None:
instance.instance_ref = str(instance_ref)
# ALLOTMENT_REF
if allotment_ref := env('ALLOTMENT_REF', None) is not None:
pass # todo
# INSTANCE_KEY_RSA, INSTANCE_KEY_PUB
default_instance_private_key_path = str(join(dirname(__file__), 'cert/instance.private.pem'))
if instance_private_key := env('INSTANCE_KEY_RSA', None) is not None:
instance.private_key = load_key(str(instance_private_key))
elif isfile(default_instance_private_key_path):
instance.private_key = load_key(default_instance_private_key_path)
default_instance_public_key_path = str(join(dirname(__file__), 'cert/instance.public.pem'))
if instance_public_key := env('INSTANCE_KEY_PUB', None) is not None:
instance.public_key = load_key(str(instance_public_key))
elif isfile(default_instance_public_key_path):
instance.public_key = load_key(default_instance_public_key_path)
# TOKEN_EXPIRE_DELTA
if token_expire_delta := env('TOKEN_EXPIRE_DAYS', None) not in (None, 0):
instance.token_expire_delta = token_expire_delta * 86_400
if token_expire_delta := env('TOKEN_EXPIRE_HOURS', None) not in (None, 0):
instance.token_expire_delta = token_expire_delta * 3_600
# LEASE_EXPIRE_DELTA, LEASE_RENEWAL_DELTA
if lease_expire_delta := env('LEASE_EXPIRE_DAYS', None) not in (None, 0):
instance.lease_expire_delta = lease_expire_delta * 86_400
if lease_expire_delta := env('LEASE_EXPIRE_HOURS', None) not in (None, 0):
instance.lease_expire_delta = lease_expire_delta * 3_600
# LEASE_RENEWAL_PERIOD
if lease_renewal_period := env('LEASE_RENEWAL_PERIOD', None) is not None:
instance.lease_renewal_period = lease_renewal_period
# todo: update site, instance
upgrade_1_x_to_2_0()

View File

@@ -1,643 +0,0 @@
{
"product": [
{
"xid": "c0ce7114-d8a5-40d4-b8b0-df204f4ff631",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "NVIDIA-vComputeServer-9.0",
"name": "NVIDIA-vComputeServer-9.0",
"description": null
},
{
"xid": "2a99638e-493f-424b-bc3a-629935307490",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "vGaming_Flexera_License-0.1",
"name": "vGaming_Flexera_License-0.1",
"description": null
},
{
"xid": "a013d60c-3cd6-4e61-ae51-018b5e342178",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "GRID-Virtual-Apps-3.0",
"name": "GRID-Virtual-Apps-3.0",
"description": null
},
{
"xid": "bb99c6a3-81ce-4439-aef5-9648e75dd878",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "GRID-vGaming-NLS-Metered-8.0",
"name": "GRID-vGaming-NLS-Metered-8.0",
"description": null
},
{
"xid": "c653e131-695c-4477-b77c-42ade3dcb02c",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "GRID-Virtual-WS-Ext-2.0",
"name": "GRID-Virtual-WS-Ext-2.0",
"description": null
},
{
"xid": "6fc224ef-e0b5-467b-9bbb-d31c9eb7c6fc",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "GRID-vGaming-8.0",
"name": "GRID-vGaming-8.0",
"description": null
},
{
"xid": "3c88888d-ebf3-4df7-9e86-c97d5b29b997",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "GRID-Virtual-PC-2.0",
"name": "GRID-Virtual-PC-2.0",
"description": null
},
{
"xid": "66744b41-1fff-49be-a5a6-4cbd71b1117e",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "NVAIE_Licensing-1.0",
"name": "NVAIE_Licensing-1.0",
"description": null
},
{
"xid": "1d4e9ebc-a78c-41f4-a11a-de38a467b2ba",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "NVIDIA-vComputeServer NLS Metered-9.0",
"name": "NVIDIA-vComputeServer NLS Metered-9.0",
"description": null
},
{
"xid": "2152f8aa-d17b-46f5-8f5f-6f8c0760ce9c",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "vGaming_FB_License-0.1",
"name": "vGaming_FB_License-0.1",
"description": null
},
{
"xid": "54cbe0e8-7b35-4068-b058-e11f5b367c66",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "Quadro-Virtual-DWS-5.0",
"name": "Quadro-Virtual-DWS-5.0",
"description": null
},
{
"xid": "07a1d2b5-c147-48bc-bf44-9390339ca388",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "GRID-Virtual-WS-2.0",
"name": "GRID-Virtual-WS-2.0",
"description": null
},
{
"xid": "82d7a5f0-0c26-11ef-b3b6-371045c70906",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "vGaming_Flexera_License-0.1",
"name": "vGaming_Flexera_License-0.1",
"description": null
},
{
"xid": "bdfbde00-2cdb-11ec-9838-061a22468b59",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "NVIDIA Virtual Applications",
"name": "NVIDIA Virtual Applications",
"description": null
},
{
"xid": "bdfbe16d-2cdb-11ec-9838-061a22468b59",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "NVIDIA Virtual PC",
"name": "NVIDIA Virtual PC",
"description": null
},
{
"xid": "bdfbe308-2cdb-11ec-9838-061a22468b59",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "NVIDIA RTX Virtual Workstation",
"name": "NVIDIA RTX Virtual Workstation",
"description": null
},
{
"xid": "bdfbe405-2cdb-11ec-9838-061a22468b59",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "NVIDIA vGaming",
"name": "NVIDIA vGaming",
"description": null
},
{
"xid": "bdfbe509-2cdb-11ec-9838-061a22468b59",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "GRID Virtual Applications",
"name": "GRID Virtual Applications",
"description": null
},
{
"xid": "bdfbe5c6-2cdb-11ec-9838-061a22468b59",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "GRID Virtual PC",
"name": "GRID Virtual PC",
"description": null
},
{
"xid": "bdfbe6e8-2cdb-11ec-9838-061a22468b59",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "Quadro Virtual Data Center Workstation",
"name": "Quadro Virtual Data Center Workstation",
"description": null
},
{
"xid": "bdfbe7c8-2cdb-11ec-9838-061a22468b59",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "GRID vGaming",
"name": "GRID vGaming",
"description": null
},
{
"xid": "bdfbe884-2cdb-11ec-9838-061a22468b59",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "NVIDIA Virtual Compute Server",
"name": "NVIDIA Virtual Compute Server",
"description": null
},
{
"xid": "f09b5c33-5c07-11ed-9fa6-061a22468b59",
"product_family_xid": "bda4d909-2cdb-11ec-9838-061a22468b59",
"identifier": "NVIDIA OVE Licensing",
"name": "NVIDIA Omniverse Nucleus",
"description": null
}
],
"product_fulfillment": [
{
"xid": "cf0a5330-b583-4d9f-84bb-cfc8ce0917bb",
"product_xid": "07a1d2b5-c147-48bc-bf44-9390339ca388",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "90d0f05f-9431-4a15-86e7-740a4f08d457",
"product_xid": "1d4e9ebc-a78c-41f4-a11a-de38a467b2ba",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "327385dd-4ba8-4b3c-bc56-30bcf58ae9a3",
"product_xid": "2152f8aa-d17b-46f5-8f5f-6f8c0760ce9c",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "6733f2cc-0736-47ee-bcc8-20c4c624ce37",
"product_xid": "2a99638e-493f-424b-bc3a-629935307490",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "f35396a9-24f8-44b6-aa6a-493b335f4d56",
"product_xid": "3c88888d-ebf3-4df7-9e86-c97d5b29b997",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "6c7981d3-7192-4bfd-b7ec-ea2ad0b466dc",
"product_xid": "54cbe0e8-7b35-4068-b058-e11f5b367c66",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "9bd09610-6190-4684-9be6-3d9503833e80",
"product_xid": "66744b41-1fff-49be-a5a6-4cbd71b1117e",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "a4282e5b-ea08-4e0a-b724-7f4059ba99de",
"product_xid": "6fc224ef-e0b5-467b-9bbb-d31c9eb7c6fc",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "5cf793fc-1fb3-45c0-a711-d3112c775cbe",
"product_xid": "a013d60c-3cd6-4e61-ae51-018b5e342178",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "eb2d39a4-6370-4464-8a6a-ec3f42c69cb5",
"product_xid": "bb99c6a3-81ce-4439-aef5-9648e75dd878",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "e9df1c70-7fac-4c84-b54c-66e922b9791a",
"product_xid": "c0ce7114-d8a5-40d4-b8b0-df204f4ff631",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "6a4d5bcd-7b81-4e22-a289-ce3673e5cabf",
"product_xid": "c653e131-695c-4477-b77c-42ade3dcb02c",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "9e162d3c-0c26-11ef-b3b6-371045c70906",
"product_xid": "82d7a5f0-0c26-11ef-b3b6-371045c70906",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "be2769b9-2cdb-11ec-9838-061a22468b59",
"product_xid": "bdfbde00-2cdb-11ec-9838-061a22468b59",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "be276d7b-2cdb-11ec-9838-061a22468b59",
"product_xid": "bdfbe16d-2cdb-11ec-9838-061a22468b59",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "be276efe-2cdb-11ec-9838-061a22468b59",
"product_xid": "bdfbe308-2cdb-11ec-9838-061a22468b59",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "be276ff0-2cdb-11ec-9838-061a22468b59",
"product_xid": "bdfbe405-2cdb-11ec-9838-061a22468b59",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "be2770af-2cdb-11ec-9838-061a22468b59",
"product_xid": "bdfbe509-2cdb-11ec-9838-061a22468b59",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "be277164-2cdb-11ec-9838-061a22468b59",
"product_xid": "bdfbe5c6-2cdb-11ec-9838-061a22468b59",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "be277214-2cdb-11ec-9838-061a22468b59",
"product_xid": "bdfbe6e8-2cdb-11ec-9838-061a22468b59",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "be2772c8-2cdb-11ec-9838-061a22468b59",
"product_xid": "bdfbe7c8-2cdb-11ec-9838-061a22468b59",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "be277379-2cdb-11ec-9838-061a22468b59",
"product_xid": "bdfbe884-2cdb-11ec-9838-061a22468b59",
"qualifier_specification": null,
"evaluation_order_index": 0
},
{
"xid": "c4284597-5c09-11ed-9fa6-061a22468b59",
"product_xid": "f09b5c33-5c07-11ed-9fa6-061a22468b59",
"qualifier_specification": null,
"evaluation_order_index": 0
}
],
"product_fulfillment_feature": [
{
"xid": "9ca32d2b-736e-4e4f-8f5a-895a755b4c41",
"product_fulfillment_xid": "5cf793fc-1fb3-45c0-a711-d3112c775cbe",
"feature_identifier": "GRID-Virtual-Apps",
"feature_version": "3.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "d8b25329-f47f-43dc-a278-f2d38f9e939b",
"product_fulfillment_xid": "f35396a9-24f8-44b6-aa6a-493b335f4d56",
"feature_identifier": "GRID-Virtual-PC",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "e7102df8-d88a-4bd0-aa79-9a53d8b77888",
"product_fulfillment_xid": "cf0a5330-b583-4d9f-84bb-cfc8ce0917bb",
"feature_identifier": "GRID-Virtual-WS",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "30761db3-0afe-454d-b284-efba6d9b13a3",
"product_fulfillment_xid": "6a4d5bcd-7b81-4e22-a289-ce3673e5cabf",
"feature_identifier": "GRID-Virtual-WS-Ext",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "10fd7701-83ae-4caf-a27f-75880fab23f6",
"product_fulfillment_xid": "a4282e5b-ea08-4e0a-b724-7f4059ba99de",
"feature_identifier": "GRID-vGaming",
"feature_version": "8.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "cbd61276-fb1e-42e1-b844-43e94465da8f",
"product_fulfillment_xid": "9bd09610-6190-4684-9be6-3d9503833e80",
"feature_identifier": "NVAIE_Licensing",
"feature_version": "1.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "6b1c74b5-1511-46ee-9f12-8bc6d5636fef",
"product_fulfillment_xid": "90d0f05f-9431-4a15-86e7-740a4f08d457",
"feature_identifier": "NVIDIA-vComputeServer NLS Metered",
"feature_version": "9.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "db53af09-7295-48b7-b927-24b23690c959",
"product_fulfillment_xid": "e9df1c70-7fac-4c84-b54c-66e922b9791a",
"feature_identifier": "NVIDIA-vComputeServer",
"feature_version": "9.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "1f62be61-a887-4e54-a34e-61cfa7b2db30",
"product_fulfillment_xid": "6c7981d3-7192-4bfd-b7ec-ea2ad0b466dc",
"feature_identifier": "Quadro-Virtual-DWS",
"feature_version": "5.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "8a4b5e98-f1ca-4c18-b0d4-8f4f9f0462e2",
"product_fulfillment_xid": "327385dd-4ba8-4b3c-bc56-30bcf58ae9a3",
"feature_identifier": "vGaming_FB_License",
"feature_version": "0.1",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "be531e98-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be2769b9-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-Apps",
"feature_version": "3.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "be53219e-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be276d7b-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-PC",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "be5322f0-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be276d7b-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "Quadro-Virtual-DWS",
"feature_version": "5.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 1
},
{
"xid": "be5323d8-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be276d7b-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-WS",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 2
},
{
"xid": "be5324a6-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be276d7b-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-WS-Ext",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 3
},
{
"xid": "be532568-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be276efe-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "Quadro-Virtual-DWS",
"feature_version": "5.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "be532630-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be276efe-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-WS",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 1
},
{
"xid": "be5326e7-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be276efe-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-WS-Ext",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 2
},
{
"xid": "be5327a7-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be276ff0-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-vGaming",
"feature_version": "8.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "be532923-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be2770af-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-Apps",
"feature_version": "3.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "be5329e0-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be277164-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-PC",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "be532aa0-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be277164-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "Quadro-Virtual-DWS",
"feature_version": "5.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 1
},
{
"xid": "be532b5c-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be277164-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-WS",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 2
},
{
"xid": "be532c19-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be277164-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-WS-Ext",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 3
},
{
"xid": "be532ccb-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be277214-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "Quadro-Virtual-DWS",
"feature_version": "5.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "be532d92-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be277214-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-WS",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 1
},
{
"xid": "be532e45-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be277214-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-Virtual-WS-Ext",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 2
},
{
"xid": "be532efa-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be2772c8-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-vGaming",
"feature_version": "8.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "be53306d-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be277379-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "NVIDIA-vComputeServer",
"feature_version": "9.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "be533228-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be277379-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "NVIDIA-vComputeServer NLS Metered",
"feature_version": "9.0",
"license_type_identifier": "CONCURRENT_UNCOUNTED_SINGLE",
"evaluation_order_index": 2
},
{
"xid": "be5332f6-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be277379-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "NVAIE_Licensing",
"feature_version": "1.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 3
},
{
"xid": "15ff4f16-57a8-4593-93ec-58352a256f12",
"product_fulfillment_xid": "eb2d39a4-6370-4464-8a6a-ec3f42c69cb5",
"feature_identifier": "GRID-vGaming-NLS-Metered",
"feature_version": "8.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 3
},
{
"xid": "0c1552ca-3ef8-11ed-9fa6-061a22468b59",
"product_fulfillment_xid": "be276ff0-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "vGaming_Flexera_License",
"feature_version": "0.1",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 1
},
{
"xid": "31c3be8c-5c0a-11ed-9fa6-061a22468b59",
"product_fulfillment_xid": "c4284597-5c09-11ed-9fa6-061a22468b59",
"feature_identifier": "OVE_Licensing",
"feature_version": "1.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
},
{
"xid": "6caeb4cf-360f-11ee-b67d-02f279bf2bff",
"product_fulfillment_xid": "be277379-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "NVAIE_Licensing",
"feature_version": "2.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 4
},
{
"xid": "7fb1d01d-3f0e-11ed-9fa6-061a22468b59",
"product_fulfillment_xid": "be276ff0-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "vGaming_FB_License",
"feature_version": "0.1",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 2
},
{
"xid": "8eabcb08-3f0e-11ed-9fa6-061a22468b59",
"product_fulfillment_xid": "be2772c8-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "vGaming_FB_License",
"feature_version": "0.1",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 2
},
{
"xid": "a1dfe741-3e49-11ed-9fa6-061a22468b59",
"product_fulfillment_xid": "be2772c8-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "vGaming_Flexera_License",
"feature_version": "0.1",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 1
},
{
"xid": "be53286a-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be276ff0-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-vGaming-NLS-Metered",
"feature_version": "8.0",
"license_type_identifier": "CONCURRENT_UNCOUNTED_SINGLE",
"evaluation_order_index": 3
},
{
"xid": "be532fb2-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be2772c8-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "GRID-vGaming-NLS-Metered",
"feature_version": "8.0",
"license_type_identifier": "CONCURRENT_UNCOUNTED_SINGLE",
"evaluation_order_index": 3
},
{
"xid": "be533144-2cdb-11ec-9838-061a22468b59",
"product_fulfillment_xid": "be277379-2cdb-11ec-9838-061a22468b59",
"feature_identifier": "Quadro-Virtual-DWS",
"feature_version": "0.0",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 1
},
{
"xid": "bf105e18-0c26-11ef-b3b6-371045c70906",
"product_fulfillment_xid": "9e162d3c-0c26-11ef-b3b6-371045c70906",
"feature_identifier": "vGaming_Flexera_License",
"feature_version": "0.1",
"license_type_identifier": "CONCURRENT_COUNTED_SINGLE",
"evaluation_order_index": 0
}
]
}

View File

@@ -1,423 +1,40 @@
import logging def load_file(filename) -> bytes:
from datetime import datetime, UTC, timedelta
from json import loads as json_loads
from os.path import join, dirname, isfile, isdir
from cryptography import x509
from cryptography.hazmat._oid import NameOID
from cryptography.hazmat.primitives import serialization, hashes
from cryptography.hazmat.primitives.asymmetric.padding import PKCS1v15
from cryptography.hazmat.primitives.asymmetric.rsa import RSAPrivateKey, RSAPublicKey, generate_private_key
from cryptography.hazmat.primitives.hashes import SHA256
from cryptography.hazmat.primitives.serialization import Encoding
from cryptography.hazmat.primitives.serialization import load_pem_private_key, load_pem_public_key
from cryptography.x509 import load_pem_x509_certificate, Certificate
logging.basicConfig()
def load_file(filename: str) -> bytes:
log = logging.getLogger(f'{__name__}')
log.debug(f'Loading contents of file "{filename}')
with open(filename, 'rb') as file: with open(filename, 'rb') as file:
content = file.read() content = file.read()
return content return content
class CASetup: def load_key(filename) -> "RsaKey":
### try:
# # Crypto | Cryptodome on Debian
# https://git.collinwebdesigns.de/nvidia/nls/-/blob/main/src/test/test_config_token.py from Crypto.PublicKey import RSA
# from Crypto.PublicKey.RSA import RsaKey
### except ModuleNotFoundError:
from Cryptodome.PublicKey import RSA
ROOT_PRIVATE_KEY_FILENAME = 'root_private_key.pem' from Cryptodome.PublicKey.RSA import RsaKey
ROOT_CERTIFICATE_FILENAME = 'root_certificate.pem'
CA_PRIVATE_KEY_FILENAME = 'ca_private_key.pem' return RSA.import_key(extern_key=load_file(filename), passphrase=None)
CA_CERTIFICATE_FILENAME = 'ca_certificate.pem'
SI_PRIVATE_KEY_FILENAME = 'si_private_key.pem'
SI_CERTIFICATE_FILENAME = 'si_certificate.pem' def parse_key(content: bytes) -> "RsaKey":
try:
def __init__(self, service_instance_ref: str, cert_path: str = None): # Crypto | Cryptodome on Debian
cert_path_prefix = join(dirname(__file__), 'cert') from Crypto.PublicKey import RSA
if cert_path is not None and len(cert_path) > 0 and isdir(cert_path): from Crypto.PublicKey.RSA import RsaKey
cert_path_prefix = cert_path except ModuleNotFoundError:
from Cryptodome.PublicKey import RSA
self.service_instance_ref = service_instance_ref from Cryptodome.PublicKey.RSA import RsaKey
self.root_private_key_filename = join(cert_path_prefix, CASetup.ROOT_PRIVATE_KEY_FILENAME)
self.root_certificate_filename = join(cert_path_prefix, CASetup.ROOT_CERTIFICATE_FILENAME) return RSA.import_key(extern_key=content, passphrase=None)
self.ca_private_key_filename = join(cert_path_prefix, CASetup.CA_PRIVATE_KEY_FILENAME)
self.ca_certificate_filename = join(cert_path_prefix, CASetup.CA_CERTIFICATE_FILENAME)
self.si_private_key_filename = join(cert_path_prefix, CASetup.SI_PRIVATE_KEY_FILENAME) def generate_key() -> "RsaKey":
self.si_certificate_filename = join(cert_path_prefix, CASetup.SI_CERTIFICATE_FILENAME) try:
# Crypto | Cryptodome on Debian
if not (isfile(self.root_private_key_filename) from Crypto.PublicKey import RSA
and isfile(self.root_certificate_filename) from Crypto.PublicKey.RSA import RsaKey
and isfile(self.ca_private_key_filename) except ModuleNotFoundError:
and isfile(self.ca_certificate_filename) from Cryptodome.PublicKey import RSA
and isfile(self.si_private_key_filename) from Cryptodome.PublicKey.RSA import RsaKey
and isfile(self.si_certificate_filename)):
self.init_config_token_demo() return RSA.generate(bits=2048)
def init_config_token_demo(self):
""" Create Root Key and Certificate """
# create root keypair
my_root_private_key = generate_private_key(public_exponent=65537, key_size=4096)
my_root_public_key = my_root_private_key.public_key()
# create root-certificate subject
my_root_subject = x509.Name([
x509.NameAttribute(NameOID.COUNTRY_NAME, u'US'),
x509.NameAttribute(NameOID.STATE_OR_PROVINCE_NAME, u'California'),
x509.NameAttribute(NameOID.ORGANIZATION_NAME, u'Nvidia'),
x509.NameAttribute(NameOID.ORGANIZATIONAL_UNIT_NAME, u'Nvidia Licensing Service (NLS)'),
x509.NameAttribute(NameOID.COMMON_NAME, u'NLS Root CA'),
])
# create self-signed root-certificate
my_root_certificate = (
x509.CertificateBuilder()
.subject_name(my_root_subject)
.issuer_name(my_root_subject)
.public_key(my_root_public_key)
.serial_number(x509.random_serial_number())
.not_valid_before(datetime.now(tz=UTC) - timedelta(days=1))
.not_valid_after(datetime.now(tz=UTC) + timedelta(days=365 * 10))
.add_extension(x509.BasicConstraints(ca=True, path_length=None), critical=True)
.add_extension(x509.KeyUsage(
digital_signature=False,
key_encipherment=False,
key_cert_sign=True,
key_agreement=False,
content_commitment=False,
data_encipherment=False,
crl_sign=True,
encipher_only=False,
decipher_only=False),
critical=True
)
.add_extension(x509.SubjectKeyIdentifier.from_public_key(my_root_public_key), critical=False)
.add_extension(x509.AuthorityKeyIdentifier.from_issuer_public_key(my_root_public_key), critical=False)
.sign(my_root_private_key, hashes.SHA256()))
my_root_private_key_as_pem = my_root_private_key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption(),
)
with open(self.root_private_key_filename, 'wb') as f:
f.write(my_root_private_key_as_pem)
with open(self.root_certificate_filename, 'wb') as f:
f.write(my_root_certificate.public_bytes(encoding=Encoding.PEM))
""" Create CA (Intermediate) Key and Certificate """
# create ca keypair
my_ca_private_key = generate_private_key(public_exponent=65537, key_size=4096)
my_ca_public_key = my_ca_private_key.public_key()
# create ca-certificate subject
my_ca_subject = x509.Name([
x509.NameAttribute(NameOID.COUNTRY_NAME, u'US'),
x509.NameAttribute(NameOID.STATE_OR_PROVINCE_NAME, u'California'),
x509.NameAttribute(NameOID.ORGANIZATION_NAME, u'Nvidia'),
x509.NameAttribute(NameOID.ORGANIZATIONAL_UNIT_NAME, u'Nvidia Licensing Service (NLS)'),
x509.NameAttribute(NameOID.COMMON_NAME, u'NLS Intermediate CA'),
])
# create self-signed ca-certificate
my_ca_certificate = (
x509.CertificateBuilder()
.subject_name(my_ca_subject)
.issuer_name(my_root_subject)
.public_key(my_ca_public_key)
.serial_number(x509.random_serial_number())
.not_valid_before(datetime.now(tz=UTC) - timedelta(days=1))
.not_valid_after(datetime.now(tz=UTC) + timedelta(days=365 * 10))
.add_extension(x509.BasicConstraints(ca=True, path_length=None), critical=True)
.add_extension(x509.KeyUsage(
digital_signature=False,
key_encipherment=False,
key_cert_sign=True,
key_agreement=False,
content_commitment=False,
data_encipherment=False,
crl_sign=True,
encipher_only=False,
decipher_only=False),
critical=True
)
.add_extension(x509.SubjectKeyIdentifier.from_public_key(my_ca_public_key), critical=False)
.add_extension(x509.AuthorityKeyIdentifier.from_issuer_subject_key_identifier(
my_root_certificate.extensions.get_extension_for_class(x509.SubjectKeyIdentifier).value
), critical=False)
.sign(my_root_private_key, hashes.SHA256()))
my_ca_private_key_as_pem = my_ca_private_key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption(),
)
with open(self.ca_private_key_filename, 'wb') as f:
f.write(my_ca_private_key_as_pem)
with open(self.ca_certificate_filename, 'wb') as f:
f.write(my_ca_certificate.public_bytes(encoding=Encoding.PEM))
""" Create Service-Instance Key and Certificate """
# create si keypair
my_si_private_key = generate_private_key(public_exponent=65537, key_size=2048)
my_si_public_key = my_si_private_key.public_key()
my_si_private_key_as_pem = my_si_private_key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption(),
)
my_si_public_key_as_pem = my_si_public_key.public_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PublicFormat.SubjectPublicKeyInfo,
)
with open(self.si_private_key_filename, 'wb') as f:
f.write(my_si_private_key_as_pem)
# with open(self.si_public_key_filename, 'wb') as f:
# f.write(my_si_public_key_as_pem)
# create si-certificate subject
my_si_subject = x509.Name([
# x509.NameAttribute(NameOID.COMMON_NAME, INSTANCE_REF),
x509.NameAttribute(NameOID.COMMON_NAME, self.service_instance_ref),
])
# create self-signed si-certificate
my_si_certificate = (
x509.CertificateBuilder()
.subject_name(my_si_subject)
.issuer_name(my_ca_subject)
.public_key(my_si_public_key)
.serial_number(x509.random_serial_number())
.not_valid_before(datetime.now(tz=UTC) - timedelta(days=1))
.not_valid_after(datetime.now(tz=UTC) + timedelta(days=365 * 10))
.add_extension(x509.KeyUsage(digital_signature=True, key_encipherment=True, key_cert_sign=False,
key_agreement=True, content_commitment=False, data_encipherment=False,
crl_sign=False, encipher_only=False, decipher_only=False), critical=True)
.add_extension(x509.ExtendedKeyUsage([
x509.oid.ExtendedKeyUsageOID.SERVER_AUTH,
x509.oid.ExtendedKeyUsageOID.CLIENT_AUTH]
), critical=False)
.add_extension(x509.SubjectKeyIdentifier.from_public_key(my_si_public_key), critical=False)
# .add_extension(x509.AuthorityKeyIdentifier.from_issuer_public_key(my_ca_public_key), critical=False)
.add_extension(x509.AuthorityKeyIdentifier.from_issuer_subject_key_identifier(
my_ca_certificate.extensions.get_extension_for_class(x509.SubjectKeyIdentifier).value
), critical=False)
.add_extension(x509.SubjectAlternativeName([
# x509.DNSName(INSTANCE_REF)
x509.DNSName(self.service_instance_ref)
]), critical=False)
.sign(my_ca_private_key, hashes.SHA256()))
with open(self.si_certificate_filename, 'wb') as f:
f.write(my_si_certificate.public_bytes(encoding=Encoding.PEM))
class PrivateKey:
def __init__(self, data: bytes):
self.__key = load_pem_private_key(data, password=None)
@staticmethod
def from_file(filename: str) -> "PrivateKey":
log = logging.getLogger(__name__)
log.debug(f'Importing RSA-Private-Key from "{filename}"')
with open(filename, 'rb') as f:
data = f.read()
return PrivateKey(data=data.strip())
def raw(self) -> RSAPrivateKey:
return self.__key
def pem(self) -> bytes:
return self.__key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption()
)
def public_key(self) -> "PublicKey":
data = self.__key.public_key().public_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PublicFormat.SubjectPublicKeyInfo
)
return PublicKey(data=data)
def generate_signature(self, data: bytes) -> bytes:
return self.__key.sign(data=data, padding=PKCS1v15(), algorithm=SHA256())
@staticmethod
def generate(public_exponent: int = 65537, key_size: int = 2048) -> "PrivateKey":
log = logging.getLogger(__name__)
log.debug(f'Generating RSA-Key')
key = generate_private_key(public_exponent=public_exponent, key_size=key_size)
data = key.private_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=serialization.NoEncryption()
)
return PrivateKey(data=data)
class PublicKey:
def __init__(self, data: bytes):
self.__key = load_pem_public_key(data)
@staticmethod
def from_file(filename: str) -> "PublicKey":
log = logging.getLogger(__name__)
log.debug(f'Importing RSA-Public-Key from "{filename}"')
with open(filename, 'rb') as f:
data = f.read()
return PublicKey(data=data.strip())
def raw(self) -> RSAPublicKey:
return self.__key
def pem(self) -> bytes:
return self.__key.public_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PublicFormat.SubjectPublicKeyInfo
)
def mod(self) -> str:
return hex(self.__key.public_numbers().n)[2:]
def exp(self):
return int(self.__key.public_numbers().e)
def verify_signature(self, signature: bytes, data: bytes) -> None:
self.__key.verify(signature=signature, data=data, padding=PKCS1v15(), algorithm=SHA256())
class Cert:
def __init__(self, data: bytes):
self.__cert = load_pem_x509_certificate(data)
@staticmethod
def from_file(filename: str) -> "Cert":
log = logging.getLogger(__name__)
log.debug(f'Importing Certificate from "{filename}"')
with open(filename, 'rb') as f:
data = f.read()
return Cert(data=data.strip())
def raw(self) -> Certificate:
return self.__cert
def pem(self) -> bytes:
return self.__cert.public_bytes(encoding=serialization.Encoding.PEM)
def public_key(self) -> "PublicKey":
data = self.__cert.public_key().public_bytes(
encoding=serialization.Encoding.PEM,
format=serialization.PublicFormat.SubjectPublicKeyInfo
)
return PublicKey(data=data)
def signature(self) -> bytes:
return self.__cert.signature
def subject_key_identifier(self):
return self.__cert.extensions.get_extension_for_class(x509.SubjectKeyIdentifier).value.key_identifier
def authority_key_identifier(self):
return self.__cert.extensions.get_extension_for_class(x509.AuthorityKeyIdentifier).value.key_identifier
class DriverMatrix:
__DRIVER_MATRIX_FILENAME = 'static/driver_matrix.json'
__DRIVER_MATRIX: None | dict = None # https://docs.nvidia.com/grid/ => "Driver Versions"
def __init__(self):
self.log = logging.getLogger(self.__class__.__name__)
if DriverMatrix.__DRIVER_MATRIX is None:
self.__load()
def __load(self):
try:
with open(DriverMatrix.__DRIVER_MATRIX_FILENAME, 'r') as f:
DriverMatrix.__DRIVER_MATRIX = json_loads(f.read())
self.log.debug(f'Successfully loaded "{DriverMatrix.__DRIVER_MATRIX_FILENAME}".')
except Exception as e:
DriverMatrix.__DRIVER_MATRIX = {} # init empty dict to not try open file everytime, just when restarting app
# self.log.warning(f'Failed to load "{NV.__DRIVER_MATRIX_FILENAME}": {e}')
@staticmethod
def find(version: str) -> dict | None:
if DriverMatrix.__DRIVER_MATRIX is None:
return None
for idx, (key, branch) in enumerate(DriverMatrix.__DRIVER_MATRIX.items()):
for release in branch.get('$releases'):
linux_driver = release.get('Linux Driver')
windows_driver = release.get('Windows Driver')
if version == linux_driver or version == windows_driver:
tmp = branch.copy()
tmp.pop('$releases')
is_latest = release.get('vGPU Software') == branch.get('Latest Release in Branch')
return {
'software_branch': branch.get('vGPU Software Branch'),
'branch_version': release.get('vGPU Software'),
'driver_branch': branch.get('Driver Branch'),
'branch_status': branch.get('vGPU Branch Status'),
'release_date': release.get('Release Date'),
'eol': branch.get('EOL Date') if is_latest else None,
'is_latest': is_latest,
}
return None
class ProductMapping:
def __init__(self, filename: str):
with open(filename, 'r') as file:
self.data = json_loads(file.read())
def get_feature_name(self, product_name: str) -> (str, str):
product = self.__get_product(product_name)
product_fulfillment = self.__get_product_fulfillment(product.get('xid'))
feature = self.__get_product_fulfillment_feature(product_fulfillment.get('xid'))
return feature.get('feature_identifier')
def __get_product(self, product_name: str):
product_list = self.data.get('product')
return next(filter(lambda _: _.get('identifier') == product_name, product_list))
def __get_product_fulfillment(self, product_xid: str):
product_fulfillment_list = self.data.get('product_fulfillment')
return next(filter(lambda _: _.get('product_xid') == product_xid, product_fulfillment_list))
def __get_product_fulfillment_feature(self, product_fulfillment_xid: str):
feature_list = self.data.get('product_fulfillment_feature')
features = list(filter(lambda _: _.get('product_fulfillment_xid') == product_fulfillment_xid, feature_list))
features.sort(key=lambda _: _.get('evaluation_order_index'))
return features[0]

26
doc/Database.md Normal file
View File

@@ -0,0 +1,26 @@
# Database structure
## `request_routing.service_instance`
| xid | org_name |
|----------------------------------------|--------------------------|
| `10000000-0000-0000-0000-000000000000` | `lic-000000000000000000` |
- `xid` is used as `SERVICE_INSTANCE_XID`
## `request_routing.license_allotment_service_instance`
| xid | service_instance_xid | license_allotment_xid |
|----------------------------------------|----------------------------------------|----------------------------------------|
| `90000000-0000-0000-0000-000000000001` | `10000000-0000-0000-0000-000000000000` | `80000000-0000-0000-0000-000000000001` |
- `xid` is only a primary-key and never used as foreign-key or reference
- `license_allotment_xid` must be used to fetch `xid`'s from `request_routing.license_allotment_reference`
## `request_routing.license_allotment_reference`
| xid | license_allotment_xid |
|----------------------------------------|----------------------------------------|
| `20000000-0000-0000-0000-000000000001` | `80000000-0000-0000-0000-000000000001` |
- `xid` is used as `scope_ref_list` on token request

View File

@@ -0,0 +1,177 @@
# Reverse Engineering Notes
# Usefully commands
## Check licensing status
- `nvidia-smi -q | grep "License"`
**Output**
```
vGPU Software Licensed Product
License Status : Licensed (Expiry: 2023-1-14 12:59:52 GMT)
```
## Track licensing progress
- NVIDIA Grid Log: `journalctl -u nvidia-gridd -f`
```
systemd[1]: Started NVIDIA Grid Daemon.
nvidia-gridd[2986]: Configuration parameter ( ServerAddress ) not set
nvidia-gridd[2986]: vGPU Software package (0)
nvidia-gridd[2986]: Ignore service provider and node-locked licensing
nvidia-gridd[2986]: NLS initialized
nvidia-gridd[2986]: Acquiring license. (Info: license.nvidia.space; NVIDIA RTX Virtual Workstation)
nvidia-gridd[2986]: License acquired successfully. (Info: license.nvidia.space, NVIDIA RTX Virtual Workstation; Expiry: 2023-1-29 22:3:0 GMT)
```
# DLS-Container File-System (Docker)
## Configuration data
Most variables and configs are stored in `/var/lib/docker/volumes/configurations/_data`.
Files can be modified with `docker cp <container-id>:/venv/... /opt/localfile/...` and back.
(May you need to fix permissions with `docker exec -u 0 <container-id> chown nonroot:nonroot /venv/...`)
## Dive / Docker image inspector
- `dive dls:appliance`
The source code is stored in `/venv/lib/python3.9/site-packages/nls_*`.
Image-Reference:
```
Tags: (unavailable)
Id: d1c7976a5d2b3681ff6c5a30f8187e4015187a83f3f285ba4a37a45458bd6b98
Digest: sha256:311223c5af7a298ec1104f5dc8c3019bfb0e1f77256dc3d995244ffb295a97
1f
Command:
#(nop) ADD file:c1900d3e3a29c29a743a8da86c437006ec5d2aa873fb24e48033b6bf492bb37b in /
```
## Private Key (Site-Key)
- `/etc/dls/config/decryptor/decryptor`
```shell
docker exec -it <container-id> /etc/dls/config/decryptor/decryptor > /tmp/private-key.pem
```
```
-----BEGIN RSA PRIVATE KEY-----
...
-----END RSA PRIVATE KEY-----
```
## Site Key Uri - `/etc/dls/config/site_key_uri.bin`
```
base64-content...
```
## DB Password - `/etc/dls/config/dls_db_password.bin`
```
base64-content...
```
**Decrypt database password**
```
cd /var/lib/docker/volumes/configurations/_data
cat dls_db_password.bin | base64 -d > dls_db_password.bin.raw
openssl rsautl -decrypt -inkey /tmp/private-key.pem -in dls_db_password.bin.raw
```
# Database
- It's enough to manipulate database licenses. There must not be changed any line of code to bypass licensing
validations.
# Logging / Stack Trace
- https://docs.nvidia.com/license-system/latest/nvidia-license-system-user-guide/index.html#troubleshooting-dls-instance
**Failed licensing log**
```
{
"activity": 100,
"context": {
"SERVICE_INSTANCE_ID": "b43d6e46-d6d0-4943-8b8d-c66a5f6e0d38",
"SERVICE_INSTANCE_NAME": "DEFAULT_2022-12-14_12:48:30",
"description": "borrow failed: NotFoundError(no pool features found for: NVIDIA RTX Virtual Workstation)",
"event_type": null,
"function_name": "_evt",
"lineno": 54,
"module_name": "nls_dal_lease_dls.event",
"operation_id": "e72a8ca7-34cc-4e11-b80c-273592085a24",
"origin_ref": "3f7f5a50-a26b-425b-8d5e-157f63e72b1c",
"service_name": "nls_services_lease"
},
"detail": {
"oc": {
"license_allotment_xid": "10c4317f-7c4c-11ed-a524-0e4252a7e5f1",
"origin_ref": "3f7f5a50-a26b-425b-8d5e-157f63e72b1c",
"service_instance_xid": "b43d6e46-d6d0-4943-8b8d-c66a5f6e0d38"
},
"operation_id": "e72a8ca7-34cc-4e11-b80c-273592085a24"
},
"id": "0cc9e092-3b92-4652-8d9e-7622ef85dc79",
"metadata": {},
"ts": "2022-12-15T10:25:36.827661Z"
}
{
"activity": 400,
"context": {
"SERVICE_INSTANCE_ID": "b43d6e46-d6d0-4943-8b8d-c66a5f6e0d38",
"SERVICE_INSTANCE_NAME": "DEFAULT_2022-12-14_12:48:30",
"description": "lease_multi_create failed: no pool features found for: NVIDIA RTX Virtual Workstation",
"event_by": "system",
"function_name": "lease_multi_create",
"level": "warning",
"lineno": 157,
"module_name": "nls_services_lease.controllers.lease_multi_controller",
"operation_id": "e72a8ca7-34cc-4e11-b80c-273592085a24",
"service_name": "nls_services_lease"
},
"detail": {
"_msg": "lease_multi_create failed: no pool features found for: NVIDIA RTX Virtual Workstation",
"exec_info": ["NotFoundError", "NotFoundError(no pool features found for: NVIDIA RTX Virtual Workstation)", " File \"/venv/lib/python3.9/site-packages/nls_services_lease/controllers/lease_multi_controller.py\", line 127, in lease_multi_create\n data = _leaseMulti.lease_multi_create(event_args)\n File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 208, in lease_multi_create\n raise e\n File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 184, in lease_multi_create\n self._try_proposals(oc, mlr, results, detail)\n File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 219, in _try_proposals\n lease = self._leases.create(creator)\n File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 230, in create\n features = self._get_features(creator)\n File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 148, in _get_features\n self._explain_not_available(cur, creator)\n File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 299, in _explain_not_available\n raise NotFoundError(f'no pool features found for: {lcc.product_name}')\n"],
"operation_id": "e72a8ca7-34cc-4e11-b80c-273592085a24"
},
"id": "282801b9-d612-40a5-9145-b56d8e420dac",
"metadata": {},
"ts": "2022-12-15T10:25:36.831673Z"
}
```
**Stack Trace**
```
"NotFoundError", "NotFoundError(no pool features found for: NVIDIA RTX Virtual Workstation)", " File \"/venv/lib/python3.9/site-packages/nls_services_lease/controllers/lease_multi_controller.py\", line 127, in lease_multi_create
data = _leaseMulti.lease_multi_create(event_args)
File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 208, in lease_multi_create
raise e
File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 184, in lease_multi_create
self._try_proposals(oc, mlr, results, detail)
File \"/venv/lib/python3.9/site-packages/nls_core_lease/lease_multi.py\", line 219, in _try_proposals
lease = self._leases.create(creator)
File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 230, in create
features = self._get_features(creator)
File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 148, in _get_features
self._explain_not_available(cur, creator)
File \"/venv/lib/python3.9/site-packages/nls_dal_lease_dls/leases.py\", line 299, in _explain_not_available
raise NotFoundError(f'no pool features found for: {lcc.product_name}')
"
```
# Nginx
- NGINX uses `/opt/certs/cert.pem` and `/opt/certs/key.pem`

View File

@@ -15,7 +15,7 @@ services:
<<: *dls-variables <<: *dls-variables
volumes: volumes:
- /etc/timezone:/etc/timezone:ro - /etc/timezone:/etc/timezone:ro
- /opt/docker/fastapi-dls/cert:/app/cert - /opt/docker/fastapi-dls/cert:/app/cert # instance.private.pem, instance.public.pem
- db:/app/database - db:/app/database
entrypoint: ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000", "--app-dir", "/app", "--proxy-headers"] entrypoint: ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000", "--app-dir", "/app", "--proxy-headers"]
healthcheck: healthcheck:

View File

@@ -1,8 +1,8 @@
fastapi==0.116.1 fastapi==0.110.0
uvicorn[standard]==0.35.0 uvicorn[standard]==0.27.1
pyjwt==2.10.1 python-jose==3.3.0
cryptography==45.0.6 pycryptodome==3.20.0
python-dateutil==2.9.0 python-dateutil==2.8.2
sqlalchemy==2.0.43 sqlalchemy==2.0.27
markdown==3.8.2 markdown==3.5.2
python-dotenv==1.1.1 python-dotenv==1.0.1

View File

@@ -1,123 +0,0 @@
import logging
logging.basicConfig()
logger = logging.getLogger(__name__)
logger.setLevel(logging.INFO)
URL = 'https://docs.nvidia.com/vgpu/index.html'
BRANCH_STATUS_KEY = 'vGPU Branch Status'
VGPU_KEY, GRID_KEY, DRIVER_BRANCH_KEY = 'vGPU Software', 'vGPU Software', 'Driver Branch'
LINUX_VGPU_MANAGER_KEY, LINUX_DRIVER_KEY = 'Linux vGPU Manager', 'Linux Driver'
WINDOWS_VGPU_MANAGER_KEY, WINDOWS_DRIVER_KEY = 'Windows vGPU Manager', 'Windows Driver'
ALT_VGPU_MANAGER_KEY = 'vGPU Manager'
RELEASE_DATE_KEY, LATEST_KEY, EOL_KEY = 'Release Date', 'Latest Release in Branch', 'EOL Date'
JSON_RELEASES_KEY = '$releases'
def __driver_versions(html: 'BeautifulSoup'):
def __strip(_: str) -> str:
# removes content after linebreak (e.g. "Hello\n World" to "Hello")
_ = _.strip()
tmp = _.split('\n')
if len(tmp) > 0:
return tmp[0]
return _
# find wrapper for "DriverVersions" and find tables
data = html.find('div', {'id': 'driver-versions'})
items = data.find_all('bsp-accordion', {'class': 'Accordion-items-item'})
for item in items:
software_branch = item.find('div', {'class': 'Accordion-items-item-title'}).text.strip()
software_branch = software_branch.replace(' Releases', '')
matrix_key = software_branch.lower()
branch_status = item.find('a', href=True, string='Branch status')
branch_status = branch_status.next_sibling.replace(':', '').strip()
# driver version info from table-heads (ths) and table-rows (trs)
table = item.find('table')
ths, trs = table.find_all('th'), table.find_all('tr')
headers, releases = [header.text.strip() for header in ths], []
for trs in trs:
tds = trs.find_all('td')
if len(tds) == 0: # skip empty
continue
# create dict with table-heads as key and cell content as value
x = {headers[i]: __strip(cell.text) for i, cell in enumerate(tds)}
x.setdefault(BRANCH_STATUS_KEY, branch_status)
releases.append(x)
# add to matrix
MATRIX.update({matrix_key: {JSON_RELEASES_KEY: releases}})
def __debug():
# print table head
s = f'{VGPU_KEY:^13} | {LINUX_VGPU_MANAGER_KEY:^21} | {LINUX_DRIVER_KEY:^21} | {WINDOWS_VGPU_MANAGER_KEY:^21} | {WINDOWS_DRIVER_KEY:^21} | {RELEASE_DATE_KEY:>21} | {BRANCH_STATUS_KEY:^21}'
print(s)
# iterate over dict & format some variables to not overload table
for idx, (key, branch) in enumerate(MATRIX.items()):
for release in branch.get(JSON_RELEASES_KEY):
version = release.get(VGPU_KEY, release.get(GRID_KEY, ''))
linux_manager = release.get(LINUX_VGPU_MANAGER_KEY, release.get(ALT_VGPU_MANAGER_KEY, ''))
linux_driver = release.get(LINUX_DRIVER_KEY)
windows_manager = release.get(WINDOWS_VGPU_MANAGER_KEY, release.get(ALT_VGPU_MANAGER_KEY, ''))
windows_driver = release.get(WINDOWS_DRIVER_KEY)
release_date = release.get(RELEASE_DATE_KEY)
is_latest = release.get(VGPU_KEY) == branch.get(LATEST_KEY)
branch_status = __parse_branch_status(release.get(BRANCH_STATUS_KEY, ''))
version = f'{version} *' if is_latest else version
s = f'{version:<13} | {linux_manager:<21} | {linux_driver:<21} | {windows_manager:<21} | {windows_driver:<21} | {release_date:>21} | {branch_status:^21}'
print(s)
def __parse_branch_status(string: str) -> str:
string = string.replace('Production Branch', 'Prod. -')
string = string.replace('Long-Term Support Branch', 'LTS -')
string = string.replace('supported until', '')
string = string.replace('EOL since', 'EOL - ')
string = string.replace('EOL from', 'EOL -')
return string
def __dump(filename: str):
import json
file = open(filename, 'w')
json.dump(MATRIX, file)
file.close()
if __name__ == '__main__':
MATRIX = {}
try:
import httpx
from bs4 import BeautifulSoup
except Exception as e:
logger.error(f'Failed to import module: {e}')
logger.info('Run "pip install beautifulsoup4 httpx"')
exit(1)
r = httpx.get(URL)
if r.status_code != 200:
logger.error(f'Error loading "{URL}" with status code {r.status_code}.')
exit(2)
# parse html
soup = BeautifulSoup(r.text, features='html.parser')
# build matrix
__driver_versions(soup)
# debug output
__debug()
# dump data to file
__dump('../app/static/driver_matrix.json')

View File

@@ -1,93 +1,51 @@
import json from os import getenv as env
import sys
from base64 import b64encode as b64enc from base64 import b64encode as b64enc
from calendar import timegm
from datetime import datetime, UTC
from hashlib import sha256 from hashlib import sha256
from json import loads as json_loads, dumps as json_dumps from calendar import timegm
from uuid import uuid4, UUID from datetime import datetime
from uuid import UUID, uuid4
from cryptography.hazmat.primitives.asymmetric.padding import PKCS1v15
from cryptography.hazmat.primitives.hashes import SHA256
from dateutil.relativedelta import relativedelta from dateutil.relativedelta import relativedelta
import jwt from jose import jwt
from jose.constants import ALGORITHMS
from starlette.testclient import TestClient from starlette.testclient import TestClient
from sqlalchemy import create_engine
import sys
# add relative path to use packages as they were in the app/ dir # add relative path to use packages as they were in the app/ dir
sys.path.append('../') sys.path.append('../')
sys.path.append('../app') sys.path.append('../app')
from app import main from app import main
from util import CASetup, PrivateKey, PublicKey, Cert from app.orm import init as db_init, migrate, Site, Instance
client = TestClient(main.app)
# Instance
INSTANCE_REF = '10000000-0000-0000-0000-000000000001'
ORIGIN_REF, ALLOTMENT_REF, SECRET = str(uuid4()), '20000000-0000-0000-0000-000000000001', 'HelloWorld' ORIGIN_REF, ALLOTMENT_REF, SECRET = str(uuid4()), '20000000-0000-0000-0000-000000000001', 'HelloWorld'
# CA & Signing # fastapi setup
ca_setup = CASetup(service_instance_ref=INSTANCE_REF) client = TestClient(main.app)
my_root_private_key = PrivateKey.from_file(ca_setup.root_private_key_filename)
my_root_certificate = Cert.from_file(ca_setup.root_certificate_filename) # database setup
my_ca_certificate = Cert.from_file(ca_setup.ca_certificate_filename) db = create_engine(str(env('DATABASE', 'sqlite:///db.sqlite')))
my_ca_private_key = PrivateKey.from_file(ca_setup.ca_private_key_filename) db_init(db), migrate(db)
my_si_private_key = PrivateKey.from_file(ca_setup.si_private_key_filename)
my_si_private_key_as_pem = my_si_private_key.pem() # test vars
my_si_public_key = my_si_private_key.public_key() DEFAULT_SITE, DEFAULT_INSTANCE = Site.get_default_site(db), Instance.get_default_instance(db)
my_si_public_key_as_pem = my_si_private_key.public_key().pem()
my_si_certificate = Cert.from_file(ca_setup.si_certificate_filename) SITE_KEY = DEFAULT_SITE.site_key
jwt_encode_key, jwt_decode_key = DEFAULT_INSTANCE.get_jwt_encode_key(), DEFAULT_INSTANCE.get_jwt_decode_key()
jwt_encode_key = my_si_private_key.pem()
jwt_decode_key = my_si_private_key.public_key().pem()
def __bearer_token(origin_ref: str) -> str: def __bearer_token(origin_ref: str) -> str:
# token = jwt.encode({"origin_ref": origin_ref}, key=jwt_encode_key, algorithm=ALGORITHMS.RS256) token = jwt.encode({"origin_ref": origin_ref}, key=jwt_encode_key, algorithm=ALGORITHMS.RS256)
token = jwt.encode(payload={"origin_ref": origin_ref}, key=jwt_encode_key, algorithm='RS256')
token = f'Bearer {token}' token = f'Bearer {token}'
return token return token
def test_signing(): def test_initial_default_site_and_instance():
signature_set_header = my_si_private_key.generate_signature(b'Hello') default_site, default_instance = Site.get_default_site(db), Instance.get_default_instance(db)
assert default_site.site_key == Site.INITIAL_SITE_KEY_XID
# test plain assert default_instance.instance_ref == Instance.DEFAULT_INSTANCE_REF
my_si_public_key.verify_signature(signature_set_header, b'Hello')
# test "X-NLS-Signature: b'....'
x_nls_signature_header_value = f'{signature_set_header.hex().encode()}'
assert f'{x_nls_signature_header_value}'.startswith('b\'')
assert f'{x_nls_signature_header_value}'.endswith('\'')
# test eval
signature_get_header = eval(x_nls_signature_header_value)
signature_get_header = bytes.fromhex(signature_get_header.decode('ascii'))
my_si_public_key.verify_signature(signature_get_header, b'Hello')
def test_keypair_and_certificates():
assert my_root_certificate.public_key().mod() == my_root_private_key.public_key().mod()
assert my_ca_certificate.public_key().mod() == my_ca_private_key.public_key().mod()
assert my_si_certificate.public_key().mod() == my_si_public_key.mod()
assert len(my_root_certificate.public_key().mod()) == 1024
assert len(my_ca_certificate.public_key().mod()) == 1024
assert len(my_si_certificate.public_key().mod()) == 512
#assert my_si_certificate.public_key().mod() != my_si_public_key.mod()
my_root_certificate.public_key().raw().verify(
my_ca_certificate.raw().signature,
my_ca_certificate.raw().tbs_certificate_bytes,
PKCS1v15(),
SHA256(),
)
my_ca_certificate.public_key().raw().verify(
my_si_certificate.raw().signature,
my_si_certificate.raw().tbs_certificate_bytes,
PKCS1v15(),
SHA256(),
)
def test_index(): def test_index():
@@ -106,12 +64,6 @@ def test_config():
assert response.status_code == 200 assert response.status_code == 200
def test_config_root_ca():
response = client.get('/-/config/root-certificate')
assert response.status_code == 200
assert response.content.decode('utf-8').strip() == my_root_certificate.pem().decode('utf-8').strip()
def test_readme(): def test_readme():
response = client.get('/-/readme') response = client.get('/-/readme')
assert response.status_code == 200 assert response.status_code == 200
@@ -127,41 +79,6 @@ def test_client_token():
assert response.status_code == 200 assert response.status_code == 200
def test_config_token():
# https://git.collinwebdesigns.de/nvidia/nls/-/blob/main/src/test/test_config_token.py
response = client.post('/leasing/v1/config-token', json={"service_instance_ref": INSTANCE_REF})
assert response.status_code == 200
nv_response_certificate_configuration = response.json().get('certificateConfiguration')
nv_ca_chain = nv_response_certificate_configuration.get('caChain')[0].encode('utf-8')
nv_ca_chain = Cert(nv_ca_chain)
nv_response_public_cert = nv_response_certificate_configuration.get('publicCert').encode('utf-8')
nv_response_public_key = nv_response_certificate_configuration.get('publicKey')
nv_si_certificate = Cert(nv_response_public_cert)
assert nv_si_certificate.public_key().mod() == nv_response_public_key.get('mod')[0]
assert nv_si_certificate.authority_key_identifier() == nv_ca_chain.subject_key_identifier()
# nv_jwt_decode_key = jwk.construct(nv_response_public_cert, algorithm=ALGORITHMS.RS256)
nv_response_config_token = response.json().get('configToken')
#payload = jws.verify(nv_response_config_token, key=nv_jwt_decode_key, algorithms=ALGORITHMS.RS256)
payload = jwt.decode(jwt=nv_response_config_token, key=nv_si_certificate.public_key().pem(), algorithms=['RS256'], options={'verify_signature': False})
assert payload.get('iss') == 'NLS Service Instance'
assert payload.get('aud') == 'NLS Licensed Client'
assert payload.get('service_instance_ref') == INSTANCE_REF
nv_si_public_key_configuration = payload.get('service_instance_public_key_configuration')
nv_si_public_key_me = nv_si_public_key_configuration.get('service_instance_public_key_me')
assert len(nv_si_public_key_me.get('mod')) == 512 # nv_si_public_key_mod
assert nv_si_public_key_me.get('exp') == 65537 # nv_si_public_key_exp
def test_origins(): def test_origins():
pass pass
@@ -199,7 +116,6 @@ def test_auth_v1_origin():
assert response.json().get('origin_ref') == ORIGIN_REF assert response.json().get('origin_ref') == ORIGIN_REF
def auth_v1_origin_update(): def auth_v1_origin_update():
payload = { payload = {
"registration_pending": False, "registration_pending": False,
@@ -230,12 +146,12 @@ def test_auth_v1_code():
response = client.post('/auth/v1/code', json=payload) response = client.post('/auth/v1/code', json=payload)
assert response.status_code == 200 assert response.status_code == 200
payload = jwt.decode(response.json().get('auth_code'), key=my_si_public_key_as_pem, algorithms=['RS256']) payload = jwt.get_unverified_claims(token=response.json().get('auth_code'))
assert payload.get('origin_ref') == ORIGIN_REF assert payload.get('origin_ref') == ORIGIN_REF
def test_auth_v1_token(): def test_auth_v1_token():
cur_time = datetime.now(UTC) cur_time = datetime.utcnow()
access_expires_on = cur_time + relativedelta(hours=1) access_expires_on = cur_time + relativedelta(hours=1)
payload = { payload = {
@@ -247,7 +163,7 @@ def test_auth_v1_token():
"kid": "00000000-0000-0000-0000-000000000000" "kid": "00000000-0000-0000-0000-000000000000"
} }
payload = { payload = {
"auth_code": jwt.encode(payload, key=jwt_encode_key, headers={'kid': payload.get('kid')}, algorithm='RS256'), "auth_code": jwt.encode(payload, key=jwt_encode_key, headers={'kid': payload.get('kid')}, algorithm=ALGORITHMS.RS256),
"code_verifier": SECRET, "code_verifier": SECRET,
} }
@@ -255,19 +171,18 @@ def test_auth_v1_token():
assert response.status_code == 200 assert response.status_code == 200
token = response.json().get('auth_token') token = response.json().get('auth_token')
payload = jwt.decode(token, key=jwt_decode_key, algorithms=['RS256'], options={'verify_signature': False}) payload = jwt.decode(token=token, key=jwt_decode_key, algorithms=ALGORITHMS.RS256, options={'verify_aud': False})
assert payload.get('origin_ref') == ORIGIN_REF assert payload.get('origin_ref') == ORIGIN_REF
def test_leasing_v1_lessor(): def test_leasing_v1_lessor():
payload = { payload = {
'client_challenge': 'my_unique_string',
'fulfillment_context': { 'fulfillment_context': {
'fulfillment_class_ref_list': [] 'fulfillment_class_ref_list': []
}, },
'lease_proposal_list': [{ 'lease_proposal_list': [{
'license_type_qualifiers': {'count': 1}, 'license_type_qualifiers': {'count': 1},
'product': {'name': 'NVIDIA Virtual Applications'} 'product': {'name': 'NVIDIA RTX Virtual Workstation'}
}], }],
'proposal_evaluation_mode': 'ALL_OF', 'proposal_evaluation_mode': 'ALL_OF',
'scope_ref_list': [ALLOTMENT_REF] 'scope_ref_list': [ALLOTMENT_REF]
@@ -276,21 +191,12 @@ def test_leasing_v1_lessor():
response = client.post('/leasing/v1/lessor', json=payload, headers={'authorization': __bearer_token(ORIGIN_REF)}) response = client.post('/leasing/v1/lessor', json=payload, headers={'authorization': __bearer_token(ORIGIN_REF)})
assert response.status_code == 200 assert response.status_code == 200
client_challenge = response.json().get('client_challenge')
assert client_challenge == payload.get('client_challenge')
signature = eval(response.headers.get('X-NLS-Signature'))
assert len(signature) == 512
signature = bytes.fromhex(signature.decode('ascii'))
assert len(signature) == 256
my_si_public_key.verify_signature(signature, response.content)
lease_result_list = response.json().get('lease_result_list') lease_result_list = response.json().get('lease_result_list')
assert len(lease_result_list) == 1 assert len(lease_result_list) == 1
assert len(lease_result_list[0]['lease']['ref']) == 36 assert len(lease_result_list[0]['lease']['ref']) == 36
assert str(UUID(lease_result_list[0]['lease']['ref'])) == lease_result_list[0]['lease']['ref'] assert str(UUID(lease_result_list[0]['lease']['ref'])) == lease_result_list[0]['lease']['ref']
assert lease_result_list[0]['lease']['product_name'] == 'NVIDIA Virtual Applications'
assert lease_result_list[0]['lease']['feature_name'] == 'GRID-Virtual-Apps'
return lease_result_list[0]['lease']['ref']
def test_leasing_v1_lessor_lease(): def test_leasing_v1_lessor_lease():
@@ -310,18 +216,9 @@ def test_leasing_v1_lease_renew():
### ###
payload = {'client_challenge': 'my_unique_string'} response = client.put(f'/leasing/v1/lease/{active_lease_ref}', headers={'authorization': __bearer_token(ORIGIN_REF)})
response = client.put(f'/leasing/v1/lease/{active_lease_ref}', json=payload, headers={'authorization': __bearer_token(ORIGIN_REF)})
assert response.status_code == 200 assert response.status_code == 200
client_challenge = response.json().get('client_challenge')
assert client_challenge == payload.get('client_challenge')
signature = eval(response.headers.get('X-NLS-Signature'))
assert len(signature) == 512
signature = bytes.fromhex(signature.decode('ascii'))
assert len(signature) == 256
my_si_public_key.verify_signature(signature, response.content)
lease_ref = response.json().get('lease_ref') lease_ref = response.json().get('lease_ref')
assert len(lease_ref) == 36 assert len(lease_ref) == 36
assert lease_ref == active_lease_ref assert lease_ref == active_lease_ref
@@ -343,23 +240,7 @@ def test_leasing_v1_lease_delete():
def test_leasing_v1_lessor_lease_remove(): def test_leasing_v1_lessor_lease_remove():
# see "test_leasing_v1_lessor()" lease_ref = test_leasing_v1_lessor()
payload = {
'fulfillment_context': {
'fulfillment_class_ref_list': []
},
'lease_proposal_list': [{
'license_type_qualifiers': {'count': 1},
'product': {'name': 'NVIDIA Virtual Applications'}
}],
'proposal_evaluation_mode': 'ALL_OF',
'scope_ref_list': [ALLOTMENT_REF]
}
response = client.post('/leasing/v1/lessor', json=payload, headers={'authorization': __bearer_token(ORIGIN_REF)})
lease_result_list = response.json().get('lease_result_list')
lease_ref = lease_result_list[0]['lease']['ref']
#
response = client.delete('/leasing/v1/lessor/leases', headers={'authorization': __bearer_token(ORIGIN_REF)}) response = client.delete('/leasing/v1/lessor/leases', headers={'authorization': __bearer_token(ORIGIN_REF)})
assert response.status_code == 200 assert response.status_code == 200