Cisco provides Code Exchange for convenience and informational purposes only, with no support of any kind. This page contains information and links from third-party websites that are governed by their own separate terms. Reference to a project or contributor on this page does not imply any affiliation with or endorsement by Cisco. Please note that some of the repositories in Code Exchange may be enabled to interact with third-party Generative AI platforms outside of Cisco’s control, and users should review those third-party terms and privacy statements to understand how data is processed, stored or used, including input data."
It's a simple integration between Cisco's Webex Skill SDK and ChatGPT Library.
It basically forwards all of your inquiries to this Webex Assistant Skill to ChatGPT.
Yeah, that's basically what it does and yes I agree it's indeed pretty cool.
For now we'll settle for this (very) high level design to give you an idea of how it's done.
(Yes, this schema is done with excalidraw)
I would say it's pretty simple since I've containerized the whole thing to make it more portable and as much easy to use as possible, so that you don't have to fiddle with SDKs and stuff. I'm going to explain the logic behind this one day in the following section, but for now just jump to the TL;DR section.
-
A Linux server with Docker installed (or any other container tool of your choice, e.g. podman + buildah... I used Docker though).
- TIP: you can use the docker script to quickly install docker on supported distros on a development environment:
You can get more details on installing docker here.
curl -fsSL https://get.docker.com -o get-docker.sh sudo sh get-docker.sh
- TIP: you can use the docker script to quickly install docker on supported distros on a development environment:
-
A public fully qualified domain name pointing to your VPS hosting the containers and reachability from the Internet on 443 TCP port.
-
Public certificate for your fqdn.
- TIP: You can use certbot to generate a 3 months public certificate from Let's Encrypt.
Certbot can be launched in a container:This will start a wizard, which will ask for your fqdn, an email and a couple more personal information. If you give reachability from the Internet on port 80, you can do everything locally (option 1 of certbot wizard). Once finished, your certificate+key will be saved insudo docker run -it --rm --name certbot \ -v "/etc/letsencrypt:/etc/letsencrypt" \ -v "/var/lib/letsencrypt:/var/lib/letsencrypt" \ -p 80:80 \ certbot/certbot certonly
/etc/letsencrypt/live/ ->your fqdn<- /fullchain.pem
/etc/letsencrypt/live/ ->your fqdn<- /privkey.pem
- TIP: You can use certbot to generate a 3 months public certificate from Let's Encrypt.
-
OpenAI account with an API token
This is what it looks like to fire it up it on a virtual private server, but you could as well build your container image and deploy it in your preferred cloud container, or just scrap the ngnix layer altogether and use a lambda function... I'll try to enrich with other deploy scenarios in the future.
-
Clone the repo
git clone https://github.com/bbird81/webex-skill-gpt.git cd webex-skill-gpt
-
Use your certificate
- Rename your certificate chain and key as
fullchain.pem
andprivkey.pem
; they must be in Base64 format (if you used certbot, files already have these names). - Copy your certificate chain+key in
containers/nginx-reverse-proxy/certificates
directory.
- Rename your certificate chain and key as
-
Fire up your container
Use docker compose to build and start your containers.3.1. Edit .env file with your favourite editor and fill OpenAI API token and FQDN of your server:
# Insert OpenAPI token below OPENAPI_KEY = "..." # Fully Qualified Domain Name of your Webex Skill URL FQDN = "..."
3.2. Do docker compose:
docker compose up -d
-
Create the skill
Create the skill following this tutorial.
** Please pay attention that the URL in the Webex Skill portal MUST end with /parse **
Secret and public key can be found in the output of the docker compose logs, by issuing:docker compose logs
For further details on logs, please check the Troubleshooting section
-
Proceed to enable the skill in Webex Control Hub
-
Test & Invoke it!
You might want to test it using this web tool or, if you're feeling lucky, just try it on a compatible device using the expression:
"Hey Webex, tell <your skill name> <your request to ChatGPT>".
The complete guide for creating a generic Webex Assistant Skill can be found here:
https://developer.webex.com/docs/api/guides/webex-assistant-skills-guide
Everybody knows this: you can't win them all :-/
You might have received some errors during the installation, I'm going to address the most common here; please open a issue if all of this doesn't work for you and I'll see what I can do.
Never forget that this is a proof of concept and provided "as is" so mind that every effort from my side is BEST EFFORT: please be kind and patient.
- If you don't set OpenAI token or FQDN of your vps in the .env file, you will get the following messages:
Set the values in the
webex-skill-gpt-uvicorn-backend-1 | OpenAI API token not set: ABORTING! webex-skill-gpt-nginx-reverse-proxy-1 | FQDN env var not set: ABORTING!
.env
file and rebuild the container images with the following command:
docker compose up -d --build --force-recreate
To see only the latest run of docker compose logs, you might use the --since parameter, in the example you'll see log of the last 3 minutes:
docker compose logs --since 3m
- Certificate/Private key not in base64 format.
If the certificate/key you put incontainers/nginx-reverse-proxy/certificates
directory are not in base64 format, an error like this might show:If the extention of your certificate/private key is .cer, .der or .pfx, then it might be in binary format. You can check by using cat/less/more followed by the file name, certificate and key should look something like this:webex-skill-gpt-nginx-reverse-proxy-1 | 2023/06/28 16:30:28 [emerg] 10#10: cannot load certificate "/etc/ssl/certs/nginx/fullchain.pem": PEM_read_bio_X509_AUX() failed (SSL: error:04800064:PEM routines::bad base64 decode) webex-skill-gpt-nginx-reverse-proxy-1 | nginx: [emerg] cannot load certificate "/etc/ssl/certs/nginx/fullchain.pem": PEM_read_bio_X509_AUX() failed (SSL: error:04800064:PEM routines::bad base64 decode)
To convert a certificate from binary to base64 format you might use-----BEGIN CERTIFICATE----- IUUouds90udsab91290ueubei39329 ... some multi-line rubbish here ... -----END CERTIFICATE----- -----BEGIN PRIVATE KEY----- UUouds90udsab91290ueubei39329 ... some multi-line rubbish here ... -----END PRIVATE KEY-----
openssl
. Also Google (or Duck Duck GO!) is your friend :-).