2.8 KiB
Model testing using Phi-3 model and IBM's Granite model
This is a flask application that has connected both the Phi-3 model and Granite 3b model to help with things such as code generation and other tasks the user might see fit. The flask routing is set up so that you can either use the Phi-3 model or Granite model depending on task.
Example curl request and output that doesn't specify model -> defaults to phi-3
curl -X POST -H "Content-Type: application/x-www-form-urlencoded" -d"query=Write me a script to post files to Box.com from a Windows 10 machine." http://169.231.231.109:5000/query/
This post will output setup instructions for posting files on box.com from a windows machine from the phi-3 model.
Example curl request and output that does specify model -> granite 3b
curl -X POST -H "Content-Type: application/x-www-form-urlencoded" -d"query=Write me a script to post files to Box.com from a Windows 10 machine." http://169.231.231.109:5000/query/granite
As you can see we specify the model at the end there as granite and that will then ger recognized in the testing.py folder and accordingly change the tokenizer and model.
If you try to request something that isn't in the query of models available you will get an error output
curl -X POST -H "Content-Type: application/x-www-form-urlencoded" -d"query=Write me an example of Go programming code." http://169.231.231.109:5000/query/phi-5 Only models phi and granite are supported.%
These models can be very helpful with code generation, test using these requests
Phi-3 generation with Go code:
curl -X POST -H "Content-Type: application/x-www-form-urlencoded" -d"query=Write me an example of Go programming code." http://169.231.231.109:5000/query/phi
Granite 3b generation with Go code:
curl -X POST -H "Content-Type: application/x-www-form-urlencoded" -d"query=Write me an example of Go programming code." http://169.231.231.109:5000/query/granite
How to run this on your own
- Start by setting up an env within the directory of your choice.
python -m venv env
source env/bin//activate
- You will need to (pip) install torch, transformers, accelerator, and flask in order to properly run the application
pip freeze > requirements.txt
will help setup a requirements text file as shown in the repo - If you haven't as yet, also make sure to fork Phi-3: https://huggingface.co/microsoft/Phi-3-mini-4k-instruct#sample-inference-code$0 and IBM Granite 3b: https://huggingface.co/ibm-granite/granite-3b-code-instruct#generation$0 from huggingface and change the
@app.route
in the testing.py file to match the locations where you host those models.
Adding more models for testing
in prog...
Overall the Phi-3 model can be better used as a way to generate text and the Granite model might be better at developing code and generating software solutions