phi3_model_testing/README.md
sjaipuriyar 7d09b4337b Update README.md
adding section for if the user wants to add more models or modify current models
2024-08-26 13:51:52 -07:00

3.4 KiB

Model testing using Phi-3 model and IBM's Granite model

This is a flask application that has connected both the Phi-3 model and Granite 3b model to help with things such as code generation and other tasks the user might see fit. The flask routing is set up so that you can either use the Phi-3 model or Granite model depending on task.

Example curl request and output that doesn't specify model -> defaults to phi-3

curl -X POST -H "Content-Type: application/x-www-form-urlencoded" -d"query=Write me a script to post files to Box.com from a Windows 10 machine." http://169.231.231.109:5000/query/ This post will output setup instructions for posting files on box.com from a windows machine from the phi-3 model.

Example curl request and output that does specify model -> granite 3b

curl -X POST -H "Content-Type: application/x-www-form-urlencoded" -d"query=Write me a script to post files to Box.com from a Windows 10 machine." http://169.231.231.109:5000/query/granite As you can see we specify the model at the end there as granite and that will then ger recognized in the testing.py folder and accordingly change the tokenizer and model.

If you try to request something that isn't in the query of models available you will get an error output

curl -X POST -H "Content-Type: application/x-www-form-urlencoded" -d"query=Write me an example of Go programming code." http://169.231.231.109:5000/query/phi-5 Only models phi and granite are supported.%

These models can be very helpful with code generation, test using these requests

Phi-3 generation with Go code: curl -X POST -H "Content-Type: application/x-www-form-urlencoded" -d"query=Write me an example of Go programming code." http://169.231.231.109:5000/query/phi Granite 3b generation with Go code: curl -X POST -H "Content-Type: application/x-www-form-urlencoded" -d"query=Write me an example of Go programming code." http://169.231.231.109:5000/query/granite

How to run this on your own

Adding more models for testing

in prog..

  • Head over to huggingface's website -> models -> select the model of your choice -> git clone it to the same directory as testing.py -> add condition within try block of generate_response() function to recognize the model you're adding.
  • Additionally you should add a tokenizer that recognizes the token for that model such as tokenizer_phi = AutoTokenizer.from_pretrained("microsoft/Phi-3-mini-4k-instruct") for the Phi-3 model
  • Finally add the model using AutoModelForCausalLM.from_pretrained similarly to the phi and granite models shown in testing.py

Overall the Phi-3 model can be better used as a way to generate text and the Granite model might be better at developing code and generating software solutions