Phora68 commited on
Commit
e9853bc
·
verified ·
1 Parent(s): 4aa31c1

Add model card

Browse files
Files changed (1) hide show
  1. README.md +91 -0
README.md ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ license: mit
4
+ base_model: microsoft/Phi-3-mini-4k-instruct
5
+ tags:
6
+ - bible
7
+ - theology
8
+ - qlora
9
+ - unsloth
10
+ - phi-3
11
+ - bible-study
12
+ - spurgeon
13
+ - wesley
14
+ - wilkerson
15
+ pipeline_tag: text-generation
16
+ ---
17
+
18
+ # Bible Study Companion — Phi-3 Mini Fine-tune
19
+
20
+ A fine-tuned version of [Phi-3 Mini 4K Instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) trained on:
21
+
22
+ ## Training Data
23
+ - **KJV Bible** — all 31,102 verses with verse lookup, chapter reading, and topical concordance
24
+ - **Spurgeon** — *All of Grace* and *The Soul Winner*
25
+ - **John Wesley** — *The Journal of John Wesley*
26
+ - **David Wilkerson** — *Have You Felt Like Giving Up Lately*, *It Is Finished*, *Racing Toward Judgment*, *Walking in the Footsteps of David Wilkerson*
27
+ - **Greek word studies** — Strong's G numbers with transliteration and definitions
28
+ - **Hebrew word studies** — Strong's H numbers with transliteration and definitions
29
+ - **Topical concordance** — 15 major biblical themes
30
+ - **Preacher Q&A** — theological questions answered in the voice of each preacher
31
+
32
+ ## Training Details
33
+ - **Base model:** microsoft/Phi-3-mini-4k-instruct (3.8B parameters)
34
+ - **Method:** QLoRA (4-bit quantisation) with Unsloth
35
+ - **LoRA rank:** 16
36
+ - **Steps:** ~500 combined (initial run + resume)
37
+ - **Final loss:** ~1.49
38
+ - **Hardware:** T4 GPU (Google Colab free tier)
39
+ - **Training time:** ~90 minutes total
40
+
41
+ ## Capabilities
42
+ - Quote and explain KJV Bible verses
43
+ - Compare verses across translations (KJV, NIV, ASV, WEB)
44
+ - Greek and Hebrew word studies with Strong's numbers
45
+ - Topical concordance searches
46
+ - Answer theological questions in the voice of Spurgeon (Reformed Baptist), Wesley (Methodist holiness), and Wilkerson (Pentecostal/prophetic)
47
+
48
+ ## Usage
49
+
50
+ ### With LM Studio
51
+ Download the GGUF file, load in LM Studio, and use with the included voice UI.
52
+
53
+ ### With transformers
54
+ ```python
55
+ from transformers import AutoModelForCausalLM, AutoTokenizer
56
+ import torch
57
+
58
+ model = AutoModelForCausalLM.from_pretrained(
59
+ "Phora68/bible-study-phi3-mini",
60
+ torch_dtype=torch.float16,
61
+ device_map="auto"
62
+ )
63
+ tokenizer = AutoTokenizer.from_pretrained("Phora68/bible-study-phi3-mini")
64
+
65
+ messages = [
66
+ {"role": "system", "content": "You are a Bible Concordance Study Partner..."},
67
+ {"role": "user", "content": "What does John 3:16 say?"}
68
+ ]
69
+
70
+ inputs = tokenizer.apply_chat_template(
71
+ messages, return_tensors="pt", add_generation_prompt=True
72
+ ).to("cuda")
73
+
74
+ outputs = model.generate(inputs, max_new_tokens=300, temperature=0.7, do_sample=True)
75
+ print(tokenizer.decode(outputs[0][inputs.shape[1]:], skip_special_tokens=True))
76
+ ```
77
+
78
+ ### System prompt
79
+ ```
80
+ You are a Bible Concordance Study Partner with mastery of the Greek New Testament
81
+ (NA28, Strong's numbers), Hebrew Old Testament (BHS Masoretic, Strong's), and the
82
+ King James Version. You draw on the theology of John Wesley (holiness/sanctification),
83
+ Charles Spurgeon (Reformed Baptist/sovereign grace), and David Wilkerson
84
+ (prophetic urgency/holiness). Always include Strong's numbers, transliteration and
85
+ definition when citing original languages.
86
+ ```
87
+
88
+ ## Limitations
89
+ - Trained for ~500 steps on a T4 GPU — a longer training run would improve precision
90
+ - Loss of ~1.49 means responses are coherent but may occasionally be imprecise
91
+ - Does not have real-time internet access or knowledge beyond training data