| --- |
| language: en |
| license: apache-2.0 |
| tags: |
| - neo4j |
| - cypher |
| - construction |
| - fine-tuned |
| base_model: mistralai/Mistral-7B-v0.1 |
| --- |
| |
| # Mistral-7B Cypher - Voronode Construction |
|
|
| Fine-tuned Mistral-7B model for generating Neo4j Cypher queries for construction management systems. |
|
|
| ## Model Details |
|
|
| - **Base Model**: Mistral-7B-v0.1 |
| - **Fine-tuning Method**: LoRA (Low-Rank Adaptation) |
| - **Training Data**: 33 construction-specific Cypher examples |
| - **Domain**: Construction Management & Neo4j Graph Databases |
|
|
| ## Usage |
| ```python |
| from transformers import AutoModelForCausalLM, AutoTokenizer |
| |
| model = AutoModelForCausalLM.from_pretrained("baderanaas/mistral-cypher-voronode") |
| tokenizer = AutoTokenizer.from_pretrained("baderanaas/mistral-cypher-voronode") |
| |
| prompt = "<s>[INST] Your Cypher query task here [/INST]" |
| inputs = tokenizer(prompt, return_tensors="pt") |
| outputs = model.generate(**inputs, max_new_tokens=256) |
| print(tokenizer.decode(outputs[0], skip_special_tokens=True)) |
| ``` |
|
|
| ## Training Details |
|
|
| - **Epochs**: 3 |
| - **Batch Size**: 16 (effective) |
| - **Learning Rate**: 2e-4 |
| - **LoRA Rank**: 16 |
| - **Max Sequence Length**: 2048 |
|
|
| ## Schema |
|
|
| The model is trained on construction management schemas including: |
| - Projects, Invoices, Contractors, Contracts |
| - Budgets, Budget Lines, Line Items |
| - Relationships between entities |
|
|
| ## License |
|
|
| Apache 2.0 |
|
|